43 Commits
2.0 ... 3.0

Author SHA1 Message Date
geoffsee
1dab5aaa14 Bun server handles static assets and api 2025-06-25 16:46:46 -04:00
geoffsee
a295c208e9 Update React, React-DOM, and related dependencies to latest versions. 2025-06-25 16:30:42 -04:00
dependabot[bot]
713f0ffe8b Bump @anthropic-ai/sdk from 0.32.1 to 0.54.0
Bumps [@anthropic-ai/sdk](https://github.com/anthropics/anthropic-sdk-typescript) from 0.32.1 to 0.54.0.
- [Release notes](https://github.com/anthropics/anthropic-sdk-typescript/releases)
- [Changelog](https://github.com/anthropics/anthropic-sdk-typescript/blob/main/CHANGELOG.md)
- [Commits](https://github.com/anthropics/anthropic-sdk-typescript/compare/sdk-v0.32.1...sdk-v0.54.0)

---
updated-dependencies:
- dependency-name: "@anthropic-ai/sdk"
  dependency-version: 0.54.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-25 16:22:59 -04:00
dependabot[bot]
a793bfe8e0 Bump react-dom from 18.3.1 to 19.1.0
Bumps [react-dom](https://github.com/facebook/react/tree/HEAD/packages/react-dom) from 18.3.1 to 19.1.0.
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.1.0/packages/react-dom)

---
updated-dependencies:
- dependency-name: react-dom
  dependency-version: 19.1.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-25 16:22:51 -04:00
dependabot[bot]
d594929998 Bump @testing-library/react from 14.3.1 to 16.3.0
Bumps [@testing-library/react](https://github.com/testing-library/react-testing-library) from 14.3.1 to 16.3.0.
- [Release notes](https://github.com/testing-library/react-testing-library/releases)
- [Changelog](https://github.com/testing-library/react-testing-library/blob/main/CHANGELOG.md)
- [Commits](https://github.com/testing-library/react-testing-library/compare/v14.3.1...v16.3.0)

---
updated-dependencies:
- dependency-name: "@testing-library/react"
  dependency-version: 16.3.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-25 16:22:36 -04:00
geoffsee
6d9bf79ba3 Update tests to use updated HUMAN/ASSISTANT format instead of **Human**/**Assistant**. 2025-06-25 16:16:23 -04:00
geoffsee
6b5928de7f Update AssetService SSR handling tests: refine mocks and add edge cases 2025-06-25 16:12:59 -04:00
geoffsee
f9249f3496 - Refactored to introduce handleSsr function in @open-gsio/client/server/index.ts for streamlined SSR handling.
- Replaced inline SSR logic in `AssetService.ts` with `handleSsr` import.
- Enhanced `build:client` script to ensure server directory creation.
- Updated dependencies and devDependencies across multiple packages for compatibility improvements.
2025-06-25 16:03:13 -04:00
geoffsee
93bec55585 Add bun wrangler tail log script and filter non-text models 2025-06-25 14:32:54 -04:00
geoffsee
8cdb6b8c94 - Refine assistant output formatting by removing bold headers and adjusting response template.
- Update `package.json` across multiple packages to include missing newline and add package manager metadata.
- Minor README formatting fixes to remove unnecessary trailing spaces.
2025-06-25 14:15:01 -04:00
geoffsee
48bedb8c74 fix nonexistant suite 2025-06-25 14:00:16 -04:00
geoffsee
068d8614e0 tests updated with new import 2025-06-25 14:00:16 -04:00
geoffsee
554096abb2 wip 2025-06-25 14:00:16 -04:00
geoffsee
21d6c8604e github button targets repo 2025-06-24 20:56:08 -04:00
geoffsee
de3173a8f8 add missing files to last commit 2025-06-24 20:46:36 -04:00
geoffsee
c6e09644e2 **Refactor:** Restructure server package to streamline imports and improve file organization
- Moved `providers`, `services`, `models`, `lib`, and related files to `src` directory within `server` package.
- Adjusted imports across the codebase to reflect the new paths.
- Renamed several `.ts` files for consistency.
- Introduced an `index.ts` in the `ai/providers` package to export all providers.

This improves maintainability and aligns with the project's updated directory structure.
2025-06-24 20:46:15 -04:00
geoffsee
0b8d67fc69 remove package manager spec 2025-06-24 17:36:39 -04:00
geoffsee
f76301d620 run format 2025-06-24 17:32:59 -04:00
geoffsee
02c3253343 adds eslint 2025-06-24 17:32:59 -04:00
geoffsee
9698fc6f3b Refactor project: remove unused code, clean up logs, streamline error handling, update TypeScript configs, and enhance message streaming.
- Deployed
2025-06-24 16:28:25 -04:00
geoffsee
004ec580d3 Remove unused ResumeComponent, ServicesComponent, and related sections. Update theming for SupportThisSiteModal, adjust DogecoinIcon, and refine Cloudflare worker references. 2025-06-24 15:51:39 -04:00
geoffsee
bdbc8de6d5 **Remove dead links and redundant comments; improve styling and clarity across multiple files**
- Removed outdated links and unused properties in Sidebar and Welcome Home Text files.
- Dropped extraneous comments and consolidated imports in server files for streamlined code.
- Enhanced MarkdownEditor visuals with a colorful border for better user experience.
2025-06-24 15:23:34 -04:00
geoffsee
a367812fe7 update prompts and ollama endpoint 2025-06-24 15:12:12 -04:00
geoffsee
22bf2f1c2f Fix provider endpoints 2025-06-24 15:01:43 -04:00
geoffsee
02ede2b0f6 Refactor ServerCoordinator and project structure for clearer durable objects organization and module imports. 2025-06-18 15:53:17 -04:00
geoffsee
afc46fe2c3 fix tests 2025-06-18 15:02:29 -04:00
geoffsee
b7f02eb4fb fix mlx omni provider 2025-06-18 14:33:07 -04:00
geoffsee
f1d7f52dbd fixes model initialization for mlx 2025-06-18 13:30:38 -04:00
geoffsee
38b364caeb fix local inference config 2025-06-18 12:38:38 -04:00
geoffsee
3d16bd94b4 **Refactor imports and improve type annotations**
- Adjusted import statements across the codebase to align with consistent use of `type`.
- Unified usage of `EventSource` initialization.
- Introduced `RootDeps` type for shared dependencies.
- Commented out unused VitePWA configuration.
- Updated proxy target URLs in Vite configuration.
2025-06-18 12:34:16 -04:00
geoffsee
7454c9b54b fix build 2025-06-18 10:41:39 -04:00
geoffsee
0c999e0400 fixes tests 2025-06-09 23:18:52 -04:00
geoffsee
362f50bf85 remove faulty test execution pattern 2025-06-09 23:18:52 -04:00
geoffsee
9e79c488ee correct README 2025-06-09 23:18:52 -04:00
geoffsee
370c3e5717 adjust README and local inference configuration script 2025-06-09 23:18:52 -04:00
geoffsee
f29bb6779c improves interoperability of model providers, local and remote providers can be used together seemlessly 2025-06-09 23:18:52 -04:00
Geoff Seemueller
ad7dc5c0a6 Update README.md
improve semantics

Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-05 14:04:08 -04:00
Geoff Seemueller
059e7d3218 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-04 20:19:12 -04:00
geoffsee
6be0316e75 add some missing to last 2025-06-04 20:09:39 -04:00
geoffsee
5bd1e2f77f add Acknowledgments section to README 2025-06-04 20:05:02 -04:00
geoffsee
03aae4d8db fix static fileserver 2025-06-04 19:00:10 -04:00
geoffsee
5d7a7b740a fix package script for server:dev 2025-06-04 18:52:39 -04:00
geoffsee
31d734d4f6 fix incorrect constructor usage 2025-06-04 18:50:59 -04:00
247 changed files with 6693 additions and 6333 deletions

41
.eslintignore Normal file
View File

@@ -0,0 +1,41 @@
# Dependencies
node_modules
.pnp
.pnp.js
# Build outputs
dist
build
out
.next
.nuxt
.cache
# Test coverage
coverage
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Environment variables
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
# Editor directories and files
.idea
.vscode
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
# TypeScript
*.d.ts

49
.eslintrc.cjs Normal file
View File

@@ -0,0 +1,49 @@
module.exports = {
root: true,
parser: '@typescript-eslint/parser',
parserOptions: {
ecmaVersion: 2021,
sourceType: 'module',
project: './tsconfig.json',
},
env: {
browser: true,
node: true,
es6: true,
},
globals: {
Bun: 'readonly',
},
plugins: ['@typescript-eslint', 'import', 'prettier'],
extends: [
'eslint:recommended',
'plugin:@typescript-eslint/recommended',
'plugin:import/errors',
'plugin:import/warnings',
'plugin:import/typescript',
'prettier',
],
rules: {
'prettier/prettier': 'error',
'@typescript-eslint/explicit-module-boundary-types': 'off',
'@typescript-eslint/no-explicit-any': 'warn',
'@typescript-eslint/no-unused-vars': ['warn', { argsIgnorePattern: '^_' }],
'import/order': [
'error',
{
'newlines-between': 'always',
alphabetize: { order: 'asc', caseInsensitive: true },
groups: ['builtin', 'external', 'internal', 'parent', 'sibling', 'index'],
},
],
},
settings: {
'import/resolver': {
node: {
extensions: ['.js', '.jsx', '.ts', '.tsx'],
moduleDirectory: ['node_modules', 'packages/*/node_modules'],
},
},
},
ignorePatterns: ['node_modules', 'dist', 'build', '*.d.ts', '*.min.js'],
};

3
.gitignore vendored
View File

@@ -13,4 +13,5 @@ packages/client/public/static/fonts/*
**/.dev.vars
packages/client/public/sitemap.xml
packages/client/public/robots.txt
wrangler.dev.jsonc
wrangler.dev.jsonc
/packages/client/public/static/fonts/

47
.prettierignore Normal file
View File

@@ -0,0 +1,47 @@
# Dependencies
node_modules
.pnp
.pnp.js
# Build outputs
dist
build
out
.next
.nuxt
.cache
# Test coverage
coverage
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Environment variables
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
# Editor directories and files
.idea
.vscode
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
# Package files
package-lock.json
yarn.lock
pnpm-lock.yaml
bun.lock
# Generated files
CHANGELOG.md

19
.prettierrc.cjs Normal file
View File

@@ -0,0 +1,19 @@
module.exports = {
semi: true,
singleQuote: true,
trailingComma: 'all',
printWidth: 100,
tabWidth: 2,
useTabs: false,
bracketSpacing: true,
arrowParens: 'avoid',
endOfLine: 'lf',
overrides: [
{
files: '*.{json,yml,yaml,md}',
options: {
tabWidth: 2,
},
},
],
};

View File

@@ -1,60 +1,60 @@
Legacy Development History
---
## Legacy Development History
The source code of open-gsio was drawn from the source code of my personal website. That commit history was contaminated early on with secrets. `open-gsio` is a refinement of those sources. A total of 367 commits were submitted to the main branch of the upstream source repository between August 2024 and May 2025.
#### **May 2025**
* Added **seemueller.ai** link to UI sidebar.
* Global config/markdown guide cleanup; patched a critical forgotten bug.
- Added **seemueller.ai** link to UI sidebar.
- Global config/markdown guide cleanup; patched a critical forgotten bug.
#### **Apr 2025**
* **CI/CD overhaul**: autodeploy to dev & staging, Bun adoption as package manager, streamlined blocklist workflow (now autoupdates via VPN blocker).
* New 404 error page; multiple robots.txt and editorresize fixes; removed dead/duplicate code.
- **CI/CD overhaul**: autodeploy to dev & staging, Bun adoption as package manager, streamlined blocklist workflow (now autoupdates via VPN blocker).
- New 404 error page; multiple robots.txt and editorresize fixes; removed dead/duplicate code.
#### **Mar 2025**
* Introduced **modelspecific `max_tokens`** handling and plugged in **Cloudflare AI models** for testing.
* Bundle size minimised (reenabled minifier, smaller vendor set).
- Introduced **modelspecific `max_tokens`** handling and plugged in **Cloudflare AI models** for testing.
- Bundle size minimised (reenabled minifier, smaller vendor set).
#### **Feb 2025**
* **Full theme system** (runtime switching, Centauri theme, serversaved prefs).
* Tightened MobX typing for messages; responsive breakpoints & input scaling repaired.
* Dropped legacy document API; general folder restructure.
- **Full theme system** (runtime switching, Centauri theme, serversaved prefs).
- Tightened MobX typing for messages; responsive breakpoints & input scaling repaired.
- Dropped legacy document API; general folder restructure.
#### **Jan 2025**
* **Ratelimit middleware**, larger KV/R2 storage quota.
* Switched default model → *llamav3p170binstruct*; pluggable model handlers.
* Added **KaTeX fonts** & **Marked.js** for rich math/markdown.
* Fireworks key rotation; deprecated Google models removed.
- **Ratelimit middleware**, larger KV/R2 storage quota.
- Switched default model → _llamav3p170binstruct_; pluggable model handlers.
- Added **KaTeX fonts** & **Marked.js** for rich math/markdown.
- Fireworks key rotation; deprecated Google models removed.
#### **Dec 2024**
* Major package upgrades; **CodeHighlighter** now supports HTML/JSX/TS(X)/Zig.
* Refactored streaming + markdown renderer; Androidspecific padding fixes.
* Reset default chat model to **gpt4o**; welcome message & richer searchintent logic.
- Major package upgrades; **CodeHighlighter** now supports HTML/JSX/TS(X)/Zig.
- Refactored streaming + markdown renderer; Androidspecific padding fixes.
- Reset default chat model to **gpt4o**; welcome message & richer searchintent logic.
#### **Nov 2024**
* **Fireworks API** + agent server; firstclass support for **Anthropic** & **GROQ** models (incl. attachments).
* **VPN blocker** shipped with CIDR validation and dedicated GitHub Action.
* Live search buffering, feedback modal, smarter context preprocessing.
- **Fireworks API** + agent server; firstclass support for **Anthropic** & **GROQ** models (incl. attachments).
- **VPN blocker** shipped with CIDR validation and dedicated GitHub Action.
- Live search buffering, feedback modal, smarter context preprocessing.
#### **Oct 2024**
* Rolled out **image generation** + picker for image models.
* Deployed **ETH payment processor** & depositaddress flow.
* Introduced fewshot prompting library; analytics worker refactor; Halloween prompt.
* Extensive mobileUX polish and bundling/worker config updates.
- Rolled out **image generation** + picker for image models.
- Deployed **ETH payment processor** & depositaddress flow.
- Introduced fewshot prompting library; analytics worker refactor; Halloween prompt.
- Extensive mobileUX polish and bundling/worker config updates.
#### **Sep 2024**
* Endtoend **math rendering** (KaTeX) and **GitHubflavoured markdown**.
* Migrated chat state to **MobX**; launched analytics service & metrics worker.
* Switched build minifier to **esbuild**; tokenizer limits enforced; gradient sidebar & cookieconsent manager added.
- Endtoend **math rendering** (KaTeX) and **GitHubflavoured markdown**.
- Migrated chat state to **MobX**; launched analytics service & metrics worker.
- Switched build minifier to **esbuild**; tokenizer limits enforced; gradient sidebar & cookieconsent manager added.
#### **Aug 2024**
* **Initial MVP**: iMessagestyle chat UI, websocket prototype, Google Analytics, Cloudflare bindings, base workersite scaffold.
- **Initial MVP**: iMessagestyle chat UI, websocket prototype, Google Analytics, Cloudflare bindings, base workersite scaffold.

103
README.md
View File

@@ -1,42 +1,29 @@
# open-gsio
[![Tests](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml/badge.svg)](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml)
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
</br>
<p align="center">
<img src="https://github.com/user-attachments/assets/620d2517-e7be-4bb0-b2b7-3aa0cba37ef0" width="250" />
</p>
> **Note**: This project is currently under active development. The styling is a work in progress and some functionality
> may be broken. Tests are being actively ported and stability will improve over time. Thank you for your patience!
This is a full-stack Conversational AI. It runs on Cloudflare or Bun.
This is a full-stack Conversational AI.
## Table of Contents
- [Stack](#stack)
- [Installation](#installation)
- [Deployment](#deployment)
- [Local Inference](#local-inference)
- [mlx-omni-server (default)](#mlx-omni-server)
- [Adding models](#adding-models-for-local-inference-apple-silicon)
- [Ollama](#ollama)
- [Adding models](#adding-models-for-local-inference-ollama)
- [mlx-omni-server (default)](#mlx-omni-server)
- [Adding models](#adding-models-for-local-inference-apple-silicon)
- [Ollama](#ollama)
- [Adding models](#adding-models-for-local-inference-ollama)
- [Testing](#testing)
- [Troubleshooting](#troubleshooting)
- [History](#history)
- [Acknowledgments](#acknowledgments)
- [License](#license)
## Stack
* [TypeScript](https://www.typescriptlang.org/)
* [Vike](https://vike.dev/)
* [React](https://react.dev/)
* [Cloudflare Workers](https://developers.cloudflare.com/workers/)
* [ittyrouter](https://github.com/kwhitley/itty-router)
* [MobXStateTree](https://mobx-state-tree.js.org/)
* [OpenAI SDK](https://github.com/openai/openai-node)
* [Vitest](https://vitest.dev/)
## Installation
1. `bun i && bun test:all`
@@ -46,30 +33,34 @@ This is a full-stack Conversational AI. It runs on Cloudflare or Bun.
> Note: it should be possible to use pnpm in place of bun.
## Deployment
1. Setup KV_STORAGE binding in `packages/server/wrangler.jsonc`
1. [Add keys in secrets.json](https://console.groq.com/keys)
1. [Add keys in secrets.json](https://console.groq.com/keys)
1. Run `bun run deploy && bun run deploy:secrets && bun run deploy`
> Note: Subsequent deployments should omit `bun run deploy:secrets`
## Local Inference
> Local inference is achieved by overriding the `OPENAI_API_KEY` and `OPENAI_API_ENDPOINT` environment variables. See below.
> Local inference is supported for Ollama and mlx-omni-server. OpenAI compatible servers can be used by overriding OPENAI_API_KEY and OPENAI_API_ENDPOINT.
### mlx-omni-server
(default) (Apple Silicon Only) - Use Ollama for other platforms.
~~~bash
(default) (Apple Silicon Only)
```bash
# (prereq) install mlx-omni-server
brew tap seemueller-io/tap
brew install seemueller-io/tap/mlx-omni-server
brew tap seemueller-io/tap
brew install seemueller-io/tap/mlx-omni-server
bun run openai:local mlx-omni-server # Start mlx-omni-server
bun run openai:local:configure # Configure connection
bun run server:dev # Restart server
~~~
```
#### Adding models for local inference (Apple Silicon)
~~~bash
```bash
# ensure mlx-omni-server is running
# See https://huggingface.co/mlx-community for available models
@@ -81,21 +72,22 @@ curl http://localhost:10240/v1/chat/completions \
\"model\": \"$MODEL_TO_ADD\",
\"messages\": [{\"role\": \"user\", \"content\": \"Hello\"}]
}"
~~~
```
### Ollama
~~~bash
```bash
bun run openai:local ollama # Start ollama server
bun run openai:local:configure # Configure connection
bun run server:dev # Restart server
~~~
```
#### Adding models for local inference (ollama)
~~~bash
```bash
# See https://ollama.com/library for available models
use the ollama web ui @ http://localhost:8080
~~~
```
## Testing
@@ -103,20 +95,44 @@ Tests are located in `__tests__` directories next to the code they test. Testing
> `bun test:all` will run all tests
## Troubleshooting
1. `bun clean`
1. `bun i`
1. `bun server:dev`
1. `bun client:dev`
1. Submit an issue
1. Submit an issue
## History
History
---
A high-level overview for the development history of the parent repository, [geoff-seemueller-io](https://geoff.seemueller.io), is provided in [LEGACY.md](./LEGACY.md).
## Acknowledgments
I would like to express gratitude to the following projects, libraries, and individuals that have contributed to making open-gsio possible:
- [TypeScript](https://www.typescriptlang.org/) - Primary programming language
- [React](https://react.dev/) - UI library for building the frontend
- [Vike](https://vike.dev/) - Framework for server-side rendering and routing
- [Cloudflare Workers](https://developers.cloudflare.com/workers/) - Serverless execution environment
- [Bun](https://bun.sh/) - JavaScript runtime and toolkit
- [itty-router](https://github.com/kwhitley/itty-router) - Lightweight router for serverless environments
- [MobX-State-Tree](https://mobx-state-tree.js.org/) - State management solution
- [OpenAI SDK](https://github.com/openai/openai-node) - Client for AI model integration
- [Vitest](https://vitest.dev/) - Testing framework
- [OpenAI](https://github.com/openai)
- [Groq](https://console.groq.com/) - Fast inference API
- [Anthropic](https://www.anthropic.com/) - Creator of Claude models
- [Fireworks](https://fireworks.ai/) - AI inference platform
- [XAI](https://x.ai/) - Creator of Grok models
- [Cerebras](https://www.cerebras.net/) - AI compute and models
- [(madroidmaq) MLX Omni Server](https://github.com/madroidmaq/mlx-omni-server) - Open-source high-performance inference for Apple Silicon
- [MLX](https://github.com/ml-explore/mlx) - An array framework for Apple silicon
- [Ollama](https://github.com/ollama/ollama) - Versatile solution for self-hosting models
## License
~~~text
```text
MIT License
Copyright (c) 2025 Geoff Seemueller
@@ -138,7 +154,4 @@ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
~~~
```

887
bun.lock

File diff suppressed because it is too large Load Diff

BIN
bun.lockb

Binary file not shown.

View File

@@ -12,19 +12,32 @@
"clean": "packages/scripts/cleanup.sh",
"test:all": "bun run --filter='*' tests",
"client:dev": "(cd packages/client && bun run dev)",
"server:dev": "(cd packages/server && bun run dev)",
"build": "(cd packages/cloudflare-workers && bun run deploy:dry-run)",
"deploy": "(cd packages/cloudflare-workers && bun run deploy)",
"server:dev": "bun build:client && (cd packages/server && bun run dev)",
"build": "(cd packages/cloudflare-workers/open-gsio && bun run deploy:dry-run)",
"build:client": "(cd packages/client && bun run vite build)",
"deploy": "(cd packages/cloudflare-workers/open-gsio && bun run deploy)",
"deploy:secrets": "wrangler secret bulk secrets.json -c packages/cloudflare-workers/open-gsio/wrangler.jsonc",
"openai:local:mlx": "packages/scripts/start_inference_server.sh mlx-omni-server",
"openai:local:ollama": "packages/scripts/start_inference_server.sh ollama",
"openai:local:configure": "packages/scripts/configure_local_inference.sh"
"openai:local:configure": "packages/scripts/configure_local_inference.sh",
"lint": "eslint . --ext .js,.jsx,.ts,.tsx",
"lint:fix": "eslint . --ext .js,.jsx,.ts,.tsx --fix",
"format": "prettier --write \"**/*.{js,jsx,ts,tsx,json,md}\"",
"format:check": "prettier --check \"**/*.{js,jsx,ts,tsx,json,md}\"",
"log": "(cd packages/cloudflare-workers/open-gsio && bun wrangler tail)"
},
"devDependencies": {
"@types/bun": "latest"
"@types/bun": "^1.2.17",
"@typescript-eslint/eslint-plugin": "^8.35.0",
"@typescript-eslint/parser": "^8.35.0",
"eslint": "^9.29.0",
"eslint-config-prettier": "^10.1.5",
"eslint-plugin-import": "^2.32.0",
"eslint-plugin-prettier": "^5.5.1",
"happy-dom": "^18.0.1",
"prettier": "^3.6.1"
},
"peerDependencies": {
"typescript": "^5"
},
"packageManager": "pnpm@10.10.0+sha512.d615db246fe70f25dcfea6d8d73dee782ce23e2245e3c4f6f888249fb568149318637dca73c2c5c8ef2a4ca0d5657fb9567188bfab47f566d1ee6ce987815c39"
"typescript": "^5.8.3"
}
}

View File

@@ -1 +0,0 @@
export * from "./supported-models.ts";

View File

@@ -1,4 +1,48 @@
{
"name": "@open-gsio/ai",
"module": "index.ts"
}
"type": "module",
"module": "src/index.ts",
"exports": {
".": {
"import": "./src/index.ts",
"types": "./src/index.ts"
},
"./chat-sdk/chat-sdk.ts": {
"import": "./src/chat-sdk/chat-sdk.ts",
"types": "./src/chat-sdk/chat-sdk.ts"
},
"./providers/_ProviderRepository.ts": {
"import": "./src/providers/_ProviderRepository.ts",
"types": "./src/providers/_ProviderRepository.ts"
},
"./providers/google.ts": {
"import": "./src/providers/google.ts",
"types": "./src/providers/google.ts"
},
"./providers/openai.ts": {
"import": "./src/providers/openai.ts",
"types": "./src/providers/openai.ts"
},
"./src": {
"import": "./src/index.ts",
"types": "./src/index.ts"
},
"./utils": {
"import": "./src/utils/index.ts",
"types": "./src/utils/index.ts"
}
},
"scripts": {
"tests": "vitest run",
"tests:coverage": "vitest run --coverage.enabled=true"
},
"devDependencies": {
"@open-gsio/env": "workspace:*",
"@open-gsio/schema": "workspace:*",
"@anthropic-ai/sdk": "^0.55.0",
"openai": "^5.0.1",
"wrangler": "^4.18.0",
"vitest": "^3.1.4",
"vite": "^6.3.5"
}
}

View File

@@ -0,0 +1,155 @@
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import { AssistantSdk } from '../assistant-sdk';
import { Utils } from '../utils/utils.ts';
// Mock dependencies
vi.mock('../utils/utils.ts', () => ({
Utils: {
selectEquitably: vi.fn(),
getCurrentDate: vi.fn(),
},
}));
vi.mock('../prompts/few_shots', () => ({
default: {
a: 'A1',
question1: 'answer1',
question2: 'answer2',
question3: 'answer3',
},
}));
describe('AssistantSdk', () => {
beforeEach(() => {
vi.useFakeTimers();
vi.setSystemTime(new Date('2023-01-01T12:30:45Z'));
// Reset mocks
vi.resetAllMocks();
});
afterEach(() => {
vi.useRealTimers();
});
describe('getAssistantPrompt', () => {
it('should return a prompt with default values when minimal params are provided', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({});
expect(prompt).toContain('# Assistant Knowledge');
expect(prompt).toContain('### Date: ');
expect(prompt).toContain('### Web Host: ');
expect(prompt).toContain('### User Location: ');
expect(prompt).toContain('### Timezone: ');
});
it('should include maxTokens when provided', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({ maxTokens: 1000 });
expect(prompt).toContain('Max Response Length: 1000 tokens (maximum)');
});
it('should use provided userTimezone and userLocation', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({
userTimezone: 'America/New_York',
userLocation: 'New York, USA',
});
expect(prompt).toContain('### User Location: New York, USA');
expect(prompt).toContain('### Timezone: America/New_York');
});
it('should use current date when Utils.getCurrentDate is not available', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
// @ts-expect-error - is supposed to break
Utils.getCurrentDate.mockReturnValue(undefined);
const prompt = AssistantSdk.getAssistantPrompt({});
// Instead of checking for a specific date, just verify that a date is included
expect(prompt).toMatch(/### Date: \d{4}-\d{2}-\d{2} \d{1,2}:\d{2} \d{1,2}s/);
});
it('should use few_shots directly when Utils.selectEquitably is not available', () => {
// @ts-expect-error - is supposed to break
Utils.selectEquitably.mockReturnValue(undefined);
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({});
// The prompt should still contain examples
expect(prompt).toContain('#### Example 1');
// Instead of checking for specific content, just verify that examples are included
expect(prompt).toMatch(/HUMAN: .+\nASSISTANT: .+/);
});
});
describe('useFewshots', () => {
it('should format fewshots correctly', () => {
const fewshots = {
'What is the capital of France?': 'Paris is the capital of France.',
'How do I make pasta?': 'Boil water, add pasta, cook until al dente.',
};
const result = AssistantSdk.useFewshots(fewshots);
expect(result).toContain('#### Example 1');
expect(result).toContain('HUMAN: What is the capital of France?');
expect(result).toContain('ASSISTANT: Paris is the capital of France.');
expect(result).toContain('#### Example 2');
expect(result).toContain('HUMAN: How do I make pasta?');
expect(result).toContain('ASSISTANT: Boil water, add pasta, cook until al dente.');
});
it('should respect the limit parameter', () => {
const fewshots = {
Q1: 'A1',
Q2: 'A2',
Q3: 'A3',
Q4: 'A4',
Q5: 'A5',
Q6: 'A6',
};
const result = AssistantSdk.useFewshots(fewshots, 3);
expect(result).toContain('#### Example 1');
expect(result).toContain('HUMAN: Q1');
expect(result).toContain('ASSISTANT: A1');
expect(result).toContain('#### Example 2');
expect(result).toContain('HUMAN: Q2');
expect(result).toContain('ASSISTANT: A2');
expect(result).toContain('#### Example 3');
expect(result).toContain('HUMAN: Q3');
expect(result).toContain('ASSISTANT: A3');
expect(result).not.toContain('#### Example 4');
expect(result).not.toContain('HUMAN: Q4');
});
});
});

View File

@@ -1,24 +1,29 @@
import { Schema } from '@open-gsio/schema';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { ChatSdk } from '../chat-sdk.ts';
import { AssistantSdk } from '../assistant-sdk.ts';
import Message from '../../models/Message.ts';
import { getModelFamily } from '@open-gsio/ai/supported-models';
import { AssistantSdk } from '../assistant-sdk';
import { ChatSdk } from '../chat-sdk';
import { ProviderRepository } from '../providers/_ProviderRepository.ts';
// Mock dependencies
vi.mock('../assistant-sdk', () => ({
AssistantSdk: {
getAssistantPrompt: vi.fn()
}
getAssistantPrompt: vi.fn(),
},
}));
vi.mock('../../models/Message', () => ({
default: {
create: vi.fn((message) => message)
}
vi.mock('@open-gsio/schema', () => ({
Schema: {
Message: {
create: vi.fn(message => message),
},
},
}));
vi.mock('@open-gsio/ai/supported-models', () => ({
getModelFamily: vi.fn()
vi.mock('../providers/_ProviderRepository', () => ({
ProviderRepository: {
getModelFamily: vi.fn().mockResolvedValue('openai'),
},
}));
describe('ChatSdk', () => {
@@ -30,16 +35,16 @@ describe('ChatSdk', () => {
describe('preprocess', () => {
it('should return an assistant message with empty content', async () => {
const messages = [{ role: 'user', content: 'Hello' }];
const result = await ChatSdk.preprocess({ messages });
expect(Message.create).toHaveBeenCalledWith({
expect(Schema.Message.create).toHaveBeenCalledWith({
role: 'assistant',
content: ''
content: '',
});
expect(result).toEqual({
role: 'assistant',
content: ''
content: '',
});
});
});
@@ -47,7 +52,7 @@ describe('ChatSdk', () => {
describe('handleChatRequest', () => {
it('should return a 400 response if no messages are provided', async () => {
const request = {
json: vi.fn().mockResolvedValue({ messages: [] })
json: vi.fn().mockResolvedValue({ messages: [] }),
};
const ctx = {
openai: {},
@@ -56,13 +61,13 @@ describe('ChatSdk', () => {
env: {
SERVER_COORDINATOR: {
idFromName: vi.fn(),
get: vi.fn()
}
}
get: vi.fn(),
},
},
};
const response = await ChatSdk.handleChatRequest(request as any, ctx as any);
expect(response.status).toBe(400);
expect(await response.text()).toBe('No messages provided');
});
@@ -70,22 +75,22 @@ describe('ChatSdk', () => {
it('should save stream data and return a response with streamUrl', async () => {
const streamId = 'test-uuid';
vi.stubGlobal('crypto', {
randomUUID: vi.fn().mockReturnValue(streamId)
randomUUID: vi.fn().mockReturnValue(streamId),
});
const messages = [{ role: 'user', content: 'Hello' }];
const model = 'gpt-4';
const conversationId = 'conv-123';
const request = {
json: vi.fn().mockResolvedValue({ messages, model, conversationId })
json: vi.fn().mockResolvedValue({ messages, model, conversationId }),
};
const saveStreamData = vi.fn();
const durableObject = {
saveStreamData
saveStreamData,
};
const ctx = {
openai: {},
systemPrompt: 'System prompt',
@@ -93,22 +98,19 @@ describe('ChatSdk', () => {
env: {
SERVER_COORDINATOR: {
idFromName: vi.fn().mockReturnValue('object-id'),
get: vi.fn().mockReturnValue(durableObject)
}
}
get: vi.fn().mockReturnValue(durableObject),
},
},
};
const response = await ChatSdk.handleChatRequest(request as any, ctx as any);
const responseBody = await response.json();
expect(ctx.env.SERVER_COORDINATOR.idFromName).toHaveBeenCalledWith('stream-index');
expect(ctx.env.SERVER_COORDINATOR.get).toHaveBeenCalledWith('object-id');
expect(saveStreamData).toHaveBeenCalledWith(
streamId,
expect.stringContaining(model)
);
expect(saveStreamData).toHaveBeenCalledWith(streamId, expect.stringContaining(model));
expect(responseBody).toEqual({
streamUrl: `/api/streams/${streamId}`
streamUrl: `/api/streams/${streamId}`,
});
});
});
@@ -118,21 +120,21 @@ describe('ChatSdk', () => {
const messages = [{ role: 'user', content: 'Hello' }];
const dynamicMaxTokens = vi.fn().mockResolvedValue(500);
const durableObject = {
dynamicMaxTokens
dynamicMaxTokens,
};
const ctx = {
maxTokens: 1000,
env: {
SERVER_COORDINATOR: {
idFromName: vi.fn().mockReturnValue('object-id'),
get: vi.fn().mockReturnValue(durableObject)
}
}
get: vi.fn().mockReturnValue(durableObject),
},
},
};
await ChatSdk.calculateMaxTokens(messages, ctx as any);
expect(ctx.env.SERVER_COORDINATOR.idFromName).toHaveBeenCalledWith('dynamic-token-counter');
expect(ctx.env.SERVER_COORDINATOR.get).toHaveBeenCalledWith('object-id');
expect(dynamicMaxTokens).toHaveBeenCalledWith(messages, 1000);
@@ -142,96 +144,94 @@ describe('ChatSdk', () => {
describe('buildAssistantPrompt', () => {
it('should call AssistantSdk.getAssistantPrompt with the correct parameters', () => {
vi.mocked(AssistantSdk.getAssistantPrompt).mockReturnValue('Assistant prompt');
const result = ChatSdk.buildAssistantPrompt({ maxTokens: 1000 });
expect(AssistantSdk.getAssistantPrompt).toHaveBeenCalledWith({
maxTokens: 1000,
userTimezone: 'UTC',
userLocation: 'USA/unknown'
userLocation: 'USA/unknown',
});
expect(result).toBe('Assistant prompt');
});
});
describe('buildMessageChain', () => {
it('should build a message chain with system role for most models', () => {
vi.mocked(getModelFamily).mockReturnValue('openai');
const messages = [
{ role: 'user', content: 'Hello' }
];
// TODO: Fix this test
it('should build a message chain with system role for most models', async () => {
ProviderRepository.getModelFamily.mockResolvedValue('openai');
const messages = [{ role: 'user', content: 'Hello' }];
const opts = {
systemPrompt: 'System prompt',
assistantPrompt: 'Assistant prompt',
toolResults: { role: 'tool', content: 'Tool result' },
model: 'gpt-4'
model: 'gpt-4',
env: {},
};
const result = ChatSdk.buildMessageChain(messages, opts as any);
expect(getModelFamily).toHaveBeenCalledWith('gpt-4');
expect(Message.create).toHaveBeenCalledTimes(3);
expect(Message.create).toHaveBeenNthCalledWith(1, {
const result = await ChatSdk.buildMessageChain(messages, opts as any);
expect(ProviderRepository.getModelFamily).toHaveBeenCalledWith('gpt-4', {});
expect(Schema.Message.create).toHaveBeenCalledTimes(3);
expect(Schema.Message.create).toHaveBeenNthCalledWith(1, {
role: 'system',
content: 'System prompt'
content: 'System prompt',
});
expect(Message.create).toHaveBeenNthCalledWith(2, {
expect(Schema.Message.create).toHaveBeenNthCalledWith(2, {
role: 'assistant',
content: 'Assistant prompt'
content: 'Assistant prompt',
});
expect(Message.create).toHaveBeenNthCalledWith(3, {
expect(Schema.Message.create).toHaveBeenNthCalledWith(3, {
role: 'user',
content: 'Hello'
content: 'Hello',
});
});
it('should build a message chain with assistant role for o1, gemma, claude, or google models', async () => {
ProviderRepository.getModelFamily.mockResolvedValue('claude');
const messages = [{ role: 'user', content: 'Hello' }];
it('should build a message chain with assistant role for o1, gemma, claude, or google models', () => {
vi.mocked(getModelFamily).mockReturnValue('claude');
const messages = [
{ role: 'user', content: 'Hello' }
];
const opts = {
systemPrompt: 'System prompt',
assistantPrompt: 'Assistant prompt',
toolResults: { role: 'tool', content: 'Tool result' },
model: 'claude-3'
model: 'claude-3',
env: {},
};
const result = ChatSdk.buildMessageChain(messages, opts as any);
expect(getModelFamily).toHaveBeenCalledWith('claude-3');
expect(Message.create).toHaveBeenCalledTimes(3);
expect(Message.create).toHaveBeenNthCalledWith(1, {
const result = await ChatSdk.buildMessageChain(messages, opts as any);
expect(ProviderRepository.getModelFamily).toHaveBeenCalledWith('claude-3', {});
expect(Schema.Message.create).toHaveBeenCalledTimes(3);
expect(Schema.Message.create).toHaveBeenNthCalledWith(1, {
role: 'assistant',
content: 'System prompt'
content: 'System prompt',
});
});
it('should filter out messages with empty content', async () => {
ProviderRepository.getModelFamily.mockResolvedValue('openai');
it('should filter out messages with empty content', () => {
vi.mocked(getModelFamily).mockReturnValue('openai');
const messages = [
{ role: 'user', content: 'Hello' },
{ role: 'user', content: '' },
{ role: 'user', content: ' ' },
{ role: 'user', content: 'World' }
{ role: 'user', content: 'World' },
];
const opts = {
systemPrompt: 'System prompt',
assistantPrompt: 'Assistant prompt',
toolResults: { role: 'tool', content: 'Tool result' },
model: 'gpt-4'
model: 'gpt-4',
env: {},
};
const result = ChatSdk.buildMessageChain(messages, opts as any);
const result = await ChatSdk.buildMessageChain(messages, opts as any);
// 2 system/assistant messages + 2 user messages (Hello and World)
expect(Message.create).toHaveBeenCalledTimes(4);
expect(Schema.Message.create).toHaveBeenCalledTimes(4);
});
});
});
});

View File

@@ -1,5 +1,6 @@
import { describe, it, expect } from 'vitest';
import { Utils } from '../utils.ts';
import { Utils } from '../utils/utils.ts';
describe('Debug Utils.getSeason', () => {
it('should print out the actual seasons for different dates', () => {

View File

@@ -1,13 +1,14 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import handleStreamData from '../handleStreamData.ts';
import handleStreamData from '../utils/handleStreamData.ts';
describe('handleStreamData', () => {
// Setup mocks
const mockController = {
enqueue: vi.fn()
enqueue: vi.fn(),
};
const mockEncoder = {
encode: vi.fn((str) => str)
encode: vi.fn(str => str),
};
beforeEach(() => {
@@ -16,75 +17,77 @@ describe('handleStreamData', () => {
it('should return early if data type is not "chat"', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any);
handler({ type: 'not-chat', data: {} });
expect(mockController.enqueue).not.toHaveBeenCalled();
expect(mockEncoder.encode).not.toHaveBeenCalled();
});
it('should return early if data is undefined', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any);
handler(undefined as any);
expect(mockController.enqueue).not.toHaveBeenCalled();
expect(mockEncoder.encode).not.toHaveBeenCalled();
});
it('should handle content_block_start type data', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = {
type: 'chat',
data: {
type: 'content_block_start',
content_block: {
type: 'text',
text: 'Hello world'
}
}
text: 'Hello world',
},
},
};
handler(data);
expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world'));
// @ts-expect-error - mock
const encodedData = mockEncoder.encode.mock.calls[0][0];
const parsedData = JSON.parse(encodedData.split('data: ')[1]);
expect(parsedData.type).toBe('chat');
expect(parsedData.data.choices[0].delta.content).toBe('Hello world');
});
it('should handle delta.text type data', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = {
type: 'chat',
data: {
delta: {
text: 'Hello world'
}
}
text: 'Hello world',
},
},
};
handler(data);
expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world'));
// @ts-expect-error - mock
const encodedData = mockEncoder.encode.mock.calls[0][0];
const parsedData = JSON.parse(encodedData.split('data: ')[1]);
expect(parsedData.type).toBe('chat');
expect(parsedData.data.choices[0].delta.content).toBe('Hello world');
});
it('should handle choices[0].delta.content type data', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = {
type: 'chat',
data: {
@@ -92,23 +95,24 @@ describe('handleStreamData', () => {
{
index: 0,
delta: {
content: 'Hello world'
content: 'Hello world',
},
logprobs: null,
finish_reason: null
}
]
}
finish_reason: null,
},
],
},
};
handler(data);
expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world'));
// @ts-expect-error - mock
const encodedData = mockEncoder.encode.mock.calls[0][0];
const parsedData = JSON.parse(encodedData.split('data: ')[1]);
expect(parsedData.type).toBe('chat');
expect(parsedData.data.choices[0].delta.content).toBe('Hello world');
expect(parsedData.data.choices[0].finish_reason).toBe(null);
@@ -116,7 +120,7 @@ describe('handleStreamData', () => {
it('should pass through data with choices but no delta.content', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = {
type: 'chat',
data: {
@@ -125,64 +129,66 @@ describe('handleStreamData', () => {
index: 0,
delta: {},
logprobs: null,
finish_reason: 'stop'
}
]
}
finish_reason: 'stop',
},
],
},
};
handler(data);
handler(data as any);
expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('"finish_reason":"stop"'));
expect(mockEncoder.encode).toHaveBeenCalledWith(
expect.stringContaining('"finish_reason":"stop"'),
);
});
it('should return early for unrecognized data format', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = {
type: 'chat',
data: {
// No recognized properties
unrecognized: 'property'
}
unrecognized: 'property',
},
};
handler(data);
handler(data as any);
expect(mockController.enqueue).not.toHaveBeenCalled();
expect(mockEncoder.encode).not.toHaveBeenCalled();
});
it('should use custom transform function if provided', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = {
type: 'chat',
data: {
original: 'data'
}
original: 'data',
},
};
const transformFn = vi.fn().mockReturnValue({
type: 'chat',
data: {
choices: [
{
delta: {
content: 'Transformed content'
content: 'Transformed content',
},
logprobs: null,
finish_reason: null
}
]
}
finish_reason: null,
},
],
},
});
handler(data, transformFn);
handler(data as any, transformFn);
expect(transformFn).toHaveBeenCalledWith(data);
expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Transformed content'));
});
});
});

View File

@@ -1,5 +1,6 @@
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import { Utils } from '../utils.ts';
import { Utils } from '../utils/utils.ts';
describe('Utils', () => {
describe('getSeason', () => {
@@ -42,10 +43,11 @@ describe('Utils', () => {
beforeEach(() => {
// Mock Intl.DateTimeFormat
// @ts-expect-error - mock
global.Intl.DateTimeFormat = vi.fn().mockReturnValue({
resolvedOptions: vi.fn().mockReturnValue({
timeZone: 'America/New_York'
})
timeZone: 'America/New_York',
}),
});
});
@@ -102,10 +104,10 @@ describe('Utils', () => {
it('should select items equitably from multiple sources', () => {
const sources = {
a: { 'key1': 'value1', 'key2': 'value2' },
b: { 'key3': 'value3', 'key4': 'value4' },
c: { 'key5': 'value5', 'key6': 'value6' },
d: { 'key7': 'value7', 'key8': 'value8' }
a: { key1: 'value1', key2: 'value2' },
b: { key3: 'value3', key4: 'value4' },
c: { key5: 'value5', key6: 'value6' },
d: { key7: 'value7', key8: 'value8' },
};
const result = Utils.selectEquitably(sources, 4);
@@ -117,10 +119,10 @@ describe('Utils', () => {
it('should handle itemCount greater than available items', () => {
const sources = {
a: { 'key1': 'value1' },
b: { 'key2': 'value2' },
a: { key1: 'value1' },
b: { key2: 'value2' },
c: {},
d: {}
d: {},
};
const result = Utils.selectEquitably(sources, 5);
@@ -135,7 +137,7 @@ describe('Utils', () => {
a: {},
b: {},
c: {},
d: {}
d: {},
};
const result = Utils.selectEquitably(sources, 5);
@@ -148,10 +150,10 @@ describe('Utils', () => {
it('should insert blank messages to maintain user/assistant alternation', () => {
const messages = [
{ role: 'user', content: 'Hello' },
{ role: 'user', content: 'How are you?' }
{ role: 'user', content: 'How are you?' },
];
const result = Utils.normalizeWithBlanks(messages);
const result = Utils.normalizeWithBlanks(messages as any[]);
expect(result.length).toBe(3);
expect(result[0]).toEqual({ role: 'user', content: 'Hello' });
@@ -160,11 +162,9 @@ describe('Utils', () => {
});
it('should insert blank user message if first message is assistant', () => {
const messages = [
{ role: 'assistant', content: 'Hello, how can I help?' }
];
const messages = [{ role: 'assistant', content: 'Hello, how can I help?' }];
const result = Utils.normalizeWithBlanks(messages);
const result = Utils.normalizeWithBlanks(messages as any[]);
expect(result.length).toBe(2);
expect(result[0]).toEqual({ role: 'user', content: '' });
@@ -183,10 +183,10 @@ describe('Utils', () => {
const messages = [
{ role: 'user', content: 'Hello' },
{ role: 'assistant', content: 'Hi there' },
{ role: 'user', content: 'How are you?' }
{ role: 'user', content: 'How are you?' },
];
const result = Utils.normalizeWithBlanks(messages);
const result = Utils.normalizeWithBlanks(messages as any[]);
expect(result.length).toBe(3);
expect(result).toEqual(messages);

View File

@@ -0,0 +1,56 @@
import Prompts from '../prompts';
import { Common } from '../utils';
export class AssistantSdk {
static getAssistantPrompt(params: {
maxTokens?: number;
userTimezone?: string;
userLocation?: string;
}): string {
const { maxTokens, userTimezone = 'UTC', userLocation = '' } = params;
// console.log('[DEBUG_LOG] few_shots:', JSON.stringify(few_shots));
let selectedFewshots = Common.Utils.selectEquitably?.(Prompts.FewShots);
// console.log('[DEBUG_LOG] selectedFewshots after Utils.selectEquitably:', JSON.stringify(selectedFewshots));
if (!selectedFewshots) {
selectedFewshots = Prompts.FewShots;
// console.log('[DEBUG_LOG] selectedFewshots after fallback:', JSON.stringify(selectedFewshots));
}
const sdkDate = new Date().toISOString();
const [currentDate] = sdkDate.includes('T') ? sdkDate.split('T') : [sdkDate];
const now = new Date();
const formattedMinutes = String(now.getMinutes()).padStart(2, '0');
const currentTime = `${now.getHours()}:${formattedMinutes} ${now.getSeconds()}s`;
return `# Assistant Knowledge
## Current Context
### Date: ${currentDate} ${currentTime}
### Web Host: open-gsio.seemueller.workers.dev
${maxTokens ? `### Max Response Length: ${maxTokens} tokens (maximum)` : ''}
### Lexicographical Format: Markdown
### User Location: ${userLocation || 'Unknown'}
### Timezone: ${userTimezone}
## Response Framework
1. Use knowledge provided in the current context as the primary source of truth.
2. Format all responses in Markdown.
3. Attribute external sources with footnotes.
4. Do not bold headers.
## Examples
#### Example 0
HUMAN: What is this?
ASSISTANT: This is a conversational AI system.
---
${AssistantSdk.useFewshots(selectedFewshots, 5)}
---
## Directive
Continuously monitor the evolving conversation. Dynamically adapt each response.`;
}
static useFewshots(fewshots: Record<string, string>, limit = 5): string {
return Object.entries(fewshots)
.slice(0, limit)
.map(([q, a], i) => {
return `#### Example ${i + 1}\nHUMAN: ${q}\nASSISTANT: ${a}`;
})
.join('\n---\n');
}
}

View File

@@ -0,0 +1,3 @@
import { AssistantSdk } from './assistant-sdk.ts';
export { AssistantSdk };

View File

@@ -0,0 +1,137 @@
import { Schema } from '@open-gsio/schema';
import type { Instance } from 'mobx-state-tree';
import { OpenAI } from 'openai';
import { AssistantSdk } from '../assistant-sdk';
import { ProviderRepository } from '../providers/_ProviderRepository.ts';
import type {
BuildAssistantPromptParams,
ChatRequestBody,
GenericEnv,
PreprocessParams,
} from '../types';
export class ChatSdk {
static async preprocess(params: PreprocessParams) {
// a slot for to provide additional context
return Schema.Message.create({
role: 'assistant',
content: '',
});
}
static async handleChatRequest(
request: Request,
ctx: {
openai: OpenAI;
systemPrompt: any;
maxTokens: any;
env: GenericEnv;
},
) {
const streamId = crypto.randomUUID();
const { messages, model, conversationId } = (await request.json()) as ChatRequestBody;
if (!messages?.length) {
return new Response('No messages provided', { status: 400 });
}
const preprocessedContext = await ChatSdk.preprocess({
messages,
});
// console.log(ctx.env)
// console.log(ctx.env.SERVER_COORDINATOR);
const objectId = ctx.env.SERVER_COORDINATOR.idFromName('stream-index');
const durableObject = ctx.env.SERVER_COORDINATOR.get(objectId);
await durableObject.saveStreamData(
streamId,
JSON.stringify({
messages,
model,
conversationId,
timestamp: Date.now(),
systemPrompt: ctx.systemPrompt,
preprocessedContext,
}),
);
return new Response(
JSON.stringify({
streamUrl: `/api/streams/${streamId}`,
}),
{
headers: {
'Content-Type': 'application/json',
},
},
);
}
static async calculateMaxTokens(
messages: any[],
ctx: Record<string, any> & {
env: GenericEnv;
maxTokens: number;
},
) {
const objectId = ctx.env.SERVER_COORDINATOR.idFromName('dynamic-token-counter');
const durableObject = ctx.env.SERVER_COORDINATOR.get(objectId);
return durableObject.dynamicMaxTokens(messages, ctx.maxTokens);
}
static buildAssistantPrompt(params: BuildAssistantPromptParams) {
const { maxTokens } = params;
return AssistantSdk.getAssistantPrompt({
maxTokens,
userTimezone: 'UTC',
userLocation: 'USA/unknown',
});
}
static async buildMessageChain(
messages: any[],
opts: {
systemPrompt: any;
assistantPrompt: string;
toolResults: Instance<typeof Message>;
model: any;
env: GenericEnv;
},
) {
const modelFamily = await ProviderRepository.getModelFamily(opts.model, opts.env);
const messagesToSend = [];
messagesToSend.push(
Schema.Message.create({
role:
opts.model.includes('o1') ||
opts.model.includes('gemma') ||
modelFamily === 'claude' ||
modelFamily === 'google'
? 'assistant'
: 'system',
content: opts.systemPrompt.trim(),
}),
);
messagesToSend.push(
Schema.Message.create({
role: 'assistant',
content: opts.assistantPrompt.trim(),
}),
);
messagesToSend.push(
...messages
.filter((message: any) => message.content?.trim())
.map((message: any) => Schema.Message.create(message)),
);
return messagesToSend;
}
}
export default ChatSdk;

View File

@@ -0,0 +1 @@
export * from './chat-sdk.ts';

2
packages/ai/src/index.ts Normal file
View File

@@ -0,0 +1,2 @@
export * from './providers';
export * from './chat-sdk';

View File

@@ -1,50 +1,49 @@
/* eslint-disable no-useless-escape */
const a = {
"Can you search for the latest quantum computing breakthroughs?":
"I don't have web access.",
'Can you search for the latest quantum computing breakthroughs?': "I don't have web access.",
};
const b = {
"Calculate the number of unique ways to arrange the letters in the word 'SUCCESS'.": `
Alright, let's tackle 'SUCCESS' and see just how unique it can get!
1. Step 1: Count the Letters
'Calculate the number of unique ways to arrange the letters in the word SUCCESS.': `Alright, let's tackle 'SUCCESS' and see just how unique it can get!
### Step 1: Count the Letters
The word 'SUCCESS' has 7 letters, but its no free-for-allthere are repeats!
- S: 3 times
- U: 1 time
- C: 2 times
- E: 1 time
Erdős would call this a combinatorial party with some guests arriving multiple times.
2. Step 2: The Formula for Permutations with Repetition
### Step 2: The Formula for Permutations with Repetition
When letters repeat, we divide to avoid overcounting. The magic formula is:
$$
\frac{n!}{k_1! \cdot k_2! \cdot \ldots \cdot k_m!}
$$
Here, (n) is the total number of letters, and (k_i) are the frequencies of each unique letter.
3. Step 3: Plug in the Values
### Step 3. Plug in the Values
Lets compute:
$$
\frac{7!}{3! \cdot 1! \cdot 2! \cdot 1!} = 420
$$
4. Step 4: Reveal the Result
There are **420 unique arrangements** of 'SUCCESS'. Mathematically satisfying, isnt it? Its as if every version of success gets its own unique permutation!`,
"Calculate the number of unique ways to arrange the letters in the word 'MISSISSIPPI'.": `Ready for some permutation wizardry? Lets decode 'MISSISSIPPI'!
1. Step 1: Count the Letters
### Step 4: Reveal the Result
There are 420 unique arrangements of 'SUCCESS'. Mathematically satisfying, isnt it? Its as if every version of success gets its own unique permutation!`,
'Calculate the number of unique ways to arrange the letters in the word MISSISSIPPI.': `Ready for some permutation wizardry? Lets decode 'MISSISSIPPI'!
### Step 1: Count the Letters
'MISSISSIPPI' has 11 letters. But like a social network with cliques, some letters appear in multiples:
- M: 1
- I: 4
- S: 4
- P: 2
2. Step 2: Apply the Permutation Formula for Repetition
### Step 2: Apply the Permutation Formula for Repetition
The formula for such a case is:
$$
\frac{n!}{k_1! \cdot k_2! \cdot \ldots \cdot k_m!}
$$
3. Step 3: Plug in the Values
### Step 3: Plug in the Values
Heres where we let the math do its magic:
$$
\frac{11!}{1! \cdot 4! \cdot 4! \cdot 2!}
$$
4. Step 4: Simplify Step-by-Step
### Step 4: Simplify Step-by-Step
Lets break it down:
- (11! = 39,916,800)
- (4! = 24)
@@ -53,19 +52,17 @@ Lets break it down:
$$
\frac{39,916,800}{1 \cdot 24 \cdot 24 \cdot 2} = 34,650
$$
5. Step 5: The Grand Finale
There are **34,650 unique arrangements** of 'MISSISSIPPI'. Erdős would marvel at the sheer elegance of these numbersmuch like the way rivers meander through the land, permutations flow through possibilities.`,
### Step 5: The Grand Finale
There are 34,650 unique arrangements of 'MISSISSIPPI'.`,
};
const c = {
// c remains unchanged as it contains image generation prompts
"Render a Greek statue with warm marble tones and realistic proportions.":
'Render a Greek statue with warm marble tones and realistic proportions.':
"I don't have the ability to generate images right now. ",
};
const d = {
"Demonstrate all outputs.": `
# Comprehensive Markdown Showcase
'Demonstrate markdown formatted text.': `
## Headers
~~~markdown
# Large Header
@@ -73,80 +70,48 @@ const d = {
### Small Header
~~~
## Ordered Lists
~~~markdown
1. First Item
2. Second Item
1. Subitem 1
2. Subitem 2
3. Third Item
~~~
## Unordered Lists
~~~markdown
- First Item
- Second Item
- Subitem 1
- Subitem 2
~~~
## Links
~~~markdown
[Visit OpenAI](https://openai.com/)
~~~
## Images
~~~markdown
![Example Image](example.jpg)
~~~
![Example Image](example.jpg)
## Inline Code
~~~markdown
\`console.log('Hello, Markdown!')\`
~~~
## Code Blocks
\`\`\`markdown
~~~javascript
console.log(marked.parse('A Description List:\\n'
+ ': Topic 1 : Description 1\\n'
+ ': **Topic 2** : *Description 2*'));
+ ': Topic 2 : Description 2'));
~~~
\`\`\`
## Tables
~~~markdown
| Name | Value |
|---------|-------|
| Item A | 10 |
| Item B | 20 |
~~~
## Blockquotes
~~~markdown
> Markdown makes writing beautiful.
> - Markdown Fan
~~~
## Horizontal Rule
~~~markdown
---
~~~
## Font: Bold and Italic
~~~markdown
**Bold Text**
**Bold Text**
*Italic Text*
~~~
## Font: Strikethrough
~~~markdown
~~Struck-through text~~
~~~
---
## Math: Inline
This is block level katex:
## Math
~~~markdown
$$
c = \\\\pm\\\\sqrt{a^2 + b^2}
$$
~~~
## Math: Block
This is inline katex
~~~markdown
$c = \\\\pm\\\\sqrt{a^2 + b^2}$
~~~
`,
$$`,
};
export default { a, b, c, d };

View File

@@ -0,0 +1,5 @@
import few_shots from './few_shots.ts';
export default {
FewShots: few_shots,
};

View File

@@ -0,0 +1,96 @@
import type { GenericEnv, ModelMeta, Providers, SupportedProvider } from '../types';
export class ProviderRepository {
#providers: Providers = [];
#env: GenericEnv;
constructor(env: GenericEnv) {
this.#env = env;
this.setProviders(env);
}
static OPENAI_COMPAT_ENDPOINTS = {
xai: 'https://api.x.ai/v1',
groq: 'https://api.groq.com/openai/v1',
google: 'https://generativelanguage.googleapis.com/v1beta/openai',
fireworks: 'https://api.fireworks.ai/inference/v1',
cohere: 'https://api.cohere.ai/compatibility/v1',
cloudflare: 'https://api.cloudflare.com/client/v4/accounts/{CLOUDFLARE_ACCOUNT_ID}/ai/v1',
claude: 'https://api.anthropic.com/v1',
openai: 'https://api.openai.com/v1',
cerebras: 'https://api.cerebras.com/v1',
ollama: 'http://localhost:11434/v1',
mlx: 'http://localhost:10240/v1',
};
static async getModelFamily(model: any, env: GenericEnv) {
const allModels = await env.KV_STORAGE.get('supportedModels');
const models = JSON.parse(allModels);
const modelData = models.filter((m: ModelMeta) => m.id === model);
return modelData[0].provider;
}
static async getModelMeta(meta: any, env: GenericEnv) {
const allModels = await env.KV_STORAGE.get('supportedModels');
const models = JSON.parse(allModels);
return models.filter((m: ModelMeta) => m.id === meta.model).pop();
}
getProviders(): { name: string; key: string; endpoint: string }[] {
return this.#providers;
}
setProviders(env: GenericEnv) {
const indicies = {
providerName: 0,
providerValue: 1,
};
const valueDelimiter = '_';
const envKeys = Object.keys(env);
for (let i = 0; i < envKeys.length; i++) {
if (envKeys.at(i)?.endsWith('KEY')) {
const detectedProvider = envKeys
.at(i)
?.split(valueDelimiter)
.at(indicies.providerName)
?.toLowerCase();
const detectedProviderValue = env[envKeys.at(i) as string];
if (detectedProviderValue) {
switch (detectedProvider) {
case 'anthropic':
this.#providers.push({
name: 'claude',
key: env.ANTHROPIC_API_KEY,
endpoint: ProviderRepository.OPENAI_COMPAT_ENDPOINTS['claude'],
});
break;
case 'gemini':
this.#providers.push({
name: 'google',
key: env.GEMINI_API_KEY,
endpoint: ProviderRepository.OPENAI_COMPAT_ENDPOINTS['google'],
});
break;
case 'cloudflare':
this.#providers.push({
name: 'cloudflare',
key: env.CLOUDFLARE_API_KEY,
endpoint: ProviderRepository.OPENAI_COMPAT_ENDPOINTS[detectedProvider].replace(
'{CLOUDFLARE_ACCOUNT_ID}',
env.CLOUDFLARE_ACCOUNT_ID,
),
});
break;
default:
this.#providers.push({
name: detectedProvider as SupportedProvider,
key: env[envKeys[i] as string],
endpoint:
ProviderRepository.OPENAI_COMPAT_ENDPOINTS[detectedProvider as SupportedProvider],
});
}
}
}
}
}
}

View File

@@ -1,6 +1,11 @@
import { describe, it, expect, vi } from 'vitest';
import { BaseChatProvider, CommonProviderParams, ChatStreamProvider } from '../chat-stream-provider.ts';
import { OpenAI } from 'openai';
import { describe, it, expect, vi } from 'vitest';
import {
BaseChatProvider,
CommonProviderParams,
ChatStreamProvider,
} from '../chat-stream-provider.ts';
// Create a concrete implementation of BaseChatProvider for testing
class TestChatProvider extends BaseChatProvider {
@@ -29,16 +34,16 @@ vi.mock('../../lib/chat-sdk', () => ({
buildAssistantPrompt: vi.fn().mockReturnValue('Assistant prompt'),
buildMessageChain: vi.fn().mockReturnValue([
{ role: 'system', content: 'System prompt' },
{ role: 'user', content: 'User message' }
])
}
{ role: 'user', content: 'User message' },
]),
},
}));
describe('ChatStreamProvider', () => {
it('should define the required interface', () => {
// Verify the interface has the required method
const mockProvider: ChatStreamProvider = {
handleStream: vi.fn()
handleStream: vi.fn(),
};
expect(mockProvider.handleStream).toBeDefined();

View File

@@ -1,17 +1,19 @@
import {OpenAI} from "openai";
import {BaseChatProvider, CommonProviderParams} from "./chat-stream-provider.ts";
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class CerebrasChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: "https://api.cerebras.ai/v1",
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.cerebras,
apiKey: param.env.CEREBRAS_API_KEY,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
// models provided by cerebras do not follow standard tune params
// they must be individually configured
// models provided by cerebras do not follow standard tune params
// they must be individually configured
// const tuningParams = {
// temperature: 0.86,
// top_p: 0.98,
@@ -23,18 +25,18 @@ export class CerebrasChatProvider extends BaseChatProvider {
return {
model: param.model,
messages: safeMessages,
stream: true
// ...tuningParams
stream: true,
// ...tuningParams
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === "stop") {
dataCallback({ type: "chat", data: chunk });
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: "chat", data: chunk });
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
@@ -46,14 +48,13 @@ export class CerebrasSdk {
param: {
openai: OpenAI;
systemPrompt: any;
disableWebhookGeneration: boolean;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: string;
env: Env;
env: GenericEnv;
},
dataCallback: (data) => void,
dataCallback: (data: any) => void,
) {
return this.provider.handleStream(
{

View File

@@ -1,5 +1,7 @@
import { OpenAI } from "openai";
import ChatSdk from "../lib/chat-sdk.ts";
import { OpenAI } from 'openai';
import ChatSdk from '../chat-sdk/chat-sdk.ts';
import type { GenericEnv } from '../types';
export interface CommonProviderParams {
openai?: OpenAI; // Optional for providers that use a custom client.
@@ -8,42 +10,37 @@ export interface CommonProviderParams {
maxTokens: number | unknown | undefined;
messages: any;
model: string;
env: Env;
env: GenericEnv;
disableWebhookGeneration?: boolean;
// Additional fields can be added as needed
}
export interface ChatStreamProvider {
handleStream(
param: CommonProviderParams,
dataCallback: (data: any) => void,
): Promise<any>;
handleStream(param: CommonProviderParams, dataCallback: (data: any) => void): Promise<any>;
}
export abstract class BaseChatProvider implements ChatStreamProvider {
abstract getOpenAIClient(param: CommonProviderParams): OpenAI;
abstract getStreamParams(param: CommonProviderParams, safeMessages: any[]): any;
abstract async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean>;
abstract processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean>;
async handleStream(
param: CommonProviderParams,
dataCallback: (data: any) => void,
) {
async handleStream(param: CommonProviderParams, dataCallback: (data: any) => void) {
const assistantPrompt = ChatSdk.buildAssistantPrompt({ maxTokens: param.maxTokens });
const safeMessages = ChatSdk.buildMessageChain(param.messages, {
const safeMessages = await ChatSdk.buildMessageChain(param.messages, {
systemPrompt: param.systemPrompt,
model: param.model,
assistantPrompt,
toolResults: param.preprocessedContext,
env: param.env,
});
const client = this.getOpenAIClient(param);
const streamParams = this.getStreamParams(param, safeMessages);
const stream = await client.chat.completions.create(streamParams);
for await (const chunk of stream) {
for await (const chunk of stream as unknown as AsyncIterable<any>) {
const shouldBreak = await this.processChunk(chunk, dataCallback);
if (shouldBreak) break;
}
}
}
}

View File

@@ -1,14 +1,17 @@
import Anthropic from "@anthropic-ai/sdk";
import {OpenAI} from "openai";
import {
import Anthropic from '@anthropic-ai/sdk';
import type {
_NotCustomized,
ISimpleType,
ModelPropertiesDeclarationToProperties,
ModelSnapshotType2,
UnionStringArray,
} from "mobx-state-tree";
import ChatSdk from "../lib/chat-sdk.ts";
import {BaseChatProvider, CommonProviderParams} from "./chat-stream-provider.ts";
} from 'mobx-state-tree';
import { OpenAI } from 'openai';
import ChatSdk from '../chat-sdk/chat-sdk.ts';
import type { GenericEnv, GenericStreamData } from '../types';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class ClaudeChatProvider extends BaseChatProvider {
private anthropic: Anthropic | null = null;
@@ -33,20 +36,20 @@ export class ClaudeChatProvider extends BaseChatProvider {
stream: true,
model: param.model,
messages: safeMessages,
...claudeTuningParams
...claudeTuningParams,
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.type === "message_stop") {
if (chunk.type === 'message_stop') {
dataCallback({
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: { content: "" },
delta: { content: '' },
logprobs: null,
finish_reason: "stop",
finish_reason: 'stop',
},
],
},
@@ -54,32 +57,30 @@ export class ClaudeChatProvider extends BaseChatProvider {
return true;
}
dataCallback({ type: "chat", data: chunk });
dataCallback({ type: 'chat', data: chunk });
return false;
}
// Override the base handleStream method to use Anthropic client instead of OpenAI
async handleStream(
param: CommonProviderParams,
dataCallback: (data: any) => void,
) {
async handleStream(param: CommonProviderParams, dataCallback: (data: any) => void) {
const assistantPrompt = ChatSdk.buildAssistantPrompt({ maxTokens: param.maxTokens });
const safeMessages = ChatSdk.buildMessageChain(param.messages, {
const safeMessages = await ChatSdk.buildMessageChain(param.messages, {
systemPrompt: param.systemPrompt,
model: param.model,
assistantPrompt,
toolResults: param.preprocessedContext,
env: param.env,
});
const streamParams = this.getStreamParams(param, safeMessages);
if (!this.anthropic) {
throw new Error("Anthropic client not initialized");
throw new Error('Anthropic client not initialized');
}
const stream = await this.anthropic.messages.create(streamParams);
for await (const chunk of stream) {
for await (const chunk of stream as unknown as AsyncIterable<any>) {
const shouldBreak = await this.processChunk(chunk, dataCallback);
if (shouldBreak) break;
}
@@ -104,9 +105,9 @@ export class ClaudeChatSdk {
maxTokens: unknown | number | undefined;
messages: any;
model: string;
env: Env;
env: GenericEnv;
},
dataCallback: (data) => void,
dataCallback: (data: GenericStreamData) => void,
) {
return this.provider.handleStream(
{

View File

@@ -0,0 +1,142 @@
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class CloudflareAiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
apiKey: param.env.CLOUDFLARE_API_KEY,
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.cloudflare.replace(
'{CLOUDFLARE_ACCOUNT_ID}',
param.env.CLOUDFLARE_ACCOUNT_ID,
),
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
const generationParams: Record<string, any> = {
model: this.getModelWithPrefix(param.model),
messages: safeMessages,
stream: true,
};
// Set max_tokens based on model
if (this.getModelPrefix(param.model) === '@cf/meta') {
generationParams['max_tokens'] = 4096;
}
if (this.getModelPrefix(param.model) === '@hf/mistral') {
generationParams['max_tokens'] = 4096;
}
if (param.model.toLowerCase().includes('hermes-2-pro-mistral-7b')) {
generationParams['max_tokens'] = 1000;
}
if (param.model.toLowerCase().includes('openhermes-2.5-mistral-7b-awq')) {
generationParams['max_tokens'] = 1000;
}
if (param.model.toLowerCase().includes('deepseek-coder-6.7b-instruct-awq')) {
generationParams['max_tokens'] = 590;
}
if (param.model.toLowerCase().includes('deepseek-math-7b-instruct')) {
generationParams['max_tokens'] = 512;
}
if (param.model.toLowerCase().includes('neural-chat-7b-v3-1-awq')) {
generationParams['max_tokens'] = 590;
}
if (param.model.toLowerCase().includes('openchat-3.5-0106')) {
generationParams['max_tokens'] = 2000;
}
return generationParams;
}
private getModelPrefix(model: string): string {
let modelPrefix = `@cf/meta`;
if (model.toLowerCase().includes('llama')) {
modelPrefix = `@cf/meta`;
}
if (model.toLowerCase().includes('hermes-2-pro-mistral-7b')) {
modelPrefix = `@hf/nousresearch`;
}
if (model.toLowerCase().includes('mistral-7b-instruct')) {
modelPrefix = `@hf/mistral`;
}
if (model.toLowerCase().includes('gemma')) {
modelPrefix = `@cf/google`;
}
if (model.toLowerCase().includes('deepseek')) {
modelPrefix = `@cf/deepseek-ai`;
}
if (model.toLowerCase().includes('openchat-3.5-0106')) {
modelPrefix = `@cf/openchat`;
}
const isNueralChat = model.toLowerCase().includes('neural-chat-7b-v3-1-awq');
if (
isNueralChat ||
model.toLowerCase().includes('openhermes-2.5-mistral-7b-awq') ||
model.toLowerCase().includes('zephyr-7b-beta-awq') ||
model.toLowerCase().includes('deepseek-coder-6.7b-instruct-awq')
) {
modelPrefix = `@hf/thebloke`;
}
return modelPrefix;
}
private getModelWithPrefix(model: string): string {
return `${this.getModelPrefix(model)}/${model}`;
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class CloudflareAISdk {
private static provider = new CloudflareAiChatProvider();
static async handleCloudflareAIStream(
param: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: string;
env: Env;
},
dataCallback: (data: any) => void,
) {
return this.provider.handleStream(
{
systemPrompt: param.systemPrompt,
preprocessedContext: param.preprocessedContext,
maxTokens: param.maxTokens,
messages: param.messages,
model: param.model,
env: param.env,
},
dataCallback,
);
}
}

View File

@@ -1,29 +1,20 @@
import { OpenAI } from "openai";
import {
_NotCustomized,
castToSnapshot,
getSnapshot,
ISimpleType,
ModelPropertiesDeclarationToProperties,
ModelSnapshotType2,
UnionStringArray,
} from "mobx-state-tree";
import Message from "../models/Message.ts";
import ChatSdk from "../lib/chat-sdk.ts";
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts";
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class FireworksAiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
apiKey: param.env.FIREWORKS_API_KEY,
baseURL: "https://api.fireworks.ai/inference/v1",
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.fireworks,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
let modelPrefix = "accounts/fireworks/models/";
if (param.model.toLowerCase().includes("yi-")) {
modelPrefix = "accounts/yi-01-ai/models/";
let modelPrefix = 'accounts/fireworks/models/';
if (param.model.toLowerCase().includes('yi-')) {
modelPrefix = 'accounts/yi-01-ai/models/';
}
return {
@@ -34,12 +25,12 @@ export class FireworksAiChatProvider extends BaseChatProvider {
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === "stop") {
dataCallback({ type: "chat", data: chunk });
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: "chat", data: chunk });
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
@@ -55,9 +46,10 @@ export class FireworksAiChatSdk {
maxTokens: number;
messages: any;
model: any;
env: Env;
env: any;
},
dataCallback: (data) => void,
// TODO: Replace usage of any with an explicit but permissive type
dataCallback: (data: any) => void,
) {
return this.provider.handleStream(
{

View File

@@ -1,12 +1,12 @@
import { OpenAI } from "openai";
import ChatSdk from "../lib/chat-sdk.ts";
import { StreamParams } from "../services/ChatService.ts";
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts";
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class GoogleChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: "https://generativelanguage.googleapis.com/v1beta/openai",
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.google,
apiKey: param.env.GEMINI_API_KEY,
});
}
@@ -20,14 +20,14 @@ export class GoogleChatProvider extends BaseChatProvider {
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices?.[0]?.finish_reason === "stop") {
if (chunk.choices?.[0]?.finish_reason === 'stop') {
dataCallback({
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: { content: chunk.choices[0].delta.content || "" },
finish_reason: "stop",
delta: { content: chunk.choices[0].delta.content || '' },
finish_reason: 'stop',
index: chunk.choices[0].index,
},
],
@@ -36,11 +36,11 @@ export class GoogleChatProvider extends BaseChatProvider {
return true;
} else {
dataCallback({
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: { content: chunk.choices?.[0]?.delta?.content || "" },
delta: { content: chunk.choices?.[0]?.delta?.content || '' },
finish_reason: null,
index: chunk.choices?.[0]?.index || 0,
},
@@ -55,10 +55,7 @@ export class GoogleChatProvider extends BaseChatProvider {
export class GoogleChatSdk {
private static provider = new GoogleChatProvider();
static async handleGoogleStream(
param: StreamParams,
dataCallback: (data) => void,
) {
static async handleGoogleStream(param: StreamParams, dataCallback: (data: any) => void) {
return this.provider.handleStream(
{
systemPrompt: param.systemPrompt,

View File

@@ -1,17 +1,19 @@
import { OpenAI } from "openai";
import {
_NotCustomized,
ISimpleType,
ModelPropertiesDeclarationToProperties,
ModelSnapshotType2,
UnionStringArray,
} from "mobx-state-tree";
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts";
} from 'mobx-state-tree';
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, CommonProviderParams } from './chat-stream-provider.ts';
export class GroqChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: "https://api.groq.com/openai/v1",
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.groq,
apiKey: param.env.GROQ_API_KEY,
});
}
@@ -29,17 +31,17 @@ export class GroqChatProvider extends BaseChatProvider {
model: param.model,
messages: safeMessages,
stream: true,
...tuningParams
...tuningParams,
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === "stop") {
dataCallback({ type: "chat", data: chunk });
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: "chat", data: chunk });
dataCallback({ type: 'chat', data: chunk });
return false;
}
}

View File

@@ -0,0 +1,8 @@
export * from './claude.ts';
export * from './cerebras.ts';
export * from './cloudflareAi.ts';
export * from './fireworks.ts';
export * from './groq.ts';
export * from './mlx-omni.ts';
export * from './ollama.ts';
export * from './xai.ts';

View File

@@ -0,0 +1,97 @@
import { OpenAI } from 'openai';
import { type ChatCompletionCreateParamsStreaming } from 'openai/resources/chat/completions/completions';
import { Common } from '../utils';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class MlxOmniChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: 'http://localhost:10240',
apiKey: param.env.MLX_API_KEY,
});
}
getStreamParams(
param: CommonProviderParams,
safeMessages: any[],
): ChatCompletionCreateParamsStreaming {
const baseTuningParams = {
temperature: 0.86,
top_p: 0.98,
presence_penalty: 0.1,
frequency_penalty: 0.3,
max_tokens: param.maxTokens as number,
};
const getTuningParams = () => {
return baseTuningParams;
};
let completionRequest: ChatCompletionCreateParamsStreaming = {
model: param.model,
stream: true,
messages: safeMessages,
};
const client = this.getOpenAIClient(param);
const isLocal = client.baseURL.includes('localhost');
if (isLocal) {
completionRequest['messages'] = Common.Utils.normalizeWithBlanks(safeMessages);
completionRequest['stream_options'] = {
include_usage: true,
};
} else {
completionRequest = { ...completionRequest, ...getTuningParams() };
}
return completionRequest;
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
const isLocal = chunk.usage !== undefined;
if (isLocal && chunk.usage) {
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: { content: '' },
logprobs: null,
finish_reason: 'stop',
},
],
},
});
return true; // Break the stream
}
dataCallback({ type: 'chat', data: chunk });
return false; // Continue the stream
}
}
export class MlxOmniChatSdk {
private static provider = new MlxOmniChatProvider();
static async handleMlxOmniStream(ctx: any, dataCallback: (data: any) => any) {
if (!ctx.messages?.length) {
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(
{
systemPrompt: ctx.systemPrompt,
preprocessedContext: ctx.preprocessedContext,
maxTokens: ctx.maxTokens,
messages: Common.Utils.normalizeWithBlanks(ctx.messages),
model: ctx.model,
env: ctx.env,
},
dataCallback,
);
}
}

View File

@@ -0,0 +1,75 @@
import { OpenAI } from 'openai';
import type { GenericEnv } from '../types';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class OllamaChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: param.env.OLLAMA_API_ENDPOINT ?? ProviderRepository.OPENAI_COMPAT_ENDPOINTS.ollama,
apiKey: param.env.OLLAMA_API_KEY,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
const tuningParams = {
temperature: 0.75,
};
const getTuningParams = () => {
return tuningParams;
};
return {
model: param.model,
messages: safeMessages,
stream: true,
...getTuningParams(),
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class OllamaChatSdk {
private static provider = new OllamaChatProvider();
static async handleOllamaStream(
ctx: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: any;
env: GenericEnv;
},
dataCallback: (data: any) => any,
) {
if (!ctx.messages?.length) {
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(
{
systemPrompt: ctx.systemPrompt,
preprocessedContext: ctx.preprocessedContext,
maxTokens: ctx.maxTokens,
messages: ctx.messages,
model: ctx.model,
env: ctx.env,
},
dataCallback,
);
}
}

View File

@@ -1,16 +1,21 @@
import { OpenAI } from "openai";
import { Utils } from "../lib/utils.ts";
import { ChatCompletionCreateParamsStreaming } from "openai/resources/chat/completions/completions";
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts";
import { OpenAI } from 'openai';
import type { ChatCompletionCreateParamsStreaming } from 'openai/resources/chat/completions/completions';
import { Common } from '../utils';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class OpenAiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return param.openai as OpenAI;
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): ChatCompletionCreateParamsStreaming {
getStreamParams(
param: CommonProviderParams,
safeMessages: any[],
): ChatCompletionCreateParamsStreaming {
const isO1 = () => {
if (param.model === "o1-preview" || param.model === "o1-mini") {
if (param.model === 'o1-preview' || param.model === 'o1-mini') {
return true;
}
};
@@ -27,8 +32,8 @@ export class OpenAiChatProvider extends BaseChatProvider {
const getTuningParams = () => {
if (isO1()) {
tuningParams["temperature"] = 1;
tuningParams["max_completion_tokens"] = (param.maxTokens as number) + 10000;
tuningParams['temperature'] = 1;
tuningParams['max_completion_tokens'] = (param.maxTokens as number) + 10000;
return tuningParams;
}
return gpt4oTuningParams;
@@ -37,19 +42,19 @@ export class OpenAiChatProvider extends BaseChatProvider {
let completionRequest: ChatCompletionCreateParamsStreaming = {
model: param.model,
stream: true,
messages: safeMessages
messages: safeMessages,
};
const client = this.getOpenAIClient(param);
const isLocal = client.baseURL.includes("localhost");
const isLocal = client.baseURL.includes('localhost');
if(isLocal) {
completionRequest["messages"] = Utils.normalizeWithBlanks(safeMessages);
completionRequest["stream_options"] = {
include_usage: true
if (isLocal) {
completionRequest['messages'] = Common.Utils.normalizeWithBlanks(safeMessages);
completionRequest['stream_options'] = {
include_usage: true,
};
} else {
completionRequest = {...completionRequest, ...getTuningParams()};
completionRequest = { ...completionRequest, ...getTuningParams() };
}
return completionRequest;
@@ -60,13 +65,13 @@ export class OpenAiChatProvider extends BaseChatProvider {
if (isLocal && chunk.usage) {
dataCallback({
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: { content: "" },
delta: { content: '' },
logprobs: null,
finish_reason: "stop",
finish_reason: 'stop',
},
],
},
@@ -74,7 +79,7 @@ export class OpenAiChatProvider extends BaseChatProvider {
return true; // Break the stream
}
dataCallback({ type: "chat", data: chunk });
dataCallback({ type: 'chat', data: chunk });
return false; // Continue the stream
}
}
@@ -95,7 +100,7 @@ export class OpenAiChatSdk {
dataCallback: (data: any) => any,
) {
if (!ctx.messages?.length) {
return new Response("No messages provided", { status: 400 });
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(

View File

@@ -0,0 +1,75 @@
import { OpenAI } from 'openai';
import type { GenericEnv, GenericStreamData } from '../types';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class XaiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: 'https://api.x.ai/v1',
apiKey: param.env.XAI_API_KEY,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
const tuningParams = {
temperature: 0.75,
};
const getTuningParams = () => {
return tuningParams;
};
return {
model: param.model,
messages: safeMessages,
stream: true,
...getTuningParams(),
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class XaiChatSdk {
private static provider = new XaiChatProvider();
static async handleXaiStream(
ctx: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: any;
env: GenericEnv;
},
dataCallback: (data: GenericStreamData) => any,
) {
if (!ctx.messages?.length) {
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(
{
systemPrompt: ctx.systemPrompt,
preprocessedContext: ctx.preprocessedContext,
maxTokens: ctx.maxTokens,
messages: ctx.messages,
model: ctx.model,
env: ctx.env,
disableWebhookGeneration: ctx.disableWebhookGeneration,
},
dataCallback,
);
}
}

View File

@@ -0,0 +1 @@
export * from './types.ts';

View File

@@ -0,0 +1,5 @@
{
"name": "@open-gsio/types",
"type": "module",
"module": "index.ts"
}

View File

@@ -0,0 +1,29 @@
import { ProviderRepository } from '../providers/_ProviderRepository.ts';
export type GenericEnv = Record<string, any>;
export type GenericStreamData = any;
export type ModelMeta = {
id: any;
} & Record<string, any>;
export type SupportedProvider = keyof typeof ProviderRepository.OPENAI_COMPAT_ENDPOINTS & string;
export type Provider = { name: SupportedProvider; key: string; endpoint: string };
export type Providers = Provider[];
export type ChatRequestBody = {
messages: any[];
model: string;
conversationId: string;
};
export interface BuildAssistantPromptParams {
maxTokens: any;
}
export interface PreprocessParams {
messages: any[];
}

View File

@@ -22,15 +22,9 @@ interface StreamResponse {
};
}
const handleStreamData = (
controller: ReadableStreamDefaultController,
encoder: TextEncoder,
) => {
return (
data: StreamResponse,
transformFn?: (data: StreamResponse) => StreamResponse,
) => {
if (!data?.type || data.type !== "chat") {
const handleStreamData = (controller: ReadableStreamDefaultController, encoder: TextEncoder) => {
return (data: StreamResponse, transformFn?: (data: StreamResponse) => StreamResponse) => {
if (!data?.type || data.type !== 'chat') {
return;
}
@@ -39,17 +33,14 @@ const handleStreamData = (
if (transformFn) {
transformedData = transformFn(data);
} else {
if (
data.data.type === "content_block_start" &&
data.data.content_block?.type === "text"
) {
if (data.data.type === 'content_block_start' && data.data.content_block?.type === 'text') {
transformedData = {
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: {
content: data.data.content_block.text || "",
content: data.data.content_block.text || '',
},
logprobs: null,
finish_reason: null,
@@ -59,7 +50,7 @@ const handleStreamData = (
};
} else if (data.data.delta?.text) {
transformedData = {
type: "chat",
type: 'chat',
data: {
choices: [
{
@@ -74,7 +65,7 @@ const handleStreamData = (
};
} else if (data.data.choices?.[0]?.delta?.content) {
transformedData = {
type: "chat",
type: 'chat',
data: {
choices: [
{
@@ -95,9 +86,7 @@ const handleStreamData = (
}
}
controller.enqueue(
encoder.encode(`data: ${JSON.stringify(transformedData)}\n\n`),
);
controller.enqueue(encoder.encode(`data: ${JSON.stringify(transformedData)}\n\n`));
};
};

View File

@@ -0,0 +1,3 @@
import * as Common from './utils.ts';
export { Common };

View File

@@ -1,20 +1,19 @@
import handleStreamData from './handleStreamData.ts';
export class Utils {
static getSeason(date: string): string {
const hemispheres = {
Northern: ["Winter", "Spring", "Summer", "Autumn"],
Southern: ["Summer", "Autumn", "Winter", "Spring"],
Northern: ['Winter', 'Spring', 'Summer', 'Autumn'],
Southern: ['Summer', 'Autumn', 'Winter', 'Spring'],
};
const d = new Date(date);
const month = d.getMonth();
const day = d.getDate();
const hemisphere = "Northern";
const hemisphere = 'Northern';
if (month < 2 || (month === 2 && day <= 20) || month === 11)
return hemispheres[hemisphere][0];
if (month < 5 || (month === 5 && day <= 21))
return hemispheres[hemisphere][1];
if (month < 8 || (month === 8 && day <= 22))
return hemispheres[hemisphere][2];
if (month < 2 || (month === 2 && day <= 20) || month === 11) return hemispheres[hemisphere][0];
if (month < 5 || (month === 5 && day <= 21)) return hemispheres[hemisphere][1];
if (month < 8 || (month === 8 && day <= 22)) return hemispheres[hemisphere][2];
return hemispheres[hemisphere][3];
}
static getTimezone(timezone) {
@@ -30,18 +29,16 @@ export class Utils {
static isAssetUrl(url) {
const { pathname } = new URL(url);
return pathname.startsWith("/assets/");
return pathname.startsWith('/assets/');
}
static selectEquitably({ a, b, c, d }, itemCount = 9) {
const sources = [a, b, c, d];
const result = {};
let combinedItems = [];
let combinedItems: any[] = [];
sources.forEach((source, index) => {
combinedItems.push(
...Object.keys(source).map((key) => ({ source: index, key })),
);
combinedItems.push(...Object.keys(source).map(key => ({ source: index, key })));
});
combinedItems = combinedItems.sort(() => Math.random() - 0.5);
@@ -60,37 +57,37 @@ export class Utils {
return result;
}
static normalizeWithBlanks<T extends Normalize.ChatMessage>(msgs: T[]): T[] {
static normalizeWithBlanks<T extends NormalizeChatMessage>(msgs: T[]): T[] {
const out: T[] = [];
// In local mode first turn expected to be user.
let expected: Normalize.Role = "user";
let expected: NormalizeRole = 'user';
for (const m of msgs) {
while (m.role !== expected) {
// Insert blanks to match expected sequence user/assistant/user...
out.push(Normalize.makeBlank(expected) as T);
expected = expected === "user" ? "assistant" : "user";
out.push(makeNormalizeBlank(expected) as T);
expected = expected === 'user' ? 'assistant' : 'user';
}
out.push(m);
expected = expected === "user" ? "assistant" : "user";
expected = expected === 'user' ? 'assistant' : 'user';
}
return out;
}
static handleStreamData = handleStreamData;
}
module Normalize {
export type Role = "user" | "assistant";
// Normalize module exports
export type NormalizeRole = 'user' | 'assistant';
export interface ChatMessage extends Record<any, any> {
role: Role;
}
export const makeBlank = (role: Role): ChatMessage => ({
role,
content: ""
});
export interface NormalizeChatMessage extends Record<any, any> {
role: NormalizeRole;
}
export const makeNormalizeBlank = (role: NormalizeRole): NormalizeChatMessage => ({
role,
content: '',
});

View File

@@ -1,88 +0,0 @@
const SUPPORTED_MODELS_GROUPS = {
openai: [
// "o1-preview",
// "o1-mini",
// "gpt-4o",
// "gpt-3.5-turbo"
],
groq: [
// "mixtral-8x7b-32768",
// "deepseek-r1-distill-llama-70b",
"meta-llama/llama-4-scout-17b-16e-instruct",
"gemma2-9b-it",
"mistral-saba-24b",
// "qwen-2.5-32b",
"llama-3.3-70b-versatile",
// "llama-3.3-70b-versatile"
// "llama-3.1-70b-versatile",
// "llama-3.3-70b-versatile"
],
cerebras: ["llama-3.3-70b"],
claude: [
// "claude-3-5-sonnet-20241022",
// "claude-3-opus-20240229"
],
fireworks: [
// "llama-v3p1-405b-instruct",
// "llama-v3p1-70b-instruct",
// "llama-v3p2-90b-vision-instruct",
// "mixtral-8x22b-instruct",
// "mythomax-l2-13b",
// "yi-large"
],
google: [
// "gemini-2.0-flash-exp",
// "gemini-1.5-flash",
// "gemini-exp-1206",
// "gemini-1.5-pro"
],
xai: [
// "grok-beta",
// "grok-2",
// "grok-2-1212",
// "grok-2-latest",
// "grok-beta"
],
cloudflareAI: [
"llama-3.2-3b-instruct", // max_tokens
"llama-3-8b-instruct", // max_tokens
"llama-3.1-8b-instruct-fast", // max_tokens
"deepseek-math-7b-instruct",
"deepseek-coder-6.7b-instruct-awq",
"hermes-2-pro-mistral-7b",
"openhermes-2.5-mistral-7b-awq",
"mistral-7b-instruct-v0.2",
"neural-chat-7b-v3-1-awq",
"openchat-3.5-0106",
// "gemma-7b-it",
],
};
export type SupportedModel =
| keyof typeof SUPPORTED_MODELS_GROUPS
| (typeof SUPPORTED_MODELS_GROUPS)[keyof typeof SUPPORTED_MODELS_GROUPS][number];
export type ModelFamily = keyof typeof SUPPORTED_MODELS_GROUPS;
function getModelFamily(model: string): ModelFamily | undefined {
return Object.keys(SUPPORTED_MODELS_GROUPS)
.filter((family) => {
return SUPPORTED_MODELS_GROUPS[
family as keyof typeof SUPPORTED_MODELS_GROUPS
].includes(model.trim());
})
.at(0) as ModelFamily | undefined;
}
const SUPPORTED_MODELS = [
// ...SUPPORTED_MODELS_GROUPS.xai,
// ...SUPPORTED_MODELS_GROUPS.claude,
// ...SUPPORTED_MODELS_GROUPS.google,
...SUPPORTED_MODELS_GROUPS.groq,
// ...SUPPORTED_MODELS_GROUPS.fireworks,
// ...SUPPORTED_MODELS_GROUPS.openai,
// ...SUPPORTED_MODELS_GROUPS.cerebras,
// ...SUPPORTED_MODELS_GROUPS.cloudflareAI,
];
export { SUPPORTED_MODELS, SUPPORTED_MODELS_GROUPS, getModelFamily };

View File

@@ -0,0 +1,9 @@
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"outDir": "dist",
"rootDir": "."
},
"include": ["*.ts"],
"exclude": ["node_modules"]
}

View File

@@ -5,18 +5,26 @@
"dev": "bun vite dev",
"build": "bun vite build",
"tests": "vitest run",
"tests:coverage": "vitest run --coverage.enabled=true"
"tests:coverage": "vitest run --coverage.enabled=true",
"generate:sitemap": "bun ./scripts/generate_sitemap.js open-gsio.seemueller.workers.dev",
"generate:robotstxt": "bun ./scripts/generate_robots_txt.js open-gsio.seemueller.workers.dev",
"generate:fonts": "cp -r ../../node_modules/katex/dist/fonts public/static"
},
"dependencies": {
"exports": {
"./server/index.ts": {
"import": "./server/index.ts",
"types": "./server/index.ts"
}
},
"devDependencies": {
"@open-gsio/env": "workspace:*",
"@open-gsio/scripts": "workspace:*",
"@anthropic-ai/sdk": "^0.32.1",
"@chakra-ui/react": "^2.10.6",
"@cloudflare/workers-types": "^4.20241205.0",
"@emotion/react": "^11.13.5",
"@emotion/styled": "^11.13.5",
"@testing-library/jest-dom": "^6.4.2",
"@testing-library/react": "^14.2.1",
"@testing-library/react": "^16.3.0",
"@testing-library/user-event": "^14.5.2",
"@types/marked": "^6.0.0",
"@vitejs/plugin-react": "^4.3.4",
@@ -38,16 +46,18 @@
"mobx-state-tree": "^6.0.1",
"moo": "^0.5.2",
"qrcode.react": "^4.1.0",
"react": "^18.3.1",
"react-dom": "^18.3.1",
"react": "^19.1.0",
"react-dom": "^19.1.0",
"react-icons": "^5.4.0",
"react-streaming": "^0.3.44",
"react-textarea-autosize": "^8.5.5",
"shiki": "^1.24.0",
"typescript": "^5.7.2",
"vike": "0.4.193",
"vite": "^6.3.5",
"vike": "^0.4.235",
"vite": "^7.0.0",
"vite-plugin-pwa": "^1.0.0",
"vitest": "^3.1.4"
"vitest": "^3.1.4",
"bun": "^1.2.17",
"@types/bun": "^1.2.17"
}
}

View File

@@ -15,30 +15,29 @@
};
function s() {
var i = [
g(m(4)) + "=" + g(m(6)),
"ga=" + t.ga_tid,
"dt=" + r(e.title),
"de=" + r(e.characterSet || e.charset),
"dr=" + r(e.referrer),
"ul=" + (n.language || n.browserLanguage || n.userLanguage),
"sd=" + a.colorDepth + "-bit",
"sr=" + a.width + "x" + a.height,
"vp=" +
g(m(4)) + '=' + g(m(6)),
'ga=' + t.ga_tid,
'dt=' + r(e.title),
'de=' + r(e.characterSet || e.charset),
'dr=' + r(e.referrer),
'ul=' + (n.language || n.browserLanguage || n.userLanguage),
'sd=' + a.colorDepth + '-bit',
'sr=' + a.width + 'x' + a.height,
'vp=' +
o(e.documentElement.clientWidth, t.innerWidth || 0) +
"x" +
'x' +
o(e.documentElement.clientHeight, t.innerHeight || 0),
"plt=" + c(d.loadEventStart - d.navigationStart || 0),
"dns=" + c(d.domainLookupEnd - d.domainLookupStart || 0),
"pdt=" + c(d.responseEnd - d.responseStart || 0),
"rrt=" + c(d.redirectEnd - d.redirectStart || 0),
"tcp=" + c(d.connectEnd - d.connectStart || 0),
"srt=" + c(d.responseStart - d.requestStart || 0),
"dit=" + c(d.domInteractive - d.domLoading || 0),
"clt=" + c(d.domContentLoadedEventStart - d.navigationStart || 0),
"z=" + Date.now(),
'plt=' + c(d.loadEventStart - d.navigationStart || 0),
'dns=' + c(d.domainLookupEnd - d.domainLookupStart || 0),
'pdt=' + c(d.responseEnd - d.responseStart || 0),
'rrt=' + c(d.redirectEnd - d.redirectStart || 0),
'tcp=' + c(d.connectEnd - d.connectStart || 0),
'srt=' + c(d.responseStart - d.requestStart || 0),
'dit=' + c(d.domInteractive - d.domLoading || 0),
'clt=' + c(d.domContentLoadedEventStart - d.navigationStart || 0),
'z=' + Date.now(),
];
(t.__ga_img = new Image()), (t.__ga_img.src = t.ga_api + "?" + i.join("&"));
((t.__ga_img = new Image()), (t.__ga_img.src = t.ga_api + '?' + i.join('&')));
}
(t.cfga = s),
"complete" === e.readyState ? s() : t.addEventListener("load", s);
((t.cfga = s), 'complete' === e.readyState ? s() : t.addEventListener('load', s));
})(window, document, navigator);

View File

@@ -1,17 +1,17 @@
#!/usr/bin/env bun
/* eslint-env node */
import fs from "fs";
import {parseArgs} from "util";
import fs from 'fs';
import { parseArgs } from 'util';
const {positionals} = parseArgs({
const { positionals } = parseArgs({
args: Bun.argv,
options: {},
strict: true,
allowPositionals: true,
});
const currentDate = new Date().toISOString().split("T")[0];
const currentDate = new Date().toISOString().split('T')[0];
const host = positionals[2];
@@ -25,12 +25,12 @@ Disallow: /assets
Sitemap: https://${host}/sitemap.xml
`;
const robotsTxtPath = "./public/robots.txt";
const robotsTxtPath = './public/robots.txt';
fs.writeFile(robotsTxtPath, robotsTxtTemplate, (err) => {
fs.writeFile(robotsTxtPath, robotsTxtTemplate, err => {
if (err) {
console.error("Error writing robots.txt:", err);
console.error('Error writing robots.txt:', err);
process.exit(1);
}
console.log("robots.txt created successfully:", currentDate);
console.log('robots.txt created successfully:', currentDate);
});

View File

@@ -1,17 +1,16 @@
#!/usr/bin/env bun
import fs from "fs";
import {parseArgs} from "util";
import fs from 'fs';
import { parseArgs } from 'util';
const {positionals} = parseArgs({
const { positionals } = parseArgs({
args: Bun.argv,
options: {},
strict: true,
allowPositionals: true,
});
const currentDate = new Date().toISOString().split("T")[0];
const currentDate = new Date().toISOString().split('T')[0];
const host = positionals[2];
@@ -30,12 +29,12 @@ const sitemapTemplate = `<?xml version="1.0" encoding="UTF-8"?>
</url>
</urlset>`;
const sitemapPath = "./public/sitemap.xml";
const sitemapPath = './public/sitemap.xml';
fs.writeFile(sitemapPath, sitemapTemplate, (err) => {
fs.writeFile(sitemapPath, sitemapTemplate, err => {
if (err) {
console.error("Error writing sitemap file:", err);
console.error('Error writing sitemap file:', err);
process.exit(1);
}
console.log("Sitemap updated successfully with current date:", currentDate);
console.log('Sitemap updated successfully with current date:', currentDate);
});

View File

@@ -0,0 +1,20 @@
import { renderPage } from 'vike/server';
// This is what makes SSR possible. It is consumed by @open-gsio/server
export { handleSsr };
async function handleSsr(url: string, headers: Headers) {
const pageContextInit = {
urlOriginal: url,
headersOriginal: headers,
fetch: (...args: Parameters<typeof fetch>) => fetch(...args),
};
const pageContext = await renderPage(pageContextInit);
const { httpResponse } = pageContext;
const stream = httpResponse.getReadableWebStream();
return new Response(stream, {
headers: httpResponse.headers,
status: httpResponse.statusCode,
});
}

View File

@@ -1,7 +1,8 @@
import React from "react";
import { IconButton } from "@chakra-ui/react";
import { LucideHammer } from "lucide-react";
import { toolbarButtonZIndex } from "./toolbar/Toolbar";
import { IconButton } from '@chakra-ui/react';
import { LucideHammer } from 'lucide-react';
import React from 'react';
import { toolbarButtonZIndex } from './toolbar/Toolbar';
export default function BuiltWithButton() {
return (
@@ -12,12 +13,12 @@ export default function BuiltWithButton() {
bg="transparent"
stroke="text.accent"
color="text.accent"
onClick={() => alert("Built by Geoff Seemueller")}
onClick={() => alert('Built by GSIO')}
_hover={{
bg: "transparent",
bg: 'transparent',
svg: {
stroke: "accent.secondary",
transition: "stroke 0.3s ease-in-out",
stroke: 'accent.secondary',
transition: 'stroke 0.3s ease-in-out',
},
}}
zIndex={toolbarButtonZIndex}

View File

@@ -1,10 +1,12 @@
import { getColorThemes } from "../layout/theme/color-themes";
import { Center, IconButton, VStack } from "@chakra-ui/react";
import userOptionsStore from "../stores/UserOptionsStore";
import { Circle } from "lucide-react";
import { toolbarButtonZIndex } from "./toolbar/Toolbar";
import React from "react";
import { useIsMobile } from "./contexts/MobileContext";
import { Center, IconButton, VStack } from '@chakra-ui/react';
import { Circle } from 'lucide-react';
import React from 'react';
import { getColorThemes } from '../layout/theme/color-themes';
import userOptionsStore from '../stores/UserOptionsStore';
import { useIsMobile } from './contexts/MobileContext';
import { toolbarButtonZIndex } from './toolbar/Toolbar';
export function ThemeSelectionOptions() {
const children = [];
@@ -24,11 +26,11 @@ export function ThemeSelectionOptions() {
size={!isMobile ? 16 : 20}
stroke="transparent"
style={{
background: `conic-gradient(${theme.colors.background.primary.startsWith("#") ? theme.colors.background.primary : theme.colors.background.secondary} 0 50%, ${theme.colors.text.secondary} 50% 100%)`,
borderRadius: "50%",
boxShadow: "0 0 0.5px 0.25px #fff",
cursor: "pointer",
transition: "background 0.2s",
background: `conic-gradient(${theme.colors.background.primary.startsWith('#') ? theme.colors.background.primary : theme.colors.background.secondary} 0 50%, ${theme.colors.text.secondary} 50% 100%)`,
borderRadius: '50%',
boxShadow: '0 0 0.5px 0.25px #fff',
cursor: 'pointer',
transition: 'background 0.2s',
}}
/>
}
@@ -38,7 +40,7 @@ export function ThemeSelectionOptions() {
color="transparent"
_hover={{
svg: {
transition: "stroke 0.3s ease-in-out", // Smooth transition effect
transition: 'stroke 0.3s ease-in-out', // Smooth transition effect
},
}}
zIndex={toolbarButtonZIndex}
@@ -47,7 +49,7 @@ export function ThemeSelectionOptions() {
}
return (
<VStack align={!isMobile ? "end" : "start"} p={1.2}>
<VStack align={!isMobile ? 'end' : 'start'} p={1.2}>
<Center>{children}</Center>
</VStack>
);

View File

@@ -1,11 +1,9 @@
import { motion } from "framer-motion";
import { Box, Center, VStack } from "@chakra-ui/react";
import {
welcome_home_text,
welcome_home_tip,
} from "../static-data/welcome_home_text";
import {renderMarkdown} from "./markdown/MarkdownComponent";
import { Box, Center, VStack } from '@chakra-ui/react';
import { motion } from 'framer-motion';
import { welcome_home_text, welcome_home_tip } from '../static-data/welcome_home_text';
import { renderMarkdown } from './markdown/MarkdownComponent';
function WelcomeHomeMessage({ visible }) {
const containerVariants = {
@@ -45,33 +43,19 @@ function WelcomeHomeMessage({ visible }) {
<Center>
<VStack spacing={8} align="center" maxW="400px">
{/* Welcome Message */}
<Box
fontSize="sm"
fontStyle="italic"
textAlign="center"
color="text.secondary"
mt={4}
>
<Box fontSize="sm" fontStyle="italic" textAlign="center" color="text.secondary" mt={4}>
<motion.div
variants={containerVariants}
initial="hidden"
animate={visible ? "visible" : "hidden"}
animate={visible ? 'visible' : 'hidden'}
>
<Box userSelect={"none"}>
<motion.div variants={textVariants}>
{renderMarkdown(welcome_home_text)}
</motion.div>
<Box userSelect={'none'}>
<motion.div variants={textVariants}>{renderMarkdown(welcome_home_text)}</motion.div>
</Box>
</motion.div>
</Box>
<motion.div variants={textVariants}>
<Box
fontSize="sm"
fontStyle="italic"
textAlign="center"
color="text.secondary"
mt={1}
>
<Box fontSize="sm" fontStyle="italic" textAlign="center" color="text.secondary" mt={1}>
{renderMarkdown(welcome_home_tip)}
</Box>
</motion.div>

View File

@@ -1,37 +1,38 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react';
import { ThemeSelectionOptions } from '../ThemeSelection';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import userOptionsStore from '../../stores/UserOptionsStore';
import * as MobileContext from '../contexts/MobileContext';
import { ThemeSelectionOptions } from '../ThemeSelection';
// Mock dependencies
vi.mock('../../layout/theme/color-themes', () => ({
getColorThemes: () => [
{
name: 'light',
colors: {
{
name: 'light',
colors: {
background: { primary: '#ffffff', secondary: '#f0f0f0' },
text: { secondary: '#333333' }
}
text: { secondary: '#333333' },
},
},
{
name: 'dark',
colors: {
{
name: 'dark',
colors: {
background: { primary: '#121212', secondary: '#1e1e1e' },
text: { secondary: '#e0e0e0' }
}
}
]
text: { secondary: '#e0e0e0' },
},
},
],
}));
vi.mock('../../stores/UserOptionsStore', () => ({
default: {
selectTheme: vi.fn()
}
selectTheme: vi.fn(),
},
}));
vi.mock('../toolbar/Toolbar', () => ({
toolbarButtonZIndex: 100
toolbarButtonZIndex: 100,
}));
describe('ThemeSelectionOptions', () => {
@@ -42,20 +43,20 @@ describe('ThemeSelectionOptions', () => {
it('renders theme options for desktop view', () => {
// Mock useIsMobile to return false (desktop view)
vi.spyOn(MobileContext, 'useIsMobile').mockReturnValue(false);
render(<ThemeSelectionOptions />);
// Should render 2 theme buttons (from our mock)
const buttons = screen.getAllByRole("button")
const buttons = screen.getAllByRole('button');
expect(buttons).toHaveLength(2);
});
it('renders theme options for mobile view', () => {
// Mock useIsMobile to return true (mobile view)
vi.spyOn(MobileContext, 'useIsMobile').mockReturnValue(true);
render(<ThemeSelectionOptions />);
// Should still render 2 theme buttons
const buttons = screen.getAllByRole('button');
expect(buttons).toHaveLength(2);
@@ -63,16 +64,16 @@ describe('ThemeSelectionOptions', () => {
it('calls selectTheme when a theme button is clicked', () => {
vi.spyOn(MobileContext, 'useIsMobile').mockReturnValue(false);
render(<ThemeSelectionOptions />);
const buttons = screen.getAllByRole('button');
fireEvent.click(buttons[0]); // Click the first theme button (light)
// Verify that selectTheme was called with the correct theme name
expect(userOptionsStore.selectTheme).toHaveBeenCalledWith('light');
fireEvent.click(buttons[1]); // Click the second theme button (dark)
expect(userOptionsStore.selectTheme).toHaveBeenCalledWith('dark');
});
});
});

View File

@@ -1,22 +1,23 @@
import { describe, it, expect } from 'vitest';
import { render, screen } from '@testing-library/react';
import WelcomeHomeMessage from '../WelcomeHome';
import { describe, it, expect } from 'vitest';
import { welcome_home_text, welcome_home_tip } from '../../static-data/welcome_home_text';
import { renderMarkdown } from '../markdown/MarkdownComponent';
import WelcomeHomeMessage from '../WelcomeHome';
// Mock the renderMarkdown function
vi.mock('../markdown/MarkdownComponent', () => ({
renderMarkdown: vi.fn((text) => `Rendered: ${text}`),
renderMarkdown: vi.fn(text => `Rendered: ${text}`),
}));
describe('WelcomeHomeMessage', () => {
it('renders correctly when visible', () => {
render(<WelcomeHomeMessage visible={true} />);
// Check if the rendered markdown content is in the document
expect(screen.getByText(`Rendered: ${welcome_home_text}`)).toBeInTheDocument();
expect(screen.getByText(`Rendered: ${welcome_home_tip}`)).toBeInTheDocument();
// Verify that renderMarkdown was called with the correct arguments
expect(renderMarkdown).toHaveBeenCalledWith(welcome_home_text);
expect(renderMarkdown).toHaveBeenCalledWith(welcome_home_tip);
@@ -24,17 +25,17 @@ describe('WelcomeHomeMessage', () => {
it('applies animation variants based on visible prop', () => {
const { rerender } = render(<WelcomeHomeMessage visible={true} />);
// When visible is true, the component should have the visible animation state
// Since we've mocked framer-motion, we can't directly test the animation state
// But we can verify that the component renders the content
expect(screen.getByText(`Rendered: ${welcome_home_text}`)).toBeInTheDocument();
// Re-render with visible=false
rerender(<WelcomeHomeMessage visible={false} />);
// Content should still be in the document even when not visible
// (since we've mocked the animations)
expect(screen.getByText(`Rendered: ${welcome_home_text}`)).toBeInTheDocument();
});
});
});

View File

@@ -1,14 +1,14 @@
import React from "react";
import { Grid, GridItem, Image, Text } from "@chakra-ui/react";
import { Grid, GridItem, Image, Text } from '@chakra-ui/react';
import React from 'react';
const fontSize = "md";
const fontSize = 'md';
function AboutComponent() {
return (
<Grid
templateColumns="1fr"
gap={4}
maxW={["100%", "100%", "100%"]}
maxW={['100%', '100%', '100%']}
mx="auto"
className="about-container"
>
@@ -17,22 +17,22 @@ function AboutComponent() {
src="/me.png"
alt="Geoff Seemueller"
borderRadius="full"
boxSize={["120px", "150px"]}
boxSize={['120px', '150px']}
objectFit="cover"
/>
</GridItem>
<GridItem
colSpan={1}
maxW={["100%", "100%", "container.md"]}
maxW={['100%', '100%', 'container.md']}
justifySelf="center"
minH={"100%"}
minH={'100%'}
>
<Grid templateColumns="1fr" gap={4} overflowY={"auto"}>
<Grid templateColumns="1fr" gap={4} overflowY={'auto'}>
<GridItem>
<Text fontSize={fontSize}>
If you're interested in collaborating on innovative projects that
push technological boundaries and create real value, I'd be keen
to connect and explore potential opportunities.
If you're interested in collaborating on innovative projects that push technological
boundaries and create real value, I'd be keen to connect and explore potential
opportunities.
</Text>
</GridItem>
</Grid>

View File

@@ -1,30 +1,26 @@
import React, { useEffect, useRef, useState } from "react";
import { observer } from "mobx-react-lite";
import { Box, Grid, GridItem } from "@chakra-ui/react";
import ChatMessages from "./messages/ChatMessages";
import ChatInput from "./input/ChatInput";
import chatStore from "../../stores/ClientChatStore";
import menuState from "../../stores/AppMenuStore";
import WelcomeHome from "../WelcomeHome";
import { Box, Grid, GridItem } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React, { useEffect, useRef, useState } from 'react';
import menuState from '../../stores/AppMenuStore';
import chatStore from '../../stores/ClientChatStore';
import WelcomeHome from '../WelcomeHome';
import ChatInput from './input/ChatInput';
import ChatMessages from './messages/ChatMessages';
const Chat = observer(({ height, width }) => {
const scrollRef = useRef();
const [isAndroid, setIsAndroid] = useState(false);
useEffect(() => {
if (typeof window !== "undefined") {
if (typeof window !== 'undefined') {
setIsAndroid(/android/i.test(window.navigator.userAgent));
}
}, []);
return (
<Grid
templateRows="1fr auto"
templateColumns="1fr"
height={height}
width={width}
gap={0}
>
<Grid templateRows="1fr auto" templateColumns="1fr" height={height} width={width} gap={0}>
<GridItem alignSelf="center" hidden={!(chatStore.items.length < 1)}>
<WelcomeHome visible={chatStore.items.length < 1} />
</GridItem>
@@ -35,32 +31,17 @@ const Chat = observer(({ height, width }) => {
maxH="100%"
ref={scrollRef}
// If there are attachments, use "100px". Otherwise, use "128px" on Android, "73px" elsewhere.
pb={
isAndroid
? "128px"
: "73px"
}
pb={isAndroid ? '128px' : '73px'}
alignSelf="flex-end"
>
<ChatMessages scrollRef={scrollRef} />
</GridItem>
<GridItem
position="relative"
bg="background.primary"
zIndex={1000}
width="100%"
>
<Box
w="100%"
display="flex"
justifyContent="center"
mx="auto"
hidden={menuState.isOpen}
>
<GridItem position="relative" bg="background.primary" zIndex={1000} width="100%">
<Box w="100%" display="flex" justifyContent="center" mx="auto" hidden={menuState.isOpen}>
<ChatInput
input={chatStore.input}
setInput={(value) => chatStore.setInput(value)}
setInput={value => chatStore.setInput(value)}
handleSendMessage={chatStore.sendMessage}
isLoading={chatStore.isLoading}
/>

View File

@@ -1,16 +1,17 @@
import React from "react";
import { observer } from "mobx-react-lite";
import clientChatStore from "../../stores/ClientChatStore";
import { observer } from 'mobx-react-lite';
import React from 'react';
import clientChatStore from '../../stores/ClientChatStore';
export const IntermediateStepsComponent = observer(({ hidden }) => {
return (
<div hidden={hidden}>
{clientChatStore.intermediateSteps.map((step, index) => {
switch (step.kind) {
case "web-search": {
case 'web-search': {
return <WebSearchResult key={index} data={step.data} />;
}
case "tool-result":
case 'tool-result':
return <ToolResult key={index} data={step.data} />;
default:
return <GenericStep key={index} data={step.data} />;
@@ -45,7 +46,7 @@ export const GenericStep = ({ data }) => {
return (
<div className="generic-step">
<h3>Generic Step</h3>
<p>{data.description || "No additional information provided."}</p>
<p>{data.description || 'No additional information provided.'}</p>
</div>
);
};

View File

@@ -1,5 +1,3 @@
import React, { useRef } from "react";
import { observer } from "mobx-react-lite";
import {
Box,
Divider,
@@ -11,8 +9,10 @@ import {
Portal,
Text,
useDisclosure,
} from "@chakra-ui/react";
import { ChevronRight } from "lucide-react";
} from '@chakra-ui/react';
import { ChevronRight } from 'lucide-react';
import { observer } from 'mobx-react-lite';
import React, { useRef } from 'react';
const FlyoutSubMenu: React.FC<{
title: string;
@@ -23,15 +23,7 @@ const FlyoutSubMenu: React.FC<{
parentIsOpen: boolean;
setMenuState?: (state) => void;
}> = observer(
({
title,
flyoutMenuOptions,
onClose,
handleSelect,
isSelected,
parentIsOpen,
setMenuState,
}) => {
({ title, flyoutMenuOptions, onClose, handleSelect, isSelected, parentIsOpen, setMenuState }) => {
const { isOpen, onOpen, onClose: onSubMenuClose } = useDisclosure();
const menuRef = new useRef();
@@ -41,9 +33,9 @@ const FlyoutSubMenu: React.FC<{
placement="right-start"
isOpen={isOpen && parentIsOpen}
closeOnBlur={true}
lazyBehavior={"keepMounted"}
lazyBehavior={'keepMounted'}
isLazy={true}
onClose={(e) => {
onClose={e => {
onSubMenuClose();
}}
closeOnSelect={false}
@@ -54,12 +46,12 @@ const FlyoutSubMenu: React.FC<{
ref={menuRef}
bg="background.tertiary"
color="text.primary"
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }}
_hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
>
<HStack width={"100%"} justifyContent={"space-between"}>
<HStack width={'100%'} justifyContent={'space-between'}>
<Text>{title}</Text>
<ChevronRight size={"1rem"} />
<ChevronRight size={'1rem'} />
</HStack>
</MenuButton>
<Portal>
@@ -67,7 +59,7 @@ const FlyoutSubMenu: React.FC<{
key={title}
maxHeight={56}
overflowY="scroll"
visibility={"visible"}
visibility={'visible'}
minWidth="180px"
bg="background.tertiary"
boxShadow="lg"
@@ -77,43 +69,35 @@ const FlyoutSubMenu: React.FC<{
left="100%"
bottom={-10}
sx={{
"::-webkit-scrollbar": {
width: "8px",
'::-webkit-scrollbar': {
width: '8px',
},
"::-webkit-scrollbar-thumb": {
background: "background.primary",
borderRadius: "4px",
'::-webkit-scrollbar-thumb': {
background: 'background.primary',
borderRadius: '4px',
},
"::-webkit-scrollbar-track": {
background: "background.tertiary",
'::-webkit-scrollbar-track': {
background: 'background.tertiary',
},
}}
>
{flyoutMenuOptions.map((item, index) => (
<Box key={"itemflybox" + index}>
<Box key={'itemflybox' + index}>
<MenuItem
key={"itemfly" + index}
key={'itemfly' + index}
onClick={() => {
onSubMenuClose();
onClose();
handleSelect(item);
}}
bg={
isSelected(item)
? "background.secondary"
: "background.tertiary"
}
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }}
bg={isSelected(item) ? 'background.secondary' : 'background.tertiary'}
_hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
>
{item.name}
</MenuItem>
{index < flyoutMenuOptions.length - 1 && (
<Divider
key={item.name + "-divider"}
color="text.tertiary"
w={"100%"}
/>
<Divider key={item.name + '-divider'} color="text.tertiary" w={'100%'} />
)}
</Box>
))}

View File

@@ -1,4 +1,3 @@
import React, { useCallback, useEffect, useRef, useState } from "react";
import {
Box,
Button,
@@ -12,204 +11,180 @@ import {
Text,
useDisclosure,
useOutsideClick,
} from "@chakra-ui/react";
import { observer } from "mobx-react-lite";
import { ChevronDown, Copy, RefreshCcw, Settings } from "lucide-react";
import ClientChatStore from "../../../stores/ClientChatStore";
import clientChatStore from "../../../stores/ClientChatStore";
import FlyoutSubMenu from "./FlyoutSubMenu";
import { useIsMobile } from "../../contexts/MobileContext";
import { useIsMobile as useIsMobileUserAgent } from "../../../hooks/_IsMobileHook";
import { getModelFamily, SUPPORTED_MODELS } from "../lib/SupportedModels";
import { formatConversationMarkdown } from "../lib/exportConversationAsMarkdown";
} from '@chakra-ui/react';
import { ChevronDown, Copy, RefreshCcw, Settings } from 'lucide-react';
import { observer } from 'mobx-react-lite';
import React, { useCallback, useEffect, useRef, useState } from 'react';
import { useIsMobile as useIsMobileUserAgent } from '../../../hooks/_IsMobileHook';
import clientChatStore from '../../../stores/ClientChatStore';
import { useIsMobile } from '../../contexts/MobileContext';
import { formatConversationMarkdown } from '../lib/exportConversationAsMarkdown';
import FlyoutSubMenu from './FlyoutSubMenu';
export const MsM_commonButtonStyles = {
bg: "transparent",
color: "text.primary",
borderRadius: "full",
bg: 'transparent',
color: 'text.primary',
borderRadius: 'full',
padding: 2,
border: "none",
_hover: { bg: "rgba(255, 255, 255, 0.2)" },
_active: { bg: "rgba(255, 255, 255, 0.3)" },
_focus: { boxShadow: "none" },
border: 'none',
_hover: { bg: 'rgba(255, 255, 255, 0.2)' },
_active: { bg: 'rgba(255, 255, 255, 0.3)' },
_focus: { boxShadow: 'none' },
};
const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(
({ isDisabled }) => {
const isMobile = useIsMobile();
const isMobileUserAgent = useIsMobileUserAgent();
const {
isOpen,
onOpen,
onClose,
onToggle,
getDisclosureProps,
getButtonProps,
} = useDisclosure();
const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(({ isDisabled }) => {
const isMobile = useIsMobile();
const isMobileUserAgent = useIsMobileUserAgent();
const { isOpen, onOpen, onClose, onToggle, getDisclosureProps, getButtonProps } = useDisclosure();
const [controlledOpen, setControlledOpen] = useState<boolean>(false);
const [controlledOpen, setControlledOpen] = useState<boolean>(false);
const [supportedModels, setSupportedModels] = useState<any[]>([]);
useEffect(() => {
setControlledOpen(isOpen);
}, [isOpen]);
useEffect(() => {
setControlledOpen(isOpen);
}, [isOpen]);
useEffect(() => {
fetch('/api/models')
.then(response => response.json())
.then(models => {
setSupportedModels(models);
})
.catch(err => {
console.error('Could not fetch models: ', err);
});
}, []);
const getSupportedModels = async () => {
// Check if fetch is available (browser environment)
if (typeof fetch !== 'undefined') {
try {
return await (await fetch("/api/models")).json();
} catch (error) {
console.error("Error fetching models:", error);
return [];
}
} else {
// In test environment or where fetch is not available
console.log("Fetch not available, using default models");
return [];
}
}
const handleClose = useCallback(() => {
onClose();
}, [isOpen]);
useEffect(() => {
getSupportedModels().then((supportedModels) => {
// Check if setSupportedModels method exists before calling it
if (clientChatStore.setSupportedModels) {
clientChatStore.setSupportedModels(supportedModels);
} else {
console.log("setSupportedModels method not available in this environment");
}
});
}, []);
const handleCopyConversation = useCallback(() => {
navigator.clipboard
.writeText(formatConversationMarkdown(clientChatStore.items))
.then(() => {
window.alert('Conversation copied to clipboard. \n\nPaste it somewhere safe!');
onClose();
})
.catch(err => {
console.error('Could not copy text to clipboard: ', err);
window.alert('Failed to copy conversation. Please try again.');
});
}, [onClose]);
async function selectModelFn({ name, value }) {
clientChatStore.setModel(value);
}
const handleClose = useCallback(() => {
onClose();
}, [isOpen]);
function isSelectedModelFn({ name, value }) {
return clientChatStore.model === value;
}
const handleCopyConversation = useCallback(() => {
navigator.clipboard
.writeText(formatConversationMarkdown(clientChatStore.items))
.then(() => {
window.alert(
"Conversation copied to clipboard. \n\nPaste it somewhere safe!",
);
onClose();
})
.catch((err) => {
console.error("Could not copy text to clipboard: ", err);
window.alert("Failed to copy conversation. Please try again.");
});
}, [onClose]);
const menuRef = useRef();
const [menuState, setMenuState] = useState();
async function selectModelFn({ name, value }) {
clientChatStore.setModel(value);
}
useOutsideClick({
enabled: !isMobile && isOpen,
ref: menuRef,
handler: () => {
handleClose();
},
});
function isSelectedModelFn({ name, value }) {
return clientChatStore.model === value;
}
const menuRef = useRef();
const [menuState, setMenuState] = useState();
useOutsideClick({
enabled: !isMobile && isOpen,
ref: menuRef,
handler: () => {
handleClose();
},
});
return (
<Menu
isOpen={controlledOpen}
onClose={onClose}
onOpen={onOpen}
autoSelect={false}
closeOnSelect={false}
closeOnBlur={isOpen && !isMobileUserAgent}
isLazy={true}
lazyBehavior={"unmount"}
>
{isMobile ? (
<MenuButton
as={IconButton}
bg="text.accent"
icon={<Settings size={20} />}
isDisabled={isDisabled}
aria-label="Settings"
_hover={{ bg: "rgba(255, 255, 255, 0.2)" }}
_focus={{ boxShadow: "none" }}
{...MsM_commonButtonStyles}
/>
) : (
<MenuButton
as={Button}
rightIcon={<ChevronDown size={16} />}
isDisabled={isDisabled}
variant="ghost"
display="flex"
justifyContent="space-between"
alignItems="center"
minW="auto"
{...MsM_commonButtonStyles}
>
<Text noOfLines={1} maxW="100px" fontSize="sm">
{clientChatStore.model}
</Text>
</MenuButton>
)}
<MenuList
bg="background.tertiary"
border="none"
borderRadius="md"
boxShadow="lg"
minW={"10rem"}
ref={menuRef}
return (
<Menu
isOpen={controlledOpen}
onClose={onClose}
onOpen={onOpen}
autoSelect={false}
closeOnSelect={false}
closeOnBlur={isOpen && !isMobileUserAgent}
isLazy={true}
lazyBehavior={'unmount'}
>
{isMobile ? (
<MenuButton
as={IconButton}
bg="text.accent"
icon={<Settings size={20} />}
isDisabled={isDisabled}
aria-label="Settings"
_hover={{ bg: 'rgba(255, 255, 255, 0.2)' }}
_focus={{ boxShadow: 'none' }}
{...MsM_commonButtonStyles}
/>
) : (
<MenuButton
as={Button}
rightIcon={<ChevronDown size={16} />}
isDisabled={isDisabled}
variant="ghost"
display="flex"
justifyContent="space-between"
alignItems="center"
minW="auto"
{...MsM_commonButtonStyles}
>
<FlyoutSubMenu
title="Text Models"
flyoutMenuOptions={clientChatStore.supportedModels.map((m) => ({ name: m, value: m }))}
onClose={onClose}
parentIsOpen={isOpen}
setMenuState={setMenuState}
handleSelect={selectModelFn}
isSelected={isSelectedModelFn}
/>
<Divider color="text.tertiary" />
{/*Export conversation button*/}
<MenuItem
bg="background.tertiary"
color="text.primary"
onClick={handleCopyConversation}
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }}
>
<Flex align="center">
<Copy size="16px" style={{ marginRight: "8px" }} />
<Box>Export</Box>
</Flex>
</MenuItem>
{/*New conversation button*/}
<MenuItem
bg="background.tertiary"
color="text.primary"
onClick={() => {
clientChatStore.setActiveConversation("conversation:new");
onClose();
}}
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }}
>
<Flex align="center">
<RefreshCcw size="16px" style={{ marginRight: "8px" }} />
<Box>New</Box>
</Flex>
</MenuItem>
</MenuList>
</Menu>
);
},
);
<Text noOfLines={1} maxW="100px" fontSize="sm">
{clientChatStore.model}
</Text>
</MenuButton>
)}
<MenuList
bg="background.tertiary"
border="none"
borderRadius="md"
boxShadow="lg"
minW={'10rem'}
ref={menuRef}
>
<FlyoutSubMenu
title="Text Models"
flyoutMenuOptions={supportedModels.map(modelData => ({
name: modelData.id.split('/').pop() || modelData.id,
value: modelData.id,
}))}
onClose={onClose}
parentIsOpen={isOpen}
setMenuState={setMenuState}
handleSelect={selectModelFn}
isSelected={isSelectedModelFn}
/>
<Divider color="text.tertiary" />
{/*Export conversation button*/}
<MenuItem
bg="background.tertiary"
color="text.primary"
onClick={handleCopyConversation}
_hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
>
<Flex align="center">
<Copy size="16px" style={{ marginRight: '8px' }} />
<Box>Export</Box>
</Flex>
</MenuItem>
{/*New conversation button*/}
<MenuItem
bg="background.tertiary"
color="text.primary"
onClick={() => {
clientChatStore.setActiveConversation('conversation:new');
onClose();
}}
_hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
>
<Flex align="center">
<RefreshCcw size="16px" style={{ marginRight: '8px' }} />
<Box>New</Box>
</Flex>
</MenuItem>
</MenuList>
</Menu>
);
});
export default InputMenu;

View File

@@ -1,34 +1,28 @@
import React, { useEffect, useRef, useState } from "react";
import {
Box,
Button,
Grid,
GridItem,
useBreakpointValue,
} from "@chakra-ui/react";
import { observer } from "mobx-react-lite";
import chatStore from "../../../stores/ClientChatStore";
import InputMenu from "../input-menu/InputMenu";
import InputTextarea from "./ChatInputTextArea";
import SendButton from "./ChatInputSendButton";
import { useMaxWidth } from "../../../hooks/useMaxWidth";
import userOptionsStore from "../../../stores/UserOptionsStore";
import { Box, Button, Grid, GridItem, useBreakpointValue } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React, { useEffect, useRef, useState } from 'react';
import { useMaxWidth } from '../../../hooks/useMaxWidth';
import chatStore from '../../../stores/ClientChatStore';
import userOptionsStore from '../../../stores/UserOptionsStore';
import InputMenu from '../input-menu/InputMenu';
import SendButton from './ChatInputSendButton';
import InputTextarea from './ChatInputTextArea';
const ChatInput = observer(() => {
const inputRef = useRef<HTMLTextAreaElement>(null);
const containerRef = useRef<HTMLDivElement>(null);
const maxWidth = useMaxWidth();
const [inputValue, setInputValue] = useState<string>("");
const [inputValue, setInputValue] = useState<string>('');
const [containerHeight, setContainerHeight] = useState(56);
const [containerBorderRadius, setContainerBorderRadius] = useState(9999);
const [shouldFollow, setShouldFollow] = useState<boolean>(
userOptionsStore.followModeEnabled,
);
const [shouldFollow, setShouldFollow] = useState<boolean>(userOptionsStore.followModeEnabled);
const [couldFollow, setCouldFollow] = useState<boolean>(chatStore.isLoading);
const [inputWidth, setInputWidth] = useState<string>("50%");
const [inputWidth, setInputWidth] = useState<string>('50%');
useEffect(() => {
setShouldFollow(chatStore.isLoading && userOptionsStore.followModeEnabled);
@@ -42,8 +36,8 @@ const ChatInput = observer(() => {
useEffect(() => {
if (containerRef.current) {
const observer = new ResizeObserver((entries) => {
for (let entry of entries) {
const observer = new ResizeObserver(entries => {
for (const entry of entries) {
const newHeight = entry.target.clientHeight;
setContainerHeight(newHeight);
@@ -63,20 +57,20 @@ const ChatInput = observer(() => {
};
const handleKeyDown = (e: React.KeyboardEvent<HTMLTextAreaElement>) => {
if (e.key === "Enter" && !e.shiftKey) {
if (e.key === 'Enter' && !e.shiftKey) {
e.preventDefault();
chatStore.sendMessage();
}
};
const inputMaxWidth = useBreakpointValue(
{ base: "50rem", lg: "50rem", md: "80%", sm: "100vw" },
{ base: '50rem', lg: '50rem', md: '80%', sm: '100vw' },
{ ssr: true },
);
const inputMinWidth = useBreakpointValue({ lg: "40rem" }, { ssr: true });
const inputMinWidth = useBreakpointValue({ lg: '40rem' }, { ssr: true });
useEffect(() => {
setInputWidth("100%");
setInputWidth('100%');
}, [inputMaxWidth, inputMinWidth]);
return (
@@ -105,12 +99,12 @@ const ChatInput = observer(() => {
size="sm"
variant="ghost"
colorScheme="blue"
onClick={(_) => {
onClick={_ => {
userOptionsStore.toggleFollowMode();
}}
isDisabled={!chatStore.isLoading}
>
{shouldFollow ? "Disable Follow Mode" : "Enable Follow Mode"}
{shouldFollow ? 'Disable Follow Mode' : 'Enable Follow Mode'}
</Button>
</Box>
)}
@@ -123,7 +117,7 @@ const ChatInput = observer(() => {
gap={2}
alignItems="center"
style={{
transition: "border-radius 0.2s ease",
transition: 'border-radius 0.2s ease',
}}
>
<GridItem>

View File

@@ -1,9 +1,9 @@
import React from "react";
import { Button } from "@chakra-ui/react";
import clientChatStore from "../../../stores/ClientChatStore";
import { CirclePause, Send } from "lucide-react";
import { Button } from '@chakra-ui/react';
import { motion } from 'framer-motion';
import { CirclePause, Send } from 'lucide-react';
import React from 'react';
import { motion } from "framer-motion";
import clientChatStore from '../../../stores/ClientChatStore';
interface SendButtonProps {
isLoading: boolean;
@@ -13,25 +13,20 @@ interface SendButtonProps {
}
const SendButton: React.FC<SendButtonProps> = ({ onClick }) => {
const isDisabled =
clientChatStore.input.trim().length === 0 && !clientChatStore.isLoading;
const isDisabled = clientChatStore.input.trim().length === 0 && !clientChatStore.isLoading;
return (
<Button
onClick={(e) =>
clientChatStore.isLoading
? clientChatStore.stopIncomingMessage()
: onClick(e)
onClick={e =>
clientChatStore.isLoading ? clientChatStore.stopIncomingMessage() : onClick(e)
}
bg="transparent"
color={
clientChatStore.input.trim().length <= 1 ? "brand.700" : "text.primary"
}
color={clientChatStore.input.trim().length <= 1 ? 'brand.700' : 'text.primary'}
borderRadius="full"
p={2}
isDisabled={isDisabled}
_hover={{ bg: !isDisabled ? "rgba(255, 255, 255, 0.2)" : "inherit" }}
_active={{ bg: !isDisabled ? "rgba(255, 255, 255, 0.3)" : "inherit" }}
_focus={{ boxShadow: "none" }}
_hover={{ bg: !isDisabled ? 'rgba(255, 255, 255, 0.2)' : 'inherit' }}
_active={{ bg: !isDisabled ? 'rgba(255, 255, 255, 0.3)' : 'inherit' }}
_focus={{ boxShadow: 'none' }}
>
{clientChatStore.isLoading ? <MySpinner /> : <Send size={20} />}
</Button>
@@ -45,10 +40,10 @@ const MySpinner = ({ onClick }) => (
exit={{ opacity: 0, scale: 0.9 }}
transition={{
duration: 0.4,
ease: "easeInOut",
ease: 'easeInOut',
}}
>
<CirclePause color={"#F0F0F0"} size={24} onClick={onClick} />
<CirclePause color={'#F0F0F0'} size={24} onClick={onClick} />
</motion.div>
);

View File

@@ -1,7 +1,7 @@
import React, {useEffect, useRef, useState} from "react";
import {observer} from "mobx-react-lite";
import {Box, chakra, InputGroup,} from "@chakra-ui/react";
import AutoResize from "react-textarea-autosize";
import { Box, chakra, InputGroup } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React, { useEffect, useRef, useState } from 'react';
import AutoResize from 'react-textarea-autosize';
const AutoResizeTextArea = chakra(AutoResize);
@@ -15,10 +15,7 @@ interface InputTextAreaProps {
const InputTextArea: React.FC<InputTextAreaProps> = observer(
({ inputRef, value, onChange, onKeyDown, isLoading }) => {
const [heightConstraint, setHeightConstraint] = useState<
number | undefined
>(10);
const [heightConstraint, setHeightConstraint] = useState<number | undefined>(10);
useEffect(() => {
if (value.length > 10) {
@@ -34,7 +31,6 @@ const InputTextArea: React.FC<InputTextAreaProps> = observer(
display="flex"
flexDirection="column"
>
{/* Input Area */}
<InputGroup position="relative">
<AutoResizeTextArea
@@ -43,7 +39,7 @@ const InputTextArea: React.FC<InputTextAreaProps> = observer(
value={value}
height={heightConstraint}
autoFocus
onChange={(e) => onChange(e.target.value)}
onChange={e => onChange(e.target.value)}
onKeyDown={onKeyDown}
p={2}
pr="8px"
@@ -53,19 +49,19 @@ const InputTextArea: React.FC<InputTextAreaProps> = observer(
borderRadius="20px"
border="none"
placeholder="Free my mind..."
_placeholder={{ color: "gray.400" }}
_placeholder={{ color: 'gray.400' }}
_focus={{
outline: "none",
outline: 'none',
}}
disabled={isLoading}
minRows={1}
maxRows={12}
style={{
touchAction: "none",
resize: "none",
overflowY: "auto",
width: "100%",
transition: "height 0.2s ease-in-out",
touchAction: 'none',
resize: 'none',
overflowY: 'auto',
width: '100%',
transition: 'height 0.2s ease-in-out',
}}
/>
</InputGroup>

View File

@@ -1,9 +1,10 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react';
import React from 'react';
import ChatInput from '../ChatInput';
import userOptionsStore from '../../../../stores/UserOptionsStore';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import chatStore from '../../../../stores/ClientChatStore';
import userOptionsStore from '../../../../stores/UserOptionsStore';
import ChatInput from '../ChatInput';
// Mock browser APIs
class MockResizeObserver {
@@ -85,7 +86,7 @@ vi.mock('./ChatInputTextArea', () => ({
aria-label="Chat input"
ref={inputRef}
value={value}
onChange={(e) => onChange(e.target.value)}
onChange={e => onChange(e.target.value)}
onKeyDown={onKeyDown}
disabled={isLoading}
/>

View File

@@ -8,16 +8,16 @@ const SUPPORTED_MODELS_GROUPS = {
groq: [
// "mixtral-8x7b-32768",
// "deepseek-r1-distill-llama-70b",
"meta-llama/llama-4-scout-17b-16e-instruct",
"gemma2-9b-it",
"mistral-saba-24b",
'meta-llama/llama-4-scout-17b-16e-instruct',
'gemma2-9b-it',
'mistral-saba-24b',
// "qwen-2.5-32b",
"llama-3.3-70b-versatile",
'llama-3.3-70b-versatile',
// "llama-3.3-70b-versatile"
// "llama-3.1-70b-versatile",
// "llama-3.3-70b-versatile"
],
cerebras: ["llama-3.3-70b"],
cerebras: ['llama-3.3-70b'],
claude: [
// "claude-3-5-sonnet-20241022",
// "claude-3-opus-20240229"
@@ -44,34 +44,34 @@ const SUPPORTED_MODELS_GROUPS = {
// "grok-beta"
],
cloudflareAI: [
"llama-3.2-3b-instruct", // max_tokens
"llama-3-8b-instruct", // max_tokens
"llama-3.1-8b-instruct-fast", // max_tokens
"deepseek-math-7b-instruct",
"deepseek-coder-6.7b-instruct-awq",
"hermes-2-pro-mistral-7b",
"openhermes-2.5-mistral-7b-awq",
"mistral-7b-instruct-v0.2",
"neural-chat-7b-v3-1-awq",
"openchat-3.5-0106",
'llama-3.2-3b-instruct', // max_tokens
'llama-3-8b-instruct', // max_tokens
'llama-3.1-8b-instruct-fast', // max_tokens
'deepseek-math-7b-instruct',
'deepseek-coder-6.7b-instruct-awq',
'hermes-2-pro-mistral-7b',
'openhermes-2.5-mistral-7b-awq',
'mistral-7b-instruct-v0.2',
'neural-chat-7b-v3-1-awq',
'openchat-3.5-0106',
// "gemma-7b-it",
],
};
export type SupportedModel =
| keyof typeof SUPPORTED_MODELS_GROUPS
| (typeof SUPPORTED_MODELS_GROUPS)[keyof typeof SUPPORTED_MODELS_GROUPS][number];
| keyof typeof SUPPORTED_MODELS_GROUPS
| (typeof SUPPORTED_MODELS_GROUPS)[keyof typeof SUPPORTED_MODELS_GROUPS][number];
export type ModelFamily = keyof typeof SUPPORTED_MODELS_GROUPS;
function getModelFamily(model: string): ModelFamily | undefined {
return Object.keys(SUPPORTED_MODELS_GROUPS)
.filter((family) => {
return SUPPORTED_MODELS_GROUPS[
family as keyof typeof SUPPORTED_MODELS_GROUPS
].includes(model.trim());
})
.at(0) as ModelFamily | undefined;
.filter(family => {
return SUPPORTED_MODELS_GROUPS[family as keyof typeof SUPPORTED_MODELS_GROUPS].includes(
model.trim(),
);
})
.at(0) as ModelFamily | undefined;
}
const SUPPORTED_MODELS = [

View File

@@ -1,30 +1,30 @@
import DOMPurify from "isomorphic-dompurify";
import DOMPurify from 'isomorphic-dompurify';
function domPurify(dirty: string) {
return DOMPurify.sanitize(dirty, {
USE_PROFILES: { html: true },
ALLOWED_TAGS: [
"b",
"i",
"u",
"a",
"p",
"span",
"div",
"table",
"thead",
"tbody",
"tr",
"td",
"th",
"ul",
"ol",
"li",
"code",
"pre",
'b',
'i',
'u',
'a',
'p',
'span',
'div',
'table',
'thead',
'tbody',
'tr',
'td',
'th',
'ul',
'ol',
'li',
'code',
'pre',
],
ALLOWED_ATTR: ["href", "src", "alt", "title", "class", "style"],
FORBID_TAGS: ["script", "iframe"],
ALLOWED_ATTR: ['href', 'src', 'alt', 'title', 'class', 'style'],
FORBID_TAGS: ['script', 'iframe'],
KEEP_CONTENT: true,
SAFE_FOR_TEMPLATES: true,
});

View File

@@ -1,18 +1,17 @@
// Function to generate a Markdown representation of the current conversation
import { type IMessage } from "../../../stores/ClientChatStore";
import { Instance } from "mobx-state-tree";
import { type Instance } from 'mobx-state-tree';
export function formatConversationMarkdown(
messages: Instance<typeof IMessage>[],
): string {
import { type IMessage } from '../../../stores/ClientChatStore';
export function formatConversationMarkdown(messages: Instance<typeof IMessage>[]): string {
return messages
.map((message) => {
if (message.role === "user") {
.map(message => {
if (message.role === 'user') {
return `**You**: ${message.content}`;
} else if (message.role === "assistant") {
} else if (message.role === 'assistant') {
return `**Geoff's AI**: ${message.content}`;
}
return "";
return '';
})
.join("\n\n");
.join('\n\n');
}

View File

@@ -1,6 +1,6 @@
import React from "react";
import React from 'react';
import MessageMarkdownRenderer from "./MessageMarkdownRenderer";
import MessageMarkdownRenderer from './MessageMarkdownRenderer';
const ChatMessageContent = ({ content }) => {
return <MessageMarkdownRenderer markdown={content} />;

View File

@@ -1,9 +1,11 @@
import React from "react";
import {Box, Grid, GridItem} from "@chakra-ui/react";
import MessageBubble from "./MessageBubble";
import {observer} from "mobx-react-lite";
import chatStore from "../../../stores/ClientChatStore";
import {useIsMobile} from "../../contexts/MobileContext";
import { Box, Grid, GridItem } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React from 'react';
import chatStore from '../../../stores/ClientChatStore';
import { useIsMobile } from '../../contexts/MobileContext';
import MessageBubble from './MessageBubble';
interface ChatMessagesProps {
scrollRef: React.RefObject<HTMLDivElement>;
@@ -13,11 +15,7 @@ const ChatMessages: React.FC<ChatMessagesProps> = observer(({ scrollRef }) => {
const isMobile = useIsMobile();
return (
<Box
pt={isMobile ? 24 : undefined}
overflowY={"scroll"}
overflowX={"hidden"}
>
<Box pt={isMobile ? 24 : undefined} overflowY={'scroll'} overflowX={'hidden'}>
<Grid
fontFamily="Arial, sans-serif"
templateColumns="1fr"

View File

@@ -1,43 +1,43 @@
import React, { useEffect, useRef, useState } from "react";
import { Box, Flex, Text } from "@chakra-ui/react";
import MessageRenderer from "./ChatMessageContent";
import { observer } from "mobx-react-lite";
import MessageEditor from "./MessageEditorComponent";
import UserMessageTools from "./UserMessageTools";
import clientChatStore from "../../../stores/ClientChatStore";
import UserOptionsStore from "../../../stores/UserOptionsStore";
import MotionBox from "./MotionBox";
import { Box, Flex, Text } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React, { useEffect, useRef, useState } from 'react';
import clientChatStore from '../../../stores/ClientChatStore';
import UserOptionsStore from '../../../stores/UserOptionsStore';
import MessageRenderer from './ChatMessageContent';
import MessageEditor from './MessageEditorComponent';
import MotionBox from './MotionBox';
import UserMessageTools from './UserMessageTools';
const LoadingDots = () => {
return (
<Flex>
{[0, 1, 2].map((i) => (
<MotionBox
key={i}
width="8px"
height="8px"
borderRadius="50%"
backgroundColor="text.primary"
margin="0 4px"
animate={{
scale: [1, 1.2, 1],
opacity: [0.5, 1, 0.5],
}}
transition={{
duration: 1,
repeat: Infinity,
delay: i * 0.2,
}}
/>
))}
</Flex>
<Flex>
{[0, 1, 2].map(i => (
<MotionBox
key={i}
width="8px"
height="8px"
borderRadius="50%"
backgroundColor="text.primary"
margin="0 4px"
animate={{
scale: [1, 1.2, 1],
opacity: [0.5, 1, 0.5],
}}
transition={{
duration: 1,
repeat: Infinity,
delay: i * 0.2,
}}
/>
))}
</Flex>
);
}
};
function renderMessage(msg: any) {
if (msg.role === "user") {
if (msg.role === 'user') {
return (
<Text as="p" fontSize="sm" lineHeight="short" color="text.primary">
{msg.content}
@@ -50,8 +50,8 @@ function renderMessage(msg: any) {
const MessageBubble = observer(({ msg, scrollRef }) => {
const [isEditing, setIsEditing] = useState(false);
const [isHovered, setIsHovered] = useState(false);
const isUser = msg.role === "user";
const senderName = isUser ? "You" : "Geoff's AI";
const isUser = msg.role === 'user';
const senderName = isUser ? 'You' : "Geoff's AI";
const isLoading = !msg.content || !(msg.content.trim().length > 0);
const messageRef = useRef();
@@ -64,10 +64,15 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
};
useEffect(() => {
if (clientChatStore.items.length > 0 && clientChatStore.isLoading && UserOptionsStore.followModeEnabled) { // Refine condition
if (
clientChatStore.items.length > 0 &&
clientChatStore.isLoading &&
UserOptionsStore.followModeEnabled
) {
// Refine condition
scrollRef.current?.scrollTo({
top: scrollRef.current.scrollHeight,
behavior: "auto",
behavior: 'auto',
});
}
});
@@ -75,7 +80,7 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
return (
<Flex
flexDirection="column"
alignItems={isUser ? "flex-end" : "flex-start"}
alignItems={isUser ? 'flex-end' : 'flex-start'}
role="listitem"
flex={0}
aria-label={`Message from ${senderName}`}
@@ -85,19 +90,19 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
<Text
fontSize="xs"
color="text.tertiary"
textAlign={isUser ? "right" : "left"}
alignSelf={isUser ? "flex-end" : "flex-start"}
textAlign={isUser ? 'right' : 'left'}
alignSelf={isUser ? 'flex-end' : 'flex-start'}
mb={1}
>
{senderName}
</Text>
<MotionBox
minW={{ base: "99%", sm: "99%", lg: isUser ? "55%" : "60%" }}
maxW={{ base: "99%", sm: "99%", lg: isUser ? "65%" : "65%" }}
minW={{ base: '99%', sm: '99%', lg: isUser ? '55%' : '60%' }}
maxW={{ base: '99%', sm: '99%', lg: isUser ? '65%' : '65%' }}
p={3}
borderRadius="1.5em"
bg={isUser ? "#0A84FF" : "#3A3A3C"}
bg={isUser ? '#0A84FF' : '#3A3A3C'}
color="text.primary"
textAlign="left"
boxShadow="0 2px 4px rgba(0, 0, 0, 0.1)"
@@ -115,10 +120,10 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
whiteSpace="pre-wrap"
ref={messageRef}
sx={{
"pre, code": {
maxWidth: "100%",
whiteSpace: "pre-wrap",
overflowX: "auto",
'pre, code': {
maxWidth: '100%',
whiteSpace: 'pre-wrap',
overflowX: 'auto',
},
}}
>
@@ -139,9 +144,7 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
justifyContent="center"
alignItems="center"
>
{isHovered && !isEditing && (
<UserMessageTools message={msg} onEdit={handleEdit} />
)}
{isHovered && !isEditing && <UserMessageTools message={msg} onEdit={handleEdit} />}
</Box>
)}
</Flex>

View File

@@ -1,10 +1,11 @@
import React, { KeyboardEvent, useEffect } from "react";
import { Box, Flex, IconButton, Textarea } from "@chakra-ui/react";
import { Check, X } from "lucide-react";
import { observer } from "mobx-react-lite";
import { Instance } from "mobx-state-tree";
import Message from "../../../models/Message";
import messageEditorStore from "../../../stores/MessageEditorStore";
import { Box, Flex, IconButton, Textarea } from '@chakra-ui/react';
import { Check, X } from 'lucide-react';
import { observer } from 'mobx-react-lite';
import { type Instance } from 'mobx-state-tree';
import React, { type KeyboardEvent, useEffect } from 'react';
import Message from '../../../models/Message';
import messageEditorStore from '../../../stores/MessageEditorStore';
interface MessageEditorProps {
message: Instance<typeof Message>;
@@ -30,15 +31,13 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
onCancel();
};
const handleKeyDown = (e: KeyboardEvent<HTMLTextAreaElement>) => {
if (e.key === "Enter" && (e.metaKey || e.ctrlKey)) {
if (e.key === 'Enter' && (e.metaKey || e.ctrlKey)) {
e.preventDefault();
handleSave();
}
if (e.key === "Escape") {
if (e.key === 'Escape') {
e.preventDefault();
handleCancel();
}
@@ -48,14 +47,14 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
<Box width="100%">
<Textarea
value={messageEditorStore.editedContent}
onChange={(e) => messageEditorStore.setEditedContent(e.target.value)}
onChange={e => messageEditorStore.setEditedContent(e.target.value)}
onKeyDown={handleKeyDown}
minHeight="100px"
bg="transparent"
border="1px solid"
borderColor="whiteAlpha.300"
_hover={{ borderColor: "whiteAlpha.400" }}
_focus={{ borderColor: "brand.100", boxShadow: "none" }}
_hover={{ borderColor: 'whiteAlpha.400' }}
_focus={{ borderColor: 'brand.100', boxShadow: 'none' }}
resize="vertical"
color="text.primary"
/>
@@ -66,7 +65,7 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
onClick={handleCancel}
size="sm"
variant="ghost"
color={"accent.danger"}
color={'accent.danger'}
/>
<IconButton
aria-label="Save edit"
@@ -74,7 +73,7 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
onClick={handleSave}
size="sm"
variant="ghost"
color={"accent.confirm"}
color={'accent.confirm'}
/>
</Flex>
</Box>

View File

@@ -1,5 +1,3 @@
import React from "react";
import {
Box,
Code,
@@ -17,13 +15,15 @@ import {
Thead,
Tr,
useColorModeValue,
} from "@chakra-ui/react";
import { marked } from "marked";
import CodeBlock from "../../code/CodeBlock";
import ImageWithFallback from "../../markdown/ImageWithFallback";
import markedKatex from "marked-katex-extension";
import katex from "katex";
import domPurify from "../lib/domPurify";
} from '@chakra-ui/react';
import katex from 'katex';
import { marked } from 'marked';
import markedKatex from 'marked-katex-extension';
import React from 'react';
import CodeBlock from '../../code/CodeBlock';
import ImageWithFallback from '../../markdown/ImageWithFallback';
import domPurify from '../lib/domPurify';
try {
if (localStorage) {
@@ -34,11 +34,13 @@ try {
throwOnError: false,
strict: true,
colorIsTextColor: true,
errorColor: "red",
errorColor: 'red',
}),
);
}
} catch (_) {}
} catch (_) {
// Silently ignore errors in marked setup - fallback to default behavior
}
const MemoizedCodeBlock = React.memo(CodeBlock);
@@ -49,32 +51,29 @@ const MemoizedCodeBlock = React.memo(CodeBlock);
const getHeadingProps = (depth: number) => {
switch (depth) {
case 1:
return { as: "h1", size: "xl", mt: 4, mb: 2 };
return { as: 'h1', size: 'xl', mt: 4, mb: 2 };
case 2:
return { as: "h2", size: "lg", mt: 3, mb: 2 };
return { as: 'h2', size: 'lg', mt: 3, mb: 2 };
case 3:
return { as: "h3", size: "md", mt: 2, mb: 1 };
return { as: 'h3', size: 'md', mt: 2, mb: 1 };
case 4:
return { as: "h4", size: "sm", mt: 2, mb: 1 };
return { as: 'h4', size: 'sm', mt: 2, mb: 1 };
case 5:
return { as: "h5", size: "sm", mt: 2, mb: 1 };
return { as: 'h5', size: 'sm', mt: 2, mb: 1 };
case 6:
return { as: "h6", size: "xs", mt: 2, mb: 1 };
return { as: 'h6', size: 'xs', mt: 2, mb: 1 };
default:
return { as: `h${depth}`, size: "md", mt: 2, mb: 1 };
return { as: `h${depth}`, size: 'md', mt: 2, mb: 1 };
}
};
interface TableToken extends marked.Tokens.Table {
align: Array<"center" | "left" | "right" | null>;
align: Array<'center' | 'left' | 'right' | null>;
header: (string | marked.Tokens.TableCell)[];
rows: (string | marked.Tokens.TableCell)[][];
}
const CustomHeading: React.FC<{ text: string; depth: number }> = ({
text,
depth,
}) => {
const CustomHeading: React.FC<{ text: string; depth: number }> = ({ text, depth }) => {
const headingProps = getHeadingProps(depth);
return (
<Heading {...headingProps} wordBreak="break-word" maxWidth="100%">
@@ -83,9 +82,7 @@ const CustomHeading: React.FC<{ text: string; depth: number }> = ({
);
};
const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({
children,
}) => {
const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return (
<Text
as="p"
@@ -100,9 +97,7 @@ const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({
);
};
const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({
children,
}) => {
const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return (
<Box
as="blockquote"
@@ -120,16 +115,9 @@ const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({
);
};
const CustomCodeBlock: React.FC<{ code: string; language?: string }> = ({
code,
language,
}) => {
const CustomCodeBlock: React.FC<{ code: string; language?: string }> = ({ code, language }) => {
return (
<MemoizedCodeBlock
language={language}
code={code}
onRenderComplete={() => Promise.resolve()}
/>
<MemoizedCodeBlock language={language} code={code} onRenderComplete={() => Promise.resolve()} />
);
};
@@ -141,10 +129,10 @@ const CustomList: React.FC<{
children: React.ReactNode;
}> = ({ ordered, start, children }) => {
const commonStyles = {
fontSize: "sm",
wordBreak: "break-word" as const,
maxWidth: "100%" as const,
stylePosition: "outside" as const,
fontSize: 'sm',
wordBreak: 'break-word' as const,
maxWidth: '100%' as const,
stylePosition: 'outside' as const,
mb: 2,
pl: 4,
};
@@ -166,16 +154,13 @@ const CustomListItem: React.FC<{
return <ListItem mb={1}>{children}</ListItem>;
};
const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({
math,
displayMode,
}) => {
const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({ math, displayMode }) => {
const renderedMath = katex.renderToString(math, { displayMode });
return (
<Box
as="span"
display={displayMode ? "block" : "inline"}
display={displayMode ? 'block' : 'inline'}
p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0}
borderRadius="md"
@@ -188,23 +173,17 @@ const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({
const CustomTable: React.FC<{
header: React.ReactNode[];
align: Array<"center" | "left" | "right" | null>;
align: Array<'center' | 'left' | 'right' | null>;
rows: React.ReactNode[][];
}> = ({ header, align, rows }) => {
return (
<Table
variant="simple"
size="sm"
my={4}
borderRadius="md"
overflow="hidden"
>
<Table variant="simple" size="sm" my={4} borderRadius="md" overflow="hidden">
<Thead bg="background.secondary">
<Tr>
{header.map((cell, i) => (
<Th
key={i}
textAlign={align[i] || "left"}
textAlign={align[i] || 'left'}
fontWeight="bold"
p={2}
minW={16}
@@ -219,12 +198,7 @@ const CustomTable: React.FC<{
{rows.map((row, rIndex) => (
<Tr key={rIndex}>
{row.map((cell, cIndex) => (
<Td
key={cIndex}
textAlign={align[cIndex] || "left"}
p={2}
wordBreak="break-word"
>
<Td key={cIndex} textAlign={align[cIndex] || 'left'} p={2} wordBreak="break-word">
{cell}
</Td>
))}
@@ -241,13 +215,7 @@ const CustomHtmlBlock: React.FC<{ content: string }> = ({ content }) => {
const CustomText: React.FC<{ text: React.ReactNode }> = ({ text }) => {
return (
<Text
fontSize="sm"
lineHeight="short"
wordBreak="break-word"
maxWidth="100%"
as="span"
>
<Text fontSize="sm" lineHeight="short" wordBreak="break-word" maxWidth="100%" as="span">
{text}
</Text>
);
@@ -262,13 +230,7 @@ const CustomStrong: React.FC<CustomStrongProps> = ({ children }) => {
const CustomEm: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return (
<Text
as="em"
fontStyle="italic"
lineHeight="short"
wordBreak="break-word"
display="inline"
>
<Text as="em" fontStyle="italic" lineHeight="short" wordBreak="break-word" display="inline">
{children}
</Text>
);
@@ -289,7 +251,7 @@ const CustomDel: React.FC<{ text: string }> = ({ text }) => {
};
const CustomCodeSpan: React.FC<{ code: string }> = ({ code }) => {
const bg = useColorModeValue("gray.100", "gray.800");
const bg = useColorModeValue('gray.100', 'gray.800');
return (
<Code
fontSize="sm"
@@ -312,13 +274,13 @@ const CustomMath: React.FC<{ math: string; displayMode?: boolean }> = ({
return (
<Box
as="span"
display={displayMode ? "block" : "inline"}
display={displayMode ? 'block' : 'inline'}
p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0}
borderRadius="md"
overflow="auto"
maxWidth="100%"
className={`math ${displayMode ? "math-display" : "math-inline"}`}
className={`math ${displayMode ? 'math-display' : 'math-inline'}`}
>
{math}
</Box>
@@ -336,8 +298,8 @@ const CustomLink: React.FC<{
title={title}
isExternal
sx={{
"& span": {
color: "text.link",
'& span': {
color: 'text.link',
},
}}
maxWidth="100%"
@@ -379,46 +341,34 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
tokens.forEach((token, i) => {
switch (token.type) {
case "heading":
output.push(
<CustomHeading key={i} text={token.text} depth={token.depth} />,
);
case 'heading':
output.push(<CustomHeading key={i} text={token.text} depth={token.depth} />);
break;
case "paragraph": {
const parsedContent = token.tokens
? parseTokens(token.tokens)
: token.text;
case 'paragraph': {
const parsedContent = token.tokens ? parseTokens(token.tokens) : token.text;
if (blockquoteContent.length > 0) {
blockquoteContent.push(
<CustomParagraph key={i}>{parsedContent}</CustomParagraph>,
);
blockquoteContent.push(<CustomParagraph key={i}>{parsedContent}</CustomParagraph>);
} else {
output.push(
<CustomParagraph key={i}>{parsedContent}</CustomParagraph>,
);
output.push(<CustomParagraph key={i}>{parsedContent}</CustomParagraph>);
}
break;
}
case "br":
case 'br':
output.push(<br key={i} />);
break;
case "escape": {
case 'escape': {
break;
}
case "blockquote_start":
case 'blockquote_start':
blockquoteContent = [];
break;
case "blockquote_end":
output.push(
<CustomBlockquote key={i}>
{parseTokens(blockquoteContent)}
</CustomBlockquote>,
);
case 'blockquote_end':
output.push(<CustomBlockquote key={i}>{parseTokens(blockquoteContent)}</CustomBlockquote>);
blockquoteContent = [];
break;
case "blockquote": {
case 'blockquote': {
output.push(
<CustomBlockquote key={i}>
{token.tokens ? parseTokens(token.tokens) : null}
@@ -426,44 +376,30 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
);
break;
}
case "math":
output.push(
<CustomMath key={i} math={(token as any).value} displayMode={true} />,
);
case 'math':
output.push(<CustomMath key={i} math={(token as any).value} displayMode={true} />);
break;
case "inlineMath":
output.push(
<CustomMath
key={i}
math={(token as any).value}
displayMode={false}
/>,
);
case 'inlineMath':
output.push(<CustomMath key={i} math={(token as any).value} displayMode={false} />);
break;
case "inlineKatex":
case "blockKatex": {
case 'inlineKatex':
case 'blockKatex': {
const katexToken = token as any;
output.push(
<CustomKatex
key={i}
math={katexToken.text}
displayMode={katexToken.displayMode}
/>,
<CustomKatex key={i} math={katexToken.text} displayMode={katexToken.displayMode} />,
);
break;
}
case "code":
output.push(
<CustomCodeBlock key={i} code={token.text} language={token.lang} />,
);
case 'code':
output.push(<CustomCodeBlock key={i} code={token.text} language={token.lang} />);
break;
case "hr":
case 'hr':
output.push(<CustomHr key={i} />);
break;
case "list": {
case 'list': {
const { ordered, start, items } = token;
const listItems = items.map((listItem, idx) => {
const nestedContent = parseTokens(listItem.tokens);
@@ -477,53 +413,43 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
);
break;
}
case "table": {
case 'table': {
const tableToken = token as TableToken;
output.push(
<CustomTable
key={i}
header={tableToken.header.map((cell) =>
typeof cell === "string" ? cell : parseTokens(cell.tokens || []),
header={tableToken.header.map(cell =>
typeof cell === 'string' ? cell : parseTokens(cell.tokens || []),
)}
align={tableToken.align}
rows={tableToken.rows.map((row) =>
row.map((cell) =>
typeof cell === "string"
? cell
: parseTokens(cell.tokens || []),
),
rows={tableToken.rows.map(row =>
row.map(cell => (typeof cell === 'string' ? cell : parseTokens(cell.tokens || []))),
)}
/>,
);
break;
}
case "html":
case 'html':
output.push(<CustomHtmlBlock key={i} content={token.text} />);
break;
case "def":
case "space":
case 'def':
case 'space':
break;
case "strong":
output.push(
<CustomStrong key={i}>
{parseTokens(token.tokens || [])}
</CustomStrong>,
);
case 'strong':
output.push(<CustomStrong key={i}>{parseTokens(token.tokens || [])}</CustomStrong>);
break;
case "em":
case 'em':
output.push(
<CustomEm key={i}>
{token.tokens ? parseTokens(token.tokens) : token.text}
</CustomEm>,
<CustomEm key={i}>{token.tokens ? parseTokens(token.tokens) : token.text}</CustomEm>,
);
break;
case "codespan":
case 'codespan':
output.push(<CustomCodeSpan key={i} code={token.text} />);
break;
case "link":
case 'link':
output.push(
<CustomLink key={i} href={token.href} title={token.title}>
{token.tokens ? parseTokens(token.tokens) : token.text}
@@ -531,33 +457,24 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
);
break;
case "image":
case 'image':
output.push(
<CustomImage
key={i}
href={token.href}
title={token.title}
text={token.text}
/>,
<CustomImage key={i} href={token.href} title={token.title} text={token.text} />,
);
break;
case "text": {
const parsedContent = token.tokens
? parseTokens(token.tokens)
: token.text;
case 'text': {
const parsedContent = token.tokens ? parseTokens(token.tokens) : token.text;
if (blockquoteContent.length > 0) {
blockquoteContent.push(
<React.Fragment key={i}>{parsedContent}</React.Fragment>,
);
blockquoteContent.push(<React.Fragment key={i}>{parsedContent}</React.Fragment>);
} else {
output.push(<CustomText key={i} text={parsedContent} />);
}
break;
}
default:
console.warn("Unhandled token type:", token.type, token);
console.warn('Unhandled token type:', token.type, token);
}
});

View File

@@ -1,13 +1,12 @@
import React from "react";
import {renderMessageMarkdown} from "./MessageMarkdown";
import React from 'react';
import { renderMessageMarkdown } from './MessageMarkdown';
interface CustomMarkdownRendererProps {
markdown: string;
}
const MessageMarkdownRenderer: React.FC<CustomMarkdownRendererProps> = ({
markdown,
}) => {
const MessageMarkdownRenderer: React.FC<CustomMarkdownRendererProps> = ({ markdown }) => {
return <div>{renderMessageMarkdown(markdown)}</div>;
};

View File

@@ -1,4 +1,4 @@
import {motion} from "framer-motion";
import {Box} from "@chakra-ui/react";
import { Box } from '@chakra-ui/react';
import { motion } from 'framer-motion';
export default motion(Box);
export default motion(Box);

View File

@@ -1,6 +1,6 @@
import { observer } from "mobx-react-lite";
import { IconButton } from "@chakra-ui/react";
import { Edit2Icon } from "lucide-react";
import { IconButton } from '@chakra-ui/react';
import { Edit2Icon } from 'lucide-react';
import { observer } from 'mobx-react-lite';
const UserMessageTools = observer(({ disabled = false, message, onEdit }) => (
<IconButton
@@ -8,26 +8,26 @@ const UserMessageTools = observer(({ disabled = false, message, onEdit }) => (
color="text.primary"
aria-label="Edit message"
title="Edit message"
icon={<Edit2Icon size={"1em"} />}
icon={<Edit2Icon size={'1em'} />}
onClick={() => onEdit(message)}
_active={{
bg: "transparent",
bg: 'transparent',
svg: {
stroke: "brand.100",
transition: "stroke 0.3s ease-in-out",
stroke: 'brand.100',
transition: 'stroke 0.3s ease-in-out',
},
}}
_hover={{
bg: "transparent",
bg: 'transparent',
svg: {
stroke: "accent.secondary",
transition: "stroke 0.3s ease-in-out",
stroke: 'accent.secondary',
transition: 'stroke 0.3s ease-in-out',
},
}}
variant="ghost"
size="sm"
isDisabled={disabled}
_focus={{ boxShadow: "none" }}
_focus={{ boxShadow: 'none' }}
/>
));

View File

@@ -1,14 +1,15 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent, waitFor } from '@testing-library/react';
import React from 'react';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import messageEditorStore from '../../../../stores/MessageEditorStore';
import MessageBubble from '../MessageBubble';
import messageEditorStore from "../../../../stores/MessageEditorStore";
// Mock browser APIs
class MockResizeObserver {
observe() {}
unobserve() {}
disconnect() {}
observe() {}
unobserve() {}
disconnect() {}
}
// Add ResizeObserver to the global object
@@ -16,140 +17,140 @@ global.ResizeObserver = MockResizeObserver;
// Mock the Message model
vi.mock('../../../../models/Message', () => ({
default: {
// This is needed for the Instance<typeof Message> type
}
default: {
// This is needed for the Instance<typeof Message> type
},
}));
// Mock the stores
vi.mock('../../../../stores/ClientChatStore', () => ({
default: {
items: [],
isLoading: false,
editMessage: vi.fn().mockReturnValue(true)
}
default: {
items: [],
isLoading: false,
editMessage: vi.fn().mockReturnValue(true),
},
}));
vi.mock('../../../../stores/UserOptionsStore', () => ({
default: {
followModeEnabled: false,
setFollowModeEnabled: vi.fn()
}
default: {
followModeEnabled: false,
setFollowModeEnabled: vi.fn(),
},
}));
// Mock the MessageEditorStore
vi.mock('../../../../stores/MessageEditorStore', () => ({
default: {
editedContent: 'Test message',
setEditedContent: vi.fn(),
setMessage: vi.fn(),
onCancel: vi.fn(),
handleSave: vi.fn().mockImplementation(function() {
// Use the mocked messageEditorStore from the import
messageEditorStore.onCancel();
return Promise.resolve();
})
}
default: {
editedContent: 'Test message',
setEditedContent: vi.fn(),
setMessage: vi.fn(),
onCancel: vi.fn(),
handleSave: vi.fn().mockImplementation(function () {
// Use the mocked messageEditorStore from the import
messageEditorStore.onCancel();
return Promise.resolve();
}),
},
}));
// Mock the MessageRenderer component
vi.mock('../ChatMessageContent', () => ({
default: ({ content }) => <div data-testid="message-content">{content}</div>
default: ({ content }) => <div data-testid="message-content">{content}</div>,
}));
// Mock the UserMessageTools component
vi.mock('../UserMessageTools', () => ({
default: ({ message, onEdit }) => (
<button data-testid="edit-button" onClick={() => onEdit(message)}>
Edit
</button>
)
default: ({ message, onEdit }) => (
<button data-testid="edit-button" onClick={() => onEdit(message)}>
Edit
</button>
),
}));
vi.mock("../MotionBox", async (importOriginal) => {
const actual = await importOriginal()
vi.mock('../MotionBox', async importOriginal => {
const actual = await importOriginal();
return { default: {
...actual.default,
div: (props: any) => React.createElement('div', props, props.children),
motion: (props: any) => React.createElement('div', props, props.children),
}
}
return {
default: {
...actual.default,
div: (props: any) => React.createElement('div', props, props.children),
motion: (props: any) => React.createElement('div', props, props.children),
},
};
});
describe('MessageBubble', () => {
const mockScrollRef = { current: { scrollTo: vi.fn() } };
const mockUserMessage = {
role: 'user',
content: 'Test message'
};
const mockAssistantMessage = {
role: 'assistant',
content: 'Assistant response'
};
const mockScrollRef = { current: { scrollTo: vi.fn() } };
const mockUserMessage = {
role: 'user',
content: 'Test message',
};
const mockAssistantMessage = {
role: 'assistant',
content: 'Assistant response',
};
beforeEach(() => {
vi.clearAllMocks();
beforeEach(() => {
vi.clearAllMocks();
});
it('should render user message correctly', () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
expect(screen.getByText('You')).toBeInTheDocument();
expect(screen.getByText('Test message')).toBeInTheDocument();
});
it('should render assistant message correctly', () => {
render(<MessageBubble msg={mockAssistantMessage} scrollRef={mockScrollRef} />);
expect(screen.getByText("Geoff's AI")).toBeInTheDocument();
expect(screen.getByTestId('message-content')).toHaveTextContent('Assistant response');
});
it('should show edit button on hover for user messages', async () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Simulate hover
fireEvent.mouseEnter(screen.getByRole('listitem'));
expect(screen.getByTestId('edit-button')).toBeInTheDocument();
});
it('should show editor when edit button is clicked', () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Simulate hover and click edit
fireEvent.mouseEnter(screen.getByRole('listitem'));
fireEvent.click(screen.getByTestId('edit-button'));
// Check if the textarea is rendered (part of MessageEditor)
expect(screen.getByRole('textbox')).toBeInTheDocument();
});
it('should hide editor after message is edited and saved', async () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Show the editor
fireEvent.mouseEnter(screen.getByRole('listitem'));
fireEvent.click(screen.getByTestId('edit-button'));
// Verify editor is shown
expect(screen.getByRole('textbox')).toBeInTheDocument();
// Find and click the save button
const saveButton = screen.getByLabelText('Save edit');
fireEvent.click(saveButton);
// Wait for the editor to disappear
await waitFor(() => {
// Check that the editor is no longer visible
expect(screen.queryByRole('textbox')).not.toBeInTheDocument();
// And the message content is visible again
expect(screen.getByText('Test message')).toBeInTheDocument();
});
it('should render user message correctly', () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
expect(screen.getByText('You')).toBeInTheDocument();
expect(screen.getByText('Test message')).toBeInTheDocument();
});
it('should render assistant message correctly', () => {
render(<MessageBubble msg={mockAssistantMessage} scrollRef={mockScrollRef} />);
expect(screen.getByText("Geoff's AI")).toBeInTheDocument();
expect(screen.getByTestId('message-content')).toHaveTextContent('Assistant response');
});
it('should show edit button on hover for user messages', async () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Simulate hover
fireEvent.mouseEnter(screen.getByRole('listitem'));
expect(screen.getByTestId('edit-button')).toBeInTheDocument();
});
it('should show editor when edit button is clicked', () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Simulate hover and click edit
fireEvent.mouseEnter(screen.getByRole('listitem'));
fireEvent.click(screen.getByTestId('edit-button'));
// Check if the textarea is rendered (part of MessageEditor)
expect(screen.getByRole('textbox')).toBeInTheDocument();
});
it('should hide editor after message is edited and saved', async () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Show the editor
fireEvent.mouseEnter(screen.getByRole('listitem'));
fireEvent.click(screen.getByTestId('edit-button'));
// Verify editor is shown
expect(screen.getByRole('textbox')).toBeInTheDocument();
// Find and click the save button
const saveButton = screen.getByLabelText('Save edit');
fireEvent.click(saveButton);
// Wait for the editor to disappear
await waitFor(() => {
// Check that the editor is no longer visible
expect(screen.queryByRole('textbox')).not.toBeInTheDocument();
// And the message content is visible again
expect(screen.getByText('Test message')).toBeInTheDocument();
});
// Verify that handleSave was called
expect(messageEditorStore.handleSave).toHaveBeenCalled();
});
// Verify that handleSave was called
expect(messageEditorStore.handleSave).toHaveBeenCalled();
});
});

View File

@@ -1,27 +1,27 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react';
import React from 'react';
import MessageEditor from '../MessageEditorComponent';
import { describe, it, expect, vi, beforeEach } from 'vitest';
// Import the mocked stores
import clientChatStore from '../../../../stores/ClientChatStore';
import messageEditorStore from '../../../../stores/MessageEditorStore';
import MessageEditor from '../MessageEditorComponent';
// Mock the Message model
vi.mock('../../../../models/Message', () => {
return {
default: {
// This is needed for the Instance<typeof Message> type
}
},
};
});
// Mock fetch globally
globalThis.fetch = vi.fn(() =>
Promise.resolve({
ok: true,
json: () => Promise.resolve({})
})
Promise.resolve({
ok: true,
json: () => Promise.resolve({}),
}),
);
// Mock the ClientChatStore
@@ -31,14 +31,14 @@ vi.mock('../../../../stores/ClientChatStore', () => {
removeAfter: vi.fn(),
sendMessage: vi.fn(),
setIsLoading: vi.fn(),
editMessage: vi.fn().mockReturnValue(true)
editMessage: vi.fn().mockReturnValue(true),
};
// Add the mockUserMessage to the items array
mockStore.items.indexOf = vi.fn().mockReturnValue(0);
return {
default: mockStore
default: mockStore,
};
});
@@ -48,25 +48,25 @@ vi.mock('../../../../stores/MessageEditorStore', () => {
editedContent: 'Test message', // Set initial value to match the test expectation
message: null,
setEditedContent: vi.fn(),
setMessage: vi.fn((message) => {
setMessage: vi.fn(message => {
mockStore.message = message;
mockStore.editedContent = message.content;
}),
onCancel: vi.fn(),
handleSave: vi.fn()
handleSave: vi.fn(),
};
return {
default: mockStore
default: mockStore,
};
});
describe('MessageEditor', () => {
// Create a message object with a setContent method
const mockUserMessage = {
content: 'Test message',
const mockUserMessage = {
content: 'Test message',
role: 'user',
setContent: vi.fn()
setContent: vi.fn(),
};
const mockOnCancel = vi.fn();
@@ -93,7 +93,7 @@ describe('MessageEditor', () => {
});
it('should call handleSave when save button is clicked', () => {
render(<MessageEditor message={mockUserMessage} onCancel={mockOnCancel}/>);
render(<MessageEditor message={mockUserMessage} onCancel={mockOnCancel} />);
const saveButton = screen.getByLabelText('Save edit');
fireEvent.click(saveButton);

View File

@@ -1,5 +1,6 @@
import React, { useState, useEffect, useCallback, useMemo } from "react";
import { buildCodeHighlighter } from "./CodeHighlighter";
import React, { useState, useEffect, useCallback } from 'react';
import { buildCodeHighlighter } from './CodeHighlighter';
interface CodeBlockProps {
language: string;
@@ -9,23 +10,19 @@ interface CodeBlockProps {
const highlighter = buildCodeHighlighter();
const CodeBlock: React.FC<CodeBlockProps> = ({
language,
code,
onRenderComplete,
}) => {
const [html, setHtml] = useState<string>("");
const CodeBlock: React.FC<CodeBlockProps> = ({ language, code, onRenderComplete }) => {
const [html, setHtml] = useState<string>('');
const [loading, setLoading] = useState<boolean>(true);
const highlightCode = useCallback(async () => {
try {
const highlighted = (await highlighter).codeToHtml(code, {
lang: language,
theme: "github-dark",
theme: 'github-dark',
});
setHtml(highlighted);
} catch (error) {
console.error("Error highlighting code:", error);
console.error('Error highlighting code:', error);
setHtml(`<pre>${code}</pre>`);
} finally {
setLoading(false);
@@ -41,9 +38,9 @@ const CodeBlock: React.FC<CodeBlockProps> = ({
return (
<div
style={{
backgroundColor: "#24292e",
padding: "10px",
borderRadius: "1.5em",
backgroundColor: '#24292e',
padding: '10px',
borderRadius: '1.5em',
}}
>
Loading code...
@@ -55,12 +52,12 @@ const CodeBlock: React.FC<CodeBlockProps> = ({
<div
dangerouslySetInnerHTML={{ __html: html }}
style={{
transition: "none",
transition: 'none',
padding: 20,
backgroundColor: "#24292e",
overflowX: "auto",
borderRadius: ".37em",
fontSize: ".75rem",
backgroundColor: '#24292e',
overflowX: 'auto',
borderRadius: '.37em',
fontSize: '.75rem',
}}
/>
);

View File

@@ -1,5 +1,6 @@
import { createHighlighterCore } from "shiki";
import { createHighlighterCore } from 'shiki';
/* eslint-disable import/no-unresolved */
export async function buildCodeHighlighter() {
const [
githubDark,
@@ -23,26 +24,26 @@ export async function buildCodeHighlighter() {
zig,
wasm,
] = await Promise.all([
import("shiki/themes/github-dark.mjs"),
import("shiki/langs/html.mjs"),
import("shiki/langs/javascript.mjs"),
import("shiki/langs/jsx.mjs"),
import("shiki/langs/typescript.mjs"),
import("shiki/langs/tsx.mjs"),
import("shiki/langs/go.mjs"),
import("shiki/langs/rust.mjs"),
import("shiki/langs/python.mjs"),
import("shiki/langs/java.mjs"),
import("shiki/langs/kotlin.mjs"),
import("shiki/langs/shell.mjs"),
import("shiki/langs/sql.mjs"),
import("shiki/langs/yaml.mjs"),
import("shiki/langs/toml.mjs"),
import("shiki/langs/markdown.mjs"),
import("shiki/langs/json.mjs"),
import("shiki/langs/xml.mjs"),
import("shiki/langs/zig.mjs"),
import("shiki/wasm"),
import('shiki/themes/github-dark.mjs'),
import('shiki/langs/html.mjs'),
import('shiki/langs/javascript.mjs'),
import('shiki/langs/jsx.mjs'),
import('shiki/langs/typescript.mjs'),
import('shiki/langs/tsx.mjs'),
import('shiki/langs/go.mjs'),
import('shiki/langs/rust.mjs'),
import('shiki/langs/python.mjs'),
import('shiki/langs/java.mjs'),
import('shiki/langs/kotlin.mjs'),
import('shiki/langs/shell.mjs'),
import('shiki/langs/sql.mjs'),
import('shiki/langs/yaml.mjs'),
import('shiki/langs/toml.mjs'),
import('shiki/langs/markdown.mjs'),
import('shiki/langs/json.mjs'),
import('shiki/langs/xml.mjs'),
import('shiki/langs/zig.mjs'),
import('shiki/wasm'),
]);
// Create the highlighter instance with the loaded themes and languages

View File

@@ -1,4 +1,3 @@
import React from "react";
import {
Alert,
AlertIcon,
@@ -9,40 +8,41 @@ import {
Link,
List,
ListItem,
} from "@chakra-ui/react";
import { MarkdownEditor } from "./MarkdownEditor";
import { Fragment, useState } from "react";
} from '@chakra-ui/react';
import React, { Fragment, useState } from 'react';
import { MarkdownEditor } from './MarkdownEditor';
function ConnectComponent() {
const [formData, setFormData] = useState({
markdown: "",
email: "",
firstname: "",
lastname: "",
markdown: '',
email: '',
firstname: '',
lastname: '',
});
const [isSubmitted, setIsSubmitted] = useState(false);
const [isError, setIsError] = useState(false);
const [validationError, setValidationError] = useState("");
const [validationError, setValidationError] = useState('');
const handleChange = (field: string) => (value: string) => {
setFormData((prev) => ({ ...prev, [field]: value }));
setFormData(prev => ({ ...prev, [field]: value }));
setIsSubmitted(false);
setValidationError("");
setValidationError('');
};
const handleSubmitButton = async () => {
setValidationError("");
setValidationError('');
if (!formData.email || !formData.firstname || !formData.markdown) {
setValidationError("Please fill in all required fields.");
setValidationError('Please fill in all required fields.');
return;
}
try {
const response = await fetch("/api/contact", {
method: "POST",
const response = await fetch('/api/contact', {
method: 'POST',
headers: {
"Content-Type": "application/json",
'Content-Type': 'application/json',
},
body: JSON.stringify(formData),
});
@@ -51,10 +51,10 @@ function ConnectComponent() {
setIsSubmitted(true);
setIsError(false);
setFormData({
markdown: "",
email: "",
firstname: "",
lastname: "",
markdown: '',
email: '',
firstname: '',
lastname: '',
});
} else {
setIsError(true);
@@ -68,7 +68,7 @@ function ConnectComponent() {
<Fragment>
<List color="text.primary" mb={4}>
<ListItem>
Email:{" "}
Email:{' '}
<Link href="mailto:geoff@seemueller.io" color="teal.500">
geoff@seemueller.io
</Link>
@@ -79,14 +79,14 @@ function ConnectComponent() {
<Input
placeholder="First name *"
value={formData.firstname}
onChange={(e) => handleChange("firstname")(e.target.value)}
onChange={e => handleChange('firstname')(e.target.value)}
color="text.primary"
borderColor="text.primary"
/>
<Input
placeholder="Last name *"
value={formData.lastname}
onChange={(e) => handleChange("lastname")(e.target.value)}
onChange={e => handleChange('lastname')(e.target.value)}
color="text.primary"
borderColor="text.primary"
// bg="text.primary"
@@ -95,13 +95,13 @@ function ConnectComponent() {
<Input
placeholder="Email *"
value={formData.email}
onChange={(e) => handleChange("email")(e.target.value)}
onChange={e => handleChange('email')(e.target.value)}
mb={4}
borderColor="text.primary"
color="text.primary"
/>
<MarkdownEditor
onChange={handleChange("markdown")}
onChange={handleChange('markdown')}
markdown={formData.markdown}
placeholder="Your Message..."
/>
@@ -116,47 +116,32 @@ function ConnectComponent() {
mb={4}
float="right"
_hover={{
bg: "",
transform: "scale(1.05)",
bg: '',
transform: 'scale(1.05)',
}}
_active={{
bg: "gray.800",
transform: "scale(1)",
bg: 'gray.800',
transform: 'scale(1)',
}}
>
SEND
</Button>
<Box mt={12}>
{isSubmitted && (
<Alert
status="success"
borderRadius="md"
color="text.primary"
bg="green.500"
>
<Alert status="success" borderRadius="md" color="text.primary" bg="green.500">
<AlertIcon />
Message sent successfully!
</Alert>
)}
{isError && (
<Alert
status="error"
borderRadius="md"
color="text.primary"
bg="red.500"
>
<Alert status="error" borderRadius="md" color="text.primary" bg="red.500">
<AlertIcon />
There was an error sending your message. Please try again.
</Alert>
)}
{validationError && (
<Alert
status="warning"
borderRadius="md"
color="background.primary"
bg="yellow.500"
>
<Alert status="warning" borderRadius="md" color="background.primary" bg="yellow.500">
<AlertIcon />
{validationError}
</Alert>

View File

@@ -1,5 +1,5 @@
import React from "react";
import { Box, Textarea } from "@chakra-ui/react";
import { Box, Textarea } from '@chakra-ui/react';
import React from 'react';
export const MarkdownEditor = (props: {
placeholder: string;
@@ -8,15 +8,15 @@ export const MarkdownEditor = (props: {
}) => {
return (
<Box>
<link rel="stylesheet" href="/packages/client/public" media="print" onLoad="this.media='all'" />
<Textarea
value={props.markdown}
placeholder={props.placeholder}
onChange={(e) => props.onChange(e.target.value)}
onChange={e => props.onChange(e.target.value)}
width="100%"
minHeight="150px"
height="100%"
resize="none"
borderColor="text.accent"
/>
</Box>
);

View File

@@ -1,13 +1,9 @@
import {
ChakraProvider,
cookieStorageManagerSSR,
localStorageManager,
} from "@chakra-ui/react";
import { ChakraProvider, cookieStorageManagerSSR, localStorageManager } from '@chakra-ui/react';
export function Chakra({ cookies, children, theme }) {
const colorModeManager =
typeof cookies === "string"
? cookieStorageManagerSSR("color_state", cookies)
typeof cookies === 'string'
? cookieStorageManagerSSR('color_state', cookies)
: localStorageManager;
return (

View File

@@ -1,5 +1,5 @@
import React, { createContext, useContext, useState, useEffect } from "react";
import { useMediaQuery } from "@chakra-ui/react";
import { useMediaQuery } from '@chakra-ui/react';
import React, { createContext, useContext, useState, useEffect } from 'react';
// Create the context to provide mobile state
const MobileContext = createContext(false);
@@ -7,25 +7,20 @@ const MobileContext = createContext(false);
// Create a provider component to wrap your app
export const MobileProvider = ({ children }: { children: React.ReactNode }) => {
const [isMobile, setIsMobile] = useState(false);
const [isFallbackMobile] = useMediaQuery("(max-width: 768px)");
const [isFallbackMobile] = useMediaQuery('(max-width: 768px)');
useEffect(() => {
const userAgent = navigator.userAgent || navigator.vendor || window.opera;
const mobile =
/android|webos|iphone|ipad|ipod|blackberry|iemobile|opera mini/i.test(
userAgent.toLowerCase(),
);
const mobile = /android|webos|iphone|ipad|ipod|blackberry|iemobile|opera mini/i.test(
userAgent.toLowerCase(),
);
setIsMobile(mobile);
}, []);
// Provide the combined mobile state globally
const mobileState = isMobile || isFallbackMobile;
return (
<MobileContext.Provider value={mobileState}>
{children}
</MobileContext.Provider>
);
return <MobileContext.Provider value={mobileState}>{children}</MobileContext.Provider>;
};
// Custom hook to use the mobile context in any component

View File

@@ -1,5 +1,5 @@
import React from "react";
import { Badge, Box, Flex, Heading, Image, Text } from "@chakra-ui/react";
import { Badge, Box, Flex, Heading, Image, Text } from '@chakra-ui/react';
import React from 'react';
function DemoCard({ icon, title, description, imageUrl, badge, onClick }) {
return (
@@ -9,15 +9,15 @@ function DemoCard({ icon, title, description, imageUrl, badge, onClick }) {
overflowY="hidden"
boxShadow="md"
transition="transform 0.2s"
_hover={{ transform: "scale(1.05)", cursor: "pointer" }}
_hover={{ transform: 'scale(1.05)', cursor: 'pointer' }}
color="text.primary"
onClick={onClick}
display="flex"
flexDirection="column"
minW={"12rem"}
maxW={"18rem"}
minH={"35rem"}
maxH={"20rem"}
minW={'12rem'}
maxW={'18rem'}
minH={'35rem'}
maxH={'20rem'}
>
{imageUrl && (
<Image
@@ -42,7 +42,7 @@ function DemoCard({ icon, title, description, imageUrl, badge, onClick }) {
</Flex>
{badge && (
<Box p={2}>
<Badge colorScheme={"teal"}>{badge}</Badge>
<Badge colorScheme={'teal'}>{badge}</Badge>
</Box>
)}
</Box>

View File

@@ -1,16 +1,12 @@
import React from "react";
import { SimpleGrid } from "@chakra-ui/react";
import { Rocket, Shield } from "lucide-react";
import DemoCard from "./DemoCard";
import { SimpleGrid } from '@chakra-ui/react';
import { Rocket, Shield } from 'lucide-react';
import React from 'react';
import DemoCard from './DemoCard';
function DemoComponent() {
return (
<SimpleGrid
columns={{ base: 1, sm: 1, lg: 2 }}
spacing={"7%"}
minH={"min-content"}
h={"100vh"}
>
<SimpleGrid columns={{ base: 1, sm: 1, lg: 2 }} spacing={'7%'} minH={'min-content'} h={'100vh'}>
<DemoCard
icon={<Rocket size={24} color="teal" />}
title="toak"
@@ -18,7 +14,7 @@ function DemoComponent() {
imageUrl="/code-tokenizer-md.jpg"
badge="npm"
onClick={() => {
window.open("https://github.com/seemueller-io/toak");
window.open('https://github.com/seemueller-io/toak');
}}
/>
<DemoCard
@@ -28,7 +24,7 @@ function DemoComponent() {
imageUrl="/rehoboam.png"
badge="APP"
onClick={() => {
window.open("https://rehoboam.seemueller.io");
window.open('https://rehoboam.seemueller.io');
}}
/>
</SimpleGrid>

View File

@@ -1,4 +1,3 @@
import React from "react";
import {
Box,
Button,
@@ -14,9 +13,11 @@ import {
Textarea,
useToast,
VStack,
} from "@chakra-ui/react";
import { observer } from "mobx-react-lite";
import feedbackState from "../../stores/ClientFeedbackStore";
} from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React from 'react';
import feedbackState from '../../stores/ClientFeedbackStore';
const FeedbackModal = observer(({ isOpen, onClose, zIndex }) => {
const toast = useToast();
@@ -26,9 +27,9 @@ const FeedbackModal = observer(({ isOpen, onClose, zIndex }) => {
if (success) {
toast({
title: "Feedback Submitted",
description: "Thank you for your feedback!",
status: "success",
title: 'Feedback Submitted',
description: 'Thank you for your feedback!',
status: 'success',
duration: 3000,
isClosable: true,
});
@@ -40,9 +41,9 @@ const FeedbackModal = observer(({ isOpen, onClose, zIndex }) => {
}
toast({
title: "Submission Failed",
title: 'Submission Failed',
description: feedbackState.error,
status: "error",
status: 'error',
duration: 3000,
isClosable: true,
});
@@ -78,7 +79,7 @@ const FeedbackModal = observer(({ isOpen, onClose, zIndex }) => {
<Textarea
placeholder="Type your feedback here..."
value={feedbackState.input}
onChange={(e) => feedbackState.setInput(e.target.value)}
onChange={e => feedbackState.setInput(e.target.value)}
bg="gray.700"
color="white"
minHeight="120px"
@@ -89,7 +90,7 @@ const FeedbackModal = observer(({ isOpen, onClose, zIndex }) => {
bottom="2"
right="2"
fontSize="xs"
color={charactersRemaining < 50 ? "orange.300" : "gray.400"}
color={charactersRemaining < 50 ? 'orange.300' : 'gray.400'}
>
{charactersRemaining} characters remaining
</Text>

View File

@@ -1,13 +1,14 @@
import React from "react";
import { Box } from "@chakra-ui/react";
import { Box } from '@chakra-ui/react';
import React from 'react';
const TealDogecoinIcon = (props) => (
const TealDogecoinIcon = props => (
<Box
as="svg"
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 24 24"
stroke={'currentColor'}
fill="currentColor"
boxSize={props.boxSize || "1em"}
boxSize={props.boxSize || '1em'}
{...props}
>
<path
@@ -26,8 +27,8 @@ const TealDogecoinIcon = (props) => (
fill-rule="evenodd"
clip-rule="evenodd"
d="M17.5001 9.54053C16.4888 6.92891 13.9888 6.44507 13.9888 6.44507H6.85606L6.88358 9.10523H8.30406V15.0454H6.85596V17.7007H13.7913C15.4628 17.7007 16.8026 16.0211 16.8026 16.0211C18.9482 12.9758 17.5 9.54053 17.5 9.54053H17.5001ZM13.8285 14.2314C13.8285 14.2314 13.2845 15.0163 12.6927 15.0163H11.5087L11.4806 9.11173H13.0001C13.0001 9.11173 13.7041 9.25894 14.1959 10.6521C14.1959 10.6521 14.848 12.6468 13.8285 14.2314Z"
fill="white"
fill-opacity="0.8"
// fill="white"
// fill-opacity="0.8"
/>
</Box>
);

View File

@@ -1,18 +1,14 @@
import React from "react";
import { Box, VStack } from "@chakra-ui/react";
import {renderMarkdown} from "../markdown/MarkdownComponent";
import { Box, VStack } from '@chakra-ui/react';
import React from 'react';
import { renderMarkdown } from '../markdown/MarkdownComponent';
function LegalDoc({ text }) {
return (
<Box maxWidth="800px" margin="0 auto">
<VStack spacing={6} align="stretch">
<Box
color="text.primary"
wordBreak="break-word"
whiteSpace="pre-wrap"
spacing={4}
>
{renderMarkdown(text)}
<Box color="text.primary" wordBreak="break-word" whiteSpace="pre-wrap" spacing={4}>
{renderMarkdown(text)}
</Box>
</VStack>
</Box>

View File

@@ -1,21 +1,16 @@
import React, { useState, useEffect } from "react";
import { Image, Box, Spinner, Text, Flex } from "@chakra-ui/react";
import { keyframes } from "@emotion/react";
import { Image, Box, Spinner, Text, Flex } from '@chakra-ui/react';
import { keyframes } from '@emotion/react';
import React, { useState, useEffect } from 'react';
const shimmer = keyframes`
0% { background-position: -100% 0; }
100% { background-position: 100% 0; }
`;
const ImageWithFallback = ({
alt,
src,
fallbackSrc = "/fallback.png",
...props
}) => {
const ImageWithFallback = ({ alt, src, fallbackSrc = '/fallback.png', ...props }) => {
const [isLoading, setIsLoading] = useState(true);
const [scrollPosition, setScrollPosition] = useState(0);
const isSlowLoadingSource = src.includes("text2image.seemueller.io");
const isSlowLoadingSource = src.includes('text2image.seemueller.io');
const handleImageLoad = () => setIsLoading(false);
const handleImageError = () => {
@@ -33,24 +28,17 @@ const ImageWithFallback = ({
setScrollPosition(scrolled);
};
window.addEventListener("scroll", handleScroll);
window.addEventListener('scroll', handleScroll);
return () => {
window.removeEventListener("scroll", handleScroll);
window.removeEventListener('scroll', handleScroll);
};
}, []);
const parallaxOffset = scrollPosition * 0.2;
return (
<Box
position="relative"
w="full"
maxW="full"
borderRadius="md"
my={2}
overflow="hidden"
>
<Box position="relative" w="full" maxW="full" borderRadius="md" my={2} overflow="hidden">
{isLoading && isSlowLoadingSource && (
<Flex
align="center"
@@ -76,7 +64,7 @@ const ImageWithFallback = ({
fallbackSrc={fallbackSrc}
onLoad={handleImageLoad}
onError={handleImageError}
display={isLoading ? "none" : "block"}
display={isLoading ? 'none' : 'block'}
transform={`translateY(${parallaxOffset}px)`}
transition="transform 0.1s ease-out"
{...props}

View File

@@ -1,576 +1,487 @@
import React from "react";
import {
Box,
Code,
Divider,
Heading,
Link,
List,
ListItem,
OrderedList,
Table,
Tbody,
Td,
Text,
Th,
Thead,
Tr,
useColorModeValue,
} from "@chakra-ui/react";
import {marked} from "marked";
Box,
Code,
Divider,
Heading,
Link,
List,
ListItem,
OrderedList,
Table,
Tbody,
Td,
Text,
Th,
Thead,
Tr,
useColorModeValue,
} from '@chakra-ui/react';
import katex from 'katex';
import { marked } from 'marked';
import markedKatex from 'marked-katex-extension';
import React from 'react';
import markedKatex from "marked-katex-extension";
import katex from "katex";
import CodeBlock from "../code/CodeBlock";
import ImageWithFallback from "./ImageWithFallback";
import CodeBlock from '../code/CodeBlock';
import ImageWithFallback from './ImageWithFallback';
try {
if (localStorage) {
marked.use(
markedKatex({
nonStandard: false,
displayMode: true,
throwOnError: false,
strict: true,
colorIsTextColor: true,
errorColor: "red",
}),
);
}
if (localStorage) {
marked.use(
markedKatex({
nonStandard: false,
displayMode: true,
throwOnError: false,
strict: true,
colorIsTextColor: true,
errorColor: 'red',
}),
);
}
} catch (_) {
// Silently ignore errors in marked setup - fallback to default behavior
}
const MemoizedCodeBlock = React.memo(CodeBlock);
const getHeadingProps = (depth: number) => {
switch (depth) {
case 1:
return {as: "h1", size: "xl", mt: 4, mb: 2};
case 2:
return {as: "h2", size: "lg", mt: 3, mb: 2};
case 3:
return {as: "h3", size: "md", mt: 2, mb: 1};
case 4:
return {as: "h4", size: "sm", mt: 2, mb: 1};
case 5:
return {as: "h5", size: "sm", mt: 2, mb: 1};
case 6:
return {as: "h6", size: "xs", mt: 2, mb: 1};
default:
return {as: `h${depth}`, size: "md", mt: 2, mb: 1};
}
switch (depth) {
case 1:
return { as: 'h1', size: 'xl', mt: 4, mb: 2 };
case 2:
return { as: 'h2', size: 'lg', mt: 3, mb: 2 };
case 3:
return { as: 'h3', size: 'md', mt: 2, mb: 1 };
case 4:
return { as: 'h4', size: 'sm', mt: 2, mb: 1 };
case 5:
return { as: 'h5', size: 'sm', mt: 2, mb: 1 };
case 6:
return { as: 'h6', size: 'xs', mt: 2, mb: 1 };
default:
return { as: `h${depth}`, size: 'md', mt: 2, mb: 1 };
}
};
interface TableToken extends marked.Tokens.Table {
align: Array<"center" | "left" | "right" | null>;
header: (string | marked.Tokens.TableCell)[];
rows: (string | marked.Tokens.TableCell)[][];
align: Array<'center' | 'left' | 'right' | null>;
header: (string | marked.Tokens.TableCell)[];
rows: (string | marked.Tokens.TableCell)[][];
}
const CustomHeading: React.FC<{ text: string; depth: number }> = ({
text,
depth,
}) => {
const headingProps = getHeadingProps(depth);
return (
<Heading
{...headingProps}
wordBreak="break-word"
maxWidth="100%"
color="text.accent"
>
{text}
</Heading>
);
const CustomHeading: React.FC<{ text: string; depth: number }> = ({ text, depth }) => {
const headingProps = getHeadingProps(depth);
return (
<Heading {...headingProps} wordBreak="break-word" maxWidth="100%" color="text.accent">
{text}
</Heading>
);
};
const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({
children,
}) => {
return (
<Text
as="p"
fontSize="sm"
lineHeight="short"
wordBreak="break-word"
maxWidth="100%"
>
{children}
</Text>
);
const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return (
<Text as="p" fontSize="sm" lineHeight="short" wordBreak="break-word" maxWidth="100%">
{children}
</Text>
);
};
const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({
children,
}) => {
return (
<Box
as="blockquote"
borderLeft="4px solid"
borderColor="gray.200"
fontStyle="italic"
color="gray.600"
pl={4}
maxWidth="100%"
wordBreak="break-word"
mb={2}
>
{children}
</Box>
);
const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return (
<Box
as="blockquote"
borderLeft="4px solid"
borderColor="gray.200"
fontStyle="italic"
color="gray.600"
pl={4}
maxWidth="100%"
wordBreak="break-word"
mb={2}
>
{children}
</Box>
);
};
const CustomCodeBlock: React.FC<{ code: string; language?: string }> = ({
code,
language,
}) => {
return (
<MemoizedCodeBlock
language={language}
code={code}
onRenderComplete={() => Promise.resolve()}
/>
);
const CustomCodeBlock: React.FC<{ code: string; language?: string }> = ({ code, language }) => {
return (
<MemoizedCodeBlock language={language} code={code} onRenderComplete={() => Promise.resolve()} />
);
};
const CustomHr: React.FC = () => <Divider my={4}/>;
const CustomHr: React.FC = () => <Divider my={4} />;
const CustomList: React.FC<{
ordered?: boolean;
start?: number;
children: React.ReactNode;
}> = ({ordered, start, children}) => {
const commonStyles = {
fontSize: "sm",
wordBreak: "break-word" as const,
maxWidth: "100%" as const,
stylePosition: "outside" as const,
mb: 2,
pl: 4,
};
ordered?: boolean;
start?: number;
children: React.ReactNode;
}> = ({ ordered, start, children }) => {
const commonStyles = {
fontSize: 'sm',
wordBreak: 'break-word' as const,
maxWidth: '100%' as const,
stylePosition: 'outside' as const,
mb: 2,
pl: 4,
};
return ordered ? (
<OrderedList start={start} {...commonStyles}>
{children}
</OrderedList>
) : (
<List styleType="disc" {...commonStyles}>
{children}
</List>
);
return ordered ? (
<OrderedList start={start} {...commonStyles}>
{children}
</OrderedList>
) : (
<List styleType="disc" {...commonStyles}>
{children}
</List>
);
};
const CustomListItem: React.FC<{
children: React.ReactNode;
}> = ({children}) => {
return <ListItem mb={1}>{children}</ListItem>;
children: React.ReactNode;
}> = ({ children }) => {
return <ListItem mb={1}>{children}</ListItem>;
};
const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({
math,
displayMode,
}) => {
const renderedMath = katex.renderToString(math, {displayMode});
const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({ math, displayMode }) => {
const renderedMath = katex.renderToString(math, { displayMode });
return (
<Box
as="span"
display={displayMode ? "block" : "inline"}
// bg={bg}
p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0}
borderRadius="md"
overflow="auto"
maxWidth="100%"
dangerouslySetInnerHTML={{__html: renderedMath}}
/>
);
return (
<Box
as="span"
display={displayMode ? 'block' : 'inline'}
// bg={bg}
p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0}
borderRadius="md"
overflow="auto"
maxWidth="100%"
dangerouslySetInnerHTML={{ __html: renderedMath }}
/>
);
};
const CustomTable: React.FC<{
header: React.ReactNode[];
align: Array<"center" | "left" | "right" | null>;
rows: React.ReactNode[][];
}> = ({header, align, rows}) => {
return (
<Table
variant="simple"
size="sm"
my={4}
borderRadius="md"
overflow="hidden"
>
<Thead bg="background.secondary">
<Tr>
{header.map((cell, i) => (
<Th
key={i}
textAlign={align[i] || "left"}
fontWeight="bold"
p={2}
minW={16}
wordBreak="break-word"
>
{cell}
</Th>
))}
</Tr>
</Thead>
<Tbody>
{rows.map((row, rIndex) => (
<Tr key={rIndex}>
{row.map((cell, cIndex) => (
<Td
key={cIndex}
textAlign={align[cIndex] || "left"}
p={2}
wordBreak="break-word"
>
{cell}
</Td>
))}
</Tr>
))}
</Tbody>
</Table>
);
header: React.ReactNode[];
align: Array<'center' | 'left' | 'right' | null>;
rows: React.ReactNode[][];
}> = ({ header, align, rows }) => {
return (
<Table variant="simple" size="sm" my={4} borderRadius="md" overflow="hidden">
<Thead bg="background.secondary">
<Tr>
{header.map((cell, i) => (
<Th
key={i}
textAlign={align[i] || 'left'}
fontWeight="bold"
p={2}
minW={16}
wordBreak="break-word"
>
{cell}
</Th>
))}
</Tr>
</Thead>
<Tbody>
{rows.map((row, rIndex) => (
<Tr key={rIndex}>
{row.map((cell, cIndex) => (
<Td key={cIndex} textAlign={align[cIndex] || 'left'} p={2} wordBreak="break-word">
{cell}
</Td>
))}
</Tr>
))}
</Tbody>
</Table>
);
};
const CustomHtmlBlock: React.FC<{ content: string }> = ({content}) => {
return <Box as="span" display="inline" dangerouslySetInnerHTML={{__html: content}} mb={2}/>;
const CustomHtmlBlock: React.FC<{ content: string }> = ({ content }) => {
return <Box as="span" display="inline" dangerouslySetInnerHTML={{ __html: content }} mb={2} />;
};
const CustomText: React.FC<{ text: React.ReactNode }> = ({text}) => {
return (
<Text
fontSize="sm"
lineHeight="short"
color="text.accent"
wordBreak="break-word"
maxWidth="100%"
as="span"
>
{text}
</Text>
);
const CustomText: React.FC<{ text: React.ReactNode }> = ({ text }) => {
return (
<Text
fontSize="sm"
lineHeight="short"
color="text.accent"
wordBreak="break-word"
maxWidth="100%"
as="span"
>
{text}
</Text>
);
};
interface CustomStrongProps {
children: React.ReactNode;
children: React.ReactNode;
}
const CustomStrong: React.FC<CustomStrongProps> = ({children}) => {
return <Text as="strong">{children}</Text>;
const CustomStrong: React.FC<CustomStrongProps> = ({ children }) => {
return <Text as="strong">{children}</Text>;
};
const CustomEm: React.FC<{ children: React.ReactNode }> = ({children}) => {
return (
<Text
as="em"
fontStyle="italic"
lineHeight="short"
wordBreak="break-word"
display="inline"
>
{children}
</Text>
);
const CustomEm: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return (
<Text as="em" fontStyle="italic" lineHeight="short" wordBreak="break-word" display="inline">
{children}
</Text>
);
};
const CustomDel: React.FC<{ text: string }> = ({text}) => {
return (
<Text
as="del"
textDecoration="line-through"
lineHeight="short"
wordBreak="break-word"
display="inline"
>
{text}
</Text>
);
const CustomDel: React.FC<{ text: string }> = ({ text }) => {
return (
<Text
as="del"
textDecoration="line-through"
lineHeight="short"
wordBreak="break-word"
display="inline"
>
{text}
</Text>
);
};
const CustomCodeSpan: React.FC<{ code: string }> = ({code}) => {
const bg = useColorModeValue("gray.100", "gray.800");
return (
<Code
fontSize="sm"
bg={bg}
overflowX="clip"
borderRadius="md"
wordBreak="break-word"
maxWidth="100%"
p={0.5}
>
{code}
</Code>
);
const CustomCodeSpan: React.FC<{ code: string }> = ({ code }) => {
const bg = useColorModeValue('gray.100', 'gray.800');
return (
<Code
fontSize="sm"
bg={bg}
overflowX="clip"
borderRadius="md"
wordBreak="break-word"
maxWidth="100%"
p={0.5}
>
{code}
</Code>
);
};
const CustomMath: React.FC<{ math: string; displayMode?: boolean }> = ({
math,
displayMode = false,
}) => {
return (
<Box
as="span"
display={displayMode ? "block" : "inline"}
p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0}
borderRadius="md"
overflow="auto"
maxWidth="100%"
className={`math ${displayMode ? "math-display" : "math-inline"}`}
>
{math}
</Box>
);
math,
displayMode = false,
}) => {
return (
<Box
as="span"
display={displayMode ? 'block' : 'inline'}
p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0}
borderRadius="md"
overflow="auto"
maxWidth="100%"
className={`math ${displayMode ? 'math-display' : 'math-inline'}`}
>
{math}
</Box>
);
};
const CustomLink: React.FC<{
href: string;
title?: string;
children: React.ReactNode;
}> = ({href, title, children, ...props}) => {
return (
<Link
href={href}
title={title}
isExternal
sx={{
"& span": {
color: "text.link",
},
}}
maxWidth="100%"
color="teal.500"
wordBreak="break-word"
{...props}
>
{children}
</Link>
);
href: string;
title?: string;
children: React.ReactNode;
}> = ({ href, title, children, ...props }) => {
return (
<Link
href={href}
title={title}
isExternal
sx={{
'& span': {
color: 'text.link',
},
}}
maxWidth="100%"
color="teal.500"
wordBreak="break-word"
{...props}
>
{children}
</Link>
);
};
const CustomImage: React.FC<{ href: string; text: string; title?: string }> = ({
href,
text,
title,
}) => {
return (
<ImageWithFallback
src={href}
alt={text}
title={title}
maxW="100%"
width="auto"
height="auto"
my={2}
/>
);
href,
text,
title,
}) => {
return (
<ImageWithFallback
src={href}
alt={text}
title={title}
maxW="100%"
width="auto"
height="auto"
my={2}
/>
);
};
function parseTokens(tokens: marked.Token[]): JSX.Element[] {
const output: JSX.Element[] = [];
let blockquoteContent: JSX.Element[] = [];
const output: JSX.Element[] = [];
let blockquoteContent: JSX.Element[] = [];
tokens.forEach((token, i) => {
switch (token.type) {
case "heading":
output.push(
<CustomHeading key={i} text={token.text} depth={token.depth}/>,
);
break;
tokens.forEach((token, i) => {
switch (token.type) {
case 'heading':
output.push(<CustomHeading key={i} text={token.text} depth={token.depth} />);
break;
case "paragraph": {
const parsedContent = token.tokens
? parseTokens(token.tokens)
: token.text;
if (blockquoteContent.length > 0) {
blockquoteContent.push(
<CustomParagraph key={i}>{parsedContent}</CustomParagraph>,
);
} else {
output.push(
<CustomParagraph key={i}>{parsedContent}</CustomParagraph>,
);
}
break;
}
case "br":
output.push(<br key={i}/>);
break;
case "escape": {
break;
}
case "blockquote_start":
blockquoteContent = [];
break;
case "blockquote_end":
output.push(
<CustomBlockquote key={i}>
{parseTokens(blockquoteContent)}
</CustomBlockquote>,
);
blockquoteContent = [];
break;
case "blockquote": {
output.push(
<CustomBlockquote key={i}>
{token.tokens ? parseTokens(token.tokens) : null}
</CustomBlockquote>,
);
break;
}
case "math":
output.push(
<CustomMath key={i} math={(token as any).value} displayMode={true}/>,
);
break;
case "inlineMath":
output.push(
<CustomMath
key={i}
math={(token as any).value}
displayMode={false}
/>,
);
break;
case "inlineKatex":
case "blockKatex": {
const katexToken = token as any;
output.push(
<CustomKatex
key={i}
math={katexToken.text}
displayMode={katexToken.displayMode}
/>,
);
break;
}
case "code":
output.push(
<CustomCodeBlock key={i} code={token.text} language={token.lang}/>,
);
break;
case "hr":
output.push(<CustomHr key={i}/>);
break;
case "list": {
const {ordered, start, items} = token;
const listItems = items.map((listItem, idx) => {
const nestedContent = parseTokens(listItem.tokens);
return <CustomListItem key={idx}>{nestedContent}</CustomListItem>;
});
output.push(
<CustomList key={i} ordered={ordered} start={start}>
{listItems}
</CustomList>,
);
break;
}
case "table": {
const tableToken = token as TableToken;
output.push(
<CustomTable
key={i}
header={tableToken.header.map((cell) =>
typeof cell === "string" ? cell : parseTokens(cell.tokens || []),
)}
align={tableToken.align}
rows={tableToken.rows.map((row) =>
row.map((cell) =>
typeof cell === "string"
? cell
: parseTokens(cell.tokens || []),
),
)}
/>,
);
break;
}
case "html":
output.push(<CustomHtmlBlock key={i} content={token.text}/>);
break;
case "def":
case "space":
break;
case "strong":
output.push(
<CustomStrong key={i}>
{parseTokens(token.tokens || [])}
</CustomStrong>,
);
break;
case "em":
output.push(
<CustomEm key={i}>
{token.tokens ? parseTokens(token.tokens) : token.text}
</CustomEm>,
);
break;
case "codespan":
output.push(<CustomCodeSpan key={i} code={token.text}/>);
break;
case "link":
output.push(
<CustomLink key={i} href={token.href} title={token.title}>
{token.tokens ? parseTokens(token.tokens) : token.text}
</CustomLink>,
);
break;
case "image":
output.push(
<CustomImage
key={i}
href={token.href}
title={token.title}
text={token.text}
/>,
);
break;
case "text": {
const parsedContent = token.tokens
? parseTokens(token.tokens)
: token.text;
if (blockquoteContent.length > 0) {
blockquoteContent.push(
<React.Fragment key={i}>{parsedContent}</React.Fragment>,
);
} else {
output.push(<CustomText key={i} text={parsedContent}/>);
}
break;
}
default:
console.warn("Unhandled token type:", token.type, token);
case 'paragraph': {
const parsedContent = token.tokens ? parseTokens(token.tokens) : token.text;
if (blockquoteContent.length > 0) {
blockquoteContent.push(<CustomParagraph key={i}>{parsedContent}</CustomParagraph>);
} else {
output.push(<CustomParagraph key={i}>{parsedContent}</CustomParagraph>);
}
});
break;
}
case 'br':
output.push(<br key={i} />);
break;
case 'escape': {
break;
}
case 'blockquote_start':
blockquoteContent = [];
break;
return output;
case 'blockquote_end':
output.push(<CustomBlockquote key={i}>{parseTokens(blockquoteContent)}</CustomBlockquote>);
blockquoteContent = [];
break;
case 'blockquote': {
output.push(
<CustomBlockquote key={i}>
{token.tokens ? parseTokens(token.tokens) : null}
</CustomBlockquote>,
);
break;
}
case 'math':
output.push(<CustomMath key={i} math={(token as any).value} displayMode={true} />);
break;
case 'inlineMath':
output.push(<CustomMath key={i} math={(token as any).value} displayMode={false} />);
break;
case 'inlineKatex':
case 'blockKatex': {
const katexToken = token as any;
output.push(
<CustomKatex key={i} math={katexToken.text} displayMode={katexToken.displayMode} />,
);
break;
}
case 'code':
output.push(<CustomCodeBlock key={i} code={token.text} language={token.lang} />);
break;
case 'hr':
output.push(<CustomHr key={i} />);
break;
case 'list': {
const { ordered, start, items } = token;
const listItems = items.map((listItem, idx) => {
const nestedContent = parseTokens(listItem.tokens);
return <CustomListItem key={idx}>{nestedContent}</CustomListItem>;
});
output.push(
<CustomList key={i} ordered={ordered} start={start}>
{listItems}
</CustomList>,
);
break;
}
case 'table': {
const tableToken = token as TableToken;
output.push(
<CustomTable
key={i}
header={tableToken.header.map(cell =>
typeof cell === 'string' ? cell : parseTokens(cell.tokens || []),
)}
align={tableToken.align}
rows={tableToken.rows.map(row =>
row.map(cell => (typeof cell === 'string' ? cell : parseTokens(cell.tokens || []))),
)}
/>,
);
break;
}
case 'html':
output.push(<CustomHtmlBlock key={i} content={token.text} />);
break;
case 'def':
case 'space':
break;
case 'strong':
output.push(<CustomStrong key={i}>{parseTokens(token.tokens || [])}</CustomStrong>);
break;
case 'em':
output.push(
<CustomEm key={i}>{token.tokens ? parseTokens(token.tokens) : token.text}</CustomEm>,
);
break;
case 'codespan':
output.push(<CustomCodeSpan key={i} code={token.text} />);
break;
case 'link':
output.push(
<CustomLink key={i} href={token.href} title={token.title}>
{token.tokens ? parseTokens(token.tokens) : token.text}
</CustomLink>,
);
break;
case 'image':
output.push(
<CustomImage key={i} href={token.href} title={token.title} text={token.text} />,
);
break;
case 'text': {
const parsedContent = token.tokens ? parseTokens(token.tokens) : token.text;
if (blockquoteContent.length > 0) {
blockquoteContent.push(<React.Fragment key={i}>{parsedContent}</React.Fragment>);
} else {
output.push(<CustomText key={i} text={parsedContent} />);
}
break;
}
default:
console.warn('Unhandled token type:', token.type, token);
}
});
return output;
}
export function renderMarkdown(markdown: string): JSX.Element[] {
marked.setOptions({
breaks: true,
gfm: true,
silent: false,
async: true,
});
marked.setOptions({
breaks: true,
gfm: true,
silent: false,
async: true,
});
const tokens = marked.lexer(markdown);
return parseTokens(tokens);
const tokens = marked.lexer(markdown);
return parseTokens(tokens);
}

View File

@@ -1,62 +0,0 @@
import React, { useCallback, useMemo } from "react";
import { Box, Flex, useMediaQuery } from "@chakra-ui/react";
import { resumeData } from "../../static-data/resume_data";
import SectionContent from "./SectionContent";
import SectionButton from "./SectionButton";
const sections = ["professionalSummary", "skills", "experience", "education"];
export default function ResumeComponent() {
const [activeSection, setActiveSection] = React.useState(
"professionalSummary",
);
const [isMobile] = useMediaQuery("(max-width: 1243px)");
const handleSectionClick = useCallback((section) => {
setActiveSection(section);
}, []);
const capitalizeFirstLetter = useCallback((word) => {
return word.charAt(0).toUpperCase() + word.slice(1).toLowerCase();
}, []);
const sectionButtons = useMemo(
() =>
sections.map((section) => (
<SectionButton
key={section}
onClick={() => handleSectionClick(section)}
activeSection={activeSection}
section={section}
mobile={isMobile}
callbackfn={capitalizeFirstLetter}
/>
)),
[activeSection, isMobile, handleSectionClick, capitalizeFirstLetter],
);
return (
<Box p={"unset"}>
<Flex
direction={isMobile ? "column" : "row"}
mb={8}
wrap="nowrap"
gap={isMobile ? 2 : 4}
minWidth="0"
>
{sectionButtons}
</Flex>
<Box
bg="background.secondary"
color="text.primary"
borderRadius="md"
boxShadow="md"
borderWidth={1}
borderColor="brand.300"
minHeight="300px"
>
<SectionContent activeSection={activeSection} resumeData={resumeData} />
</Box>
</Box>
);
}

View File

@@ -1,32 +0,0 @@
import React from "react";
import { Button } from "@chakra-ui/react";
import { ChevronRight } from "lucide-react";
function SectionButton(props: {
onClick: () => void;
activeSection: string;
section: string;
mobile: boolean;
callbackfn: (word) => string;
}) {
return (
<Button
mt={1}
onClick={props.onClick}
variant={props.activeSection === props.section ? "solid" : "outline"}
colorScheme="brand"
rightIcon={<ChevronRight size={16} />}
size="md"
width={props.mobile ? "100%" : "auto"}
>
{props.section
.replace(/([A-Z])/g, " $1")
.trim()
.split(" ")
.map(props.callbackfn)
.join(" ")}
</Button>
);
}
export default SectionButton;

View File

@@ -1,98 +0,0 @@
import React from "react";
import {
Box,
Grid,
GridItem,
Heading,
ListItem,
Text,
UnorderedList,
VStack,
} from "@chakra-ui/react";
const fontSize = "md";
const ProfessionalSummary = ({ professionalSummary }) => (
<Box>
<Grid
templateColumns="1fr"
gap={4}
maxW={["100%", "100%", "100%"]}
mx="auto"
className="about-container"
>
<GridItem
colSpan={1}
maxW={["100%", "100%", "container.md"]}
justifySelf="center"
minH={"100%"}
>
<Grid templateColumns="1fr" gap={4} overflowY={"auto"}>
<GridItem>
<Text fontSize="md">{professionalSummary}</Text>
</GridItem>
</Grid>
</GridItem>
</Grid>
</Box>
);
const Skills = ({ skills }) => (
<VStack align={"baseline"} spacing={6} mb={4}>
<UnorderedList spacing={2} mb={0}>
<Box>
{skills?.map((skill, index) => (
<ListItem p={1} key={index}>
{skill}
</ListItem>
))}
</Box>
</UnorderedList>
</VStack>
);
const Experience = ({ experience }) => (
<VStack align="start" spacing={6} mb={4}>
{experience?.map((job, index) => (
<Box key={index} width="100%">
<Heading as="h3" size="md" mb={2}>
{job.title}
</Heading>
<Text fontWeight="bold">{job.company}</Text>
<Text color="gray.500" mb={2}>
{job.timeline}
</Text>
<Text>{job.description}</Text>
</Box>
))}
</VStack>
);
const Education = ({ education }) => (
<UnorderedList spacing={2} mb={4}>
{education?.map((edu, index) => <ListItem key={index}>{edu}</ListItem>)}
</UnorderedList>
);
const SectionContent = ({ activeSection, resumeData }) => {
const components = {
professionalSummary: ProfessionalSummary,
skills: Skills,
experience: Experience,
education: Education,
};
const ActiveComponent = components[activeSection];
return (
<Box p={4} minHeight="300px" width="100%">
{ActiveComponent ? (
<ActiveComponent {...resumeData} />
) : (
<Text>Select a section to view details.</Text>
)}
</Box>
);
};
export default SectionContent;

View File

@@ -1,64 +0,0 @@
// ServicesComponent.js
import React, { useCallback, useMemo } from "react";
import { Box, Flex, useMediaQuery } from "@chakra-ui/react";
import { servicesData } from "../../static-data/services_data";
import SectionButton from "../resume/SectionButton";
import ServicesSectionContent from "./ServicesComponentSection";
const sections = ["servicesOverview", "offerings"];
export default function ServicesComponent() {
const [activeSection, setActiveSection] = React.useState("servicesOverview");
const [isMobile] = useMediaQuery("(max-width: 1243px)");
const handleSectionClick = useCallback((section) => {
setActiveSection(section);
}, []);
const capitalizeFirstLetter = useCallback((word) => {
return word.charAt(0).toUpperCase() + word.slice(1).toLowerCase();
}, []);
const sectionButtons = useMemo(
() =>
sections.map((section) => (
<SectionButton
key={section}
onClick={() => handleSectionClick(section)}
activeSection={activeSection}
section={section}
mobile={isMobile}
callbackfn={capitalizeFirstLetter}
/>
)),
[activeSection, isMobile, handleSectionClick, capitalizeFirstLetter],
);
return (
<Box p={"unset"}>
<Flex
direction={isMobile ? "column" : "row"}
mb={8}
wrap="nowrap"
gap={isMobile ? 2 : 4}
minWidth="0" // Ensures flex items can shrink if needed
>
{sectionButtons}
</Flex>
<Box
bg="background.secondary"
color="text.primary"
borderRadius="md"
boxShadow="md"
borderWidth={1}
borderColor="brand.300"
minHeight="300px"
>
<ServicesSectionContent
activeSection={activeSection}
data={servicesData}
/>
</Box>
</Box>
);
}

View File

@@ -1,40 +0,0 @@
import React from "react";
import { Box, Heading, Text, VStack } from "@chakra-ui/react";
const ServicesOverview = ({ servicesOverview }) => (
<Text fontSize="md">{servicesOverview}</Text>
);
const Offerings = ({ offerings }) => (
<VStack align="start" spacing={6} mb={4}>
{offerings.map((service, index) => (
<Box key={index}>
<Heading as="h3" size="md" mb={2}>
{service.title}
</Heading>
<Text mb={4}>{service.description}</Text>
</Box>
))}
</VStack>
);
const ServicesSectionContent = ({ activeSection, data }) => {
const components = {
servicesOverview: ServicesOverview,
offerings: Offerings,
};
const ActiveComponent = components[activeSection];
return (
<Box p={4} minHeight="300px" width="100%">
{ActiveComponent ? (
<ActiveComponent {...data} />
) : (
<Text>Select a section to view details.</Text>
)}
</Box>
);
};
export default ServicesSectionContent;

View File

@@ -1,13 +1,14 @@
import React from "react";
import { IconButton } from "@chakra-ui/react";
import { Github } from "lucide-react";
import { toolbarButtonZIndex } from "./Toolbar";
import { IconButton } from '@chakra-ui/react';
import { Github } from 'lucide-react';
import React from 'react';
import { toolbarButtonZIndex } from './Toolbar';
export default function GithubButton() {
return (
<IconButton
as="a"
href="https://github.com/geoffsee"
href="https://github.com/geoffsee/open-gsio"
target="_blank"
aria-label="GitHub"
icon={<Github />}
@@ -16,10 +17,10 @@ export default function GithubButton() {
stroke="text.accent"
color="text.accent"
_hover={{
bg: "transparent",
bg: 'transparent',
svg: {
stroke: "accent.secondary",
transition: "stroke 0.3s ease-in-out",
stroke: 'accent.secondary',
transition: 'stroke 0.3s ease-in-out',
},
}}
title="GitHub"

View File

@@ -1,8 +1,9 @@
import React from "react";
import { IconButton, useDisclosure } from "@chakra-ui/react";
import { LucideHeart } from "lucide-react";
import { toolbarButtonZIndex } from "./Toolbar";
import SupportThisSiteModal from "./SupportThisSiteModal";
import { IconButton, useDisclosure } from '@chakra-ui/react';
import { LucideHeart } from 'lucide-react';
import React from 'react';
import SupportThisSiteModal from './SupportThisSiteModal';
import { toolbarButtonZIndex } from './Toolbar';
export default function SupportThisSiteButton() {
const { isOpen, onOpen, onClose } = useDisclosure();
@@ -18,10 +19,10 @@ export default function SupportThisSiteButton() {
stroke="text.accent"
bg="transparent"
_hover={{
bg: "transparent",
bg: 'transparent',
svg: {
stroke: "accent.danger",
transition: "stroke 0.3s ease-in-out",
stroke: 'accent.danger',
transition: 'stroke 0.3s ease-in-out',
},
}}
title="Support"
@@ -29,9 +30,9 @@ export default function SupportThisSiteButton() {
zIndex={toolbarButtonZIndex}
sx={{
svg: {
stroke: "text.accent",
strokeWidth: "2px",
transition: "stroke 0.2s ease-in-out",
stroke: 'text.accent',
strokeWidth: '2px',
transition: 'stroke 0.2s ease-in-out',
},
}}
/>

View File

@@ -1,4 +1,3 @@
import React from "react";
import {
Box,
Button,
@@ -19,26 +18,26 @@ import {
useClipboard,
useToast,
VStack,
} from "@chakra-ui/react";
import { QRCodeCanvas } from "qrcode.react";
import { FaBitcoin, FaEthereum } from "react-icons/fa";
import { observer } from "mobx-react-lite";
import clientTransactionStore from "../../stores/ClientTransactionStore";
import DogecoinIcon from "../icons/DogecoinIcon";
} from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import { QRCodeCanvas } from 'qrcode.react';
import React from 'react';
import { FaBitcoin, FaEthereum } from 'react-icons/fa';
import clientTransactionStore from '../../stores/ClientTransactionStore';
import DogecoinIcon from '../icons/DogecoinIcon';
const SupportThisSiteModal = observer(({ isOpen, onClose, zIndex }) => {
const { hasCopied, onCopy } = useClipboard(
clientTransactionStore.depositAddress || "",
);
const { hasCopied, onCopy } = useClipboard(clientTransactionStore.depositAddress || '');
const toast = useToast();
const handleCopy = () => {
if (clientTransactionStore.depositAddress) {
onCopy();
toast({
title: "Address Copied!",
description: "Thank you for your support!",
status: "success",
title: 'Address Copied!',
description: 'Thank you for your support!',
status: 'success',
duration: 3000,
isClosable: true,
});
@@ -49,17 +48,17 @@ const SupportThisSiteModal = observer(({ isOpen, onClose, zIndex }) => {
try {
await clientTransactionStore.prepareTransaction();
toast({
title: "Success",
title: 'Success',
description: `Use your wallet app (Coinbase, ...ect) to send the selected asset to the provided address.`,
status: "success",
status: 'success',
duration: 6000,
isClosable: true,
});
} catch (error) {
toast({
title: "Transaction Failed",
description: "There was an issue preparing your transaction.",
status: "error",
title: 'Transaction Failed',
description: 'There was an issue preparing your transaction.',
status: 'error',
duration: 3000,
isClosable: true,
});
@@ -68,29 +67,23 @@ const SupportThisSiteModal = observer(({ isOpen, onClose, zIndex }) => {
const donationMethods = [
{
name: "Ethereum",
name: 'Ethereum',
icon: FaEthereum,
},
{
name: "Bitcoin",
name: 'Bitcoin',
icon: FaBitcoin,
},
{
name: "Dogecoin",
name: 'Dogecoin',
icon: DogecoinIcon,
},
];
return (
<Modal
isOpen={isOpen}
onClose={onClose}
size="md"
motionPreset="slideInBottom"
zIndex={zIndex}
>
<ModalOverlay />
<ModalContent bg="gray.800" color="text.primary">
<Modal isOpen={isOpen} onClose={onClose} size="md" motionPreset="slideInBottom" zIndex={zIndex}>
<ModalOverlay bg="bg.primary" backdropFilter="blur(10px) hue-rotate(90deg)" />
<ModalContent bg="bg.primary" color="text.primary">
<ModalHeader textAlign="center" mb={2}>
Support
</ModalHeader>
@@ -103,85 +96,75 @@ const SupportThisSiteModal = observer(({ isOpen, onClose, zIndex }) => {
<Tabs
align="center"
variant="soft-rounded"
colorScheme="teal"
// colorScheme="teal"
isFitted
>
<TabList mb={2} w={"20%"}>
{donationMethods.map((method) => (
<TabList mb={2} w={'20%'}>
{donationMethods.map(method => (
<Tab
p={4}
key={method.name}
color={'text.primary'}
bg={
clientTransactionStore.selectedMethod === method.name
? 'bg.primary'
: 'bg.secondary'
}
onClick={() => {
clientTransactionStore.setSelectedMethod(method.name);
}}
>
<Box p={1} w={"fit-content"}>
<method.icon />{" "}
<Box p={1} w={'fit-content'}>
<method.icon />{' '}
</Box>
{method.name}
</Tab>
))}
</TabList>
<TabPanels>
{donationMethods.map((method) => (
{donationMethods.map(method => (
<TabPanel key={method.name}>
{!clientTransactionStore.userConfirmed ? (
<VStack spacing={4}>
<Text>Enter your information:</Text>
<Input
placeholder="Your name"
value={
clientTransactionStore.donerId as string | undefined
}
onChange={(e) =>
clientTransactionStore.setDonerId(e.target.value)
}
value={clientTransactionStore.donerId as string | undefined}
onChange={e => clientTransactionStore.setDonerId(e.target.value)}
type="text"
bg="gray.700"
color="white"
bg="bg.secondary"
color="text.primary"
w="100%"
/>
<Text>Enter the amount you wish to donate:</Text>
<Input
placeholder="Enter amount"
value={
clientTransactionStore.amount as number | undefined
}
onChange={(e) =>
clientTransactionStore.setAmount(e.target.value)
}
value={clientTransactionStore.amount as number | undefined}
onChange={e => clientTransactionStore.setAmount(e.target.value)}
type="number"
bg="gray.700"
color="white"
bg="bg.secondary"
// color="white"
w="100%"
/>
<Button
onClick={handleConfirmAmount}
size="md"
colorScheme="teal"
// colorScheme="teal"
>
Confirm Amount
</Button>
</VStack>
) : (
<>
<Box
bg="white"
p={2}
borderRadius="lg"
mb={4}
w={"min-content"}
>
<Box bg="white" p={2} borderRadius="lg" mb={4} w={'min-content'}>
<QRCodeCanvas
value={
clientTransactionStore.depositAddress as string
}
value={clientTransactionStore.depositAddress as string}
size={180}
/>
</Box>
<Box
bg="gray.700"
bg="bg.secondary"
p={4}
borderRadius="md"
wordBreak="unset"
@@ -196,10 +179,10 @@ const SupportThisSiteModal = observer(({ isOpen, onClose, zIndex }) => {
<Button
onClick={handleCopy}
size="md"
colorScheme="teal"
// colorScheme="teal"
mb={4}
>
{hasCopied ? "Address Copied!" : "Copy Address"}
{hasCopied ? 'Address Copied!' : 'Copy Address'}
</Button>
<Text fontSize="md" fontWeight="bold">
Transaction ID: {clientTransactionStore.txId}
@@ -213,7 +196,7 @@ const SupportThisSiteModal = observer(({ isOpen, onClose, zIndex }) => {
</VStack>
</ModalBody>
<ModalFooter>
<Button variant="outline" mr={3} onClick={onClose} colorScheme="gray">
<Button variant="outline" mr={3} onClick={onClose}>
Close
</Button>
</ModalFooter>

View File

@@ -1,8 +1,10 @@
import React from "react";
import { Flex } from "@chakra-ui/react";
import SupportThisSiteButton from "./SupportThisSiteButton";
import GithubButton from "./GithubButton";
import BuiltWithButton from "../BuiltWithButton";
import { Flex } from '@chakra-ui/react';
import React from 'react';
import BuiltWithButton from '../BuiltWithButton';
import GithubButton from './GithubButton';
import SupportThisSiteButton from './SupportThisSiteButton';
const toolbarButtonZIndex = 901;
@@ -11,8 +13,8 @@ export { toolbarButtonZIndex };
function ToolBar({ isMobile }) {
return (
<Flex
direction={isMobile ? "row" : "column"}
alignItems={isMobile ? "flex-start" : "flex-end"}
direction={isMobile ? 'row' : 'column'}
alignItems={isMobile ? 'flex-start' : 'flex-end'}
pb={4}
>
<SupportThisSiteButton />

Some files were not shown because too many files have changed in this diff Show More