75 Commits
2.0 ... main

Author SHA1 Message Date
Geoff Seemueller
03c83b0a2e Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-08-16 10:22:17 -04:00
geoffsee
ae6a6e4064 Refactor model filtering logic into reusable basicFilters function. 2025-07-31 10:10:35 -04:00
geoffsee
67483d08db Update model path handling logic for FireworksAI and refine supported model filtering. 2025-07-27 12:30:47 -04:00
geoffsee
53268b528d Update hero label for home route in renderer routes. 2025-07-27 09:32:46 -04:00
geoffsee
f9d5fc8282 Remove unused submodules and related scripts 2025-07-27 09:00:25 -04:00
geoffsee
ce9bc4db07 "Swap default states for mapActive and aiActive in LandingComponent" 2025-07-17 14:11:15 -04:00
geoffsee
bd71bfcad3 - Remove unused BevyScene and related dependencies.
- Refactor `InstallButton` and relocate it to `install/`.
- Update `Toolbar` imports to reflect the new `InstallButton` structure.
- Introduce `handleInstall` functionality for PWA installation prompt handling.
2025-07-17 14:04:47 -04:00
Geoff Seemueller
4edee1e191 Potential fix for code scanning alert no. 5: Shell command built from environment values
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-07-17 13:47:50 -04:00
geoffsee
734f48d4a7 remove webhost in assistant prompt 2025-07-17 13:47:50 -04:00
geoffsee
66363cdf39 set ai as the default landing 2025-07-17 13:47:50 -04:00
geoffsee
36f8fcee87 Integrate PWA service worker registration using virtual:pwa-register. 2025-07-17 13:47:50 -04:00
geoffsee
f055cd39fe Update InputMenu to use clientChatStore.reset() instead of setActiveConversation when closing. 2025-07-17 13:47:50 -04:00
geoffsee
0183503425 Refactored layout components and styling: removed unused imports, adjusted positioning and padding for consistency. 2025-07-17 13:47:50 -04:00
geoffsee
a7ad06093a simplify landing page for my peace 2025-07-17 13:47:50 -04:00
geoffsee
c26d2467f4 sweet lander 2025-07-17 13:47:50 -04:00
geoffsee
818e0e672a chat + maps + ai + tools 2025-07-17 13:47:50 -04:00
geoffsee
48655474e3 mirror error handling behavior in cloudflare worker 2025-07-17 13:47:50 -04:00
geoffsee
ffabfd4ce5 add top level error handler to the router 2025-07-17 13:47:50 -04:00
geoffsee
fa5b7466bc Optimize WASM handling and integrate service worker caching.
Removed unused pointer events in BevyScene, updated Vite config with Workbox for service worker caching, and adjusted file paths in generate-bevy-bundle.js. Added WASM size optimization to ensure smaller and efficient builds, skipping optimization for files below 30MB.
2025-07-17 13:47:50 -04:00
geoffsee
6cc5e038a7 Add visible prop to toggle components and simplify conditional rendering 2025-07-17 13:47:50 -04:00
geoffsee
e72198628c Add "Install App" button to the toolbar using react-use-pwa-install library 2025-07-17 13:47:50 -04:00
geoffsee
c0428094c8 **Integrate PWA asset generator and update favicon and manifest configuration** 2025-07-17 13:47:50 -04:00
geoffsee
3901337163 - Refactor BevyScene to replace script injection with dynamic import.
- Update `NavItem` to provide fallback route for invalid `path`.
- Temporarily stub metric API endpoints with placeholders.
2025-07-17 13:47:50 -04:00
geoffsee
0ff8b5c03e * Introduced BevyScene React component in landing-component for rendering a 3D cockpit visualization.
* Included WebAssembly asset `yachtpit.js` for cockpit functionality.
* Added Bevy MIT license file.
* Implemented a service worker to cache assets locally instead of fetching them remotely.
* Added collapsible functionality to **Tweakbox** and included the `@chakra-ui/icons` dependency.
* Applied the `hidden` prop to the Tweakbox Heading for better accessibility.
* Refactored **Particles** component for improved performance, clarity, and maintainability.

  * Introduced helper functions for particle creation and count management.
  * Added responsive resizing with particle repositioning.
  * Optimized animation updates, including velocity adjustments for speed changes.
  * Ensured canvas size and particle state are cleanly managed on component unmount.
2025-07-17 13:47:50 -04:00
geoffsee
858282929c Refactor chat-stream-provider to simplify tool structure. Optimize WeatherTool implementation with enriched function schema. 2025-07-17 13:47:50 -04:00
geoffsee
06b6a68b9b Enable tool-based message generation in chat-stream-provider and add BasicValueTool and WeatherTool.
Updated dependencies to latest versions in `bun.lock`. Modified development script in `package.json` to include watch mode.
2025-07-17 13:47:50 -04:00
dependabot[bot]
de968bcfbd Bump dotenv from 16.6.1 to 17.0.0
Bumps [dotenv](https://github.com/motdotla/dotenv) from 16.6.1 to 17.0.0.
- [Changelog](https://github.com/motdotla/dotenv/blob/master/CHANGELOG.md)
- [Commits](https://github.com/motdotla/dotenv/compare/v16.6.1...v17.0.0)

---
updated-dependencies:
- dependency-name: dotenv
  dependency-version: 17.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-01 11:00:15 -04:00
dependabot[bot]
6e8d9f3534 Bump react-streaming from 0.3.50 to 0.4.2
Bumps [react-streaming](https://github.com/brillout/react-streaming) from 0.3.50 to 0.4.2.
- [Changelog](https://github.com/brillout/react-streaming/blob/main/CHANGELOG.md)
- [Commits](https://github.com/brillout/react-streaming/compare/v0.3.50...v0.4.2)

---
updated-dependencies:
- dependency-name: react-streaming
  dependency-version: 0.4.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-01 10:59:30 -04:00
geoffsee
57ad9df087 fix wrangler config schema ref 2025-06-26 14:21:11 -04:00
geoffsee
610cb711a4 fix eslint version to 8 2025-06-26 12:40:54 -04:00
geoffsee
8cba09e67b - Add cache refresh mechanism for providers in ChatService
- Implemented tests to validate caching logic based on provider changes
- Enhanced caching logic to include a provider signature for more precise cache validation
2025-06-25 19:12:14 -04:00
geoffsee
c8e6da2d15 Add Docker support with Dockerfile and docker-compose.yml, update build scripts and README for containerized deployment.
- Updated server `Bun.build` configuration: adjusted `outdir`, added `format` as `esm`, and set `@open-gsio/client` to external.
- Expanded README with Docker instructions.
- Added new package `@open-gsio/analytics-worker`.
- Upgraded dependencies (`vite`, `typescript`, `bun`) and locked `pnpm` version in `package.json`.
2025-06-25 18:13:52 -04:00
geoffsee
1dab5aaa14 Bun server handles static assets and api 2025-06-25 16:46:46 -04:00
geoffsee
a295c208e9 Update React, React-DOM, and related dependencies to latest versions. 2025-06-25 16:30:42 -04:00
dependabot[bot]
713f0ffe8b Bump @anthropic-ai/sdk from 0.32.1 to 0.54.0
Bumps [@anthropic-ai/sdk](https://github.com/anthropics/anthropic-sdk-typescript) from 0.32.1 to 0.54.0.
- [Release notes](https://github.com/anthropics/anthropic-sdk-typescript/releases)
- [Changelog](https://github.com/anthropics/anthropic-sdk-typescript/blob/main/CHANGELOG.md)
- [Commits](https://github.com/anthropics/anthropic-sdk-typescript/compare/sdk-v0.32.1...sdk-v0.54.0)

---
updated-dependencies:
- dependency-name: "@anthropic-ai/sdk"
  dependency-version: 0.54.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-25 16:22:59 -04:00
dependabot[bot]
a793bfe8e0 Bump react-dom from 18.3.1 to 19.1.0
Bumps [react-dom](https://github.com/facebook/react/tree/HEAD/packages/react-dom) from 18.3.1 to 19.1.0.
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.1.0/packages/react-dom)

---
updated-dependencies:
- dependency-name: react-dom
  dependency-version: 19.1.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-25 16:22:51 -04:00
dependabot[bot]
d594929998 Bump @testing-library/react from 14.3.1 to 16.3.0
Bumps [@testing-library/react](https://github.com/testing-library/react-testing-library) from 14.3.1 to 16.3.0.
- [Release notes](https://github.com/testing-library/react-testing-library/releases)
- [Changelog](https://github.com/testing-library/react-testing-library/blob/main/CHANGELOG.md)
- [Commits](https://github.com/testing-library/react-testing-library/compare/v14.3.1...v16.3.0)

---
updated-dependencies:
- dependency-name: "@testing-library/react"
  dependency-version: 16.3.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-25 16:22:36 -04:00
geoffsee
6d9bf79ba3 Update tests to use updated HUMAN/ASSISTANT format instead of **Human**/**Assistant**. 2025-06-25 16:16:23 -04:00
geoffsee
6b5928de7f Update AssetService SSR handling tests: refine mocks and add edge cases 2025-06-25 16:12:59 -04:00
geoffsee
f9249f3496 - Refactored to introduce handleSsr function in @open-gsio/client/server/index.ts for streamlined SSR handling.
- Replaced inline SSR logic in `AssetService.ts` with `handleSsr` import.
- Enhanced `build:client` script to ensure server directory creation.
- Updated dependencies and devDependencies across multiple packages for compatibility improvements.
2025-06-25 16:03:13 -04:00
geoffsee
93bec55585 Add bun wrangler tail log script and filter non-text models 2025-06-25 14:32:54 -04:00
geoffsee
8cdb6b8c94 - Refine assistant output formatting by removing bold headers and adjusting response template.
- Update `package.json` across multiple packages to include missing newline and add package manager metadata.
- Minor README formatting fixes to remove unnecessary trailing spaces.
2025-06-25 14:15:01 -04:00
geoffsee
48bedb8c74 fix nonexistant suite 2025-06-25 14:00:16 -04:00
geoffsee
068d8614e0 tests updated with new import 2025-06-25 14:00:16 -04:00
geoffsee
554096abb2 wip 2025-06-25 14:00:16 -04:00
geoffsee
21d6c8604e github button targets repo 2025-06-24 20:56:08 -04:00
geoffsee
de3173a8f8 add missing files to last commit 2025-06-24 20:46:36 -04:00
geoffsee
c6e09644e2 **Refactor:** Restructure server package to streamline imports and improve file organization
- Moved `providers`, `services`, `models`, `lib`, and related files to `src` directory within `server` package.
- Adjusted imports across the codebase to reflect the new paths.
- Renamed several `.ts` files for consistency.
- Introduced an `index.ts` in the `ai/providers` package to export all providers.

This improves maintainability and aligns with the project's updated directory structure.
2025-06-24 20:46:15 -04:00
geoffsee
0b8d67fc69 remove package manager spec 2025-06-24 17:36:39 -04:00
geoffsee
f76301d620 run format 2025-06-24 17:32:59 -04:00
geoffsee
02c3253343 adds eslint 2025-06-24 17:32:59 -04:00
geoffsee
9698fc6f3b Refactor project: remove unused code, clean up logs, streamline error handling, update TypeScript configs, and enhance message streaming.
- Deployed
2025-06-24 16:28:25 -04:00
geoffsee
004ec580d3 Remove unused ResumeComponent, ServicesComponent, and related sections. Update theming for SupportThisSiteModal, adjust DogecoinIcon, and refine Cloudflare worker references. 2025-06-24 15:51:39 -04:00
geoffsee
bdbc8de6d5 **Remove dead links and redundant comments; improve styling and clarity across multiple files**
- Removed outdated links and unused properties in Sidebar and Welcome Home Text files.
- Dropped extraneous comments and consolidated imports in server files for streamlined code.
- Enhanced MarkdownEditor visuals with a colorful border for better user experience.
2025-06-24 15:23:34 -04:00
geoffsee
a367812fe7 update prompts and ollama endpoint 2025-06-24 15:12:12 -04:00
geoffsee
22bf2f1c2f Fix provider endpoints 2025-06-24 15:01:43 -04:00
geoffsee
02ede2b0f6 Refactor ServerCoordinator and project structure for clearer durable objects organization and module imports. 2025-06-18 15:53:17 -04:00
geoffsee
afc46fe2c3 fix tests 2025-06-18 15:02:29 -04:00
geoffsee
b7f02eb4fb fix mlx omni provider 2025-06-18 14:33:07 -04:00
geoffsee
f1d7f52dbd fixes model initialization for mlx 2025-06-18 13:30:38 -04:00
geoffsee
38b364caeb fix local inference config 2025-06-18 12:38:38 -04:00
geoffsee
3d16bd94b4 **Refactor imports and improve type annotations**
- Adjusted import statements across the codebase to align with consistent use of `type`.
- Unified usage of `EventSource` initialization.
- Introduced `RootDeps` type for shared dependencies.
- Commented out unused VitePWA configuration.
- Updated proxy target URLs in Vite configuration.
2025-06-18 12:34:16 -04:00
geoffsee
7454c9b54b fix build 2025-06-18 10:41:39 -04:00
geoffsee
0c999e0400 fixes tests 2025-06-09 23:18:52 -04:00
geoffsee
362f50bf85 remove faulty test execution pattern 2025-06-09 23:18:52 -04:00
geoffsee
9e79c488ee correct README 2025-06-09 23:18:52 -04:00
geoffsee
370c3e5717 adjust README and local inference configuration script 2025-06-09 23:18:52 -04:00
geoffsee
f29bb6779c improves interoperability of model providers, local and remote providers can be used together seemlessly 2025-06-09 23:18:52 -04:00
Geoff Seemueller
ad7dc5c0a6 Update README.md
improve semantics

Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-05 14:04:08 -04:00
Geoff Seemueller
059e7d3218 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-04 20:19:12 -04:00
geoffsee
6be0316e75 add some missing to last 2025-06-04 20:09:39 -04:00
geoffsee
5bd1e2f77f add Acknowledgments section to README 2025-06-04 20:05:02 -04:00
geoffsee
03aae4d8db fix static fileserver 2025-06-04 19:00:10 -04:00
geoffsee
5d7a7b740a fix package script for server:dev 2025-06-04 18:52:39 -04:00
geoffsee
31d734d4f6 fix incorrect constructor usage 2025-06-04 18:50:59 -04:00
282 changed files with 8652 additions and 6473 deletions

3
.dockerignore Normal file
View File

@@ -0,0 +1,3 @@
/.wrangler/
/.open-gsio/
/node_modules/

41
.eslintignore Normal file
View File

@@ -0,0 +1,41 @@
# Dependencies
node_modules
.pnp
.pnp.js
# Build outputs
dist
build
out
.next
.nuxt
.cache
# Test coverage
coverage
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Environment variables
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
# Editor directories and files
.idea
.vscode
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
# TypeScript
*.d.ts

49
.eslintrc.cjs Normal file
View File

@@ -0,0 +1,49 @@
module.exports = {
root: true,
parser: '@typescript-eslint/parser',
parserOptions: {
ecmaVersion: 2021,
sourceType: 'module',
project: './tsconfig.json',
},
env: {
browser: true,
node: true,
es6: true,
},
globals: {
Bun: 'readonly',
},
plugins: ['@typescript-eslint', 'import', 'prettier'],
extends: [
'eslint:recommended',
'plugin:@typescript-eslint/recommended',
'plugin:import/errors',
'plugin:import/warnings',
'plugin:import/typescript',
'prettier',
],
rules: {
'prettier/prettier': 'error',
'@typescript-eslint/explicit-module-boundary-types': 'off',
'@typescript-eslint/no-explicit-any': 'warn',
'@typescript-eslint/no-unused-vars': ['warn', { argsIgnorePattern: '^_' }],
'import/order': [
'error',
{
'newlines-between': 'always',
alphabetize: { order: 'asc', caseInsensitive: true },
groups: ['builtin', 'external', 'internal', 'parent', 'sibling', 'index'],
},
],
},
settings: {
'import/resolver': {
node: {
extensions: ['.js', '.jsx', '.ts', '.tsx'],
moduleDirectory: ['node_modules', 'packages/*/node_modules'],
},
},
},
ignorePatterns: ['node_modules', 'dist', 'build', '*.d.ts', '*.min.js'],
};

17
.gitignore vendored
View File

@@ -7,10 +7,21 @@
**/.idea/
**/html/
**/.env
packages/client/public/static/fonts/*
**/secrets.json
**/.dev.vars
packages/client/public/sitemap.xml
packages/client/public/robots.txt
wrangler.dev.jsonc
/packages/client/public/static/fonts/
/packages/client/public/robots.txt
/packages/client/public/sitemap.xml
/packages/client/public/yachtpit.html
/packages/client/public/yachtpit.js
/packages/client/public/yachtpit_bg.wasm
/packages/client/public/assets/
/packages/client/public/apple-touch-icon-180x180.png
/packages/client/public/icon.ico
/packages/client/public/maskable-icon-512x512.png
/packages/client/public/pwa-64x64.png
/packages/client/public/pwa-192x192.png
/packages/client/public/pwa-512x512.png
packages/client/public/yachtpit_bg*

47
.prettierignore Normal file
View File

@@ -0,0 +1,47 @@
# Dependencies
node_modules
.pnp
.pnp.js
# Build outputs
dist
build
out
.next
.nuxt
.cache
# Test coverage
coverage
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Environment variables
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
# Editor directories and files
.idea
.vscode
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
# Package files
package-lock.json
yarn.lock
pnpm-lock.yaml
bun.lock
# Generated files
CHANGELOG.md

19
.prettierrc.cjs Normal file
View File

@@ -0,0 +1,19 @@
module.exports = {
semi: true,
singleQuote: true,
trailingComma: 'all',
printWidth: 100,
tabWidth: 2,
useTabs: false,
bracketSpacing: true,
arrowParens: 'avoid',
endOfLine: 'lf',
overrides: [
{
files: '*.{json,yml,yaml,md}',
options: {
tabWidth: 2,
},
},
],
};

51
Dockerfile Normal file
View File

@@ -0,0 +1,51 @@
FROM oven/bun:latest
WORKDIR /app
# Copy package files first for better caching
COPY package.json bun.lock ./
# Create directory structure for all packages
RUN mkdir -p packages/ai packages/ai/src/types packages/client packages/coordinators packages/env packages/router packages/schema packages/scripts packages/server packages/services packages/cloudflare-workers/analytics packages/cloudflare-workers/open-gsio
# Copy package.json files for all packages
COPY packages/ai/package.json ./packages/ai/
COPY packages/ai/src/types/package.json ./packages/ai/src/types/
COPY packages/client/package.json ./packages/client/
COPY packages/coordinators/package.json ./packages/coordinators/
COPY packages/env/package.json ./packages/env/
COPY packages/router/package.json ./packages/router/
COPY packages/schema/package.json ./packages/schema/
COPY packages/scripts/package.json ./packages/scripts/
COPY packages/server/package.json ./packages/server/
COPY packages/services/package.json ./packages/services/
COPY packages/cloudflare-workers/analytics/package.json ./packages/cloudflare-workers/analytics/
COPY packages/cloudflare-workers/open-gsio/package.json ./packages/cloudflare-workers/open-gsio/
# Install dependencies
RUN bun install
# Copy the rest of the application
COPY . .
# Create .env file if it doesn't exist
RUN touch ./packages/server/.env
# Build client and server
RUN bun build:client && bun build:server
# Ensure the client directories exist
RUN mkdir -p ./client/public ./client/dist/client
# Copy client files to the expected locations
RUN cp -r ./packages/client/public/* ./client/public/ || true
RUN cp -r ./packages/client/dist/* ./client/dist/ || true
EXPOSE 3003
# Verify server.js exists
RUN test -f ./packages/server/dist/server.js || (echo "Error: server.js not found" && exit 1)
CMD ["bun", "./packages/server/dist/server.js"]

View File

@@ -1,60 +1,60 @@
Legacy Development History
---
## Legacy Development History
The source code of open-gsio was drawn from the source code of my personal website. That commit history was contaminated early on with secrets. `open-gsio` is a refinement of those sources. A total of 367 commits were submitted to the main branch of the upstream source repository between August 2024 and May 2025.
#### **May 2025**
* Added **seemueller.ai** link to UI sidebar.
* Global config/markdown guide cleanup; patched a critical forgotten bug.
- Added **seemueller.ai** link to UI sidebar.
- Global config/markdown guide cleanup; patched a critical forgotten bug.
#### **Apr 2025**
* **CI/CD overhaul**: autodeploy to dev & staging, Bun adoption as package manager, streamlined blocklist workflow (now autoupdates via VPN blocker).
* New 404 error page; multiple robots.txt and editorresize fixes; removed dead/duplicate code.
- **CI/CD overhaul**: autodeploy to dev & staging, Bun adoption as package manager, streamlined blocklist workflow (now autoupdates via VPN blocker).
- New 404 error page; multiple robots.txt and editorresize fixes; removed dead/duplicate code.
#### **Mar 2025**
* Introduced **modelspecific `max_tokens`** handling and plugged in **Cloudflare AI models** for testing.
* Bundle size minimised (reenabled minifier, smaller vendor set).
- Introduced **modelspecific `max_tokens`** handling and plugged in **Cloudflare AI models** for testing.
- Bundle size minimised (reenabled minifier, smaller vendor set).
#### **Feb 2025**
* **Full theme system** (runtime switching, Centauri theme, serversaved prefs).
* Tightened MobX typing for messages; responsive breakpoints & input scaling repaired.
* Dropped legacy document API; general folder restructure.
- **Full theme system** (runtime switching, Centauri theme, serversaved prefs).
- Tightened MobX typing for messages; responsive breakpoints & input scaling repaired.
- Dropped legacy document API; general folder restructure.
#### **Jan 2025**
* **Ratelimit middleware**, larger KV/R2 storage quota.
* Switched default model → *llamav3p170binstruct*; pluggable model handlers.
* Added **KaTeX fonts** & **Marked.js** for rich math/markdown.
* Fireworks key rotation; deprecated Google models removed.
- **Ratelimit middleware**, larger KV/R2 storage quota.
- Switched default model → _llamav3p170binstruct_; pluggable model handlers.
- Added **KaTeX fonts** & **Marked.js** for rich math/markdown.
- Fireworks key rotation; deprecated Google models removed.
#### **Dec 2024**
* Major package upgrades; **CodeHighlighter** now supports HTML/JSX/TS(X)/Zig.
* Refactored streaming + markdown renderer; Androidspecific padding fixes.
* Reset default chat model to **gpt4o**; welcome message & richer searchintent logic.
- Major package upgrades; **CodeHighlighter** now supports HTML/JSX/TS(X)/Zig.
- Refactored streaming + markdown renderer; Androidspecific padding fixes.
- Reset default chat model to **gpt4o**; welcome message & richer searchintent logic.
#### **Nov 2024**
* **Fireworks API** + agent server; firstclass support for **Anthropic** & **GROQ** models (incl. attachments).
* **VPN blocker** shipped with CIDR validation and dedicated GitHub Action.
* Live search buffering, feedback modal, smarter context preprocessing.
- **Fireworks API** + agent server; firstclass support for **Anthropic** & **GROQ** models (incl. attachments).
- **VPN blocker** shipped with CIDR validation and dedicated GitHub Action.
- Live search buffering, feedback modal, smarter context preprocessing.
#### **Oct 2024**
* Rolled out **image generation** + picker for image models.
* Deployed **ETH payment processor** & depositaddress flow.
* Introduced fewshot prompting library; analytics worker refactor; Halloween prompt.
* Extensive mobileUX polish and bundling/worker config updates.
- Rolled out **image generation** + picker for image models.
- Deployed **ETH payment processor** & depositaddress flow.
- Introduced fewshot prompting library; analytics worker refactor; Halloween prompt.
- Extensive mobileUX polish and bundling/worker config updates.
#### **Sep 2024**
* Endtoend **math rendering** (KaTeX) and **GitHubflavoured markdown**.
* Migrated chat state to **MobX**; launched analytics service & metrics worker.
* Switched build minifier to **esbuild**; tokenizer limits enforced; gradient sidebar & cookieconsent manager added.
- Endtoend **math rendering** (KaTeX) and **GitHubflavoured markdown**.
- Migrated chat state to **MobX**; launched analytics service & metrics worker.
- Switched build minifier to **esbuild**; tokenizer limits enforced; gradient sidebar & cookieconsent manager added.
#### **Aug 2024**
* **Initial MVP**: iMessagestyle chat UI, websocket prototype, Google Analytics, Cloudflare bindings, base workersite scaffold.
- **Initial MVP**: iMessagestyle chat UI, websocket prototype, Google Analytics, Cloudflare bindings, base workersite scaffold.

141
README.md
View File

@@ -1,21 +1,20 @@
# open-gsio
> Rewrite in-progress.
[![Tests](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml/badge.svg)](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml)
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
</br>
<p align="center">
<img src="https://github.com/user-attachments/assets/620d2517-e7be-4bb0-b2b7-3aa0cba37ef0" width="250" />
</p>
> **Note**: This project is currently under active development. The styling is a work in progress and some functionality
> may be broken. Tests are being actively ported and stability will improve over time. Thank you for your patience!
This is a full-stack Conversational AI. It runs on Cloudflare or Bun.
This is a full-stack Conversational AI.
## Table of Contents
- [Stack](#stack)
- [Installation](#installation)
- [Deployment](#deployment)
- [Docker](#docker)
- [Local Inference](#local-inference)
- [mlx-omni-server (default)](#mlx-omni-server)
- [Adding models](#adding-models-for-local-inference-apple-silicon)
@@ -23,20 +22,9 @@ This is a full-stack Conversational AI. It runs on Cloudflare or Bun.
- [Adding models](#adding-models-for-local-inference-ollama)
- [Testing](#testing)
- [Troubleshooting](#troubleshooting)
- [History](#history)
- [Acknowledgments](#acknowledgments)
- [License](#license)
## Stack
* [TypeScript](https://www.typescriptlang.org/)
* [Vike](https://vike.dev/)
* [React](https://react.dev/)
* [Cloudflare Workers](https://developers.cloudflare.com/workers/)
* [ittyrouter](https://github.com/kwhitley/itty-router)
* [MobXStateTree](https://mobx-state-tree.js.org/)
* [OpenAI SDK](https://github.com/openai/openai-node)
* [Vitest](https://vitest.dev/)
## Installation
1. `bun i && bun test:all`
@@ -46,19 +34,75 @@ This is a full-stack Conversational AI. It runs on Cloudflare or Bun.
> Note: it should be possible to use pnpm in place of bun.
## Deployment
1. Setup KV_STORAGE binding in `packages/server/wrangler.jsonc`
1. [Add keys in secrets.json](https://console.groq.com/keys)
1. Run `bun run deploy && bun run deploy:secrets && bun run deploy`
> Note: Subsequent deployments should omit `bun run deploy:secrets`
## Docker
You can run the server using Docker. The image is large but will be slimmed down in future commits.
### Building the Docker Image
```bash
docker compose build
# OR
docker build -t open-gsio .
```
### Running the Docker Container
```bash
docker run -p 3003:3003 \
-e GROQ_API_KEY=your_groq_api_key \
-e FIREWORKS_API_KEY=your_fireworks_api_key \
open-gsio
```
You can omit any environment variables that you don't need. The server will be available at http://localhost:3003.
### Using Docker Compose
A `docker-compose.yml` file is provided in the repository. You can edit it to add your API keys:
```yaml
version: '3'
services:
open-gsio:
build: .
ports:
- "3003:3003"
environment:
- GROQ_API_KEY=your_groq_api_key
- FIREWORKS_API_KEY=your_fireworks_api_key
# Other environment variables are included in the file
restart: unless-stopped
```
Then run:
```bash
docker compose up
```
Or to run in detached mode:
```bash
docker compose up -d
```
## Local Inference
> Local inference is achieved by overriding the `OPENAI_API_KEY` and `OPENAI_API_ENDPOINT` environment variables. See below.
> Local inference is supported for Ollama and mlx-omni-server. OpenAI compatible servers can be used by overriding OPENAI_API_KEY and OPENAI_API_ENDPOINT.
### mlx-omni-server
(default) (Apple Silicon Only) - Use Ollama for other platforms.
~~~bash
(default) (Apple Silicon Only)
```bash
# (prereq) install mlx-omni-server
brew tap seemueller-io/tap
brew install seemueller-io/tap/mlx-omni-server
@@ -66,10 +110,11 @@ brew install seemueller-io/tap/mlx-omni-server
bun run openai:local mlx-omni-server # Start mlx-omni-server
bun run openai:local:configure # Configure connection
bun run server:dev # Restart server
~~~
```
#### Adding models for local inference (Apple Silicon)
~~~bash
```bash
# ensure mlx-omni-server is running
# See https://huggingface.co/mlx-community for available models
@@ -81,21 +126,22 @@ curl http://localhost:10240/v1/chat/completions \
\"model\": \"$MODEL_TO_ADD\",
\"messages\": [{\"role\": \"user\", \"content\": \"Hello\"}]
}"
~~~
```
### Ollama
~~~bash
```bash
bun run openai:local ollama # Start ollama server
bun run openai:local:configure # Configure connection
bun run server:dev # Restart server
~~~
```
#### Adding models for local inference (ollama)
~~~bash
```bash
# See https://ollama.com/library for available models
use the ollama web ui @ http://localhost:8080
~~~
```
## Testing
@@ -103,20 +149,46 @@ Tests are located in `__tests__` directories next to the code they test. Testing
> `bun test:all` will run all tests
## Troubleshooting
1. `bun clean`
1. `bun i`
1. `bun server:dev`
1. `bun client:dev`
1. Submit an issue
History
---
## History
A high-level overview for the development history of the parent repository, [geoff-seemueller-io](https://geoff.seemueller.io), is provided in [LEGACY.md](./LEGACY.md).
## Acknowledgments
I would like to express gratitude to the following projects, libraries, and individuals that have contributed to making open-gsio possible:
- [TypeScript](https://www.typescriptlang.org/) - Primary programming language
- [React](https://react.dev/) - UI library for building the frontend
- [Vike](https://vike.dev/) - Framework for server-side rendering and routing
- [Cloudflare Workers](https://developers.cloudflare.com/workers/) - Serverless execution environment
- [Bun](https://bun.sh/) - JavaScript runtime and toolkit
- [Marked.js](https://github.com/markedjs/marked) - Markdown Rendering
- [Shiki](https://github.com/shikijs/shiki) - Syntax Highlighting
- [itty-router](https://github.com/kwhitley/itty-router) - Lightweight router for serverless environments
- [MobX-State-Tree](https://mobx-state-tree.js.org/) - State management solution
- [OpenAI SDK](https://github.com/openai/openai-node) - Client for AI model integration
- [Vitest](https://vitest.dev/) - Testing framework
- [OpenAI](https://github.com/openai)
- [Groq](https://console.groq.com/) - Fast inference API
- [Anthropic](https://www.anthropic.com/) - Creator of Claude models
- [Fireworks](https://fireworks.ai/) - AI inference platform
- [XAI](https://x.ai/) - Creator of Grok models
- [Cerebras](https://www.cerebras.net/) - AI compute and models
- [(madroidmaq) MLX Omni Server](https://github.com/madroidmaq/mlx-omni-server) - Open-source high-performance inference for Apple Silicon
- [MLX](https://github.com/ml-explore/mlx) - An array framework for Apple silicon
- [Ollama](https://github.com/ollama/ollama) - Versatile solution for self-hosting models
## License
~~~text
```text
MIT License
Copyright (c) 2025 Geoff Seemueller
@@ -138,7 +210,4 @@ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
~~~
```

1092
bun.lock

File diff suppressed because it is too large Load Diff

BIN
bun.lockb

Binary file not shown.

13
docker-compose.yml Normal file
View File

@@ -0,0 +1,13 @@
version: '3'
services:
open-gsio:
image: open-gsio:latest
build:
pull: false
context: .
dockerfile: Dockerfile
ports:
- "3003:3003"
env_file:
- ./packages/server/.env
restart: unless-stopped

View File

@@ -12,19 +12,37 @@
"clean": "packages/scripts/cleanup.sh",
"test:all": "bun run --filter='*' tests",
"client:dev": "(cd packages/client && bun run dev)",
"server:dev": "(cd packages/server && bun run dev)",
"build": "(cd packages/cloudflare-workers && bun run deploy:dry-run)",
"deploy": "(cd packages/cloudflare-workers && bun run deploy)",
"server:dev": "bun build:client && (cd packages/server && bun run dev)",
"build": "(cd packages/cloudflare-workers/open-gsio && bun run deploy:dry-run)",
"build:client": "(cd packages/client && bun run vite build)",
"build:server": "bun --filter=@open-gsio/server run build",
"deploy": "(cd packages/cloudflare-workers/open-gsio && bun run deploy)",
"deploy:secrets": "wrangler secret bulk secrets.json -c packages/cloudflare-workers/open-gsio/wrangler.jsonc",
"openai:local:mlx": "packages/scripts/start_inference_server.sh mlx-omni-server",
"openai:local:ollama": "packages/scripts/start_inference_server.sh ollama",
"openai:local:configure": "packages/scripts/configure_local_inference.sh"
"openai:local:configure": "packages/scripts/configure_local_inference.sh",
"lint": "eslint . --ext .js,.jsx,.ts,.tsx",
"lint:fix": "eslint . --ext .js,.jsx,.ts,.tsx --fix",
"format": "prettier --write \"**/*.{js,jsx,ts,tsx,json,md}\"",
"format:check": "prettier --check \"**/*.{js,jsx,ts,tsx,json,md}\"",
"log": "(cd packages/cloudflare-workers/open-gsio && bun wrangler tail)"
},
"devDependencies": {
"@types/bun": "latest"
"@types/bun": "^1.2.17",
"@typescript-eslint/eslint-plugin": "^8.35.0",
"@typescript-eslint/parser": "^8.35.0",
"eslint": "^8",
"eslint-config-prettier": "^10.1.5",
"eslint-plugin-import": "^2.32.0",
"eslint-plugin-prettier": "^5.5.1",
"happy-dom": "^18.0.1",
"prettier": "^3.6.1"
},
"peerDependencies": {
"typescript": "^5"
"typescript": "^5.8.3"
},
"dependencies": {
"@chakra-ui/icons": "^2.2.4"
},
"packageManager": "pnpm@10.10.0+sha512.d615db246fe70f25dcfea6d8d73dee782ce23e2245e3c4f6f888249fb568149318637dca73c2c5c8ef2a4ca0d5657fb9567188bfab47f566d1ee6ce987815c39"
}

View File

@@ -1 +0,0 @@
export * from "./supported-models.ts";

View File

@@ -1,4 +1,48 @@
{
"name": "@open-gsio/ai",
"module": "index.ts"
"type": "module",
"module": "src/index.ts",
"exports": {
".": {
"import": "./src/index.ts",
"types": "./src/index.ts"
},
"./chat-sdk/chat-sdk.ts": {
"import": "./src/chat-sdk/chat-sdk.ts",
"types": "./src/chat-sdk/chat-sdk.ts"
},
"./providers/_ProviderRepository.ts": {
"import": "./src/providers/_ProviderRepository.ts",
"types": "./src/providers/_ProviderRepository.ts"
},
"./providers/google.ts": {
"import": "./src/providers/google.ts",
"types": "./src/providers/google.ts"
},
"./providers/openai.ts": {
"import": "./src/providers/openai.ts",
"types": "./src/providers/openai.ts"
},
"./src": {
"import": "./src/index.ts",
"types": "./src/index.ts"
},
"./utils": {
"import": "./src/utils/index.ts",
"types": "./src/utils/index.ts"
}
},
"scripts": {
"tests": "vitest run",
"tests:coverage": "vitest run --coverage.enabled=true"
},
"devDependencies": {
"@open-gsio/env": "workspace:*",
"@open-gsio/schema": "workspace:*",
"@anthropic-ai/sdk": "^0.55.0",
"openai": "^5.0.1",
"wrangler": "^4.18.0",
"vitest": "^3.1.4",
"vite": "^6.3.5"
}
}

View File

@@ -0,0 +1,154 @@
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import { AssistantSdk } from '../assistant-sdk';
import { Utils } from '../utils/utils.ts';
// Mock dependencies
vi.mock('../utils/utils.ts', () => ({
Utils: {
selectEquitably: vi.fn(),
getCurrentDate: vi.fn(),
},
}));
vi.mock('../prompts/few_shots', () => ({
default: {
a: 'A1',
question1: 'answer1',
question2: 'answer2',
question3: 'answer3',
},
}));
describe('AssistantSdk', () => {
beforeEach(() => {
vi.useFakeTimers();
vi.setSystemTime(new Date('2023-01-01T12:30:45Z'));
// Reset mocks
vi.resetAllMocks();
});
afterEach(() => {
vi.useRealTimers();
});
describe('getAssistantPrompt', () => {
it('should return a prompt with default values when minimal params are provided', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({});
expect(prompt).toContain('# Assistant Knowledge');
expect(prompt).toContain('### Date: ');
expect(prompt).toContain('### User Location: ');
expect(prompt).toContain('### Timezone: ');
});
it('should include maxTokens when provided', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({ maxTokens: 1000 });
expect(prompt).toContain('Max Response Length: 1000 tokens (maximum)');
});
it('should use provided userTimezone and userLocation', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({
userTimezone: 'America/New_York',
userLocation: 'New York, USA',
});
expect(prompt).toContain('### User Location: New York, USA');
expect(prompt).toContain('### Timezone: America/New_York');
});
it('should use current date when Utils.getCurrentDate is not available', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
// @ts-expect-error - is supposed to break
Utils.getCurrentDate.mockReturnValue(undefined);
const prompt = AssistantSdk.getAssistantPrompt({});
// Instead of checking for a specific date, just verify that a date is included
expect(prompt).toMatch(/### Date: \d{4}-\d{2}-\d{2} \d{1,2}:\d{2} \d{1,2}s/);
});
it('should use few_shots directly when Utils.selectEquitably is not available', () => {
// @ts-expect-error - is supposed to break
Utils.selectEquitably.mockReturnValue(undefined);
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({});
// The prompt should still contain examples
expect(prompt).toContain('#### Example 1');
// Instead of checking for specific content, just verify that examples are included
expect(prompt).toMatch(/HUMAN: .+\nASSISTANT: .+/);
});
});
describe('useFewshots', () => {
it('should format fewshots correctly', () => {
const fewshots = {
'What is the capital of France?': 'Paris is the capital of France.',
'How do I make pasta?': 'Boil water, add pasta, cook until al dente.',
};
const result = AssistantSdk.useFewshots(fewshots);
expect(result).toContain('#### Example 1');
expect(result).toContain('HUMAN: What is the capital of France?');
expect(result).toContain('ASSISTANT: Paris is the capital of France.');
expect(result).toContain('#### Example 2');
expect(result).toContain('HUMAN: How do I make pasta?');
expect(result).toContain('ASSISTANT: Boil water, add pasta, cook until al dente.');
});
it('should respect the limit parameter', () => {
const fewshots = {
Q1: 'A1',
Q2: 'A2',
Q3: 'A3',
Q4: 'A4',
Q5: 'A5',
Q6: 'A6',
};
const result = AssistantSdk.useFewshots(fewshots, 3);
expect(result).toContain('#### Example 1');
expect(result).toContain('HUMAN: Q1');
expect(result).toContain('ASSISTANT: A1');
expect(result).toContain('#### Example 2');
expect(result).toContain('HUMAN: Q2');
expect(result).toContain('ASSISTANT: A2');
expect(result).toContain('#### Example 3');
expect(result).toContain('HUMAN: Q3');
expect(result).toContain('ASSISTANT: A3');
expect(result).not.toContain('#### Example 4');
expect(result).not.toContain('HUMAN: Q4');
});
});
});

View File

@@ -1,24 +1,29 @@
import { Schema } from '@open-gsio/schema';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { ChatSdk } from '../chat-sdk.ts';
import { AssistantSdk } from '../assistant-sdk.ts';
import Message from '../../models/Message.ts';
import { getModelFamily } from '@open-gsio/ai/supported-models';
import { AssistantSdk } from '../assistant-sdk';
import { ChatSdk } from '../chat-sdk';
import { ProviderRepository } from '../providers/_ProviderRepository.ts';
// Mock dependencies
vi.mock('../assistant-sdk', () => ({
AssistantSdk: {
getAssistantPrompt: vi.fn()
}
getAssistantPrompt: vi.fn(),
},
}));
vi.mock('../../models/Message', () => ({
default: {
create: vi.fn((message) => message)
}
vi.mock('@open-gsio/schema', () => ({
Schema: {
Message: {
create: vi.fn(message => message),
},
},
}));
vi.mock('@open-gsio/ai/supported-models', () => ({
getModelFamily: vi.fn()
vi.mock('../providers/_ProviderRepository', () => ({
ProviderRepository: {
getModelFamily: vi.fn().mockResolvedValue('openai'),
},
}));
describe('ChatSdk', () => {
@@ -33,13 +38,13 @@ describe('ChatSdk', () => {
const result = await ChatSdk.preprocess({ messages });
expect(Message.create).toHaveBeenCalledWith({
expect(Schema.Message.create).toHaveBeenCalledWith({
role: 'assistant',
content: ''
content: '',
});
expect(result).toEqual({
role: 'assistant',
content: ''
content: '',
});
});
});
@@ -47,7 +52,7 @@ describe('ChatSdk', () => {
describe('handleChatRequest', () => {
it('should return a 400 response if no messages are provided', async () => {
const request = {
json: vi.fn().mockResolvedValue({ messages: [] })
json: vi.fn().mockResolvedValue({ messages: [] }),
};
const ctx = {
openai: {},
@@ -56,9 +61,9 @@ describe('ChatSdk', () => {
env: {
SERVER_COORDINATOR: {
idFromName: vi.fn(),
get: vi.fn()
}
}
get: vi.fn(),
},
},
};
const response = await ChatSdk.handleChatRequest(request as any, ctx as any);
@@ -70,7 +75,7 @@ describe('ChatSdk', () => {
it('should save stream data and return a response with streamUrl', async () => {
const streamId = 'test-uuid';
vi.stubGlobal('crypto', {
randomUUID: vi.fn().mockReturnValue(streamId)
randomUUID: vi.fn().mockReturnValue(streamId),
});
const messages = [{ role: 'user', content: 'Hello' }];
@@ -78,12 +83,12 @@ describe('ChatSdk', () => {
const conversationId = 'conv-123';
const request = {
json: vi.fn().mockResolvedValue({ messages, model, conversationId })
json: vi.fn().mockResolvedValue({ messages, model, conversationId }),
};
const saveStreamData = vi.fn();
const durableObject = {
saveStreamData
saveStreamData,
};
const ctx = {
@@ -93,9 +98,9 @@ describe('ChatSdk', () => {
env: {
SERVER_COORDINATOR: {
idFromName: vi.fn().mockReturnValue('object-id'),
get: vi.fn().mockReturnValue(durableObject)
}
}
get: vi.fn().mockReturnValue(durableObject),
},
},
};
const response = await ChatSdk.handleChatRequest(request as any, ctx as any);
@@ -103,12 +108,9 @@ describe('ChatSdk', () => {
expect(ctx.env.SERVER_COORDINATOR.idFromName).toHaveBeenCalledWith('stream-index');
expect(ctx.env.SERVER_COORDINATOR.get).toHaveBeenCalledWith('object-id');
expect(saveStreamData).toHaveBeenCalledWith(
streamId,
expect.stringContaining(model)
);
expect(saveStreamData).toHaveBeenCalledWith(streamId, expect.stringContaining(model));
expect(responseBody).toEqual({
streamUrl: `/api/streams/${streamId}`
streamUrl: `/api/streams/${streamId}`,
});
});
});
@@ -118,7 +120,7 @@ describe('ChatSdk', () => {
const messages = [{ role: 'user', content: 'Hello' }];
const dynamicMaxTokens = vi.fn().mockResolvedValue(500);
const durableObject = {
dynamicMaxTokens
dynamicMaxTokens,
};
const ctx = {
@@ -126,9 +128,9 @@ describe('ChatSdk', () => {
env: {
SERVER_COORDINATOR: {
idFromName: vi.fn().mockReturnValue('object-id'),
get: vi.fn().mockReturnValue(durableObject)
}
}
get: vi.fn().mockReturnValue(durableObject),
},
},
};
await ChatSdk.calculateMaxTokens(messages, ctx as any);
@@ -148,90 +150,88 @@ describe('ChatSdk', () => {
expect(AssistantSdk.getAssistantPrompt).toHaveBeenCalledWith({
maxTokens: 1000,
userTimezone: 'UTC',
userLocation: 'USA/unknown'
userLocation: 'USA/unknown',
});
expect(result).toBe('Assistant prompt');
});
});
describe('buildMessageChain', () => {
it('should build a message chain with system role for most models', () => {
vi.mocked(getModelFamily).mockReturnValue('openai');
// TODO: Fix this test
it('should build a message chain with system role for most models', async () => {
ProviderRepository.getModelFamily.mockResolvedValue('openai');
const messages = [
{ role: 'user', content: 'Hello' }
];
const messages = [{ role: 'user', content: 'Hello' }];
const opts = {
systemPrompt: 'System prompt',
assistantPrompt: 'Assistant prompt',
toolResults: { role: 'tool', content: 'Tool result' },
model: 'gpt-4'
model: 'gpt-4',
env: {},
};
const result = ChatSdk.buildMessageChain(messages, opts as any);
const result = await ChatSdk.buildMessageChain(messages, opts as any);
expect(getModelFamily).toHaveBeenCalledWith('gpt-4');
expect(Message.create).toHaveBeenCalledTimes(3);
expect(Message.create).toHaveBeenNthCalledWith(1, {
expect(ProviderRepository.getModelFamily).toHaveBeenCalledWith('gpt-4', {});
expect(Schema.Message.create).toHaveBeenCalledTimes(3);
expect(Schema.Message.create).toHaveBeenNthCalledWith(1, {
role: 'system',
content: 'System prompt'
content: 'System prompt',
});
expect(Message.create).toHaveBeenNthCalledWith(2, {
expect(Schema.Message.create).toHaveBeenNthCalledWith(2, {
role: 'assistant',
content: 'Assistant prompt'
content: 'Assistant prompt',
});
expect(Message.create).toHaveBeenNthCalledWith(3, {
expect(Schema.Message.create).toHaveBeenNthCalledWith(3, {
role: 'user',
content: 'Hello'
content: 'Hello',
});
});
it('should build a message chain with assistant role for o1, gemma, claude, or google models', async () => {
ProviderRepository.getModelFamily.mockResolvedValue('claude');
it('should build a message chain with assistant role for o1, gemma, claude, or google models', () => {
vi.mocked(getModelFamily).mockReturnValue('claude');
const messages = [
{ role: 'user', content: 'Hello' }
];
const messages = [{ role: 'user', content: 'Hello' }];
const opts = {
systemPrompt: 'System prompt',
assistantPrompt: 'Assistant prompt',
toolResults: { role: 'tool', content: 'Tool result' },
model: 'claude-3'
model: 'claude-3',
env: {},
};
const result = ChatSdk.buildMessageChain(messages, opts as any);
const result = await ChatSdk.buildMessageChain(messages, opts as any);
expect(getModelFamily).toHaveBeenCalledWith('claude-3');
expect(Message.create).toHaveBeenCalledTimes(3);
expect(Message.create).toHaveBeenNthCalledWith(1, {
expect(ProviderRepository.getModelFamily).toHaveBeenCalledWith('claude-3', {});
expect(Schema.Message.create).toHaveBeenCalledTimes(3);
expect(Schema.Message.create).toHaveBeenNthCalledWith(1, {
role: 'assistant',
content: 'System prompt'
content: 'System prompt',
});
});
it('should filter out messages with empty content', () => {
vi.mocked(getModelFamily).mockReturnValue('openai');
it('should filter out messages with empty content', async () => {
ProviderRepository.getModelFamily.mockResolvedValue('openai');
const messages = [
{ role: 'user', content: 'Hello' },
{ role: 'user', content: '' },
{ role: 'user', content: ' ' },
{ role: 'user', content: 'World' }
{ role: 'user', content: 'World' },
];
const opts = {
systemPrompt: 'System prompt',
assistantPrompt: 'Assistant prompt',
toolResults: { role: 'tool', content: 'Tool result' },
model: 'gpt-4'
model: 'gpt-4',
env: {},
};
const result = ChatSdk.buildMessageChain(messages, opts as any);
const result = await ChatSdk.buildMessageChain(messages, opts as any);
// 2 system/assistant messages + 2 user messages (Hello and World)
expect(Message.create).toHaveBeenCalledTimes(4);
expect(Schema.Message.create).toHaveBeenCalledTimes(4);
});
});
});

View File

@@ -1,5 +1,6 @@
import { describe, it, expect } from 'vitest';
import { Utils } from '../utils.ts';
import { Utils } from '../utils/utils.ts';
describe('Debug Utils.getSeason', () => {
it('should print out the actual seasons for different dates', () => {

View File

@@ -1,13 +1,14 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import handleStreamData from '../handleStreamData.ts';
import handleStreamData from '../utils/handleStreamData.ts';
describe('handleStreamData', () => {
// Setup mocks
const mockController = {
enqueue: vi.fn()
enqueue: vi.fn(),
};
const mockEncoder = {
encode: vi.fn((str) => str)
encode: vi.fn(str => str),
};
beforeEach(() => {
@@ -41,9 +42,9 @@ describe('handleStreamData', () => {
type: 'content_block_start',
content_block: {
type: 'text',
text: 'Hello world'
}
}
text: 'Hello world',
},
},
};
handler(data);
@@ -51,6 +52,7 @@ describe('handleStreamData', () => {
expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world'));
// @ts-expect-error - mock
const encodedData = mockEncoder.encode.mock.calls[0][0];
const parsedData = JSON.parse(encodedData.split('data: ')[1]);
@@ -65,9 +67,9 @@ describe('handleStreamData', () => {
type: 'chat',
data: {
delta: {
text: 'Hello world'
}
}
text: 'Hello world',
},
},
};
handler(data);
@@ -75,6 +77,7 @@ describe('handleStreamData', () => {
expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world'));
// @ts-expect-error - mock
const encodedData = mockEncoder.encode.mock.calls[0][0];
const parsedData = JSON.parse(encodedData.split('data: ')[1]);
@@ -92,13 +95,13 @@ describe('handleStreamData', () => {
{
index: 0,
delta: {
content: 'Hello world'
content: 'Hello world',
},
logprobs: null,
finish_reason: null
}
]
}
finish_reason: null,
},
],
},
};
handler(data);
@@ -106,6 +109,7 @@ describe('handleStreamData', () => {
expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world'));
// @ts-expect-error - mock
const encodedData = mockEncoder.encode.mock.calls[0][0];
const parsedData = JSON.parse(encodedData.split('data: ')[1]);
@@ -125,16 +129,18 @@ describe('handleStreamData', () => {
index: 0,
delta: {},
logprobs: null,
finish_reason: 'stop'
}
]
}
finish_reason: 'stop',
},
],
},
};
handler(data);
handler(data as any);
expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('"finish_reason":"stop"'));
expect(mockEncoder.encode).toHaveBeenCalledWith(
expect.stringContaining('"finish_reason":"stop"'),
);
});
it('should return early for unrecognized data format', () => {
@@ -144,11 +150,11 @@ describe('handleStreamData', () => {
type: 'chat',
data: {
// No recognized properties
unrecognized: 'property'
}
unrecognized: 'property',
},
};
handler(data);
handler(data as any);
expect(mockController.enqueue).not.toHaveBeenCalled();
expect(mockEncoder.encode).not.toHaveBeenCalled();
@@ -160,8 +166,8 @@ describe('handleStreamData', () => {
const data = {
type: 'chat',
data: {
original: 'data'
}
original: 'data',
},
};
const transformFn = vi.fn().mockReturnValue({
@@ -170,16 +176,16 @@ describe('handleStreamData', () => {
choices: [
{
delta: {
content: 'Transformed content'
content: 'Transformed content',
},
logprobs: null,
finish_reason: null
}
]
}
finish_reason: null,
},
],
},
});
handler(data, transformFn);
handler(data as any, transformFn);
expect(transformFn).toHaveBeenCalledWith(data);
expect(mockController.enqueue).toHaveBeenCalledTimes(1);

View File

@@ -1,5 +1,6 @@
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import { Utils } from '../utils.ts';
import { Utils } from '../utils/utils.ts';
describe('Utils', () => {
describe('getSeason', () => {
@@ -42,10 +43,11 @@ describe('Utils', () => {
beforeEach(() => {
// Mock Intl.DateTimeFormat
// @ts-expect-error - mock
global.Intl.DateTimeFormat = vi.fn().mockReturnValue({
resolvedOptions: vi.fn().mockReturnValue({
timeZone: 'America/New_York'
})
timeZone: 'America/New_York',
}),
});
});
@@ -102,10 +104,10 @@ describe('Utils', () => {
it('should select items equitably from multiple sources', () => {
const sources = {
a: { 'key1': 'value1', 'key2': 'value2' },
b: { 'key3': 'value3', 'key4': 'value4' },
c: { 'key5': 'value5', 'key6': 'value6' },
d: { 'key7': 'value7', 'key8': 'value8' }
a: { key1: 'value1', key2: 'value2' },
b: { key3: 'value3', key4: 'value4' },
c: { key5: 'value5', key6: 'value6' },
d: { key7: 'value7', key8: 'value8' },
};
const result = Utils.selectEquitably(sources, 4);
@@ -117,10 +119,10 @@ describe('Utils', () => {
it('should handle itemCount greater than available items', () => {
const sources = {
a: { 'key1': 'value1' },
b: { 'key2': 'value2' },
a: { key1: 'value1' },
b: { key2: 'value2' },
c: {},
d: {}
d: {},
};
const result = Utils.selectEquitably(sources, 5);
@@ -135,7 +137,7 @@ describe('Utils', () => {
a: {},
b: {},
c: {},
d: {}
d: {},
};
const result = Utils.selectEquitably(sources, 5);
@@ -148,10 +150,10 @@ describe('Utils', () => {
it('should insert blank messages to maintain user/assistant alternation', () => {
const messages = [
{ role: 'user', content: 'Hello' },
{ role: 'user', content: 'How are you?' }
{ role: 'user', content: 'How are you?' },
];
const result = Utils.normalizeWithBlanks(messages);
const result = Utils.normalizeWithBlanks(messages as any[]);
expect(result.length).toBe(3);
expect(result[0]).toEqual({ role: 'user', content: 'Hello' });
@@ -160,11 +162,9 @@ describe('Utils', () => {
});
it('should insert blank user message if first message is assistant', () => {
const messages = [
{ role: 'assistant', content: 'Hello, how can I help?' }
];
const messages = [{ role: 'assistant', content: 'Hello, how can I help?' }];
const result = Utils.normalizeWithBlanks(messages);
const result = Utils.normalizeWithBlanks(messages as any[]);
expect(result.length).toBe(2);
expect(result[0]).toEqual({ role: 'user', content: '' });
@@ -183,10 +183,10 @@ describe('Utils', () => {
const messages = [
{ role: 'user', content: 'Hello' },
{ role: 'assistant', content: 'Hi there' },
{ role: 'user', content: 'How are you?' }
{ role: 'user', content: 'How are you?' },
];
const result = Utils.normalizeWithBlanks(messages);
const result = Utils.normalizeWithBlanks(messages as any[]);
expect(result.length).toBe(3);
expect(result).toEqual(messages);

View File

@@ -0,0 +1,57 @@
import Prompts from '../prompts';
import { Common } from '../utils';
export class AssistantSdk {
static getAssistantPrompt(params: {
maxTokens?: number;
userTimezone?: string;
userLocation?: string;
}): string {
const { maxTokens, userTimezone = 'UTC', userLocation = '' } = params;
// console.log('[DEBUG_LOG] few_shots:', JSON.stringify(few_shots));
let selectedFewshots = Common.Utils.selectEquitably?.(Prompts.FewShots);
// console.log('[DEBUG_LOG] selectedFewshots after Utils.selectEquitably:', JSON.stringify(selectedFewshots));
if (!selectedFewshots) {
selectedFewshots = Prompts.FewShots;
// console.log('[DEBUG_LOG] selectedFewshots after fallback:', JSON.stringify(selectedFewshots));
}
const sdkDate = new Date().toISOString();
const [currentDate] = sdkDate.includes('T') ? sdkDate.split('T') : [sdkDate];
const now = new Date();
const formattedMinutes = String(now.getMinutes()).padStart(2, '0');
const currentTime = `${now.getHours()}:${formattedMinutes} ${now.getSeconds()}s`;
return `# Assistant Knowledge
## Assistant Name
### open-gsio
## Current Context
### Date: ${currentDate} ${currentTime}
${maxTokens ? `### Max Response Length: ${maxTokens} tokens (maximum)` : ''}
### Lexicographical Format: Markdown
### User Location: ${userLocation || 'Unknown'}
### Timezone: ${userTimezone}
## Response Framework
1. Use knowledge provided in the current context as the primary source of truth.
2. Format all responses in Markdown.
3. Attribute external sources with footnotes.
4. Do not bold headers.
## Examples
#### Example 0
HUMAN: What is this?
ASSISTANT: This is a conversational AI system.
---
${AssistantSdk.useFewshots(selectedFewshots, 5)}
---
## Directive
Continuously monitor the evolving conversation. Dynamically adapt each response.`;
}
static useFewshots(fewshots: Record<string, string>, limit = 5): string {
return Object.entries(fewshots)
.slice(0, limit)
.map(([q, a], i) => {
return `#### Example ${i + 1}\nHUMAN: ${q}\nASSISTANT: ${a}`;
})
.join('\n---\n');
}
}

View File

@@ -0,0 +1,3 @@
import { AssistantSdk } from './assistant-sdk.ts';
export { AssistantSdk };

View File

@@ -0,0 +1,138 @@
import { Schema } from '@open-gsio/schema';
import type { Instance } from 'mobx-state-tree';
import { OpenAI } from 'openai';
import type Message from '../../../schema/src/models/Message.ts';
import { AssistantSdk } from '../assistant-sdk';
import { ProviderRepository } from '../providers/_ProviderRepository.ts';
import type {
BuildAssistantPromptParams,
ChatRequestBody,
GenericEnv,
PreprocessParams,
} from '../types';
export class ChatSdk {
static async preprocess(params: PreprocessParams) {
// a slot for to provide additional context
return Schema.Message.create({
role: 'assistant',
content: '',
});
}
static async handleChatRequest(
request: Request,
ctx: {
openai: OpenAI;
systemPrompt: any;
maxTokens: any;
env: GenericEnv;
},
) {
const streamId = crypto.randomUUID();
const { messages, model, conversationId } = (await request.json()) as ChatRequestBody;
if (!messages?.length) {
return new Response('No messages provided', { status: 400 });
}
const preprocessedContext = await ChatSdk.preprocess({
messages,
});
// console.log(ctx.env)
// console.log(ctx.env.SERVER_COORDINATOR);
const objectId = ctx.env.SERVER_COORDINATOR.idFromName('stream-index');
const durableObject = ctx.env.SERVER_COORDINATOR.get(objectId);
await durableObject.saveStreamData(
streamId,
JSON.stringify({
messages,
model,
conversationId,
timestamp: Date.now(),
systemPrompt: ctx.systemPrompt,
preprocessedContext,
}),
);
return new Response(
JSON.stringify({
streamUrl: `/api/streams/${streamId}`,
}),
{
headers: {
'Content-Type': 'application/json',
},
},
);
}
static async calculateMaxTokens(
messages: any[],
ctx: Record<string, any> & {
env: GenericEnv;
maxTokens: number;
},
) {
const objectId = ctx.env.SERVER_COORDINATOR.idFromName('dynamic-token-counter');
const durableObject = ctx.env.SERVER_COORDINATOR.get(objectId);
return durableObject.dynamicMaxTokens(messages, ctx.maxTokens);
}
static buildAssistantPrompt(params: BuildAssistantPromptParams) {
const { maxTokens } = params;
return AssistantSdk.getAssistantPrompt({
maxTokens,
userTimezone: 'UTC',
userLocation: 'USA/unknown',
});
}
static async buildMessageChain(
messages: any[],
opts: {
systemPrompt: any;
assistantPrompt: string;
toolResults: Instance<typeof Message>;
model: any;
env: GenericEnv;
},
) {
const modelFamily = await ProviderRepository.getModelFamily(opts.model, opts.env);
const messagesToSend = [];
messagesToSend.push(
Schema.Message.create({
role:
opts.model.includes('o1') ||
opts.model.includes('gemma') ||
modelFamily === 'claude' ||
modelFamily === 'google'
? 'assistant'
: 'system',
content: opts.systemPrompt.trim(),
}),
);
messagesToSend.push(
Schema.Message.create({
role: 'assistant',
content: opts.assistantPrompt.trim(),
}),
);
messagesToSend.push(
...messages
.filter((message: any) => message.content?.trim())
.map((message: any) => Schema.Message.create(message)),
);
return messagesToSend;
}
}
export default ChatSdk;

View File

@@ -0,0 +1 @@
export * from './chat-sdk.ts';

2
packages/ai/src/index.ts Normal file
View File

@@ -0,0 +1,2 @@
export * from './providers';
export * from './chat-sdk';

View File

@@ -1,50 +1,49 @@
/* eslint-disable no-useless-escape */
const a = {
"Can you search for the latest quantum computing breakthroughs?":
"I don't have web access.",
'Can you search for the latest quantum computing breakthroughs?': "I don't have web access.",
};
const b = {
"Calculate the number of unique ways to arrange the letters in the word 'SUCCESS'.": `
Alright, let's tackle 'SUCCESS' and see just how unique it can get!
1. Step 1: Count the Letters
'Calculate the number of unique ways to arrange the letters in the word SUCCESS.': `Alright, let's tackle 'SUCCESS' and see just how unique it can get!
### Step 1: Count the Letters
The word 'SUCCESS' has 7 letters, but its no free-for-allthere are repeats!
- S: 3 times
- U: 1 time
- C: 2 times
- E: 1 time
Erdős would call this a combinatorial party with some guests arriving multiple times.
2. Step 2: The Formula for Permutations with Repetition
### Step 2: The Formula for Permutations with Repetition
When letters repeat, we divide to avoid overcounting. The magic formula is:
$$
\frac{n!}{k_1! \cdot k_2! \cdot \ldots \cdot k_m!}
$$
Here, (n) is the total number of letters, and (k_i) are the frequencies of each unique letter.
3. Step 3: Plug in the Values
### Step 3. Plug in the Values
Lets compute:
$$
\frac{7!}{3! \cdot 1! \cdot 2! \cdot 1!} = 420
$$
4. Step 4: Reveal the Result
There are **420 unique arrangements** of 'SUCCESS'. Mathematically satisfying, isnt it? Its as if every version of success gets its own unique permutation!`,
"Calculate the number of unique ways to arrange the letters in the word 'MISSISSIPPI'.": `Ready for some permutation wizardry? Lets decode 'MISSISSIPPI'!
1. Step 1: Count the Letters
### Step 4: Reveal the Result
There are 420 unique arrangements of 'SUCCESS'. Mathematically satisfying, isnt it? Its as if every version of success gets its own unique permutation!`,
'Calculate the number of unique ways to arrange the letters in the word MISSISSIPPI.': `Ready for some permutation wizardry? Lets decode 'MISSISSIPPI'!
### Step 1: Count the Letters
'MISSISSIPPI' has 11 letters. But like a social network with cliques, some letters appear in multiples:
- M: 1
- I: 4
- S: 4
- P: 2
2. Step 2: Apply the Permutation Formula for Repetition
### Step 2: Apply the Permutation Formula for Repetition
The formula for such a case is:
$$
\frac{n!}{k_1! \cdot k_2! \cdot \ldots \cdot k_m!}
$$
3. Step 3: Plug in the Values
### Step 3: Plug in the Values
Heres where we let the math do its magic:
$$
\frac{11!}{1! \cdot 4! \cdot 4! \cdot 2!}
$$
4. Step 4: Simplify Step-by-Step
### Step 4: Simplify Step-by-Step
Lets break it down:
- (11! = 39,916,800)
- (4! = 24)
@@ -53,19 +52,17 @@ Lets break it down:
$$
\frac{39,916,800}{1 \cdot 24 \cdot 24 \cdot 2} = 34,650
$$
5. Step 5: The Grand Finale
There are **34,650 unique arrangements** of 'MISSISSIPPI'. Erdős would marvel at the sheer elegance of these numbersmuch like the way rivers meander through the land, permutations flow through possibilities.`,
### Step 5: The Grand Finale
There are 34,650 unique arrangements of 'MISSISSIPPI'.`,
};
const c = {
// c remains unchanged as it contains image generation prompts
"Render a Greek statue with warm marble tones and realistic proportions.":
'Render a Greek statue with warm marble tones and realistic proportions.':
"I don't have the ability to generate images right now. ",
};
const d = {
"Demonstrate all outputs.": `
# Comprehensive Markdown Showcase
'Demonstrate markdown formatted text.': `
## Headers
~~~markdown
# Large Header
@@ -73,80 +70,48 @@ const d = {
### Small Header
~~~
## Ordered Lists
~~~markdown
1. First Item
2. Second Item
1. Subitem 1
2. Subitem 2
3. Third Item
~~~
## Unordered Lists
~~~markdown
- First Item
- Second Item
- Subitem 1
- Subitem 2
~~~
## Links
~~~markdown
[Visit OpenAI](https://openai.com/)
~~~
## Images
~~~markdown
![Example Image](example.jpg)
~~~
![Example Image](example.jpg)
## Inline Code
~~~markdown
\`console.log('Hello, Markdown!')\`
~~~
## Code Blocks
\`\`\`markdown
~~~javascript
console.log(marked.parse('A Description List:\\n'
+ ': Topic 1 : Description 1\\n'
+ ': **Topic 2** : *Description 2*'));
+ ': Topic 2 : Description 2'));
~~~
\`\`\`
## Tables
~~~markdown
| Name | Value |
|---------|-------|
| Item A | 10 |
| Item B | 20 |
~~~
## Blockquotes
~~~markdown
> Markdown makes writing beautiful.
> - Markdown Fan
~~~
## Horizontal Rule
~~~markdown
---
~~~
## Font: Bold and Italic
~~~markdown
**Bold Text**
*Italic Text*
~~~
## Font: Strikethrough
~~~markdown
~~Struck-through text~~
~~~
---
## Math: Inline
This is block level katex:
## Math
~~~markdown
$$
c = \\\\pm\\\\sqrt{a^2 + b^2}
$$
~~~
## Math: Block
This is inline katex
~~~markdown
$c = \\\\pm\\\\sqrt{a^2 + b^2}$
~~~
`,
$$`,
};
export default { a, b, c, d };

View File

@@ -0,0 +1,5 @@
import few_shots from './few_shots.ts';
export default {
FewShots: few_shots,
};

View File

@@ -0,0 +1,96 @@
import type { GenericEnv, ModelMeta, Providers, SupportedProvider } from '../types';
export class ProviderRepository {
#providers: Providers = [];
#env: GenericEnv;
constructor(env: GenericEnv) {
this.#env = env;
this.setProviders(env);
}
static OPENAI_COMPAT_ENDPOINTS = {
xai: 'https://api.x.ai/v1',
groq: 'https://api.groq.com/openai/v1',
google: 'https://generativelanguage.googleapis.com/v1beta/openai',
fireworks: 'https://api.fireworks.ai/inference/v1',
cohere: 'https://api.cohere.ai/compatibility/v1',
cloudflare: 'https://api.cloudflare.com/client/v4/accounts/{CLOUDFLARE_ACCOUNT_ID}/ai/v1',
claude: 'https://api.anthropic.com/v1',
openai: 'https://api.openai.com/v1',
cerebras: 'https://api.cerebras.com/v1',
ollama: 'http://localhost:11434/v1',
mlx: 'http://localhost:10240/v1',
};
static async getModelFamily(model: any, env: GenericEnv) {
const allModels = await env.KV_STORAGE.get('supportedModels');
const models = JSON.parse(allModels);
const modelData = models.filter((m: ModelMeta) => m.id === model);
return modelData[0].provider;
}
static async getModelMeta(meta: any, env: GenericEnv) {
const allModels = await env.KV_STORAGE.get('supportedModels');
const models = JSON.parse(allModels);
return models.filter((m: ModelMeta) => m.id === meta.model).pop();
}
getProviders(): { name: string; key: string; endpoint: string }[] {
return this.#providers;
}
setProviders(env: GenericEnv) {
const indicies = {
providerName: 0,
providerValue: 1,
};
const valueDelimiter = '_';
const envKeys = Object.keys(env);
for (let i = 0; i < envKeys.length; i++) {
if (envKeys.at(i)?.endsWith('KEY')) {
const detectedProvider = envKeys
.at(i)
?.split(valueDelimiter)
.at(indicies.providerName)
?.toLowerCase();
const detectedProviderValue = env[envKeys.at(i) as string];
if (detectedProviderValue) {
switch (detectedProvider) {
case 'anthropic':
this.#providers.push({
name: 'claude',
key: env.ANTHROPIC_API_KEY,
endpoint: ProviderRepository.OPENAI_COMPAT_ENDPOINTS['claude'],
});
break;
case 'gemini':
this.#providers.push({
name: 'google',
key: env.GEMINI_API_KEY,
endpoint: ProviderRepository.OPENAI_COMPAT_ENDPOINTS['google'],
});
break;
case 'cloudflare':
this.#providers.push({
name: 'cloudflare',
key: env.CLOUDFLARE_API_KEY,
endpoint: ProviderRepository.OPENAI_COMPAT_ENDPOINTS[detectedProvider].replace(
'{CLOUDFLARE_ACCOUNT_ID}',
env.CLOUDFLARE_ACCOUNT_ID,
),
});
break;
default:
this.#providers.push({
name: detectedProvider as SupportedProvider,
key: env[envKeys[i] as string],
endpoint:
ProviderRepository.OPENAI_COMPAT_ENDPOINTS[detectedProvider as SupportedProvider],
});
}
}
}
}
}
}

View File

@@ -1,6 +1,11 @@
import { describe, it, expect, vi } from 'vitest';
import { BaseChatProvider, CommonProviderParams, ChatStreamProvider } from '../chat-stream-provider.ts';
import { OpenAI } from 'openai';
import { describe, it, expect, vi } from 'vitest';
import {
BaseChatProvider,
CommonProviderParams,
ChatStreamProvider,
} from '../chat-stream-provider.ts';
// Create a concrete implementation of BaseChatProvider for testing
class TestChatProvider extends BaseChatProvider {
@@ -29,16 +34,16 @@ vi.mock('../../lib/chat-sdk', () => ({
buildAssistantPrompt: vi.fn().mockReturnValue('Assistant prompt'),
buildMessageChain: vi.fn().mockReturnValue([
{ role: 'system', content: 'System prompt' },
{ role: 'user', content: 'User message' }
])
}
{ role: 'user', content: 'User message' },
]),
},
}));
describe('ChatStreamProvider', () => {
it('should define the required interface', () => {
// Verify the interface has the required method
const mockProvider: ChatStreamProvider = {
handleStream: vi.fn()
handleStream: vi.fn(),
};
expect(mockProvider.handleStream).toBeDefined();

View File

@@ -1,10 +1,12 @@
import {OpenAI} from "openai";
import {BaseChatProvider, CommonProviderParams} from "./chat-stream-provider.ts";
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class CerebrasChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: "https://api.cerebras.ai/v1",
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.cerebras,
apiKey: param.env.CEREBRAS_API_KEY,
});
}
@@ -23,18 +25,18 @@ export class CerebrasChatProvider extends BaseChatProvider {
return {
model: param.model,
messages: safeMessages,
stream: true
stream: true,
// ...tuningParams
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === "stop") {
dataCallback({ type: "chat", data: chunk });
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: "chat", data: chunk });
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
@@ -46,14 +48,13 @@ export class CerebrasSdk {
param: {
openai: OpenAI;
systemPrompt: any;
disableWebhookGeneration: boolean;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: string;
env: Env;
env: GenericEnv;
},
dataCallback: (data) => void,
dataCallback: (data: any) => void,
) {
return this.provider.handleStream(
{

View File

@@ -0,0 +1,281 @@
import { OpenAI } from 'openai';
import ChatSdk from '../chat-sdk/chat-sdk.ts';
import { getWeather, WeatherTool } from '../tools/weather.ts';
import { yachtpitAi, YachtpitTools } from '../tools/yachtpit.ts';
import type { GenericEnv } from '../types';
export interface CommonProviderParams {
openai?: OpenAI; // Optional for providers that use a custom client.
systemPrompt: any;
preprocessedContext: any;
maxTokens: number | unknown | undefined;
messages: any;
model: string;
env: GenericEnv;
disableWebhookGeneration?: boolean;
// Additional fields can be added as needed
}
export interface ChatStreamProvider {
handleStream(param: CommonProviderParams, dataCallback: (data: any) => void): Promise<any>;
}
export abstract class BaseChatProvider implements ChatStreamProvider {
abstract getOpenAIClient(param: CommonProviderParams): OpenAI;
abstract getStreamParams(param: CommonProviderParams, safeMessages: any[]): any;
abstract processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean>;
async handleStream(param: CommonProviderParams, dataCallback: (data: any) => void) {
const assistantPrompt = ChatSdk.buildAssistantPrompt({ maxTokens: param.maxTokens });
const safeMessages = await ChatSdk.buildMessageChain(param.messages, {
systemPrompt: param.systemPrompt,
model: param.model,
assistantPrompt,
toolResults: param.preprocessedContext,
env: param.env,
});
const client = this.getOpenAIClient(param);
const tools = [WeatherTool, YachtpitTools];
const callFunction = async (name, args) => {
if (name === 'get_weather') {
return getWeather(args.latitude, args.longitude);
}
if (name === 'ship_control') {
return yachtpitAi({ action: args.action, value: args.value });
}
};
// Main conversation loop - handle tool calls properly
let conversationComplete = false;
let toolCallIterations = 0;
const maxToolCallIterations = 5; // Prevent infinite loops
let toolsExecuted = false; // Track if we've executed tools
while (!conversationComplete && toolCallIterations < maxToolCallIterations) {
const streamParams = this.getStreamParams(param, safeMessages);
// Only provide tools on the first call, after that force text response
const currentTools = toolsExecuted ? undefined : tools;
const stream = await client.chat.completions.create({ ...streamParams, tools: currentTools });
let assistantMessage = '';
const toolCalls: any[] = [];
for await (const chunk of stream as unknown as AsyncIterable<any>) {
// console.log('chunk', chunk);
// Handle tool calls
if (chunk.choices[0]?.delta?.tool_calls) {
const deltaToolCalls = chunk.choices[0].delta.tool_calls;
for (const deltaToolCall of deltaToolCalls) {
if (deltaToolCall.index !== undefined) {
// Initialize or get existing tool call
if (!toolCalls[deltaToolCall.index]) {
toolCalls[deltaToolCall.index] = {
id: deltaToolCall.id || '',
type: deltaToolCall.type || 'function',
function: {
name: deltaToolCall.function?.name || '',
arguments: deltaToolCall.function?.arguments || '',
},
};
} else {
// Append to existing tool call
if (deltaToolCall.function?.arguments) {
toolCalls[deltaToolCall.index].function.arguments +=
deltaToolCall.function.arguments;
}
if (deltaToolCall.function?.name) {
toolCalls[deltaToolCall.index].function.name += deltaToolCall.function.name;
}
if (deltaToolCall.id) {
toolCalls[deltaToolCall.index].id += deltaToolCall.id;
}
}
}
}
}
// Handle regular content
if (chunk.choices[0]?.delta?.content) {
assistantMessage += chunk.choices[0].delta.content;
}
// Check if stream is finished
if (chunk.choices[0]?.finish_reason) {
if (chunk.choices[0].finish_reason === 'tool_calls' && toolCalls.length > 0) {
// Increment tool call iterations counter
toolCallIterations++;
console.log(`Tool call iteration ${toolCallIterations}/${maxToolCallIterations}`);
// Execute tool calls and add results to conversation
console.log('Executing tool calls:', toolCalls);
// Send feedback to user about tool invocation
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: `\n\n🔧 Invoking ${toolCalls.length} tool${toolCalls.length > 1 ? 's' : ''}...\n`,
},
},
],
},
});
// Add assistant message with tool calls to conversation
safeMessages.push({
role: 'assistant',
content: assistantMessage || null,
tool_calls: toolCalls,
});
// Execute each tool call and add results
for (const toolCall of toolCalls) {
if (toolCall.type === 'function') {
const name = toolCall.function.name;
console.log(`Calling function: ${name}`);
// Send feedback about specific tool being called
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: `📞 Calling ${name}...`,
},
},
],
},
});
try {
const args = JSON.parse(toolCall.function.arguments);
console.log(`Function arguments:`, args);
const result = await callFunction(name, args);
console.log(`Function result:`, result);
// Send feedback about tool completion
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: `\n`,
},
},
],
},
});
// Add tool result to conversation
safeMessages.push({
role: 'tool',
tool_call_id: toolCall.id,
content: result?.toString() || '',
});
} catch (error) {
console.error(`Error executing tool ${name}:`, error);
// Send feedback about tool error
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: ` ❌ Error\n`,
},
},
],
},
});
safeMessages.push({
role: 'tool',
tool_call_id: toolCall.id,
content: `Error: ${error.message}`,
});
}
}
}
// Mark that tools have been executed to prevent repeated calls
toolsExecuted = true;
// Send feedback that tool execution is complete
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: `\n🎯 Tool execution complete. Generating response...\n\n`,
},
},
],
},
});
// Continue conversation with tool results
break;
} else {
// Regular completion - send final response
conversationComplete = true;
}
}
// Process chunk normally for non-tool-call responses
if (!chunk.choices[0]?.delta?.tool_calls) {
console.log('after-tool-call-chunk', chunk);
const shouldBreak = await this.processChunk(chunk, dataCallback);
if (shouldBreak) {
conversationComplete = true;
break;
}
}
}
}
// Handle case where we hit maximum tool call iterations
if (toolCallIterations >= maxToolCallIterations && !conversationComplete) {
console.log('Maximum tool call iterations reached, forcing completion');
// Send a message indicating we've hit the limit and provide available information
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content:
'\n\n⚠ Maximum tool execution limit reached. Based on the available information, I can provide the following response:\n\n',
},
},
],
},
});
// Make one final call without tools to get a response based on the tool results
const finalStreamParams = this.getStreamParams(param, safeMessages);
const finalStream = await client.chat.completions.create({
...finalStreamParams,
tools: undefined, // Remove tools to force a text response
});
for await (const chunk of finalStream as unknown as AsyncIterable<any>) {
const shouldBreak = await this.processChunk(chunk, dataCallback);
if (shouldBreak) break;
}
}
}
}

View File

@@ -1,14 +1,17 @@
import Anthropic from "@anthropic-ai/sdk";
import {OpenAI} from "openai";
import {
import Anthropic from '@anthropic-ai/sdk';
import type {
_NotCustomized,
ISimpleType,
ModelPropertiesDeclarationToProperties,
ModelSnapshotType2,
UnionStringArray,
} from "mobx-state-tree";
import ChatSdk from "../lib/chat-sdk.ts";
import {BaseChatProvider, CommonProviderParams} from "./chat-stream-provider.ts";
} from 'mobx-state-tree';
import { OpenAI } from 'openai';
import ChatSdk from '../chat-sdk/chat-sdk.ts';
import type { GenericEnv, GenericStreamData } from '../types';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class ClaudeChatProvider extends BaseChatProvider {
private anthropic: Anthropic | null = null;
@@ -33,20 +36,20 @@ export class ClaudeChatProvider extends BaseChatProvider {
stream: true,
model: param.model,
messages: safeMessages,
...claudeTuningParams
...claudeTuningParams,
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.type === "message_stop") {
if (chunk.type === 'message_stop') {
dataCallback({
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: { content: "" },
delta: { content: '' },
logprobs: null,
finish_reason: "stop",
finish_reason: 'stop',
},
],
},
@@ -54,32 +57,30 @@ export class ClaudeChatProvider extends BaseChatProvider {
return true;
}
dataCallback({ type: "chat", data: chunk });
dataCallback({ type: 'chat', data: chunk });
return false;
}
// Override the base handleStream method to use Anthropic client instead of OpenAI
async handleStream(
param: CommonProviderParams,
dataCallback: (data: any) => void,
) {
async handleStream(param: CommonProviderParams, dataCallback: (data: any) => void) {
const assistantPrompt = ChatSdk.buildAssistantPrompt({ maxTokens: param.maxTokens });
const safeMessages = ChatSdk.buildMessageChain(param.messages, {
const safeMessages = await ChatSdk.buildMessageChain(param.messages, {
systemPrompt: param.systemPrompt,
model: param.model,
assistantPrompt,
toolResults: param.preprocessedContext,
env: param.env,
});
const streamParams = this.getStreamParams(param, safeMessages);
if (!this.anthropic) {
throw new Error("Anthropic client not initialized");
throw new Error('Anthropic client not initialized');
}
const stream = await this.anthropic.messages.create(streamParams);
for await (const chunk of stream) {
for await (const chunk of stream as unknown as AsyncIterable<any>) {
const shouldBreak = await this.processChunk(chunk, dataCallback);
if (shouldBreak) break;
}
@@ -104,9 +105,9 @@ export class ClaudeChatSdk {
maxTokens: unknown | number | undefined;
messages: any;
model: string;
env: Env;
env: GenericEnv;
},
dataCallback: (data) => void,
dataCallback: (data: GenericStreamData) => void,
) {
return this.provider.handleStream(
{

View File

@@ -0,0 +1,142 @@
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class CloudflareAiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
apiKey: param.env.CLOUDFLARE_API_KEY,
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.cloudflare.replace(
'{CLOUDFLARE_ACCOUNT_ID}',
param.env.CLOUDFLARE_ACCOUNT_ID,
),
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
const generationParams: Record<string, any> = {
model: this.getModelWithPrefix(param.model),
messages: safeMessages,
stream: true,
};
// Set max_tokens based on model
if (this.getModelPrefix(param.model) === '@cf/meta') {
generationParams['max_tokens'] = 4096;
}
if (this.getModelPrefix(param.model) === '@hf/mistral') {
generationParams['max_tokens'] = 4096;
}
if (param.model.toLowerCase().includes('hermes-2-pro-mistral-7b')) {
generationParams['max_tokens'] = 1000;
}
if (param.model.toLowerCase().includes('openhermes-2.5-mistral-7b-awq')) {
generationParams['max_tokens'] = 1000;
}
if (param.model.toLowerCase().includes('deepseek-coder-6.7b-instruct-awq')) {
generationParams['max_tokens'] = 590;
}
if (param.model.toLowerCase().includes('deepseek-math-7b-instruct')) {
generationParams['max_tokens'] = 512;
}
if (param.model.toLowerCase().includes('neural-chat-7b-v3-1-awq')) {
generationParams['max_tokens'] = 590;
}
if (param.model.toLowerCase().includes('openchat-3.5-0106')) {
generationParams['max_tokens'] = 2000;
}
return generationParams;
}
private getModelPrefix(model: string): string {
let modelPrefix = `@cf/meta`;
if (model.toLowerCase().includes('llama')) {
modelPrefix = `@cf/meta`;
}
if (model.toLowerCase().includes('hermes-2-pro-mistral-7b')) {
modelPrefix = `@hf/nousresearch`;
}
if (model.toLowerCase().includes('mistral-7b-instruct')) {
modelPrefix = `@hf/mistral`;
}
if (model.toLowerCase().includes('gemma')) {
modelPrefix = `@cf/google`;
}
if (model.toLowerCase().includes('deepseek')) {
modelPrefix = `@cf/deepseek-ai`;
}
if (model.toLowerCase().includes('openchat-3.5-0106')) {
modelPrefix = `@cf/openchat`;
}
const isNueralChat = model.toLowerCase().includes('neural-chat-7b-v3-1-awq');
if (
isNueralChat ||
model.toLowerCase().includes('openhermes-2.5-mistral-7b-awq') ||
model.toLowerCase().includes('zephyr-7b-beta-awq') ||
model.toLowerCase().includes('deepseek-coder-6.7b-instruct-awq')
) {
modelPrefix = `@hf/thebloke`;
}
return modelPrefix;
}
private getModelWithPrefix(model: string): string {
return `${this.getModelPrefix(model)}/${model}`;
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class CloudflareAISdk {
private static provider = new CloudflareAiChatProvider();
static async handleCloudflareAIStream(
param: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: string;
env: Env;
},
dataCallback: (data: any) => void,
) {
return this.provider.handleStream(
{
systemPrompt: param.systemPrompt,
preprocessedContext: param.preprocessedContext,
maxTokens: param.maxTokens,
messages: param.messages,
model: param.model,
env: param.env,
},
dataCallback,
);
}
}

View File

@@ -0,0 +1,77 @@
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class FireworksAiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
apiKey: param.env.FIREWORKS_API_KEY,
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.fireworks,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
let modelPrefix = 'accounts/fireworks/models/';
if (param.model.toLowerCase().includes('yi-')) {
modelPrefix = 'accounts/yi-01-ai/models/';
} else if (param.model.toLowerCase().includes('/perplexity/')) {
modelPrefix = 'accounts/perplexity/models/';
} else if (param.model.toLowerCase().includes('/sentientfoundation/')) {
modelPrefix = 'accounts/sentientfoundation/models/';
} else if (param.model.toLowerCase().includes('/sentientfoundation-serverless/')) {
modelPrefix = 'accounts/sentientfoundation-serverless/models/';
} else if (param.model.toLowerCase().includes('/instacart/')) {
modelPrefix = 'accounts/instacart/models/';
}
const finalModelIdentifier = param.model.includes(modelPrefix)
? param.model
: `${modelPrefix}${param.model}`;
console.log('using fireworks model', finalModelIdentifier);
return {
model: finalModelIdentifier,
messages: safeMessages,
stream: true,
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class FireworksAiChatSdk {
private static provider = new FireworksAiChatProvider();
static async handleFireworksStream(
param: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: number;
messages: any;
model: any;
env: any;
},
// TODO: Replace usage of any with an explicit but permissive type
dataCallback: (data: any) => void,
) {
return this.provider.handleStream(
{
systemPrompt: param.systemPrompt,
preprocessedContext: param.preprocessedContext,
maxTokens: param.maxTokens,
messages: param.messages,
model: param.model,
env: param.env,
},
dataCallback,
);
}
}

View File

@@ -1,12 +1,12 @@
import { OpenAI } from "openai";
import ChatSdk from "../lib/chat-sdk.ts";
import { StreamParams } from "../services/ChatService.ts";
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts";
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class GoogleChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: "https://generativelanguage.googleapis.com/v1beta/openai",
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.google,
apiKey: param.env.GEMINI_API_KEY,
});
}
@@ -20,14 +20,14 @@ export class GoogleChatProvider extends BaseChatProvider {
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices?.[0]?.finish_reason === "stop") {
if (chunk.choices?.[0]?.finish_reason === 'stop') {
dataCallback({
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: { content: chunk.choices[0].delta.content || "" },
finish_reason: "stop",
delta: { content: chunk.choices[0].delta.content || '' },
finish_reason: 'stop',
index: chunk.choices[0].index,
},
],
@@ -36,11 +36,11 @@ export class GoogleChatProvider extends BaseChatProvider {
return true;
} else {
dataCallback({
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: { content: chunk.choices?.[0]?.delta?.content || "" },
delta: { content: chunk.choices?.[0]?.delta?.content || '' },
finish_reason: null,
index: chunk.choices?.[0]?.index || 0,
},
@@ -55,10 +55,7 @@ export class GoogleChatProvider extends BaseChatProvider {
export class GoogleChatSdk {
private static provider = new GoogleChatProvider();
static async handleGoogleStream(
param: StreamParams,
dataCallback: (data) => void,
) {
static async handleGoogleStream(param: StreamParams, dataCallback: (data: any) => void) {
return this.provider.handleStream(
{
systemPrompt: param.systemPrompt,

View File

@@ -1,17 +1,19 @@
import { OpenAI } from "openai";
import {
_NotCustomized,
ISimpleType,
ModelPropertiesDeclarationToProperties,
ModelSnapshotType2,
UnionStringArray,
} from "mobx-state-tree";
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts";
} from 'mobx-state-tree';
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, CommonProviderParams } from './chat-stream-provider.ts';
export class GroqChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: "https://api.groq.com/openai/v1",
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.groq,
apiKey: param.env.GROQ_API_KEY,
});
}
@@ -29,17 +31,17 @@ export class GroqChatProvider extends BaseChatProvider {
model: param.model,
messages: safeMessages,
stream: true,
...tuningParams
...tuningParams,
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === "stop") {
dataCallback({ type: "chat", data: chunk });
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: "chat", data: chunk });
dataCallback({ type: 'chat', data: chunk });
return false;
}
}

View File

@@ -0,0 +1,8 @@
export * from './claude.ts';
export * from './cerebras.ts';
export * from './cloudflareAi.ts';
export * from './fireworks.ts';
export * from './groq.ts';
export * from './mlx-omni.ts';
export * from './ollama.ts';
export * from './xai.ts';

View File

@@ -0,0 +1,97 @@
import { OpenAI } from 'openai';
import { type ChatCompletionCreateParamsStreaming } from 'openai/resources/chat/completions/completions';
import { Common } from '../utils';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class MlxOmniChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: 'http://localhost:10240',
apiKey: param.env.MLX_API_KEY,
});
}
getStreamParams(
param: CommonProviderParams,
safeMessages: any[],
): ChatCompletionCreateParamsStreaming {
const baseTuningParams = {
temperature: 0.86,
top_p: 0.98,
presence_penalty: 0.1,
frequency_penalty: 0.3,
max_tokens: param.maxTokens as number,
};
const getTuningParams = () => {
return baseTuningParams;
};
let completionRequest: ChatCompletionCreateParamsStreaming = {
model: param.model,
stream: true,
messages: safeMessages,
};
const client = this.getOpenAIClient(param);
const isLocal = client.baseURL.includes('localhost');
if (isLocal) {
completionRequest['messages'] = Common.Utils.normalizeWithBlanks(safeMessages);
completionRequest['stream_options'] = {
include_usage: true,
};
} else {
completionRequest = { ...completionRequest, ...getTuningParams() };
}
return completionRequest;
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
const isLocal = chunk.usage !== undefined;
if (isLocal && chunk.usage) {
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: { content: '' },
logprobs: null,
finish_reason: 'stop',
},
],
},
});
return true; // Break the stream
}
dataCallback({ type: 'chat', data: chunk });
return false; // Continue the stream
}
}
export class MlxOmniChatSdk {
private static provider = new MlxOmniChatProvider();
static async handleMlxOmniStream(ctx: any, dataCallback: (data: any) => any) {
if (!ctx.messages?.length) {
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(
{
systemPrompt: ctx.systemPrompt,
preprocessedContext: ctx.preprocessedContext,
maxTokens: ctx.maxTokens,
messages: Common.Utils.normalizeWithBlanks(ctx.messages),
model: ctx.model,
env: ctx.env,
},
dataCallback,
);
}
}

View File

@@ -0,0 +1,75 @@
import { OpenAI } from 'openai';
import type { GenericEnv } from '../types';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class OllamaChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: param.env.OLLAMA_API_ENDPOINT ?? ProviderRepository.OPENAI_COMPAT_ENDPOINTS.ollama,
apiKey: param.env.OLLAMA_API_KEY,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
const tuningParams = {
temperature: 0.75,
};
const getTuningParams = () => {
return tuningParams;
};
return {
model: param.model,
messages: safeMessages,
stream: true,
...getTuningParams(),
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class OllamaChatSdk {
private static provider = new OllamaChatProvider();
static async handleOllamaStream(
ctx: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: any;
env: GenericEnv;
},
dataCallback: (data: any) => any,
) {
if (!ctx.messages?.length) {
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(
{
systemPrompt: ctx.systemPrompt,
preprocessedContext: ctx.preprocessedContext,
maxTokens: ctx.maxTokens,
messages: ctx.messages,
model: ctx.model,
env: ctx.env,
},
dataCallback,
);
}
}

View File

@@ -1,16 +1,21 @@
import { OpenAI } from "openai";
import { Utils } from "../lib/utils.ts";
import { ChatCompletionCreateParamsStreaming } from "openai/resources/chat/completions/completions";
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts";
import { OpenAI } from 'openai';
import type { ChatCompletionCreateParamsStreaming } from 'openai/resources/chat/completions/completions';
import { Common } from '../utils';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class OpenAiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return param.openai as OpenAI;
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): ChatCompletionCreateParamsStreaming {
getStreamParams(
param: CommonProviderParams,
safeMessages: any[],
): ChatCompletionCreateParamsStreaming {
const isO1 = () => {
if (param.model === "o1-preview" || param.model === "o1-mini") {
if (param.model === 'o1-preview' || param.model === 'o1-mini') {
return true;
}
};
@@ -27,8 +32,8 @@ export class OpenAiChatProvider extends BaseChatProvider {
const getTuningParams = () => {
if (isO1()) {
tuningParams["temperature"] = 1;
tuningParams["max_completion_tokens"] = (param.maxTokens as number) + 10000;
tuningParams['temperature'] = 1;
tuningParams['max_completion_tokens'] = (param.maxTokens as number) + 10000;
return tuningParams;
}
return gpt4oTuningParams;
@@ -37,16 +42,16 @@ export class OpenAiChatProvider extends BaseChatProvider {
let completionRequest: ChatCompletionCreateParamsStreaming = {
model: param.model,
stream: true,
messages: safeMessages
messages: safeMessages,
};
const client = this.getOpenAIClient(param);
const isLocal = client.baseURL.includes("localhost");
const isLocal = client.baseURL.includes('localhost');
if (isLocal) {
completionRequest["messages"] = Utils.normalizeWithBlanks(safeMessages);
completionRequest["stream_options"] = {
include_usage: true
completionRequest['messages'] = Common.Utils.normalizeWithBlanks(safeMessages);
completionRequest['stream_options'] = {
include_usage: true,
};
} else {
completionRequest = { ...completionRequest, ...getTuningParams() };
@@ -60,13 +65,13 @@ export class OpenAiChatProvider extends BaseChatProvider {
if (isLocal && chunk.usage) {
dataCallback({
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: { content: "" },
delta: { content: '' },
logprobs: null,
finish_reason: "stop",
finish_reason: 'stop',
},
],
},
@@ -74,7 +79,7 @@ export class OpenAiChatProvider extends BaseChatProvider {
return true; // Break the stream
}
dataCallback({ type: "chat", data: chunk });
dataCallback({ type: 'chat', data: chunk });
return false; // Continue the stream
}
}
@@ -95,7 +100,7 @@ export class OpenAiChatSdk {
dataCallback: (data: any) => any,
) {
if (!ctx.messages?.length) {
return new Response("No messages provided", { status: 400 });
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(

View File

@@ -0,0 +1,75 @@
import { OpenAI } from 'openai';
import type { GenericEnv, GenericStreamData } from '../types';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class XaiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: 'https://api.x.ai/v1',
apiKey: param.env.XAI_API_KEY,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
const tuningParams = {
temperature: 0.75,
};
const getTuningParams = () => {
return tuningParams;
};
return {
model: param.model,
messages: safeMessages,
stream: true,
...getTuningParams(),
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class XaiChatSdk {
private static provider = new XaiChatProvider();
static async handleXaiStream(
ctx: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: any;
env: GenericEnv;
},
dataCallback: (data: GenericStreamData) => any,
) {
if (!ctx.messages?.length) {
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(
{
systemPrompt: ctx.systemPrompt,
preprocessedContext: ctx.preprocessedContext,
maxTokens: ctx.maxTokens,
messages: ctx.messages,
model: ctx.model,
env: ctx.env,
disableWebhookGeneration: ctx.disableWebhookGeneration,
},
dataCallback,
);
}
}

View File

@@ -0,0 +1,21 @@
// tools/basicValue.ts
export interface BasicValueResult {
value: string;
}
export const BasicValueTool = {
name: 'basicValue',
type: 'function',
description: 'Returns a basic value (timestamp-based) for testing',
parameters: {
type: 'object',
properties: {},
required: [],
},
function: async (): Promise<BasicValueResult> => {
// generate something obviously basic
const basic = `tool-called-${Date.now()}`;
console.log('[BasicValueTool] returning:', basic);
return { value: basic };
},
};

View File

@@ -0,0 +1,25 @@
export async function getWeather(latitude: any, longitude: any) {
const response = await fetch(
`https://api.open-meteo.com/v1/forecast?latitude=${latitude}&longitude=${longitude}&current=temperature_2m,wind_speed_10m&hourly=temperature_2m,relative_humidity_2m,wind_speed_10m`,
);
const data = await response.json();
return data.current.temperature_2m;
}
export const WeatherTool = {
type: 'function',
function: {
name: 'get_weather',
description: 'Get current temperature for provided coordinates in celsius.',
parameters: {
type: 'object',
properties: {
latitude: { type: 'number' },
longitude: { type: 'number' },
},
required: ['latitude', 'longitude'],
additionalProperties: false,
},
strict: true,
},
};

View File

@@ -0,0 +1,68 @@
export interface ShipControlResult {
message: string;
status: 'success' | 'error';
data?: any;
}
/**
* A mock interface for controlling a ship.
*/
export const YachtpitTools = {
type: 'function',
description: 'Interface for controlling a ship: set speed, change heading, report status, etc.',
/**
* Mock implementation of a ship control command.
*/
function: {
name: 'ship_control',
parameters: {
type: 'object',
properties: {
action: {
type: 'string',
enum: ['set_speed', 'change_heading', 'report_status', 'stop'],
description: 'Action to perform on the ship.',
},
value: {
type: 'number',
description:
'Numeric value for the action, such as speed (knots) or heading (degrees). Only required for set_speed and change_heading.',
},
},
required: ['action'],
additionalProperties: false,
},
},
};
export function yachtpitAi(args: { action: string; value?: number }): Promise<ShipControlResult> {
switch (args.action) {
case 'set_speed':
if (typeof args.value !== 'number') {
return { status: 'error', message: 'Missing speed value.' };
}
return { status: 'success', message: `Speed set to ${args.value} knots.` };
case 'change_heading':
if (typeof args.value !== 'number') {
return { status: 'error', message: 'Missing heading value.' };
}
return { status: 'success', message: `Heading changed to ${args.value} degrees.` };
case 'report_status':
// Return a simulated ship status
return {
status: 'success',
message: 'Ship status reported.',
data: {
speed: 12,
heading: 87,
engine: 'nominal',
position: { lat: 42.35, lon: -70.88 },
},
};
case 'stop':
return { status: 'success', message: 'Ship stopped.' };
default:
return { status: 'error', message: 'Invalid action.' };
}
}

View File

@@ -0,0 +1 @@
export * from './types.ts';

View File

@@ -0,0 +1,5 @@
{
"name": "@open-gsio/types",
"type": "module",
"module": "index.ts"
}

View File

@@ -0,0 +1,29 @@
import { ProviderRepository } from '../providers/_ProviderRepository.ts';
export type GenericEnv = Record<string, any>;
export type GenericStreamData = any;
export type ModelMeta = {
id: any;
} & Record<string, any>;
export type SupportedProvider = keyof typeof ProviderRepository.OPENAI_COMPAT_ENDPOINTS & string;
export type Provider = { name: SupportedProvider; key: string; endpoint: string };
export type Providers = Provider[];
export type ChatRequestBody = {
messages: any[];
model: string;
conversationId: string;
};
export interface BuildAssistantPromptParams {
maxTokens: any;
}
export interface PreprocessParams {
messages: any[];
}

View File

@@ -22,15 +22,9 @@ interface StreamResponse {
};
}
const handleStreamData = (
controller: ReadableStreamDefaultController,
encoder: TextEncoder,
) => {
return (
data: StreamResponse,
transformFn?: (data: StreamResponse) => StreamResponse,
) => {
if (!data?.type || data.type !== "chat") {
const handleStreamData = (controller: ReadableStreamDefaultController, encoder: TextEncoder) => {
return (data: StreamResponse, transformFn?: (data: StreamResponse) => StreamResponse) => {
if (!data?.type || data.type !== 'chat') {
return;
}
@@ -39,17 +33,14 @@ const handleStreamData = (
if (transformFn) {
transformedData = transformFn(data);
} else {
if (
data.data.type === "content_block_start" &&
data.data.content_block?.type === "text"
) {
if (data.data.type === 'content_block_start' && data.data.content_block?.type === 'text') {
transformedData = {
type: "chat",
type: 'chat',
data: {
choices: [
{
delta: {
content: data.data.content_block.text || "",
content: data.data.content_block.text || '',
},
logprobs: null,
finish_reason: null,
@@ -59,7 +50,7 @@ const handleStreamData = (
};
} else if (data.data.delta?.text) {
transformedData = {
type: "chat",
type: 'chat',
data: {
choices: [
{
@@ -74,7 +65,7 @@ const handleStreamData = (
};
} else if (data.data.choices?.[0]?.delta?.content) {
transformedData = {
type: "chat",
type: 'chat',
data: {
choices: [
{
@@ -95,9 +86,7 @@ const handleStreamData = (
}
}
controller.enqueue(
encoder.encode(`data: ${JSON.stringify(transformedData)}\n\n`),
);
controller.enqueue(encoder.encode(`data: ${JSON.stringify(transformedData)}\n\n`));
};
};

View File

@@ -0,0 +1,3 @@
import * as Common from './utils.ts';
export { Common };

View File

@@ -1,20 +1,19 @@
import handleStreamData from './handleStreamData.ts';
export class Utils {
static getSeason(date: string): string {
const hemispheres = {
Northern: ["Winter", "Spring", "Summer", "Autumn"],
Southern: ["Summer", "Autumn", "Winter", "Spring"],
Northern: ['Winter', 'Spring', 'Summer', 'Autumn'],
Southern: ['Summer', 'Autumn', 'Winter', 'Spring'],
};
const d = new Date(date);
const month = d.getMonth();
const day = d.getDate();
const hemisphere = "Northern";
const hemisphere = 'Northern';
if (month < 2 || (month === 2 && day <= 20) || month === 11)
return hemispheres[hemisphere][0];
if (month < 5 || (month === 5 && day <= 21))
return hemispheres[hemisphere][1];
if (month < 8 || (month === 8 && day <= 22))
return hemispheres[hemisphere][2];
if (month < 2 || (month === 2 && day <= 20) || month === 11) return hemispheres[hemisphere][0];
if (month < 5 || (month === 5 && day <= 21)) return hemispheres[hemisphere][1];
if (month < 8 || (month === 8 && day <= 22)) return hemispheres[hemisphere][2];
return hemispheres[hemisphere][3];
}
static getTimezone(timezone) {
@@ -30,18 +29,16 @@ export class Utils {
static isAssetUrl(url) {
const { pathname } = new URL(url);
return pathname.startsWith("/assets/");
return pathname.startsWith('/assets/');
}
static selectEquitably({ a, b, c, d }, itemCount = 9) {
const sources = [a, b, c, d];
const result = {};
let combinedItems = [];
let combinedItems: any[] = [];
sources.forEach((source, index) => {
combinedItems.push(
...Object.keys(source).map((key) => ({ source: index, key })),
);
combinedItems.push(...Object.keys(source).map(key => ({ source: index, key })));
});
combinedItems = combinedItems.sort(() => Math.random() - 0.5);
@@ -60,37 +57,37 @@ export class Utils {
return result;
}
static normalizeWithBlanks<T extends Normalize.ChatMessage>(msgs: T[]): T[] {
static normalizeWithBlanks<T extends NormalizeChatMessage>(msgs: T[]): T[] {
const out: T[] = [];
// In local mode first turn expected to be user.
let expected: Normalize.Role = "user";
let expected: NormalizeRole = 'user';
for (const m of msgs) {
while (m.role !== expected) {
// Insert blanks to match expected sequence user/assistant/user...
out.push(Normalize.makeBlank(expected) as T);
expected = expected === "user" ? "assistant" : "user";
out.push(makeNormalizeBlank(expected) as T);
expected = expected === 'user' ? 'assistant' : 'user';
}
out.push(m);
expected = expected === "user" ? "assistant" : "user";
expected = expected === 'user' ? 'assistant' : 'user';
}
return out;
}
static handleStreamData = handleStreamData;
}
module Normalize {
export type Role = "user" | "assistant";
// Normalize module exports
export type NormalizeRole = 'user' | 'assistant';
export interface ChatMessage extends Record<any, any> {
role: Role;
export interface NormalizeChatMessage extends Record<any, any> {
role: NormalizeRole;
}
export const makeBlank = (role: Role): ChatMessage => ({
export const makeNormalizeBlank = (role: NormalizeRole): NormalizeChatMessage => ({
role,
content: ""
content: '',
});
}

View File

@@ -1,88 +0,0 @@
const SUPPORTED_MODELS_GROUPS = {
openai: [
// "o1-preview",
// "o1-mini",
// "gpt-4o",
// "gpt-3.5-turbo"
],
groq: [
// "mixtral-8x7b-32768",
// "deepseek-r1-distill-llama-70b",
"meta-llama/llama-4-scout-17b-16e-instruct",
"gemma2-9b-it",
"mistral-saba-24b",
// "qwen-2.5-32b",
"llama-3.3-70b-versatile",
// "llama-3.3-70b-versatile"
// "llama-3.1-70b-versatile",
// "llama-3.3-70b-versatile"
],
cerebras: ["llama-3.3-70b"],
claude: [
// "claude-3-5-sonnet-20241022",
// "claude-3-opus-20240229"
],
fireworks: [
// "llama-v3p1-405b-instruct",
// "llama-v3p1-70b-instruct",
// "llama-v3p2-90b-vision-instruct",
// "mixtral-8x22b-instruct",
// "mythomax-l2-13b",
// "yi-large"
],
google: [
// "gemini-2.0-flash-exp",
// "gemini-1.5-flash",
// "gemini-exp-1206",
// "gemini-1.5-pro"
],
xai: [
// "grok-beta",
// "grok-2",
// "grok-2-1212",
// "grok-2-latest",
// "grok-beta"
],
cloudflareAI: [
"llama-3.2-3b-instruct", // max_tokens
"llama-3-8b-instruct", // max_tokens
"llama-3.1-8b-instruct-fast", // max_tokens
"deepseek-math-7b-instruct",
"deepseek-coder-6.7b-instruct-awq",
"hermes-2-pro-mistral-7b",
"openhermes-2.5-mistral-7b-awq",
"mistral-7b-instruct-v0.2",
"neural-chat-7b-v3-1-awq",
"openchat-3.5-0106",
// "gemma-7b-it",
],
};
export type SupportedModel =
| keyof typeof SUPPORTED_MODELS_GROUPS
| (typeof SUPPORTED_MODELS_GROUPS)[keyof typeof SUPPORTED_MODELS_GROUPS][number];
export type ModelFamily = keyof typeof SUPPORTED_MODELS_GROUPS;
function getModelFamily(model: string): ModelFamily | undefined {
return Object.keys(SUPPORTED_MODELS_GROUPS)
.filter((family) => {
return SUPPORTED_MODELS_GROUPS[
family as keyof typeof SUPPORTED_MODELS_GROUPS
].includes(model.trim());
})
.at(0) as ModelFamily | undefined;
}
const SUPPORTED_MODELS = [
// ...SUPPORTED_MODELS_GROUPS.xai,
// ...SUPPORTED_MODELS_GROUPS.claude,
// ...SUPPORTED_MODELS_GROUPS.google,
...SUPPORTED_MODELS_GROUPS.groq,
// ...SUPPORTED_MODELS_GROUPS.fireworks,
// ...SUPPORTED_MODELS_GROUPS.openai,
// ...SUPPORTED_MODELS_GROUPS.cerebras,
// ...SUPPORTED_MODELS_GROUPS.cloudflareAI,
];
export { SUPPORTED_MODELS, SUPPORTED_MODELS_GROUPS, getModelFamily };

View File

@@ -0,0 +1,9 @@
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"outDir": "dist",
"rootDir": "."
},
"include": ["*.ts"],
"exclude": ["node_modules"]
}

View File

@@ -5,23 +5,36 @@
"dev": "bun vite dev",
"build": "bun vite build",
"tests": "vitest run",
"tests:coverage": "vitest run --coverage.enabled=true"
"tests:coverage": "vitest run --coverage.enabled=true",
"generate:sitemap": "bun ./scripts/generate_sitemap.js open-gsio.seemueller.workers.dev",
"generate:robotstxt": "bun ./scripts/generate_robots_txt.js open-gsio.seemueller.workers.dev",
"generate:fonts": "cp -r ../../node_modules/katex/dist/fonts public/static",
"generate:pwa:assets": "test ! -f public/pwa-64x64.png && pwa-assets-generator --preset minimal-2023 public/logo.png || echo 'PWA assets already exist'"
},
"dependencies": {
"@open-gsio/env": "workspace:*",
"@open-gsio/scripts": "workspace:*",
"@anthropic-ai/sdk": "^0.32.1",
"exports": {
"./server/index.ts": {
"import": "./server/index.ts",
"types": "./server/index.ts"
}
},
"devDependencies": {
"@chakra-ui/icons": "^2.2.4",
"@chakra-ui/react": "^2.10.6",
"@cloudflare/workers-types": "^4.20241205.0",
"@emotion/react": "^11.13.5",
"@emotion/styled": "^11.13.5",
"@open-gsio/env": "workspace:*",
"@open-gsio/scripts": "workspace:*",
"@testing-library/jest-dom": "^6.4.2",
"@testing-library/react": "^14.2.1",
"@testing-library/react": "^16.3.0",
"@testing-library/user-event": "^14.5.2",
"@types/bun": "^1.2.17",
"@types/marked": "^6.0.0",
"@vite-pwa/assets-generator": "^1.0.0",
"@vitejs/plugin-react": "^4.3.4",
"@vitest/coverage-v8": "^3.1.4",
"@vitest/ui": "^3.1.4",
"bun": "^1.2.17",
"chokidar": "^4.0.1",
"framer-motion": "^11.13.1",
"isomorphic-dompurify": "^2.19.0",
@@ -29,6 +42,7 @@
"jsdom": "^24.0.0",
"katex": "^0.16.20",
"lucide-react": "^0.436.0",
"mapbox-gl": "^3.13.0",
"marked": "^15.0.4",
"marked-extended-latex": "^1.1.0",
"marked-footnote": "^1.2.4",
@@ -36,18 +50,19 @@
"mobx": "^6.13.5",
"mobx-react-lite": "^4.0.7",
"mobx-state-tree": "^6.0.1",
"moo": "^0.5.2",
"qrcode.react": "^4.1.0",
"react": "^18.3.1",
"react-dom": "^18.3.1",
"react": "^19.1.0",
"react-dom": "^19.1.0",
"react-icons": "^5.4.0",
"react-streaming": "^0.3.44",
"react-map-gl": "^8.0.4",
"react-streaming": "^0.4.2",
"react-textarea-autosize": "^8.5.5",
"shiki": "^1.24.0",
"tslog": "^4.9.3",
"typescript": "^5.7.2",
"vike": "0.4.193",
"vite": "^6.3.5",
"vite-plugin-pwa": "^1.0.0",
"vike": "^0.4.235",
"vite": "^7.0.0",
"vite-plugin-pwa": "^1.0.1",
"vitest": "^3.1.4"
}
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.8 KiB

View File

@@ -15,30 +15,29 @@
};
function s() {
var i = [
g(m(4)) + "=" + g(m(6)),
"ga=" + t.ga_tid,
"dt=" + r(e.title),
"de=" + r(e.characterSet || e.charset),
"dr=" + r(e.referrer),
"ul=" + (n.language || n.browserLanguage || n.userLanguage),
"sd=" + a.colorDepth + "-bit",
"sr=" + a.width + "x" + a.height,
"vp=" +
g(m(4)) + '=' + g(m(6)),
'ga=' + t.ga_tid,
'dt=' + r(e.title),
'de=' + r(e.characterSet || e.charset),
'dr=' + r(e.referrer),
'ul=' + (n.language || n.browserLanguage || n.userLanguage),
'sd=' + a.colorDepth + '-bit',
'sr=' + a.width + 'x' + a.height,
'vp=' +
o(e.documentElement.clientWidth, t.innerWidth || 0) +
"x" +
'x' +
o(e.documentElement.clientHeight, t.innerHeight || 0),
"plt=" + c(d.loadEventStart - d.navigationStart || 0),
"dns=" + c(d.domainLookupEnd - d.domainLookupStart || 0),
"pdt=" + c(d.responseEnd - d.responseStart || 0),
"rrt=" + c(d.redirectEnd - d.redirectStart || 0),
"tcp=" + c(d.connectEnd - d.connectStart || 0),
"srt=" + c(d.responseStart - d.requestStart || 0),
"dit=" + c(d.domInteractive - d.domLoading || 0),
"clt=" + c(d.domContentLoadedEventStart - d.navigationStart || 0),
"z=" + Date.now(),
'plt=' + c(d.loadEventStart - d.navigationStart || 0),
'dns=' + c(d.domainLookupEnd - d.domainLookupStart || 0),
'pdt=' + c(d.responseEnd - d.responseStart || 0),
'rrt=' + c(d.redirectEnd - d.redirectStart || 0),
'tcp=' + c(d.connectEnd - d.connectStart || 0),
'srt=' + c(d.responseStart - d.requestStart || 0),
'dit=' + c(d.domInteractive - d.domLoading || 0),
'clt=' + c(d.domContentLoadedEventStart - d.navigationStart || 0),
'z=' + Date.now(),
];
(t.__ga_img = new Image()), (t.__ga_img.src = t.ga_api + "?" + i.join("&"));
((t.__ga_img = new Image()), (t.__ga_img.src = t.ga_api + '?' + i.join('&')));
}
(t.cfga = s),
"complete" === e.readyState ? s() : t.addEventListener("load", s);
((t.cfga = s), 'complete' === e.readyState ? s() : t.addEventListener('load', s));
})(window, document, navigator);

Binary file not shown.

Before

Width:  |  Height:  |  Size: 638 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 563 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 624 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 534 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 373 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 165 KiB

View File

@@ -1,19 +0,0 @@
{
"name": "",
"short_name": "",
"icons": [
{
"src": "/android-chrome-192x192.png",
"sizes": "192x192",
"type": "image/png"
},
{
"src": "/android-chrome-512x512.png",
"sizes": "512x512",
"type": "image/png"
}
],
"theme_color": "#fffff0",
"background_color": "#000000",
"display": "standalone"
}

View File

@@ -1,8 +1,8 @@
#!/usr/bin/env bun
/* eslint-env node */
import fs from "fs";
import {parseArgs} from "util";
import fs from 'fs';
import { parseArgs } from 'util';
const { positionals } = parseArgs({
args: Bun.argv,
@@ -11,7 +11,7 @@ const {positionals} = parseArgs({
allowPositionals: true,
});
const currentDate = new Date().toISOString().split("T")[0];
const currentDate = new Date().toISOString().split('T')[0];
const host = positionals[2];
@@ -25,12 +25,12 @@ Disallow: /assets
Sitemap: https://${host}/sitemap.xml
`;
const robotsTxtPath = "./public/robots.txt";
const robotsTxtPath = './public/robots.txt';
fs.writeFile(robotsTxtPath, robotsTxtTemplate, (err) => {
fs.writeFile(robotsTxtPath, robotsTxtTemplate, err => {
if (err) {
console.error("Error writing robots.txt:", err);
console.error('Error writing robots.txt:', err);
process.exit(1);
}
console.log("robots.txt created successfully:", currentDate);
console.log('robots.txt created successfully:', currentDate);
});

View File

@@ -1,8 +1,7 @@
#!/usr/bin/env bun
import fs from "fs";
import {parseArgs} from "util";
import fs from 'fs';
import { parseArgs } from 'util';
const { positionals } = parseArgs({
args: Bun.argv,
@@ -11,7 +10,7 @@ const {positionals} = parseArgs({
allowPositionals: true,
});
const currentDate = new Date().toISOString().split("T")[0];
const currentDate = new Date().toISOString().split('T')[0];
const host = positionals[2];
@@ -30,12 +29,12 @@ const sitemapTemplate = `<?xml version="1.0" encoding="UTF-8"?>
</url>
</urlset>`;
const sitemapPath = "./public/sitemap.xml";
const sitemapPath = './public/sitemap.xml';
fs.writeFile(sitemapPath, sitemapTemplate, (err) => {
fs.writeFile(sitemapPath, sitemapTemplate, err => {
if (err) {
console.error("Error writing sitemap file:", err);
console.error('Error writing sitemap file:', err);
process.exit(1);
}
console.log("Sitemap updated successfully with current date:", currentDate);
console.log('Sitemap updated successfully with current date:', currentDate);
});

View File

@@ -0,0 +1,20 @@
import { renderPage } from 'vike/server';
// This is what makes SSR possible. It is consumed by @open-gsio/server
export { handleSsr };
async function handleSsr(url: string, headers: Headers) {
const pageContextInit = {
urlOriginal: url,
headersOriginal: headers,
fetch: (...args: Parameters<typeof fetch>) => fetch(...args),
};
const pageContext = await renderPage(pageContextInit);
const { httpResponse } = pageContext;
const stream = httpResponse.getReadableWebStream();
return new Response(stream, {
headers: httpResponse.headers,
status: httpResponse.statusCode,
});
}

View File

@@ -1,7 +1,8 @@
import React from "react";
import { IconButton } from "@chakra-ui/react";
import { LucideHammer } from "lucide-react";
import { toolbarButtonZIndex } from "./toolbar/Toolbar";
import { IconButton } from '@chakra-ui/react';
import { LucideHammer } from 'lucide-react';
import React from 'react';
import { toolbarButtonZIndex } from './toolbar/Toolbar';
export default function BuiltWithButton() {
return (
@@ -12,12 +13,12 @@ export default function BuiltWithButton() {
bg="transparent"
stroke="text.accent"
color="text.accent"
onClick={() => alert("Built by Geoff Seemueller")}
onClick={() => alert('Built by GSIO')}
_hover={{
bg: "transparent",
bg: 'transparent',
svg: {
stroke: "accent.secondary",
transition: "stroke 0.3s ease-in-out",
stroke: 'accent.secondary',
transition: 'stroke 0.3s ease-in-out',
},
}}
zIndex={toolbarButtonZIndex}

View File

@@ -1,10 +1,12 @@
import { getColorThemes } from "../layout/theme/color-themes";
import { Center, IconButton, VStack } from "@chakra-ui/react";
import userOptionsStore from "../stores/UserOptionsStore";
import { Circle } from "lucide-react";
import { toolbarButtonZIndex } from "./toolbar/Toolbar";
import React from "react";
import { useIsMobile } from "./contexts/MobileContext";
import { Center, IconButton, VStack } from '@chakra-ui/react';
import { Circle } from 'lucide-react';
import React from 'react';
import { getColorThemes } from '../layout/theme/color-themes';
import userOptionsStore from '../stores/UserOptionsStore';
import { useIsMobile } from './contexts/MobileContext';
import { toolbarButtonZIndex } from './toolbar/Toolbar';
export function ThemeSelectionOptions() {
const children = [];
@@ -24,11 +26,11 @@ export function ThemeSelectionOptions() {
size={!isMobile ? 16 : 20}
stroke="transparent"
style={{
background: `conic-gradient(${theme.colors.background.primary.startsWith("#") ? theme.colors.background.primary : theme.colors.background.secondary} 0 50%, ${theme.colors.text.secondary} 50% 100%)`,
borderRadius: "50%",
boxShadow: "0 0 0.5px 0.25px #fff",
cursor: "pointer",
transition: "background 0.2s",
background: `conic-gradient(${theme.colors.background.primary.startsWith('#') ? theme.colors.background.primary : theme.colors.background.secondary} 0 50%, ${theme.colors.text.secondary} 50% 100%)`,
borderRadius: '50%',
boxShadow: '0 0 0.5px 0.25px #fff',
cursor: 'pointer',
transition: 'background 0.2s',
}}
/>
}
@@ -38,7 +40,7 @@ export function ThemeSelectionOptions() {
color="transparent"
_hover={{
svg: {
transition: "stroke 0.3s ease-in-out", // Smooth transition effect
transition: 'stroke 0.3s ease-in-out', // Smooth transition effect
},
}}
zIndex={toolbarButtonZIndex}
@@ -47,7 +49,7 @@ export function ThemeSelectionOptions() {
}
return (
<VStack align={!isMobile ? "end" : "start"} p={1.2}>
<VStack align={!isMobile ? 'end' : 'start'} p={1.2}>
<Center>{children}</Center>
</VStack>
);

View File

@@ -1,11 +1,9 @@
import { motion } from "framer-motion";
import { Box, Center, VStack } from "@chakra-ui/react";
import {
welcome_home_text,
welcome_home_tip,
} from "../static-data/welcome_home_text";
import {renderMarkdown} from "./markdown/MarkdownComponent";
import { Box, Center, VStack } from '@chakra-ui/react';
import { motion } from 'framer-motion';
import { welcome_home_text, welcome_home_tip } from '../static-data/welcome_home_text';
import { renderMarkdown } from './markdown/MarkdownComponent';
function WelcomeHomeMessage({ visible }) {
const containerVariants = {
@@ -45,33 +43,19 @@ function WelcomeHomeMessage({ visible }) {
<Center>
<VStack spacing={8} align="center" maxW="400px">
{/* Welcome Message */}
<Box
fontSize="sm"
fontStyle="italic"
textAlign="center"
color="text.secondary"
mt={4}
>
<Box fontSize="sm" fontStyle="italic" textAlign="center" color="text.secondary" mt={4}>
<motion.div
variants={containerVariants}
initial="hidden"
animate={visible ? "visible" : "hidden"}
animate={visible ? 'visible' : 'hidden'}
>
<Box userSelect={"none"}>
<motion.div variants={textVariants}>
{renderMarkdown(welcome_home_text)}
</motion.div>
<Box userSelect={'none'}>
<motion.div variants={textVariants}>{renderMarkdown(welcome_home_text)}</motion.div>
</Box>
</motion.div>
</Box>
<motion.div variants={textVariants}>
<Box
fontSize="sm"
fontStyle="italic"
textAlign="center"
color="text.secondary"
mt={1}
>
<Box fontSize="sm" fontStyle="italic" textAlign="center" color="text.secondary" mt={1}>
{renderMarkdown(welcome_home_tip)}
</Box>
</motion.div>

View File

@@ -1,8 +1,9 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react';
import { ThemeSelectionOptions } from '../ThemeSelection';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import userOptionsStore from '../../stores/UserOptionsStore';
import * as MobileContext from '../contexts/MobileContext';
import { ThemeSelectionOptions } from '../ThemeSelection';
// Mock dependencies
vi.mock('../../layout/theme/color-themes', () => ({
@@ -11,27 +12,27 @@ vi.mock('../../layout/theme/color-themes', () => ({
name: 'light',
colors: {
background: { primary: '#ffffff', secondary: '#f0f0f0' },
text: { secondary: '#333333' }
}
text: { secondary: '#333333' },
},
},
{
name: 'dark',
colors: {
background: { primary: '#121212', secondary: '#1e1e1e' },
text: { secondary: '#e0e0e0' }
}
}
]
text: { secondary: '#e0e0e0' },
},
},
],
}));
vi.mock('../../stores/UserOptionsStore', () => ({
default: {
selectTheme: vi.fn()
}
selectTheme: vi.fn(),
},
}));
vi.mock('../toolbar/Toolbar', () => ({
toolbarButtonZIndex: 100
toolbarButtonZIndex: 100,
}));
describe('ThemeSelectionOptions', () => {
@@ -46,7 +47,7 @@ describe('ThemeSelectionOptions', () => {
render(<ThemeSelectionOptions />);
// Should render 2 theme buttons (from our mock)
const buttons = screen.getAllByRole("button")
const buttons = screen.getAllByRole('button');
expect(buttons).toHaveLength(2);
});

View File

@@ -1,12 +1,13 @@
import { describe, it, expect } from 'vitest';
import { render, screen } from '@testing-library/react';
import WelcomeHomeMessage from '../WelcomeHome';
import { describe, it, expect } from 'vitest';
import { welcome_home_text, welcome_home_tip } from '../../static-data/welcome_home_text';
import { renderMarkdown } from '../markdown/MarkdownComponent';
import WelcomeHomeMessage from '../WelcomeHome';
// Mock the renderMarkdown function
vi.mock('../markdown/MarkdownComponent', () => ({
renderMarkdown: vi.fn((text) => `Rendered: ${text}`),
renderMarkdown: vi.fn(text => `Rendered: ${text}`),
}));
describe('WelcomeHomeMessage', () => {

View File

@@ -1,14 +1,14 @@
import React from "react";
import { Grid, GridItem, Image, Text } from "@chakra-ui/react";
import { Grid, GridItem, Image, Text } from '@chakra-ui/react';
import React from 'react';
const fontSize = "md";
const fontSize = 'md';
function AboutComponent() {
return (
<Grid
templateColumns="1fr"
gap={4}
maxW={["100%", "100%", "100%"]}
maxW={['100%', '100%', '100%']}
mx="auto"
className="about-container"
>
@@ -17,22 +17,22 @@ function AboutComponent() {
src="/me.png"
alt="Geoff Seemueller"
borderRadius="full"
boxSize={["120px", "150px"]}
boxSize={['120px', '150px']}
objectFit="cover"
/>
</GridItem>
<GridItem
colSpan={1}
maxW={["100%", "100%", "container.md"]}
maxW={['100%', '100%', 'container.md']}
justifySelf="center"
minH={"100%"}
minH={'100%'}
>
<Grid templateColumns="1fr" gap={4} overflowY={"auto"}>
<Grid templateColumns="1fr" gap={4} overflowY={'auto'}>
<GridItem>
<Text fontSize={fontSize}>
If you're interested in collaborating on innovative projects that
push technological boundaries and create real value, I'd be keen
to connect and explore potential opportunities.
If you're interested in collaborating on innovative projects that push technological
boundaries and create real value, I'd be keen to connect and explore potential
opportunities.
</Text>
</GridItem>
</Grid>

View File

@@ -1,30 +1,26 @@
import React, { useEffect, useRef, useState } from "react";
import { observer } from "mobx-react-lite";
import { Box, Grid, GridItem } from "@chakra-ui/react";
import ChatMessages from "./messages/ChatMessages";
import ChatInput from "./input/ChatInput";
import chatStore from "../../stores/ClientChatStore";
import menuState from "../../stores/AppMenuStore";
import WelcomeHome from "../WelcomeHome";
import { Box, Grid, GridItem } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React, { useEffect, useRef, useState } from 'react';
import menuState from '../../stores/AppMenuStore';
import chatStore from '../../stores/ClientChatStore';
import WelcomeHome from '../WelcomeHome';
import ChatInput from './input/ChatInput';
import ChatMessages from './messages/ChatMessages';
const Chat = observer(({ height, width }) => {
const scrollRef = useRef();
const [isAndroid, setIsAndroid] = useState(false);
useEffect(() => {
if (typeof window !== "undefined") {
if (typeof window !== 'undefined') {
setIsAndroid(/android/i.test(window.navigator.userAgent));
}
}, []);
return (
<Grid
templateRows="1fr auto"
templateColumns="1fr"
height={height}
width={width}
gap={0}
>
<Grid templateRows="1fr auto" templateColumns="1fr" height={height} width={width} gap={0}>
<GridItem alignSelf="center" hidden={!(chatStore.items.length < 1)}>
<WelcomeHome visible={chatStore.items.length < 1} />
</GridItem>
@@ -32,35 +28,20 @@ const Chat = observer(({ height, width }) => {
<GridItem
overflow="auto"
width="100%"
maxH="100%"
maxH="100vh"
ref={scrollRef}
// If there are attachments, use "100px". Otherwise, use "128px" on Android, "73px" elsewhere.
pb={
isAndroid
? "128px"
: "73px"
}
pb={isAndroid ? '128px' : '73px'}
alignSelf="flex-end"
>
<ChatMessages scrollRef={scrollRef} />
</GridItem>
<GridItem
position="relative"
bg="background.primary"
zIndex={1000}
width="100%"
>
<Box
w="100%"
display="flex"
justifyContent="center"
mx="auto"
hidden={menuState.isOpen}
>
<GridItem position="relative" bg="background.primary" zIndex={1000} width="100%">
<Box w="100%" display="flex" justifyContent="center" mx="auto" hidden={menuState.isOpen}>
<ChatInput
input={chatStore.input}
setInput={(value) => chatStore.setInput(value)}
setInput={value => chatStore.setInput(value)}
handleSendMessage={chatStore.sendMessage}
isLoading={chatStore.isLoading}
/>

View File

@@ -1,16 +1,17 @@
import React from "react";
import { observer } from "mobx-react-lite";
import clientChatStore from "../../stores/ClientChatStore";
import { observer } from 'mobx-react-lite';
import React from 'react';
import clientChatStore from '../../stores/ClientChatStore';
export const IntermediateStepsComponent = observer(({ hidden }) => {
return (
<div hidden={hidden}>
{clientChatStore.intermediateSteps.map((step, index) => {
switch (step.kind) {
case "web-search": {
case 'web-search': {
return <WebSearchResult key={index} data={step.data} />;
}
case "tool-result":
case 'tool-result':
return <ToolResult key={index} data={step.data} />;
default:
return <GenericStep key={index} data={step.data} />;
@@ -45,7 +46,7 @@ export const GenericStep = ({ data }) => {
return (
<div className="generic-step">
<h3>Generic Step</h3>
<p>{data.description || "No additional information provided."}</p>
<p>{data.description || 'No additional information provided.'}</p>
</div>
);
};

View File

@@ -1,5 +1,3 @@
import React, { useRef } from "react";
import { observer } from "mobx-react-lite";
import {
Box,
Divider,
@@ -11,8 +9,10 @@ import {
Portal,
Text,
useDisclosure,
} from "@chakra-ui/react";
import { ChevronRight } from "lucide-react";
} from '@chakra-ui/react';
import { ChevronRight } from 'lucide-react';
import { observer } from 'mobx-react-lite';
import React, { useRef } from 'react';
const FlyoutSubMenu: React.FC<{
title: string;
@@ -23,15 +23,7 @@ const FlyoutSubMenu: React.FC<{
parentIsOpen: boolean;
setMenuState?: (state) => void;
}> = observer(
({
title,
flyoutMenuOptions,
onClose,
handleSelect,
isSelected,
parentIsOpen,
setMenuState,
}) => {
({ title, flyoutMenuOptions, onClose, handleSelect, isSelected, parentIsOpen, setMenuState }) => {
const { isOpen, onOpen, onClose: onSubMenuClose } = useDisclosure();
const menuRef = new useRef();
@@ -41,9 +33,9 @@ const FlyoutSubMenu: React.FC<{
placement="right-start"
isOpen={isOpen && parentIsOpen}
closeOnBlur={true}
lazyBehavior={"keepMounted"}
lazyBehavior={'keepMounted'}
isLazy={true}
onClose={(e) => {
onClose={e => {
onSubMenuClose();
}}
closeOnSelect={false}
@@ -54,12 +46,12 @@ const FlyoutSubMenu: React.FC<{
ref={menuRef}
bg="background.tertiary"
color="text.primary"
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }}
_hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
>
<HStack width={"100%"} justifyContent={"space-between"}>
<HStack width={'100%'} justifyContent={'space-between'}>
<Text>{title}</Text>
<ChevronRight size={"1rem"} />
<ChevronRight size={'1rem'} />
</HStack>
</MenuButton>
<Portal>
@@ -67,7 +59,7 @@ const FlyoutSubMenu: React.FC<{
key={title}
maxHeight={56}
overflowY="scroll"
visibility={"visible"}
visibility={'visible'}
minWidth="180px"
bg="background.tertiary"
boxShadow="lg"
@@ -77,43 +69,35 @@ const FlyoutSubMenu: React.FC<{
left="100%"
bottom={-10}
sx={{
"::-webkit-scrollbar": {
width: "8px",
'::-webkit-scrollbar': {
width: '8px',
},
"::-webkit-scrollbar-thumb": {
background: "background.primary",
borderRadius: "4px",
'::-webkit-scrollbar-thumb': {
background: 'background.primary',
borderRadius: '4px',
},
"::-webkit-scrollbar-track": {
background: "background.tertiary",
'::-webkit-scrollbar-track': {
background: 'background.tertiary',
},
}}
>
{flyoutMenuOptions.map((item, index) => (
<Box key={"itemflybox" + index}>
<Box key={'itemflybox' + index}>
<MenuItem
key={"itemfly" + index}
key={'itemfly' + index}
onClick={() => {
onSubMenuClose();
onClose();
handleSelect(item);
}}
bg={
isSelected(item)
? "background.secondary"
: "background.tertiary"
}
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }}
bg={isSelected(item) ? 'background.secondary' : 'background.tertiary'}
_hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
>
{item.name}
</MenuItem>
{index < flyoutMenuOptions.length - 1 && (
<Divider
key={item.name + "-divider"}
color="text.tertiary"
w={"100%"}
/>
<Divider key={item.name + '-divider'} color="text.tertiary" w={'100%'} />
)}
</Box>
))}

View File

@@ -1,4 +1,3 @@
import React, { useCallback, useEffect, useRef, useState } from "react";
import {
Box,
Button,
@@ -12,76 +11,52 @@ import {
Text,
useDisclosure,
useOutsideClick,
} from "@chakra-ui/react";
import { observer } from "mobx-react-lite";
import { ChevronDown, Copy, RefreshCcw, Settings } from "lucide-react";
import ClientChatStore from "../../../stores/ClientChatStore";
import clientChatStore from "../../../stores/ClientChatStore";
import FlyoutSubMenu from "./FlyoutSubMenu";
import { useIsMobile } from "../../contexts/MobileContext";
import { useIsMobile as useIsMobileUserAgent } from "../../../hooks/_IsMobileHook";
import { getModelFamily, SUPPORTED_MODELS } from "../lib/SupportedModels";
import { formatConversationMarkdown } from "../lib/exportConversationAsMarkdown";
} from '@chakra-ui/react';
import { ChevronDown, Copy, RefreshCcw, Settings } from 'lucide-react';
import { observer } from 'mobx-react-lite';
import React, { useCallback, useEffect, useRef, useState } from 'react';
import { useIsMobile as useIsMobileUserAgent } from '../../../hooks/_IsMobileHook';
import clientChatStore from '../../../stores/ClientChatStore';
import { useIsMobile } from '../../contexts/MobileContext';
import { formatConversationMarkdown } from '../lib/exportConversationAsMarkdown';
import FlyoutSubMenu from './FlyoutSubMenu';
export const MsM_commonButtonStyles = {
bg: "transparent",
color: "text.primary",
borderRadius: "full",
bg: 'transparent',
color: 'text.primary',
borderRadius: 'full',
padding: 2,
border: "none",
_hover: { bg: "rgba(255, 255, 255, 0.2)" },
_active: { bg: "rgba(255, 255, 255, 0.3)" },
_focus: { boxShadow: "none" },
border: 'none',
_hover: { bg: 'rgba(255, 255, 255, 0.2)' },
_active: { bg: 'rgba(255, 255, 255, 0.3)' },
_focus: { boxShadow: 'none' },
};
const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(
({ isDisabled }) => {
const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(({ isDisabled }) => {
const isMobile = useIsMobile();
const isMobileUserAgent = useIsMobileUserAgent();
const {
isOpen,
onOpen,
onClose,
onToggle,
getDisclosureProps,
getButtonProps,
} = useDisclosure();
const { isOpen, onOpen, onClose, onToggle, getDisclosureProps, getButtonProps } = useDisclosure();
const [controlledOpen, setControlledOpen] = useState<boolean>(false);
const [supportedModels, setSupportedModels] = useState<any[]>([]);
useEffect(() => {
setControlledOpen(isOpen);
}, [isOpen]);
const getSupportedModels = async () => {
// Check if fetch is available (browser environment)
if (typeof fetch !== 'undefined') {
try {
return await (await fetch("/api/models")).json();
} catch (error) {
console.error("Error fetching models:", error);
return [];
}
} else {
// In test environment or where fetch is not available
console.log("Fetch not available, using default models");
return [];
}
}
useEffect(() => {
getSupportedModels().then((supportedModels) => {
// Check if setSupportedModels method exists before calling it
if (clientChatStore.setSupportedModels) {
clientChatStore.setSupportedModels(supportedModels);
} else {
console.log("setSupportedModels method not available in this environment");
}
fetch('/api/models')
.then(response => response.json())
.then(models => {
setSupportedModels(models);
})
.catch(err => {
console.error('Could not fetch models: ', err);
});
}, []);
const handleClose = useCallback(() => {
onClose();
}, [isOpen]);
@@ -90,14 +65,12 @@ const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(
navigator.clipboard
.writeText(formatConversationMarkdown(clientChatStore.items))
.then(() => {
window.alert(
"Conversation copied to clipboard. \n\nPaste it somewhere safe!",
);
window.alert('Conversation copied to clipboard. \n\nPaste it somewhere safe!');
onClose();
})
.catch((err) => {
console.error("Could not copy text to clipboard: ", err);
window.alert("Failed to copy conversation. Please try again.");
.catch(err => {
console.error('Could not copy text to clipboard: ', err);
window.alert('Failed to copy conversation. Please try again.');
});
}, [onClose]);
@@ -129,7 +102,7 @@ const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(
closeOnSelect={false}
closeOnBlur={isOpen && !isMobileUserAgent}
isLazy={true}
lazyBehavior={"unmount"}
lazyBehavior={'unmount'}
>
{isMobile ? (
<MenuButton
@@ -138,8 +111,8 @@ const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(
icon={<Settings size={20} />}
isDisabled={isDisabled}
aria-label="Settings"
_hover={{ bg: "rgba(255, 255, 255, 0.2)" }}
_focus={{ boxShadow: "none" }}
_hover={{ bg: 'rgba(255, 255, 255, 0.2)' }}
_focus={{ boxShadow: 'none' }}
{...MsM_commonButtonStyles}
/>
) : (
@@ -164,12 +137,15 @@ const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(
border="none"
borderRadius="md"
boxShadow="lg"
minW={"10rem"}
minW={'10rem'}
ref={menuRef}
>
<FlyoutSubMenu
title="Text Models"
flyoutMenuOptions={clientChatStore.supportedModels.map((m) => ({ name: m, value: m }))}
flyoutMenuOptions={supportedModels.map(modelData => ({
name: modelData.id.split('/').pop() || modelData.id,
value: modelData.id,
}))}
onClose={onClose}
parentIsOpen={isOpen}
setMenuState={setMenuState}
@@ -182,11 +158,11 @@ const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(
bg="background.tertiary"
color="text.primary"
onClick={handleCopyConversation}
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }}
_hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
>
<Flex align="center">
<Copy size="16px" style={{ marginRight: "8px" }} />
<Copy size="16px" style={{ marginRight: '8px' }} />
<Box>Export</Box>
</Flex>
</MenuItem>
@@ -195,21 +171,20 @@ const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(
bg="background.tertiary"
color="text.primary"
onClick={() => {
clientChatStore.setActiveConversation("conversation:new");
clientChatStore.reset();
onClose();
}}
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }}
_hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
>
<Flex align="center">
<RefreshCcw size="16px" style={{ marginRight: "8px" }} />
<RefreshCcw size="16px" style={{ marginRight: '8px' }} />
<Box>New</Box>
</Flex>
</MenuItem>
</MenuList>
</Menu>
);
},
);
});
export default InputMenu;

View File

@@ -1,34 +1,28 @@
import React, { useEffect, useRef, useState } from "react";
import {
Box,
Button,
Grid,
GridItem,
useBreakpointValue,
} from "@chakra-ui/react";
import { observer } from "mobx-react-lite";
import chatStore from "../../../stores/ClientChatStore";
import InputMenu from "../input-menu/InputMenu";
import InputTextarea from "./ChatInputTextArea";
import SendButton from "./ChatInputSendButton";
import { useMaxWidth } from "../../../hooks/useMaxWidth";
import userOptionsStore from "../../../stores/UserOptionsStore";
import { Box, Button, Grid, GridItem, useBreakpointValue } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React, { useEffect, useRef, useState } from 'react';
import { useMaxWidth } from '../../../hooks/useMaxWidth';
import chatStore from '../../../stores/ClientChatStore';
import userOptionsStore from '../../../stores/UserOptionsStore';
import InputMenu from '../input-menu/InputMenu';
import SendButton from './ChatInputSendButton';
import InputTextarea from './ChatInputTextArea';
const ChatInput = observer(() => {
const inputRef = useRef<HTMLTextAreaElement>(null);
const containerRef = useRef<HTMLDivElement>(null);
const maxWidth = useMaxWidth();
const [inputValue, setInputValue] = useState<string>("");
const [inputValue, setInputValue] = useState<string>('');
const [containerHeight, setContainerHeight] = useState(56);
const [containerBorderRadius, setContainerBorderRadius] = useState(9999);
const [shouldFollow, setShouldFollow] = useState<boolean>(
userOptionsStore.followModeEnabled,
);
const [shouldFollow, setShouldFollow] = useState<boolean>(userOptionsStore.followModeEnabled);
const [couldFollow, setCouldFollow] = useState<boolean>(chatStore.isLoading);
const [inputWidth, setInputWidth] = useState<string>("50%");
const [inputWidth, setInputWidth] = useState<string>('40%');
useEffect(() => {
setShouldFollow(chatStore.isLoading && userOptionsStore.followModeEnabled);
@@ -42,8 +36,8 @@ const ChatInput = observer(() => {
useEffect(() => {
if (containerRef.current) {
const observer = new ResizeObserver((entries) => {
for (let entry of entries) {
const observer = new ResizeObserver(entries => {
for (const entry of entries) {
const newHeight = entry.target.clientHeight;
setContainerHeight(newHeight);
@@ -63,27 +57,25 @@ const ChatInput = observer(() => {
};
const handleKeyDown = (e: React.KeyboardEvent<HTMLTextAreaElement>) => {
if (e.key === "Enter" && !e.shiftKey) {
if (e.key === 'Enter' && !e.shiftKey) {
e.preventDefault();
chatStore.sendMessage();
}
};
const inputMaxWidth = useBreakpointValue(
{ base: "50rem", lg: "50rem", md: "80%", sm: "100vw" },
{ base: '30rem', lg: '50rem', md: '80%', sm: '100vw' },
{ ssr: true },
);
const inputMinWidth = useBreakpointValue({ lg: "40rem" }, { ssr: true });
const inputMinWidth = useBreakpointValue({ lg: '40rem', md: '30rem' }, { ssr: true });
useEffect(() => {
setInputWidth("100%");
setInputWidth('100%');
}, [inputMaxWidth, inputMinWidth]);
return (
<Box
width={inputWidth}
maxW={inputMaxWidth}
minWidth={inputMinWidth}
width={inputMinWidth}
mx="auto"
p={2}
pl={2}
@@ -105,12 +97,12 @@ const ChatInput = observer(() => {
size="sm"
variant="ghost"
colorScheme="blue"
onClick={(_) => {
onClick={_ => {
userOptionsStore.toggleFollowMode();
}}
isDisabled={!chatStore.isLoading}
>
{shouldFollow ? "Disable Follow Mode" : "Enable Follow Mode"}
{shouldFollow ? 'Disable Follow Mode' : 'Enable Follow Mode'}
</Button>
</Box>
)}
@@ -123,7 +115,7 @@ const ChatInput = observer(() => {
gap={2}
alignItems="center"
style={{
transition: "border-radius 0.2s ease",
transition: 'border-radius 0.2s ease',
}}
>
<GridItem>

View File

@@ -1,9 +1,9 @@
import React from "react";
import { Button } from "@chakra-ui/react";
import clientChatStore from "../../../stores/ClientChatStore";
import { CirclePause, Send } from "lucide-react";
import { Button } from '@chakra-ui/react';
import { motion } from 'framer-motion';
import { CirclePause, Send } from 'lucide-react';
import React from 'react';
import { motion } from "framer-motion";
import clientChatStore from '../../../stores/ClientChatStore';
interface SendButtonProps {
isLoading: boolean;
@@ -13,25 +13,20 @@ interface SendButtonProps {
}
const SendButton: React.FC<SendButtonProps> = ({ onClick }) => {
const isDisabled =
clientChatStore.input.trim().length === 0 && !clientChatStore.isLoading;
const isDisabled = clientChatStore.input.trim().length === 0 && !clientChatStore.isLoading;
return (
<Button
onClick={(e) =>
clientChatStore.isLoading
? clientChatStore.stopIncomingMessage()
: onClick(e)
onClick={e =>
clientChatStore.isLoading ? clientChatStore.stopIncomingMessage() : onClick(e)
}
bg="transparent"
color={
clientChatStore.input.trim().length <= 1 ? "brand.700" : "text.primary"
}
color={clientChatStore.input.trim().length <= 1 ? 'brand.700' : 'text.primary'}
borderRadius="full"
p={2}
isDisabled={isDisabled}
_hover={{ bg: !isDisabled ? "rgba(255, 255, 255, 0.2)" : "inherit" }}
_active={{ bg: !isDisabled ? "rgba(255, 255, 255, 0.3)" : "inherit" }}
_focus={{ boxShadow: "none" }}
_hover={{ bg: !isDisabled ? 'rgba(255, 255, 255, 0.2)' : 'inherit' }}
_active={{ bg: !isDisabled ? 'rgba(255, 255, 255, 0.3)' : 'inherit' }}
_focus={{ boxShadow: 'none' }}
>
{clientChatStore.isLoading ? <MySpinner /> : <Send size={20} />}
</Button>
@@ -45,10 +40,10 @@ const MySpinner = ({ onClick }) => (
exit={{ opacity: 0, scale: 0.9 }}
transition={{
duration: 0.4,
ease: "easeInOut",
ease: 'easeInOut',
}}
>
<CirclePause color={"#F0F0F0"} size={24} onClick={onClick} />
<CirclePause color={'#F0F0F0'} size={24} onClick={onClick} />
</motion.div>
);

View File

@@ -1,7 +1,7 @@
import React, {useEffect, useRef, useState} from "react";
import {observer} from "mobx-react-lite";
import {Box, chakra, InputGroup,} from "@chakra-ui/react";
import AutoResize from "react-textarea-autosize";
import { Box, chakra, InputGroup, useBreakpointValue } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React, { useEffect, useRef, useState } from 'react';
import AutoResize from 'react-textarea-autosize';
const AutoResizeTextArea = chakra(AutoResize);
@@ -15,14 +15,11 @@ interface InputTextAreaProps {
const InputTextArea: React.FC<InputTextAreaProps> = observer(
({ inputRef, value, onChange, onKeyDown, isLoading }) => {
const [heightConstraint, setHeightConstraint] = useState<
number | undefined
>(10);
const [heightConstraint, setHeightConstraint] = useState<number | undefined>(10);
useEffect(() => {
if (value.length > 10) {
setHeightConstraint();
setHeightConstraint(parseInt(value));
}
}, [value]);
@@ -34,7 +31,6 @@ const InputTextArea: React.FC<InputTextAreaProps> = observer(
display="flex"
flexDirection="column"
>
{/* Input Area */}
<InputGroup position="relative">
<AutoResizeTextArea
@@ -42,8 +38,9 @@ const InputTextArea: React.FC<InputTextAreaProps> = observer(
ref={inputRef}
value={value}
height={heightConstraint}
maxH={heightConstraint}
autoFocus
onChange={(e) => onChange(e.target.value)}
onChange={e => onChange(e.target.value)}
onKeyDown={onKeyDown}
p={2}
pr="8px"
@@ -53,19 +50,25 @@ const InputTextArea: React.FC<InputTextAreaProps> = observer(
borderRadius="20px"
border="none"
placeholder="Free my mind..."
_placeholder={{ color: "gray.400" }}
_placeholder={{
color: 'gray.400',
textWrap: 'nowrap',
textOverflow: 'ellipsis',
overflow: 'hidden',
width: '90%',
}}
_focus={{
outline: "none",
outline: 'none',
}}
disabled={isLoading}
minRows={1}
maxRows={12}
style={{
touchAction: "none",
resize: "none",
overflowY: "auto",
width: "100%",
transition: "height 0.2s ease-in-out",
touchAction: 'none',
resize: 'none',
overflowY: 'auto',
width: '100%',
transition: 'height 0.2s ease-in-out',
}}
/>
</InputGroup>

View File

@@ -1,9 +1,10 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react';
import React from 'react';
import ChatInput from '../ChatInput';
import userOptionsStore from '../../../../stores/UserOptionsStore';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import chatStore from '../../../../stores/ClientChatStore';
import userOptionsStore from '../../../../stores/UserOptionsStore';
import ChatInput from '../ChatInput';
// Mock browser APIs
class MockResizeObserver {
@@ -85,7 +86,7 @@ vi.mock('./ChatInputTextArea', () => ({
aria-label="Chat input"
ref={inputRef}
value={value}
onChange={(e) => onChange(e.target.value)}
onChange={e => onChange(e.target.value)}
onKeyDown={onKeyDown}
disabled={isLoading}
/>

View File

@@ -8,16 +8,16 @@ const SUPPORTED_MODELS_GROUPS = {
groq: [
// "mixtral-8x7b-32768",
// "deepseek-r1-distill-llama-70b",
"meta-llama/llama-4-scout-17b-16e-instruct",
"gemma2-9b-it",
"mistral-saba-24b",
'meta-llama/llama-4-scout-17b-16e-instruct',
'gemma2-9b-it',
'mistral-saba-24b',
// "qwen-2.5-32b",
"llama-3.3-70b-versatile",
'llama-3.3-70b-versatile',
// "llama-3.3-70b-versatile"
// "llama-3.1-70b-versatile",
// "llama-3.3-70b-versatile"
],
cerebras: ["llama-3.3-70b"],
cerebras: ['llama-3.3-70b'],
claude: [
// "claude-3-5-sonnet-20241022",
// "claude-3-opus-20240229"
@@ -44,16 +44,16 @@ const SUPPORTED_MODELS_GROUPS = {
// "grok-beta"
],
cloudflareAI: [
"llama-3.2-3b-instruct", // max_tokens
"llama-3-8b-instruct", // max_tokens
"llama-3.1-8b-instruct-fast", // max_tokens
"deepseek-math-7b-instruct",
"deepseek-coder-6.7b-instruct-awq",
"hermes-2-pro-mistral-7b",
"openhermes-2.5-mistral-7b-awq",
"mistral-7b-instruct-v0.2",
"neural-chat-7b-v3-1-awq",
"openchat-3.5-0106",
'llama-3.2-3b-instruct', // max_tokens
'llama-3-8b-instruct', // max_tokens
'llama-3.1-8b-instruct-fast', // max_tokens
'deepseek-math-7b-instruct',
'deepseek-coder-6.7b-instruct-awq',
'hermes-2-pro-mistral-7b',
'openhermes-2.5-mistral-7b-awq',
'mistral-7b-instruct-v0.2',
'neural-chat-7b-v3-1-awq',
'openchat-3.5-0106',
// "gemma-7b-it",
],
};
@@ -66,10 +66,10 @@ export type ModelFamily = keyof typeof SUPPORTED_MODELS_GROUPS;
function getModelFamily(model: string): ModelFamily | undefined {
return Object.keys(SUPPORTED_MODELS_GROUPS)
.filter((family) => {
return SUPPORTED_MODELS_GROUPS[
family as keyof typeof SUPPORTED_MODELS_GROUPS
].includes(model.trim());
.filter(family => {
return SUPPORTED_MODELS_GROUPS[family as keyof typeof SUPPORTED_MODELS_GROUPS].includes(
model.trim(),
);
})
.at(0) as ModelFamily | undefined;
}

View File

@@ -1,30 +1,30 @@
import DOMPurify from "isomorphic-dompurify";
import DOMPurify from 'isomorphic-dompurify';
function domPurify(dirty: string) {
return DOMPurify.sanitize(dirty, {
USE_PROFILES: { html: true },
ALLOWED_TAGS: [
"b",
"i",
"u",
"a",
"p",
"span",
"div",
"table",
"thead",
"tbody",
"tr",
"td",
"th",
"ul",
"ol",
"li",
"code",
"pre",
'b',
'i',
'u',
'a',
'p',
'span',
'div',
'table',
'thead',
'tbody',
'tr',
'td',
'th',
'ul',
'ol',
'li',
'code',
'pre',
],
ALLOWED_ATTR: ["href", "src", "alt", "title", "class", "style"],
FORBID_TAGS: ["script", "iframe"],
ALLOWED_ATTR: ['href', 'src', 'alt', 'title', 'class', 'style'],
FORBID_TAGS: ['script', 'iframe'],
KEEP_CONTENT: true,
SAFE_FOR_TEMPLATES: true,
});

View File

@@ -1,18 +1,17 @@
// Function to generate a Markdown representation of the current conversation
import { type IMessage } from "../../../stores/ClientChatStore";
import { Instance } from "mobx-state-tree";
import { type Instance } from 'mobx-state-tree';
export function formatConversationMarkdown(
messages: Instance<typeof IMessage>[],
): string {
import { type IMessage } from '../../../stores/ClientChatStore';
export function formatConversationMarkdown(messages: Instance<typeof IMessage>[]): string {
return messages
.map((message) => {
if (message.role === "user") {
.map(message => {
if (message.role === 'user') {
return `**You**: ${message.content}`;
} else if (message.role === "assistant") {
return `**Geoff's AI**: ${message.content}`;
} else if (message.role === 'assistant') {
return `**open-gsio**: ${message.content}`;
}
return "";
return '';
})
.join("\n\n");
.join('\n\n');
}

View File

@@ -1,6 +1,6 @@
import React from "react";
import React from 'react';
import MessageMarkdownRenderer from "./MessageMarkdownRenderer";
import MessageMarkdownRenderer from './MessageMarkdownRenderer';
const ChatMessageContent = ({ content }) => {
return <MessageMarkdownRenderer markdown={content} />;

View File

@@ -1,9 +1,11 @@
import React from "react";
import {Box, Grid, GridItem} from "@chakra-ui/react";
import MessageBubble from "./MessageBubble";
import {observer} from "mobx-react-lite";
import chatStore from "../../../stores/ClientChatStore";
import {useIsMobile} from "../../contexts/MobileContext";
import { Box, Grid, GridItem } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React from 'react';
import chatStore from '../../../stores/ClientChatStore';
import { useIsMobile } from '../../contexts/MobileContext';
import MessageBubble from './MessageBubble';
interface ChatMessagesProps {
scrollRef: React.RefObject<HTMLDivElement>;
@@ -13,11 +15,7 @@ const ChatMessages: React.FC<ChatMessagesProps> = observer(({ scrollRef }) => {
const isMobile = useIsMobile();
return (
<Box
pt={isMobile ? 24 : undefined}
overflowY={"scroll"}
overflowX={"hidden"}
>
<Box pt={isMobile ? 24 : undefined} overflowY={'scroll'} overflowX={'hidden'}>
<Grid
fontFamily="Arial, sans-serif"
templateColumns="1fr"

View File

@@ -1,19 +1,19 @@
import React, { useEffect, useRef, useState } from "react";
import { Box, Flex, Text } from "@chakra-ui/react";
import MessageRenderer from "./ChatMessageContent";
import { observer } from "mobx-react-lite";
import MessageEditor from "./MessageEditorComponent";
import UserMessageTools from "./UserMessageTools";
import clientChatStore from "../../../stores/ClientChatStore";
import UserOptionsStore from "../../../stores/UserOptionsStore";
import MotionBox from "./MotionBox";
import { Box, Flex, Text } from '@chakra-ui/react';
import { observer } from 'mobx-react-lite';
import React, { useEffect, useRef, useState } from 'react';
import clientChatStore from '../../../stores/ClientChatStore';
import UserOptionsStore from '../../../stores/UserOptionsStore';
import MessageRenderer from './ChatMessageContent';
import MessageEditor from './MessageEditorComponent';
import MotionBox from './MotionBox';
import UserMessageTools from './UserMessageTools';
const LoadingDots = () => {
return (
<Flex>
{[0, 1, 2].map((i) => (
{[0, 1, 2].map(i => (
<MotionBox
key={i}
width="8px"
@@ -34,10 +34,10 @@ const LoadingDots = () => {
))}
</Flex>
);
}
};
function renderMessage(msg: any) {
if (msg.role === "user") {
if (msg.role === 'user') {
return (
<Text as="p" fontSize="sm" lineHeight="short" color="text.primary">
{msg.content}
@@ -50,8 +50,8 @@ function renderMessage(msg: any) {
const MessageBubble = observer(({ msg, scrollRef }) => {
const [isEditing, setIsEditing] = useState(false);
const [isHovered, setIsHovered] = useState(false);
const isUser = msg.role === "user";
const senderName = isUser ? "You" : "Geoff's AI";
const isUser = msg.role === 'user';
const senderName = isUser ? 'You' : 'open-gsio';
const isLoading = !msg.content || !(msg.content.trim().length > 0);
const messageRef = useRef();
@@ -64,10 +64,15 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
};
useEffect(() => {
if (clientChatStore.items.length > 0 && clientChatStore.isLoading && UserOptionsStore.followModeEnabled) { // Refine condition
if (
clientChatStore.items.length > 0 &&
clientChatStore.isLoading &&
UserOptionsStore.followModeEnabled
) {
// Refine condition
scrollRef.current?.scrollTo({
top: scrollRef.current.scrollHeight,
behavior: "auto",
behavior: 'auto',
});
}
});
@@ -75,7 +80,7 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
return (
<Flex
flexDirection="column"
alignItems={isUser ? "flex-end" : "flex-start"}
alignItems={isUser ? 'flex-end' : 'flex-start'}
role="listitem"
flex={0}
aria-label={`Message from ${senderName}`}
@@ -85,19 +90,19 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
<Text
fontSize="xs"
color="text.tertiary"
textAlign={isUser ? "right" : "left"}
alignSelf={isUser ? "flex-end" : "flex-start"}
textAlign={isUser ? 'right' : 'left'}
alignSelf={isUser ? 'flex-end' : 'flex-start'}
mb={1}
>
{senderName}
</Text>
<MotionBox
minW={{ base: "99%", sm: "99%", lg: isUser ? "55%" : "60%" }}
maxW={{ base: "99%", sm: "99%", lg: isUser ? "65%" : "65%" }}
minW={{ base: '99%', sm: '99%', lg: isUser ? '55%' : '60%' }}
maxW={{ base: '99%', sm: '99%', lg: isUser ? '65%' : '65%' }}
p={3}
borderRadius="1.5em"
bg={isUser ? "#0A84FF" : "#3A3A3C"}
bg={isUser ? '#0A84FF' : '#3A3A3C'}
color="text.primary"
textAlign="left"
boxShadow="0 2px 4px rgba(0, 0, 0, 0.1)"
@@ -115,10 +120,10 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
whiteSpace="pre-wrap"
ref={messageRef}
sx={{
"pre, code": {
maxWidth: "100%",
whiteSpace: "pre-wrap",
overflowX: "auto",
'pre, code': {
maxWidth: '100%',
whiteSpace: 'pre-wrap',
overflowX: 'auto',
},
}}
>
@@ -139,9 +144,7 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
justifyContent="center"
alignItems="center"
>
{isHovered && !isEditing && (
<UserMessageTools message={msg} onEdit={handleEdit} />
)}
{isHovered && !isEditing && <UserMessageTools message={msg} onEdit={handleEdit} />}
</Box>
)}
</Flex>

View File

@@ -1,10 +1,11 @@
import React, { KeyboardEvent, useEffect } from "react";
import { Box, Flex, IconButton, Textarea } from "@chakra-ui/react";
import { Check, X } from "lucide-react";
import { observer } from "mobx-react-lite";
import { Instance } from "mobx-state-tree";
import Message from "../../../models/Message";
import messageEditorStore from "../../../stores/MessageEditorStore";
import { Box, Flex, IconButton, Textarea } from '@chakra-ui/react';
import { Check, X } from 'lucide-react';
import { observer } from 'mobx-react-lite';
import { type Instance } from 'mobx-state-tree';
import React, { type KeyboardEvent, useEffect } from 'react';
import Message from '../../../models/Message';
import messageEditorStore from '../../../stores/MessageEditorStore';
interface MessageEditorProps {
message: Instance<typeof Message>;
@@ -30,15 +31,13 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
onCancel();
};
const handleKeyDown = (e: KeyboardEvent<HTMLTextAreaElement>) => {
if (e.key === "Enter" && (e.metaKey || e.ctrlKey)) {
if (e.key === 'Enter' && (e.metaKey || e.ctrlKey)) {
e.preventDefault();
handleSave();
}
if (e.key === "Escape") {
if (e.key === 'Escape') {
e.preventDefault();
handleCancel();
}
@@ -48,14 +47,14 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
<Box width="100%">
<Textarea
value={messageEditorStore.editedContent}
onChange={(e) => messageEditorStore.setEditedContent(e.target.value)}
onChange={e => messageEditorStore.setEditedContent(e.target.value)}
onKeyDown={handleKeyDown}
minHeight="100px"
bg="transparent"
border="1px solid"
borderColor="whiteAlpha.300"
_hover={{ borderColor: "whiteAlpha.400" }}
_focus={{ borderColor: "brand.100", boxShadow: "none" }}
_hover={{ borderColor: 'whiteAlpha.400' }}
_focus={{ borderColor: 'brand.100', boxShadow: 'none' }}
resize="vertical"
color="text.primary"
/>
@@ -66,7 +65,7 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
onClick={handleCancel}
size="sm"
variant="ghost"
color={"accent.danger"}
color={'accent.danger'}
/>
<IconButton
aria-label="Save edit"
@@ -74,7 +73,7 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
onClick={handleSave}
size="sm"
variant="ghost"
color={"accent.confirm"}
color={'accent.confirm'}
/>
</Flex>
</Box>

View File

@@ -1,5 +1,3 @@
import React from "react";
import {
Box,
Code,
@@ -17,13 +15,15 @@ import {
Thead,
Tr,
useColorModeValue,
} from "@chakra-ui/react";
import { marked } from "marked";
import CodeBlock from "../../code/CodeBlock";
import ImageWithFallback from "../../markdown/ImageWithFallback";
import markedKatex from "marked-katex-extension";
import katex from "katex";
import domPurify from "../lib/domPurify";
} from '@chakra-ui/react';
import katex from 'katex';
import { marked } from 'marked';
import markedKatex from 'marked-katex-extension';
import React from 'react';
import CodeBlock from '../../code/CodeBlock';
import ImageWithFallback from '../../markdown/ImageWithFallback';
import domPurify from '../lib/domPurify';
try {
if (localStorage) {
@@ -34,11 +34,13 @@ try {
throwOnError: false,
strict: true,
colorIsTextColor: true,
errorColor: "red",
errorColor: 'red',
}),
);
}
} catch (_) {}
} catch (_) {
// Silently ignore errors in marked setup - fallback to default behavior
}
const MemoizedCodeBlock = React.memo(CodeBlock);
@@ -49,32 +51,29 @@ const MemoizedCodeBlock = React.memo(CodeBlock);
const getHeadingProps = (depth: number) => {
switch (depth) {
case 1:
return { as: "h1", size: "xl", mt: 4, mb: 2 };
return { as: 'h1', size: 'xl', mt: 4, mb: 2 };
case 2:
return { as: "h2", size: "lg", mt: 3, mb: 2 };
return { as: 'h2', size: 'lg', mt: 3, mb: 2 };
case 3:
return { as: "h3", size: "md", mt: 2, mb: 1 };
return { as: 'h3', size: 'md', mt: 2, mb: 1 };
case 4:
return { as: "h4", size: "sm", mt: 2, mb: 1 };
return { as: 'h4', size: 'sm', mt: 2, mb: 1 };
case 5:
return { as: "h5", size: "sm", mt: 2, mb: 1 };
return { as: 'h5', size: 'sm', mt: 2, mb: 1 };
case 6:
return { as: "h6", size: "xs", mt: 2, mb: 1 };
return { as: 'h6', size: 'xs', mt: 2, mb: 1 };
default:
return { as: `h${depth}`, size: "md", mt: 2, mb: 1 };
return { as: `h${depth}`, size: 'md', mt: 2, mb: 1 };
}
};
interface TableToken extends marked.Tokens.Table {
align: Array<"center" | "left" | "right" | null>;
align: Array<'center' | 'left' | 'right' | null>;
header: (string | marked.Tokens.TableCell)[];
rows: (string | marked.Tokens.TableCell)[][];
}
const CustomHeading: React.FC<{ text: string; depth: number }> = ({
text,
depth,
}) => {
const CustomHeading: React.FC<{ text: string; depth: number }> = ({ text, depth }) => {
const headingProps = getHeadingProps(depth);
return (
<Heading {...headingProps} wordBreak="break-word" maxWidth="100%">
@@ -83,9 +82,7 @@ const CustomHeading: React.FC<{ text: string; depth: number }> = ({
);
};
const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({
children,
}) => {
const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return (
<Text
as="p"
@@ -100,9 +97,7 @@ const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({
);
};
const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({
children,
}) => {
const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return (
<Box
as="blockquote"
@@ -120,16 +115,9 @@ const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({
);
};
const CustomCodeBlock: React.FC<{ code: string; language?: string }> = ({
code,
language,
}) => {
const CustomCodeBlock: React.FC<{ code: string; language?: string }> = ({ code, language }) => {
return (
<MemoizedCodeBlock
language={language}
code={code}
onRenderComplete={() => Promise.resolve()}
/>
<MemoizedCodeBlock language={language} code={code} onRenderComplete={() => Promise.resolve()} />
);
};
@@ -141,10 +129,10 @@ const CustomList: React.FC<{
children: React.ReactNode;
}> = ({ ordered, start, children }) => {
const commonStyles = {
fontSize: "sm",
wordBreak: "break-word" as const,
maxWidth: "100%" as const,
stylePosition: "outside" as const,
fontSize: 'sm',
wordBreak: 'break-word' as const,
maxWidth: '100%' as const,
stylePosition: 'outside' as const,
mb: 2,
pl: 4,
};
@@ -166,16 +154,13 @@ const CustomListItem: React.FC<{
return <ListItem mb={1}>{children}</ListItem>;
};
const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({
math,
displayMode,
}) => {
const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({ math, displayMode }) => {
const renderedMath = katex.renderToString(math, { displayMode });
return (
<Box
as="span"
display={displayMode ? "block" : "inline"}
display={displayMode ? 'block' : 'inline'}
p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0}
borderRadius="md"
@@ -188,23 +173,17 @@ const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({
const CustomTable: React.FC<{
header: React.ReactNode[];
align: Array<"center" | "left" | "right" | null>;
align: Array<'center' | 'left' | 'right' | null>;
rows: React.ReactNode[][];
}> = ({ header, align, rows }) => {
return (
<Table
variant="simple"
size="sm"
my={4}
borderRadius="md"
overflow="hidden"
>
<Table variant="simple" size="sm" my={4} borderRadius="md" overflow="hidden">
<Thead bg="background.secondary">
<Tr>
{header.map((cell, i) => (
<Th
key={i}
textAlign={align[i] || "left"}
textAlign={align[i] || 'left'}
fontWeight="bold"
p={2}
minW={16}
@@ -219,12 +198,7 @@ const CustomTable: React.FC<{
{rows.map((row, rIndex) => (
<Tr key={rIndex}>
{row.map((cell, cIndex) => (
<Td
key={cIndex}
textAlign={align[cIndex] || "left"}
p={2}
wordBreak="break-word"
>
<Td key={cIndex} textAlign={align[cIndex] || 'left'} p={2} wordBreak="break-word">
{cell}
</Td>
))}
@@ -241,13 +215,7 @@ const CustomHtmlBlock: React.FC<{ content: string }> = ({ content }) => {
const CustomText: React.FC<{ text: React.ReactNode }> = ({ text }) => {
return (
<Text
fontSize="sm"
lineHeight="short"
wordBreak="break-word"
maxWidth="100%"
as="span"
>
<Text fontSize="sm" lineHeight="short" wordBreak="break-word" maxWidth="100%" as="span">
{text}
</Text>
);
@@ -262,13 +230,7 @@ const CustomStrong: React.FC<CustomStrongProps> = ({ children }) => {
const CustomEm: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return (
<Text
as="em"
fontStyle="italic"
lineHeight="short"
wordBreak="break-word"
display="inline"
>
<Text as="em" fontStyle="italic" lineHeight="short" wordBreak="break-word" display="inline">
{children}
</Text>
);
@@ -289,7 +251,7 @@ const CustomDel: React.FC<{ text: string }> = ({ text }) => {
};
const CustomCodeSpan: React.FC<{ code: string }> = ({ code }) => {
const bg = useColorModeValue("gray.100", "gray.800");
const bg = useColorModeValue('gray.100', 'gray.800');
return (
<Code
fontSize="sm"
@@ -312,13 +274,13 @@ const CustomMath: React.FC<{ math: string; displayMode?: boolean }> = ({
return (
<Box
as="span"
display={displayMode ? "block" : "inline"}
display={displayMode ? 'block' : 'inline'}
p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0}
borderRadius="md"
overflow="auto"
maxWidth="100%"
className={`math ${displayMode ? "math-display" : "math-inline"}`}
className={`math ${displayMode ? 'math-display' : 'math-inline'}`}
>
{math}
</Box>
@@ -336,8 +298,8 @@ const CustomLink: React.FC<{
title={title}
isExternal
sx={{
"& span": {
color: "text.link",
'& span': {
color: 'text.link',
},
}}
maxWidth="100%"
@@ -379,46 +341,34 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
tokens.forEach((token, i) => {
switch (token.type) {
case "heading":
output.push(
<CustomHeading key={i} text={token.text} depth={token.depth} />,
);
case 'heading':
output.push(<CustomHeading key={i} text={token.text} depth={token.depth} />);
break;
case "paragraph": {
const parsedContent = token.tokens
? parseTokens(token.tokens)
: token.text;
case 'paragraph': {
const parsedContent = token.tokens ? parseTokens(token.tokens) : token.text;
if (blockquoteContent.length > 0) {
blockquoteContent.push(
<CustomParagraph key={i}>{parsedContent}</CustomParagraph>,
);
blockquoteContent.push(<CustomParagraph key={i}>{parsedContent}</CustomParagraph>);
} else {
output.push(
<CustomParagraph key={i}>{parsedContent}</CustomParagraph>,
);
output.push(<CustomParagraph key={i}>{parsedContent}</CustomParagraph>);
}
break;
}
case "br":
case 'br':
output.push(<br key={i} />);
break;
case "escape": {
case 'escape': {
break;
}
case "blockquote_start":
case 'blockquote_start':
blockquoteContent = [];
break;
case "blockquote_end":
output.push(
<CustomBlockquote key={i}>
{parseTokens(blockquoteContent)}
</CustomBlockquote>,
);
case 'blockquote_end':
output.push(<CustomBlockquote key={i}>{parseTokens(blockquoteContent)}</CustomBlockquote>);
blockquoteContent = [];
break;
case "blockquote": {
case 'blockquote': {
output.push(
<CustomBlockquote key={i}>
{token.tokens ? parseTokens(token.tokens) : null}
@@ -426,44 +376,30 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
);
break;
}
case "math":
output.push(
<CustomMath key={i} math={(token as any).value} displayMode={true} />,
);
case 'math':
output.push(<CustomMath key={i} math={(token as any).value} displayMode={true} />);
break;
case "inlineMath":
output.push(
<CustomMath
key={i}
math={(token as any).value}
displayMode={false}
/>,
);
case 'inlineMath':
output.push(<CustomMath key={i} math={(token as any).value} displayMode={false} />);
break;
case "inlineKatex":
case "blockKatex": {
case 'inlineKatex':
case 'blockKatex': {
const katexToken = token as any;
output.push(
<CustomKatex
key={i}
math={katexToken.text}
displayMode={katexToken.displayMode}
/>,
<CustomKatex key={i} math={katexToken.text} displayMode={katexToken.displayMode} />,
);
break;
}
case "code":
output.push(
<CustomCodeBlock key={i} code={token.text} language={token.lang} />,
);
case 'code':
output.push(<CustomCodeBlock key={i} code={token.text} language={token.lang} />);
break;
case "hr":
case 'hr':
output.push(<CustomHr key={i} />);
break;
case "list": {
case 'list': {
const { ordered, start, items } = token;
const listItems = items.map((listItem, idx) => {
const nestedContent = parseTokens(listItem.tokens);
@@ -477,53 +413,43 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
);
break;
}
case "table": {
case 'table': {
const tableToken = token as TableToken;
output.push(
<CustomTable
key={i}
header={tableToken.header.map((cell) =>
typeof cell === "string" ? cell : parseTokens(cell.tokens || []),
header={tableToken.header.map(cell =>
typeof cell === 'string' ? cell : parseTokens(cell.tokens || []),
)}
align={tableToken.align}
rows={tableToken.rows.map((row) =>
row.map((cell) =>
typeof cell === "string"
? cell
: parseTokens(cell.tokens || []),
),
rows={tableToken.rows.map(row =>
row.map(cell => (typeof cell === 'string' ? cell : parseTokens(cell.tokens || []))),
)}
/>,
);
break;
}
case "html":
case 'html':
output.push(<CustomHtmlBlock key={i} content={token.text} />);
break;
case "def":
case "space":
case 'def':
case 'space':
break;
case "strong":
output.push(
<CustomStrong key={i}>
{parseTokens(token.tokens || [])}
</CustomStrong>,
);
case 'strong':
output.push(<CustomStrong key={i}>{parseTokens(token.tokens || [])}</CustomStrong>);
break;
case "em":
case 'em':
output.push(
<CustomEm key={i}>
{token.tokens ? parseTokens(token.tokens) : token.text}
</CustomEm>,
<CustomEm key={i}>{token.tokens ? parseTokens(token.tokens) : token.text}</CustomEm>,
);
break;
case "codespan":
case 'codespan':
output.push(<CustomCodeSpan key={i} code={token.text} />);
break;
case "link":
case 'link':
output.push(
<CustomLink key={i} href={token.href} title={token.title}>
{token.tokens ? parseTokens(token.tokens) : token.text}
@@ -531,33 +457,24 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
);
break;
case "image":
case 'image':
output.push(
<CustomImage
key={i}
href={token.href}
title={token.title}
text={token.text}
/>,
<CustomImage key={i} href={token.href} title={token.title} text={token.text} />,
);
break;
case "text": {
const parsedContent = token.tokens
? parseTokens(token.tokens)
: token.text;
case 'text': {
const parsedContent = token.tokens ? parseTokens(token.tokens) : token.text;
if (blockquoteContent.length > 0) {
blockquoteContent.push(
<React.Fragment key={i}>{parsedContent}</React.Fragment>,
);
blockquoteContent.push(<React.Fragment key={i}>{parsedContent}</React.Fragment>);
} else {
output.push(<CustomText key={i} text={parsedContent} />);
}
break;
}
default:
console.warn("Unhandled token type:", token.type, token);
console.warn('Unhandled token type:', token.type, token);
}
});

View File

@@ -1,13 +1,12 @@
import React from "react";
import {renderMessageMarkdown} from "./MessageMarkdown";
import React from 'react';
import { renderMessageMarkdown } from './MessageMarkdown';
interface CustomMarkdownRendererProps {
markdown: string;
}
const MessageMarkdownRenderer: React.FC<CustomMarkdownRendererProps> = ({
markdown,
}) => {
const MessageMarkdownRenderer: React.FC<CustomMarkdownRendererProps> = ({ markdown }) => {
return <div>{renderMessageMarkdown(markdown)}</div>;
};

View File

@@ -1,4 +1,4 @@
import {motion} from "framer-motion";
import {Box} from "@chakra-ui/react";
import { Box } from '@chakra-ui/react';
import { motion } from 'framer-motion';
export default motion(Box);

View File

@@ -1,6 +1,6 @@
import { observer } from "mobx-react-lite";
import { IconButton } from "@chakra-ui/react";
import { Edit2Icon } from "lucide-react";
import { IconButton } from '@chakra-ui/react';
import { Edit2Icon } from 'lucide-react';
import { observer } from 'mobx-react-lite';
const UserMessageTools = observer(({ disabled = false, message, onEdit }) => (
<IconButton
@@ -8,26 +8,26 @@ const UserMessageTools = observer(({ disabled = false, message, onEdit }) => (
color="text.primary"
aria-label="Edit message"
title="Edit message"
icon={<Edit2Icon size={"1em"} />}
icon={<Edit2Icon size={'1em'} />}
onClick={() => onEdit(message)}
_active={{
bg: "transparent",
bg: 'transparent',
svg: {
stroke: "brand.100",
transition: "stroke 0.3s ease-in-out",
stroke: 'brand.100',
transition: 'stroke 0.3s ease-in-out',
},
}}
_hover={{
bg: "transparent",
bg: 'transparent',
svg: {
stroke: "accent.secondary",
transition: "stroke 0.3s ease-in-out",
stroke: 'accent.secondary',
transition: 'stroke 0.3s ease-in-out',
},
}}
variant="ghost"
size="sm"
isDisabled={disabled}
_focus={{ boxShadow: "none" }}
_focus={{ boxShadow: 'none' }}
/>
));

View File

@@ -1,8 +1,9 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent, waitFor } from '@testing-library/react';
import React from 'react';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import messageEditorStore from '../../../../stores/MessageEditorStore';
import MessageBubble from '../MessageBubble';
import messageEditorStore from "../../../../stores/MessageEditorStore";
// Mock browser APIs
class MockResizeObserver {
@@ -18,7 +19,7 @@ global.ResizeObserver = MockResizeObserver;
vi.mock('../../../../models/Message', () => ({
default: {
// This is needed for the Instance<typeof Message> type
}
},
}));
// Mock the stores
@@ -26,15 +27,15 @@ vi.mock('../../../../stores/ClientChatStore', () => ({
default: {
items: [],
isLoading: false,
editMessage: vi.fn().mockReturnValue(true)
}
editMessage: vi.fn().mockReturnValue(true),
},
}));
vi.mock('../../../../stores/UserOptionsStore', () => ({
default: {
followModeEnabled: false,
setFollowModeEnabled: vi.fn()
}
setFollowModeEnabled: vi.fn(),
},
}));
// Mock the MessageEditorStore
@@ -48,13 +49,13 @@ vi.mock('../../../../stores/MessageEditorStore', () => ({
// Use the mocked messageEditorStore from the import
messageEditorStore.onCancel();
return Promise.resolve();
})
}
}),
},
}));
// Mock the MessageRenderer component
vi.mock('../ChatMessageContent', () => ({
default: ({ content }) => <div data-testid="message-content">{content}</div>
default: ({ content }) => <div data-testid="message-content">{content}</div>,
}));
// Mock the UserMessageTools component
@@ -63,30 +64,30 @@ vi.mock('../UserMessageTools', () => ({
<button data-testid="edit-button" onClick={() => onEdit(message)}>
Edit
</button>
)
),
}));
vi.mock("../MotionBox", async (importOriginal) => {
const actual = await importOriginal()
vi.mock('../MotionBox', async importOriginal => {
const actual = await importOriginal();
return { default: {
return {
default: {
...actual.default,
div: (props: any) => React.createElement('div', props, props.children),
motion: (props: any) => React.createElement('div', props, props.children),
}
}
},
};
});
describe('MessageBubble', () => {
const mockScrollRef = { current: { scrollTo: vi.fn() } };
const mockUserMessage = {
role: 'user',
content: 'Test message'
content: 'Test message',
};
const mockAssistantMessage = {
role: 'assistant',
content: 'Assistant response'
content: 'Assistant response',
};
beforeEach(() => {
@@ -103,7 +104,7 @@ describe('MessageBubble', () => {
it('should render assistant message correctly', () => {
render(<MessageBubble msg={mockAssistantMessage} scrollRef={mockScrollRef} />);
expect(screen.getByText("Geoff's AI")).toBeInTheDocument();
expect(screen.getByText('open-gsio')).toBeInTheDocument();
expect(screen.getByTestId('message-content')).toHaveTextContent('Assistant response');
});

View File

@@ -1,18 +1,18 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react';
import React from 'react';
import MessageEditor from '../MessageEditorComponent';
import { describe, it, expect, vi, beforeEach } from 'vitest';
// Import the mocked stores
import clientChatStore from '../../../../stores/ClientChatStore';
import messageEditorStore from '../../../../stores/MessageEditorStore';
import MessageEditor from '../MessageEditorComponent';
// Mock the Message model
vi.mock('../../../../models/Message', () => {
return {
default: {
// This is needed for the Instance<typeof Message> type
}
},
};
});
@@ -20,8 +20,8 @@ vi.mock('../../../../models/Message', () => {
globalThis.fetch = vi.fn(() =>
Promise.resolve({
ok: true,
json: () => Promise.resolve({})
})
json: () => Promise.resolve({}),
}),
);
// Mock the ClientChatStore
@@ -31,14 +31,14 @@ vi.mock('../../../../stores/ClientChatStore', () => {
removeAfter: vi.fn(),
sendMessage: vi.fn(),
setIsLoading: vi.fn(),
editMessage: vi.fn().mockReturnValue(true)
editMessage: vi.fn().mockReturnValue(true),
};
// Add the mockUserMessage to the items array
mockStore.items.indexOf = vi.fn().mockReturnValue(0);
return {
default: mockStore
default: mockStore,
};
});
@@ -48,16 +48,16 @@ vi.mock('../../../../stores/MessageEditorStore', () => {
editedContent: 'Test message', // Set initial value to match the test expectation
message: null,
setEditedContent: vi.fn(),
setMessage: vi.fn((message) => {
setMessage: vi.fn(message => {
mockStore.message = message;
mockStore.editedContent = message.content;
}),
onCancel: vi.fn(),
handleSave: vi.fn()
handleSave: vi.fn(),
};
return {
default: mockStore
default: mockStore,
};
});
@@ -66,7 +66,7 @@ describe('MessageEditor', () => {
const mockUserMessage = {
content: 'Test message',
role: 'user',
setContent: vi.fn()
setContent: vi.fn(),
};
const mockOnCancel = vi.fn();

View File

@@ -1,5 +1,6 @@
import React, { useState, useEffect, useCallback, useMemo } from "react";
import { buildCodeHighlighter } from "./CodeHighlighter";
import React, { useState, useEffect, useCallback } from 'react';
import { buildCodeHighlighter } from './CodeHighlighter';
interface CodeBlockProps {
language: string;
@@ -9,23 +10,19 @@ interface CodeBlockProps {
const highlighter = buildCodeHighlighter();
const CodeBlock: React.FC<CodeBlockProps> = ({
language,
code,
onRenderComplete,
}) => {
const [html, setHtml] = useState<string>("");
const CodeBlock: React.FC<CodeBlockProps> = ({ language, code, onRenderComplete }) => {
const [html, setHtml] = useState<string>('');
const [loading, setLoading] = useState<boolean>(true);
const highlightCode = useCallback(async () => {
try {
const highlighted = (await highlighter).codeToHtml(code, {
lang: language,
theme: "github-dark",
theme: 'github-dark',
});
setHtml(highlighted);
} catch (error) {
console.error("Error highlighting code:", error);
console.error('Error highlighting code:', error);
setHtml(`<pre>${code}</pre>`);
} finally {
setLoading(false);
@@ -41,9 +38,9 @@ const CodeBlock: React.FC<CodeBlockProps> = ({
return (
<div
style={{
backgroundColor: "#24292e",
padding: "10px",
borderRadius: "1.5em",
backgroundColor: '#24292e',
padding: '10px',
borderRadius: '1.5em',
}}
>
Loading code...
@@ -55,12 +52,12 @@ const CodeBlock: React.FC<CodeBlockProps> = ({
<div
dangerouslySetInnerHTML={{ __html: html }}
style={{
transition: "none",
transition: 'none',
padding: 20,
backgroundColor: "#24292e",
overflowX: "auto",
borderRadius: ".37em",
fontSize: ".75rem",
backgroundColor: '#24292e',
overflowX: 'auto',
borderRadius: '.37em',
fontSize: '.75rem',
}}
/>
);

View File

@@ -1,5 +1,6 @@
import { createHighlighterCore } from "shiki";
import { createHighlighterCore } from 'shiki';
/* eslint-disable import/no-unresolved */
export async function buildCodeHighlighter() {
const [
githubDark,
@@ -23,26 +24,26 @@ export async function buildCodeHighlighter() {
zig,
wasm,
] = await Promise.all([
import("shiki/themes/github-dark.mjs"),
import("shiki/langs/html.mjs"),
import("shiki/langs/javascript.mjs"),
import("shiki/langs/jsx.mjs"),
import("shiki/langs/typescript.mjs"),
import("shiki/langs/tsx.mjs"),
import("shiki/langs/go.mjs"),
import("shiki/langs/rust.mjs"),
import("shiki/langs/python.mjs"),
import("shiki/langs/java.mjs"),
import("shiki/langs/kotlin.mjs"),
import("shiki/langs/shell.mjs"),
import("shiki/langs/sql.mjs"),
import("shiki/langs/yaml.mjs"),
import("shiki/langs/toml.mjs"),
import("shiki/langs/markdown.mjs"),
import("shiki/langs/json.mjs"),
import("shiki/langs/xml.mjs"),
import("shiki/langs/zig.mjs"),
import("shiki/wasm"),
import('shiki/themes/github-dark.mjs'),
import('shiki/langs/html.mjs'),
import('shiki/langs/javascript.mjs'),
import('shiki/langs/jsx.mjs'),
import('shiki/langs/typescript.mjs'),
import('shiki/langs/tsx.mjs'),
import('shiki/langs/go.mjs'),
import('shiki/langs/rust.mjs'),
import('shiki/langs/python.mjs'),
import('shiki/langs/java.mjs'),
import('shiki/langs/kotlin.mjs'),
import('shiki/langs/shell.mjs'),
import('shiki/langs/sql.mjs'),
import('shiki/langs/yaml.mjs'),
import('shiki/langs/toml.mjs'),
import('shiki/langs/markdown.mjs'),
import('shiki/langs/json.mjs'),
import('shiki/langs/xml.mjs'),
import('shiki/langs/zig.mjs'),
import('shiki/wasm'),
]);
// Create the highlighter instance with the loaded themes and languages

Some files were not shown because too many files have changed in this diff Show More