75 Commits
2.0 ... main

Author SHA1 Message Date
Geoff Seemueller
03c83b0a2e Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-08-16 10:22:17 -04:00
geoffsee
ae6a6e4064 Refactor model filtering logic into reusable basicFilters function. 2025-07-31 10:10:35 -04:00
geoffsee
67483d08db Update model path handling logic for FireworksAI and refine supported model filtering. 2025-07-27 12:30:47 -04:00
geoffsee
53268b528d Update hero label for home route in renderer routes. 2025-07-27 09:32:46 -04:00
geoffsee
f9d5fc8282 Remove unused submodules and related scripts 2025-07-27 09:00:25 -04:00
geoffsee
ce9bc4db07 "Swap default states for mapActive and aiActive in LandingComponent" 2025-07-17 14:11:15 -04:00
geoffsee
bd71bfcad3 - Remove unused BevyScene and related dependencies.
- Refactor `InstallButton` and relocate it to `install/`.
- Update `Toolbar` imports to reflect the new `InstallButton` structure.
- Introduce `handleInstall` functionality for PWA installation prompt handling.
2025-07-17 14:04:47 -04:00
Geoff Seemueller
4edee1e191 Potential fix for code scanning alert no. 5: Shell command built from environment values
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-07-17 13:47:50 -04:00
geoffsee
734f48d4a7 remove webhost in assistant prompt 2025-07-17 13:47:50 -04:00
geoffsee
66363cdf39 set ai as the default landing 2025-07-17 13:47:50 -04:00
geoffsee
36f8fcee87 Integrate PWA service worker registration using virtual:pwa-register. 2025-07-17 13:47:50 -04:00
geoffsee
f055cd39fe Update InputMenu to use clientChatStore.reset() instead of setActiveConversation when closing. 2025-07-17 13:47:50 -04:00
geoffsee
0183503425 Refactored layout components and styling: removed unused imports, adjusted positioning and padding for consistency. 2025-07-17 13:47:50 -04:00
geoffsee
a7ad06093a simplify landing page for my peace 2025-07-17 13:47:50 -04:00
geoffsee
c26d2467f4 sweet lander 2025-07-17 13:47:50 -04:00
geoffsee
818e0e672a chat + maps + ai + tools 2025-07-17 13:47:50 -04:00
geoffsee
48655474e3 mirror error handling behavior in cloudflare worker 2025-07-17 13:47:50 -04:00
geoffsee
ffabfd4ce5 add top level error handler to the router 2025-07-17 13:47:50 -04:00
geoffsee
fa5b7466bc Optimize WASM handling and integrate service worker caching.
Removed unused pointer events in BevyScene, updated Vite config with Workbox for service worker caching, and adjusted file paths in generate-bevy-bundle.js. Added WASM size optimization to ensure smaller and efficient builds, skipping optimization for files below 30MB.
2025-07-17 13:47:50 -04:00
geoffsee
6cc5e038a7 Add visible prop to toggle components and simplify conditional rendering 2025-07-17 13:47:50 -04:00
geoffsee
e72198628c Add "Install App" button to the toolbar using react-use-pwa-install library 2025-07-17 13:47:50 -04:00
geoffsee
c0428094c8 **Integrate PWA asset generator and update favicon and manifest configuration** 2025-07-17 13:47:50 -04:00
geoffsee
3901337163 - Refactor BevyScene to replace script injection with dynamic import.
- Update `NavItem` to provide fallback route for invalid `path`.
- Temporarily stub metric API endpoints with placeholders.
2025-07-17 13:47:50 -04:00
geoffsee
0ff8b5c03e * Introduced BevyScene React component in landing-component for rendering a 3D cockpit visualization.
* Included WebAssembly asset `yachtpit.js` for cockpit functionality.
* Added Bevy MIT license file.
* Implemented a service worker to cache assets locally instead of fetching them remotely.
* Added collapsible functionality to **Tweakbox** and included the `@chakra-ui/icons` dependency.
* Applied the `hidden` prop to the Tweakbox Heading for better accessibility.
* Refactored **Particles** component for improved performance, clarity, and maintainability.

  * Introduced helper functions for particle creation and count management.
  * Added responsive resizing with particle repositioning.
  * Optimized animation updates, including velocity adjustments for speed changes.
  * Ensured canvas size and particle state are cleanly managed on component unmount.
2025-07-17 13:47:50 -04:00
geoffsee
858282929c Refactor chat-stream-provider to simplify tool structure. Optimize WeatherTool implementation with enriched function schema. 2025-07-17 13:47:50 -04:00
geoffsee
06b6a68b9b Enable tool-based message generation in chat-stream-provider and add BasicValueTool and WeatherTool.
Updated dependencies to latest versions in `bun.lock`. Modified development script in `package.json` to include watch mode.
2025-07-17 13:47:50 -04:00
dependabot[bot]
de968bcfbd Bump dotenv from 16.6.1 to 17.0.0
Bumps [dotenv](https://github.com/motdotla/dotenv) from 16.6.1 to 17.0.0.
- [Changelog](https://github.com/motdotla/dotenv/blob/master/CHANGELOG.md)
- [Commits](https://github.com/motdotla/dotenv/compare/v16.6.1...v17.0.0)

---
updated-dependencies:
- dependency-name: dotenv
  dependency-version: 17.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-01 11:00:15 -04:00
dependabot[bot]
6e8d9f3534 Bump react-streaming from 0.3.50 to 0.4.2
Bumps [react-streaming](https://github.com/brillout/react-streaming) from 0.3.50 to 0.4.2.
- [Changelog](https://github.com/brillout/react-streaming/blob/main/CHANGELOG.md)
- [Commits](https://github.com/brillout/react-streaming/compare/v0.3.50...v0.4.2)

---
updated-dependencies:
- dependency-name: react-streaming
  dependency-version: 0.4.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-01 10:59:30 -04:00
geoffsee
57ad9df087 fix wrangler config schema ref 2025-06-26 14:21:11 -04:00
geoffsee
610cb711a4 fix eslint version to 8 2025-06-26 12:40:54 -04:00
geoffsee
8cba09e67b - Add cache refresh mechanism for providers in ChatService
- Implemented tests to validate caching logic based on provider changes
- Enhanced caching logic to include a provider signature for more precise cache validation
2025-06-25 19:12:14 -04:00
geoffsee
c8e6da2d15 Add Docker support with Dockerfile and docker-compose.yml, update build scripts and README for containerized deployment.
- Updated server `Bun.build` configuration: adjusted `outdir`, added `format` as `esm`, and set `@open-gsio/client` to external.
- Expanded README with Docker instructions.
- Added new package `@open-gsio/analytics-worker`.
- Upgraded dependencies (`vite`, `typescript`, `bun`) and locked `pnpm` version in `package.json`.
2025-06-25 18:13:52 -04:00
geoffsee
1dab5aaa14 Bun server handles static assets and api 2025-06-25 16:46:46 -04:00
geoffsee
a295c208e9 Update React, React-DOM, and related dependencies to latest versions. 2025-06-25 16:30:42 -04:00
dependabot[bot]
713f0ffe8b Bump @anthropic-ai/sdk from 0.32.1 to 0.54.0
Bumps [@anthropic-ai/sdk](https://github.com/anthropics/anthropic-sdk-typescript) from 0.32.1 to 0.54.0.
- [Release notes](https://github.com/anthropics/anthropic-sdk-typescript/releases)
- [Changelog](https://github.com/anthropics/anthropic-sdk-typescript/blob/main/CHANGELOG.md)
- [Commits](https://github.com/anthropics/anthropic-sdk-typescript/compare/sdk-v0.32.1...sdk-v0.54.0)

---
updated-dependencies:
- dependency-name: "@anthropic-ai/sdk"
  dependency-version: 0.54.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-25 16:22:59 -04:00
dependabot[bot]
a793bfe8e0 Bump react-dom from 18.3.1 to 19.1.0
Bumps [react-dom](https://github.com/facebook/react/tree/HEAD/packages/react-dom) from 18.3.1 to 19.1.0.
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.1.0/packages/react-dom)

---
updated-dependencies:
- dependency-name: react-dom
  dependency-version: 19.1.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-25 16:22:51 -04:00
dependabot[bot]
d594929998 Bump @testing-library/react from 14.3.1 to 16.3.0
Bumps [@testing-library/react](https://github.com/testing-library/react-testing-library) from 14.3.1 to 16.3.0.
- [Release notes](https://github.com/testing-library/react-testing-library/releases)
- [Changelog](https://github.com/testing-library/react-testing-library/blob/main/CHANGELOG.md)
- [Commits](https://github.com/testing-library/react-testing-library/compare/v14.3.1...v16.3.0)

---
updated-dependencies:
- dependency-name: "@testing-library/react"
  dependency-version: 16.3.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-25 16:22:36 -04:00
geoffsee
6d9bf79ba3 Update tests to use updated HUMAN/ASSISTANT format instead of **Human**/**Assistant**. 2025-06-25 16:16:23 -04:00
geoffsee
6b5928de7f Update AssetService SSR handling tests: refine mocks and add edge cases 2025-06-25 16:12:59 -04:00
geoffsee
f9249f3496 - Refactored to introduce handleSsr function in @open-gsio/client/server/index.ts for streamlined SSR handling.
- Replaced inline SSR logic in `AssetService.ts` with `handleSsr` import.
- Enhanced `build:client` script to ensure server directory creation.
- Updated dependencies and devDependencies across multiple packages for compatibility improvements.
2025-06-25 16:03:13 -04:00
geoffsee
93bec55585 Add bun wrangler tail log script and filter non-text models 2025-06-25 14:32:54 -04:00
geoffsee
8cdb6b8c94 - Refine assistant output formatting by removing bold headers and adjusting response template.
- Update `package.json` across multiple packages to include missing newline and add package manager metadata.
- Minor README formatting fixes to remove unnecessary trailing spaces.
2025-06-25 14:15:01 -04:00
geoffsee
48bedb8c74 fix nonexistant suite 2025-06-25 14:00:16 -04:00
geoffsee
068d8614e0 tests updated with new import 2025-06-25 14:00:16 -04:00
geoffsee
554096abb2 wip 2025-06-25 14:00:16 -04:00
geoffsee
21d6c8604e github button targets repo 2025-06-24 20:56:08 -04:00
geoffsee
de3173a8f8 add missing files to last commit 2025-06-24 20:46:36 -04:00
geoffsee
c6e09644e2 **Refactor:** Restructure server package to streamline imports and improve file organization
- Moved `providers`, `services`, `models`, `lib`, and related files to `src` directory within `server` package.
- Adjusted imports across the codebase to reflect the new paths.
- Renamed several `.ts` files for consistency.
- Introduced an `index.ts` in the `ai/providers` package to export all providers.

This improves maintainability and aligns with the project's updated directory structure.
2025-06-24 20:46:15 -04:00
geoffsee
0b8d67fc69 remove package manager spec 2025-06-24 17:36:39 -04:00
geoffsee
f76301d620 run format 2025-06-24 17:32:59 -04:00
geoffsee
02c3253343 adds eslint 2025-06-24 17:32:59 -04:00
geoffsee
9698fc6f3b Refactor project: remove unused code, clean up logs, streamline error handling, update TypeScript configs, and enhance message streaming.
- Deployed
2025-06-24 16:28:25 -04:00
geoffsee
004ec580d3 Remove unused ResumeComponent, ServicesComponent, and related sections. Update theming for SupportThisSiteModal, adjust DogecoinIcon, and refine Cloudflare worker references. 2025-06-24 15:51:39 -04:00
geoffsee
bdbc8de6d5 **Remove dead links and redundant comments; improve styling and clarity across multiple files**
- Removed outdated links and unused properties in Sidebar and Welcome Home Text files.
- Dropped extraneous comments and consolidated imports in server files for streamlined code.
- Enhanced MarkdownEditor visuals with a colorful border for better user experience.
2025-06-24 15:23:34 -04:00
geoffsee
a367812fe7 update prompts and ollama endpoint 2025-06-24 15:12:12 -04:00
geoffsee
22bf2f1c2f Fix provider endpoints 2025-06-24 15:01:43 -04:00
geoffsee
02ede2b0f6 Refactor ServerCoordinator and project structure for clearer durable objects organization and module imports. 2025-06-18 15:53:17 -04:00
geoffsee
afc46fe2c3 fix tests 2025-06-18 15:02:29 -04:00
geoffsee
b7f02eb4fb fix mlx omni provider 2025-06-18 14:33:07 -04:00
geoffsee
f1d7f52dbd fixes model initialization for mlx 2025-06-18 13:30:38 -04:00
geoffsee
38b364caeb fix local inference config 2025-06-18 12:38:38 -04:00
geoffsee
3d16bd94b4 **Refactor imports and improve type annotations**
- Adjusted import statements across the codebase to align with consistent use of `type`.
- Unified usage of `EventSource` initialization.
- Introduced `RootDeps` type for shared dependencies.
- Commented out unused VitePWA configuration.
- Updated proxy target URLs in Vite configuration.
2025-06-18 12:34:16 -04:00
geoffsee
7454c9b54b fix build 2025-06-18 10:41:39 -04:00
geoffsee
0c999e0400 fixes tests 2025-06-09 23:18:52 -04:00
geoffsee
362f50bf85 remove faulty test execution pattern 2025-06-09 23:18:52 -04:00
geoffsee
9e79c488ee correct README 2025-06-09 23:18:52 -04:00
geoffsee
370c3e5717 adjust README and local inference configuration script 2025-06-09 23:18:52 -04:00
geoffsee
f29bb6779c improves interoperability of model providers, local and remote providers can be used together seemlessly 2025-06-09 23:18:52 -04:00
Geoff Seemueller
ad7dc5c0a6 Update README.md
improve semantics

Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-05 14:04:08 -04:00
Geoff Seemueller
059e7d3218 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-04 20:19:12 -04:00
geoffsee
6be0316e75 add some missing to last 2025-06-04 20:09:39 -04:00
geoffsee
5bd1e2f77f add Acknowledgments section to README 2025-06-04 20:05:02 -04:00
geoffsee
03aae4d8db fix static fileserver 2025-06-04 19:00:10 -04:00
geoffsee
5d7a7b740a fix package script for server:dev 2025-06-04 18:52:39 -04:00
geoffsee
31d734d4f6 fix incorrect constructor usage 2025-06-04 18:50:59 -04:00
282 changed files with 8652 additions and 6473 deletions

3
.dockerignore Normal file
View File

@@ -0,0 +1,3 @@
/.wrangler/
/.open-gsio/
/node_modules/

41
.eslintignore Normal file
View File

@@ -0,0 +1,41 @@
# Dependencies
node_modules
.pnp
.pnp.js
# Build outputs
dist
build
out
.next
.nuxt
.cache
# Test coverage
coverage
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Environment variables
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
# Editor directories and files
.idea
.vscode
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
# TypeScript
*.d.ts

49
.eslintrc.cjs Normal file
View File

@@ -0,0 +1,49 @@
module.exports = {
root: true,
parser: '@typescript-eslint/parser',
parserOptions: {
ecmaVersion: 2021,
sourceType: 'module',
project: './tsconfig.json',
},
env: {
browser: true,
node: true,
es6: true,
},
globals: {
Bun: 'readonly',
},
plugins: ['@typescript-eslint', 'import', 'prettier'],
extends: [
'eslint:recommended',
'plugin:@typescript-eslint/recommended',
'plugin:import/errors',
'plugin:import/warnings',
'plugin:import/typescript',
'prettier',
],
rules: {
'prettier/prettier': 'error',
'@typescript-eslint/explicit-module-boundary-types': 'off',
'@typescript-eslint/no-explicit-any': 'warn',
'@typescript-eslint/no-unused-vars': ['warn', { argsIgnorePattern: '^_' }],
'import/order': [
'error',
{
'newlines-between': 'always',
alphabetize: { order: 'asc', caseInsensitive: true },
groups: ['builtin', 'external', 'internal', 'parent', 'sibling', 'index'],
},
],
},
settings: {
'import/resolver': {
node: {
extensions: ['.js', '.jsx', '.ts', '.tsx'],
moduleDirectory: ['node_modules', 'packages/*/node_modules'],
},
},
},
ignorePatterns: ['node_modules', 'dist', 'build', '*.d.ts', '*.min.js'],
};

19
.gitignore vendored
View File

@@ -7,10 +7,21 @@
**/.idea/ **/.idea/
**/html/ **/html/
**/.env **/.env
packages/client/public/static/fonts/*
**/secrets.json **/secrets.json
**/.dev.vars **/.dev.vars
packages/client/public/sitemap.xml wrangler.dev.jsonc
packages/client/public/robots.txt /packages/client/public/static/fonts/
wrangler.dev.jsonc /packages/client/public/robots.txt
/packages/client/public/sitemap.xml
/packages/client/public/yachtpit.html
/packages/client/public/yachtpit.js
/packages/client/public/yachtpit_bg.wasm
/packages/client/public/assets/
/packages/client/public/apple-touch-icon-180x180.png
/packages/client/public/icon.ico
/packages/client/public/maskable-icon-512x512.png
/packages/client/public/pwa-64x64.png
/packages/client/public/pwa-192x192.png
/packages/client/public/pwa-512x512.png
packages/client/public/yachtpit_bg*

47
.prettierignore Normal file
View File

@@ -0,0 +1,47 @@
# Dependencies
node_modules
.pnp
.pnp.js
# Build outputs
dist
build
out
.next
.nuxt
.cache
# Test coverage
coverage
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Environment variables
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
# Editor directories and files
.idea
.vscode
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
# Package files
package-lock.json
yarn.lock
pnpm-lock.yaml
bun.lock
# Generated files
CHANGELOG.md

19
.prettierrc.cjs Normal file
View File

@@ -0,0 +1,19 @@
module.exports = {
semi: true,
singleQuote: true,
trailingComma: 'all',
printWidth: 100,
tabWidth: 2,
useTabs: false,
bracketSpacing: true,
arrowParens: 'avoid',
endOfLine: 'lf',
overrides: [
{
files: '*.{json,yml,yaml,md}',
options: {
tabWidth: 2,
},
},
],
};

51
Dockerfile Normal file
View File

@@ -0,0 +1,51 @@
FROM oven/bun:latest
WORKDIR /app
# Copy package files first for better caching
COPY package.json bun.lock ./
# Create directory structure for all packages
RUN mkdir -p packages/ai packages/ai/src/types packages/client packages/coordinators packages/env packages/router packages/schema packages/scripts packages/server packages/services packages/cloudflare-workers/analytics packages/cloudflare-workers/open-gsio
# Copy package.json files for all packages
COPY packages/ai/package.json ./packages/ai/
COPY packages/ai/src/types/package.json ./packages/ai/src/types/
COPY packages/client/package.json ./packages/client/
COPY packages/coordinators/package.json ./packages/coordinators/
COPY packages/env/package.json ./packages/env/
COPY packages/router/package.json ./packages/router/
COPY packages/schema/package.json ./packages/schema/
COPY packages/scripts/package.json ./packages/scripts/
COPY packages/server/package.json ./packages/server/
COPY packages/services/package.json ./packages/services/
COPY packages/cloudflare-workers/analytics/package.json ./packages/cloudflare-workers/analytics/
COPY packages/cloudflare-workers/open-gsio/package.json ./packages/cloudflare-workers/open-gsio/
# Install dependencies
RUN bun install
# Copy the rest of the application
COPY . .
# Create .env file if it doesn't exist
RUN touch ./packages/server/.env
# Build client and server
RUN bun build:client && bun build:server
# Ensure the client directories exist
RUN mkdir -p ./client/public ./client/dist/client
# Copy client files to the expected locations
RUN cp -r ./packages/client/public/* ./client/public/ || true
RUN cp -r ./packages/client/dist/* ./client/dist/ || true
EXPOSE 3003
# Verify server.js exists
RUN test -f ./packages/server/dist/server.js || (echo "Error: server.js not found" && exit 1)
CMD ["bun", "./packages/server/dist/server.js"]

View File

@@ -1,60 +1,60 @@
Legacy Development History ## Legacy Development History
---
The source code of open-gsio was drawn from the source code of my personal website. That commit history was contaminated early on with secrets. `open-gsio` is a refinement of those sources. A total of 367 commits were submitted to the main branch of the upstream source repository between August 2024 and May 2025. The source code of open-gsio was drawn from the source code of my personal website. That commit history was contaminated early on with secrets. `open-gsio` is a refinement of those sources. A total of 367 commits were submitted to the main branch of the upstream source repository between August 2024 and May 2025.
#### **May 2025** #### **May 2025**
* Added **seemueller.ai** link to UI sidebar. - Added **seemueller.ai** link to UI sidebar.
* Global config/markdown guide cleanup; patched a critical forgotten bug. - Global config/markdown guide cleanup; patched a critical forgotten bug.
#### **Apr 2025** #### **Apr 2025**
* **CI/CD overhaul**: autodeploy to dev & staging, Bun adoption as package manager, streamlined blocklist workflow (now autoupdates via VPN blocker). - **CI/CD overhaul**: autodeploy to dev & staging, Bun adoption as package manager, streamlined blocklist workflow (now autoupdates via VPN blocker).
* New 404 error page; multiple robots.txt and editorresize fixes; removed dead/duplicate code. - New 404 error page; multiple robots.txt and editorresize fixes; removed dead/duplicate code.
#### **Mar 2025** #### **Mar 2025**
* Introduced **modelspecific `max_tokens`** handling and plugged in **Cloudflare AI models** for testing. - Introduced **modelspecific `max_tokens`** handling and plugged in **Cloudflare AI models** for testing.
* Bundle size minimised (reenabled minifier, smaller vendor set). - Bundle size minimised (reenabled minifier, smaller vendor set).
#### **Feb 2025** #### **Feb 2025**
* **Full theme system** (runtime switching, Centauri theme, serversaved prefs). - **Full theme system** (runtime switching, Centauri theme, serversaved prefs).
* Tightened MobX typing for messages; responsive breakpoints & input scaling repaired. - Tightened MobX typing for messages; responsive breakpoints & input scaling repaired.
* Dropped legacy document API; general folder restructure. - Dropped legacy document API; general folder restructure.
#### **Jan 2025** #### **Jan 2025**
* **Ratelimit middleware**, larger KV/R2 storage quota. - **Ratelimit middleware**, larger KV/R2 storage quota.
* Switched default model → *llamav3p170binstruct*; pluggable model handlers. - Switched default model → _llamav3p170binstruct_; pluggable model handlers.
* Added **KaTeX fonts** & **Marked.js** for rich math/markdown. - Added **KaTeX fonts** & **Marked.js** for rich math/markdown.
* Fireworks key rotation; deprecated Google models removed. - Fireworks key rotation; deprecated Google models removed.
#### **Dec 2024** #### **Dec 2024**
* Major package upgrades; **CodeHighlighter** now supports HTML/JSX/TS(X)/Zig. - Major package upgrades; **CodeHighlighter** now supports HTML/JSX/TS(X)/Zig.
* Refactored streaming + markdown renderer; Androidspecific padding fixes. - Refactored streaming + markdown renderer; Androidspecific padding fixes.
* Reset default chat model to **gpt4o**; welcome message & richer searchintent logic. - Reset default chat model to **gpt4o**; welcome message & richer searchintent logic.
#### **Nov 2024** #### **Nov 2024**
* **Fireworks API** + agent server; firstclass support for **Anthropic** & **GROQ** models (incl. attachments). - **Fireworks API** + agent server; firstclass support for **Anthropic** & **GROQ** models (incl. attachments).
* **VPN blocker** shipped with CIDR validation and dedicated GitHub Action. - **VPN blocker** shipped with CIDR validation and dedicated GitHub Action.
* Live search buffering, feedback modal, smarter context preprocessing. - Live search buffering, feedback modal, smarter context preprocessing.
#### **Oct 2024** #### **Oct 2024**
* Rolled out **image generation** + picker for image models. - Rolled out **image generation** + picker for image models.
* Deployed **ETH payment processor** & depositaddress flow. - Deployed **ETH payment processor** & depositaddress flow.
* Introduced fewshot prompting library; analytics worker refactor; Halloween prompt. - Introduced fewshot prompting library; analytics worker refactor; Halloween prompt.
* Extensive mobileUX polish and bundling/worker config updates. - Extensive mobileUX polish and bundling/worker config updates.
#### **Sep 2024** #### **Sep 2024**
* Endtoend **math rendering** (KaTeX) and **GitHubflavoured markdown**. - Endtoend **math rendering** (KaTeX) and **GitHubflavoured markdown**.
* Migrated chat state to **MobX**; launched analytics service & metrics worker. - Migrated chat state to **MobX**; launched analytics service & metrics worker.
* Switched build minifier to **esbuild**; tokenizer limits enforced; gradient sidebar & cookieconsent manager added. - Switched build minifier to **esbuild**; tokenizer limits enforced; gradient sidebar & cookieconsent manager added.
#### **Aug 2024** #### **Aug 2024**
* **Initial MVP**: iMessagestyle chat UI, websocket prototype, Google Analytics, Cloudflare bindings, base workersite scaffold. - **Initial MVP**: iMessagestyle chat UI, websocket prototype, Google Analytics, Cloudflare bindings, base workersite scaffold.

157
README.md
View File

@@ -1,42 +1,30 @@
# open-gsio # open-gsio
> Rewrite in-progress.
[![Tests](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml/badge.svg)](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml) [![Tests](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml/badge.svg)](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml)
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT) [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
</br> </br>
<p align="center"> <p align="center">
<img src="https://github.com/user-attachments/assets/620d2517-e7be-4bb0-b2b7-3aa0cba37ef0" width="250" /> <img src="https://github.com/user-attachments/assets/620d2517-e7be-4bb0-b2b7-3aa0cba37ef0" width="250" />
</p> </p>
> **Note**: This project is currently under active development. The styling is a work in progress and some functionality This is a full-stack Conversational AI.
> may be broken. Tests are being actively ported and stability will improve over time. Thank you for your patience!
This is a full-stack Conversational AI. It runs on Cloudflare or Bun.
## Table of Contents ## Table of Contents
- [Stack](#stack)
- [Installation](#installation) - [Installation](#installation)
- [Deployment](#deployment) - [Deployment](#deployment)
- [Docker](#docker)
- [Local Inference](#local-inference) - [Local Inference](#local-inference)
- [mlx-omni-server (default)](#mlx-omni-server) - [mlx-omni-server (default)](#mlx-omni-server)
- [Adding models](#adding-models-for-local-inference-apple-silicon) - [Adding models](#adding-models-for-local-inference-apple-silicon)
- [Ollama](#ollama) - [Ollama](#ollama)
- [Adding models](#adding-models-for-local-inference-ollama) - [Adding models](#adding-models-for-local-inference-ollama)
- [Testing](#testing) - [Testing](#testing)
- [Troubleshooting](#troubleshooting) - [Troubleshooting](#troubleshooting)
- [History](#history) - [Acknowledgments](#acknowledgments)
- [License](#license) - [License](#license)
## Stack
* [TypeScript](https://www.typescriptlang.org/)
* [Vike](https://vike.dev/)
* [React](https://react.dev/)
* [Cloudflare Workers](https://developers.cloudflare.com/workers/)
* [ittyrouter](https://github.com/kwhitley/itty-router)
* [MobXStateTree](https://mobx-state-tree.js.org/)
* [OpenAI SDK](https://github.com/openai/openai-node)
* [Vitest](https://vitest.dev/)
## Installation ## Installation
1. `bun i && bun test:all` 1. `bun i && bun test:all`
@@ -46,30 +34,87 @@ This is a full-stack Conversational AI. It runs on Cloudflare or Bun.
> Note: it should be possible to use pnpm in place of bun. > Note: it should be possible to use pnpm in place of bun.
## Deployment ## Deployment
1. Setup KV_STORAGE binding in `packages/server/wrangler.jsonc` 1. Setup KV_STORAGE binding in `packages/server/wrangler.jsonc`
1. [Add keys in secrets.json](https://console.groq.com/keys) 1. [Add keys in secrets.json](https://console.groq.com/keys)
1. Run `bun run deploy && bun run deploy:secrets && bun run deploy` 1. Run `bun run deploy && bun run deploy:secrets && bun run deploy`
> Note: Subsequent deployments should omit `bun run deploy:secrets` > Note: Subsequent deployments should omit `bun run deploy:secrets`
## Docker
You can run the server using Docker. The image is large but will be slimmed down in future commits.
### Building the Docker Image
```bash
docker compose build
# OR
docker build -t open-gsio .
```
### Running the Docker Container
```bash
docker run -p 3003:3003 \
-e GROQ_API_KEY=your_groq_api_key \
-e FIREWORKS_API_KEY=your_fireworks_api_key \
open-gsio
```
You can omit any environment variables that you don't need. The server will be available at http://localhost:3003.
### Using Docker Compose
A `docker-compose.yml` file is provided in the repository. You can edit it to add your API keys:
```yaml
version: '3'
services:
open-gsio:
build: .
ports:
- "3003:3003"
environment:
- GROQ_API_KEY=your_groq_api_key
- FIREWORKS_API_KEY=your_fireworks_api_key
# Other environment variables are included in the file
restart: unless-stopped
```
Then run:
```bash
docker compose up
```
Or to run in detached mode:
```bash
docker compose up -d
```
## Local Inference ## Local Inference
> Local inference is achieved by overriding the `OPENAI_API_KEY` and `OPENAI_API_ENDPOINT` environment variables. See below.
> Local inference is supported for Ollama and mlx-omni-server. OpenAI compatible servers can be used by overriding OPENAI_API_KEY and OPENAI_API_ENDPOINT.
### mlx-omni-server ### mlx-omni-server
(default) (Apple Silicon Only) - Use Ollama for other platforms.
~~~bash (default) (Apple Silicon Only)
```bash
# (prereq) install mlx-omni-server # (prereq) install mlx-omni-server
brew tap seemueller-io/tap brew tap seemueller-io/tap
brew install seemueller-io/tap/mlx-omni-server brew install seemueller-io/tap/mlx-omni-server
bun run openai:local mlx-omni-server # Start mlx-omni-server bun run openai:local mlx-omni-server # Start mlx-omni-server
bun run openai:local:configure # Configure connection bun run openai:local:configure # Configure connection
bun run server:dev # Restart server bun run server:dev # Restart server
~~~ ```
#### Adding models for local inference (Apple Silicon) #### Adding models for local inference (Apple Silicon)
~~~bash ```bash
# ensure mlx-omni-server is running # ensure mlx-omni-server is running
# See https://huggingface.co/mlx-community for available models # See https://huggingface.co/mlx-community for available models
@@ -81,21 +126,22 @@ curl http://localhost:10240/v1/chat/completions \
\"model\": \"$MODEL_TO_ADD\", \"model\": \"$MODEL_TO_ADD\",
\"messages\": [{\"role\": \"user\", \"content\": \"Hello\"}] \"messages\": [{\"role\": \"user\", \"content\": \"Hello\"}]
}" }"
~~~ ```
### Ollama ### Ollama
~~~bash
```bash
bun run openai:local ollama # Start ollama server bun run openai:local ollama # Start ollama server
bun run openai:local:configure # Configure connection bun run openai:local:configure # Configure connection
bun run server:dev # Restart server bun run server:dev # Restart server
~~~ ```
#### Adding models for local inference (ollama) #### Adding models for local inference (ollama)
~~~bash ```bash
# See https://ollama.com/library for available models # See https://ollama.com/library for available models
use the ollama web ui @ http://localhost:8080 use the ollama web ui @ http://localhost:8080
~~~ ```
## Testing ## Testing
@@ -103,20 +149,46 @@ Tests are located in `__tests__` directories next to the code they test. Testing
> `bun test:all` will run all tests > `bun test:all` will run all tests
## Troubleshooting ## Troubleshooting
1. `bun clean` 1. `bun clean`
1. `bun i` 1. `bun i`
1. `bun server:dev` 1. `bun server:dev`
1. `bun client:dev` 1. `bun client:dev`
1. Submit an issue 1. Submit an issue
## History
History
---
A high-level overview for the development history of the parent repository, [geoff-seemueller-io](https://geoff.seemueller.io), is provided in [LEGACY.md](./LEGACY.md). A high-level overview for the development history of the parent repository, [geoff-seemueller-io](https://geoff.seemueller.io), is provided in [LEGACY.md](./LEGACY.md).
## Acknowledgments
I would like to express gratitude to the following projects, libraries, and individuals that have contributed to making open-gsio possible:
- [TypeScript](https://www.typescriptlang.org/) - Primary programming language
- [React](https://react.dev/) - UI library for building the frontend
- [Vike](https://vike.dev/) - Framework for server-side rendering and routing
- [Cloudflare Workers](https://developers.cloudflare.com/workers/) - Serverless execution environment
- [Bun](https://bun.sh/) - JavaScript runtime and toolkit
- [Marked.js](https://github.com/markedjs/marked) - Markdown Rendering
- [Shiki](https://github.com/shikijs/shiki) - Syntax Highlighting
- [itty-router](https://github.com/kwhitley/itty-router) - Lightweight router for serverless environments
- [MobX-State-Tree](https://mobx-state-tree.js.org/) - State management solution
- [OpenAI SDK](https://github.com/openai/openai-node) - Client for AI model integration
- [Vitest](https://vitest.dev/) - Testing framework
- [OpenAI](https://github.com/openai)
- [Groq](https://console.groq.com/) - Fast inference API
- [Anthropic](https://www.anthropic.com/) - Creator of Claude models
- [Fireworks](https://fireworks.ai/) - AI inference platform
- [XAI](https://x.ai/) - Creator of Grok models
- [Cerebras](https://www.cerebras.net/) - AI compute and models
- [(madroidmaq) MLX Omni Server](https://github.com/madroidmaq/mlx-omni-server) - Open-source high-performance inference for Apple Silicon
- [MLX](https://github.com/ml-explore/mlx) - An array framework for Apple silicon
- [Ollama](https://github.com/ollama/ollama) - Versatile solution for self-hosting models
## License ## License
~~~text
```text
MIT License MIT License
Copyright (c) 2025 Geoff Seemueller Copyright (c) 2025 Geoff Seemueller
@@ -138,7 +210,4 @@ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. SOFTWARE.
~~~ ```

1092
bun.lock

File diff suppressed because it is too large Load Diff

BIN
bun.lockb

Binary file not shown.

13
docker-compose.yml Normal file
View File

@@ -0,0 +1,13 @@
version: '3'
services:
open-gsio:
image: open-gsio:latest
build:
pull: false
context: .
dockerfile: Dockerfile
ports:
- "3003:3003"
env_file:
- ./packages/server/.env
restart: unless-stopped

View File

@@ -12,19 +12,37 @@
"clean": "packages/scripts/cleanup.sh", "clean": "packages/scripts/cleanup.sh",
"test:all": "bun run --filter='*' tests", "test:all": "bun run --filter='*' tests",
"client:dev": "(cd packages/client && bun run dev)", "client:dev": "(cd packages/client && bun run dev)",
"server:dev": "(cd packages/server && bun run dev)", "server:dev": "bun build:client && (cd packages/server && bun run dev)",
"build": "(cd packages/cloudflare-workers && bun run deploy:dry-run)", "build": "(cd packages/cloudflare-workers/open-gsio && bun run deploy:dry-run)",
"deploy": "(cd packages/cloudflare-workers && bun run deploy)", "build:client": "(cd packages/client && bun run vite build)",
"build:server": "bun --filter=@open-gsio/server run build",
"deploy": "(cd packages/cloudflare-workers/open-gsio && bun run deploy)",
"deploy:secrets": "wrangler secret bulk secrets.json -c packages/cloudflare-workers/open-gsio/wrangler.jsonc", "deploy:secrets": "wrangler secret bulk secrets.json -c packages/cloudflare-workers/open-gsio/wrangler.jsonc",
"openai:local:mlx": "packages/scripts/start_inference_server.sh mlx-omni-server", "openai:local:mlx": "packages/scripts/start_inference_server.sh mlx-omni-server",
"openai:local:ollama": "packages/scripts/start_inference_server.sh ollama", "openai:local:ollama": "packages/scripts/start_inference_server.sh ollama",
"openai:local:configure": "packages/scripts/configure_local_inference.sh" "openai:local:configure": "packages/scripts/configure_local_inference.sh",
"lint": "eslint . --ext .js,.jsx,.ts,.tsx",
"lint:fix": "eslint . --ext .js,.jsx,.ts,.tsx --fix",
"format": "prettier --write \"**/*.{js,jsx,ts,tsx,json,md}\"",
"format:check": "prettier --check \"**/*.{js,jsx,ts,tsx,json,md}\"",
"log": "(cd packages/cloudflare-workers/open-gsio && bun wrangler tail)"
}, },
"devDependencies": { "devDependencies": {
"@types/bun": "latest" "@types/bun": "^1.2.17",
"@typescript-eslint/eslint-plugin": "^8.35.0",
"@typescript-eslint/parser": "^8.35.0",
"eslint": "^8",
"eslint-config-prettier": "^10.1.5",
"eslint-plugin-import": "^2.32.0",
"eslint-plugin-prettier": "^5.5.1",
"happy-dom": "^18.0.1",
"prettier": "^3.6.1"
}, },
"peerDependencies": { "peerDependencies": {
"typescript": "^5" "typescript": "^5.8.3"
},
"dependencies": {
"@chakra-ui/icons": "^2.2.4"
}, },
"packageManager": "pnpm@10.10.0+sha512.d615db246fe70f25dcfea6d8d73dee782ce23e2245e3c4f6f888249fb568149318637dca73c2c5c8ef2a4ca0d5657fb9567188bfab47f566d1ee6ce987815c39" "packageManager": "pnpm@10.10.0+sha512.d615db246fe70f25dcfea6d8d73dee782ce23e2245e3c4f6f888249fb568149318637dca73c2c5c8ef2a4ca0d5657fb9567188bfab47f566d1ee6ce987815c39"
} }

View File

@@ -1 +0,0 @@
export * from "./supported-models.ts";

View File

@@ -1,4 +1,48 @@
{ {
"name": "@open-gsio/ai", "name": "@open-gsio/ai",
"module": "index.ts" "type": "module",
} "module": "src/index.ts",
"exports": {
".": {
"import": "./src/index.ts",
"types": "./src/index.ts"
},
"./chat-sdk/chat-sdk.ts": {
"import": "./src/chat-sdk/chat-sdk.ts",
"types": "./src/chat-sdk/chat-sdk.ts"
},
"./providers/_ProviderRepository.ts": {
"import": "./src/providers/_ProviderRepository.ts",
"types": "./src/providers/_ProviderRepository.ts"
},
"./providers/google.ts": {
"import": "./src/providers/google.ts",
"types": "./src/providers/google.ts"
},
"./providers/openai.ts": {
"import": "./src/providers/openai.ts",
"types": "./src/providers/openai.ts"
},
"./src": {
"import": "./src/index.ts",
"types": "./src/index.ts"
},
"./utils": {
"import": "./src/utils/index.ts",
"types": "./src/utils/index.ts"
}
},
"scripts": {
"tests": "vitest run",
"tests:coverage": "vitest run --coverage.enabled=true"
},
"devDependencies": {
"@open-gsio/env": "workspace:*",
"@open-gsio/schema": "workspace:*",
"@anthropic-ai/sdk": "^0.55.0",
"openai": "^5.0.1",
"wrangler": "^4.18.0",
"vitest": "^3.1.4",
"vite": "^6.3.5"
}
}

View File

@@ -0,0 +1,154 @@
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import { AssistantSdk } from '../assistant-sdk';
import { Utils } from '../utils/utils.ts';
// Mock dependencies
vi.mock('../utils/utils.ts', () => ({
Utils: {
selectEquitably: vi.fn(),
getCurrentDate: vi.fn(),
},
}));
vi.mock('../prompts/few_shots', () => ({
default: {
a: 'A1',
question1: 'answer1',
question2: 'answer2',
question3: 'answer3',
},
}));
describe('AssistantSdk', () => {
beforeEach(() => {
vi.useFakeTimers();
vi.setSystemTime(new Date('2023-01-01T12:30:45Z'));
// Reset mocks
vi.resetAllMocks();
});
afterEach(() => {
vi.useRealTimers();
});
describe('getAssistantPrompt', () => {
it('should return a prompt with default values when minimal params are provided', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({});
expect(prompt).toContain('# Assistant Knowledge');
expect(prompt).toContain('### Date: ');
expect(prompt).toContain('### User Location: ');
expect(prompt).toContain('### Timezone: ');
});
it('should include maxTokens when provided', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({ maxTokens: 1000 });
expect(prompt).toContain('Max Response Length: 1000 tokens (maximum)');
});
it('should use provided userTimezone and userLocation', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({
userTimezone: 'America/New_York',
userLocation: 'New York, USA',
});
expect(prompt).toContain('### User Location: New York, USA');
expect(prompt).toContain('### Timezone: America/New_York');
});
it('should use current date when Utils.getCurrentDate is not available', () => {
// Mock dependencies
Utils.selectEquitably.mockReturnValue({
question1: 'answer1',
question2: 'answer2',
});
// @ts-expect-error - is supposed to break
Utils.getCurrentDate.mockReturnValue(undefined);
const prompt = AssistantSdk.getAssistantPrompt({});
// Instead of checking for a specific date, just verify that a date is included
expect(prompt).toMatch(/### Date: \d{4}-\d{2}-\d{2} \d{1,2}:\d{2} \d{1,2}s/);
});
it('should use few_shots directly when Utils.selectEquitably is not available', () => {
// @ts-expect-error - is supposed to break
Utils.selectEquitably.mockReturnValue(undefined);
Utils.getCurrentDate.mockReturnValue('2023-01-01T12:30:45Z');
const prompt = AssistantSdk.getAssistantPrompt({});
// The prompt should still contain examples
expect(prompt).toContain('#### Example 1');
// Instead of checking for specific content, just verify that examples are included
expect(prompt).toMatch(/HUMAN: .+\nASSISTANT: .+/);
});
});
describe('useFewshots', () => {
it('should format fewshots correctly', () => {
const fewshots = {
'What is the capital of France?': 'Paris is the capital of France.',
'How do I make pasta?': 'Boil water, add pasta, cook until al dente.',
};
const result = AssistantSdk.useFewshots(fewshots);
expect(result).toContain('#### Example 1');
expect(result).toContain('HUMAN: What is the capital of France?');
expect(result).toContain('ASSISTANT: Paris is the capital of France.');
expect(result).toContain('#### Example 2');
expect(result).toContain('HUMAN: How do I make pasta?');
expect(result).toContain('ASSISTANT: Boil water, add pasta, cook until al dente.');
});
it('should respect the limit parameter', () => {
const fewshots = {
Q1: 'A1',
Q2: 'A2',
Q3: 'A3',
Q4: 'A4',
Q5: 'A5',
Q6: 'A6',
};
const result = AssistantSdk.useFewshots(fewshots, 3);
expect(result).toContain('#### Example 1');
expect(result).toContain('HUMAN: Q1');
expect(result).toContain('ASSISTANT: A1');
expect(result).toContain('#### Example 2');
expect(result).toContain('HUMAN: Q2');
expect(result).toContain('ASSISTANT: A2');
expect(result).toContain('#### Example 3');
expect(result).toContain('HUMAN: Q3');
expect(result).toContain('ASSISTANT: A3');
expect(result).not.toContain('#### Example 4');
expect(result).not.toContain('HUMAN: Q4');
});
});
});

View File

@@ -1,24 +1,29 @@
import { Schema } from '@open-gsio/schema';
import { describe, it, expect, vi, beforeEach } from 'vitest'; import { describe, it, expect, vi, beforeEach } from 'vitest';
import { ChatSdk } from '../chat-sdk.ts';
import { AssistantSdk } from '../assistant-sdk.ts'; import { AssistantSdk } from '../assistant-sdk';
import Message from '../../models/Message.ts'; import { ChatSdk } from '../chat-sdk';
import { getModelFamily } from '@open-gsio/ai/supported-models'; import { ProviderRepository } from '../providers/_ProviderRepository.ts';
// Mock dependencies // Mock dependencies
vi.mock('../assistant-sdk', () => ({ vi.mock('../assistant-sdk', () => ({
AssistantSdk: { AssistantSdk: {
getAssistantPrompt: vi.fn() getAssistantPrompt: vi.fn(),
} },
})); }));
vi.mock('../../models/Message', () => ({ vi.mock('@open-gsio/schema', () => ({
default: { Schema: {
create: vi.fn((message) => message) Message: {
} create: vi.fn(message => message),
},
},
})); }));
vi.mock('@open-gsio/ai/supported-models', () => ({ vi.mock('../providers/_ProviderRepository', () => ({
getModelFamily: vi.fn() ProviderRepository: {
getModelFamily: vi.fn().mockResolvedValue('openai'),
},
})); }));
describe('ChatSdk', () => { describe('ChatSdk', () => {
@@ -30,16 +35,16 @@ describe('ChatSdk', () => {
describe('preprocess', () => { describe('preprocess', () => {
it('should return an assistant message with empty content', async () => { it('should return an assistant message with empty content', async () => {
const messages = [{ role: 'user', content: 'Hello' }]; const messages = [{ role: 'user', content: 'Hello' }];
const result = await ChatSdk.preprocess({ messages }); const result = await ChatSdk.preprocess({ messages });
expect(Message.create).toHaveBeenCalledWith({ expect(Schema.Message.create).toHaveBeenCalledWith({
role: 'assistant', role: 'assistant',
content: '' content: '',
}); });
expect(result).toEqual({ expect(result).toEqual({
role: 'assistant', role: 'assistant',
content: '' content: '',
}); });
}); });
}); });
@@ -47,7 +52,7 @@ describe('ChatSdk', () => {
describe('handleChatRequest', () => { describe('handleChatRequest', () => {
it('should return a 400 response if no messages are provided', async () => { it('should return a 400 response if no messages are provided', async () => {
const request = { const request = {
json: vi.fn().mockResolvedValue({ messages: [] }) json: vi.fn().mockResolvedValue({ messages: [] }),
}; };
const ctx = { const ctx = {
openai: {}, openai: {},
@@ -56,13 +61,13 @@ describe('ChatSdk', () => {
env: { env: {
SERVER_COORDINATOR: { SERVER_COORDINATOR: {
idFromName: vi.fn(), idFromName: vi.fn(),
get: vi.fn() get: vi.fn(),
} },
} },
}; };
const response = await ChatSdk.handleChatRequest(request as any, ctx as any); const response = await ChatSdk.handleChatRequest(request as any, ctx as any);
expect(response.status).toBe(400); expect(response.status).toBe(400);
expect(await response.text()).toBe('No messages provided'); expect(await response.text()).toBe('No messages provided');
}); });
@@ -70,22 +75,22 @@ describe('ChatSdk', () => {
it('should save stream data and return a response with streamUrl', async () => { it('should save stream data and return a response with streamUrl', async () => {
const streamId = 'test-uuid'; const streamId = 'test-uuid';
vi.stubGlobal('crypto', { vi.stubGlobal('crypto', {
randomUUID: vi.fn().mockReturnValue(streamId) randomUUID: vi.fn().mockReturnValue(streamId),
}); });
const messages = [{ role: 'user', content: 'Hello' }]; const messages = [{ role: 'user', content: 'Hello' }];
const model = 'gpt-4'; const model = 'gpt-4';
const conversationId = 'conv-123'; const conversationId = 'conv-123';
const request = { const request = {
json: vi.fn().mockResolvedValue({ messages, model, conversationId }) json: vi.fn().mockResolvedValue({ messages, model, conversationId }),
}; };
const saveStreamData = vi.fn(); const saveStreamData = vi.fn();
const durableObject = { const durableObject = {
saveStreamData saveStreamData,
}; };
const ctx = { const ctx = {
openai: {}, openai: {},
systemPrompt: 'System prompt', systemPrompt: 'System prompt',
@@ -93,22 +98,19 @@ describe('ChatSdk', () => {
env: { env: {
SERVER_COORDINATOR: { SERVER_COORDINATOR: {
idFromName: vi.fn().mockReturnValue('object-id'), idFromName: vi.fn().mockReturnValue('object-id'),
get: vi.fn().mockReturnValue(durableObject) get: vi.fn().mockReturnValue(durableObject),
} },
} },
}; };
const response = await ChatSdk.handleChatRequest(request as any, ctx as any); const response = await ChatSdk.handleChatRequest(request as any, ctx as any);
const responseBody = await response.json(); const responseBody = await response.json();
expect(ctx.env.SERVER_COORDINATOR.idFromName).toHaveBeenCalledWith('stream-index'); expect(ctx.env.SERVER_COORDINATOR.idFromName).toHaveBeenCalledWith('stream-index');
expect(ctx.env.SERVER_COORDINATOR.get).toHaveBeenCalledWith('object-id'); expect(ctx.env.SERVER_COORDINATOR.get).toHaveBeenCalledWith('object-id');
expect(saveStreamData).toHaveBeenCalledWith( expect(saveStreamData).toHaveBeenCalledWith(streamId, expect.stringContaining(model));
streamId,
expect.stringContaining(model)
);
expect(responseBody).toEqual({ expect(responseBody).toEqual({
streamUrl: `/api/streams/${streamId}` streamUrl: `/api/streams/${streamId}`,
}); });
}); });
}); });
@@ -118,21 +120,21 @@ describe('ChatSdk', () => {
const messages = [{ role: 'user', content: 'Hello' }]; const messages = [{ role: 'user', content: 'Hello' }];
const dynamicMaxTokens = vi.fn().mockResolvedValue(500); const dynamicMaxTokens = vi.fn().mockResolvedValue(500);
const durableObject = { const durableObject = {
dynamicMaxTokens dynamicMaxTokens,
}; };
const ctx = { const ctx = {
maxTokens: 1000, maxTokens: 1000,
env: { env: {
SERVER_COORDINATOR: { SERVER_COORDINATOR: {
idFromName: vi.fn().mockReturnValue('object-id'), idFromName: vi.fn().mockReturnValue('object-id'),
get: vi.fn().mockReturnValue(durableObject) get: vi.fn().mockReturnValue(durableObject),
} },
} },
}; };
await ChatSdk.calculateMaxTokens(messages, ctx as any); await ChatSdk.calculateMaxTokens(messages, ctx as any);
expect(ctx.env.SERVER_COORDINATOR.idFromName).toHaveBeenCalledWith('dynamic-token-counter'); expect(ctx.env.SERVER_COORDINATOR.idFromName).toHaveBeenCalledWith('dynamic-token-counter');
expect(ctx.env.SERVER_COORDINATOR.get).toHaveBeenCalledWith('object-id'); expect(ctx.env.SERVER_COORDINATOR.get).toHaveBeenCalledWith('object-id');
expect(dynamicMaxTokens).toHaveBeenCalledWith(messages, 1000); expect(dynamicMaxTokens).toHaveBeenCalledWith(messages, 1000);
@@ -142,96 +144,94 @@ describe('ChatSdk', () => {
describe('buildAssistantPrompt', () => { describe('buildAssistantPrompt', () => {
it('should call AssistantSdk.getAssistantPrompt with the correct parameters', () => { it('should call AssistantSdk.getAssistantPrompt with the correct parameters', () => {
vi.mocked(AssistantSdk.getAssistantPrompt).mockReturnValue('Assistant prompt'); vi.mocked(AssistantSdk.getAssistantPrompt).mockReturnValue('Assistant prompt');
const result = ChatSdk.buildAssistantPrompt({ maxTokens: 1000 }); const result = ChatSdk.buildAssistantPrompt({ maxTokens: 1000 });
expect(AssistantSdk.getAssistantPrompt).toHaveBeenCalledWith({ expect(AssistantSdk.getAssistantPrompt).toHaveBeenCalledWith({
maxTokens: 1000, maxTokens: 1000,
userTimezone: 'UTC', userTimezone: 'UTC',
userLocation: 'USA/unknown' userLocation: 'USA/unknown',
}); });
expect(result).toBe('Assistant prompt'); expect(result).toBe('Assistant prompt');
}); });
}); });
describe('buildMessageChain', () => { describe('buildMessageChain', () => {
it('should build a message chain with system role for most models', () => { // TODO: Fix this test
vi.mocked(getModelFamily).mockReturnValue('openai'); it('should build a message chain with system role for most models', async () => {
ProviderRepository.getModelFamily.mockResolvedValue('openai');
const messages = [
{ role: 'user', content: 'Hello' } const messages = [{ role: 'user', content: 'Hello' }];
];
const opts = { const opts = {
systemPrompt: 'System prompt', systemPrompt: 'System prompt',
assistantPrompt: 'Assistant prompt', assistantPrompt: 'Assistant prompt',
toolResults: { role: 'tool', content: 'Tool result' }, toolResults: { role: 'tool', content: 'Tool result' },
model: 'gpt-4' model: 'gpt-4',
env: {},
}; };
const result = ChatSdk.buildMessageChain(messages, opts as any); const result = await ChatSdk.buildMessageChain(messages, opts as any);
expect(getModelFamily).toHaveBeenCalledWith('gpt-4'); expect(ProviderRepository.getModelFamily).toHaveBeenCalledWith('gpt-4', {});
expect(Message.create).toHaveBeenCalledTimes(3); expect(Schema.Message.create).toHaveBeenCalledTimes(3);
expect(Message.create).toHaveBeenNthCalledWith(1, { expect(Schema.Message.create).toHaveBeenNthCalledWith(1, {
role: 'system', role: 'system',
content: 'System prompt' content: 'System prompt',
}); });
expect(Message.create).toHaveBeenNthCalledWith(2, { expect(Schema.Message.create).toHaveBeenNthCalledWith(2, {
role: 'assistant', role: 'assistant',
content: 'Assistant prompt' content: 'Assistant prompt',
}); });
expect(Message.create).toHaveBeenNthCalledWith(3, { expect(Schema.Message.create).toHaveBeenNthCalledWith(3, {
role: 'user', role: 'user',
content: 'Hello' content: 'Hello',
}); });
}); });
it('should build a message chain with assistant role for o1, gemma, claude, or google models', async () => {
ProviderRepository.getModelFamily.mockResolvedValue('claude');
const messages = [{ role: 'user', content: 'Hello' }];
it('should build a message chain with assistant role for o1, gemma, claude, or google models', () => {
vi.mocked(getModelFamily).mockReturnValue('claude');
const messages = [
{ role: 'user', content: 'Hello' }
];
const opts = { const opts = {
systemPrompt: 'System prompt', systemPrompt: 'System prompt',
assistantPrompt: 'Assistant prompt', assistantPrompt: 'Assistant prompt',
toolResults: { role: 'tool', content: 'Tool result' }, toolResults: { role: 'tool', content: 'Tool result' },
model: 'claude-3' model: 'claude-3',
env: {},
}; };
const result = ChatSdk.buildMessageChain(messages, opts as any); const result = await ChatSdk.buildMessageChain(messages, opts as any);
expect(getModelFamily).toHaveBeenCalledWith('claude-3'); expect(ProviderRepository.getModelFamily).toHaveBeenCalledWith('claude-3', {});
expect(Message.create).toHaveBeenCalledTimes(3); expect(Schema.Message.create).toHaveBeenCalledTimes(3);
expect(Message.create).toHaveBeenNthCalledWith(1, { expect(Schema.Message.create).toHaveBeenNthCalledWith(1, {
role: 'assistant', role: 'assistant',
content: 'System prompt' content: 'System prompt',
}); });
}); });
it('should filter out messages with empty content', async () => {
ProviderRepository.getModelFamily.mockResolvedValue('openai');
it('should filter out messages with empty content', () => {
vi.mocked(getModelFamily).mockReturnValue('openai');
const messages = [ const messages = [
{ role: 'user', content: 'Hello' }, { role: 'user', content: 'Hello' },
{ role: 'user', content: '' }, { role: 'user', content: '' },
{ role: 'user', content: ' ' }, { role: 'user', content: ' ' },
{ role: 'user', content: 'World' } { role: 'user', content: 'World' },
]; ];
const opts = { const opts = {
systemPrompt: 'System prompt', systemPrompt: 'System prompt',
assistantPrompt: 'Assistant prompt', assistantPrompt: 'Assistant prompt',
toolResults: { role: 'tool', content: 'Tool result' }, toolResults: { role: 'tool', content: 'Tool result' },
model: 'gpt-4' model: 'gpt-4',
env: {},
}; };
const result = ChatSdk.buildMessageChain(messages, opts as any); const result = await ChatSdk.buildMessageChain(messages, opts as any);
// 2 system/assistant messages + 2 user messages (Hello and World) // 2 system/assistant messages + 2 user messages (Hello and World)
expect(Message.create).toHaveBeenCalledTimes(4); expect(Schema.Message.create).toHaveBeenCalledTimes(4);
}); });
}); });
}); });

View File

@@ -1,5 +1,6 @@
import { describe, it, expect } from 'vitest'; import { describe, it, expect } from 'vitest';
import { Utils } from '../utils.ts';
import { Utils } from '../utils/utils.ts';
describe('Debug Utils.getSeason', () => { describe('Debug Utils.getSeason', () => {
it('should print out the actual seasons for different dates', () => { it('should print out the actual seasons for different dates', () => {

View File

@@ -1,13 +1,14 @@
import { describe, it, expect, vi, beforeEach } from 'vitest'; import { describe, it, expect, vi, beforeEach } from 'vitest';
import handleStreamData from '../handleStreamData.ts';
import handleStreamData from '../utils/handleStreamData.ts';
describe('handleStreamData', () => { describe('handleStreamData', () => {
// Setup mocks // Setup mocks
const mockController = { const mockController = {
enqueue: vi.fn() enqueue: vi.fn(),
}; };
const mockEncoder = { const mockEncoder = {
encode: vi.fn((str) => str) encode: vi.fn(str => str),
}; };
beforeEach(() => { beforeEach(() => {
@@ -16,75 +17,77 @@ describe('handleStreamData', () => {
it('should return early if data type is not "chat"', () => { it('should return early if data type is not "chat"', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any); const handler = handleStreamData(mockController as any, mockEncoder as any);
handler({ type: 'not-chat', data: {} }); handler({ type: 'not-chat', data: {} });
expect(mockController.enqueue).not.toHaveBeenCalled(); expect(mockController.enqueue).not.toHaveBeenCalled();
expect(mockEncoder.encode).not.toHaveBeenCalled(); expect(mockEncoder.encode).not.toHaveBeenCalled();
}); });
it('should return early if data is undefined', () => { it('should return early if data is undefined', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any); const handler = handleStreamData(mockController as any, mockEncoder as any);
handler(undefined as any); handler(undefined as any);
expect(mockController.enqueue).not.toHaveBeenCalled(); expect(mockController.enqueue).not.toHaveBeenCalled();
expect(mockEncoder.encode).not.toHaveBeenCalled(); expect(mockEncoder.encode).not.toHaveBeenCalled();
}); });
it('should handle content_block_start type data', () => { it('should handle content_block_start type data', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any); const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = { const data = {
type: 'chat', type: 'chat',
data: { data: {
type: 'content_block_start', type: 'content_block_start',
content_block: { content_block: {
type: 'text', type: 'text',
text: 'Hello world' text: 'Hello world',
} },
} },
}; };
handler(data); handler(data);
expect(mockController.enqueue).toHaveBeenCalledTimes(1); expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world')); expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world'));
// @ts-expect-error - mock
const encodedData = mockEncoder.encode.mock.calls[0][0]; const encodedData = mockEncoder.encode.mock.calls[0][0];
const parsedData = JSON.parse(encodedData.split('data: ')[1]); const parsedData = JSON.parse(encodedData.split('data: ')[1]);
expect(parsedData.type).toBe('chat'); expect(parsedData.type).toBe('chat');
expect(parsedData.data.choices[0].delta.content).toBe('Hello world'); expect(parsedData.data.choices[0].delta.content).toBe('Hello world');
}); });
it('should handle delta.text type data', () => { it('should handle delta.text type data', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any); const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = { const data = {
type: 'chat', type: 'chat',
data: { data: {
delta: { delta: {
text: 'Hello world' text: 'Hello world',
} },
} },
}; };
handler(data); handler(data);
expect(mockController.enqueue).toHaveBeenCalledTimes(1); expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world')); expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world'));
// @ts-expect-error - mock
const encodedData = mockEncoder.encode.mock.calls[0][0]; const encodedData = mockEncoder.encode.mock.calls[0][0];
const parsedData = JSON.parse(encodedData.split('data: ')[1]); const parsedData = JSON.parse(encodedData.split('data: ')[1]);
expect(parsedData.type).toBe('chat'); expect(parsedData.type).toBe('chat');
expect(parsedData.data.choices[0].delta.content).toBe('Hello world'); expect(parsedData.data.choices[0].delta.content).toBe('Hello world');
}); });
it('should handle choices[0].delta.content type data', () => { it('should handle choices[0].delta.content type data', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any); const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = { const data = {
type: 'chat', type: 'chat',
data: { data: {
@@ -92,23 +95,24 @@ describe('handleStreamData', () => {
{ {
index: 0, index: 0,
delta: { delta: {
content: 'Hello world' content: 'Hello world',
}, },
logprobs: null, logprobs: null,
finish_reason: null finish_reason: null,
} },
] ],
} },
}; };
handler(data); handler(data);
expect(mockController.enqueue).toHaveBeenCalledTimes(1); expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world')); expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Hello world'));
// @ts-expect-error - mock
const encodedData = mockEncoder.encode.mock.calls[0][0]; const encodedData = mockEncoder.encode.mock.calls[0][0];
const parsedData = JSON.parse(encodedData.split('data: ')[1]); const parsedData = JSON.parse(encodedData.split('data: ')[1]);
expect(parsedData.type).toBe('chat'); expect(parsedData.type).toBe('chat');
expect(parsedData.data.choices[0].delta.content).toBe('Hello world'); expect(parsedData.data.choices[0].delta.content).toBe('Hello world');
expect(parsedData.data.choices[0].finish_reason).toBe(null); expect(parsedData.data.choices[0].finish_reason).toBe(null);
@@ -116,7 +120,7 @@ describe('handleStreamData', () => {
it('should pass through data with choices but no delta.content', () => { it('should pass through data with choices but no delta.content', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any); const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = { const data = {
type: 'chat', type: 'chat',
data: { data: {
@@ -125,64 +129,66 @@ describe('handleStreamData', () => {
index: 0, index: 0,
delta: {}, delta: {},
logprobs: null, logprobs: null,
finish_reason: 'stop' finish_reason: 'stop',
} },
] ],
} },
}; };
handler(data); handler(data as any);
expect(mockController.enqueue).toHaveBeenCalledTimes(1); expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('"finish_reason":"stop"')); expect(mockEncoder.encode).toHaveBeenCalledWith(
expect.stringContaining('"finish_reason":"stop"'),
);
}); });
it('should return early for unrecognized data format', () => { it('should return early for unrecognized data format', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any); const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = { const data = {
type: 'chat', type: 'chat',
data: { data: {
// No recognized properties // No recognized properties
unrecognized: 'property' unrecognized: 'property',
} },
}; };
handler(data); handler(data as any);
expect(mockController.enqueue).not.toHaveBeenCalled(); expect(mockController.enqueue).not.toHaveBeenCalled();
expect(mockEncoder.encode).not.toHaveBeenCalled(); expect(mockEncoder.encode).not.toHaveBeenCalled();
}); });
it('should use custom transform function if provided', () => { it('should use custom transform function if provided', () => {
const handler = handleStreamData(mockController as any, mockEncoder as any); const handler = handleStreamData(mockController as any, mockEncoder as any);
const data = { const data = {
type: 'chat', type: 'chat',
data: { data: {
original: 'data' original: 'data',
} },
}; };
const transformFn = vi.fn().mockReturnValue({ const transformFn = vi.fn().mockReturnValue({
type: 'chat', type: 'chat',
data: { data: {
choices: [ choices: [
{ {
delta: { delta: {
content: 'Transformed content' content: 'Transformed content',
}, },
logprobs: null, logprobs: null,
finish_reason: null finish_reason: null,
} },
] ],
} },
}); });
handler(data, transformFn); handler(data as any, transformFn);
expect(transformFn).toHaveBeenCalledWith(data); expect(transformFn).toHaveBeenCalledWith(data);
expect(mockController.enqueue).toHaveBeenCalledTimes(1); expect(mockController.enqueue).toHaveBeenCalledTimes(1);
expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Transformed content')); expect(mockEncoder.encode).toHaveBeenCalledWith(expect.stringContaining('Transformed content'));
}); });
}); });

View File

@@ -1,5 +1,6 @@
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest'; import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import { Utils } from '../utils.ts';
import { Utils } from '../utils/utils.ts';
describe('Utils', () => { describe('Utils', () => {
describe('getSeason', () => { describe('getSeason', () => {
@@ -42,10 +43,11 @@ describe('Utils', () => {
beforeEach(() => { beforeEach(() => {
// Mock Intl.DateTimeFormat // Mock Intl.DateTimeFormat
// @ts-expect-error - mock
global.Intl.DateTimeFormat = vi.fn().mockReturnValue({ global.Intl.DateTimeFormat = vi.fn().mockReturnValue({
resolvedOptions: vi.fn().mockReturnValue({ resolvedOptions: vi.fn().mockReturnValue({
timeZone: 'America/New_York' timeZone: 'America/New_York',
}) }),
}); });
}); });
@@ -102,10 +104,10 @@ describe('Utils', () => {
it('should select items equitably from multiple sources', () => { it('should select items equitably from multiple sources', () => {
const sources = { const sources = {
a: { 'key1': 'value1', 'key2': 'value2' }, a: { key1: 'value1', key2: 'value2' },
b: { 'key3': 'value3', 'key4': 'value4' }, b: { key3: 'value3', key4: 'value4' },
c: { 'key5': 'value5', 'key6': 'value6' }, c: { key5: 'value5', key6: 'value6' },
d: { 'key7': 'value7', 'key8': 'value8' } d: { key7: 'value7', key8: 'value8' },
}; };
const result = Utils.selectEquitably(sources, 4); const result = Utils.selectEquitably(sources, 4);
@@ -117,10 +119,10 @@ describe('Utils', () => {
it('should handle itemCount greater than available items', () => { it('should handle itemCount greater than available items', () => {
const sources = { const sources = {
a: { 'key1': 'value1' }, a: { key1: 'value1' },
b: { 'key2': 'value2' }, b: { key2: 'value2' },
c: {}, c: {},
d: {} d: {},
}; };
const result = Utils.selectEquitably(sources, 5); const result = Utils.selectEquitably(sources, 5);
@@ -135,7 +137,7 @@ describe('Utils', () => {
a: {}, a: {},
b: {}, b: {},
c: {}, c: {},
d: {} d: {},
}; };
const result = Utils.selectEquitably(sources, 5); const result = Utils.selectEquitably(sources, 5);
@@ -148,10 +150,10 @@ describe('Utils', () => {
it('should insert blank messages to maintain user/assistant alternation', () => { it('should insert blank messages to maintain user/assistant alternation', () => {
const messages = [ const messages = [
{ role: 'user', content: 'Hello' }, { role: 'user', content: 'Hello' },
{ role: 'user', content: 'How are you?' } { role: 'user', content: 'How are you?' },
]; ];
const result = Utils.normalizeWithBlanks(messages); const result = Utils.normalizeWithBlanks(messages as any[]);
expect(result.length).toBe(3); expect(result.length).toBe(3);
expect(result[0]).toEqual({ role: 'user', content: 'Hello' }); expect(result[0]).toEqual({ role: 'user', content: 'Hello' });
@@ -160,11 +162,9 @@ describe('Utils', () => {
}); });
it('should insert blank user message if first message is assistant', () => { it('should insert blank user message if first message is assistant', () => {
const messages = [ const messages = [{ role: 'assistant', content: 'Hello, how can I help?' }];
{ role: 'assistant', content: 'Hello, how can I help?' }
];
const result = Utils.normalizeWithBlanks(messages); const result = Utils.normalizeWithBlanks(messages as any[]);
expect(result.length).toBe(2); expect(result.length).toBe(2);
expect(result[0]).toEqual({ role: 'user', content: '' }); expect(result[0]).toEqual({ role: 'user', content: '' });
@@ -183,10 +183,10 @@ describe('Utils', () => {
const messages = [ const messages = [
{ role: 'user', content: 'Hello' }, { role: 'user', content: 'Hello' },
{ role: 'assistant', content: 'Hi there' }, { role: 'assistant', content: 'Hi there' },
{ role: 'user', content: 'How are you?' } { role: 'user', content: 'How are you?' },
]; ];
const result = Utils.normalizeWithBlanks(messages); const result = Utils.normalizeWithBlanks(messages as any[]);
expect(result.length).toBe(3); expect(result.length).toBe(3);
expect(result).toEqual(messages); expect(result).toEqual(messages);

View File

@@ -0,0 +1,57 @@
import Prompts from '../prompts';
import { Common } from '../utils';
export class AssistantSdk {
static getAssistantPrompt(params: {
maxTokens?: number;
userTimezone?: string;
userLocation?: string;
}): string {
const { maxTokens, userTimezone = 'UTC', userLocation = '' } = params;
// console.log('[DEBUG_LOG] few_shots:', JSON.stringify(few_shots));
let selectedFewshots = Common.Utils.selectEquitably?.(Prompts.FewShots);
// console.log('[DEBUG_LOG] selectedFewshots after Utils.selectEquitably:', JSON.stringify(selectedFewshots));
if (!selectedFewshots) {
selectedFewshots = Prompts.FewShots;
// console.log('[DEBUG_LOG] selectedFewshots after fallback:', JSON.stringify(selectedFewshots));
}
const sdkDate = new Date().toISOString();
const [currentDate] = sdkDate.includes('T') ? sdkDate.split('T') : [sdkDate];
const now = new Date();
const formattedMinutes = String(now.getMinutes()).padStart(2, '0');
const currentTime = `${now.getHours()}:${formattedMinutes} ${now.getSeconds()}s`;
return `# Assistant Knowledge
## Assistant Name
### open-gsio
## Current Context
### Date: ${currentDate} ${currentTime}
${maxTokens ? `### Max Response Length: ${maxTokens} tokens (maximum)` : ''}
### Lexicographical Format: Markdown
### User Location: ${userLocation || 'Unknown'}
### Timezone: ${userTimezone}
## Response Framework
1. Use knowledge provided in the current context as the primary source of truth.
2. Format all responses in Markdown.
3. Attribute external sources with footnotes.
4. Do not bold headers.
## Examples
#### Example 0
HUMAN: What is this?
ASSISTANT: This is a conversational AI system.
---
${AssistantSdk.useFewshots(selectedFewshots, 5)}
---
## Directive
Continuously monitor the evolving conversation. Dynamically adapt each response.`;
}
static useFewshots(fewshots: Record<string, string>, limit = 5): string {
return Object.entries(fewshots)
.slice(0, limit)
.map(([q, a], i) => {
return `#### Example ${i + 1}\nHUMAN: ${q}\nASSISTANT: ${a}`;
})
.join('\n---\n');
}
}

View File

@@ -0,0 +1,3 @@
import { AssistantSdk } from './assistant-sdk.ts';
export { AssistantSdk };

View File

@@ -0,0 +1,138 @@
import { Schema } from '@open-gsio/schema';
import type { Instance } from 'mobx-state-tree';
import { OpenAI } from 'openai';
import type Message from '../../../schema/src/models/Message.ts';
import { AssistantSdk } from '../assistant-sdk';
import { ProviderRepository } from '../providers/_ProviderRepository.ts';
import type {
BuildAssistantPromptParams,
ChatRequestBody,
GenericEnv,
PreprocessParams,
} from '../types';
export class ChatSdk {
static async preprocess(params: PreprocessParams) {
// a slot for to provide additional context
return Schema.Message.create({
role: 'assistant',
content: '',
});
}
static async handleChatRequest(
request: Request,
ctx: {
openai: OpenAI;
systemPrompt: any;
maxTokens: any;
env: GenericEnv;
},
) {
const streamId = crypto.randomUUID();
const { messages, model, conversationId } = (await request.json()) as ChatRequestBody;
if (!messages?.length) {
return new Response('No messages provided', { status: 400 });
}
const preprocessedContext = await ChatSdk.preprocess({
messages,
});
// console.log(ctx.env)
// console.log(ctx.env.SERVER_COORDINATOR);
const objectId = ctx.env.SERVER_COORDINATOR.idFromName('stream-index');
const durableObject = ctx.env.SERVER_COORDINATOR.get(objectId);
await durableObject.saveStreamData(
streamId,
JSON.stringify({
messages,
model,
conversationId,
timestamp: Date.now(),
systemPrompt: ctx.systemPrompt,
preprocessedContext,
}),
);
return new Response(
JSON.stringify({
streamUrl: `/api/streams/${streamId}`,
}),
{
headers: {
'Content-Type': 'application/json',
},
},
);
}
static async calculateMaxTokens(
messages: any[],
ctx: Record<string, any> & {
env: GenericEnv;
maxTokens: number;
},
) {
const objectId = ctx.env.SERVER_COORDINATOR.idFromName('dynamic-token-counter');
const durableObject = ctx.env.SERVER_COORDINATOR.get(objectId);
return durableObject.dynamicMaxTokens(messages, ctx.maxTokens);
}
static buildAssistantPrompt(params: BuildAssistantPromptParams) {
const { maxTokens } = params;
return AssistantSdk.getAssistantPrompt({
maxTokens,
userTimezone: 'UTC',
userLocation: 'USA/unknown',
});
}
static async buildMessageChain(
messages: any[],
opts: {
systemPrompt: any;
assistantPrompt: string;
toolResults: Instance<typeof Message>;
model: any;
env: GenericEnv;
},
) {
const modelFamily = await ProviderRepository.getModelFamily(opts.model, opts.env);
const messagesToSend = [];
messagesToSend.push(
Schema.Message.create({
role:
opts.model.includes('o1') ||
opts.model.includes('gemma') ||
modelFamily === 'claude' ||
modelFamily === 'google'
? 'assistant'
: 'system',
content: opts.systemPrompt.trim(),
}),
);
messagesToSend.push(
Schema.Message.create({
role: 'assistant',
content: opts.assistantPrompt.trim(),
}),
);
messagesToSend.push(
...messages
.filter((message: any) => message.content?.trim())
.map((message: any) => Schema.Message.create(message)),
);
return messagesToSend;
}
}
export default ChatSdk;

View File

@@ -0,0 +1 @@
export * from './chat-sdk.ts';

2
packages/ai/src/index.ts Normal file
View File

@@ -0,0 +1,2 @@
export * from './providers';
export * from './chat-sdk';

View File

@@ -1,50 +1,49 @@
/* eslint-disable no-useless-escape */
const a = { const a = {
"Can you search for the latest quantum computing breakthroughs?": 'Can you search for the latest quantum computing breakthroughs?': "I don't have web access.",
"I don't have web access.",
}; };
const b = { const b = {
"Calculate the number of unique ways to arrange the letters in the word 'SUCCESS'.": ` 'Calculate the number of unique ways to arrange the letters in the word SUCCESS.': `Alright, let's tackle 'SUCCESS' and see just how unique it can get!
Alright, let's tackle 'SUCCESS' and see just how unique it can get! ### Step 1: Count the Letters
1. Step 1: Count the Letters
The word 'SUCCESS' has 7 letters, but its no free-for-allthere are repeats! The word 'SUCCESS' has 7 letters, but its no free-for-allthere are repeats!
- S: 3 times - S: 3 times
- U: 1 time - U: 1 time
- C: 2 times - C: 2 times
- E: 1 time - E: 1 time
Erdős would call this a combinatorial party with some guests arriving multiple times. ### Step 2: The Formula for Permutations with Repetition
2. Step 2: The Formula for Permutations with Repetition
When letters repeat, we divide to avoid overcounting. The magic formula is: When letters repeat, we divide to avoid overcounting. The magic formula is:
$$ $$
\frac{n!}{k_1! \cdot k_2! \cdot \ldots \cdot k_m!} \frac{n!}{k_1! \cdot k_2! \cdot \ldots \cdot k_m!}
$$ $$
Here, (n) is the total number of letters, and (k_i) are the frequencies of each unique letter. Here, (n) is the total number of letters, and (k_i) are the frequencies of each unique letter.
3. Step 3: Plug in the Values ### Step 3. Plug in the Values
Lets compute: Lets compute:
$$ $$
\frac{7!}{3! \cdot 1! \cdot 2! \cdot 1!} = 420 \frac{7!}{3! \cdot 1! \cdot 2! \cdot 1!} = 420
$$ $$
4. Step 4: Reveal the Result ### Step 4: Reveal the Result
There are **420 unique arrangements** of 'SUCCESS'. Mathematically satisfying, isnt it? Its as if every version of success gets its own unique permutation!`, There are 420 unique arrangements of 'SUCCESS'. Mathematically satisfying, isnt it? Its as if every version of success gets its own unique permutation!`,
"Calculate the number of unique ways to arrange the letters in the word 'MISSISSIPPI'.": `Ready for some permutation wizardry? Lets decode 'MISSISSIPPI'!
1. Step 1: Count the Letters 'Calculate the number of unique ways to arrange the letters in the word MISSISSIPPI.': `Ready for some permutation wizardry? Lets decode 'MISSISSIPPI'!
### Step 1: Count the Letters
'MISSISSIPPI' has 11 letters. But like a social network with cliques, some letters appear in multiples: 'MISSISSIPPI' has 11 letters. But like a social network with cliques, some letters appear in multiples:
- M: 1 - M: 1
- I: 4 - I: 4
- S: 4 - S: 4
- P: 2 - P: 2
2. Step 2: Apply the Permutation Formula for Repetition ### Step 2: Apply the Permutation Formula for Repetition
The formula for such a case is: The formula for such a case is:
$$ $$
\frac{n!}{k_1! \cdot k_2! \cdot \ldots \cdot k_m!} \frac{n!}{k_1! \cdot k_2! \cdot \ldots \cdot k_m!}
$$ $$
3. Step 3: Plug in the Values ### Step 3: Plug in the Values
Heres where we let the math do its magic: Heres where we let the math do its magic:
$$ $$
\frac{11!}{1! \cdot 4! \cdot 4! \cdot 2!} \frac{11!}{1! \cdot 4! \cdot 4! \cdot 2!}
$$ $$
4. Step 4: Simplify Step-by-Step ### Step 4: Simplify Step-by-Step
Lets break it down: Lets break it down:
- (11! = 39,916,800) - (11! = 39,916,800)
- (4! = 24) - (4! = 24)
@@ -53,19 +52,17 @@ Lets break it down:
$$ $$
\frac{39,916,800}{1 \cdot 24 \cdot 24 \cdot 2} = 34,650 \frac{39,916,800}{1 \cdot 24 \cdot 24 \cdot 2} = 34,650
$$ $$
5. Step 5: The Grand Finale ### Step 5: The Grand Finale
There are **34,650 unique arrangements** of 'MISSISSIPPI'. Erdős would marvel at the sheer elegance of these numbersmuch like the way rivers meander through the land, permutations flow through possibilities.`, There are 34,650 unique arrangements of 'MISSISSIPPI'.`,
}; };
const c = { const c = {
// c remains unchanged as it contains image generation prompts 'Render a Greek statue with warm marble tones and realistic proportions.':
"Render a Greek statue with warm marble tones and realistic proportions.":
"I don't have the ability to generate images right now. ", "I don't have the ability to generate images right now. ",
}; };
const d = { const d = {
"Demonstrate all outputs.": ` 'Demonstrate markdown formatted text.': `
# Comprehensive Markdown Showcase
## Headers ## Headers
~~~markdown ~~~markdown
# Large Header # Large Header
@@ -73,80 +70,48 @@ const d = {
### Small Header ### Small Header
~~~ ~~~
## Ordered Lists ## Ordered Lists
~~~markdown
1. First Item 1. First Item
2. Second Item 2. Second Item
1. Subitem 1 1. Subitem 1
2. Subitem 2 2. Subitem 2
3. Third Item 3. Third Item
~~~
## Unordered Lists ## Unordered Lists
~~~markdown
- First Item - First Item
- Second Item - Second Item
- Subitem 1 - Subitem 1
- Subitem 2 - Subitem 2
~~~
## Links ## Links
~~~markdown
[Visit OpenAI](https://openai.com/) [Visit OpenAI](https://openai.com/)
~~~
## Images ## Images
~~~markdown
![Example Image](example.jpg)
~~~
![Example Image](example.jpg) ![Example Image](example.jpg)
## Inline Code ## Inline Code
~~~markdown
\`console.log('Hello, Markdown!')\` \`console.log('Hello, Markdown!')\`
~~~
## Code Blocks ## Code Blocks
\`\`\`markdown
~~~javascript ~~~javascript
console.log(marked.parse('A Description List:\\n' console.log(marked.parse('A Description List:\\n'
+ ': Topic 1 : Description 1\\n' + ': Topic 1 : Description 1\\n'
+ ': **Topic 2** : *Description 2*')); + ': Topic 2 : Description 2'));
~~~ ~~~
\`\`\`
## Tables ## Tables
~~~markdown
| Name | Value | | Name | Value |
|---------|-------| |---------|-------|
| Item A | 10 | | Item A | 10 |
| Item B | 20 | | Item B | 20 |
~~~
## Blockquotes ## Blockquotes
~~~markdown
> Markdown makes writing beautiful. > Markdown makes writing beautiful.
> - Markdown Fan > - Markdown Fan
~~~
## Horizontal Rule ## Horizontal Rule
~~~markdown
--- ---
~~~
## Font: Bold and Italic ## Font: Bold and Italic
~~~markdown **Bold Text**
**Bold Text**
*Italic Text* *Italic Text*
~~~
## Font: Strikethrough ## Font: Strikethrough
~~~markdown
~~Struck-through text~~ ~~Struck-through text~~
~~~ ## Math
---
## Math: Inline
This is block level katex:
~~~markdown ~~~markdown
$$ $$
c = \\\\pm\\\\sqrt{a^2 + b^2} c = \\\\pm\\\\sqrt{a^2 + b^2}
$$ $$`,
~~~
## Math: Block
This is inline katex
~~~markdown
$c = \\\\pm\\\\sqrt{a^2 + b^2}$
~~~
`,
}; };
export default { a, b, c, d }; export default { a, b, c, d };

View File

@@ -0,0 +1,5 @@
import few_shots from './few_shots.ts';
export default {
FewShots: few_shots,
};

View File

@@ -0,0 +1,96 @@
import type { GenericEnv, ModelMeta, Providers, SupportedProvider } from '../types';
export class ProviderRepository {
#providers: Providers = [];
#env: GenericEnv;
constructor(env: GenericEnv) {
this.#env = env;
this.setProviders(env);
}
static OPENAI_COMPAT_ENDPOINTS = {
xai: 'https://api.x.ai/v1',
groq: 'https://api.groq.com/openai/v1',
google: 'https://generativelanguage.googleapis.com/v1beta/openai',
fireworks: 'https://api.fireworks.ai/inference/v1',
cohere: 'https://api.cohere.ai/compatibility/v1',
cloudflare: 'https://api.cloudflare.com/client/v4/accounts/{CLOUDFLARE_ACCOUNT_ID}/ai/v1',
claude: 'https://api.anthropic.com/v1',
openai: 'https://api.openai.com/v1',
cerebras: 'https://api.cerebras.com/v1',
ollama: 'http://localhost:11434/v1',
mlx: 'http://localhost:10240/v1',
};
static async getModelFamily(model: any, env: GenericEnv) {
const allModels = await env.KV_STORAGE.get('supportedModels');
const models = JSON.parse(allModels);
const modelData = models.filter((m: ModelMeta) => m.id === model);
return modelData[0].provider;
}
static async getModelMeta(meta: any, env: GenericEnv) {
const allModels = await env.KV_STORAGE.get('supportedModels');
const models = JSON.parse(allModels);
return models.filter((m: ModelMeta) => m.id === meta.model).pop();
}
getProviders(): { name: string; key: string; endpoint: string }[] {
return this.#providers;
}
setProviders(env: GenericEnv) {
const indicies = {
providerName: 0,
providerValue: 1,
};
const valueDelimiter = '_';
const envKeys = Object.keys(env);
for (let i = 0; i < envKeys.length; i++) {
if (envKeys.at(i)?.endsWith('KEY')) {
const detectedProvider = envKeys
.at(i)
?.split(valueDelimiter)
.at(indicies.providerName)
?.toLowerCase();
const detectedProviderValue = env[envKeys.at(i) as string];
if (detectedProviderValue) {
switch (detectedProvider) {
case 'anthropic':
this.#providers.push({
name: 'claude',
key: env.ANTHROPIC_API_KEY,
endpoint: ProviderRepository.OPENAI_COMPAT_ENDPOINTS['claude'],
});
break;
case 'gemini':
this.#providers.push({
name: 'google',
key: env.GEMINI_API_KEY,
endpoint: ProviderRepository.OPENAI_COMPAT_ENDPOINTS['google'],
});
break;
case 'cloudflare':
this.#providers.push({
name: 'cloudflare',
key: env.CLOUDFLARE_API_KEY,
endpoint: ProviderRepository.OPENAI_COMPAT_ENDPOINTS[detectedProvider].replace(
'{CLOUDFLARE_ACCOUNT_ID}',
env.CLOUDFLARE_ACCOUNT_ID,
),
});
break;
default:
this.#providers.push({
name: detectedProvider as SupportedProvider,
key: env[envKeys[i] as string],
endpoint:
ProviderRepository.OPENAI_COMPAT_ENDPOINTS[detectedProvider as SupportedProvider],
});
}
}
}
}
}
}

View File

@@ -1,6 +1,11 @@
import { describe, it, expect, vi } from 'vitest';
import { BaseChatProvider, CommonProviderParams, ChatStreamProvider } from '../chat-stream-provider.ts';
import { OpenAI } from 'openai'; import { OpenAI } from 'openai';
import { describe, it, expect, vi } from 'vitest';
import {
BaseChatProvider,
CommonProviderParams,
ChatStreamProvider,
} from '../chat-stream-provider.ts';
// Create a concrete implementation of BaseChatProvider for testing // Create a concrete implementation of BaseChatProvider for testing
class TestChatProvider extends BaseChatProvider { class TestChatProvider extends BaseChatProvider {
@@ -29,16 +34,16 @@ vi.mock('../../lib/chat-sdk', () => ({
buildAssistantPrompt: vi.fn().mockReturnValue('Assistant prompt'), buildAssistantPrompt: vi.fn().mockReturnValue('Assistant prompt'),
buildMessageChain: vi.fn().mockReturnValue([ buildMessageChain: vi.fn().mockReturnValue([
{ role: 'system', content: 'System prompt' }, { role: 'system', content: 'System prompt' },
{ role: 'user', content: 'User message' } { role: 'user', content: 'User message' },
]) ]),
} },
})); }));
describe('ChatStreamProvider', () => { describe('ChatStreamProvider', () => {
it('should define the required interface', () => { it('should define the required interface', () => {
// Verify the interface has the required method // Verify the interface has the required method
const mockProvider: ChatStreamProvider = { const mockProvider: ChatStreamProvider = {
handleStream: vi.fn() handleStream: vi.fn(),
}; };
expect(mockProvider.handleStream).toBeDefined(); expect(mockProvider.handleStream).toBeDefined();

View File

@@ -1,17 +1,19 @@
import {OpenAI} from "openai"; import { OpenAI } from 'openai';
import {BaseChatProvider, CommonProviderParams} from "./chat-stream-provider.ts";
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class CerebrasChatProvider extends BaseChatProvider { export class CerebrasChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI { getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({ return new OpenAI({
baseURL: "https://api.cerebras.ai/v1", baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.cerebras,
apiKey: param.env.CEREBRAS_API_KEY, apiKey: param.env.CEREBRAS_API_KEY,
}); });
} }
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any { getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
// models provided by cerebras do not follow standard tune params // models provided by cerebras do not follow standard tune params
// they must be individually configured // they must be individually configured
// const tuningParams = { // const tuningParams = {
// temperature: 0.86, // temperature: 0.86,
// top_p: 0.98, // top_p: 0.98,
@@ -23,18 +25,18 @@ export class CerebrasChatProvider extends BaseChatProvider {
return { return {
model: param.model, model: param.model,
messages: safeMessages, messages: safeMessages,
stream: true stream: true,
// ...tuningParams // ...tuningParams
}; };
} }
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> { async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === "stop") { if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: "chat", data: chunk }); dataCallback({ type: 'chat', data: chunk });
return true; return true;
} }
dataCallback({ type: "chat", data: chunk }); dataCallback({ type: 'chat', data: chunk });
return false; return false;
} }
} }
@@ -46,14 +48,13 @@ export class CerebrasSdk {
param: { param: {
openai: OpenAI; openai: OpenAI;
systemPrompt: any; systemPrompt: any;
disableWebhookGeneration: boolean;
preprocessedContext: any; preprocessedContext: any;
maxTokens: unknown | number | undefined; maxTokens: unknown | number | undefined;
messages: any; messages: any;
model: string; model: string;
env: Env; env: GenericEnv;
}, },
dataCallback: (data) => void, dataCallback: (data: any) => void,
) { ) {
return this.provider.handleStream( return this.provider.handleStream(
{ {

View File

@@ -0,0 +1,281 @@
import { OpenAI } from 'openai';
import ChatSdk from '../chat-sdk/chat-sdk.ts';
import { getWeather, WeatherTool } from '../tools/weather.ts';
import { yachtpitAi, YachtpitTools } from '../tools/yachtpit.ts';
import type { GenericEnv } from '../types';
export interface CommonProviderParams {
openai?: OpenAI; // Optional for providers that use a custom client.
systemPrompt: any;
preprocessedContext: any;
maxTokens: number | unknown | undefined;
messages: any;
model: string;
env: GenericEnv;
disableWebhookGeneration?: boolean;
// Additional fields can be added as needed
}
export interface ChatStreamProvider {
handleStream(param: CommonProviderParams, dataCallback: (data: any) => void): Promise<any>;
}
export abstract class BaseChatProvider implements ChatStreamProvider {
abstract getOpenAIClient(param: CommonProviderParams): OpenAI;
abstract getStreamParams(param: CommonProviderParams, safeMessages: any[]): any;
abstract processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean>;
async handleStream(param: CommonProviderParams, dataCallback: (data: any) => void) {
const assistantPrompt = ChatSdk.buildAssistantPrompt({ maxTokens: param.maxTokens });
const safeMessages = await ChatSdk.buildMessageChain(param.messages, {
systemPrompt: param.systemPrompt,
model: param.model,
assistantPrompt,
toolResults: param.preprocessedContext,
env: param.env,
});
const client = this.getOpenAIClient(param);
const tools = [WeatherTool, YachtpitTools];
const callFunction = async (name, args) => {
if (name === 'get_weather') {
return getWeather(args.latitude, args.longitude);
}
if (name === 'ship_control') {
return yachtpitAi({ action: args.action, value: args.value });
}
};
// Main conversation loop - handle tool calls properly
let conversationComplete = false;
let toolCallIterations = 0;
const maxToolCallIterations = 5; // Prevent infinite loops
let toolsExecuted = false; // Track if we've executed tools
while (!conversationComplete && toolCallIterations < maxToolCallIterations) {
const streamParams = this.getStreamParams(param, safeMessages);
// Only provide tools on the first call, after that force text response
const currentTools = toolsExecuted ? undefined : tools;
const stream = await client.chat.completions.create({ ...streamParams, tools: currentTools });
let assistantMessage = '';
const toolCalls: any[] = [];
for await (const chunk of stream as unknown as AsyncIterable<any>) {
// console.log('chunk', chunk);
// Handle tool calls
if (chunk.choices[0]?.delta?.tool_calls) {
const deltaToolCalls = chunk.choices[0].delta.tool_calls;
for (const deltaToolCall of deltaToolCalls) {
if (deltaToolCall.index !== undefined) {
// Initialize or get existing tool call
if (!toolCalls[deltaToolCall.index]) {
toolCalls[deltaToolCall.index] = {
id: deltaToolCall.id || '',
type: deltaToolCall.type || 'function',
function: {
name: deltaToolCall.function?.name || '',
arguments: deltaToolCall.function?.arguments || '',
},
};
} else {
// Append to existing tool call
if (deltaToolCall.function?.arguments) {
toolCalls[deltaToolCall.index].function.arguments +=
deltaToolCall.function.arguments;
}
if (deltaToolCall.function?.name) {
toolCalls[deltaToolCall.index].function.name += deltaToolCall.function.name;
}
if (deltaToolCall.id) {
toolCalls[deltaToolCall.index].id += deltaToolCall.id;
}
}
}
}
}
// Handle regular content
if (chunk.choices[0]?.delta?.content) {
assistantMessage += chunk.choices[0].delta.content;
}
// Check if stream is finished
if (chunk.choices[0]?.finish_reason) {
if (chunk.choices[0].finish_reason === 'tool_calls' && toolCalls.length > 0) {
// Increment tool call iterations counter
toolCallIterations++;
console.log(`Tool call iteration ${toolCallIterations}/${maxToolCallIterations}`);
// Execute tool calls and add results to conversation
console.log('Executing tool calls:', toolCalls);
// Send feedback to user about tool invocation
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: `\n\n🔧 Invoking ${toolCalls.length} tool${toolCalls.length > 1 ? 's' : ''}...\n`,
},
},
],
},
});
// Add assistant message with tool calls to conversation
safeMessages.push({
role: 'assistant',
content: assistantMessage || null,
tool_calls: toolCalls,
});
// Execute each tool call and add results
for (const toolCall of toolCalls) {
if (toolCall.type === 'function') {
const name = toolCall.function.name;
console.log(`Calling function: ${name}`);
// Send feedback about specific tool being called
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: `📞 Calling ${name}...`,
},
},
],
},
});
try {
const args = JSON.parse(toolCall.function.arguments);
console.log(`Function arguments:`, args);
const result = await callFunction(name, args);
console.log(`Function result:`, result);
// Send feedback about tool completion
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: `\n`,
},
},
],
},
});
// Add tool result to conversation
safeMessages.push({
role: 'tool',
tool_call_id: toolCall.id,
content: result?.toString() || '',
});
} catch (error) {
console.error(`Error executing tool ${name}:`, error);
// Send feedback about tool error
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: ` ❌ Error\n`,
},
},
],
},
});
safeMessages.push({
role: 'tool',
tool_call_id: toolCall.id,
content: `Error: ${error.message}`,
});
}
}
}
// Mark that tools have been executed to prevent repeated calls
toolsExecuted = true;
// Send feedback that tool execution is complete
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content: `\n🎯 Tool execution complete. Generating response...\n\n`,
},
},
],
},
});
// Continue conversation with tool results
break;
} else {
// Regular completion - send final response
conversationComplete = true;
}
}
// Process chunk normally for non-tool-call responses
if (!chunk.choices[0]?.delta?.tool_calls) {
console.log('after-tool-call-chunk', chunk);
const shouldBreak = await this.processChunk(chunk, dataCallback);
if (shouldBreak) {
conversationComplete = true;
break;
}
}
}
}
// Handle case where we hit maximum tool call iterations
if (toolCallIterations >= maxToolCallIterations && !conversationComplete) {
console.log('Maximum tool call iterations reached, forcing completion');
// Send a message indicating we've hit the limit and provide available information
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: {
content:
'\n\n⚠ Maximum tool execution limit reached. Based on the available information, I can provide the following response:\n\n',
},
},
],
},
});
// Make one final call without tools to get a response based on the tool results
const finalStreamParams = this.getStreamParams(param, safeMessages);
const finalStream = await client.chat.completions.create({
...finalStreamParams,
tools: undefined, // Remove tools to force a text response
});
for await (const chunk of finalStream as unknown as AsyncIterable<any>) {
const shouldBreak = await this.processChunk(chunk, dataCallback);
if (shouldBreak) break;
}
}
}
}

View File

@@ -1,14 +1,17 @@
import Anthropic from "@anthropic-ai/sdk"; import Anthropic from '@anthropic-ai/sdk';
import {OpenAI} from "openai"; import type {
import {
_NotCustomized, _NotCustomized,
ISimpleType, ISimpleType,
ModelPropertiesDeclarationToProperties, ModelPropertiesDeclarationToProperties,
ModelSnapshotType2, ModelSnapshotType2,
UnionStringArray, UnionStringArray,
} from "mobx-state-tree"; } from 'mobx-state-tree';
import ChatSdk from "../lib/chat-sdk.ts"; import { OpenAI } from 'openai';
import {BaseChatProvider, CommonProviderParams} from "./chat-stream-provider.ts";
import ChatSdk from '../chat-sdk/chat-sdk.ts';
import type { GenericEnv, GenericStreamData } from '../types';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class ClaudeChatProvider extends BaseChatProvider { export class ClaudeChatProvider extends BaseChatProvider {
private anthropic: Anthropic | null = null; private anthropic: Anthropic | null = null;
@@ -33,20 +36,20 @@ export class ClaudeChatProvider extends BaseChatProvider {
stream: true, stream: true,
model: param.model, model: param.model,
messages: safeMessages, messages: safeMessages,
...claudeTuningParams ...claudeTuningParams,
}; };
} }
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> { async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.type === "message_stop") { if (chunk.type === 'message_stop') {
dataCallback({ dataCallback({
type: "chat", type: 'chat',
data: { data: {
choices: [ choices: [
{ {
delta: { content: "" }, delta: { content: '' },
logprobs: null, logprobs: null,
finish_reason: "stop", finish_reason: 'stop',
}, },
], ],
}, },
@@ -54,32 +57,30 @@ export class ClaudeChatProvider extends BaseChatProvider {
return true; return true;
} }
dataCallback({ type: "chat", data: chunk }); dataCallback({ type: 'chat', data: chunk });
return false; return false;
} }
// Override the base handleStream method to use Anthropic client instead of OpenAI // Override the base handleStream method to use Anthropic client instead of OpenAI
async handleStream( async handleStream(param: CommonProviderParams, dataCallback: (data: any) => void) {
param: CommonProviderParams,
dataCallback: (data: any) => void,
) {
const assistantPrompt = ChatSdk.buildAssistantPrompt({ maxTokens: param.maxTokens }); const assistantPrompt = ChatSdk.buildAssistantPrompt({ maxTokens: param.maxTokens });
const safeMessages = ChatSdk.buildMessageChain(param.messages, { const safeMessages = await ChatSdk.buildMessageChain(param.messages, {
systemPrompt: param.systemPrompt, systemPrompt: param.systemPrompt,
model: param.model, model: param.model,
assistantPrompt, assistantPrompt,
toolResults: param.preprocessedContext, toolResults: param.preprocessedContext,
env: param.env,
}); });
const streamParams = this.getStreamParams(param, safeMessages); const streamParams = this.getStreamParams(param, safeMessages);
if (!this.anthropic) { if (!this.anthropic) {
throw new Error("Anthropic client not initialized"); throw new Error('Anthropic client not initialized');
} }
const stream = await this.anthropic.messages.create(streamParams); const stream = await this.anthropic.messages.create(streamParams);
for await (const chunk of stream) { for await (const chunk of stream as unknown as AsyncIterable<any>) {
const shouldBreak = await this.processChunk(chunk, dataCallback); const shouldBreak = await this.processChunk(chunk, dataCallback);
if (shouldBreak) break; if (shouldBreak) break;
} }
@@ -104,9 +105,9 @@ export class ClaudeChatSdk {
maxTokens: unknown | number | undefined; maxTokens: unknown | number | undefined;
messages: any; messages: any;
model: string; model: string;
env: Env; env: GenericEnv;
}, },
dataCallback: (data) => void, dataCallback: (data: GenericStreamData) => void,
) { ) {
return this.provider.handleStream( return this.provider.handleStream(
{ {

View File

@@ -0,0 +1,142 @@
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class CloudflareAiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
apiKey: param.env.CLOUDFLARE_API_KEY,
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.cloudflare.replace(
'{CLOUDFLARE_ACCOUNT_ID}',
param.env.CLOUDFLARE_ACCOUNT_ID,
),
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
const generationParams: Record<string, any> = {
model: this.getModelWithPrefix(param.model),
messages: safeMessages,
stream: true,
};
// Set max_tokens based on model
if (this.getModelPrefix(param.model) === '@cf/meta') {
generationParams['max_tokens'] = 4096;
}
if (this.getModelPrefix(param.model) === '@hf/mistral') {
generationParams['max_tokens'] = 4096;
}
if (param.model.toLowerCase().includes('hermes-2-pro-mistral-7b')) {
generationParams['max_tokens'] = 1000;
}
if (param.model.toLowerCase().includes('openhermes-2.5-mistral-7b-awq')) {
generationParams['max_tokens'] = 1000;
}
if (param.model.toLowerCase().includes('deepseek-coder-6.7b-instruct-awq')) {
generationParams['max_tokens'] = 590;
}
if (param.model.toLowerCase().includes('deepseek-math-7b-instruct')) {
generationParams['max_tokens'] = 512;
}
if (param.model.toLowerCase().includes('neural-chat-7b-v3-1-awq')) {
generationParams['max_tokens'] = 590;
}
if (param.model.toLowerCase().includes('openchat-3.5-0106')) {
generationParams['max_tokens'] = 2000;
}
return generationParams;
}
private getModelPrefix(model: string): string {
let modelPrefix = `@cf/meta`;
if (model.toLowerCase().includes('llama')) {
modelPrefix = `@cf/meta`;
}
if (model.toLowerCase().includes('hermes-2-pro-mistral-7b')) {
modelPrefix = `@hf/nousresearch`;
}
if (model.toLowerCase().includes('mistral-7b-instruct')) {
modelPrefix = `@hf/mistral`;
}
if (model.toLowerCase().includes('gemma')) {
modelPrefix = `@cf/google`;
}
if (model.toLowerCase().includes('deepseek')) {
modelPrefix = `@cf/deepseek-ai`;
}
if (model.toLowerCase().includes('openchat-3.5-0106')) {
modelPrefix = `@cf/openchat`;
}
const isNueralChat = model.toLowerCase().includes('neural-chat-7b-v3-1-awq');
if (
isNueralChat ||
model.toLowerCase().includes('openhermes-2.5-mistral-7b-awq') ||
model.toLowerCase().includes('zephyr-7b-beta-awq') ||
model.toLowerCase().includes('deepseek-coder-6.7b-instruct-awq')
) {
modelPrefix = `@hf/thebloke`;
}
return modelPrefix;
}
private getModelWithPrefix(model: string): string {
return `${this.getModelPrefix(model)}/${model}`;
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class CloudflareAISdk {
private static provider = new CloudflareAiChatProvider();
static async handleCloudflareAIStream(
param: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: string;
env: Env;
},
dataCallback: (data: any) => void,
) {
return this.provider.handleStream(
{
systemPrompt: param.systemPrompt,
preprocessedContext: param.preprocessedContext,
maxTokens: param.maxTokens,
messages: param.messages,
model: param.model,
env: param.env,
},
dataCallback,
);
}
}

View File

@@ -0,0 +1,77 @@
import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class FireworksAiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
apiKey: param.env.FIREWORKS_API_KEY,
baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.fireworks,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
let modelPrefix = 'accounts/fireworks/models/';
if (param.model.toLowerCase().includes('yi-')) {
modelPrefix = 'accounts/yi-01-ai/models/';
} else if (param.model.toLowerCase().includes('/perplexity/')) {
modelPrefix = 'accounts/perplexity/models/';
} else if (param.model.toLowerCase().includes('/sentientfoundation/')) {
modelPrefix = 'accounts/sentientfoundation/models/';
} else if (param.model.toLowerCase().includes('/sentientfoundation-serverless/')) {
modelPrefix = 'accounts/sentientfoundation-serverless/models/';
} else if (param.model.toLowerCase().includes('/instacart/')) {
modelPrefix = 'accounts/instacart/models/';
}
const finalModelIdentifier = param.model.includes(modelPrefix)
? param.model
: `${modelPrefix}${param.model}`;
console.log('using fireworks model', finalModelIdentifier);
return {
model: finalModelIdentifier,
messages: safeMessages,
stream: true,
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class FireworksAiChatSdk {
private static provider = new FireworksAiChatProvider();
static async handleFireworksStream(
param: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: number;
messages: any;
model: any;
env: any;
},
// TODO: Replace usage of any with an explicit but permissive type
dataCallback: (data: any) => void,
) {
return this.provider.handleStream(
{
systemPrompt: param.systemPrompt,
preprocessedContext: param.preprocessedContext,
maxTokens: param.maxTokens,
messages: param.messages,
model: param.model,
env: param.env,
},
dataCallback,
);
}
}

View File

@@ -1,12 +1,12 @@
import { OpenAI } from "openai"; import { OpenAI } from 'openai';
import ChatSdk from "../lib/chat-sdk.ts";
import { StreamParams } from "../services/ChatService.ts"; import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts"; import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class GoogleChatProvider extends BaseChatProvider { export class GoogleChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI { getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({ return new OpenAI({
baseURL: "https://generativelanguage.googleapis.com/v1beta/openai", baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.google,
apiKey: param.env.GEMINI_API_KEY, apiKey: param.env.GEMINI_API_KEY,
}); });
} }
@@ -20,14 +20,14 @@ export class GoogleChatProvider extends BaseChatProvider {
} }
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> { async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices?.[0]?.finish_reason === "stop") { if (chunk.choices?.[0]?.finish_reason === 'stop') {
dataCallback({ dataCallback({
type: "chat", type: 'chat',
data: { data: {
choices: [ choices: [
{ {
delta: { content: chunk.choices[0].delta.content || "" }, delta: { content: chunk.choices[0].delta.content || '' },
finish_reason: "stop", finish_reason: 'stop',
index: chunk.choices[0].index, index: chunk.choices[0].index,
}, },
], ],
@@ -36,11 +36,11 @@ export class GoogleChatProvider extends BaseChatProvider {
return true; return true;
} else { } else {
dataCallback({ dataCallback({
type: "chat", type: 'chat',
data: { data: {
choices: [ choices: [
{ {
delta: { content: chunk.choices?.[0]?.delta?.content || "" }, delta: { content: chunk.choices?.[0]?.delta?.content || '' },
finish_reason: null, finish_reason: null,
index: chunk.choices?.[0]?.index || 0, index: chunk.choices?.[0]?.index || 0,
}, },
@@ -55,10 +55,7 @@ export class GoogleChatProvider extends BaseChatProvider {
export class GoogleChatSdk { export class GoogleChatSdk {
private static provider = new GoogleChatProvider(); private static provider = new GoogleChatProvider();
static async handleGoogleStream( static async handleGoogleStream(param: StreamParams, dataCallback: (data: any) => void) {
param: StreamParams,
dataCallback: (data) => void,
) {
return this.provider.handleStream( return this.provider.handleStream(
{ {
systemPrompt: param.systemPrompt, systemPrompt: param.systemPrompt,

View File

@@ -1,17 +1,19 @@
import { OpenAI } from "openai";
import { import {
_NotCustomized, _NotCustomized,
ISimpleType, ISimpleType,
ModelPropertiesDeclarationToProperties, ModelPropertiesDeclarationToProperties,
ModelSnapshotType2, ModelSnapshotType2,
UnionStringArray, UnionStringArray,
} from "mobx-state-tree"; } from 'mobx-state-tree';
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts"; import { OpenAI } from 'openai';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, CommonProviderParams } from './chat-stream-provider.ts';
export class GroqChatProvider extends BaseChatProvider { export class GroqChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI { getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({ return new OpenAI({
baseURL: "https://api.groq.com/openai/v1", baseURL: ProviderRepository.OPENAI_COMPAT_ENDPOINTS.groq,
apiKey: param.env.GROQ_API_KEY, apiKey: param.env.GROQ_API_KEY,
}); });
} }
@@ -29,17 +31,17 @@ export class GroqChatProvider extends BaseChatProvider {
model: param.model, model: param.model,
messages: safeMessages, messages: safeMessages,
stream: true, stream: true,
...tuningParams ...tuningParams,
}; };
} }
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> { async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === "stop") { if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: "chat", data: chunk }); dataCallback({ type: 'chat', data: chunk });
return true; return true;
} }
dataCallback({ type: "chat", data: chunk }); dataCallback({ type: 'chat', data: chunk });
return false; return false;
} }
} }

View File

@@ -0,0 +1,8 @@
export * from './claude.ts';
export * from './cerebras.ts';
export * from './cloudflareAi.ts';
export * from './fireworks.ts';
export * from './groq.ts';
export * from './mlx-omni.ts';
export * from './ollama.ts';
export * from './xai.ts';

View File

@@ -0,0 +1,97 @@
import { OpenAI } from 'openai';
import { type ChatCompletionCreateParamsStreaming } from 'openai/resources/chat/completions/completions';
import { Common } from '../utils';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class MlxOmniChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: 'http://localhost:10240',
apiKey: param.env.MLX_API_KEY,
});
}
getStreamParams(
param: CommonProviderParams,
safeMessages: any[],
): ChatCompletionCreateParamsStreaming {
const baseTuningParams = {
temperature: 0.86,
top_p: 0.98,
presence_penalty: 0.1,
frequency_penalty: 0.3,
max_tokens: param.maxTokens as number,
};
const getTuningParams = () => {
return baseTuningParams;
};
let completionRequest: ChatCompletionCreateParamsStreaming = {
model: param.model,
stream: true,
messages: safeMessages,
};
const client = this.getOpenAIClient(param);
const isLocal = client.baseURL.includes('localhost');
if (isLocal) {
completionRequest['messages'] = Common.Utils.normalizeWithBlanks(safeMessages);
completionRequest['stream_options'] = {
include_usage: true,
};
} else {
completionRequest = { ...completionRequest, ...getTuningParams() };
}
return completionRequest;
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
const isLocal = chunk.usage !== undefined;
if (isLocal && chunk.usage) {
dataCallback({
type: 'chat',
data: {
choices: [
{
delta: { content: '' },
logprobs: null,
finish_reason: 'stop',
},
],
},
});
return true; // Break the stream
}
dataCallback({ type: 'chat', data: chunk });
return false; // Continue the stream
}
}
export class MlxOmniChatSdk {
private static provider = new MlxOmniChatProvider();
static async handleMlxOmniStream(ctx: any, dataCallback: (data: any) => any) {
if (!ctx.messages?.length) {
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(
{
systemPrompt: ctx.systemPrompt,
preprocessedContext: ctx.preprocessedContext,
maxTokens: ctx.maxTokens,
messages: Common.Utils.normalizeWithBlanks(ctx.messages),
model: ctx.model,
env: ctx.env,
},
dataCallback,
);
}
}

View File

@@ -0,0 +1,75 @@
import { OpenAI } from 'openai';
import type { GenericEnv } from '../types';
import { ProviderRepository } from './_ProviderRepository.ts';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class OllamaChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: param.env.OLLAMA_API_ENDPOINT ?? ProviderRepository.OPENAI_COMPAT_ENDPOINTS.ollama,
apiKey: param.env.OLLAMA_API_KEY,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
const tuningParams = {
temperature: 0.75,
};
const getTuningParams = () => {
return tuningParams;
};
return {
model: param.model,
messages: safeMessages,
stream: true,
...getTuningParams(),
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class OllamaChatSdk {
private static provider = new OllamaChatProvider();
static async handleOllamaStream(
ctx: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: any;
env: GenericEnv;
},
dataCallback: (data: any) => any,
) {
if (!ctx.messages?.length) {
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(
{
systemPrompt: ctx.systemPrompt,
preprocessedContext: ctx.preprocessedContext,
maxTokens: ctx.maxTokens,
messages: ctx.messages,
model: ctx.model,
env: ctx.env,
},
dataCallback,
);
}
}

View File

@@ -1,16 +1,21 @@
import { OpenAI } from "openai"; import { OpenAI } from 'openai';
import { Utils } from "../lib/utils.ts"; import type { ChatCompletionCreateParamsStreaming } from 'openai/resources/chat/completions/completions';
import { ChatCompletionCreateParamsStreaming } from "openai/resources/chat/completions/completions";
import { BaseChatProvider, CommonProviderParams } from "./chat-stream-provider.ts"; import { Common } from '../utils';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class OpenAiChatProvider extends BaseChatProvider { export class OpenAiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI { getOpenAIClient(param: CommonProviderParams): OpenAI {
return param.openai as OpenAI; return param.openai as OpenAI;
} }
getStreamParams(param: CommonProviderParams, safeMessages: any[]): ChatCompletionCreateParamsStreaming { getStreamParams(
param: CommonProviderParams,
safeMessages: any[],
): ChatCompletionCreateParamsStreaming {
const isO1 = () => { const isO1 = () => {
if (param.model === "o1-preview" || param.model === "o1-mini") { if (param.model === 'o1-preview' || param.model === 'o1-mini') {
return true; return true;
} }
}; };
@@ -27,8 +32,8 @@ export class OpenAiChatProvider extends BaseChatProvider {
const getTuningParams = () => { const getTuningParams = () => {
if (isO1()) { if (isO1()) {
tuningParams["temperature"] = 1; tuningParams['temperature'] = 1;
tuningParams["max_completion_tokens"] = (param.maxTokens as number) + 10000; tuningParams['max_completion_tokens'] = (param.maxTokens as number) + 10000;
return tuningParams; return tuningParams;
} }
return gpt4oTuningParams; return gpt4oTuningParams;
@@ -37,19 +42,19 @@ export class OpenAiChatProvider extends BaseChatProvider {
let completionRequest: ChatCompletionCreateParamsStreaming = { let completionRequest: ChatCompletionCreateParamsStreaming = {
model: param.model, model: param.model,
stream: true, stream: true,
messages: safeMessages messages: safeMessages,
}; };
const client = this.getOpenAIClient(param); const client = this.getOpenAIClient(param);
const isLocal = client.baseURL.includes("localhost"); const isLocal = client.baseURL.includes('localhost');
if(isLocal) { if (isLocal) {
completionRequest["messages"] = Utils.normalizeWithBlanks(safeMessages); completionRequest['messages'] = Common.Utils.normalizeWithBlanks(safeMessages);
completionRequest["stream_options"] = { completionRequest['stream_options'] = {
include_usage: true include_usage: true,
}; };
} else { } else {
completionRequest = {...completionRequest, ...getTuningParams()}; completionRequest = { ...completionRequest, ...getTuningParams() };
} }
return completionRequest; return completionRequest;
@@ -60,13 +65,13 @@ export class OpenAiChatProvider extends BaseChatProvider {
if (isLocal && chunk.usage) { if (isLocal && chunk.usage) {
dataCallback({ dataCallback({
type: "chat", type: 'chat',
data: { data: {
choices: [ choices: [
{ {
delta: { content: "" }, delta: { content: '' },
logprobs: null, logprobs: null,
finish_reason: "stop", finish_reason: 'stop',
}, },
], ],
}, },
@@ -74,7 +79,7 @@ export class OpenAiChatProvider extends BaseChatProvider {
return true; // Break the stream return true; // Break the stream
} }
dataCallback({ type: "chat", data: chunk }); dataCallback({ type: 'chat', data: chunk });
return false; // Continue the stream return false; // Continue the stream
} }
} }
@@ -95,7 +100,7 @@ export class OpenAiChatSdk {
dataCallback: (data: any) => any, dataCallback: (data: any) => any,
) { ) {
if (!ctx.messages?.length) { if (!ctx.messages?.length) {
return new Response("No messages provided", { status: 400 }); return new Response('No messages provided', { status: 400 });
} }
return this.provider.handleStream( return this.provider.handleStream(

View File

@@ -0,0 +1,75 @@
import { OpenAI } from 'openai';
import type { GenericEnv, GenericStreamData } from '../types';
import { BaseChatProvider, type CommonProviderParams } from './chat-stream-provider.ts';
export class XaiChatProvider extends BaseChatProvider {
getOpenAIClient(param: CommonProviderParams): OpenAI {
return new OpenAI({
baseURL: 'https://api.x.ai/v1',
apiKey: param.env.XAI_API_KEY,
});
}
getStreamParams(param: CommonProviderParams, safeMessages: any[]): any {
const tuningParams = {
temperature: 0.75,
};
const getTuningParams = () => {
return tuningParams;
};
return {
model: param.model,
messages: safeMessages,
stream: true,
...getTuningParams(),
};
}
async processChunk(chunk: any, dataCallback: (data: any) => void): Promise<boolean> {
if (chunk.choices && chunk.choices[0]?.finish_reason === 'stop') {
dataCallback({ type: 'chat', data: chunk });
return true;
}
dataCallback({ type: 'chat', data: chunk });
return false;
}
}
export class XaiChatSdk {
private static provider = new XaiChatProvider();
static async handleXaiStream(
ctx: {
openai: OpenAI;
systemPrompt: any;
preprocessedContext: any;
maxTokens: unknown | number | undefined;
messages: any;
model: any;
env: GenericEnv;
},
dataCallback: (data: GenericStreamData) => any,
) {
if (!ctx.messages?.length) {
return new Response('No messages provided', { status: 400 });
}
return this.provider.handleStream(
{
systemPrompt: ctx.systemPrompt,
preprocessedContext: ctx.preprocessedContext,
maxTokens: ctx.maxTokens,
messages: ctx.messages,
model: ctx.model,
env: ctx.env,
disableWebhookGeneration: ctx.disableWebhookGeneration,
},
dataCallback,
);
}
}

View File

@@ -0,0 +1,21 @@
// tools/basicValue.ts
export interface BasicValueResult {
value: string;
}
export const BasicValueTool = {
name: 'basicValue',
type: 'function',
description: 'Returns a basic value (timestamp-based) for testing',
parameters: {
type: 'object',
properties: {},
required: [],
},
function: async (): Promise<BasicValueResult> => {
// generate something obviously basic
const basic = `tool-called-${Date.now()}`;
console.log('[BasicValueTool] returning:', basic);
return { value: basic };
},
};

View File

@@ -0,0 +1,25 @@
export async function getWeather(latitude: any, longitude: any) {
const response = await fetch(
`https://api.open-meteo.com/v1/forecast?latitude=${latitude}&longitude=${longitude}&current=temperature_2m,wind_speed_10m&hourly=temperature_2m,relative_humidity_2m,wind_speed_10m`,
);
const data = await response.json();
return data.current.temperature_2m;
}
export const WeatherTool = {
type: 'function',
function: {
name: 'get_weather',
description: 'Get current temperature for provided coordinates in celsius.',
parameters: {
type: 'object',
properties: {
latitude: { type: 'number' },
longitude: { type: 'number' },
},
required: ['latitude', 'longitude'],
additionalProperties: false,
},
strict: true,
},
};

View File

@@ -0,0 +1,68 @@
export interface ShipControlResult {
message: string;
status: 'success' | 'error';
data?: any;
}
/**
* A mock interface for controlling a ship.
*/
export const YachtpitTools = {
type: 'function',
description: 'Interface for controlling a ship: set speed, change heading, report status, etc.',
/**
* Mock implementation of a ship control command.
*/
function: {
name: 'ship_control',
parameters: {
type: 'object',
properties: {
action: {
type: 'string',
enum: ['set_speed', 'change_heading', 'report_status', 'stop'],
description: 'Action to perform on the ship.',
},
value: {
type: 'number',
description:
'Numeric value for the action, such as speed (knots) or heading (degrees). Only required for set_speed and change_heading.',
},
},
required: ['action'],
additionalProperties: false,
},
},
};
export function yachtpitAi(args: { action: string; value?: number }): Promise<ShipControlResult> {
switch (args.action) {
case 'set_speed':
if (typeof args.value !== 'number') {
return { status: 'error', message: 'Missing speed value.' };
}
return { status: 'success', message: `Speed set to ${args.value} knots.` };
case 'change_heading':
if (typeof args.value !== 'number') {
return { status: 'error', message: 'Missing heading value.' };
}
return { status: 'success', message: `Heading changed to ${args.value} degrees.` };
case 'report_status':
// Return a simulated ship status
return {
status: 'success',
message: 'Ship status reported.',
data: {
speed: 12,
heading: 87,
engine: 'nominal',
position: { lat: 42.35, lon: -70.88 },
},
};
case 'stop':
return { status: 'success', message: 'Ship stopped.' };
default:
return { status: 'error', message: 'Invalid action.' };
}
}

View File

@@ -0,0 +1 @@
export * from './types.ts';

View File

@@ -0,0 +1,5 @@
{
"name": "@open-gsio/types",
"type": "module",
"module": "index.ts"
}

View File

@@ -0,0 +1,29 @@
import { ProviderRepository } from '../providers/_ProviderRepository.ts';
export type GenericEnv = Record<string, any>;
export type GenericStreamData = any;
export type ModelMeta = {
id: any;
} & Record<string, any>;
export type SupportedProvider = keyof typeof ProviderRepository.OPENAI_COMPAT_ENDPOINTS & string;
export type Provider = { name: SupportedProvider; key: string; endpoint: string };
export type Providers = Provider[];
export type ChatRequestBody = {
messages: any[];
model: string;
conversationId: string;
};
export interface BuildAssistantPromptParams {
maxTokens: any;
}
export interface PreprocessParams {
messages: any[];
}

View File

@@ -22,15 +22,9 @@ interface StreamResponse {
}; };
} }
const handleStreamData = ( const handleStreamData = (controller: ReadableStreamDefaultController, encoder: TextEncoder) => {
controller: ReadableStreamDefaultController, return (data: StreamResponse, transformFn?: (data: StreamResponse) => StreamResponse) => {
encoder: TextEncoder, if (!data?.type || data.type !== 'chat') {
) => {
return (
data: StreamResponse,
transformFn?: (data: StreamResponse) => StreamResponse,
) => {
if (!data?.type || data.type !== "chat") {
return; return;
} }
@@ -39,17 +33,14 @@ const handleStreamData = (
if (transformFn) { if (transformFn) {
transformedData = transformFn(data); transformedData = transformFn(data);
} else { } else {
if ( if (data.data.type === 'content_block_start' && data.data.content_block?.type === 'text') {
data.data.type === "content_block_start" &&
data.data.content_block?.type === "text"
) {
transformedData = { transformedData = {
type: "chat", type: 'chat',
data: { data: {
choices: [ choices: [
{ {
delta: { delta: {
content: data.data.content_block.text || "", content: data.data.content_block.text || '',
}, },
logprobs: null, logprobs: null,
finish_reason: null, finish_reason: null,
@@ -59,7 +50,7 @@ const handleStreamData = (
}; };
} else if (data.data.delta?.text) { } else if (data.data.delta?.text) {
transformedData = { transformedData = {
type: "chat", type: 'chat',
data: { data: {
choices: [ choices: [
{ {
@@ -74,7 +65,7 @@ const handleStreamData = (
}; };
} else if (data.data.choices?.[0]?.delta?.content) { } else if (data.data.choices?.[0]?.delta?.content) {
transformedData = { transformedData = {
type: "chat", type: 'chat',
data: { data: {
choices: [ choices: [
{ {
@@ -95,9 +86,7 @@ const handleStreamData = (
} }
} }
controller.enqueue( controller.enqueue(encoder.encode(`data: ${JSON.stringify(transformedData)}\n\n`));
encoder.encode(`data: ${JSON.stringify(transformedData)}\n\n`),
);
}; };
}; };

View File

@@ -0,0 +1,3 @@
import * as Common from './utils.ts';
export { Common };

View File

@@ -1,20 +1,19 @@
import handleStreamData from './handleStreamData.ts';
export class Utils { export class Utils {
static getSeason(date: string): string { static getSeason(date: string): string {
const hemispheres = { const hemispheres = {
Northern: ["Winter", "Spring", "Summer", "Autumn"], Northern: ['Winter', 'Spring', 'Summer', 'Autumn'],
Southern: ["Summer", "Autumn", "Winter", "Spring"], Southern: ['Summer', 'Autumn', 'Winter', 'Spring'],
}; };
const d = new Date(date); const d = new Date(date);
const month = d.getMonth(); const month = d.getMonth();
const day = d.getDate(); const day = d.getDate();
const hemisphere = "Northern"; const hemisphere = 'Northern';
if (month < 2 || (month === 2 && day <= 20) || month === 11) if (month < 2 || (month === 2 && day <= 20) || month === 11) return hemispheres[hemisphere][0];
return hemispheres[hemisphere][0]; if (month < 5 || (month === 5 && day <= 21)) return hemispheres[hemisphere][1];
if (month < 5 || (month === 5 && day <= 21)) if (month < 8 || (month === 8 && day <= 22)) return hemispheres[hemisphere][2];
return hemispheres[hemisphere][1];
if (month < 8 || (month === 8 && day <= 22))
return hemispheres[hemisphere][2];
return hemispheres[hemisphere][3]; return hemispheres[hemisphere][3];
} }
static getTimezone(timezone) { static getTimezone(timezone) {
@@ -30,18 +29,16 @@ export class Utils {
static isAssetUrl(url) { static isAssetUrl(url) {
const { pathname } = new URL(url); const { pathname } = new URL(url);
return pathname.startsWith("/assets/"); return pathname.startsWith('/assets/');
} }
static selectEquitably({ a, b, c, d }, itemCount = 9) { static selectEquitably({ a, b, c, d }, itemCount = 9) {
const sources = [a, b, c, d]; const sources = [a, b, c, d];
const result = {}; const result = {};
let combinedItems = []; let combinedItems: any[] = [];
sources.forEach((source, index) => { sources.forEach((source, index) => {
combinedItems.push( combinedItems.push(...Object.keys(source).map(key => ({ source: index, key })));
...Object.keys(source).map((key) => ({ source: index, key })),
);
}); });
combinedItems = combinedItems.sort(() => Math.random() - 0.5); combinedItems = combinedItems.sort(() => Math.random() - 0.5);
@@ -60,37 +57,37 @@ export class Utils {
return result; return result;
} }
static normalizeWithBlanks<T extends Normalize.ChatMessage>(msgs: T[]): T[] { static normalizeWithBlanks<T extends NormalizeChatMessage>(msgs: T[]): T[] {
const out: T[] = []; const out: T[] = [];
// In local mode first turn expected to be user. // In local mode first turn expected to be user.
let expected: Normalize.Role = "user"; let expected: NormalizeRole = 'user';
for (const m of msgs) { for (const m of msgs) {
while (m.role !== expected) { while (m.role !== expected) {
// Insert blanks to match expected sequence user/assistant/user... // Insert blanks to match expected sequence user/assistant/user...
out.push(Normalize.makeBlank(expected) as T); out.push(makeNormalizeBlank(expected) as T);
expected = expected === "user" ? "assistant" : "user"; expected = expected === 'user' ? 'assistant' : 'user';
} }
out.push(m); out.push(m);
expected = expected === "user" ? "assistant" : "user"; expected = expected === 'user' ? 'assistant' : 'user';
} }
return out; return out;
} }
static handleStreamData = handleStreamData;
} }
module Normalize { // Normalize module exports
export type Role = "user" | "assistant"; export type NormalizeRole = 'user' | 'assistant';
export interface ChatMessage extends Record<any, any> { export interface NormalizeChatMessage extends Record<any, any> {
role: Role; role: NormalizeRole;
}
export const makeBlank = (role: Role): ChatMessage => ({
role,
content: ""
});
} }
export const makeNormalizeBlank = (role: NormalizeRole): NormalizeChatMessage => ({
role,
content: '',
});

View File

@@ -1,88 +0,0 @@
const SUPPORTED_MODELS_GROUPS = {
openai: [
// "o1-preview",
// "o1-mini",
// "gpt-4o",
// "gpt-3.5-turbo"
],
groq: [
// "mixtral-8x7b-32768",
// "deepseek-r1-distill-llama-70b",
"meta-llama/llama-4-scout-17b-16e-instruct",
"gemma2-9b-it",
"mistral-saba-24b",
// "qwen-2.5-32b",
"llama-3.3-70b-versatile",
// "llama-3.3-70b-versatile"
// "llama-3.1-70b-versatile",
// "llama-3.3-70b-versatile"
],
cerebras: ["llama-3.3-70b"],
claude: [
// "claude-3-5-sonnet-20241022",
// "claude-3-opus-20240229"
],
fireworks: [
// "llama-v3p1-405b-instruct",
// "llama-v3p1-70b-instruct",
// "llama-v3p2-90b-vision-instruct",
// "mixtral-8x22b-instruct",
// "mythomax-l2-13b",
// "yi-large"
],
google: [
// "gemini-2.0-flash-exp",
// "gemini-1.5-flash",
// "gemini-exp-1206",
// "gemini-1.5-pro"
],
xai: [
// "grok-beta",
// "grok-2",
// "grok-2-1212",
// "grok-2-latest",
// "grok-beta"
],
cloudflareAI: [
"llama-3.2-3b-instruct", // max_tokens
"llama-3-8b-instruct", // max_tokens
"llama-3.1-8b-instruct-fast", // max_tokens
"deepseek-math-7b-instruct",
"deepseek-coder-6.7b-instruct-awq",
"hermes-2-pro-mistral-7b",
"openhermes-2.5-mistral-7b-awq",
"mistral-7b-instruct-v0.2",
"neural-chat-7b-v3-1-awq",
"openchat-3.5-0106",
// "gemma-7b-it",
],
};
export type SupportedModel =
| keyof typeof SUPPORTED_MODELS_GROUPS
| (typeof SUPPORTED_MODELS_GROUPS)[keyof typeof SUPPORTED_MODELS_GROUPS][number];
export type ModelFamily = keyof typeof SUPPORTED_MODELS_GROUPS;
function getModelFamily(model: string): ModelFamily | undefined {
return Object.keys(SUPPORTED_MODELS_GROUPS)
.filter((family) => {
return SUPPORTED_MODELS_GROUPS[
family as keyof typeof SUPPORTED_MODELS_GROUPS
].includes(model.trim());
})
.at(0) as ModelFamily | undefined;
}
const SUPPORTED_MODELS = [
// ...SUPPORTED_MODELS_GROUPS.xai,
// ...SUPPORTED_MODELS_GROUPS.claude,
// ...SUPPORTED_MODELS_GROUPS.google,
...SUPPORTED_MODELS_GROUPS.groq,
// ...SUPPORTED_MODELS_GROUPS.fireworks,
// ...SUPPORTED_MODELS_GROUPS.openai,
// ...SUPPORTED_MODELS_GROUPS.cerebras,
// ...SUPPORTED_MODELS_GROUPS.cloudflareAI,
];
export { SUPPORTED_MODELS, SUPPORTED_MODELS_GROUPS, getModelFamily };

View File

@@ -0,0 +1,9 @@
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"outDir": "dist",
"rootDir": "."
},
"include": ["*.ts"],
"exclude": ["node_modules"]
}

View File

@@ -5,23 +5,36 @@
"dev": "bun vite dev", "dev": "bun vite dev",
"build": "bun vite build", "build": "bun vite build",
"tests": "vitest run", "tests": "vitest run",
"tests:coverage": "vitest run --coverage.enabled=true" "tests:coverage": "vitest run --coverage.enabled=true",
"generate:sitemap": "bun ./scripts/generate_sitemap.js open-gsio.seemueller.workers.dev",
"generate:robotstxt": "bun ./scripts/generate_robots_txt.js open-gsio.seemueller.workers.dev",
"generate:fonts": "cp -r ../../node_modules/katex/dist/fonts public/static",
"generate:pwa:assets": "test ! -f public/pwa-64x64.png && pwa-assets-generator --preset minimal-2023 public/logo.png || echo 'PWA assets already exist'"
}, },
"dependencies": { "exports": {
"@open-gsio/env": "workspace:*", "./server/index.ts": {
"@open-gsio/scripts": "workspace:*", "import": "./server/index.ts",
"@anthropic-ai/sdk": "^0.32.1", "types": "./server/index.ts"
}
},
"devDependencies": {
"@chakra-ui/icons": "^2.2.4",
"@chakra-ui/react": "^2.10.6", "@chakra-ui/react": "^2.10.6",
"@cloudflare/workers-types": "^4.20241205.0", "@cloudflare/workers-types": "^4.20241205.0",
"@emotion/react": "^11.13.5", "@emotion/react": "^11.13.5",
"@emotion/styled": "^11.13.5", "@emotion/styled": "^11.13.5",
"@open-gsio/env": "workspace:*",
"@open-gsio/scripts": "workspace:*",
"@testing-library/jest-dom": "^6.4.2", "@testing-library/jest-dom": "^6.4.2",
"@testing-library/react": "^14.2.1", "@testing-library/react": "^16.3.0",
"@testing-library/user-event": "^14.5.2", "@testing-library/user-event": "^14.5.2",
"@types/bun": "^1.2.17",
"@types/marked": "^6.0.0", "@types/marked": "^6.0.0",
"@vite-pwa/assets-generator": "^1.0.0",
"@vitejs/plugin-react": "^4.3.4", "@vitejs/plugin-react": "^4.3.4",
"@vitest/coverage-v8": "^3.1.4", "@vitest/coverage-v8": "^3.1.4",
"@vitest/ui": "^3.1.4", "@vitest/ui": "^3.1.4",
"bun": "^1.2.17",
"chokidar": "^4.0.1", "chokidar": "^4.0.1",
"framer-motion": "^11.13.1", "framer-motion": "^11.13.1",
"isomorphic-dompurify": "^2.19.0", "isomorphic-dompurify": "^2.19.0",
@@ -29,6 +42,7 @@
"jsdom": "^24.0.0", "jsdom": "^24.0.0",
"katex": "^0.16.20", "katex": "^0.16.20",
"lucide-react": "^0.436.0", "lucide-react": "^0.436.0",
"mapbox-gl": "^3.13.0",
"marked": "^15.0.4", "marked": "^15.0.4",
"marked-extended-latex": "^1.1.0", "marked-extended-latex": "^1.1.0",
"marked-footnote": "^1.2.4", "marked-footnote": "^1.2.4",
@@ -36,18 +50,19 @@
"mobx": "^6.13.5", "mobx": "^6.13.5",
"mobx-react-lite": "^4.0.7", "mobx-react-lite": "^4.0.7",
"mobx-state-tree": "^6.0.1", "mobx-state-tree": "^6.0.1",
"moo": "^0.5.2",
"qrcode.react": "^4.1.0", "qrcode.react": "^4.1.0",
"react": "^18.3.1", "react": "^19.1.0",
"react-dom": "^18.3.1", "react-dom": "^19.1.0",
"react-icons": "^5.4.0", "react-icons": "^5.4.0",
"react-streaming": "^0.3.44", "react-map-gl": "^8.0.4",
"react-streaming": "^0.4.2",
"react-textarea-autosize": "^8.5.5", "react-textarea-autosize": "^8.5.5",
"shiki": "^1.24.0", "shiki": "^1.24.0",
"tslog": "^4.9.3",
"typescript": "^5.7.2", "typescript": "^5.7.2",
"vike": "0.4.193", "vike": "^0.4.235",
"vite": "^6.3.5", "vite": "^7.0.0",
"vite-plugin-pwa": "^1.0.0", "vite-plugin-pwa": "^1.0.1",
"vitest": "^3.1.4" "vitest": "^3.1.4"
} }
} }

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.8 KiB

View File

@@ -15,30 +15,29 @@
}; };
function s() { function s() {
var i = [ var i = [
g(m(4)) + "=" + g(m(6)), g(m(4)) + '=' + g(m(6)),
"ga=" + t.ga_tid, 'ga=' + t.ga_tid,
"dt=" + r(e.title), 'dt=' + r(e.title),
"de=" + r(e.characterSet || e.charset), 'de=' + r(e.characterSet || e.charset),
"dr=" + r(e.referrer), 'dr=' + r(e.referrer),
"ul=" + (n.language || n.browserLanguage || n.userLanguage), 'ul=' + (n.language || n.browserLanguage || n.userLanguage),
"sd=" + a.colorDepth + "-bit", 'sd=' + a.colorDepth + '-bit',
"sr=" + a.width + "x" + a.height, 'sr=' + a.width + 'x' + a.height,
"vp=" + 'vp=' +
o(e.documentElement.clientWidth, t.innerWidth || 0) + o(e.documentElement.clientWidth, t.innerWidth || 0) +
"x" + 'x' +
o(e.documentElement.clientHeight, t.innerHeight || 0), o(e.documentElement.clientHeight, t.innerHeight || 0),
"plt=" + c(d.loadEventStart - d.navigationStart || 0), 'plt=' + c(d.loadEventStart - d.navigationStart || 0),
"dns=" + c(d.domainLookupEnd - d.domainLookupStart || 0), 'dns=' + c(d.domainLookupEnd - d.domainLookupStart || 0),
"pdt=" + c(d.responseEnd - d.responseStart || 0), 'pdt=' + c(d.responseEnd - d.responseStart || 0),
"rrt=" + c(d.redirectEnd - d.redirectStart || 0), 'rrt=' + c(d.redirectEnd - d.redirectStart || 0),
"tcp=" + c(d.connectEnd - d.connectStart || 0), 'tcp=' + c(d.connectEnd - d.connectStart || 0),
"srt=" + c(d.responseStart - d.requestStart || 0), 'srt=' + c(d.responseStart - d.requestStart || 0),
"dit=" + c(d.domInteractive - d.domLoading || 0), 'dit=' + c(d.domInteractive - d.domLoading || 0),
"clt=" + c(d.domContentLoadedEventStart - d.navigationStart || 0), 'clt=' + c(d.domContentLoadedEventStart - d.navigationStart || 0),
"z=" + Date.now(), 'z=' + Date.now(),
]; ];
(t.__ga_img = new Image()), (t.__ga_img.src = t.ga_api + "?" + i.join("&")); ((t.__ga_img = new Image()), (t.__ga_img.src = t.ga_api + '?' + i.join('&')));
} }
(t.cfga = s), ((t.cfga = s), 'complete' === e.readyState ? s() : t.addEventListener('load', s));
"complete" === e.readyState ? s() : t.addEventListener("load", s);
})(window, document, navigator); })(window, document, navigator);

Binary file not shown.

Before

Width:  |  Height:  |  Size: 638 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 563 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 624 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 534 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 373 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 165 KiB

View File

@@ -1,19 +0,0 @@
{
"name": "",
"short_name": "",
"icons": [
{
"src": "/android-chrome-192x192.png",
"sizes": "192x192",
"type": "image/png"
},
{
"src": "/android-chrome-512x512.png",
"sizes": "512x512",
"type": "image/png"
}
],
"theme_color": "#fffff0",
"background_color": "#000000",
"display": "standalone"
}

View File

@@ -1,17 +1,17 @@
#!/usr/bin/env bun #!/usr/bin/env bun
/* eslint-env node */
import fs from "fs"; import fs from 'fs';
import {parseArgs} from "util"; import { parseArgs } from 'util';
const { positionals } = parseArgs({
const {positionals} = parseArgs({
args: Bun.argv, args: Bun.argv,
options: {}, options: {},
strict: true, strict: true,
allowPositionals: true, allowPositionals: true,
}); });
const currentDate = new Date().toISOString().split("T")[0]; const currentDate = new Date().toISOString().split('T')[0];
const host = positionals[2]; const host = positionals[2];
@@ -25,12 +25,12 @@ Disallow: /assets
Sitemap: https://${host}/sitemap.xml Sitemap: https://${host}/sitemap.xml
`; `;
const robotsTxtPath = "./public/robots.txt"; const robotsTxtPath = './public/robots.txt';
fs.writeFile(robotsTxtPath, robotsTxtTemplate, (err) => { fs.writeFile(robotsTxtPath, robotsTxtTemplate, err => {
if (err) { if (err) {
console.error("Error writing robots.txt:", err); console.error('Error writing robots.txt:', err);
process.exit(1); process.exit(1);
} }
console.log("robots.txt created successfully:", currentDate); console.log('robots.txt created successfully:', currentDate);
}); });

View File

@@ -1,17 +1,16 @@
#!/usr/bin/env bun #!/usr/bin/env bun
import fs from "fs"; import fs from 'fs';
import {parseArgs} from "util"; import { parseArgs } from 'util';
const { positionals } = parseArgs({
const {positionals} = parseArgs({
args: Bun.argv, args: Bun.argv,
options: {}, options: {},
strict: true, strict: true,
allowPositionals: true, allowPositionals: true,
}); });
const currentDate = new Date().toISOString().split("T")[0]; const currentDate = new Date().toISOString().split('T')[0];
const host = positionals[2]; const host = positionals[2];
@@ -30,12 +29,12 @@ const sitemapTemplate = `<?xml version="1.0" encoding="UTF-8"?>
</url> </url>
</urlset>`; </urlset>`;
const sitemapPath = "./public/sitemap.xml"; const sitemapPath = './public/sitemap.xml';
fs.writeFile(sitemapPath, sitemapTemplate, (err) => { fs.writeFile(sitemapPath, sitemapTemplate, err => {
if (err) { if (err) {
console.error("Error writing sitemap file:", err); console.error('Error writing sitemap file:', err);
process.exit(1); process.exit(1);
} }
console.log("Sitemap updated successfully with current date:", currentDate); console.log('Sitemap updated successfully with current date:', currentDate);
}); });

View File

@@ -0,0 +1,20 @@
import { renderPage } from 'vike/server';
// This is what makes SSR possible. It is consumed by @open-gsio/server
export { handleSsr };
async function handleSsr(url: string, headers: Headers) {
const pageContextInit = {
urlOriginal: url,
headersOriginal: headers,
fetch: (...args: Parameters<typeof fetch>) => fetch(...args),
};
const pageContext = await renderPage(pageContextInit);
const { httpResponse } = pageContext;
const stream = httpResponse.getReadableWebStream();
return new Response(stream, {
headers: httpResponse.headers,
status: httpResponse.statusCode,
});
}

View File

@@ -1,7 +1,8 @@
import React from "react"; import { IconButton } from '@chakra-ui/react';
import { IconButton } from "@chakra-ui/react"; import { LucideHammer } from 'lucide-react';
import { LucideHammer } from "lucide-react"; import React from 'react';
import { toolbarButtonZIndex } from "./toolbar/Toolbar";
import { toolbarButtonZIndex } from './toolbar/Toolbar';
export default function BuiltWithButton() { export default function BuiltWithButton() {
return ( return (
@@ -12,12 +13,12 @@ export default function BuiltWithButton() {
bg="transparent" bg="transparent"
stroke="text.accent" stroke="text.accent"
color="text.accent" color="text.accent"
onClick={() => alert("Built by Geoff Seemueller")} onClick={() => alert('Built by GSIO')}
_hover={{ _hover={{
bg: "transparent", bg: 'transparent',
svg: { svg: {
stroke: "accent.secondary", stroke: 'accent.secondary',
transition: "stroke 0.3s ease-in-out", transition: 'stroke 0.3s ease-in-out',
}, },
}} }}
zIndex={toolbarButtonZIndex} zIndex={toolbarButtonZIndex}

View File

@@ -1,10 +1,12 @@
import { getColorThemes } from "../layout/theme/color-themes"; import { Center, IconButton, VStack } from '@chakra-ui/react';
import { Center, IconButton, VStack } from "@chakra-ui/react"; import { Circle } from 'lucide-react';
import userOptionsStore from "../stores/UserOptionsStore"; import React from 'react';
import { Circle } from "lucide-react";
import { toolbarButtonZIndex } from "./toolbar/Toolbar"; import { getColorThemes } from '../layout/theme/color-themes';
import React from "react"; import userOptionsStore from '../stores/UserOptionsStore';
import { useIsMobile } from "./contexts/MobileContext";
import { useIsMobile } from './contexts/MobileContext';
import { toolbarButtonZIndex } from './toolbar/Toolbar';
export function ThemeSelectionOptions() { export function ThemeSelectionOptions() {
const children = []; const children = [];
@@ -24,11 +26,11 @@ export function ThemeSelectionOptions() {
size={!isMobile ? 16 : 20} size={!isMobile ? 16 : 20}
stroke="transparent" stroke="transparent"
style={{ style={{
background: `conic-gradient(${theme.colors.background.primary.startsWith("#") ? theme.colors.background.primary : theme.colors.background.secondary} 0 50%, ${theme.colors.text.secondary} 50% 100%)`, background: `conic-gradient(${theme.colors.background.primary.startsWith('#') ? theme.colors.background.primary : theme.colors.background.secondary} 0 50%, ${theme.colors.text.secondary} 50% 100%)`,
borderRadius: "50%", borderRadius: '50%',
boxShadow: "0 0 0.5px 0.25px #fff", boxShadow: '0 0 0.5px 0.25px #fff',
cursor: "pointer", cursor: 'pointer',
transition: "background 0.2s", transition: 'background 0.2s',
}} }}
/> />
} }
@@ -38,7 +40,7 @@ export function ThemeSelectionOptions() {
color="transparent" color="transparent"
_hover={{ _hover={{
svg: { svg: {
transition: "stroke 0.3s ease-in-out", // Smooth transition effect transition: 'stroke 0.3s ease-in-out', // Smooth transition effect
}, },
}} }}
zIndex={toolbarButtonZIndex} zIndex={toolbarButtonZIndex}
@@ -47,7 +49,7 @@ export function ThemeSelectionOptions() {
} }
return ( return (
<VStack align={!isMobile ? "end" : "start"} p={1.2}> <VStack align={!isMobile ? 'end' : 'start'} p={1.2}>
<Center>{children}</Center> <Center>{children}</Center>
</VStack> </VStack>
); );

View File

@@ -1,11 +1,9 @@
import { motion } from "framer-motion"; import { Box, Center, VStack } from '@chakra-ui/react';
import { Box, Center, VStack } from "@chakra-ui/react"; import { motion } from 'framer-motion';
import {
welcome_home_text,
welcome_home_tip,
} from "../static-data/welcome_home_text";
import {renderMarkdown} from "./markdown/MarkdownComponent";
import { welcome_home_text, welcome_home_tip } from '../static-data/welcome_home_text';
import { renderMarkdown } from './markdown/MarkdownComponent';
function WelcomeHomeMessage({ visible }) { function WelcomeHomeMessage({ visible }) {
const containerVariants = { const containerVariants = {
@@ -45,33 +43,19 @@ function WelcomeHomeMessage({ visible }) {
<Center> <Center>
<VStack spacing={8} align="center" maxW="400px"> <VStack spacing={8} align="center" maxW="400px">
{/* Welcome Message */} {/* Welcome Message */}
<Box <Box fontSize="sm" fontStyle="italic" textAlign="center" color="text.secondary" mt={4}>
fontSize="sm"
fontStyle="italic"
textAlign="center"
color="text.secondary"
mt={4}
>
<motion.div <motion.div
variants={containerVariants} variants={containerVariants}
initial="hidden" initial="hidden"
animate={visible ? "visible" : "hidden"} animate={visible ? 'visible' : 'hidden'}
> >
<Box userSelect={"none"}> <Box userSelect={'none'}>
<motion.div variants={textVariants}> <motion.div variants={textVariants}>{renderMarkdown(welcome_home_text)}</motion.div>
{renderMarkdown(welcome_home_text)}
</motion.div>
</Box> </Box>
</motion.div> </motion.div>
</Box> </Box>
<motion.div variants={textVariants}> <motion.div variants={textVariants}>
<Box <Box fontSize="sm" fontStyle="italic" textAlign="center" color="text.secondary" mt={1}>
fontSize="sm"
fontStyle="italic"
textAlign="center"
color="text.secondary"
mt={1}
>
{renderMarkdown(welcome_home_tip)} {renderMarkdown(welcome_home_tip)}
</Box> </Box>
</motion.div> </motion.div>

View File

@@ -1,37 +1,38 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react'; import { render, screen, fireEvent } from '@testing-library/react';
import { ThemeSelectionOptions } from '../ThemeSelection'; import { describe, it, expect, vi, beforeEach } from 'vitest';
import userOptionsStore from '../../stores/UserOptionsStore'; import userOptionsStore from '../../stores/UserOptionsStore';
import * as MobileContext from '../contexts/MobileContext'; import * as MobileContext from '../contexts/MobileContext';
import { ThemeSelectionOptions } from '../ThemeSelection';
// Mock dependencies // Mock dependencies
vi.mock('../../layout/theme/color-themes', () => ({ vi.mock('../../layout/theme/color-themes', () => ({
getColorThemes: () => [ getColorThemes: () => [
{ {
name: 'light', name: 'light',
colors: { colors: {
background: { primary: '#ffffff', secondary: '#f0f0f0' }, background: { primary: '#ffffff', secondary: '#f0f0f0' },
text: { secondary: '#333333' } text: { secondary: '#333333' },
} },
}, },
{ {
name: 'dark', name: 'dark',
colors: { colors: {
background: { primary: '#121212', secondary: '#1e1e1e' }, background: { primary: '#121212', secondary: '#1e1e1e' },
text: { secondary: '#e0e0e0' } text: { secondary: '#e0e0e0' },
} },
} },
] ],
})); }));
vi.mock('../../stores/UserOptionsStore', () => ({ vi.mock('../../stores/UserOptionsStore', () => ({
default: { default: {
selectTheme: vi.fn() selectTheme: vi.fn(),
} },
})); }));
vi.mock('../toolbar/Toolbar', () => ({ vi.mock('../toolbar/Toolbar', () => ({
toolbarButtonZIndex: 100 toolbarButtonZIndex: 100,
})); }));
describe('ThemeSelectionOptions', () => { describe('ThemeSelectionOptions', () => {
@@ -42,20 +43,20 @@ describe('ThemeSelectionOptions', () => {
it('renders theme options for desktop view', () => { it('renders theme options for desktop view', () => {
// Mock useIsMobile to return false (desktop view) // Mock useIsMobile to return false (desktop view)
vi.spyOn(MobileContext, 'useIsMobile').mockReturnValue(false); vi.spyOn(MobileContext, 'useIsMobile').mockReturnValue(false);
render(<ThemeSelectionOptions />); render(<ThemeSelectionOptions />);
// Should render 2 theme buttons (from our mock) // Should render 2 theme buttons (from our mock)
const buttons = screen.getAllByRole("button") const buttons = screen.getAllByRole('button');
expect(buttons).toHaveLength(2); expect(buttons).toHaveLength(2);
}); });
it('renders theme options for mobile view', () => { it('renders theme options for mobile view', () => {
// Mock useIsMobile to return true (mobile view) // Mock useIsMobile to return true (mobile view)
vi.spyOn(MobileContext, 'useIsMobile').mockReturnValue(true); vi.spyOn(MobileContext, 'useIsMobile').mockReturnValue(true);
render(<ThemeSelectionOptions />); render(<ThemeSelectionOptions />);
// Should still render 2 theme buttons // Should still render 2 theme buttons
const buttons = screen.getAllByRole('button'); const buttons = screen.getAllByRole('button');
expect(buttons).toHaveLength(2); expect(buttons).toHaveLength(2);
@@ -63,16 +64,16 @@ describe('ThemeSelectionOptions', () => {
it('calls selectTheme when a theme button is clicked', () => { it('calls selectTheme when a theme button is clicked', () => {
vi.spyOn(MobileContext, 'useIsMobile').mockReturnValue(false); vi.spyOn(MobileContext, 'useIsMobile').mockReturnValue(false);
render(<ThemeSelectionOptions />); render(<ThemeSelectionOptions />);
const buttons = screen.getAllByRole('button'); const buttons = screen.getAllByRole('button');
fireEvent.click(buttons[0]); // Click the first theme button (light) fireEvent.click(buttons[0]); // Click the first theme button (light)
// Verify that selectTheme was called with the correct theme name // Verify that selectTheme was called with the correct theme name
expect(userOptionsStore.selectTheme).toHaveBeenCalledWith('light'); expect(userOptionsStore.selectTheme).toHaveBeenCalledWith('light');
fireEvent.click(buttons[1]); // Click the second theme button (dark) fireEvent.click(buttons[1]); // Click the second theme button (dark)
expect(userOptionsStore.selectTheme).toHaveBeenCalledWith('dark'); expect(userOptionsStore.selectTheme).toHaveBeenCalledWith('dark');
}); });
}); });

View File

@@ -1,22 +1,23 @@
import { describe, it, expect } from 'vitest';
import { render, screen } from '@testing-library/react'; import { render, screen } from '@testing-library/react';
import WelcomeHomeMessage from '../WelcomeHome'; import { describe, it, expect } from 'vitest';
import { welcome_home_text, welcome_home_tip } from '../../static-data/welcome_home_text'; import { welcome_home_text, welcome_home_tip } from '../../static-data/welcome_home_text';
import { renderMarkdown } from '../markdown/MarkdownComponent'; import { renderMarkdown } from '../markdown/MarkdownComponent';
import WelcomeHomeMessage from '../WelcomeHome';
// Mock the renderMarkdown function // Mock the renderMarkdown function
vi.mock('../markdown/MarkdownComponent', () => ({ vi.mock('../markdown/MarkdownComponent', () => ({
renderMarkdown: vi.fn((text) => `Rendered: ${text}`), renderMarkdown: vi.fn(text => `Rendered: ${text}`),
})); }));
describe('WelcomeHomeMessage', () => { describe('WelcomeHomeMessage', () => {
it('renders correctly when visible', () => { it('renders correctly when visible', () => {
render(<WelcomeHomeMessage visible={true} />); render(<WelcomeHomeMessage visible={true} />);
// Check if the rendered markdown content is in the document // Check if the rendered markdown content is in the document
expect(screen.getByText(`Rendered: ${welcome_home_text}`)).toBeInTheDocument(); expect(screen.getByText(`Rendered: ${welcome_home_text}`)).toBeInTheDocument();
expect(screen.getByText(`Rendered: ${welcome_home_tip}`)).toBeInTheDocument(); expect(screen.getByText(`Rendered: ${welcome_home_tip}`)).toBeInTheDocument();
// Verify that renderMarkdown was called with the correct arguments // Verify that renderMarkdown was called with the correct arguments
expect(renderMarkdown).toHaveBeenCalledWith(welcome_home_text); expect(renderMarkdown).toHaveBeenCalledWith(welcome_home_text);
expect(renderMarkdown).toHaveBeenCalledWith(welcome_home_tip); expect(renderMarkdown).toHaveBeenCalledWith(welcome_home_tip);
@@ -24,17 +25,17 @@ describe('WelcomeHomeMessage', () => {
it('applies animation variants based on visible prop', () => { it('applies animation variants based on visible prop', () => {
const { rerender } = render(<WelcomeHomeMessage visible={true} />); const { rerender } = render(<WelcomeHomeMessage visible={true} />);
// When visible is true, the component should have the visible animation state // When visible is true, the component should have the visible animation state
// Since we've mocked framer-motion, we can't directly test the animation state // Since we've mocked framer-motion, we can't directly test the animation state
// But we can verify that the component renders the content // But we can verify that the component renders the content
expect(screen.getByText(`Rendered: ${welcome_home_text}`)).toBeInTheDocument(); expect(screen.getByText(`Rendered: ${welcome_home_text}`)).toBeInTheDocument();
// Re-render with visible=false // Re-render with visible=false
rerender(<WelcomeHomeMessage visible={false} />); rerender(<WelcomeHomeMessage visible={false} />);
// Content should still be in the document even when not visible // Content should still be in the document even when not visible
// (since we've mocked the animations) // (since we've mocked the animations)
expect(screen.getByText(`Rendered: ${welcome_home_text}`)).toBeInTheDocument(); expect(screen.getByText(`Rendered: ${welcome_home_text}`)).toBeInTheDocument();
}); });
}); });

View File

@@ -1,14 +1,14 @@
import React from "react"; import { Grid, GridItem, Image, Text } from '@chakra-ui/react';
import { Grid, GridItem, Image, Text } from "@chakra-ui/react"; import React from 'react';
const fontSize = "md"; const fontSize = 'md';
function AboutComponent() { function AboutComponent() {
return ( return (
<Grid <Grid
templateColumns="1fr" templateColumns="1fr"
gap={4} gap={4}
maxW={["100%", "100%", "100%"]} maxW={['100%', '100%', '100%']}
mx="auto" mx="auto"
className="about-container" className="about-container"
> >
@@ -17,22 +17,22 @@ function AboutComponent() {
src="/me.png" src="/me.png"
alt="Geoff Seemueller" alt="Geoff Seemueller"
borderRadius="full" borderRadius="full"
boxSize={["120px", "150px"]} boxSize={['120px', '150px']}
objectFit="cover" objectFit="cover"
/> />
</GridItem> </GridItem>
<GridItem <GridItem
colSpan={1} colSpan={1}
maxW={["100%", "100%", "container.md"]} maxW={['100%', '100%', 'container.md']}
justifySelf="center" justifySelf="center"
minH={"100%"} minH={'100%'}
> >
<Grid templateColumns="1fr" gap={4} overflowY={"auto"}> <Grid templateColumns="1fr" gap={4} overflowY={'auto'}>
<GridItem> <GridItem>
<Text fontSize={fontSize}> <Text fontSize={fontSize}>
If you're interested in collaborating on innovative projects that If you're interested in collaborating on innovative projects that push technological
push technological boundaries and create real value, I'd be keen boundaries and create real value, I'd be keen to connect and explore potential
to connect and explore potential opportunities. opportunities.
</Text> </Text>
</GridItem> </GridItem>
</Grid> </Grid>

View File

@@ -1,30 +1,26 @@
import React, { useEffect, useRef, useState } from "react"; import { Box, Grid, GridItem } from '@chakra-ui/react';
import { observer } from "mobx-react-lite"; import { observer } from 'mobx-react-lite';
import { Box, Grid, GridItem } from "@chakra-ui/react"; import React, { useEffect, useRef, useState } from 'react';
import ChatMessages from "./messages/ChatMessages";
import ChatInput from "./input/ChatInput"; import menuState from '../../stores/AppMenuStore';
import chatStore from "../../stores/ClientChatStore"; import chatStore from '../../stores/ClientChatStore';
import menuState from "../../stores/AppMenuStore"; import WelcomeHome from '../WelcomeHome';
import WelcomeHome from "../WelcomeHome";
import ChatInput from './input/ChatInput';
import ChatMessages from './messages/ChatMessages';
const Chat = observer(({ height, width }) => { const Chat = observer(({ height, width }) => {
const scrollRef = useRef(); const scrollRef = useRef();
const [isAndroid, setIsAndroid] = useState(false); const [isAndroid, setIsAndroid] = useState(false);
useEffect(() => { useEffect(() => {
if (typeof window !== "undefined") { if (typeof window !== 'undefined') {
setIsAndroid(/android/i.test(window.navigator.userAgent)); setIsAndroid(/android/i.test(window.navigator.userAgent));
} }
}, []); }, []);
return ( return (
<Grid <Grid templateRows="1fr auto" templateColumns="1fr" height={height} width={width} gap={0}>
templateRows="1fr auto"
templateColumns="1fr"
height={height}
width={width}
gap={0}
>
<GridItem alignSelf="center" hidden={!(chatStore.items.length < 1)}> <GridItem alignSelf="center" hidden={!(chatStore.items.length < 1)}>
<WelcomeHome visible={chatStore.items.length < 1} /> <WelcomeHome visible={chatStore.items.length < 1} />
</GridItem> </GridItem>
@@ -32,35 +28,20 @@ const Chat = observer(({ height, width }) => {
<GridItem <GridItem
overflow="auto" overflow="auto"
width="100%" width="100%"
maxH="100%" maxH="100vh"
ref={scrollRef} ref={scrollRef}
// If there are attachments, use "100px". Otherwise, use "128px" on Android, "73px" elsewhere. // If there are attachments, use "100px". Otherwise, use "128px" on Android, "73px" elsewhere.
pb={ pb={isAndroid ? '128px' : '73px'}
isAndroid
? "128px"
: "73px"
}
alignSelf="flex-end" alignSelf="flex-end"
> >
<ChatMessages scrollRef={scrollRef} /> <ChatMessages scrollRef={scrollRef} />
</GridItem> </GridItem>
<GridItem <GridItem position="relative" bg="background.primary" zIndex={1000} width="100%">
position="relative" <Box w="100%" display="flex" justifyContent="center" mx="auto" hidden={menuState.isOpen}>
bg="background.primary"
zIndex={1000}
width="100%"
>
<Box
w="100%"
display="flex"
justifyContent="center"
mx="auto"
hidden={menuState.isOpen}
>
<ChatInput <ChatInput
input={chatStore.input} input={chatStore.input}
setInput={(value) => chatStore.setInput(value)} setInput={value => chatStore.setInput(value)}
handleSendMessage={chatStore.sendMessage} handleSendMessage={chatStore.sendMessage}
isLoading={chatStore.isLoading} isLoading={chatStore.isLoading}
/> />

View File

@@ -1,16 +1,17 @@
import React from "react"; import { observer } from 'mobx-react-lite';
import { observer } from "mobx-react-lite"; import React from 'react';
import clientChatStore from "../../stores/ClientChatStore";
import clientChatStore from '../../stores/ClientChatStore';
export const IntermediateStepsComponent = observer(({ hidden }) => { export const IntermediateStepsComponent = observer(({ hidden }) => {
return ( return (
<div hidden={hidden}> <div hidden={hidden}>
{clientChatStore.intermediateSteps.map((step, index) => { {clientChatStore.intermediateSteps.map((step, index) => {
switch (step.kind) { switch (step.kind) {
case "web-search": { case 'web-search': {
return <WebSearchResult key={index} data={step.data} />; return <WebSearchResult key={index} data={step.data} />;
} }
case "tool-result": case 'tool-result':
return <ToolResult key={index} data={step.data} />; return <ToolResult key={index} data={step.data} />;
default: default:
return <GenericStep key={index} data={step.data} />; return <GenericStep key={index} data={step.data} />;
@@ -45,7 +46,7 @@ export const GenericStep = ({ data }) => {
return ( return (
<div className="generic-step"> <div className="generic-step">
<h3>Generic Step</h3> <h3>Generic Step</h3>
<p>{data.description || "No additional information provided."}</p> <p>{data.description || 'No additional information provided.'}</p>
</div> </div>
); );
}; };

View File

@@ -1,5 +1,3 @@
import React, { useRef } from "react";
import { observer } from "mobx-react-lite";
import { import {
Box, Box,
Divider, Divider,
@@ -11,8 +9,10 @@ import {
Portal, Portal,
Text, Text,
useDisclosure, useDisclosure,
} from "@chakra-ui/react"; } from '@chakra-ui/react';
import { ChevronRight } from "lucide-react"; import { ChevronRight } from 'lucide-react';
import { observer } from 'mobx-react-lite';
import React, { useRef } from 'react';
const FlyoutSubMenu: React.FC<{ const FlyoutSubMenu: React.FC<{
title: string; title: string;
@@ -23,15 +23,7 @@ const FlyoutSubMenu: React.FC<{
parentIsOpen: boolean; parentIsOpen: boolean;
setMenuState?: (state) => void; setMenuState?: (state) => void;
}> = observer( }> = observer(
({ ({ title, flyoutMenuOptions, onClose, handleSelect, isSelected, parentIsOpen, setMenuState }) => {
title,
flyoutMenuOptions,
onClose,
handleSelect,
isSelected,
parentIsOpen,
setMenuState,
}) => {
const { isOpen, onOpen, onClose: onSubMenuClose } = useDisclosure(); const { isOpen, onOpen, onClose: onSubMenuClose } = useDisclosure();
const menuRef = new useRef(); const menuRef = new useRef();
@@ -41,9 +33,9 @@ const FlyoutSubMenu: React.FC<{
placement="right-start" placement="right-start"
isOpen={isOpen && parentIsOpen} isOpen={isOpen && parentIsOpen}
closeOnBlur={true} closeOnBlur={true}
lazyBehavior={"keepMounted"} lazyBehavior={'keepMounted'}
isLazy={true} isLazy={true}
onClose={(e) => { onClose={e => {
onSubMenuClose(); onSubMenuClose();
}} }}
closeOnSelect={false} closeOnSelect={false}
@@ -54,12 +46,12 @@ const FlyoutSubMenu: React.FC<{
ref={menuRef} ref={menuRef}
bg="background.tertiary" bg="background.tertiary"
color="text.primary" color="text.primary"
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }} _hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }} _focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
> >
<HStack width={"100%"} justifyContent={"space-between"}> <HStack width={'100%'} justifyContent={'space-between'}>
<Text>{title}</Text> <Text>{title}</Text>
<ChevronRight size={"1rem"} /> <ChevronRight size={'1rem'} />
</HStack> </HStack>
</MenuButton> </MenuButton>
<Portal> <Portal>
@@ -67,7 +59,7 @@ const FlyoutSubMenu: React.FC<{
key={title} key={title}
maxHeight={56} maxHeight={56}
overflowY="scroll" overflowY="scroll"
visibility={"visible"} visibility={'visible'}
minWidth="180px" minWidth="180px"
bg="background.tertiary" bg="background.tertiary"
boxShadow="lg" boxShadow="lg"
@@ -77,43 +69,35 @@ const FlyoutSubMenu: React.FC<{
left="100%" left="100%"
bottom={-10} bottom={-10}
sx={{ sx={{
"::-webkit-scrollbar": { '::-webkit-scrollbar': {
width: "8px", width: '8px',
}, },
"::-webkit-scrollbar-thumb": { '::-webkit-scrollbar-thumb': {
background: "background.primary", background: 'background.primary',
borderRadius: "4px", borderRadius: '4px',
}, },
"::-webkit-scrollbar-track": { '::-webkit-scrollbar-track': {
background: "background.tertiary", background: 'background.tertiary',
}, },
}} }}
> >
{flyoutMenuOptions.map((item, index) => ( {flyoutMenuOptions.map((item, index) => (
<Box key={"itemflybox" + index}> <Box key={'itemflybox' + index}>
<MenuItem <MenuItem
key={"itemfly" + index} key={'itemfly' + index}
onClick={() => { onClick={() => {
onSubMenuClose(); onSubMenuClose();
onClose(); onClose();
handleSelect(item); handleSelect(item);
}} }}
bg={ bg={isSelected(item) ? 'background.secondary' : 'background.tertiary'}
isSelected(item) _hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
? "background.secondary" _focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
: "background.tertiary"
}
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }}
> >
{item.name} {item.name}
</MenuItem> </MenuItem>
{index < flyoutMenuOptions.length - 1 && ( {index < flyoutMenuOptions.length - 1 && (
<Divider <Divider key={item.name + '-divider'} color="text.tertiary" w={'100%'} />
key={item.name + "-divider"}
color="text.tertiary"
w={"100%"}
/>
)} )}
</Box> </Box>
))} ))}

View File

@@ -1,4 +1,3 @@
import React, { useCallback, useEffect, useRef, useState } from "react";
import { import {
Box, Box,
Button, Button,
@@ -12,204 +11,180 @@ import {
Text, Text,
useDisclosure, useDisclosure,
useOutsideClick, useOutsideClick,
} from "@chakra-ui/react"; } from '@chakra-ui/react';
import { observer } from "mobx-react-lite"; import { ChevronDown, Copy, RefreshCcw, Settings } from 'lucide-react';
import { ChevronDown, Copy, RefreshCcw, Settings } from "lucide-react"; import { observer } from 'mobx-react-lite';
import ClientChatStore from "../../../stores/ClientChatStore"; import React, { useCallback, useEffect, useRef, useState } from 'react';
import clientChatStore from "../../../stores/ClientChatStore";
import FlyoutSubMenu from "./FlyoutSubMenu"; import { useIsMobile as useIsMobileUserAgent } from '../../../hooks/_IsMobileHook';
import { useIsMobile } from "../../contexts/MobileContext"; import clientChatStore from '../../../stores/ClientChatStore';
import { useIsMobile as useIsMobileUserAgent } from "../../../hooks/_IsMobileHook"; import { useIsMobile } from '../../contexts/MobileContext';
import { getModelFamily, SUPPORTED_MODELS } from "../lib/SupportedModels"; import { formatConversationMarkdown } from '../lib/exportConversationAsMarkdown';
import { formatConversationMarkdown } from "../lib/exportConversationAsMarkdown";
import FlyoutSubMenu from './FlyoutSubMenu';
export const MsM_commonButtonStyles = { export const MsM_commonButtonStyles = {
bg: "transparent", bg: 'transparent',
color: "text.primary", color: 'text.primary',
borderRadius: "full", borderRadius: 'full',
padding: 2, padding: 2,
border: "none", border: 'none',
_hover: { bg: "rgba(255, 255, 255, 0.2)" }, _hover: { bg: 'rgba(255, 255, 255, 0.2)' },
_active: { bg: "rgba(255, 255, 255, 0.3)" }, _active: { bg: 'rgba(255, 255, 255, 0.3)' },
_focus: { boxShadow: "none" }, _focus: { boxShadow: 'none' },
}; };
const InputMenu: React.FC<{ isDisabled?: boolean }> = observer( const InputMenu: React.FC<{ isDisabled?: boolean }> = observer(({ isDisabled }) => {
({ isDisabled }) => { const isMobile = useIsMobile();
const isMobile = useIsMobile(); const isMobileUserAgent = useIsMobileUserAgent();
const isMobileUserAgent = useIsMobileUserAgent(); const { isOpen, onOpen, onClose, onToggle, getDisclosureProps, getButtonProps } = useDisclosure();
const {
isOpen,
onOpen,
onClose,
onToggle,
getDisclosureProps,
getButtonProps,
} = useDisclosure();
const [controlledOpen, setControlledOpen] = useState<boolean>(false); const [controlledOpen, setControlledOpen] = useState<boolean>(false);
const [supportedModels, setSupportedModels] = useState<any[]>([]);
useEffect(() => { useEffect(() => {
setControlledOpen(isOpen); setControlledOpen(isOpen);
}, [isOpen]); }, [isOpen]);
useEffect(() => {
fetch('/api/models')
.then(response => response.json())
.then(models => {
setSupportedModels(models);
})
.catch(err => {
console.error('Could not fetch models: ', err);
});
}, []);
const getSupportedModels = async () => { const handleClose = useCallback(() => {
// Check if fetch is available (browser environment) onClose();
if (typeof fetch !== 'undefined') { }, [isOpen]);
try {
return await (await fetch("/api/models")).json();
} catch (error) {
console.error("Error fetching models:", error);
return [];
}
} else {
// In test environment or where fetch is not available
console.log("Fetch not available, using default models");
return [];
}
}
useEffect(() => { const handleCopyConversation = useCallback(() => {
getSupportedModels().then((supportedModels) => { navigator.clipboard
// Check if setSupportedModels method exists before calling it .writeText(formatConversationMarkdown(clientChatStore.items))
if (clientChatStore.setSupportedModels) { .then(() => {
clientChatStore.setSupportedModels(supportedModels); window.alert('Conversation copied to clipboard. \n\nPaste it somewhere safe!');
} else { onClose();
console.log("setSupportedModels method not available in this environment"); })
} .catch(err => {
}); console.error('Could not copy text to clipboard: ', err);
}, []); window.alert('Failed to copy conversation. Please try again.');
});
}, [onClose]);
async function selectModelFn({ name, value }) {
clientChatStore.setModel(value);
}
const handleClose = useCallback(() => { function isSelectedModelFn({ name, value }) {
onClose(); return clientChatStore.model === value;
}, [isOpen]); }
const handleCopyConversation = useCallback(() => { const menuRef = useRef();
navigator.clipboard const [menuState, setMenuState] = useState();
.writeText(formatConversationMarkdown(clientChatStore.items))
.then(() => {
window.alert(
"Conversation copied to clipboard. \n\nPaste it somewhere safe!",
);
onClose();
})
.catch((err) => {
console.error("Could not copy text to clipboard: ", err);
window.alert("Failed to copy conversation. Please try again.");
});
}, [onClose]);
async function selectModelFn({ name, value }) { useOutsideClick({
clientChatStore.setModel(value); enabled: !isMobile && isOpen,
} ref: menuRef,
handler: () => {
handleClose();
},
});
function isSelectedModelFn({ name, value }) { return (
return clientChatStore.model === value; <Menu
} isOpen={controlledOpen}
onClose={onClose}
const menuRef = useRef(); onOpen={onOpen}
const [menuState, setMenuState] = useState(); autoSelect={false}
closeOnSelect={false}
useOutsideClick({ closeOnBlur={isOpen && !isMobileUserAgent}
enabled: !isMobile && isOpen, isLazy={true}
ref: menuRef, lazyBehavior={'unmount'}
handler: () => { >
handleClose(); {isMobile ? (
}, <MenuButton
}); as={IconButton}
bg="text.accent"
return ( icon={<Settings size={20} />}
<Menu isDisabled={isDisabled}
isOpen={controlledOpen} aria-label="Settings"
onClose={onClose} _hover={{ bg: 'rgba(255, 255, 255, 0.2)' }}
onOpen={onOpen} _focus={{ boxShadow: 'none' }}
autoSelect={false} {...MsM_commonButtonStyles}
closeOnSelect={false} />
closeOnBlur={isOpen && !isMobileUserAgent} ) : (
isLazy={true} <MenuButton
lazyBehavior={"unmount"} as={Button}
> rightIcon={<ChevronDown size={16} />}
{isMobile ? ( isDisabled={isDisabled}
<MenuButton variant="ghost"
as={IconButton} display="flex"
bg="text.accent" justifyContent="space-between"
icon={<Settings size={20} />} alignItems="center"
isDisabled={isDisabled} minW="auto"
aria-label="Settings" {...MsM_commonButtonStyles}
_hover={{ bg: "rgba(255, 255, 255, 0.2)" }}
_focus={{ boxShadow: "none" }}
{...MsM_commonButtonStyles}
/>
) : (
<MenuButton
as={Button}
rightIcon={<ChevronDown size={16} />}
isDisabled={isDisabled}
variant="ghost"
display="flex"
justifyContent="space-between"
alignItems="center"
minW="auto"
{...MsM_commonButtonStyles}
>
<Text noOfLines={1} maxW="100px" fontSize="sm">
{clientChatStore.model}
</Text>
</MenuButton>
)}
<MenuList
bg="background.tertiary"
border="none"
borderRadius="md"
boxShadow="lg"
minW={"10rem"}
ref={menuRef}
> >
<FlyoutSubMenu <Text noOfLines={1} maxW="100px" fontSize="sm">
title="Text Models" {clientChatStore.model}
flyoutMenuOptions={clientChatStore.supportedModels.map((m) => ({ name: m, value: m }))} </Text>
onClose={onClose} </MenuButton>
parentIsOpen={isOpen} )}
setMenuState={setMenuState} <MenuList
handleSelect={selectModelFn} bg="background.tertiary"
isSelected={isSelectedModelFn} border="none"
/> borderRadius="md"
<Divider color="text.tertiary" /> boxShadow="lg"
{/*Export conversation button*/} minW={'10rem'}
<MenuItem ref={menuRef}
bg="background.tertiary" >
color="text.primary" <FlyoutSubMenu
onClick={handleCopyConversation} title="Text Models"
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }} flyoutMenuOptions={supportedModels.map(modelData => ({
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }} name: modelData.id.split('/').pop() || modelData.id,
> value: modelData.id,
<Flex align="center"> }))}
<Copy size="16px" style={{ marginRight: "8px" }} /> onClose={onClose}
<Box>Export</Box> parentIsOpen={isOpen}
</Flex> setMenuState={setMenuState}
</MenuItem> handleSelect={selectModelFn}
{/*New conversation button*/} isSelected={isSelectedModelFn}
<MenuItem />
bg="background.tertiary" <Divider color="text.tertiary" />
color="text.primary" {/*Export conversation button*/}
onClick={() => { <MenuItem
clientChatStore.setActiveConversation("conversation:new"); bg="background.tertiary"
onClose(); color="text.primary"
}} onClick={handleCopyConversation}
_hover={{ bg: "rgba(0, 0, 0, 0.05)" }} _hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: "rgba(0, 0, 0, 0.1)" }} _focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
> >
<Flex align="center"> <Flex align="center">
<RefreshCcw size="16px" style={{ marginRight: "8px" }} /> <Copy size="16px" style={{ marginRight: '8px' }} />
<Box>New</Box> <Box>Export</Box>
</Flex> </Flex>
</MenuItem> </MenuItem>
</MenuList> {/*New conversation button*/}
</Menu> <MenuItem
); bg="background.tertiary"
}, color="text.primary"
); onClick={() => {
clientChatStore.reset();
onClose();
}}
_hover={{ bg: 'rgba(0, 0, 0, 0.05)' }}
_focus={{ bg: 'rgba(0, 0, 0, 0.1)' }}
>
<Flex align="center">
<RefreshCcw size="16px" style={{ marginRight: '8px' }} />
<Box>New</Box>
</Flex>
</MenuItem>
</MenuList>
</Menu>
);
});
export default InputMenu; export default InputMenu;

View File

@@ -1,34 +1,28 @@
import React, { useEffect, useRef, useState } from "react"; import { Box, Button, Grid, GridItem, useBreakpointValue } from '@chakra-ui/react';
import { import { observer } from 'mobx-react-lite';
Box, import React, { useEffect, useRef, useState } from 'react';
Button,
Grid, import { useMaxWidth } from '../../../hooks/useMaxWidth';
GridItem, import chatStore from '../../../stores/ClientChatStore';
useBreakpointValue, import userOptionsStore from '../../../stores/UserOptionsStore';
} from "@chakra-ui/react"; import InputMenu from '../input-menu/InputMenu';
import { observer } from "mobx-react-lite";
import chatStore from "../../../stores/ClientChatStore"; import SendButton from './ChatInputSendButton';
import InputMenu from "../input-menu/InputMenu"; import InputTextarea from './ChatInputTextArea';
import InputTextarea from "./ChatInputTextArea";
import SendButton from "./ChatInputSendButton";
import { useMaxWidth } from "../../../hooks/useMaxWidth";
import userOptionsStore from "../../../stores/UserOptionsStore";
const ChatInput = observer(() => { const ChatInput = observer(() => {
const inputRef = useRef<HTMLTextAreaElement>(null); const inputRef = useRef<HTMLTextAreaElement>(null);
const containerRef = useRef<HTMLDivElement>(null); const containerRef = useRef<HTMLDivElement>(null);
const maxWidth = useMaxWidth(); const maxWidth = useMaxWidth();
const [inputValue, setInputValue] = useState<string>(""); const [inputValue, setInputValue] = useState<string>('');
const [containerHeight, setContainerHeight] = useState(56); const [containerHeight, setContainerHeight] = useState(56);
const [containerBorderRadius, setContainerBorderRadius] = useState(9999); const [containerBorderRadius, setContainerBorderRadius] = useState(9999);
const [shouldFollow, setShouldFollow] = useState<boolean>( const [shouldFollow, setShouldFollow] = useState<boolean>(userOptionsStore.followModeEnabled);
userOptionsStore.followModeEnabled,
);
const [couldFollow, setCouldFollow] = useState<boolean>(chatStore.isLoading); const [couldFollow, setCouldFollow] = useState<boolean>(chatStore.isLoading);
const [inputWidth, setInputWidth] = useState<string>("50%"); const [inputWidth, setInputWidth] = useState<string>('40%');
useEffect(() => { useEffect(() => {
setShouldFollow(chatStore.isLoading && userOptionsStore.followModeEnabled); setShouldFollow(chatStore.isLoading && userOptionsStore.followModeEnabled);
@@ -42,8 +36,8 @@ const ChatInput = observer(() => {
useEffect(() => { useEffect(() => {
if (containerRef.current) { if (containerRef.current) {
const observer = new ResizeObserver((entries) => { const observer = new ResizeObserver(entries => {
for (let entry of entries) { for (const entry of entries) {
const newHeight = entry.target.clientHeight; const newHeight = entry.target.clientHeight;
setContainerHeight(newHeight); setContainerHeight(newHeight);
@@ -63,27 +57,25 @@ const ChatInput = observer(() => {
}; };
const handleKeyDown = (e: React.KeyboardEvent<HTMLTextAreaElement>) => { const handleKeyDown = (e: React.KeyboardEvent<HTMLTextAreaElement>) => {
if (e.key === "Enter" && !e.shiftKey) { if (e.key === 'Enter' && !e.shiftKey) {
e.preventDefault(); e.preventDefault();
chatStore.sendMessage(); chatStore.sendMessage();
} }
}; };
const inputMaxWidth = useBreakpointValue( const inputMaxWidth = useBreakpointValue(
{ base: "50rem", lg: "50rem", md: "80%", sm: "100vw" }, { base: '30rem', lg: '50rem', md: '80%', sm: '100vw' },
{ ssr: true }, { ssr: true },
); );
const inputMinWidth = useBreakpointValue({ lg: "40rem" }, { ssr: true }); const inputMinWidth = useBreakpointValue({ lg: '40rem', md: '30rem' }, { ssr: true });
useEffect(() => { useEffect(() => {
setInputWidth("100%"); setInputWidth('100%');
}, [inputMaxWidth, inputMinWidth]); }, [inputMaxWidth, inputMinWidth]);
return ( return (
<Box <Box
width={inputWidth} width={inputMinWidth}
maxW={inputMaxWidth}
minWidth={inputMinWidth}
mx="auto" mx="auto"
p={2} p={2}
pl={2} pl={2}
@@ -105,12 +97,12 @@ const ChatInput = observer(() => {
size="sm" size="sm"
variant="ghost" variant="ghost"
colorScheme="blue" colorScheme="blue"
onClick={(_) => { onClick={_ => {
userOptionsStore.toggleFollowMode(); userOptionsStore.toggleFollowMode();
}} }}
isDisabled={!chatStore.isLoading} isDisabled={!chatStore.isLoading}
> >
{shouldFollow ? "Disable Follow Mode" : "Enable Follow Mode"} {shouldFollow ? 'Disable Follow Mode' : 'Enable Follow Mode'}
</Button> </Button>
</Box> </Box>
)} )}
@@ -123,7 +115,7 @@ const ChatInput = observer(() => {
gap={2} gap={2}
alignItems="center" alignItems="center"
style={{ style={{
transition: "border-radius 0.2s ease", transition: 'border-radius 0.2s ease',
}} }}
> >
<GridItem> <GridItem>

View File

@@ -1,9 +1,9 @@
import React from "react"; import { Button } from '@chakra-ui/react';
import { Button } from "@chakra-ui/react"; import { motion } from 'framer-motion';
import clientChatStore from "../../../stores/ClientChatStore"; import { CirclePause, Send } from 'lucide-react';
import { CirclePause, Send } from "lucide-react"; import React from 'react';
import { motion } from "framer-motion"; import clientChatStore from '../../../stores/ClientChatStore';
interface SendButtonProps { interface SendButtonProps {
isLoading: boolean; isLoading: boolean;
@@ -13,25 +13,20 @@ interface SendButtonProps {
} }
const SendButton: React.FC<SendButtonProps> = ({ onClick }) => { const SendButton: React.FC<SendButtonProps> = ({ onClick }) => {
const isDisabled = const isDisabled = clientChatStore.input.trim().length === 0 && !clientChatStore.isLoading;
clientChatStore.input.trim().length === 0 && !clientChatStore.isLoading;
return ( return (
<Button <Button
onClick={(e) => onClick={e =>
clientChatStore.isLoading clientChatStore.isLoading ? clientChatStore.stopIncomingMessage() : onClick(e)
? clientChatStore.stopIncomingMessage()
: onClick(e)
} }
bg="transparent" bg="transparent"
color={ color={clientChatStore.input.trim().length <= 1 ? 'brand.700' : 'text.primary'}
clientChatStore.input.trim().length <= 1 ? "brand.700" : "text.primary"
}
borderRadius="full" borderRadius="full"
p={2} p={2}
isDisabled={isDisabled} isDisabled={isDisabled}
_hover={{ bg: !isDisabled ? "rgba(255, 255, 255, 0.2)" : "inherit" }} _hover={{ bg: !isDisabled ? 'rgba(255, 255, 255, 0.2)' : 'inherit' }}
_active={{ bg: !isDisabled ? "rgba(255, 255, 255, 0.3)" : "inherit" }} _active={{ bg: !isDisabled ? 'rgba(255, 255, 255, 0.3)' : 'inherit' }}
_focus={{ boxShadow: "none" }} _focus={{ boxShadow: 'none' }}
> >
{clientChatStore.isLoading ? <MySpinner /> : <Send size={20} />} {clientChatStore.isLoading ? <MySpinner /> : <Send size={20} />}
</Button> </Button>
@@ -45,10 +40,10 @@ const MySpinner = ({ onClick }) => (
exit={{ opacity: 0, scale: 0.9 }} exit={{ opacity: 0, scale: 0.9 }}
transition={{ transition={{
duration: 0.4, duration: 0.4,
ease: "easeInOut", ease: 'easeInOut',
}} }}
> >
<CirclePause color={"#F0F0F0"} size={24} onClick={onClick} /> <CirclePause color={'#F0F0F0'} size={24} onClick={onClick} />
</motion.div> </motion.div>
); );

View File

@@ -1,7 +1,7 @@
import React, {useEffect, useRef, useState} from "react"; import { Box, chakra, InputGroup, useBreakpointValue } from '@chakra-ui/react';
import {observer} from "mobx-react-lite"; import { observer } from 'mobx-react-lite';
import {Box, chakra, InputGroup,} from "@chakra-ui/react"; import React, { useEffect, useRef, useState } from 'react';
import AutoResize from "react-textarea-autosize"; import AutoResize from 'react-textarea-autosize';
const AutoResizeTextArea = chakra(AutoResize); const AutoResizeTextArea = chakra(AutoResize);
@@ -15,14 +15,11 @@ interface InputTextAreaProps {
const InputTextArea: React.FC<InputTextAreaProps> = observer( const InputTextArea: React.FC<InputTextAreaProps> = observer(
({ inputRef, value, onChange, onKeyDown, isLoading }) => { ({ inputRef, value, onChange, onKeyDown, isLoading }) => {
const [heightConstraint, setHeightConstraint] = useState<number | undefined>(10);
const [heightConstraint, setHeightConstraint] = useState<
number | undefined
>(10);
useEffect(() => { useEffect(() => {
if (value.length > 10) { if (value.length > 10) {
setHeightConstraint(); setHeightConstraint(parseInt(value));
} }
}, [value]); }, [value]);
@@ -34,7 +31,6 @@ const InputTextArea: React.FC<InputTextAreaProps> = observer(
display="flex" display="flex"
flexDirection="column" flexDirection="column"
> >
{/* Input Area */} {/* Input Area */}
<InputGroup position="relative"> <InputGroup position="relative">
<AutoResizeTextArea <AutoResizeTextArea
@@ -42,8 +38,9 @@ const InputTextArea: React.FC<InputTextAreaProps> = observer(
ref={inputRef} ref={inputRef}
value={value} value={value}
height={heightConstraint} height={heightConstraint}
maxH={heightConstraint}
autoFocus autoFocus
onChange={(e) => onChange(e.target.value)} onChange={e => onChange(e.target.value)}
onKeyDown={onKeyDown} onKeyDown={onKeyDown}
p={2} p={2}
pr="8px" pr="8px"
@@ -53,19 +50,25 @@ const InputTextArea: React.FC<InputTextAreaProps> = observer(
borderRadius="20px" borderRadius="20px"
border="none" border="none"
placeholder="Free my mind..." placeholder="Free my mind..."
_placeholder={{ color: "gray.400" }} _placeholder={{
color: 'gray.400',
textWrap: 'nowrap',
textOverflow: 'ellipsis',
overflow: 'hidden',
width: '90%',
}}
_focus={{ _focus={{
outline: "none", outline: 'none',
}} }}
disabled={isLoading} disabled={isLoading}
minRows={1} minRows={1}
maxRows={12} maxRows={12}
style={{ style={{
touchAction: "none", touchAction: 'none',
resize: "none", resize: 'none',
overflowY: "auto", overflowY: 'auto',
width: "100%", width: '100%',
transition: "height 0.2s ease-in-out", transition: 'height 0.2s ease-in-out',
}} }}
/> />
</InputGroup> </InputGroup>

View File

@@ -1,9 +1,10 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react'; import { render, screen, fireEvent } from '@testing-library/react';
import React from 'react'; import React from 'react';
import ChatInput from '../ChatInput'; import { describe, it, expect, vi, beforeEach } from 'vitest';
import userOptionsStore from '../../../../stores/UserOptionsStore';
import chatStore from '../../../../stores/ClientChatStore'; import chatStore from '../../../../stores/ClientChatStore';
import userOptionsStore from '../../../../stores/UserOptionsStore';
import ChatInput from '../ChatInput';
// Mock browser APIs // Mock browser APIs
class MockResizeObserver { class MockResizeObserver {
@@ -85,7 +86,7 @@ vi.mock('./ChatInputTextArea', () => ({
aria-label="Chat input" aria-label="Chat input"
ref={inputRef} ref={inputRef}
value={value} value={value}
onChange={(e) => onChange(e.target.value)} onChange={e => onChange(e.target.value)}
onKeyDown={onKeyDown} onKeyDown={onKeyDown}
disabled={isLoading} disabled={isLoading}
/> />

View File

@@ -8,16 +8,16 @@ const SUPPORTED_MODELS_GROUPS = {
groq: [ groq: [
// "mixtral-8x7b-32768", // "mixtral-8x7b-32768",
// "deepseek-r1-distill-llama-70b", // "deepseek-r1-distill-llama-70b",
"meta-llama/llama-4-scout-17b-16e-instruct", 'meta-llama/llama-4-scout-17b-16e-instruct',
"gemma2-9b-it", 'gemma2-9b-it',
"mistral-saba-24b", 'mistral-saba-24b',
// "qwen-2.5-32b", // "qwen-2.5-32b",
"llama-3.3-70b-versatile", 'llama-3.3-70b-versatile',
// "llama-3.3-70b-versatile" // "llama-3.3-70b-versatile"
// "llama-3.1-70b-versatile", // "llama-3.1-70b-versatile",
// "llama-3.3-70b-versatile" // "llama-3.3-70b-versatile"
], ],
cerebras: ["llama-3.3-70b"], cerebras: ['llama-3.3-70b'],
claude: [ claude: [
// "claude-3-5-sonnet-20241022", // "claude-3-5-sonnet-20241022",
// "claude-3-opus-20240229" // "claude-3-opus-20240229"
@@ -44,34 +44,34 @@ const SUPPORTED_MODELS_GROUPS = {
// "grok-beta" // "grok-beta"
], ],
cloudflareAI: [ cloudflareAI: [
"llama-3.2-3b-instruct", // max_tokens 'llama-3.2-3b-instruct', // max_tokens
"llama-3-8b-instruct", // max_tokens 'llama-3-8b-instruct', // max_tokens
"llama-3.1-8b-instruct-fast", // max_tokens 'llama-3.1-8b-instruct-fast', // max_tokens
"deepseek-math-7b-instruct", 'deepseek-math-7b-instruct',
"deepseek-coder-6.7b-instruct-awq", 'deepseek-coder-6.7b-instruct-awq',
"hermes-2-pro-mistral-7b", 'hermes-2-pro-mistral-7b',
"openhermes-2.5-mistral-7b-awq", 'openhermes-2.5-mistral-7b-awq',
"mistral-7b-instruct-v0.2", 'mistral-7b-instruct-v0.2',
"neural-chat-7b-v3-1-awq", 'neural-chat-7b-v3-1-awq',
"openchat-3.5-0106", 'openchat-3.5-0106',
// "gemma-7b-it", // "gemma-7b-it",
], ],
}; };
export type SupportedModel = export type SupportedModel =
| keyof typeof SUPPORTED_MODELS_GROUPS | keyof typeof SUPPORTED_MODELS_GROUPS
| (typeof SUPPORTED_MODELS_GROUPS)[keyof typeof SUPPORTED_MODELS_GROUPS][number]; | (typeof SUPPORTED_MODELS_GROUPS)[keyof typeof SUPPORTED_MODELS_GROUPS][number];
export type ModelFamily = keyof typeof SUPPORTED_MODELS_GROUPS; export type ModelFamily = keyof typeof SUPPORTED_MODELS_GROUPS;
function getModelFamily(model: string): ModelFamily | undefined { function getModelFamily(model: string): ModelFamily | undefined {
return Object.keys(SUPPORTED_MODELS_GROUPS) return Object.keys(SUPPORTED_MODELS_GROUPS)
.filter((family) => { .filter(family => {
return SUPPORTED_MODELS_GROUPS[ return SUPPORTED_MODELS_GROUPS[family as keyof typeof SUPPORTED_MODELS_GROUPS].includes(
family as keyof typeof SUPPORTED_MODELS_GROUPS model.trim(),
].includes(model.trim()); );
}) })
.at(0) as ModelFamily | undefined; .at(0) as ModelFamily | undefined;
} }
const SUPPORTED_MODELS = [ const SUPPORTED_MODELS = [

View File

@@ -1,30 +1,30 @@
import DOMPurify from "isomorphic-dompurify"; import DOMPurify from 'isomorphic-dompurify';
function domPurify(dirty: string) { function domPurify(dirty: string) {
return DOMPurify.sanitize(dirty, { return DOMPurify.sanitize(dirty, {
USE_PROFILES: { html: true }, USE_PROFILES: { html: true },
ALLOWED_TAGS: [ ALLOWED_TAGS: [
"b", 'b',
"i", 'i',
"u", 'u',
"a", 'a',
"p", 'p',
"span", 'span',
"div", 'div',
"table", 'table',
"thead", 'thead',
"tbody", 'tbody',
"tr", 'tr',
"td", 'td',
"th", 'th',
"ul", 'ul',
"ol", 'ol',
"li", 'li',
"code", 'code',
"pre", 'pre',
], ],
ALLOWED_ATTR: ["href", "src", "alt", "title", "class", "style"], ALLOWED_ATTR: ['href', 'src', 'alt', 'title', 'class', 'style'],
FORBID_TAGS: ["script", "iframe"], FORBID_TAGS: ['script', 'iframe'],
KEEP_CONTENT: true, KEEP_CONTENT: true,
SAFE_FOR_TEMPLATES: true, SAFE_FOR_TEMPLATES: true,
}); });

View File

@@ -1,18 +1,17 @@
// Function to generate a Markdown representation of the current conversation // Function to generate a Markdown representation of the current conversation
import { type IMessage } from "../../../stores/ClientChatStore"; import { type Instance } from 'mobx-state-tree';
import { Instance } from "mobx-state-tree";
export function formatConversationMarkdown( import { type IMessage } from '../../../stores/ClientChatStore';
messages: Instance<typeof IMessage>[],
): string { export function formatConversationMarkdown(messages: Instance<typeof IMessage>[]): string {
return messages return messages
.map((message) => { .map(message => {
if (message.role === "user") { if (message.role === 'user') {
return `**You**: ${message.content}`; return `**You**: ${message.content}`;
} else if (message.role === "assistant") { } else if (message.role === 'assistant') {
return `**Geoff's AI**: ${message.content}`; return `**open-gsio**: ${message.content}`;
} }
return ""; return '';
}) })
.join("\n\n"); .join('\n\n');
} }

View File

@@ -1,6 +1,6 @@
import React from "react"; import React from 'react';
import MessageMarkdownRenderer from "./MessageMarkdownRenderer"; import MessageMarkdownRenderer from './MessageMarkdownRenderer';
const ChatMessageContent = ({ content }) => { const ChatMessageContent = ({ content }) => {
return <MessageMarkdownRenderer markdown={content} />; return <MessageMarkdownRenderer markdown={content} />;

View File

@@ -1,9 +1,11 @@
import React from "react"; import { Box, Grid, GridItem } from '@chakra-ui/react';
import {Box, Grid, GridItem} from "@chakra-ui/react"; import { observer } from 'mobx-react-lite';
import MessageBubble from "./MessageBubble"; import React from 'react';
import {observer} from "mobx-react-lite";
import chatStore from "../../../stores/ClientChatStore"; import chatStore from '../../../stores/ClientChatStore';
import {useIsMobile} from "../../contexts/MobileContext"; import { useIsMobile } from '../../contexts/MobileContext';
import MessageBubble from './MessageBubble';
interface ChatMessagesProps { interface ChatMessagesProps {
scrollRef: React.RefObject<HTMLDivElement>; scrollRef: React.RefObject<HTMLDivElement>;
@@ -13,11 +15,7 @@ const ChatMessages: React.FC<ChatMessagesProps> = observer(({ scrollRef }) => {
const isMobile = useIsMobile(); const isMobile = useIsMobile();
return ( return (
<Box <Box pt={isMobile ? 24 : undefined} overflowY={'scroll'} overflowX={'hidden'}>
pt={isMobile ? 24 : undefined}
overflowY={"scroll"}
overflowX={"hidden"}
>
<Grid <Grid
fontFamily="Arial, sans-serif" fontFamily="Arial, sans-serif"
templateColumns="1fr" templateColumns="1fr"

View File

@@ -1,43 +1,43 @@
import React, { useEffect, useRef, useState } from "react"; import { Box, Flex, Text } from '@chakra-ui/react';
import { Box, Flex, Text } from "@chakra-ui/react"; import { observer } from 'mobx-react-lite';
import MessageRenderer from "./ChatMessageContent"; import React, { useEffect, useRef, useState } from 'react';
import { observer } from "mobx-react-lite";
import MessageEditor from "./MessageEditorComponent";
import UserMessageTools from "./UserMessageTools";
import clientChatStore from "../../../stores/ClientChatStore";
import UserOptionsStore from "../../../stores/UserOptionsStore";
import MotionBox from "./MotionBox";
import clientChatStore from '../../../stores/ClientChatStore';
import UserOptionsStore from '../../../stores/UserOptionsStore';
import MessageRenderer from './ChatMessageContent';
import MessageEditor from './MessageEditorComponent';
import MotionBox from './MotionBox';
import UserMessageTools from './UserMessageTools';
const LoadingDots = () => { const LoadingDots = () => {
return ( return (
<Flex> <Flex>
{[0, 1, 2].map((i) => ( {[0, 1, 2].map(i => (
<MotionBox <MotionBox
key={i} key={i}
width="8px" width="8px"
height="8px" height="8px"
borderRadius="50%" borderRadius="50%"
backgroundColor="text.primary" backgroundColor="text.primary"
margin="0 4px" margin="0 4px"
animate={{ animate={{
scale: [1, 1.2, 1], scale: [1, 1.2, 1],
opacity: [0.5, 1, 0.5], opacity: [0.5, 1, 0.5],
}} }}
transition={{ transition={{
duration: 1, duration: 1,
repeat: Infinity, repeat: Infinity,
delay: i * 0.2, delay: i * 0.2,
}} }}
/> />
))} ))}
</Flex> </Flex>
); );
} };
function renderMessage(msg: any) { function renderMessage(msg: any) {
if (msg.role === "user") { if (msg.role === 'user') {
return ( return (
<Text as="p" fontSize="sm" lineHeight="short" color="text.primary"> <Text as="p" fontSize="sm" lineHeight="short" color="text.primary">
{msg.content} {msg.content}
@@ -50,8 +50,8 @@ function renderMessage(msg: any) {
const MessageBubble = observer(({ msg, scrollRef }) => { const MessageBubble = observer(({ msg, scrollRef }) => {
const [isEditing, setIsEditing] = useState(false); const [isEditing, setIsEditing] = useState(false);
const [isHovered, setIsHovered] = useState(false); const [isHovered, setIsHovered] = useState(false);
const isUser = msg.role === "user"; const isUser = msg.role === 'user';
const senderName = isUser ? "You" : "Geoff's AI"; const senderName = isUser ? 'You' : 'open-gsio';
const isLoading = !msg.content || !(msg.content.trim().length > 0); const isLoading = !msg.content || !(msg.content.trim().length > 0);
const messageRef = useRef(); const messageRef = useRef();
@@ -64,10 +64,15 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
}; };
useEffect(() => { useEffect(() => {
if (clientChatStore.items.length > 0 && clientChatStore.isLoading && UserOptionsStore.followModeEnabled) { // Refine condition if (
clientChatStore.items.length > 0 &&
clientChatStore.isLoading &&
UserOptionsStore.followModeEnabled
) {
// Refine condition
scrollRef.current?.scrollTo({ scrollRef.current?.scrollTo({
top: scrollRef.current.scrollHeight, top: scrollRef.current.scrollHeight,
behavior: "auto", behavior: 'auto',
}); });
} }
}); });
@@ -75,7 +80,7 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
return ( return (
<Flex <Flex
flexDirection="column" flexDirection="column"
alignItems={isUser ? "flex-end" : "flex-start"} alignItems={isUser ? 'flex-end' : 'flex-start'}
role="listitem" role="listitem"
flex={0} flex={0}
aria-label={`Message from ${senderName}`} aria-label={`Message from ${senderName}`}
@@ -85,19 +90,19 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
<Text <Text
fontSize="xs" fontSize="xs"
color="text.tertiary" color="text.tertiary"
textAlign={isUser ? "right" : "left"} textAlign={isUser ? 'right' : 'left'}
alignSelf={isUser ? "flex-end" : "flex-start"} alignSelf={isUser ? 'flex-end' : 'flex-start'}
mb={1} mb={1}
> >
{senderName} {senderName}
</Text> </Text>
<MotionBox <MotionBox
minW={{ base: "99%", sm: "99%", lg: isUser ? "55%" : "60%" }} minW={{ base: '99%', sm: '99%', lg: isUser ? '55%' : '60%' }}
maxW={{ base: "99%", sm: "99%", lg: isUser ? "65%" : "65%" }} maxW={{ base: '99%', sm: '99%', lg: isUser ? '65%' : '65%' }}
p={3} p={3}
borderRadius="1.5em" borderRadius="1.5em"
bg={isUser ? "#0A84FF" : "#3A3A3C"} bg={isUser ? '#0A84FF' : '#3A3A3C'}
color="text.primary" color="text.primary"
textAlign="left" textAlign="left"
boxShadow="0 2px 4px rgba(0, 0, 0, 0.1)" boxShadow="0 2px 4px rgba(0, 0, 0, 0.1)"
@@ -115,10 +120,10 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
whiteSpace="pre-wrap" whiteSpace="pre-wrap"
ref={messageRef} ref={messageRef}
sx={{ sx={{
"pre, code": { 'pre, code': {
maxWidth: "100%", maxWidth: '100%',
whiteSpace: "pre-wrap", whiteSpace: 'pre-wrap',
overflowX: "auto", overflowX: 'auto',
}, },
}} }}
> >
@@ -139,9 +144,7 @@ const MessageBubble = observer(({ msg, scrollRef }) => {
justifyContent="center" justifyContent="center"
alignItems="center" alignItems="center"
> >
{isHovered && !isEditing && ( {isHovered && !isEditing && <UserMessageTools message={msg} onEdit={handleEdit} />}
<UserMessageTools message={msg} onEdit={handleEdit} />
)}
</Box> </Box>
)} )}
</Flex> </Flex>

View File

@@ -1,10 +1,11 @@
import React, { KeyboardEvent, useEffect } from "react"; import { Box, Flex, IconButton, Textarea } from '@chakra-ui/react';
import { Box, Flex, IconButton, Textarea } from "@chakra-ui/react"; import { Check, X } from 'lucide-react';
import { Check, X } from "lucide-react"; import { observer } from 'mobx-react-lite';
import { observer } from "mobx-react-lite"; import { type Instance } from 'mobx-state-tree';
import { Instance } from "mobx-state-tree"; import React, { type KeyboardEvent, useEffect } from 'react';
import Message from "../../../models/Message";
import messageEditorStore from "../../../stores/MessageEditorStore"; import Message from '../../../models/Message';
import messageEditorStore from '../../../stores/MessageEditorStore';
interface MessageEditorProps { interface MessageEditorProps {
message: Instance<typeof Message>; message: Instance<typeof Message>;
@@ -30,15 +31,13 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
onCancel(); onCancel();
}; };
const handleKeyDown = (e: KeyboardEvent<HTMLTextAreaElement>) => { const handleKeyDown = (e: KeyboardEvent<HTMLTextAreaElement>) => {
if (e.key === "Enter" && (e.metaKey || e.ctrlKey)) { if (e.key === 'Enter' && (e.metaKey || e.ctrlKey)) {
e.preventDefault(); e.preventDefault();
handleSave(); handleSave();
} }
if (e.key === "Escape") { if (e.key === 'Escape') {
e.preventDefault(); e.preventDefault();
handleCancel(); handleCancel();
} }
@@ -48,14 +47,14 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
<Box width="100%"> <Box width="100%">
<Textarea <Textarea
value={messageEditorStore.editedContent} value={messageEditorStore.editedContent}
onChange={(e) => messageEditorStore.setEditedContent(e.target.value)} onChange={e => messageEditorStore.setEditedContent(e.target.value)}
onKeyDown={handleKeyDown} onKeyDown={handleKeyDown}
minHeight="100px" minHeight="100px"
bg="transparent" bg="transparent"
border="1px solid" border="1px solid"
borderColor="whiteAlpha.300" borderColor="whiteAlpha.300"
_hover={{ borderColor: "whiteAlpha.400" }} _hover={{ borderColor: 'whiteAlpha.400' }}
_focus={{ borderColor: "brand.100", boxShadow: "none" }} _focus={{ borderColor: 'brand.100', boxShadow: 'none' }}
resize="vertical" resize="vertical"
color="text.primary" color="text.primary"
/> />
@@ -66,7 +65,7 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
onClick={handleCancel} onClick={handleCancel}
size="sm" size="sm"
variant="ghost" variant="ghost"
color={"accent.danger"} color={'accent.danger'}
/> />
<IconButton <IconButton
aria-label="Save edit" aria-label="Save edit"
@@ -74,7 +73,7 @@ const MessageEditor = observer(({ message, onCancel }: MessageEditorProps) => {
onClick={handleSave} onClick={handleSave}
size="sm" size="sm"
variant="ghost" variant="ghost"
color={"accent.confirm"} color={'accent.confirm'}
/> />
</Flex> </Flex>
</Box> </Box>

View File

@@ -1,5 +1,3 @@
import React from "react";
import { import {
Box, Box,
Code, Code,
@@ -17,13 +15,15 @@ import {
Thead, Thead,
Tr, Tr,
useColorModeValue, useColorModeValue,
} from "@chakra-ui/react"; } from '@chakra-ui/react';
import { marked } from "marked"; import katex from 'katex';
import CodeBlock from "../../code/CodeBlock"; import { marked } from 'marked';
import ImageWithFallback from "../../markdown/ImageWithFallback"; import markedKatex from 'marked-katex-extension';
import markedKatex from "marked-katex-extension"; import React from 'react';
import katex from "katex";
import domPurify from "../lib/domPurify"; import CodeBlock from '../../code/CodeBlock';
import ImageWithFallback from '../../markdown/ImageWithFallback';
import domPurify from '../lib/domPurify';
try { try {
if (localStorage) { if (localStorage) {
@@ -34,11 +34,13 @@ try {
throwOnError: false, throwOnError: false,
strict: true, strict: true,
colorIsTextColor: true, colorIsTextColor: true,
errorColor: "red", errorColor: 'red',
}), }),
); );
} }
} catch (_) {} } catch (_) {
// Silently ignore errors in marked setup - fallback to default behavior
}
const MemoizedCodeBlock = React.memo(CodeBlock); const MemoizedCodeBlock = React.memo(CodeBlock);
@@ -49,32 +51,29 @@ const MemoizedCodeBlock = React.memo(CodeBlock);
const getHeadingProps = (depth: number) => { const getHeadingProps = (depth: number) => {
switch (depth) { switch (depth) {
case 1: case 1:
return { as: "h1", size: "xl", mt: 4, mb: 2 }; return { as: 'h1', size: 'xl', mt: 4, mb: 2 };
case 2: case 2:
return { as: "h2", size: "lg", mt: 3, mb: 2 }; return { as: 'h2', size: 'lg', mt: 3, mb: 2 };
case 3: case 3:
return { as: "h3", size: "md", mt: 2, mb: 1 }; return { as: 'h3', size: 'md', mt: 2, mb: 1 };
case 4: case 4:
return { as: "h4", size: "sm", mt: 2, mb: 1 }; return { as: 'h4', size: 'sm', mt: 2, mb: 1 };
case 5: case 5:
return { as: "h5", size: "sm", mt: 2, mb: 1 }; return { as: 'h5', size: 'sm', mt: 2, mb: 1 };
case 6: case 6:
return { as: "h6", size: "xs", mt: 2, mb: 1 }; return { as: 'h6', size: 'xs', mt: 2, mb: 1 };
default: default:
return { as: `h${depth}`, size: "md", mt: 2, mb: 1 }; return { as: `h${depth}`, size: 'md', mt: 2, mb: 1 };
} }
}; };
interface TableToken extends marked.Tokens.Table { interface TableToken extends marked.Tokens.Table {
align: Array<"center" | "left" | "right" | null>; align: Array<'center' | 'left' | 'right' | null>;
header: (string | marked.Tokens.TableCell)[]; header: (string | marked.Tokens.TableCell)[];
rows: (string | marked.Tokens.TableCell)[][]; rows: (string | marked.Tokens.TableCell)[][];
} }
const CustomHeading: React.FC<{ text: string; depth: number }> = ({ const CustomHeading: React.FC<{ text: string; depth: number }> = ({ text, depth }) => {
text,
depth,
}) => {
const headingProps = getHeadingProps(depth); const headingProps = getHeadingProps(depth);
return ( return (
<Heading {...headingProps} wordBreak="break-word" maxWidth="100%"> <Heading {...headingProps} wordBreak="break-word" maxWidth="100%">
@@ -83,9 +82,7 @@ const CustomHeading: React.FC<{ text: string; depth: number }> = ({
); );
}; };
const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({ const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({ children }) => {
children,
}) => {
return ( return (
<Text <Text
as="p" as="p"
@@ -100,9 +97,7 @@ const CustomParagraph: React.FC<{ children: React.ReactNode }> = ({
); );
}; };
const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({ const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({ children }) => {
children,
}) => {
return ( return (
<Box <Box
as="blockquote" as="blockquote"
@@ -120,16 +115,9 @@ const CustomBlockquote: React.FC<{ children: React.ReactNode }> = ({
); );
}; };
const CustomCodeBlock: React.FC<{ code: string; language?: string }> = ({ const CustomCodeBlock: React.FC<{ code: string; language?: string }> = ({ code, language }) => {
code,
language,
}) => {
return ( return (
<MemoizedCodeBlock <MemoizedCodeBlock language={language} code={code} onRenderComplete={() => Promise.resolve()} />
language={language}
code={code}
onRenderComplete={() => Promise.resolve()}
/>
); );
}; };
@@ -141,10 +129,10 @@ const CustomList: React.FC<{
children: React.ReactNode; children: React.ReactNode;
}> = ({ ordered, start, children }) => { }> = ({ ordered, start, children }) => {
const commonStyles = { const commonStyles = {
fontSize: "sm", fontSize: 'sm',
wordBreak: "break-word" as const, wordBreak: 'break-word' as const,
maxWidth: "100%" as const, maxWidth: '100%' as const,
stylePosition: "outside" as const, stylePosition: 'outside' as const,
mb: 2, mb: 2,
pl: 4, pl: 4,
}; };
@@ -166,16 +154,13 @@ const CustomListItem: React.FC<{
return <ListItem mb={1}>{children}</ListItem>; return <ListItem mb={1}>{children}</ListItem>;
}; };
const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({ const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({ math, displayMode }) => {
math,
displayMode,
}) => {
const renderedMath = katex.renderToString(math, { displayMode }); const renderedMath = katex.renderToString(math, { displayMode });
return ( return (
<Box <Box
as="span" as="span"
display={displayMode ? "block" : "inline"} display={displayMode ? 'block' : 'inline'}
p={displayMode ? 4 : 1} p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0} my={displayMode ? 4 : 0}
borderRadius="md" borderRadius="md"
@@ -188,23 +173,17 @@ const CustomKatex: React.FC<{ math: string; displayMode: boolean }> = ({
const CustomTable: React.FC<{ const CustomTable: React.FC<{
header: React.ReactNode[]; header: React.ReactNode[];
align: Array<"center" | "left" | "right" | null>; align: Array<'center' | 'left' | 'right' | null>;
rows: React.ReactNode[][]; rows: React.ReactNode[][];
}> = ({ header, align, rows }) => { }> = ({ header, align, rows }) => {
return ( return (
<Table <Table variant="simple" size="sm" my={4} borderRadius="md" overflow="hidden">
variant="simple"
size="sm"
my={4}
borderRadius="md"
overflow="hidden"
>
<Thead bg="background.secondary"> <Thead bg="background.secondary">
<Tr> <Tr>
{header.map((cell, i) => ( {header.map((cell, i) => (
<Th <Th
key={i} key={i}
textAlign={align[i] || "left"} textAlign={align[i] || 'left'}
fontWeight="bold" fontWeight="bold"
p={2} p={2}
minW={16} minW={16}
@@ -219,12 +198,7 @@ const CustomTable: React.FC<{
{rows.map((row, rIndex) => ( {rows.map((row, rIndex) => (
<Tr key={rIndex}> <Tr key={rIndex}>
{row.map((cell, cIndex) => ( {row.map((cell, cIndex) => (
<Td <Td key={cIndex} textAlign={align[cIndex] || 'left'} p={2} wordBreak="break-word">
key={cIndex}
textAlign={align[cIndex] || "left"}
p={2}
wordBreak="break-word"
>
{cell} {cell}
</Td> </Td>
))} ))}
@@ -241,13 +215,7 @@ const CustomHtmlBlock: React.FC<{ content: string }> = ({ content }) => {
const CustomText: React.FC<{ text: React.ReactNode }> = ({ text }) => { const CustomText: React.FC<{ text: React.ReactNode }> = ({ text }) => {
return ( return (
<Text <Text fontSize="sm" lineHeight="short" wordBreak="break-word" maxWidth="100%" as="span">
fontSize="sm"
lineHeight="short"
wordBreak="break-word"
maxWidth="100%"
as="span"
>
{text} {text}
</Text> </Text>
); );
@@ -262,13 +230,7 @@ const CustomStrong: React.FC<CustomStrongProps> = ({ children }) => {
const CustomEm: React.FC<{ children: React.ReactNode }> = ({ children }) => { const CustomEm: React.FC<{ children: React.ReactNode }> = ({ children }) => {
return ( return (
<Text <Text as="em" fontStyle="italic" lineHeight="short" wordBreak="break-word" display="inline">
as="em"
fontStyle="italic"
lineHeight="short"
wordBreak="break-word"
display="inline"
>
{children} {children}
</Text> </Text>
); );
@@ -289,7 +251,7 @@ const CustomDel: React.FC<{ text: string }> = ({ text }) => {
}; };
const CustomCodeSpan: React.FC<{ code: string }> = ({ code }) => { const CustomCodeSpan: React.FC<{ code: string }> = ({ code }) => {
const bg = useColorModeValue("gray.100", "gray.800"); const bg = useColorModeValue('gray.100', 'gray.800');
return ( return (
<Code <Code
fontSize="sm" fontSize="sm"
@@ -312,13 +274,13 @@ const CustomMath: React.FC<{ math: string; displayMode?: boolean }> = ({
return ( return (
<Box <Box
as="span" as="span"
display={displayMode ? "block" : "inline"} display={displayMode ? 'block' : 'inline'}
p={displayMode ? 4 : 1} p={displayMode ? 4 : 1}
my={displayMode ? 4 : 0} my={displayMode ? 4 : 0}
borderRadius="md" borderRadius="md"
overflow="auto" overflow="auto"
maxWidth="100%" maxWidth="100%"
className={`math ${displayMode ? "math-display" : "math-inline"}`} className={`math ${displayMode ? 'math-display' : 'math-inline'}`}
> >
{math} {math}
</Box> </Box>
@@ -336,8 +298,8 @@ const CustomLink: React.FC<{
title={title} title={title}
isExternal isExternal
sx={{ sx={{
"& span": { '& span': {
color: "text.link", color: 'text.link',
}, },
}} }}
maxWidth="100%" maxWidth="100%"
@@ -379,46 +341,34 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
tokens.forEach((token, i) => { tokens.forEach((token, i) => {
switch (token.type) { switch (token.type) {
case "heading": case 'heading':
output.push( output.push(<CustomHeading key={i} text={token.text} depth={token.depth} />);
<CustomHeading key={i} text={token.text} depth={token.depth} />,
);
break; break;
case "paragraph": { case 'paragraph': {
const parsedContent = token.tokens const parsedContent = token.tokens ? parseTokens(token.tokens) : token.text;
? parseTokens(token.tokens)
: token.text;
if (blockquoteContent.length > 0) { if (blockquoteContent.length > 0) {
blockquoteContent.push( blockquoteContent.push(<CustomParagraph key={i}>{parsedContent}</CustomParagraph>);
<CustomParagraph key={i}>{parsedContent}</CustomParagraph>,
);
} else { } else {
output.push( output.push(<CustomParagraph key={i}>{parsedContent}</CustomParagraph>);
<CustomParagraph key={i}>{parsedContent}</CustomParagraph>,
);
} }
break; break;
} }
case "br": case 'br':
output.push(<br key={i} />); output.push(<br key={i} />);
break; break;
case "escape": { case 'escape': {
break; break;
} }
case "blockquote_start": case 'blockquote_start':
blockquoteContent = []; blockquoteContent = [];
break; break;
case "blockquote_end": case 'blockquote_end':
output.push( output.push(<CustomBlockquote key={i}>{parseTokens(blockquoteContent)}</CustomBlockquote>);
<CustomBlockquote key={i}>
{parseTokens(blockquoteContent)}
</CustomBlockquote>,
);
blockquoteContent = []; blockquoteContent = [];
break; break;
case "blockquote": { case 'blockquote': {
output.push( output.push(
<CustomBlockquote key={i}> <CustomBlockquote key={i}>
{token.tokens ? parseTokens(token.tokens) : null} {token.tokens ? parseTokens(token.tokens) : null}
@@ -426,44 +376,30 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
); );
break; break;
} }
case "math": case 'math':
output.push( output.push(<CustomMath key={i} math={(token as any).value} displayMode={true} />);
<CustomMath key={i} math={(token as any).value} displayMode={true} />,
);
break; break;
case "inlineMath": case 'inlineMath':
output.push( output.push(<CustomMath key={i} math={(token as any).value} displayMode={false} />);
<CustomMath
key={i}
math={(token as any).value}
displayMode={false}
/>,
);
break; break;
case "inlineKatex": case 'inlineKatex':
case "blockKatex": { case 'blockKatex': {
const katexToken = token as any; const katexToken = token as any;
output.push( output.push(
<CustomKatex <CustomKatex key={i} math={katexToken.text} displayMode={katexToken.displayMode} />,
key={i}
math={katexToken.text}
displayMode={katexToken.displayMode}
/>,
); );
break; break;
} }
case "code": case 'code':
output.push( output.push(<CustomCodeBlock key={i} code={token.text} language={token.lang} />);
<CustomCodeBlock key={i} code={token.text} language={token.lang} />,
);
break; break;
case "hr": case 'hr':
output.push(<CustomHr key={i} />); output.push(<CustomHr key={i} />);
break; break;
case "list": { case 'list': {
const { ordered, start, items } = token; const { ordered, start, items } = token;
const listItems = items.map((listItem, idx) => { const listItems = items.map((listItem, idx) => {
const nestedContent = parseTokens(listItem.tokens); const nestedContent = parseTokens(listItem.tokens);
@@ -477,53 +413,43 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
); );
break; break;
} }
case "table": { case 'table': {
const tableToken = token as TableToken; const tableToken = token as TableToken;
output.push( output.push(
<CustomTable <CustomTable
key={i} key={i}
header={tableToken.header.map((cell) => header={tableToken.header.map(cell =>
typeof cell === "string" ? cell : parseTokens(cell.tokens || []), typeof cell === 'string' ? cell : parseTokens(cell.tokens || []),
)} )}
align={tableToken.align} align={tableToken.align}
rows={tableToken.rows.map((row) => rows={tableToken.rows.map(row =>
row.map((cell) => row.map(cell => (typeof cell === 'string' ? cell : parseTokens(cell.tokens || []))),
typeof cell === "string"
? cell
: parseTokens(cell.tokens || []),
),
)} )}
/>, />,
); );
break; break;
} }
case "html": case 'html':
output.push(<CustomHtmlBlock key={i} content={token.text} />); output.push(<CustomHtmlBlock key={i} content={token.text} />);
break; break;
case "def": case 'def':
case "space": case 'space':
break; break;
case "strong": case 'strong':
output.push( output.push(<CustomStrong key={i}>{parseTokens(token.tokens || [])}</CustomStrong>);
<CustomStrong key={i}>
{parseTokens(token.tokens || [])}
</CustomStrong>,
);
break; break;
case "em": case 'em':
output.push( output.push(
<CustomEm key={i}> <CustomEm key={i}>{token.tokens ? parseTokens(token.tokens) : token.text}</CustomEm>,
{token.tokens ? parseTokens(token.tokens) : token.text}
</CustomEm>,
); );
break; break;
case "codespan": case 'codespan':
output.push(<CustomCodeSpan key={i} code={token.text} />); output.push(<CustomCodeSpan key={i} code={token.text} />);
break; break;
case "link": case 'link':
output.push( output.push(
<CustomLink key={i} href={token.href} title={token.title}> <CustomLink key={i} href={token.href} title={token.title}>
{token.tokens ? parseTokens(token.tokens) : token.text} {token.tokens ? parseTokens(token.tokens) : token.text}
@@ -531,33 +457,24 @@ function parseTokens(tokens: marked.Token[]): JSX.Element[] {
); );
break; break;
case "image": case 'image':
output.push( output.push(
<CustomImage <CustomImage key={i} href={token.href} title={token.title} text={token.text} />,
key={i}
href={token.href}
title={token.title}
text={token.text}
/>,
); );
break; break;
case "text": { case 'text': {
const parsedContent = token.tokens const parsedContent = token.tokens ? parseTokens(token.tokens) : token.text;
? parseTokens(token.tokens)
: token.text;
if (blockquoteContent.length > 0) { if (blockquoteContent.length > 0) {
blockquoteContent.push( blockquoteContent.push(<React.Fragment key={i}>{parsedContent}</React.Fragment>);
<React.Fragment key={i}>{parsedContent}</React.Fragment>,
);
} else { } else {
output.push(<CustomText key={i} text={parsedContent} />); output.push(<CustomText key={i} text={parsedContent} />);
} }
break; break;
} }
default: default:
console.warn("Unhandled token type:", token.type, token); console.warn('Unhandled token type:', token.type, token);
} }
}); });

View File

@@ -1,13 +1,12 @@
import React from "react"; import React from 'react';
import {renderMessageMarkdown} from "./MessageMarkdown";
import { renderMessageMarkdown } from './MessageMarkdown';
interface CustomMarkdownRendererProps { interface CustomMarkdownRendererProps {
markdown: string; markdown: string;
} }
const MessageMarkdownRenderer: React.FC<CustomMarkdownRendererProps> = ({ const MessageMarkdownRenderer: React.FC<CustomMarkdownRendererProps> = ({ markdown }) => {
markdown,
}) => {
return <div>{renderMessageMarkdown(markdown)}</div>; return <div>{renderMessageMarkdown(markdown)}</div>;
}; };

View File

@@ -1,4 +1,4 @@
import {motion} from "framer-motion"; import { Box } from '@chakra-ui/react';
import {Box} from "@chakra-ui/react"; import { motion } from 'framer-motion';
export default motion(Box); export default motion(Box);

View File

@@ -1,6 +1,6 @@
import { observer } from "mobx-react-lite"; import { IconButton } from '@chakra-ui/react';
import { IconButton } from "@chakra-ui/react"; import { Edit2Icon } from 'lucide-react';
import { Edit2Icon } from "lucide-react"; import { observer } from 'mobx-react-lite';
const UserMessageTools = observer(({ disabled = false, message, onEdit }) => ( const UserMessageTools = observer(({ disabled = false, message, onEdit }) => (
<IconButton <IconButton
@@ -8,26 +8,26 @@ const UserMessageTools = observer(({ disabled = false, message, onEdit }) => (
color="text.primary" color="text.primary"
aria-label="Edit message" aria-label="Edit message"
title="Edit message" title="Edit message"
icon={<Edit2Icon size={"1em"} />} icon={<Edit2Icon size={'1em'} />}
onClick={() => onEdit(message)} onClick={() => onEdit(message)}
_active={{ _active={{
bg: "transparent", bg: 'transparent',
svg: { svg: {
stroke: "brand.100", stroke: 'brand.100',
transition: "stroke 0.3s ease-in-out", transition: 'stroke 0.3s ease-in-out',
}, },
}} }}
_hover={{ _hover={{
bg: "transparent", bg: 'transparent',
svg: { svg: {
stroke: "accent.secondary", stroke: 'accent.secondary',
transition: "stroke 0.3s ease-in-out", transition: 'stroke 0.3s ease-in-out',
}, },
}} }}
variant="ghost" variant="ghost"
size="sm" size="sm"
isDisabled={disabled} isDisabled={disabled}
_focus={{ boxShadow: "none" }} _focus={{ boxShadow: 'none' }}
/> />
)); ));

View File

@@ -1,14 +1,15 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent, waitFor } from '@testing-library/react'; import { render, screen, fireEvent, waitFor } from '@testing-library/react';
import React from 'react'; import React from 'react';
import { describe, it, expect, vi, beforeEach } from 'vitest';
import messageEditorStore from '../../../../stores/MessageEditorStore';
import MessageBubble from '../MessageBubble'; import MessageBubble from '../MessageBubble';
import messageEditorStore from "../../../../stores/MessageEditorStore";
// Mock browser APIs // Mock browser APIs
class MockResizeObserver { class MockResizeObserver {
observe() {} observe() {}
unobserve() {} unobserve() {}
disconnect() {} disconnect() {}
} }
// Add ResizeObserver to the global object // Add ResizeObserver to the global object
@@ -16,140 +17,140 @@ global.ResizeObserver = MockResizeObserver;
// Mock the Message model // Mock the Message model
vi.mock('../../../../models/Message', () => ({ vi.mock('../../../../models/Message', () => ({
default: { default: {
// This is needed for the Instance<typeof Message> type // This is needed for the Instance<typeof Message> type
} },
})); }));
// Mock the stores // Mock the stores
vi.mock('../../../../stores/ClientChatStore', () => ({ vi.mock('../../../../stores/ClientChatStore', () => ({
default: { default: {
items: [], items: [],
isLoading: false, isLoading: false,
editMessage: vi.fn().mockReturnValue(true) editMessage: vi.fn().mockReturnValue(true),
} },
})); }));
vi.mock('../../../../stores/UserOptionsStore', () => ({ vi.mock('../../../../stores/UserOptionsStore', () => ({
default: { default: {
followModeEnabled: false, followModeEnabled: false,
setFollowModeEnabled: vi.fn() setFollowModeEnabled: vi.fn(),
} },
})); }));
// Mock the MessageEditorStore // Mock the MessageEditorStore
vi.mock('../../../../stores/MessageEditorStore', () => ({ vi.mock('../../../../stores/MessageEditorStore', () => ({
default: { default: {
editedContent: 'Test message', editedContent: 'Test message',
setEditedContent: vi.fn(), setEditedContent: vi.fn(),
setMessage: vi.fn(), setMessage: vi.fn(),
onCancel: vi.fn(), onCancel: vi.fn(),
handleSave: vi.fn().mockImplementation(function() { handleSave: vi.fn().mockImplementation(function () {
// Use the mocked messageEditorStore from the import // Use the mocked messageEditorStore from the import
messageEditorStore.onCancel(); messageEditorStore.onCancel();
return Promise.resolve(); return Promise.resolve();
}) }),
} },
})); }));
// Mock the MessageRenderer component // Mock the MessageRenderer component
vi.mock('../ChatMessageContent', () => ({ vi.mock('../ChatMessageContent', () => ({
default: ({ content }) => <div data-testid="message-content">{content}</div> default: ({ content }) => <div data-testid="message-content">{content}</div>,
})); }));
// Mock the UserMessageTools component // Mock the UserMessageTools component
vi.mock('../UserMessageTools', () => ({ vi.mock('../UserMessageTools', () => ({
default: ({ message, onEdit }) => ( default: ({ message, onEdit }) => (
<button data-testid="edit-button" onClick={() => onEdit(message)}> <button data-testid="edit-button" onClick={() => onEdit(message)}>
Edit Edit
</button> </button>
) ),
})); }));
vi.mock("../MotionBox", async (importOriginal) => { vi.mock('../MotionBox', async importOriginal => {
const actual = await importOriginal() const actual = await importOriginal();
return { default: { return {
...actual.default, default: {
div: (props: any) => React.createElement('div', props, props.children), ...actual.default,
motion: (props: any) => React.createElement('div', props, props.children), div: (props: any) => React.createElement('div', props, props.children),
motion: (props: any) => React.createElement('div', props, props.children),
} },
} };
}); });
describe('MessageBubble', () => { describe('MessageBubble', () => {
const mockScrollRef = { current: { scrollTo: vi.fn() } }; const mockScrollRef = { current: { scrollTo: vi.fn() } };
const mockUserMessage = { const mockUserMessage = {
role: 'user', role: 'user',
content: 'Test message' content: 'Test message',
}; };
const mockAssistantMessage = { const mockAssistantMessage = {
role: 'assistant', role: 'assistant',
content: 'Assistant response' content: 'Assistant response',
}; };
beforeEach(() => { beforeEach(() => {
vi.clearAllMocks(); vi.clearAllMocks();
});
it('should render user message correctly', () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
expect(screen.getByText('You')).toBeInTheDocument();
expect(screen.getByText('Test message')).toBeInTheDocument();
});
it('should render assistant message correctly', () => {
render(<MessageBubble msg={mockAssistantMessage} scrollRef={mockScrollRef} />);
expect(screen.getByText('open-gsio')).toBeInTheDocument();
expect(screen.getByTestId('message-content')).toHaveTextContent('Assistant response');
});
it('should show edit button on hover for user messages', async () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Simulate hover
fireEvent.mouseEnter(screen.getByRole('listitem'));
expect(screen.getByTestId('edit-button')).toBeInTheDocument();
});
it('should show editor when edit button is clicked', () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Simulate hover and click edit
fireEvent.mouseEnter(screen.getByRole('listitem'));
fireEvent.click(screen.getByTestId('edit-button'));
// Check if the textarea is rendered (part of MessageEditor)
expect(screen.getByRole('textbox')).toBeInTheDocument();
});
it('should hide editor after message is edited and saved', async () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Show the editor
fireEvent.mouseEnter(screen.getByRole('listitem'));
fireEvent.click(screen.getByTestId('edit-button'));
// Verify editor is shown
expect(screen.getByRole('textbox')).toBeInTheDocument();
// Find and click the save button
const saveButton = screen.getByLabelText('Save edit');
fireEvent.click(saveButton);
// Wait for the editor to disappear
await waitFor(() => {
// Check that the editor is no longer visible
expect(screen.queryByRole('textbox')).not.toBeInTheDocument();
// And the message content is visible again
expect(screen.getByText('Test message')).toBeInTheDocument();
}); });
it('should render user message correctly', () => { // Verify that handleSave was called
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />); expect(messageEditorStore.handleSave).toHaveBeenCalled();
});
expect(screen.getByText('You')).toBeInTheDocument();
expect(screen.getByText('Test message')).toBeInTheDocument();
});
it('should render assistant message correctly', () => {
render(<MessageBubble msg={mockAssistantMessage} scrollRef={mockScrollRef} />);
expect(screen.getByText("Geoff's AI")).toBeInTheDocument();
expect(screen.getByTestId('message-content')).toHaveTextContent('Assistant response');
});
it('should show edit button on hover for user messages', async () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Simulate hover
fireEvent.mouseEnter(screen.getByRole('listitem'));
expect(screen.getByTestId('edit-button')).toBeInTheDocument();
});
it('should show editor when edit button is clicked', () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Simulate hover and click edit
fireEvent.mouseEnter(screen.getByRole('listitem'));
fireEvent.click(screen.getByTestId('edit-button'));
// Check if the textarea is rendered (part of MessageEditor)
expect(screen.getByRole('textbox')).toBeInTheDocument();
});
it('should hide editor after message is edited and saved', async () => {
render(<MessageBubble msg={mockUserMessage} scrollRef={mockScrollRef} />);
// Show the editor
fireEvent.mouseEnter(screen.getByRole('listitem'));
fireEvent.click(screen.getByTestId('edit-button'));
// Verify editor is shown
expect(screen.getByRole('textbox')).toBeInTheDocument();
// Find and click the save button
const saveButton = screen.getByLabelText('Save edit');
fireEvent.click(saveButton);
// Wait for the editor to disappear
await waitFor(() => {
// Check that the editor is no longer visible
expect(screen.queryByRole('textbox')).not.toBeInTheDocument();
// And the message content is visible again
expect(screen.getByText('Test message')).toBeInTheDocument();
});
// Verify that handleSave was called
expect(messageEditorStore.handleSave).toHaveBeenCalled();
});
}); });

View File

@@ -1,27 +1,27 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { render, screen, fireEvent } from '@testing-library/react'; import { render, screen, fireEvent } from '@testing-library/react';
import React from 'react'; import React from 'react';
import MessageEditor from '../MessageEditorComponent'; import { describe, it, expect, vi, beforeEach } from 'vitest';
// Import the mocked stores // Import the mocked stores
import clientChatStore from '../../../../stores/ClientChatStore'; import clientChatStore from '../../../../stores/ClientChatStore';
import messageEditorStore from '../../../../stores/MessageEditorStore'; import messageEditorStore from '../../../../stores/MessageEditorStore';
import MessageEditor from '../MessageEditorComponent';
// Mock the Message model // Mock the Message model
vi.mock('../../../../models/Message', () => { vi.mock('../../../../models/Message', () => {
return { return {
default: { default: {
// This is needed for the Instance<typeof Message> type // This is needed for the Instance<typeof Message> type
} },
}; };
}); });
// Mock fetch globally // Mock fetch globally
globalThis.fetch = vi.fn(() => globalThis.fetch = vi.fn(() =>
Promise.resolve({ Promise.resolve({
ok: true, ok: true,
json: () => Promise.resolve({}) json: () => Promise.resolve({}),
}) }),
); );
// Mock the ClientChatStore // Mock the ClientChatStore
@@ -31,14 +31,14 @@ vi.mock('../../../../stores/ClientChatStore', () => {
removeAfter: vi.fn(), removeAfter: vi.fn(),
sendMessage: vi.fn(), sendMessage: vi.fn(),
setIsLoading: vi.fn(), setIsLoading: vi.fn(),
editMessage: vi.fn().mockReturnValue(true) editMessage: vi.fn().mockReturnValue(true),
}; };
// Add the mockUserMessage to the items array // Add the mockUserMessage to the items array
mockStore.items.indexOf = vi.fn().mockReturnValue(0); mockStore.items.indexOf = vi.fn().mockReturnValue(0);
return { return {
default: mockStore default: mockStore,
}; };
}); });
@@ -48,25 +48,25 @@ vi.mock('../../../../stores/MessageEditorStore', () => {
editedContent: 'Test message', // Set initial value to match the test expectation editedContent: 'Test message', // Set initial value to match the test expectation
message: null, message: null,
setEditedContent: vi.fn(), setEditedContent: vi.fn(),
setMessage: vi.fn((message) => { setMessage: vi.fn(message => {
mockStore.message = message; mockStore.message = message;
mockStore.editedContent = message.content; mockStore.editedContent = message.content;
}), }),
onCancel: vi.fn(), onCancel: vi.fn(),
handleSave: vi.fn() handleSave: vi.fn(),
}; };
return { return {
default: mockStore default: mockStore,
}; };
}); });
describe('MessageEditor', () => { describe('MessageEditor', () => {
// Create a message object with a setContent method // Create a message object with a setContent method
const mockUserMessage = { const mockUserMessage = {
content: 'Test message', content: 'Test message',
role: 'user', role: 'user',
setContent: vi.fn() setContent: vi.fn(),
}; };
const mockOnCancel = vi.fn(); const mockOnCancel = vi.fn();
@@ -93,7 +93,7 @@ describe('MessageEditor', () => {
}); });
it('should call handleSave when save button is clicked', () => { it('should call handleSave when save button is clicked', () => {
render(<MessageEditor message={mockUserMessage} onCancel={mockOnCancel}/>); render(<MessageEditor message={mockUserMessage} onCancel={mockOnCancel} />);
const saveButton = screen.getByLabelText('Save edit'); const saveButton = screen.getByLabelText('Save edit');
fireEvent.click(saveButton); fireEvent.click(saveButton);

View File

@@ -1,5 +1,6 @@
import React, { useState, useEffect, useCallback, useMemo } from "react"; import React, { useState, useEffect, useCallback } from 'react';
import { buildCodeHighlighter } from "./CodeHighlighter";
import { buildCodeHighlighter } from './CodeHighlighter';
interface CodeBlockProps { interface CodeBlockProps {
language: string; language: string;
@@ -9,23 +10,19 @@ interface CodeBlockProps {
const highlighter = buildCodeHighlighter(); const highlighter = buildCodeHighlighter();
const CodeBlock: React.FC<CodeBlockProps> = ({ const CodeBlock: React.FC<CodeBlockProps> = ({ language, code, onRenderComplete }) => {
language, const [html, setHtml] = useState<string>('');
code,
onRenderComplete,
}) => {
const [html, setHtml] = useState<string>("");
const [loading, setLoading] = useState<boolean>(true); const [loading, setLoading] = useState<boolean>(true);
const highlightCode = useCallback(async () => { const highlightCode = useCallback(async () => {
try { try {
const highlighted = (await highlighter).codeToHtml(code, { const highlighted = (await highlighter).codeToHtml(code, {
lang: language, lang: language,
theme: "github-dark", theme: 'github-dark',
}); });
setHtml(highlighted); setHtml(highlighted);
} catch (error) { } catch (error) {
console.error("Error highlighting code:", error); console.error('Error highlighting code:', error);
setHtml(`<pre>${code}</pre>`); setHtml(`<pre>${code}</pre>`);
} finally { } finally {
setLoading(false); setLoading(false);
@@ -41,9 +38,9 @@ const CodeBlock: React.FC<CodeBlockProps> = ({
return ( return (
<div <div
style={{ style={{
backgroundColor: "#24292e", backgroundColor: '#24292e',
padding: "10px", padding: '10px',
borderRadius: "1.5em", borderRadius: '1.5em',
}} }}
> >
Loading code... Loading code...
@@ -55,12 +52,12 @@ const CodeBlock: React.FC<CodeBlockProps> = ({
<div <div
dangerouslySetInnerHTML={{ __html: html }} dangerouslySetInnerHTML={{ __html: html }}
style={{ style={{
transition: "none", transition: 'none',
padding: 20, padding: 20,
backgroundColor: "#24292e", backgroundColor: '#24292e',
overflowX: "auto", overflowX: 'auto',
borderRadius: ".37em", borderRadius: '.37em',
fontSize: ".75rem", fontSize: '.75rem',
}} }}
/> />
); );

View File

@@ -1,5 +1,6 @@
import { createHighlighterCore } from "shiki"; import { createHighlighterCore } from 'shiki';
/* eslint-disable import/no-unresolved */
export async function buildCodeHighlighter() { export async function buildCodeHighlighter() {
const [ const [
githubDark, githubDark,
@@ -23,26 +24,26 @@ export async function buildCodeHighlighter() {
zig, zig,
wasm, wasm,
] = await Promise.all([ ] = await Promise.all([
import("shiki/themes/github-dark.mjs"), import('shiki/themes/github-dark.mjs'),
import("shiki/langs/html.mjs"), import('shiki/langs/html.mjs'),
import("shiki/langs/javascript.mjs"), import('shiki/langs/javascript.mjs'),
import("shiki/langs/jsx.mjs"), import('shiki/langs/jsx.mjs'),
import("shiki/langs/typescript.mjs"), import('shiki/langs/typescript.mjs'),
import("shiki/langs/tsx.mjs"), import('shiki/langs/tsx.mjs'),
import("shiki/langs/go.mjs"), import('shiki/langs/go.mjs'),
import("shiki/langs/rust.mjs"), import('shiki/langs/rust.mjs'),
import("shiki/langs/python.mjs"), import('shiki/langs/python.mjs'),
import("shiki/langs/java.mjs"), import('shiki/langs/java.mjs'),
import("shiki/langs/kotlin.mjs"), import('shiki/langs/kotlin.mjs'),
import("shiki/langs/shell.mjs"), import('shiki/langs/shell.mjs'),
import("shiki/langs/sql.mjs"), import('shiki/langs/sql.mjs'),
import("shiki/langs/yaml.mjs"), import('shiki/langs/yaml.mjs'),
import("shiki/langs/toml.mjs"), import('shiki/langs/toml.mjs'),
import("shiki/langs/markdown.mjs"), import('shiki/langs/markdown.mjs'),
import("shiki/langs/json.mjs"), import('shiki/langs/json.mjs'),
import("shiki/langs/xml.mjs"), import('shiki/langs/xml.mjs'),
import("shiki/langs/zig.mjs"), import('shiki/langs/zig.mjs'),
import("shiki/wasm"), import('shiki/wasm'),
]); ]);
// Create the highlighter instance with the loaded themes and languages // Create the highlighter instance with the loaded themes and languages

Some files were not shown because too many files have changed in this diff Show More