From f76301d620c588f5a5894d6b1736d9b765d9d072 Mon Sep 17 00:00:00 2001
From: geoffsee <>
Date: Tue, 24 Jun 2025 17:31:15 -0400
Subject: [PATCH] run format
---
LEGACY.md | 58 +++++------
README.md | 98 ++++++++++---------
package.json | 3 +-
packages/ai/package.json | 2 +-
packages/ai/tsconfig.json | 10 +-
packages/client/public/cfga.min.js | 43 ++++----
packages/client/tsconfig.json | 10 +-
.../cloudflare-workers/open-gsio/README.md | 85 ++++++++--------
.../open-gsio/tsconfig.json | 9 +-
packages/env/env.d.ts | 3 +-
packages/env/package.json | 2 +-
packages/env/tsconfig.json | 11 +--
packages/scripts/README.md | 2 +-
packages/scripts/tsconfig.json | 9 +-
packages/server/README.md | 18 ++--
packages/server/tsconfig.json | 10 +-
packages/server/types.d.ts | 6 +-
17 files changed, 180 insertions(+), 199 deletions(-)
diff --git a/LEGACY.md b/LEGACY.md
index 31c8158..6bac774 100644
--- a/LEGACY.md
+++ b/LEGACY.md
@@ -1,60 +1,60 @@
-Legacy Development History
----
+## Legacy Development History
+
The source code of open-gsio was drawn from the source code of my personal website. That commit history was contaminated early on with secrets. `open-gsio` is a refinement of those sources. A total of 367 commits were submitted to the main branch of the upstream source repository between August 2024 and May 2025.
#### **May 2025**
-* Added **seemueller.ai** link to UI sidebar.
-* Global config/markdown guide clean‑up; patched a critical forgotten bug.
+- Added **seemueller.ai** link to UI sidebar.
+- Global config/markdown guide clean‑up; patched a critical forgotten bug.
#### **Apr 2025**
-* **CI/CD overhaul**: auto‑deploy to dev & staging, Bun adoption as package manager, streamlined block‑list workflow (now auto‑updates via VPN blocker).
-* New 404 error page; multiple robots.txt and editor‑resize fixes; removed dead/duplicate code.
+- **CI/CD overhaul**: auto‑deploy to dev & staging, Bun adoption as package manager, streamlined block‑list workflow (now auto‑updates via VPN blocker).
+- New 404 error page; multiple robots.txt and editor‑resize fixes; removed dead/duplicate code.
#### **Mar 2025**
-* Introduced **model‑specific `max_tokens`** handling and plugged in **Cloudflare AI models** for testing.
-* Bundle size minimised (re‑enabled minifier, smaller vendor set).
+- Introduced **model‑specific `max_tokens`** handling and plugged in **Cloudflare AI models** for testing.
+- Bundle size minimised (re‑enabled minifier, smaller vendor set).
#### **Feb 2025**
-* **Full theme system** (runtime switching, Centauri theme, server‑saved prefs).
-* Tightened MobX typing for messages; responsive break‑points & input scaling repaired.
-* Dropped legacy document API; general folder restructure.
+- **Full theme system** (runtime switching, Centauri theme, server‑saved prefs).
+- Tightened MobX typing for messages; responsive break‑points & input scaling repaired.
+- Dropped legacy document API; general folder restructure.
#### **Jan 2025**
-* **Rate‑limit middleware**, larger KV/R2 storage quota.
-* Switched default model → *llama‑v3p1‑70b‑instruct*; pluggable model handlers.
-* Added **KaTeX fonts** & **Marked.js** for rich math/markdown.
-* Fireworks key rotation; deprecated Google models removed.
+- **Rate‑limit middleware**, larger KV/R2 storage quota.
+- Switched default model → _llama‑v3p1‑70b‑instruct_; pluggable model handlers.
+- Added **KaTeX fonts** & **Marked.js** for rich math/markdown.
+- Fireworks key rotation; deprecated Google models removed.
#### **Dec 2024**
-* Major package upgrades; **CodeHighlighter** now supports HTML/JSX/TS(X)/Zig.
-* Refactored streaming + markdown renderer; Android‑specific padding fixes.
-* Reset default chat model to **gpt‑4o**; welcome message & richer search‑intent logic.
+- Major package upgrades; **CodeHighlighter** now supports HTML/JSX/TS(X)/Zig.
+- Refactored streaming + markdown renderer; Android‑specific padding fixes.
+- Reset default chat model to **gpt‑4o**; welcome message & richer search‑intent logic.
#### **Nov 2024**
-* **Fireworks API** + agent server; first‑class support for **Anthropic** & **GROQ** models (incl. attachments).
-* **VPN blocker** shipped with CIDR validation and dedicated GitHub Action.
-* Live search buffering, feedback modal, smarter context preprocessing.
+- **Fireworks API** + agent server; first‑class support for **Anthropic** & **GROQ** models (incl. attachments).
+- **VPN blocker** shipped with CIDR validation and dedicated GitHub Action.
+- Live search buffering, feedback modal, smarter context preprocessing.
#### **Oct 2024**
-* Rolled out **image generation** + picker for image models.
-* Deployed **ETH payment processor** & deposit‑address flow.
-* Introduced few‑shot prompting library; analytics worker refactor; Halloween prompt.
-* Extensive mobile‑UX polish and bundling/worker config updates.
+- Rolled out **image generation** + picker for image models.
+- Deployed **ETH payment processor** & deposit‑address flow.
+- Introduced few‑shot prompting library; analytics worker refactor; Halloween prompt.
+- Extensive mobile‑UX polish and bundling/worker config updates.
#### **Sep 2024**
-* End‑to‑end **math rendering** (KaTeX) and **GitHub‑flavoured markdown**.
-* Migrated chat state to **MobX**; launched analytics service & metrics worker.
-* Switched build minifier to **esbuild**; tokenizer limits enforced; gradient sidebar & cookie‑consent manager added.
+- End‑to‑end **math rendering** (KaTeX) and **GitHub‑flavoured markdown**.
+- Migrated chat state to **MobX**; launched analytics service & metrics worker.
+- Switched build minifier to **esbuild**; tokenizer limits enforced; gradient sidebar & cookie‑consent manager added.
#### **Aug 2024**
-* **Initial MVP**: iMessage‑style chat UI, websocket prototype, Google Analytics, Cloudflare bindings, base worker‑site scaffold.
+- **Initial MVP**: iMessage‑style chat UI, websocket prototype, Google Analytics, Cloudflare bindings, base worker‑site scaffold.
diff --git a/README.md b/README.md
index 3fd6f24..5a9a58a 100644
--- a/README.md
+++ b/README.md
@@ -1,29 +1,29 @@
# open-gsio
+
[](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml)
[](https://opensource.org/licenses/MIT)
+
-
-This is a full-stack Conversational AI.
+This is a full-stack Conversational AI.
## Table of Contents
- [Installation](#installation)
- [Deployment](#deployment)
- [Local Inference](#local-inference)
- - [mlx-omni-server (default)](#mlx-omni-server)
- - [Adding models](#adding-models-for-local-inference-apple-silicon)
- - [Ollama](#ollama)
- - [Adding models](#adding-models-for-local-inference-ollama)
+ - [mlx-omni-server (default)](#mlx-omni-server)
+ - [Adding models](#adding-models-for-local-inference-apple-silicon)
+ - [Ollama](#ollama)
+ - [Adding models](#adding-models-for-local-inference-ollama)
- [Testing](#testing)
- [Troubleshooting](#troubleshooting)
- [Acknowledgments](#acknowledgments)
- [License](#license)
-
## Installation
1. `bun i && bun test:all`
@@ -33,29 +33,34 @@ This is a full-stack Conversational AI.
> Note: it should be possible to use pnpm in place of bun.
## Deployment
+
1. Setup KV_STORAGE binding in `packages/server/wrangler.jsonc`
-1. [Add keys in secrets.json](https://console.groq.com/keys)
+1. [Add keys in secrets.json](https://console.groq.com/keys)
1. Run `bun run deploy && bun run deploy:secrets && bun run deploy`
> Note: Subsequent deployments should omit `bun run deploy:secrets`
-
+
## Local Inference
-> Local inference is supported for Ollama and mlx-omni-server. OpenAI compatible servers can be used by overriding OPENAI_API_KEY and OPENAI_API_ENDPOINT.
+
+> Local inference is supported for Ollama and mlx-omni-server. OpenAI compatible servers can be used by overriding OPENAI_API_KEY and OPENAI_API_ENDPOINT.
### mlx-omni-server
+
(default) (Apple Silicon Only)
-~~~bash
+
+```bash
# (prereq) install mlx-omni-server
-brew tap seemueller-io/tap
-brew install seemueller-io/tap/mlx-omni-server
+brew tap seemueller-io/tap
+brew install seemueller-io/tap/mlx-omni-server
bun run openai:local mlx-omni-server # Start mlx-omni-server
bun run openai:local:configure # Configure connection
bun run server:dev # Restart server
-~~~
+```
+
#### Adding models for local inference (Apple Silicon)
-~~~bash
+```bash
# ensure mlx-omni-server is running
# See https://huggingface.co/mlx-community for available models
@@ -67,21 +72,22 @@ curl http://localhost:10240/v1/chat/completions \
\"model\": \"$MODEL_TO_ADD\",
\"messages\": [{\"role\": \"user\", \"content\": \"Hello\"}]
}"
-~~~
+```
### Ollama
-~~~bash
+
+```bash
bun run openai:local ollama # Start ollama server
bun run openai:local:configure # Configure connection
bun run server:dev # Restart server
-~~~
+```
+
#### Adding models for local inference (ollama)
-~~~bash
+```bash
# See https://ollama.com/library for available models
use the ollama web ui @ http://localhost:8080
-~~~
-
+```
## Testing
@@ -89,44 +95,44 @@ Tests are located in `__tests__` directories next to the code they test. Testing
> `bun test:all` will run all tests
-
## Troubleshooting
+
1. `bun clean`
1. `bun i`
1. `bun server:dev`
1. `bun client:dev`
-1. Submit an issue
+1. Submit an issue
+
+## History
-History
----
A high-level overview for the development history of the parent repository, [geoff-seemueller-io](https://geoff.seemueller.io), is provided in [LEGACY.md](./LEGACY.md).
## Acknowledgments
I would like to express gratitude to the following projects, libraries, and individuals that have contributed to making open-gsio possible:
- - [TypeScript](https://www.typescriptlang.org/) - Primary programming language
- - [React](https://react.dev/) - UI library for building the frontend
- - [Vike](https://vike.dev/) - Framework for server-side rendering and routing
- - [Cloudflare Workers](https://developers.cloudflare.com/workers/) - Serverless execution environment
- - [Bun](https://bun.sh/) - JavaScript runtime and toolkit
- - [itty-router](https://github.com/kwhitley/itty-router) - Lightweight router for serverless environments
- - [MobX-State-Tree](https://mobx-state-tree.js.org/) - State management solution
- - [OpenAI SDK](https://github.com/openai/openai-node) - Client for AI model integration
- - [Vitest](https://vitest.dev/) - Testing framework
- - [OpenAI](https://github.com/openai)
- - [Groq](https://console.groq.com/) - Fast inference API
- - [Anthropic](https://www.anthropic.com/) - Creator of Claude models
- - [Fireworks](https://fireworks.ai/) - AI inference platform
- - [XAI](https://x.ai/) - Creator of Grok models
- - [Cerebras](https://www.cerebras.net/) - AI compute and models
- - [(madroidmaq) MLX Omni Server](https://github.com/madroidmaq/mlx-omni-server) - Open-source high-performance inference for Apple Silicon
- - [MLX](https://github.com/ml-explore/mlx) - An array framework for Apple silicon
- - [Ollama](https://github.com/ollama/ollama) - Versatile solution for self-hosting models
-
+- [TypeScript](https://www.typescriptlang.org/) - Primary programming language
+- [React](https://react.dev/) - UI library for building the frontend
+- [Vike](https://vike.dev/) - Framework for server-side rendering and routing
+- [Cloudflare Workers](https://developers.cloudflare.com/workers/) - Serverless execution environment
+- [Bun](https://bun.sh/) - JavaScript runtime and toolkit
+- [itty-router](https://github.com/kwhitley/itty-router) - Lightweight router for serverless environments
+- [MobX-State-Tree](https://mobx-state-tree.js.org/) - State management solution
+- [OpenAI SDK](https://github.com/openai/openai-node) - Client for AI model integration
+- [Vitest](https://vitest.dev/) - Testing framework
+- [OpenAI](https://github.com/openai)
+- [Groq](https://console.groq.com/) - Fast inference API
+- [Anthropic](https://www.anthropic.com/) - Creator of Claude models
+- [Fireworks](https://fireworks.ai/) - AI inference platform
+- [XAI](https://x.ai/) - Creator of Grok models
+- [Cerebras](https://www.cerebras.net/) - AI compute and models
+- [(madroidmaq) MLX Omni Server](https://github.com/madroidmaq/mlx-omni-server) - Open-source high-performance inference for Apple Silicon
+- [MLX](https://github.com/ml-explore/mlx) - An array framework for Apple silicon
+- [Ollama](https://github.com/ollama/ollama) - Versatile solution for self-hosting models
## License
-~~~text
+
+```text
MIT License
Copyright (c) 2025 Geoff Seemueller
@@ -148,4 +154,4 @@ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
-~~~
+```
diff --git a/package.json b/package.json
index ff74ddb..03f447a 100644
--- a/package.json
+++ b/package.json
@@ -38,5 +38,6 @@
},
"peerDependencies": {
"typescript": "^5"
- }
+ },
+ "packageManager": "pnpm@10.10.0+sha512.d615db246fe70f25dcfea6d8d73dee782ce23e2245e3c4f6f888249fb568149318637dca73c2c5c8ef2a4ca0d5657fb9567188bfab47f566d1ee6ce987815c39"
}
diff --git a/packages/ai/package.json b/packages/ai/package.json
index 27192a5..79971ba 100644
--- a/packages/ai/package.json
+++ b/packages/ai/package.json
@@ -1,4 +1,4 @@
{
"name": "@open-gsio/ai",
"module": "index.ts"
-}
\ No newline at end of file
+}
diff --git a/packages/ai/tsconfig.json b/packages/ai/tsconfig.json
index 0c94e16..79d3191 100644
--- a/packages/ai/tsconfig.json
+++ b/packages/ai/tsconfig.json
@@ -4,10 +4,6 @@
"outDir": "dist",
"rootDir": "."
},
- "include": [
- "*.ts"
- ],
- "exclude": [
- "node_modules"
- ]
-}
\ No newline at end of file
+ "include": ["*.ts"],
+ "exclude": ["node_modules"]
+}
diff --git a/packages/client/public/cfga.min.js b/packages/client/public/cfga.min.js
index 0dfcd0c..ab4ab26 100644
--- a/packages/client/public/cfga.min.js
+++ b/packages/client/public/cfga.min.js
@@ -15,30 +15,29 @@
};
function s() {
var i = [
- g(m(4)) + "=" + g(m(6)),
- "ga=" + t.ga_tid,
- "dt=" + r(e.title),
- "de=" + r(e.characterSet || e.charset),
- "dr=" + r(e.referrer),
- "ul=" + (n.language || n.browserLanguage || n.userLanguage),
- "sd=" + a.colorDepth + "-bit",
- "sr=" + a.width + "x" + a.height,
- "vp=" +
+ g(m(4)) + '=' + g(m(6)),
+ 'ga=' + t.ga_tid,
+ 'dt=' + r(e.title),
+ 'de=' + r(e.characterSet || e.charset),
+ 'dr=' + r(e.referrer),
+ 'ul=' + (n.language || n.browserLanguage || n.userLanguage),
+ 'sd=' + a.colorDepth + '-bit',
+ 'sr=' + a.width + 'x' + a.height,
+ 'vp=' +
o(e.documentElement.clientWidth, t.innerWidth || 0) +
- "x" +
+ 'x' +
o(e.documentElement.clientHeight, t.innerHeight || 0),
- "plt=" + c(d.loadEventStart - d.navigationStart || 0),
- "dns=" + c(d.domainLookupEnd - d.domainLookupStart || 0),
- "pdt=" + c(d.responseEnd - d.responseStart || 0),
- "rrt=" + c(d.redirectEnd - d.redirectStart || 0),
- "tcp=" + c(d.connectEnd - d.connectStart || 0),
- "srt=" + c(d.responseStart - d.requestStart || 0),
- "dit=" + c(d.domInteractive - d.domLoading || 0),
- "clt=" + c(d.domContentLoadedEventStart - d.navigationStart || 0),
- "z=" + Date.now(),
+ 'plt=' + c(d.loadEventStart - d.navigationStart || 0),
+ 'dns=' + c(d.domainLookupEnd - d.domainLookupStart || 0),
+ 'pdt=' + c(d.responseEnd - d.responseStart || 0),
+ 'rrt=' + c(d.redirectEnd - d.redirectStart || 0),
+ 'tcp=' + c(d.connectEnd - d.connectStart || 0),
+ 'srt=' + c(d.responseStart - d.requestStart || 0),
+ 'dit=' + c(d.domInteractive - d.domLoading || 0),
+ 'clt=' + c(d.domContentLoadedEventStart - d.navigationStart || 0),
+ 'z=' + Date.now(),
];
- (t.__ga_img = new Image()), (t.__ga_img.src = t.ga_api + "?" + i.join("&"));
+ ((t.__ga_img = new Image()), (t.__ga_img.src = t.ga_api + '?' + i.join('&')));
}
- (t.cfga = s),
- "complete" === e.readyState ? s() : t.addEventListener("load", s);
+ ((t.cfga = s), 'complete' === e.readyState ? s() : t.addEventListener('load', s));
})(window, document, navigator);
diff --git a/packages/client/tsconfig.json b/packages/client/tsconfig.json
index 2e1c5d6..a24fa98 100644
--- a/packages/client/tsconfig.json
+++ b/packages/client/tsconfig.json
@@ -8,12 +8,6 @@
"baseUrl": "src",
"noEmit": true
},
- "include": [
- "src/**/*.ts",
- "src/**/*.tsx"
- ],
- "exclude": [
- "node_modules",
- "dist"
- ]
+ "include": ["src/**/*.ts", "src/**/*.tsx"],
+ "exclude": ["node_modules", "dist"]
}
diff --git a/packages/cloudflare-workers/open-gsio/README.md b/packages/cloudflare-workers/open-gsio/README.md
index 856f88c..87cf90f 100644
--- a/packages/cloudflare-workers/open-gsio/README.md
+++ b/packages/cloudflare-workers/open-gsio/README.md
@@ -1,7 +1,9 @@
# open-gsio
+
[](https://github.com/geoffsee/open-gsio/actions/workflows/test.yml)
[](https://opensource.org/licenses/MIT)
+
@@ -15,59 +17,63 @@
- [Installation](#installation)
- [Deployment](#deployment)
- [Local Inference](#local-inference)
- - [mlx-omni-server (default)](#mlx-omni-server)
- - [Adding models](#adding-models-for-local-inference-apple-silicon)
- - [Ollama](#ollama)
- - [Adding models](#adding-models-for-local-inference-ollama)
+ - [mlx-omni-server (default)](#mlx-omni-server)
+ - [Adding models](#adding-models-for-local-inference-apple-silicon)
+ - [Ollama](#ollama)
+ - [Adding models](#adding-models-for-local-inference-ollama)
- [Testing](#testing)
- [Troubleshooting](#troubleshooting)
- [History](#history)
- [License](#license)
## Stack
-* [TypeScript](https://www.typescriptlang.org/)
-* [Vike](https://vike.dev/)
-* [React](https://react.dev/)
-* [Cloudflare Workers](https://developers.cloudflare.com/workers/)
-* [itty‑router](https://github.com/kwhitley/itty-router)
-* [MobX‑State‑Tree](https://mobx-state-tree.js.org/)
-* [OpenAI SDK](https://github.com/openai/openai-node)
-* [Vitest](https://vitest.dev/)
+- [TypeScript](https://www.typescriptlang.org/)
+- [Vike](https://vike.dev/)
+- [React](https://react.dev/)
+- [Cloudflare Workers](https://developers.cloudflare.com/workers/)
+- [itty‑router](https://github.com/kwhitley/itty-router)
+- [MobX‑State‑Tree](https://mobx-state-tree.js.org/)
+- [OpenAI SDK](https://github.com/openai/openai-node)
+- [Vitest](https://vitest.dev/)
## Installation
1. `bun i && bun test`
-1. [Add your own `GROQ_API_KEY` in .dev.vars](https://console.groq.com/keys)
+1. [Add your own `GROQ_API_KEY` in .dev.vars](https://console.groq.com/keys)
1. In isolated shells, run `bun run server:dev` and `bun run client:dev`
-> Note: it should be possible to use pnpm in place of bun.
+> Note: it should be possible to use pnpm in place of bun.
## Deployment
-1. Setup the KV_STORAGE bindings in `wrangler.jsonc`
-1. [Add another `GROQ_API_KEY` in secrets.json](https://console.groq.com/keys)
+
+1. Setup the KV_STORAGE bindings in `wrangler.jsonc`
+1. [Add another `GROQ_API_KEY` in secrets.json](https://console.groq.com/keys)
1. Run `bun run deploy && bun run deploy:secrets && bun run deploy`
> Note: Subsequent deployments should omit `bun run deploy:secrets`
-
## Local Inference
+
> Local inference is achieved by overriding the `OPENAI_API_KEY` and `OPENAI_API_ENDPOINT` environment variables. See below.
### mlx-omni-server
+
(default) (Apple Silicon Only) - Use Ollama for other platforms.
-~~~bash
+
+```bash
# (prereq) install mlx-omni-server
-brew tap seemueller-io/tap
-brew install seemueller-io/tap/mlx-omni-server
+brew tap seemueller-io/tap
+brew install seemueller-io/tap/mlx-omni-server
bun run openai:local mlx-omni-server # Start mlx-omni-server
bun run openai:local:enable # Configure connection
bun run server:dev # Restart server
-~~~
+```
+
#### Adding models for local inference (Apple Silicon)
-~~~bash
+```bash
# ensure mlx-omni-server is running
# See https://huggingface.co/mlx-community for available models
@@ -79,22 +85,23 @@ curl http://localhost:10240/v1/chat/completions \
\"model\": \"$MODEL_TO_ADD\",
\"messages\": [{\"role\": \"user\", \"content\": \"Hello\"}]
}"
-~~~
+```
### Ollama
-~~~bash
+
+```bash
bun run openai:local ollama # Start ollama server
bun run openai:local:enable # Configure connection
bun run server:dev # Restart server
-~~~
+```
+
#### Adding models for local inference (ollama)
-~~~bash
+```bash
# See https://ollama.com/library for available models
-MODEL_TO_ADD=gemma3
+MODEL_TO_ADD=gemma3
docker exec -it ollama ollama run ${MODEL_TO_ADD}
-~~~
-
+```
## Testing
@@ -102,20 +109,21 @@ Tests are located in `__tests__` directories next to the code they test. Testing
> `bun run test` will run all tests
-
## Troubleshooting
+
1. `bun run clean`
1. `bun i`
-1. `bun server:dev`
-1. `bun client:dev`
-1. Submit an issue
+1. `bun server:dev`
+1. `bun client:dev`
+1. Submit an issue
-History
----
-A high-level overview for the development history of the parent repository, [geoff-seemueller-io](https://geoff.seemueller.io), is provided in [LEGACY.md](../../LEGACY.md).
+## History
+
+A high-level overview for the development history of the parent repository, [geoff-seemueller-io](https://geoff.seemueller.io), is provided in [LEGACY.md](../../LEGACY.md).
## License
-~~~text
+
+```text
MIT License
Copyright (c) 2025 Geoff Seemueller
@@ -137,5 +145,4 @@ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
-~~~
-
+```
diff --git a/packages/cloudflare-workers/open-gsio/tsconfig.json b/packages/cloudflare-workers/open-gsio/tsconfig.json
index 8bae08e..41259c2 100644
--- a/packages/cloudflare-workers/open-gsio/tsconfig.json
+++ b/packages/cloudflare-workers/open-gsio/tsconfig.json
@@ -8,11 +8,6 @@
"outDir": "dist",
"rootDir": "."
},
- "include": [
- "*.ts"
- ],
- "exclude": [
- "node_modules",
- "*.test.ts"
- ]
+ "include": ["*.ts"],
+ "exclude": ["node_modules", "*.test.ts"]
}
diff --git a/packages/env/env.d.ts b/packages/env/env.d.ts
index b803cd1..44888d9 100644
--- a/packages/env/env.d.ts
+++ b/packages/env/env.d.ts
@@ -4,7 +4,7 @@ interface Env {
EMAIL_SERVICE: any;
// Durable Objects
- SERVER_COORDINATOR: import("packages/server/durable-objects/ServerCoordinator.ts");
+ SERVER_COORDINATOR: import('packages/server/durable-objects/ServerCoordinator.ts');
// Handles serving static assets
ASSETS: Fetcher;
@@ -12,7 +12,6 @@ interface Env {
// KV Bindings
KV_STORAGE: KVNamespace;
-
// Text/Secrets
METRICS_HOST: string;
OPENAI_API_ENDPOINT: string;
diff --git a/packages/env/package.json b/packages/env/package.json
index 3e84857..719a271 100644
--- a/packages/env/package.json
+++ b/packages/env/package.json
@@ -1,4 +1,4 @@
{
"name": "@open-gsio/env",
"module": "env.d.ts"
-}
\ No newline at end of file
+}
diff --git a/packages/env/tsconfig.json b/packages/env/tsconfig.json
index 56865fe..043cb83 100644
--- a/packages/env/tsconfig.json
+++ b/packages/env/tsconfig.json
@@ -4,11 +4,6 @@
"outDir": "dist",
"rootDir": "."
},
- "include": [
- "*.ts",
- "*.d.ts"
- ],
- "exclude": [
- "node_modules"
- ]
-}
\ No newline at end of file
+ "include": ["*.ts", "*.d.ts"],
+ "exclude": ["node_modules"]
+}
diff --git a/packages/scripts/README.md b/packages/scripts/README.md
index a8da82a..667a3ce 100644
--- a/packages/scripts/README.md
+++ b/packages/scripts/README.md
@@ -9,7 +9,7 @@ bun install
To run:
```bash
-bun run
+bun run
```
This project was created using `bun init` in bun v1.2.8. [Bun](https://bun.sh) is a fast all-in-one JavaScript runtime.
diff --git a/packages/scripts/tsconfig.json b/packages/scripts/tsconfig.json
index a0db44f..bc4c513 100644
--- a/packages/scripts/tsconfig.json
+++ b/packages/scripts/tsconfig.json
@@ -6,11 +6,6 @@
"allowJs": true,
"noEmit": false
},
- "include": [
- "*.js",
- "*.ts"
- ],
- "exclude": [
- "node_modules"
- ]
+ "include": ["*.js", "*.ts"],
+ "exclude": ["node_modules"]
}
diff --git a/packages/server/README.md b/packages/server/README.md
index ce0408c..c2bcc76 100644
--- a/packages/server/README.md
+++ b/packages/server/README.md
@@ -6,15 +6,15 @@ This directory contains the server component of open-gsio, a full-stack Conversa
- `__tests__/`: Contains test files for the server components
- `services/`: Contains service modules for different functionalities
- - `AssetService.ts`: Handles static assets and SSR
- - `ChatService.ts`: Manages chat interactions with AI models
- - `ContactService.ts`: Processes contact form submissions
- - `FeedbackService.ts`: Handles user feedback
- - `MetricsService.ts`: Collects and processes metrics
- - `TransactionService.ts`: Manages transactions
+ - `AssetService.ts`: Handles static assets and SSR
+ - `ChatService.ts`: Manages chat interactions with AI models
+ - `ContactService.ts`: Processes contact form submissions
+ - `FeedbackService.ts`: Handles user feedback
+ - `MetricsService.ts`: Collects and processes metrics
+ - `TransactionService.ts`: Manages transactions
- `durable_objects/`: Contains durable object implementations
- - `ServerCoordinator.ts`: Cloudflare Implementation
- - `ServerCoordinatorBun.ts`: Bun Implementation
+ - `ServerCoordinator.ts`: Cloudflare Implementation
+ - `ServerCoordinatorBun.ts`: Bun Implementation
- `api-router.ts`: API Router
- `RequestContext.ts`: Application Context
-- `server.ts`: Main server entry point
\ No newline at end of file
+- `server.ts`: Main server entry point
diff --git a/packages/server/tsconfig.json b/packages/server/tsconfig.json
index 5d785e8..64d87c4 100644
--- a/packages/server/tsconfig.json
+++ b/packages/server/tsconfig.json
@@ -10,12 +10,6 @@
"allowJs": true,
"jsx": "react-jsx"
},
- "include": [
- "**/*.ts",
- "**/*.tsx"
- ],
- "exclude": [
- "node_modules",
- "dist"
- ]
+ "include": ["**/*.ts", "**/*.tsx"],
+ "exclude": ["node_modules", "dist"]
}
diff --git a/packages/server/types.d.ts b/packages/server/types.d.ts
index 27bbe4b..8b3fbd5 100644
--- a/packages/server/types.d.ts
+++ b/packages/server/types.d.ts
@@ -1,5 +1,5 @@
declare global {
- type ExecutionContext = any
- type Env = import("@open-gsio/env")
+ type ExecutionContext = any;
+ type Env = import('@open-gsio/env');
}
-export type ExecutionContext = any
\ No newline at end of file
+export type ExecutionContext = any;