Commit Graph

45 Commits

Author SHA1 Message Date
geoffsee
9e79c488ee correct README 2025-06-09 23:18:52 -04:00
geoffsee
370c3e5717 adjust README and local inference configuration script 2025-06-09 23:18:52 -04:00
Geoff Seemueller
ad7dc5c0a6 Update README.md
improve semantics

Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-05 14:04:08 -04:00
Geoff Seemueller
059e7d3218 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-04 20:19:12 -04:00
geoffsee
6be0316e75 add some missing to last 2025-06-04 20:09:39 -04:00
geoffsee
5bd1e2f77f add Acknowledgments section to README 2025-06-04 20:05:02 -04:00
geoffsee
2884baf000 Add Docker Compose setup for Ollama and Open-WebUI services
- Replaced single Docker command for Ollama with a `docker-compose` setup.
- Updated `start_inference_server.sh` to use `ollama-compose.yml`.
- Updated README with new usage instructions for Ollama web UI access.
2025-06-04 18:45:08 -04:00
geoffsee
497eb22ad8 change semantics
Update README deployment steps and add deploy:secrets script to package.json

update local inference script and README

update lockfile

reconfigure package scripts for development

update test execution

pass server tests

Update README with revised Bun commands and workspace details

remove pnpm package manager designator

create bun server
2025-06-04 18:45:08 -04:00
Geoff Seemueller
1055cda2f1 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-04 12:15:12 -04:00
geoffsee
8587cf10d0 remove nbsps in README 2025-06-02 13:52:11 -04:00
geoffsee
a9bbea8c34 hotfix: add default for local-inference 2025-06-02 13:51:05 -04:00
geoffsee
f4a44be89a add note to top of readme to replace project status 2025-06-02 12:50:22 -04:00
geoffsee
9e8b427826 Add scripts and documentation for local inference configuration with Ollama and mlx-omni-server
- Introduced `configure_local_inference.sh` to automatically set `.dev.vars` based on active local inference services.
- Updated `start_inference_server.sh` to handle both Ollama and mlx-omni-server server types.
- Enhanced `package.json` to include new commands for starting and configuring inference servers.
- Refined README to include updated instructions for running and adding models for local inference.
- Minor cleanup in `MessageBubble.tsx`.
2025-06-02 12:50:22 -04:00
Geoff Seemueller
f2d91e2752 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-02 11:58:26 -04:00
Geoff Seemueller
79db9f4a14 Hyperlink stack items in README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-02 11:35:05 -04:00
Geoff Seemueller
744fb41e21 Add new demo image to README
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-02 11:30:55 -04:00
Geoff Seemueller
a932f20886 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-02 11:22:57 -04:00
geoffsee
1efd7ab2e2 Rewrite README.md for improved clarity, structure, and usability; add development history to LEGACY.md. 2025-06-02 11:21:03 -04:00
Geoff Seemueller
9cb5bb0c5c Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-01 09:27:03 -04:00
Geoff Seemueller
5a7691a9af Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-06-01 00:04:48 -04:00
geoffsee
9e6ef975a9 saves a message 2025-05-31 18:48:55 -04:00
Geoff Seemueller
33baf588b6 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-30 10:03:52 -04:00
geoffsee
4fbf120710 update badges 2025-05-29 21:54:52 -04:00
geoffsee
32339f3f18 add test workflow for ci 2025-05-29 21:47:08 -04:00
geoffsee
f07c19dae8 init test suite 2025-05-29 21:32:12 -04:00
geoffsee
cc0da17b5f - Add killport.js script for terminating processes on specific ports
- Introduce `supportedModels` in `ClientChatStore` and update model validation logic
- Enhance OpenAI inferencing with local setup adaptations and improved streaming options
- Modify ChatService to handle local and remote model fetching
- Update input menu to dynamically fetch and display supported models
- Add start_inference_server.sh for initiating local inference server
- Upgrade OpenAI SDK to v5.0.1 and adjust dependencies accordingly
2025-05-29 20:17:34 -04:00
Geoff Seemueller
c9ee7c7690 Update README.md 2025-05-28 23:46:00 -04:00
Geoff Seemueller
922e5983e6 Update README.md 2025-05-28 23:41:56 -04:00
Geoff Seemueller
c17a2b1504 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-28 23:19:11 -04:00
geoffsee
887b5fc7f4 Remove redundant build step from Quickstart instructions in README.
`bun run server:dev` automatically builds the client
2025-05-28 22:10:16 -04:00
geoffsee
3f717fab1b - Update default model to meta-llama/llama-4-scout-17b-16e-instruct in ClientChatStore
- Revise deployment steps and docs for `GROQ_API_KEY`
- Enable `workers_dev` in `wrangler.jsonc`
- Adjust hero label to `open-gsio` in routes
- Update `.gitignore` to include sensitive config files
- Add `deploy:secrets` script in `package.json`
2025-05-28 21:33:34 -04:00
Geoff Seemueller
4928b6c2a2 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-27 16:21:46 -04:00
geoffsee
46b912ba93 Add session-proxy worker and deploy-all script
Introduce a new `session-proxy` worker with its configuration file. Update deployment scripts to include `deploy:session-proxy` and add a `deploy:all` script for streamlined deployment of all workers. Expand README with deployment instructions and usage of `pnpm` as an alternative to `bun`.
2025-05-27 15:15:45 -04:00
Geoff Seemueller
68f1c67773 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-23 16:24:24 -04:00
geoffsee
8bb5015fab correct ooo for dev 2025-05-23 16:15:25 -04:00
geoffsee
7348ab1ccb semantics 2025-05-23 16:14:48 -04:00
geoffsee
a9c4f25ff3 revert pnpm to bun 2025-05-23 16:12:56 -04:00
geoffsee
9c7ad7724b fix build 2025-05-23 16:10:41 -04:00
Geoff Seemueller
d7a346891f Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-23 12:34:13 -04:00
Geoff Seemueller
dab97508d3 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-23 08:56:04 -04:00
Geoff Seemueller
965559910a Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-23 08:55:09 -04:00
Geoff Seemueller
4427ba5296 Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-22 23:35:16 -04:00
Geoff Seemueller
77fb288c4d Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-22 23:34:16 -04:00
Geoff Seemueller
6ac7f2d65a Update README.md
Signed-off-by: Geoff Seemueller <28698553+geoffsee@users.noreply.github.com>
2025-05-22 23:17:57 -04:00
geoffsee
33679583af init 2025-05-22 23:14:01 -04:00