- Introduce `supportedModels` in `ClientChatStore` and update model validation logic
- Enhance OpenAI inferencing with local setup adaptations and improved streaming options
- Modify ChatService to handle local and remote model fetching
- Update input menu to dynamically fetch and display supported models
- Add start_inference_server.sh for initiating local inference server
- Upgrade OpenAI SDK to v5.0.1 and adjust dependencies accordingly
- Revise deployment steps and docs for `GROQ_API_KEY`
- Enable `workers_dev` in `wrangler.jsonc`
- Adjust hero label to `open-gsio` in routes
- Update `.gitignore` to include sensitive config files
- Add `deploy:secrets` script in `package.json`
Introduce a new `session-proxy` worker with its configuration file. Update deployment scripts to include `deploy:session-proxy` and add a `deploy:all` script for streamlined deployment of all workers. Expand README with deployment instructions and usage of `pnpm` as an alternative to `bun`.