1.1 KiB
1.1 KiB
open-web-agent-rs
A Rust-based web agent with an embedded OpenAI-compatible inference server (supports Gemma models only).
Project Structure
This project is organized as a Cargo workspace with the following crates:
agent-server
: The main web agent serverlocal_inference_engine
: An embedded OpenAI-compatible inference server for Gemma models
Setup
- Clone the repository
- Copy the example environment file:
cp .env.example .env
- Install JavaScript dependencies:
bun i
- Start the SearXNG search engine:
docker compose up -d searxng
Running the Project
Local Inference Engine
To run the local inference engine:
cd crates/local_inference_engine
cargo run --release -- --server
Agent Server
To run the agent server:
cargo run -p agent-server
Development Mode
For development with automatic reloading:
bun dev
Building
To build all crates in the workspace:
cargo build
To build a specific crate:
cargo build -p agent-server
# or
cargo build -p local_inference_engine