From 9e79c488ee17ea1a28a91e35222b3a48efdc3016 Mon Sep 17 00:00:00 2001 From: geoffsee <> Date: Mon, 9 Jun 2025 19:05:51 -0400 Subject: [PATCH] correct README --- README.md | 9 ++------- 1 file changed, 2 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index 53a72df..fc8c5fb 100644 --- a/README.md +++ b/README.md @@ -118,21 +118,16 @@ I would like to express gratitude to the following projects, libraries, and indi - [MobX-State-Tree](https://mobx-state-tree.js.org/) - State management solution - [OpenAI SDK](https://github.com/openai/openai-node) - Client for AI model integration - [Vitest](https://vitest.dev/) - Testing framework - - [mlx-omni-server](https://github.com/seemueller-io/mlx-omni-server) - Local inference server for Apple Silicon - [OpenAI](https://github.com/openai) - [Groq](https://console.groq.com/) - Fast inference API - [Anthropic](https://www.anthropic.com/) - Creator of Claude models - [Fireworks](https://fireworks.ai/) - AI inference platform - [XAI](https://x.ai/) - Creator of Grok models - [Cerebras](https://www.cerebras.net/) - AI compute and models - - [Ollama](https://github.com/ollama/ollama) - Local model running - [(madroidmaq) MLX Omni Server](https://github.com/madroidmaq/mlx-omni-server) - Open-source high-performance inference for Apple Silicon + - [MLX](https://github.com/ml-explore/mlx) - An array framework for Apple silicon + - [Ollama](https://github.com/ollama/ollama) - Versatile solution for self-hosting models -- **Contributors** - - All the developers who have contributed code, reported issues, or provided feedback - -- **Open Source Community** - - The broader open-source community for creating and maintaining the tools and libraries that make this project possible ## License ~~~text