Restart your editor — done. Your AI assistant can now use local Ollama models.
💡 TL;DR: This project creates a local proxy server that gives you free access to GPT-4o, Claude, DeepSeek, Gemini, Grok, Mistral, and Qwen models through Puter.js SDK - no expensive API keys needed!