Chat with LLMs directly from within Blender Configure Ollama URL from the addon preferences Select from available models in your Ollama installation Persistent chat history during your Blender session ...
It contains a production grade implementation including DEPLOYMENT code with CDK and a CI/CD pipeline, testing, observability and more (see Features section). Choose the architecture that you see fit, ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results