ottodev ai
-
Proxmox
How to Use Bolt.new for FREE with Local LLMs (And NO Rate Limits)
Over the last month, together as a community we started oTToDev, a fork of Bolt.new that aims to add a bunch of much needed functionality like being able to use any LLM you want, including local ones with Ollama. In this video I give some super important tips and tricks for using local LLMs with oTToDev, some of which can…
Read More »