open source llm

  • ProxmoxHow to Use Bolt.new for FREE with Local LLMs (And NO Rate Limits)

    How to Use Bolt.new for FREE with Local LLMs (And NO Rate Limits)

    Over the last month, together as a community we started oTToDev, a fork of Bolt.new that aims to add a bunch of much needed functionality like being able to use any LLM you want, including local ones with Ollama. In this video I give some super important tips and tricks for using local LLMs with oTToDev, some of which can…

    Read More »
  • Does parallel embedding work in Ollama yet?

    What happens when you try to embed multiple chunks at the same time? Does it work? Does it get slower or speed up? Take a look at this video to see what happens. You can find the code for every video I make at Then find the folder name that starts with the date this video was published and a…

    Read More »
Back to top button