Ollama v0.7: Run leading vision models locally with the new engine

Hi everyone!

When Ollama walks out of your command line and starts interacting with you as a native desktop app, don’t be surprised 🙂

This new app dramatically lowers the barrier to running top open-source models locally. You can now chat with LLMs, or drag and drop files and images to interact with multimodal models, all from a simple desktop interface. And most importantly, it’s Ollama, which is one of the most trusted and liked products for users who care about privacy and data security.

Bringing the Ollama experience to people who aren’t as comfortable with the command line will undoubtedly accelerate the adoption of on-device AI.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top