👋 Hey Product Hunt!
I built AUM because I kept running into the same wall:
👉 Great AI ideas, but limited by API costs and vendor lock-in.
We’re seeing it already –
Token prices are climbing. Free tiers are vanishing.
And most builders are stuck using just one or two cloud models.
So I asked:
What if running your own local model was as easy as opening an app?
That’s AUM:
✅ Run 50+ LLMs locally (Mac supported today)
✅ 2-click setup – no terminal or GPU tuning needed
✅ Search your local files and folders with AI
✅ Private testing with your team before committing to any cloud model
This isn’t just about building faster – it’s about owning your AI layer.
Would love to hear your thoughts, use cases, or pain points you’re facing with AI APIs right now.
Let’s talk model diversity, privacy, and edge.
— Rohan