Launches
Coming soon
Upcoming launches to watch
Launch archive
Most-loved launches by the community
Launch Guide
Checklists and pro tips for launching
Products
News
Newsletter
The best of Product Hunt, every day
Stories
Tech news, interviews, and tips from makers
Changelog
New Product Hunt features and releases
Forums
Forums
Ask questions, find support, and connect
Streaks
The most active community members
Events
Meet others online and in-person
Advertise
Subscribe
Sign in
p/chattyui
Run open-source LLMs locally in the browser using WebGPU
•
0
reviews
•
94
followers
Start new thread
Follow
trending
Jakob Hoeg
•
9mo ago
ChattyUI - Run open-source LLMs locally in the browser using WebGPU
Open-source, feature rich Gemini/ChatGPT-like interface for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU. No server-side processing - your data never leaves your pc!
9
134