ChattyUI

ChattyUI

Run open-source LLMs locally in the browser using WebGPU
0 reviews
94 followers

What is ChattyUI?

Open-source, feature rich Gemini/ChatGPT-like interface for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU. No server-side processing - your data never leaves your pc!

Do you use ChattyUI?

ChattyUI gallery image
ChattyUI gallery image

Recent ChattyUI Launches

ChattyUI

ChattyUI Alternatives

Review ChattyUI?