Awesome!!! First, congratulations on the launch. Making LLM apps cost-effective is one of the priorities for the service that I am working on. This should help. Just a quick question, can this be used in tandem with Langsmith?
We use Langfuse to monitor our GPT usage - specifically to watch for token usage, monitor hallucinations, and trace through request history when things go wrong. Great product and even greater team! Highly recommend trying Langfuse!
So excited to see Langfuse go live — we've been a happy user for 4 weeks now. Most detailed latency and analytics in the market. Highly recommend for anyone using complex chains or with user-facing chat applications, where latency becomes crucial. Congrats, Clemens, Marc, and Max!
This is awesome, congrats on the launch guys! LLMs always felt like such a black box for me, Langfuse seems to address these concerns. This is a huge win for the LLM space in general, I'm super excited to build on it!
This looks absolutely brilliant idea to track and stay informed about the performance of LLM applications. I am in the process of building one and certainly give it a try to measure my LLM app.
Easy to integrate and very intuitive to use!! Highly recommend for anyone developing chatGPT products. Amazing founding team who are super detail-oriented and understands LLM products deeply, and cool to see the evolution of the open source project every week!
We’ve been unsung Langfuse for 2 months now. It’s easy to integrate and makes it simpler for us to monitor & debug LLM requests during development and beyond.
Stellar work @max_deichmann , @marc_klingen , @clemo_sf ! If anything, I dare to say a bit too good for a launch ;). Stoked to test it out on some projects - godspeed!
Congrats on the launch - love the 2 minute demo. It shows how thoughtful this product is, and how much thought you've put into it! Congrats to the whole team!