Aqueduct
p/aqueduct
Taking Data Science to Production
Vikram Sreekanti
Aqueduct — The easiest way to run open source LLMs
Featured
11
Aqueduct's LLM support makes it easy for you to run open-source LLMs on any infrastructure that you use. With a single API call, you can run an LLM on a single prompt or even on a whole dataset!
Replies
Best
Vikram Sreekanti
Hi everyone! LLMs have taken the world by storm, but using them is a pain (or a non-starter) for most people, due to concerns around data privacy, IP ownership, and cost. Open-source LLMs, like LLaMa, Dolly, and Vicuna have enabled enterprises to think about using LLMs, but they're a pain to operate. At Aqueduct, our goal has been to enable ML teams to use the best technology without the operational nightmare of running ML in the cloud, and we're super excited to share that Aqueduct now allows you to run open-source LLMs with a single API call. ➡️ Aqueduct's Python API allows you to call an LLM with a single line of code (see the first image above). No need to worry about installing drivers and library dependencies and debugging configuration parameters. ☁️ Aqueduct is designed to work with any infrastructure you use; you can run your LLMs on a large server or on a Kubernetes cluster. You can even have Aqueduct spin up a cluster for you. 🔁 You can publish your LLM-based workflows to run ad hoc or on a fixed schedule using Aqueduct. 💡 Aqueduct's visibility features extend naturally to LLMs, so you can see what parameters or prompts you used and how performance evolves over time. We'd love to hear what you think! Check out our open-source project or join our Slack community. GitHub: https://github.com/aqueducthq/aq... Slack: https://slack.aqueducthq.com
Joseph Gonzalez
I am really excited to see how people actually use LLMs to solve real problems. How will you use LLMs?
Timothy Chen
It's amazing to see part of the team that has worked on Vicana and LMSys to also publish simple API to deploy and run these OSS LLM models in production. Excited to see everyone leverage this!
Suresh Ravoor
While most solutions prescribe a "rip & replace fork-lift" strategy, Aqueduct is refreshing in its philosophy of empowering and working with your existing best-in-class ML / LLM technology choices.
Olinda Gray
Finally, ML teams can leverage the power of LLMs with a single API call, saving time on installation and configuration headaches. And with the flexibility to run LLMs on any infrastructure, Aqueduct makes ML deployment a breeze. Kudos to the Aqueduct team for simplifying the ML journey!
Ardhra CR
"Aqueduct simplifies the deployment of open-source LLMs, making it easier to leverage their power for natural language processing tasks."
Great job!
Joe Hellerstein
The next generation of AI is in all our hands; not behind superscalar moats. This launch lets us run our own LLMs, on prem or in a secure cloud. Aqueduct makes it easy, using infrastructure you already understand.
Ronje Dawkins
Good stuff.
Will Cros
Easiest way to run LLM's in the cloud!
Varun Sharma
Exciting!