Malte Pietsch on the power of Haystack, and common usage and design patterns in LLM applications.
Subscribe: Apple • Spotify • Overcast • Google • AntennaPod • Podcast Addict • Amazon • RSS.
Malte Pietsch is co-founder & CTO of Deepset, the company behind the popular open source project Haystack, an orchestration framework for LLMs that I’ve been using for the last few months. Haystack lets you zero in on crafting your pipeline and computation graph, seamlessly integrating various components like LLMs, vector databases, and document stores. Its unified, user-friendly interface makes assembly effortless, smoothing the path to complex system development.
Interview highlights – key sections from the video version:
- Orchestration is the context of AI and LLM apps
- Deepset Cloud vs. Haystack open source project
- Haystack usage patterns: RAG and other apps
- Retrieval Augmented Generation (RAG)
- Tuning RAG requires experiments at scale
- RAG Evaluation Metrics
- Hallucination
- Information Extraction
- Data Quality
- Streaming and (near) real-time
- Vector Databases
- Haystack 2.0
Related content:
- A video version of this conversation is available on our YouTube channel.
- Best Practices in Retrieval Augmented Generation
- Philipp Moritz and Goku Mohandas: Navigating the Nuances of Retrieval Augmented Generation
- Building a Fleet of Custom LLMs
- Ivy: Streamlining AI Model Deployment and Development
- Brian Raymond: ETL for LLMs
- Jerry Liu: An Open Source Data Framework for LLMs
- Emil Eifrem: The Future of Graph Databases
If you enjoyed this episode, please support our work by encouraging your friends and colleagues to subscribe to our newsletter: