Jakub Zavrel on the 100 Most Cited Papers, Top Research Topics, and the Future of Language Models, Multimodal AI, and Beyond.
Subscribe: Apple • Spotify • Stitcher • Google • AntennaPod • Podcast Addict • Amazon • RSS.
Jakub Zavrel is the Founder and CEO at Zeta Alpha, a premier Neural Discovery Platform that utilizes cutting-edge Neural Search technology to enhance the way you and your team uncover, arrange, and disseminate knowledge. Our conversation focuses on the latest developments in artificial intelligence, taking inspiration from their recent viral article featuring the top the 100 most cited AI papers of 2022.
Interview highlights – key sections from the video version:
- The 100 most cited AI papers in 2022
- Top topics in 2022: Language Models, Multimodal Models, Alphafold
- Language Models, “chain of thought”, and reasoning
- The dominance of transformers
- Multimodal Models; Audio data
- Synthetic Data, fine tuning, and search
- Search interface of the future
- The future of prompt engineering
- Language Models beyond English
- Scaling Laws and the emergence of Custom Large Language Models
Related content:
- A video version of this conversation is available on our YouTube channel.
- Percy Liang: Evaluating Language Models
- Dylan Patel: The Open Source Stack Unleashing a Game-Changing AI Hardware Shift
- Pablo Villalobos: Exhaustion of High-Quality Data Could Slow Down AI Progress in Coming Decades
- Roy Schwartz: Efficient Methods for Natural Language Processing
- Barret Zoph and Liam Fedus: Efficient Scaling of Language Models
- Connor Leahy and Yoav Shoham: Large Language Models
- Mark Chen of OpenAI: How DALL·E works
- Jack Clark: The 2022 AI Index
- Piotr Żelasko: The Unreasonable Effectiveness of Speech Data
- fastdup: Introducing a new free tool for curating image datasets at scale
If you enjoyed this episode, please support our work by encouraging your friends and colleagues to subscribe to our newsletter:
[Image: Large Language Models – key research topics (2019-present); via Zeta Alpha]