Andrew Feldman on How Enterprises Are Building and Deploying Their Own Foundation Models.
Subscribe: Apple • Spotify • Stitcher • Google • AntennaPod • Podcast Addict • Amazon • RSS.
Andrew Feldman is CEO and co-founder of Cerebras, a startup that has released the fastest AI accelerator, based on the largest processor. We discussed Cerebras-GPT, a family of language models that have set new benchmarks for accuracy and compute efficiency, with sizes ranging from 111 million to 13 billion parameters. Andrew also points out why custom foundation models are becoming more popular as enterprises seek to avoid sending their data to third parties and to build models that are tailored to their specific needs. These models are typically smaller than general-purpose models, but they can be just as accurate, and they are much cheaper to run.
Interview highlights – key sections from the video version:
-
- Cerebras-GPT: A Family of Open, Compute-efficient, Large Language Models
- Why companies want to build and deploy their own custom foundation models
- Will “training foundation models from scratch” be viable?
- Open source vs proprietary foundation models
- Foundation models in China
- Evolution of Cerebras hardware and other hardware for AI
- Efficiency and lessons from computer vision
- The software stack that needs to accompany progress in hardware for AI
- Computer vision and multi-modal models

Related content:
- A video version of this conversation is available on our YouTube channel.
- Building LLM-powered Apps: What You Need to Know
- Jonas Andrulis: Building and Deploying Foundation Models for Enterprises
- Percy Liang: Evaluating Language Models
- Hagay Lupesko: Custom Foundation Models
- Jakub Zavrel: Uncovering and Highlighting AI Trends
- Raymond Perrault: 2023 AI Index
- Dylan Patel: The Open Source Stack Unleashing a Game-Changing AI Hardware Shift
- Pablo Villalobos: Exhaustion of High-Quality Data Could Slow Down AI Progress in Coming Decades
- Roy Schwartz: Efficient Methods for Natural Language Processing
If you enjoyed this episode, please support our work by encouraging your friends and colleagues to subscribe to our newsletter: