The Rise of Custom Foundation Models

Andrew Feldman on How Enterprises Are Building and Deploying Their Own Foundation Models.


SubscribeApple • Spotify • Stitcher • Google • AntennaPod • Podcast Addict • Amazon •  RSS.

Andrew Feldman is CEO and co-founder of Cerebras, a startup that has released the fastest AI accelerator, based on the largest processor. We discussed Cerebras-GPT, a family of language models that have set new benchmarks for accuracy and compute efficiency, with sizes ranging from 111 million to 13 billion parameters. Andrew also points out why custom foundation models are becoming more popular as enterprises seek to avoid sending their data to third parties and to build models that are tailored to their specific needs. These models are typically smaller than general-purpose models, but they can be just as accurate, and they are much cheaper to run.

Subscribe to the Gradient Flow Newsletter

Interview highlights – key sections from the video version:


Andrew Feldman will be speaking at the AI Conference in San Francisco (Sep 26-27). Use the discount code FriendsofBen18 to save 18% on your registration.



Related content:


If you enjoyed this episode, please support our work by encouraging your friends and colleagues to subscribe to our newsletter: