The state of privacy-preserving machine learning

The Data Exchange Podcast: Morten Dahl on TF Encrypted, federated learning, coopetitive learning, and other privacy tools for ML.


SubscribeiTunesAndroidSpotifyStitcherGoogle, and RSS.

In this episode of the Data Exchange I speak with Morten Dahl, research scientist at Dropout Labs, a startup building a platform and tools for privacy-preserving machine learning. He is also behind TF Encrypted, an open source framework for encrypted machine learning in TensorFlow.  The rise of privacy regulations like CCPA and GDPR combined with the growing importance of ML has led to a strong interest in tools and techniques for privacy-preserving machine learning among researchers and practitioners. Morten brings the unique perspective of someone who understands both security and machine learning: he is a longtime security researcher who also happens to have worked as a data scientist in industry.

Our conversation spanned many topics, including:

  • Morten’s unique background as an experienced security researcher, developer, and data scientist.
  • The current state of TF Encrypted.
  • Federated learning (FL) and secure aggregation for FL.
  • Privacy-preserving ML solutions will employ a variety of techniques, and thus we also discussed related topics such as differential privacy, homomorphic encryption, and RISELab’s stack for coopetitive learning (MC2).

Our goal in this podcast is to build a community of people interested in Data, Machine Learning and AI. If you have suggestions for us on what to recommend (books, conferences, links), and guests to book, please fill out the “contact” form on this site.

Learn how data, ML & AI are converging at Strata Data & AI Conference in San Jose. Early Price ends Feb 7.

Related content:

[Image by Gerd Altmann from Pixabay]