The state of privacy-preserving machine learning

The Data Exchange Podcast: Morten Dahl on TF Encrypted, federated learning, coopetitive learning, and other privacy tools for ML.

SubscribeiTunesAndroidSpotifyStitcherGoogle, and RSS.

In this episode of the Data Exchange I speak with Morten Dahl, research scientist at Dropout Labs, a startup building a platform and tools for privacy-preserving machine learning. He is also behind TF Encrypted, an open source framework for encrypted machine learning in TensorFlow.  The rise of privacy regulations like CCPA and GDPR combined with the growing importance of ML has led to a strong interest in tools and techniques for privacy-preserving machine learning among researchers and practitioners. Morten brings the unique perspective of someone who understands both security and machine learning: he is a longtime security researcher who also happens to have worked as a data scientist in industry.

Our conversation spanned many topics, including:

  • Morten’s unique background as an experienced security researcher, developer, and data scientist.
  • The current state of TF Encrypted.
  • Federated learning (FL) and secure aggregation for FL.
  • Privacy-preserving ML solutions will employ a variety of techniques, and thus we also discussed related topics such as differential privacy, homomorphic encryption, and RISELab’s stack for coopetitive learning (MC2).

Subscribe to our Newsletter:
We also publish a popular newsletter where we share highlights from recent episodes, trends in AI / machine learning / data, and a collection of recommendations.

Related content:

[Image by Gerd Altmann from Pixabay]