Machine Learning Model Observability

Oren Razon on key aspects of model observability, MLOps, and CI/CD for machine learning.


SubscribeApple • Android • Spotify • Stitcher • Google • AntennaPod • RSS.

Oren Razon is CEO and co-founder of Superwise1, a startup that builds tools to streamline observability for machine learning models. This episode provides a comprehensive overview of tools and best practices for deploying, monitoring, and managing machine learning models in production.

Some of the topic we covered include:

  • The difference between monitoring and observability tools, particularly in the context of machine learning.
  • Key features of a model observability solution.
  • Learning from best practices, tools, and processes used in traditional software engineering (CI/CD, deployment modes like Canary and Shadowing; unit and integration testing.
  • Responsible AI.
  • The impact of broader trends on MLOps and model observability (e.g., multimodal models, reinforcement learning, large/foundation models).

Oren Razon on the Difference between Model Monitoring and Model Observability solutions:

Highlights in the video version:

Related content:


[1] This post and episode are part of a collaboration between Gradient Flow and Superwise. See our statement of editorial independence.

[Image: Key Features of a Model Observability solution, from Superwise, used with permission.]