Databricks announced the launch of Databricks Model Serving to provide simplified production machine learning (ML) natively within the Databricks Lakehouse Platform. Model Serving removes the complexity of building and maintaining complicated infrastructure for intelligent applications. Organizations can leverage the Databricks Lakehouse Platform to integrate real-time machine learning systems across their business, from personalized recommendations to customer service chatbots, without the need to configure and manage the underlying infrastructure. Deep integration within the Lakehouse Platform offers data and model lineage, governance and monitoring throughout the ML lifecycle, from experimentation to training to production. Databricks Model Serving is now generally available on AWS and Azure. Capabilities, include:

  • Feature Store: Provides automated online lookups to prevent online/offline skew. Define features once during model training, and Databricks will automatically retrieve and join the relevant features in the future.
  • MLflow Integration: Natively connects to MLflow Model Registry, enabling easy deployment of models. After providing the underlying model, Databricks will automatically prepare a production-ready container for model deployment.
  • Unified Data Governance: Manage and govern all data and ML assets with Unity Catalog, including those consumed and produced by model serving.

https://www.databricks.com