on 05-07-2025 05:57 AM
This article discusses how you can use Incorta's built-in integration with MLflow to manage your machine learning lifecycle.
MLflow is an open-source platform that helps you keep track of your machine learning experiments by logging parameters, metrics, and even the resulting model artifacts (like pickled files or model definitions). This allows you to easily compare different runs, understand what led to better performance, and reproduce past results. The platform also provides a central model registry where you can store, version, and manage your trained models, bringing structure and organization to the often chaotic process of developing and deploying machine learning models.
Incorta ML model development is done via Incorta Materialized View or in Incorta notebook for business users. The ML model artifacts are stored in your Incorta tenant storage. Loading the ML model in the MVs can integrate ML inference with your regular data pipeline.
You can definitely perform ML model development and inference with or without enabling MLflow. However, by enabling the MLflow integration option from the Incorta Cloud admin console, Incorta runs an ML tracking server for you.
Your MV will interact with the MLflow tracking server via the MLflow API, which is a standard API from the open-source MLflow package.
MLflow tracking includes a metadata database for holding ML models and cloud storage for storing MLflow artifacts.
The MLflow UI, which allows you to view the stored ML model and compare its performance, will be available.
You can enable MLflow from the Incorta cloud admin console.
You will be able to access the MLflow UI page. The login information will be emailed to the admin user.
The tracking server address will be available to you from the page.
Incorta Cloud provides the option of running your Spark job driver process on a separate machine hosted in the Incorta Cloud. When the MLflow option is enabled, the Chidori mode will be enabled. Your Spark driver processes will no longer run on the analytics service or loader service nodes. The change is transparent to the Incorta users, as Incorta manages the complexity for you.
Getting Started with Machine Learning in Incorta Guide
Action on Insights: Incorta for Data Scientists
Build Machine Learning Models using Incorta Materialized Views
Incorta third-party AI/ML platform Integration