Logging In and Navigating the UI¶
Your admin will provide you with the URL for logging in to MLOps. Authentication is done via OpenID Connect.
After logging in, the MLOps home page opens. The home page provides a list of projects available in your environment. Click on a project name to view the project dashboard. This dashboard shows a project summary along with the most recent deployments and models in the project. Click on the Alerts, Deployments, Models, or Events links in a project to narrow the view to show all Alerts, Deployments, Models, or Events for that project.
Alerts are available for Deployed models. These alerts show information about the current state of a deployed model and provide alerts for drift, anomalies, and residuals in the data. For example, in the alerts below, we can see messages that some column values are “drifting” (out of normal boundaries).
The Deployments page shows all deployments for a specific project. The information that displays on this page includes:
Deployed on date
Alerts for drift, anomalies, and residuals
Actions link for
More details: Includes a summary of the deployment and a list of alerts
Monitoring: Opens a Grafana dashboard with scatterplots, histograms, and heatmaps for each feature in the experiment
Show Sample Request: Allows you to view and copy a sample request that you can copy into a Terminal to retrieve scoring information
Copy Endpoint URL: Provides you with an endpoint running a scoring server. When run, the Scoring Latency table in the Monitoring page will begin to populate.
The Models page shows all models that have been exported from Driverless AI into this project. The information that displays on this page includes:
Scorer used during the Driverless AI experiment
Actions link for:
Deploying to dev and/or production
Download the experiment Autoreport
A report file (Autoreport) is included in every successful Driverless AI experiment. When the experiment is linked to MLOps, the Autoreport is available from the Model page. This report provides insight into the training data and any detected shifts in distribution, the validation schema selected, model parameter tuning, feature evolution and the final set of features chosen during the experiment. This report is available in Microsoft Word format.
Click here to download and view a sample experiment report in Word format.
The Events page provides a list of all events that have occurred in the project, such as the project creation and tagging/untagging of deployments.