Using MLOps

Adding a New Project

Projects allow you to group multiple experiments into a single location, whether those are grouped via user or experiment type. Adding projects is easy.

  1. Click the Add project button.

  2. Specify a name for the project.

  3. Click Add. In this example, we’re creating a project called Credit card test. Upon completion, the new project will appear in the list of projects in MLOps

  4. Open the Driverless AI Projects page and refresh the page. The new project will now be visible in Driverless AI.

Add a project

Exporting Experiments from Driverless AI into MLOps

This section assumes that you have experiments in a Driverless AI project to export into MLOps. Refer to the Driverless AI User Guide for information on how to run an experiment.

  1. In Driverless AI, navigate to the Projects page. By default, this page shows all available projects, including the project just created.

  2. Select the new project, and link your experiment in the Projects. If necessary, create an experiment first, and then link it. (Refer to Linking Experiments.)

  3. Return to MLOps and verify that the default-payment-experiment experiment was exported into the Credit card test Project.

Link experiment

Importing Models from H2O-3 into MLOps

Regression and classification models from H2O-3 can be imported into MLOps.

  1. Select a project and click Add Model.

  2. Specify a H2O-3 MOJO model to import. Drag and drop a file into the window or select one from your machine’s file system.

  3. Enter a name for your model and click OK to confirm and upload the model.

Note: H2O-3 binary models cannot be imported into MLOps.

Import a H2O-3 model

Deploying Models

In MLOps, models can be deployed to Dev and/or Production environments.

  1. Select the project that includes the model you want to deploy. In this example, we will deploy the default-payment-experiment model from the Credit card test Project.

  2. Click the Actions dropdown in the Models table and specify where you want to deploy this model. In this example, we will deploy to Dev.

  3. A confirmation page displays. Click Confirm to continue the deployment or Cancel to return to the Project without deploying.

MLOps will begin the deployment, and the State column will to show that the deployment is Launching. The state changes to Healthy upon completion.

Deploying a model

After a model has been deployed, you will be able to view details about the deployment, monitor the deployment, view a sample cURL request, and copy the endpoint URL.

Note: You can undeploy by clicking the Deploy dropdown, de-selecting the current deployment, then confirming the request.

Creating a Champion/Challenger Deployment

MLOps allows you to continuously compare your chosen best model (Champion) to a Challenger model with Champion/Challenger deployments. To set a Challenger model, navigate to the Models section and click Actions > Challenger for … next to the model you want to use as a Challenger. In the pop-up window that appears, select a Champion model and an environment type, then click Deploy.

Champion/Challenger Deployment

Creating an A/B Test Deployment

A/B testing in MLOps allows you to compare the performance of two or more models. When requests are sent to an A/B deployment, they are directed to the selected models at a specified ratio known as the traffic percentage. For example, if the deployment consists of two models and the traffic percentage is set to 50% / 50%, each model will receive half of the total incoming requests.

To create an A/B deployment, navigate to the Models section and select two or more models, then click A/B Test. In the pop-up window that appears, the models listed in the upper half are the models that were selected, while the lower half allows you to select additional models for the A/B deployment if there are any still available. You can also remove models from the selection by clicking them. Once you have selected the models you want to use, specify the traffic percentage for each model, then choose an environment type and click Deploy.

Note: The traffic percentage of the last model in the deployment autocompletes so that the overall percentage adds up to 100.

A/B Deployment

Deployment Details

In the Deployments section, click on the Actions > More Details link to view summary information about the deployment along with any alerts. From this page, you can also click on the Model Endpoint link to open the scoring service, and you can view/copy a sample URL request for scoring.

More deployment details

Monitoring Deployments

Driverless AI + MLOps can monitor models for drift, anomalies, and residuals, providing more traceability and governance of models. The alerts provided on the dashboard can help you determine whether to re-tune or re-train models.

In the Deployments section, click on Actions > Monitoring to open a Grafana dashboard and view metrics for the deployment. (Refer to the Grafana documentation for more information.)

By default, the Monitoring page shows:

  • Alerts

  • Column types

  • Drift detection

  • Scoring latency (Populated only after the scoring service is started. See Copy Endpoint URL for an example.)

  • Scatterplots, histograms, and heatmaps for input columns and output columns

Note: You can also navigate to this page by clicking on an alert.

Monitoring deployments

Custom Business Metrics

In addition to the default panels, you can also create customized business metrics for this page. Just click the Add Panel button in the upper-right menu to add a new panel. At this point, you can specify to create a new query or a new visualization. (Refer to the Grafana documentation for more information.)

Add panel

Score Model

  1. In the Deployments section, click on Actions > Show sample request to see/copy the sample cURL request.

  2. Copy the sample cURL request.

  3. Open a Terminal window and paste the sample cURL request to view the model scoring information.

Score model

Copy Endpoint URL

  1. In the Deployments section, click on Actions > Copy endpoint URL link to retrieve the model endpoint.

  2. Paste the URL into a browser to launch the scoring service.

Upon completion, Scoring Latency graph on the Actions > Monitoring page begin to populate with data.

Scoring Latency

Sharing Projects

MLOps makes it easy for you to share your projects with other users in your MLOps environment, enabling for collaboration among multiple users. This can be done in either Driverless AI or in MLOps.

Share Project in Driverless AI

  1. On the Driverless AI Projects page, click the More Actions button beside the project that you want to share, then click Share.

  2. In the User dropdown, select the user that you want to share this project with.

  3. Optionally click the Restrict to dropdown to specify whether this collaborator is restricted to read or write access.

  4. Click Add to share the project.

Upon completion, the user will be added to the list of collaborators. Click the Delete button to remove the collaborator.

Note: You cannot currently edit collaborator privileges. If, for example, a collaborator only has read privileges and you want to grant write privileges, you have to delete and then re-add the collaborator.

Share project in Driverless AI

Share Project in MLOps

  1. Click Share project button in the upper-right corner of the project you want to share.

  2. In the search field, specify the MLOps user that you want to share this with.

  3. Click the Share with button to share the project. The new user will then be added to the list of collaborators.

  4. Optionally click the Actions button to restrict the user to either reader or writer or to remove this user from the list of collaborators.

Share project in MLOps