Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

GoogleCloudPlatform/vertex-ai-spark-ml-serving

Repository files navigation

Serving Spark ML models using Vertex AI

This repository contains the companion code for Serving Spark ML models using Vertex AI. The code shows you how to serve (run) online predictions from Spark MLlib models using Vertex AI.

The code allows you to build a custom container for serving predictions that can be used with Vertex AI. The custom container uses MLeap to serve a Spark MLlib model that has been exported to an MLeap Bundle (the MLeap serialization format). The MLeap execution engine and serialization format supports low-latency inference without dependencies on Spark.

See the MLeap documentation for information on exporting Spark MLlib models to MLeap Bundles.

How to use this example

Use the tutorial to understand how to:

  1. Serve predictions from an example model that is included with the tutorial. The example model has been trained using the Iris dataset, and then exported from Spark MLlib to an MLeap Bundle.

  2. Configure the custom container image to serve predictions from your own models.

License

Apache Version 2.0

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
Morty Proxy This is a proxified and sanitized view of the page, visit original site.