🦉DVC Profile picture
Open source tool for data, models, & experiment versioning for ML projects. Join our stellar community https://t.co/RTCIKrZlmf for help, support and insights.

Dec 20, 2022, 6 tweets

👨🏻‍💻 Ever tried deploying a model? and ended up entangled in scripts?

Here’s how MLEM lets us do “single command deployments” ⚡️:

- Deployment Options
- Setting up the Environment
- Run the command
- Getting Predictions

@Iterativeai @DVCorg
#mlem #aws #deployment

🧵[1/6]

🚀 MLEM gives us a simple and powerful API to deploy to platforms such as:

@awscloud Sagemaker
✅ Docker Container
✅ Heroku
✅ Kubernetes

🧵[2/6]

🌱 Setting up the Environment

Setting up the environment varies for each case, but here let's take the example of Heroku.

We can either set “HEROKU_API_KEY” environment variable or use Heroku CLI to run “heroku login”.

🧵[3/6]

🏃‍♂️ Run the command

Run “mlem deployment run heroku app.mlem —model <model> —app_name <app name>”

Voila! All Done ✅

🧵[4/6]

📊 Getting Predictions

Our model above is reachable by HTTP request, we can open the URL and see OpenAPI spec there, or send requests to get predictions.

We can also use built-in MLEM functionality for the same using “mlem deployment apply app.mlem data.csv --json”

🧵[5/6]

😎 Go ahead, try it out yourself!

cml.dev

🧵[6/6]

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling