Deploy Preset Models
Learn how to deploy preset models
Learn how to deploy preset models
Last updated 10th January, 2020.
Deploying models is the main feature of the ML Serving. This guide provides step by step instructions on how to deploy a preset model.
Inside your ML Serving namespace page there is a dedicated tab for managing models : Models
.
You can start the deployment of a new model by clicking the Deploy Model
button.
In this guide, we explain how to deploy a preset model. Just select preset model and click Next
.
Then you have to choose which preset model to deploy.
Just select the chosen one and click Next
.
That name identifies your model among others on your namespace.
Once you chose a name, click the Next
button.
A model is composed of one or several running instance(s). These instances are automaticaly scaled by the ML Serving depending on the input load of your model. That step allows you to configure the minimum and maximum number of instances that you want to run.
During the beta phase, the auto-scaling feature is not customizable and we reserve the right to remove unused models
Each model instance is related to a CPU & RAM flavor. You can choose the wanted flavor among a list of existing ones.
During the beta phase only one type of instance is available and is free of charges. Additional flavors will be created to fit specific needs.
The ML Serving sequentially performs the following tasks :
Building
.Pending
.When everything is up and running you see the build status as Deployed
and the api status as Running
. The URL where to reach your model is also displayed so you can start requesting it.
Please feel free to give any suggestions in order to improve this documentation.
Whether your feedback is about images, content, or structure, please share it, so that we can improve it together.
Your support requests will not be processed via this form. To do this, please use the "Create a ticket" form.
Thank you. Your feedback has been received.
Access your community space. Ask questions, search for information, post content, and interact with other OVHcloud Community members.
Discuss with the OVHcloud community