Deploy deep learning workloads to production with Azure Machine Learning
Deploying large-scale models for real-time inferencing is challenging because of the model’s size. Learn what you can do and which frameworks you can use to optimize your model’s performance during model scoring.
To choose the appropriate inference strategy, To optimize model scoring with ONNX, To deploy Triton as a managed online endpoint
Prerequisites
Before starting this module, you should be familiar with the Azure Machine Learning service and training machine learning and deep learning models.
There are no reviews yet.