Remembering Thoughts
  Twitter GitHub RSS

Charge Id - Deploying a ML.Net Model to Azure

In the previous post we built a machine learning model using ML.Net, in this post we will deploy the model to an Azure app and allow it to be used via a HTTP API

Using the output model in zip format ‘’ we can include this in our web application as an embedded resource or simply include the file for deployment.

To use the file from a HTTP endpoint:

  1. Include the zip file in your deployment – embedded resource/content/read from blob storage etc..
  2. Initialise the model as a singleton during application start up by using a file path or stream
  3. Call the model using the function PredictionModel.Predict(‘my data from which to predict’)

Sample below that logs to

    public class PredictionController : Controller
        private readonly IPredict _predictor;

        public PredictionController(IPredict predictor)
            _predictor = predictor;

 [SwaggerResponse(HttpStatusCode.OK, typeof(string))]
        public async Task<IActionResult> Search(PredictionRequest request)
            var requestId = Guid.NewGuid();
            using (LogContext.PushProperty("request", request.ToJson()))
            using (LogContext.PushProperty("requestId", requestId))
                    var result = await _predictor.PredictAsync(request);
                    return Ok(result);
                catch (Exception e)
                    Log.Warning(e, "PredictionController error {request}", request.ToJson());
                    return NoContent();

Hosting our endpoint with Swagger on Azure allows us to test the inputs and see the results below:



Here we hosted our model in Azure using an App Service and managed to test it via Swagger.

Hoping to make this a Function App when this issue is resolved –>



Share on Twitter