Charge Id - Deploying a ML.Net Model to Azure
In the previous post we built a machine learning model using ML.Net, in this post we will deploy the model to an Azure app and allow it to be used via a HTTP API
Using the output model in zip format ‘vita-model-1.zip’ we can include this in our web application as an embedded resource or simply include the file for deployment.
To use the file from a HTTP endpoint:
- Include the zip file in your deployment – embedded resource/content/read from blob storage etc..
- Initialise the model as a singleton during application start up by using a file path or stream
- Call the model using the function PredictionModel.Predict(‘my data from which to predict’)
Sample below that logs to Logz.io
[Produces("application/json")]
[Route("[controller]")]
public class PredictionController : Controller
{
private readonly IPredict _predictor;
public PredictionController(IPredict predictor)
{
_predictor = predictor;
}
[HttpPost("predict")]
[SwaggerResponse(HttpStatusCode.OK, typeof(string))]
public async Task<IActionResult> Search(PredictionRequest request)
{
Guard.AgainstNull(request);
var requestId = Guid.NewGuid();
using (LogContext.PushProperty("request", request.ToJson()))
using (LogContext.PushProperty("requestId", requestId))
{
try
{
var result = await _predictor.PredictAsync(request);
return Ok(result);
}
catch (Exception e)
{
Console.WriteLine(e);
Log.Warning(e, "PredictionController error {request}", request.ToJson());
return NoContent();
}
}
}
}
Hosting our endpoint with Swagger on Azure allows us to test the inputs and see the results below:
Conclusion
Here we hosted our model in Azure using an App Service and managed to test it via Swagger.
Hoping to make this a Function App when this issue is resolved –> https://github.com/dotnet/machinelearning/issues/569
Code
https://github.com/chrismckelt/vita
Published: