ML Serving - Serving ML Models

Stable Version:https://badge.fury.io/py/mlserving.svg
Source Code:

https://github.com/orlevii/mlserving

mlserving is a framework for developing a realtime model-inference service.

Allows you to easily set-up an inference-endpoint for your ML Model.

mlserving emphasizes on high performance and allows easy integration with other model servers such as TensorFlow Serving

Motivation

Data Scientists usually struggle with integrating their ML-models to production.

mlserving is here to make the development of model-servers easy for everyone.