Adlik, an LF AI Foundation Incubation-Stage Project, has released version 0.1.0. We’re thrilled to see a release from this community who has been hard at work the past few months! Adlik is a toolkit for accelerating deep learning inference, which provides an overall support for bringing trained models into production and eases the learning curves for different kinds of inference frameworks. In Adlik, Model Optimizer and Model Compiler delivers optimized and compiled models for a certain hardware environment, and Serving Engine provides deployment solutions for cloud, edge and device.
In version 0.1.0, Adlik enhances features, increases useability, and addresses miscellaneous bug fixes. A few of the release highlights include the following:
- Model Compiler
- A new framework which is easy to expand and maintain
- Compilation of models trained from Keras, Tensorflow, and Pytorch for better execution on CPU/GPU
- Model Optimizer
- Multi nodes multi GPUs training and pruning
- Configurable implementation of filter pruning to achieve smaller size of inference models
- Small batch dataset quantization for TF-Lite and TF-TRT
- Inference Engine
- Management of multi models and multi versions
- HTTP/GRPC interfaces for inference service
- Runtime scheduler that supports scheduling of multi model instances
- Integration of multiple DL inference runtime, including TensorFlow Serving, OpenVINO, TensorRT and TF Lite
- Integration of dlib to support ML runtime
This release also contains a Benchmark Test Framework for DL Model, which enables a standardized benchmark test for performance of models running in the same hardware environment with different runtime supported by Adlik. In this framework, the whole testing pipeline is auto executed with a containerized solution.
The Adlik team expressed a special thank you to contributors from ZTE, China Mobile, and China Unicom for their extra hard work.
The Adlik Project invites you to adopt or upgrade to version 0.1.0, and welcomes feedback. To learn more about the Adlik 0.1.0 release, check out the full release notes. Want to get involved with Adlik? Be sure to join the Adlik-Announce and Adlik Technical-Discuss mailing lists to join the community and stay connected on the latest updates.
Congratulations to the Adlik team! We look forward to continued growth and success as part of the LF AI Foundation. To learn about hosting an open source project with us, visit the LF AI Foundation website.
Adlik Key Links
LF AI Resources