The Linux Foundation Projects
Skip to main content
Blog

Adlik 0.2.0 Bear Release Now Available!

By November 20, 2020No Comments

Adlik, an LF AI & Data Foundation Incubation-Stage Project, has released version 0.2.0, called Bear. We’re thrilled to see a release from this community who has been hard at work the past few months! Adlik is a toolkit for accelerating deep learning inference, which provides an overall support for bringing trained models into production and eases the learning curves for different kinds of inference frameworks. In Adlik, Model Optimizer and Model Compiler delivers optimized and compiled models for a certain hardware environment, and Serving Engine provides deployment solutions for cloud, edge and device.

In version 0.2.0, Adlik enhances features, increases useability, and addresses miscellaneous bug fixes. A few of the release highlights include the following:

New Compiler

  • Support DAG generation for end-to-end compilation of models with different representation 
  • Source representation: H5, Ckpt, Pb, Pth, Onnx and SavedModel
  • Target representation: SavedModel, OpenVINO IR, TensorRT Plan and Tflite 
  • Support model quantization for TfLite and TensorRT
  • Int8 quantization for TfLite
  • Int8 and fp16 quantization for TensorRT

Inference Engine

  • Support hybrid scheduling of ML and DL inference jobs
  • Support image based deployment of Adlik compiler and inference engine in cloud native environment
  • Deployment and functions have been tested in docker (V19.03.12) and Kubernetes (V1.13)
  • Support Adlik running in RaspberryPi and JetsonNano
  • Support the newest version of OpenVINO (2021.1.110) and TensorFlow (2.3.1)

Benchmark Test

  • Support benchmark test for models including ResNet-50, Inception V3, Yolo V3 and Bert with devices and 5 runtimes supported by Adlik

Out of these features, one important goal of this release is to make Adlik, especially Inference Engine, easier for developers to use. In addition, Inference Engine in this release is the first lightweight inference engine that supports hybrid scheduling of ML and DL inference jobs. So far it has been integrated in ZTE base station products. See the release note here.

The Adlik team expressed a special thank you to all of the contributors for their extra hard work.

The Adlik Project invites you to adopt or upgrade to Bear, version 0.2.0, and welcomes feedback. To learn more about the Adlik 0.2.0 release, check out the full release notes. Want to get involved with Adlik? Be sure to join the Adlik-Announce and Adlik Technical-Discuss mailing lists to join the community and stay connected on the latest updates.

Congratulations to the Adlik team! We look forward to continued growth and success as part of the LF AI & Data Foundation. To learn about how to host an open source project with us, visit the LF AI & Data website.

Adlik Key Links

LF AI & Data Resources

 

Author

  • Andrew Bringaze

    Andrew Bringaze is the senior developer for The Linux Foundation. With over 10 years of experience his focus is on open source code, WordPress, React, and site security.