Elastic Deep Learning (EDL), an LF AI Foundation Incubation-Stage Project, has released version 0.3.0. Congratulations to the EDL community for all their hard work in order to make this possible!
EDL is a framework with the ability to dynamically adjust the parallelism (number of training workers) for deep neural network training. It can support multi-tenant cluster management to balance job completion time and job waiting time, maximize the use of idle resources, and so on. This project contains the EDL framework and its applications such as distillation and NAS.
Major features of the EDL 0.3.0 release include:
- Support elastic training with inference type services during training, such as knowledge distillation.
- Inference type services are automatically registered through service discovery in EDL.
- Knowledge distillation examples in computer vision and natural language processing.
Also, a new version will be released later that will automatically adjust teachers based on traffic.
The EDL Project invites you to adopt or upgrade to version 0.3.0, and welcomes feedback. To learn more about the EDL 0.3.0 release, check out the full release notes. Want to get involved with EDL? Be sure to join the EDL-Announce and EDL Technical-Discuss mailing lists to join the community and stay connected on the latest updates.
Congratulations to the EDL team! We look forward to continued growth and success as part of the LF AI Foundation. To learn about hosting an open source project with us, visit the LF AI Foundation website.
Elastic Deep Learning Key Links
LF AI Resources
- Learn about membership opportunities
- Explore the interactive landscape
- Check out our technical projects
- Join us at upcoming events
- Read the latest announcements on the blog
- Subscribe to the mailing lists
- Follow us on Twitter or LinkedIn
- Access other resources on LF AI’s GitHub or Wiki