Overview
This course aims to teach learners how to utilize TensorFlow Lite for on-device machine learning inference on mobile and embedded devices. By the end of the course, students will be able to deploy TensorFlow Lite models, optimize performance, and implement machine learning solutions on various platforms. The course covers topics such as TensorFlow Lite models, support library, performance optimization techniques, and applications on microcontrollers and Arduino. The teaching method involves presentations by industry experts, including discussions on performance optimization and real-world applications. This course is intended for developers, data scientists, and machine learning enthusiasts looking to implement machine learning models on resource-constrained devices.
Syllabus
Intro
Slow Motion
Mobile Bert
TensorFlow Lite Models
Support Library
Performance
Techniques
Op Coverage
Selective Registration
microcontrollers
interpreter
Arduino
Applications
Person Detection
Conclusion
Taught by
TensorFlow