Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Saving 95% of Your Edge Power with Sparsity to Enable TinyML

tinyML via YouTube

Overview

Limited-Time Offer: Up to 75% Off Coursera Plus!
7000+ certificate courses from Google, Microsoft, IBM, and many more.
This course discusses how to save 95% of edge power with sparsity to enable tiny machine learning (tinyML). The learning outcomes include understanding the different types of sparsity (time, space, connectivity, activation) and how they can be exploited to reduce computation needs for low latency and low power edge processing. The course teaches about the GrAI Core architecture and its event-based paradigm to maximize sparsity utilization. The teaching method involves a webcast presentation with a focus on practical applications and implications for tinyML tasks. The intended audience for this course includes individuals interested in edge processing, tiny machine learning, and optimizing power consumption for ML inference at the edge.

Syllabus

Intro
About Jon Tapson
Edge workloads are different
Edge data is massive
Speech waveforms
What is sparsity
Deep neural networks
Fanout
Basic CNN
Typical gains
Neural Network Accelerator
How it works
Events
Use cases
Software stack
Runtime support
Sparsity performance
Summary
Questions
Conclusion
Edge Impulse
Sponsor
Next talk
Thanks

Taught by

tinyML

Reviews

Start your review of Saving 95% of Your Edge Power with Sparsity to Enable TinyML

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.