Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Lessons Learned from Evaluating the Robustness of Defenses to Adversarial Examples

USENIX via YouTube

Overview

Limited-Time Offer: Up to 75% Off Coursera Plus!
7000+ certificate courses from Google, Microsoft, IBM, and many more.
This course covers the learning outcomes and goals of evaluating the robustness of defenses to adversarial examples in the context of machine learning classifiers. The individual skills taught include understanding adversarial examples, different defense mechanisms like adversarial training and input transformation, and how to evaluate the effectiveness of defenses. The teaching method involves a survey of past defense breaks, recommendations for thorough defense evaluations, and a comparison of evaluation methods in different security fields. The intended audience for this course includes researchers, practitioners, and students interested in machine learning security and adversarial attacks.

Syllabus

Introduction
Adversarial Examples
Why Care
What are Defenses
Adversarial Training
Thermometer Encoding
Input Transformation
Evaluating the robustness
Why are defenses easily broken
Lessons Learned
Adversary Training
Empty Set
Evaluating Adversely
Actionable Advice
Evaluation
Holding Out Data
FGSM
Gradient Descent
No Bounds
Random Classification
Negative Things
Evaluate Against the Worst Attack
Accuracy vs Distortion
Verification
Gradient Free
Random Noise
Conclusion
AES 1997
Attack success rates in insecurity
Why are we not yet crypto
How much we can prove
Still a lot of work to do
L2 Distortion
We dont know what we want
We dont have that today
Summary
Questions

Taught by

USENIX

Reviews

Start your review of Lessons Learned from Evaluating the Robustness of Defenses to Adversarial Examples

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.