Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

LEMNA - Explaining Deep Learning Based Security Applications

Association for Computing Machinery (ACM) via YouTube

Overview

This course aims to teach learners about explaining deep learning-based security applications. The learning outcomes include understanding the concerns of opaque deep learning models, exploring existing explanation techniques and their limitations, learning about the LEMNA method for local explanations, and building trust in target models. The course covers skills such as modeling feature dependency, deriving explanations from deep neural networks using LEMNA, and evaluating explanation accuracy. The teaching method includes theoretical explanations, demonstrations, and troubleshooting sessions. The intended audience for this course includes individuals interested in deep learning, security applications, and model interpretability.

Syllabus

Intro
The Concerns of Opaque Deep Learning Model
Existing Explanation Techniques & Limitations
One Example of Model Explanation (LIME. KDD'16)
Limitation of Existing Explanation Techniques
LEMNA: Local Explanation Method using Nonlinear Approximation
Supporting Locally Non-linear Decision Boundaries
Modeling the Feature Dependency . Mature regression model with fused lasso
Deriving an Explanation from DNN with LEMNA
Explanation Accuracy Evaluation
Demonstration of LEMNA in Identifying Binary Function Start
Building Trust in the Target Models
Troubleshooting and Patching Model Errors

Taught by

Association for Computing Machinery (ACM)

Reviews

Start your review of LEMNA - Explaining Deep Learning Based Security Applications

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.