Online Course
機器學習基石下 (Machine Learning Foundations)---Algorithmic Foundations
National Taiwan University via Coursera
-
22
-
- Write review
Overview
Class Central Tips
Syllabus
-weight vector for linear hypotheses and squared error instantly calculated by analytic solution
第十講: Logistic Regression
-gradient descent on cross-entropy error
to get good logistic hypothesis
第十一講: Linear Models for Classification
-binary classification via (logistic) regression; multiclass classification via OVA/OVO decomposition
第十二講: Nonlinear Transformation
-nonlinear model via nonlinear feature transform+linear model with price of model complexity
第十三講: Hazard of Overfitting
-overfitting happens with excessive power, stochastic/deterministic noise and limited data
第十四講: Regularization
-minimize augmented error, where the added regularizer effectively limits model complexity
第十五講: Validation
-(crossly) reserve validation data to simulate testing procedure for model selection
第十六講: Three Learning Principles
-be aware of model complexity, data goodness and your professionalism
Taught by
Hsuan-Tien Lin, 林軒田
Tags
Related Courses
-
機器學習技法 (Machine Learning Techniques)
National Taiwan University
5.0 -
機器學習基石上 (Machine Learning Foundations)---Mathematical Foundations
National Taiwan University
5.0 -
Applied Machine Learning in Python
University of Michigan
4.0 -
Machine Learning and AI Foundations: Classification Modeling
-
Advanced Machine Learning
ITMO University
4.6 -
Machine Learning: Classification
University of Washington
4.8
Reviews
0.0 rating, based on 0 reviews