
Be the first to like this
Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.
Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.
Published on
The presentation material for the reading club of Element of Statistical Learning by Hastie et al.
The contents of the sections cover
 Properties of logistic regression compared to least square s fitting
 Difference between logistic regression vs. linear discriminant analysis
 Rosenblatt's perceptron algorithm
 Derivation of optimal hyperplane, which offers the basis for SVM

研究室での『統計学習の基礎』（Hastieら著）の輪講用発表資料（ぜんぶ英語）です。
担当範囲は
・最小二乗法との類推で見るロジスティック回帰の特徴
・ロジスティック回帰と線形判別分析の比較
・ローゼンブラットのパーセプトロンアルゴリズム
・SVMの基礎となる最適分離超平面の導出
Be the first to like this
Be the first to comment