PFN福田圭祐による東大大学院「融合情報学特別講義Ⅲ」(2022年10月19日)の講義資料です。
・Introduction to Preferred Networks
・Our developments to date
・Our research & platform
・Simulation ✕ AI
- The document discusses linear regression models and methods for estimating coefficients, including ordinary least squares and regularization methods like ridge regression and lasso regression.
- It explains how lasso regression, unlike ordinary least squares and ridge regression, has the property of driving some of the coefficient estimates exactly to zero, allowing for variable selection.
- An example using crime rate data shows how lasso regression can select a more parsimonious model than other methods by setting some coefficients to zero.
PFN福田圭祐による東大大学院「融合情報学特別講義Ⅲ」(2022年10月19日)の講義資料です。
・Introduction to Preferred Networks
・Our developments to date
・Our research & platform
・Simulation ✕ AI
- The document discusses linear regression models and methods for estimating coefficients, including ordinary least squares and regularization methods like ridge regression and lasso regression.
- It explains how lasso regression, unlike ordinary least squares and ridge regression, has the property of driving some of the coefficient estimates exactly to zero, allowing for variable selection.
- An example using crime rate data shows how lasso regression can select a more parsimonious model than other methods by setting some coefficients to zero.
Computing for Isogeny Kernel Problem by Groebner BasisYasu Math
Today, Tani's Claw finding algorithm is the fastest method of isogeny kernel problem. However, We don't use the property of elliptic curves and isogeny to solve the problem by Tani's algorithm. We suggest new method of computing for isogeny kernel problem by Velu's formula and Groebner basis.