This is talk slide at Nagoya Univ. on Dec., 08, 2014. This slides includes some history of neural networks, percetpron, error backpropagation, neocognitron, and convolution net. Also I talk a few topics about sparse coding in the deep architeture
This is talk slide at Nagoya Univ. on Dec., 08, 2014. This slides includes some history of neural networks, percetpron, error backpropagation, neocognitron, and convolution net. Also I talk a few topics about sparse coding in the deep architeture
Appropriate Mesh Density for the Optical Simulation of a Silver Nanoparticle ...kagikenco
Abstract : Electromagnetic analysis of a metal nanoparticle or a nanostructure enables to quantify the concentration of localized enhanced electric field and research their optical properties. However, there are few papers which examine proper mesh density in this kind of analysis. In this presentation, the appropriate mesh density is clarified for the optical simulation of a silver nanoparticle.
Scan Registration for Autonomous Mining Vehicles Using 3D-NDTKitsukawa Yuki
研究室のゼミの論文紹介の発表資料です。
Magnusson, M., Lilienthal, A. and Duckett, T. (2007), Scan registration for autonomous mining vehicles using 3D-NDT. J. Field Robotics, 24: 803–827. doi: 10.1002/rob.20204
2018/8/27 電気学会システム研究会
English title: A Majorization-Minimization-Based Kalman Filter with Hyperbolic Secant Measurement Noise
Authors: H. Tanji, T. Murakami, H. Kamata
Institution: Meiji University
Presented in Technical Meeting on "Systems", IEE Japan
Structural data analysis based on multilayer networkstm1966
Introduction on data analysis based on multilayer networks (in Japanese). Some references of tools, datasets, conferences and Web sites are also mentioned.
The document provides a course calendar for a class on Bayesian estimation methods. It lists the dates and topics to be covered over 15 class periods from September to January. The topics progress from basic concepts like Bayes estimation and the Kalman filter, to more modern methods like particle filters, hidden Markov models, Bayesian decision theory, and applications of principal component analysis and independent component analysis. One class is noted as having no class.
Appropriate Mesh Density for the Optical Simulation of a Silver Nanoparticle ...kagikenco
Abstract : Electromagnetic analysis of a metal nanoparticle or a nanostructure enables to quantify the concentration of localized enhanced electric field and research their optical properties. However, there are few papers which examine proper mesh density in this kind of analysis. In this presentation, the appropriate mesh density is clarified for the optical simulation of a silver nanoparticle.
Scan Registration for Autonomous Mining Vehicles Using 3D-NDTKitsukawa Yuki
研究室のゼミの論文紹介の発表資料です。
Magnusson, M., Lilienthal, A. and Duckett, T. (2007), Scan registration for autonomous mining vehicles using 3D-NDT. J. Field Robotics, 24: 803–827. doi: 10.1002/rob.20204
2018/8/27 電気学会システム研究会
English title: A Majorization-Minimization-Based Kalman Filter with Hyperbolic Secant Measurement Noise
Authors: H. Tanji, T. Murakami, H. Kamata
Institution: Meiji University
Presented in Technical Meeting on "Systems", IEE Japan
Structural data analysis based on multilayer networkstm1966
Introduction on data analysis based on multilayer networks (in Japanese). Some references of tools, datasets, conferences and Web sites are also mentioned.
The document provides a course calendar for a class on Bayesian estimation methods. It lists the dates and topics to be covered over 15 class periods from September to January. The topics progress from basic concepts like Bayes estimation and the Kalman filter, to more modern methods like particle filters, hidden Markov models, Bayesian decision theory, and applications of principal component analysis and independent component analysis. One class is noted as having no class.
2012 mdsp pr12 k means mixture of gaussiannozomuhamada
The document provides the course calendar and lecture plan for a machine learning course. The course calendar lists the class dates and topics to be covered from September to January, including Bayes estimation, Kalman filters, particle filters, hidden Markov models, Bayesian decision theory, principal component analysis, and clustering algorithms. The lecture plan focuses on clustering methods, including k-means clustering, mixtures of Gaussians models, and using the expectation-maximization (EM) algorithm to estimate the parameters of Gaussian mixture models.
2012 mdsp pr11 ica part 2 face recognitionnozomuhamada
The document describes using independent component analysis (ICA) for face recognition. ICA is applied to a data matrix of face images to extract statistically independent basis images that represent local facial features. These basis images can then be used as a feature vector to identify faces. Specifically, ICA is applied to a training set of 425 face images to extract 25 statistically independent component basis images. These basis images provide local facial features that can be used to represent faces for recognition.
This document provides a summary of the course contents for an Independent Component Analysis (ICA) class.
1. The class covers the basics of ICA, including problem formulation, whitening using Principal Component Analysis, and measuring non-Gaussianity.
2. Key topics include linear mixing models, assumptions of source signal independence and non-Gaussianity, whitening observed signals to obtain independent components, and maximizing non-Gaussianity measures to separate the sources.
3. Non-Gaussian source examples and Gaussian limitations are discussed. Kurtosis is introduced as a classical measure of non-Gaussianity to formulate ICA as an optimization problem.
This document contains the course calendar for a machine learning course covering topics like Bayesian estimation, Kalman filters, particle filters, hidden Markov models, Bayesian decision theory, principal component analysis, independent component analysis, and clustering algorithms. The calendar lists 15 classes over the semester, the topics to be covered in each class, and any dates where there will be no class. It also includes lecture plans and slides on principal component analysis, linear discriminant analysis, and comparing PCA and LDA.
This document provides a course calendar for a machine learning course with the following contents:
- The course covers topics like Bayesian estimation, Kalman filters, particle filters, hidden Markov models, Bayesian decision theory, principal component analysis, independent component analysis, and clustering algorithms over 13 classes between September and January.
- One lecture plan discusses nonparametric density estimation approaches like histogram density estimation, kernel density estimation, and k-nearest neighbor density estimation. It also covers cross-validation techniques.
- Another document section provides an example of applying kernel density estimation and k-nearest neighbor classification to automatically sort fish based on lightness, including discussing training and test phase classification. It compares different bandwidths and values of k.
1. The document outlines the course calendar for a Bayesian estimation course, with topics including Bayes estimation, Kalman filters, particle filters, hidden Markov models, Bayesian decision theory, and applications of principal component analysis and independent component analysis.
2. The lecture on Bayesian decision theory introduces classification and decision problems, Bayes' decision theory, discriminant functions, and the Gaussian case. It discusses classifying observations to categories based on loss functions and conditional risk to minimize overall risk.
3. Bayesian decision theory aims to assign observations to categories to minimize the expected loss. It considers prior probabilities, likelihood functions, posterior probabilities, and loss functions to derive decision rules that minimize risk or probability of error.
The document provides details on a course calendar and lecture plan for hidden Markov models (HMM).
1) The course calendar covers topics like Bayesian estimation, Kalman filters, particle filters, hidden Markov models, supervised learning, and clustering algorithms over 14 weeks.
2) The HMM lecture plan introduces discrete-time HMMs and their applications. It covers the three main problems of HMMs - evaluation, decoding, and learning. Evaluation calculates the probability of an output sequence, decoding finds the most probable hidden state sequence, and learning estimates model parameters from training data.
3) The trellis diagram and forward algorithm are described for solving the evaluation problem, while the Viterbi and forward-backward algorithms are mentioned
This document provides a course calendar and lecture plans for topics related to Bayesian estimation methods. The course calendar lists 12 class dates from September to December covering topics like Bayes estimation, Kalman filters, particle filters, hidden Markov models, supervised learning, and clustering algorithms. One lecture plan provides details on the hidden Markov model, including the introduction, definition of HMMs, and problems of evaluation, decoding, and learning. Another lecture plan covers particle filters, including the sequential importance sampling algorithm, choice of proposal density, and the particle filter algorithm of sampling, weight update, resampling, and state estimation.
This document provides a summary of a lecture on simulation-based Bayesian estimation methods, specifically particle filters. It begins by explaining why simulation-based methods are needed for nonlinear and non-Gaussian problems where analytical solutions are not possible. It then discusses Monte Carlo sampling methods including historical examples, Monte Carlo integration to approximate integrals, and importance sampling to generate samples from a target distribution. The key steps of importance sampling are outlined.
1) The document outlines the key steps and equations of the Kalman filter algorithm for optimal state estimation.
2) It describes the Kalman filter as a recursive algorithm that uses a system's dynamics model and noisy measurements to produce optimal estimates of unknown variables.
3) The algorithm involves two main steps - prediction using the system model to produce an estimate, and correction using new measurements to update the estimate.
This document outlines the course calendar for a class on Bayesian estimation methods. It includes 12 topics that will be covered from September to December such as Bayes estimation, Kalman filtering, particle filtering, hidden Markov models, supervised learning, PCA, ICA, and clustering algorithms. It also provides notations and facts about key probability concepts like probability density functions, random variables, mean, variance, covariance, conditional probability, and error evaluation.
This document outlines a course on multi-dimensional signal processing and pattern recognition. The course will be taught in the fall semester and cover topics like Bayesian signal processing, machine learning, pattern recognition, and applications. It includes the course calendar, prerequisites, references, and overview of topics to be covered in two parts - Bayesian signal processing techniques and pattern recognition methods. Grading will be based on homeworks, a final report, and lecture notes will be available online.
2. Norbert Wiener
( 1894~1964 )
科学の仕事においては、研究者は与えられた問題が解
けるというだけでは十分とはいえない。自分が解いた問
題をあらゆる面から調べて、自分はいったいどういう問題
を解いたのかを見つけ出さなければならない。
「I am a mathematician」より
3. Art (技能) から Science (科学) へ
Norbert Wiener
> 回路合成論
> 線形予測法・最適フィルタ論
> 通信/情報の数学理論・符号化
> フィードバック制御論