This document contains contact information for several researchers from the Machine Perception and Robotics Group at Chubu University in Japan, including professors, lecturers, and research assistants. It lists their names, titles, contact details such as phone numbers and email addresses, and web links for the group's website. The group is part of the Department of Robotics Science and Technology or Department of Computer Science within the College of Engineering at Chubu University.
You Only Look One-level Featureの解説と見せかけた物体検出のよもやま話Yusuke Uchida
第7回全日本コンピュータビジョン勉強会「CVPR2021読み会」(前編)の発表資料です
https://kantocv.connpass.com/event/216701/
You Only Look One-level Featureの解説と、YOLO系の雑談や、物体検出における関連する手法等を広く説明しています
You Only Look One-level Featureの解説と見せかけた物体検出のよもやま話Yusuke Uchida
第7回全日本コンピュータビジョン勉強会「CVPR2021読み会」(前編)の発表資料です
https://kantocv.connpass.com/event/216701/
You Only Look One-level Featureの解説と、YOLO系の雑談や、物体検出における関連する手法等を広く説明しています
This document contains contact information for several members of the Machine Perception and Robotics Group at Chubu University in Japan, including professors Hironobu Fujiyoshi and Takayoshi Yamashita. It lists their names, titles, departments, addresses, phone numbers, and email addresses. Brief biographies are also provided for Professors Fujiyoshi and Yamashita, mentioning their research interests and accomplishments.
This document contains contact information for several members of the Machine Perception and Robotics Group at Chubu University in Japan, including professors Hironobu Fujiyoshi and Takayoshi Yamashita. It lists their names, titles, departments, addresses, phone numbers, and email addresses. Brief biographies are also provided for Professors Fujiyoshi and Yamashita, mentioning their research interests and accomplishments.
layout impact of resolution enhancement in design for manufacturing dfm- in ...Kumar Goud
Abstract: As VLSI technology scales to 65nm and below, ancient communication between style and producing becomes a lot of and lighter. Gone square measure the times once designers merely pass the look GDSII file to the mill and expect excellent manufacturing and constant quantity yield. this is often for the most part thanks to the big challenges within the producing stage because the feature size continues to shrink. Thus, the concept of DFM (Design for Manufacturing) is obtaining highly regarded. even if there's no universally accepted definition of DFM, in my opinion, one in every of} the main elements of DFM is to bring producing info into the look stage in a means that's understood by designers. Consequently, designers will act on the knowledge to boost each producing and constant quantity yield. During this treatise, I’ll gift many makes an attempt to cut back the gap between style and producing communities: Alt-PSM aware galvanic cell styles, printability improvement for careful routing and therefore the ASIC style flow with litho aware static temporal arrangement analysis. Experiment results show that greatly improve the manufacturability of the styles and that we can cut back style pessimism considerably for easier style closure.
Keywords: Layout, Cell, PSM, OAI, RSM, RET, SRAF, Optimization
A Novel Blind SR Method to Improve the Spatial Resolution of Real Life Video ...IRJET Journal
This document proposes a novel blind super resolution method to improve the spatial resolution of real-life video sequences. The key aspects of the proposed method are:
1) It estimates blur without knowing the point spread function or noise statistics using a non-uniform interpolation super resolution method and multi-scale processing.
2) It uses a cost function with fidelity and regularization terms of a Huber-Markov random field to preserve edges and fine details in the reconstructed high resolution frames.
3) It performs masking to suppress artifacts from inaccurate motions, adaptively weighting the fidelity term at each iteration for faster convergence.
The method is tested on real-life videos with complex motions, objects, and brightness changes, showing
Realtime face matching and gender prediction based on deep learningIJECEIAES
Face analysis is an essential topic in computer vision that dealing with human faces for recognition or prediction tasks. The face is one of the easiest ways to distinguish the identity people. Face recognition is a type of personal identification system that employs a person’s personal traits to determine their identity. Human face recognition scheme generally consists of four steps, namely face detection, alignment, representation, and verification. In this paper, we propose to extract information from human face for several tasks based on recent advanced deep learning framework. The proposed approach outperforms the results in the state-of-the-art.
Despeckling of Sar Image using Curvelet TransformIRJET Journal
This document presents a method for reducing speckle noise in synthetic aperture radar (SAR) images using the curvelet transform. SAR images are affected by speckle noise during image capture and transmission. The curvelet transform is used to decompose the SAR image into different scales and orientations. Thresholding is applied to the curvelet coefficients to remove coefficients corresponding to noise. The inverse curvelet transform is then applied to reconstruct the denoised image. Experimental results on SAR images show that the proposed curvelet-based method achieves higher peak signal-to-noise ratio and lower mean squared error than conventional filters, indicating it more effectively removes noise while preserving image detail.
IRJET - Human Eye Pupil Detection Technique using Center of Gravity MethodIRJET Journal
This document presents a pupil detection technique using the center of gravity method. It first applies Gaussian filtering, double thresholding, and morphological closing to an eye image to isolate the pupil region. It then uses the center of gravity method to calculate the x and y coordinates of the pupil center by dividing the total pixel values in each dimension by the total number of black pixels. Experimental results on the CASIA iris database demonstrate the accuracy of the proposed computationally efficient pupil detection method. A hardware implementation of the technique is also presented, which could be used for real-time iris localization in biometric recognition applications.
Ieee projects 2012 2013 - Digital Image ProcessingK Sundaresh Ka
ieee projects download, base paper for ieee projects, ieee projects list, ieee projects titles, ieee projects for cse, ieee projects on networking,ieee projects 2012, ieee projects 2013, final year project, computer science final year projects, final year projects for information technology, ieee final year projects, final year students projects, students projects in java, students projects download, students projects in java with source code, students projects architecture, free ieee papers
This summarizes an academic paper that proposes an unsupervised algorithm to detect regions of interest (ROIs) in images using fast feature detectors. It detects keypoints using Speeded-Up Robust Features (SURF) and Features from Accelerated Segment Test (FAST) to maximize interest points. It categorizes keypoints as foreground or background using k-nearest neighbors classification on texture descriptors. ROIs are identified as groups of foreground keypoints. Preliminary experiments showed this approach can efficiently detect ROIs without computationally expensive comparisons between images.
ANALYSIS OF LUNG NODULE DETECTION AND STAGE CLASSIFICATION USING FASTER RCNN ...IRJET Journal
This document presents a method for detecting and classifying lung nodules using Faster R-CNN technique. It first segments the lung from CT images and extracts features using Dual-Tree Complex Wavelet Transform. A Back Propagation Neural Network is then used to classify patterns of interstitial lung diseases detected in the images. Fuzzy clustering is also proposed to segment abnormal regions of the lung. The method aims to help identify and diagnose common lung diseases like pleural effusion and interstitial lung disease in an automated manner from CT images.
Possibility fuzzy c means clustering for expression invariant face recognitionIJCI JOURNAL
Face being the most natural method of identification for humans is one of the most significant biometric
modalities and various methods to achieve efficient face recognition have been proposed. However the
changes in face owing to different expressions, pose, makeup, illumination, age bring about marked
variations in the facial image. These changes will inevitably occur and they can be controlled only till a
certain degree beyond which they are bound to happen and will affect the face thereby adversely impacting
the performance of any face recognition system. This paper proposes a strategy to improve the
classification methodology in face recognition by using Possibility Fuzzy C-Means Clustering (PFCM).
This clustering technique was used for face recognition due to its properties like outlier insensitivity which
make it a suitable candidate for use in designing such robust applications.PFCM is a hybridization of
Possibilistic C-Means (PCM) and Fuzzy C-Means (FCM) clustering algorithms. PFCM is a robust
clustering technique and is especially significant for its noise insensitivity. It has also resolved the
coincident clusters problem which is faced by other clustering techniques. Therefore the technique can also
be used to increase the overall robustness of a face recognition system and thereby increase its invariance
and make it a reliably usable biometric modality.
Iaetsd multi-view and multi band face recognitionIaetsd Iaetsd
The document discusses multi-view and multi-band face recognition using wavelet transforms. It begins with an abstract describing the challenges of face recognition due to variations in lighting, expression, and aging. It then introduces a multi-band face recognition algorithm using wavelet transforms to extract features from multiple video bands. The experimental results show wavelet transforms take less response time and are more suitable for feature extraction and face matching with high accuracy. It discusses preprocessing images, feature extraction using PCA and wavelet transforms, feature matching, and concludes wavelet transforms help with feature extraction and face matching with high accuracy and less response time.
IRJET- Interactive Image Segmentation with Seed PropagationIRJET Journal
The document proposes a method called Adaptive Constraint Propagation Cut (ACPCut) for interactive image segmentation that uses seed propagation to learn a global discriminative structure from limited user inputs. ACPCut adaptively propagates characteristics of user markers across the image to avoid bias while preserving data coherence. Experimental results show that ACPCut achieves state-of-the-art performance in interactive image segmentation in terms of effectiveness and efficiency compared to other methods.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
IRJET- Object Detection in Underwater Images using Faster Region based Convol...IRJET Journal
The document presents a method for object detection in underwater images using Faster Region-Based Convolutional Neural Networks (R-CNN). The proposed method uses color compensation and enhancement techniques to preprocess underwater images. Faster R-CNN is then applied to classify image pixels into different object classes in real-time. Various geometric reasoning methods are used to improve detection accuracy. The method achieves 90% accuracy on test underwater images containing objects like fish and pipelines. It is able to detect objects under different illumination conditions, water depths and camera angles.
Detection of Attentiveness from Periocular InformationIRJET Journal
This document presents an approach to detect attentiveness from the periocular region surrounding the eyes. It first detects faces in an image using a classifier trained on facial features. It then isolates the periocular region by reducing the height of the bounding box around the detected face. Features are extracted from the periocular region using HOG descriptors and fed into an SVM classifier trained to identify attentiveness. The approach aims to predict attentiveness with minimal computational cost by focusing analysis on the periocular region rather than full face recognition or extensive image processing.
IRJET- Human Fall Detection using Co-Saliency-Enhanced Deep Recurrent Convolu...IRJET Journal
This document summarizes a research paper that proposes a new method for detecting human falls in videos using deep learning. The method uses a recurrent convolutional neural network (RCN) that applies convolutional neural networks (CNNs) to video segments and connects them with long short-term memory (LSTM) to model temporal relationships. It also enhances video frames using co-saliency detection to highlight important human activity regions before feeding them to the RCN. The researchers tested the method on a dataset of 768 video clips from 4 activity classes and achieved 98.12% accuracy at detecting falls, demonstrating the effectiveness of the co-saliency-enhanced RCN approach.
IRJET - Dehazing of Single Nighttime Haze Image using Superpixel MethodIRJET Journal
This document presents a new super-pixel based algorithm for removing haze from single nighttime images. It first decomposes the input hazy nighttime image into a glow image and glow-free hazy image using their relative smoothness. It then uses super-pixel segmentation to compute the atmospheric light and dark channel values for each pixel in the glow-free image. The transmission map is estimated from the dark channel using a weighted guided image filter. Compared to patch-based methods, using super-pixels can reduce morphological artifacts and allow a smaller filter radius to better preserve details. The proposed method is tested on nighttime hazy images and is able to effectively remove haze and restore clear nighttime scenes in 3 sentences or less
Similar to MIRU2020長尾賞受賞論文解説:Attention Branch Networkの展開 (20)
This presentation provides valuable insights into effective cost-saving techniques on AWS. Learn how to optimize your AWS resources by rightsizing, increasing elasticity, picking the right storage class, and choosing the best pricing model. Additionally, discover essential governance mechanisms to ensure continuous cost efficiency. Whether you are new to AWS or an experienced user, this presentation provides clear and practical tips to help you reduce your cloud costs and get the most out of your budget.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Dive into the realm of operating systems (OS) with Pravash Chandra Das, a seasoned Digital Forensic Analyst, as your guide. 🚀 This comprehensive presentation illuminates the core concepts, types, and evolution of OS, essential for understanding modern computing landscapes.
Beginning with the foundational definition, Das clarifies the pivotal role of OS as system software orchestrating hardware resources, software applications, and user interactions. Through succinct descriptions, he delineates the diverse types of OS, from single-user, single-task environments like early MS-DOS iterations, to multi-user, multi-tasking systems exemplified by modern Linux distributions.
Crucial components like the kernel and shell are dissected, highlighting their indispensable functions in resource management and user interface interaction. Das elucidates how the kernel acts as the central nervous system, orchestrating process scheduling, memory allocation, and device management. Meanwhile, the shell serves as the gateway for user commands, bridging the gap between human input and machine execution. 💻
The narrative then shifts to a captivating exploration of prominent desktop OSs, Windows, macOS, and Linux. Windows, with its globally ubiquitous presence and user-friendly interface, emerges as a cornerstone in personal computing history. macOS, lauded for its sleek design and seamless integration with Apple's ecosystem, stands as a beacon of stability and creativity. Linux, an open-source marvel, offers unparalleled flexibility and security, revolutionizing the computing landscape. 🖥️
Moving to the realm of mobile devices, Das unravels the dominance of Android and iOS. Android's open-source ethos fosters a vibrant ecosystem of customization and innovation, while iOS boasts a seamless user experience and robust security infrastructure. Meanwhile, discontinued platforms like Symbian and Palm OS evoke nostalgia for their pioneering roles in the smartphone revolution.
The journey concludes with a reflection on the ever-evolving landscape of OS, underscored by the emergence of real-time operating systems (RTOS) and the persistent quest for innovation and efficiency. As technology continues to shape our world, understanding the foundations and evolution of operating systems remains paramount. Join Pravash Chandra Das on this illuminating journey through the heart of computing. 🌟
Skybuffer AI: Advanced Conversational and Generative AI Solution on SAP Busin...Tatiana Kojar
Skybuffer AI, built on the robust SAP Business Technology Platform (SAP BTP), is the latest and most advanced version of our AI development, reaffirming our commitment to delivering top-tier AI solutions. Skybuffer AI harnesses all the innovative capabilities of the SAP BTP in the AI domain, from Conversational AI to cutting-edge Generative AI and Retrieval-Augmented Generation (RAG). It also helps SAP customers safeguard their investments into SAP Conversational AI and ensure a seamless, one-click transition to SAP Business AI.
With Skybuffer AI, various AI models can be integrated into a single communication channel such as Microsoft Teams. This integration empowers business users with insights drawn from SAP backend systems, enterprise documents, and the expansive knowledge of Generative AI. And the best part of it is that it is all managed through our intuitive no-code Action Server interface, requiring no extensive coding knowledge and making the advanced AI accessible to more users.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on automated letter generation for Bonterra Impact Management using Google Workspace or Microsoft 365.
Interested in deploying letter generation automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Azure API Management to expose backend services securely
MIRU2020長尾賞受賞論文解説:Attention Branch Networkの展開
1. 487-8501
1200
Tel 0568-51-8249
Fax 0568-51-9409
487-8501
1200
Tel 0568-51-9670
Fax 0568-51-1540
487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
miya@vision.cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROB
Chubu University
Department of Robotics Science a
College of Engineering
Ayumi Miyako
Machine Perception and Robotic
1200 Matsumoto-cho, Kasugai, A
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
487-8501
1200
Tel 0568-51-8249
Fax 0568-51-9409
yuu@vision.cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Research Assistant
Dr.Eng.
Yuji Yamauchi
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-8249
Fax +81-568-51-9409
yuu@vision.cs.chubu.ac.jp
487-8501
1200
Tel 0568-51-9670
Fax 0568-51-1540
yamashita@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Computer Science
College of Engineering
Lecturer
Dr.Eng.
Takayoshi Yamashita
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9670
Fax +81-568-51-1540
yamashita@cs.chubu.ac.jp
487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyoshi
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
2. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
3. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
4. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
5. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
CONFIDENTIAL EXTENDED ABSTRACT.
DO NOT DISTRIBUTE ANYWHERE.
23
Spatial Temporal Attention Graph
1,a) 1,b) 1,c) 1,d)
Graph Convolutional Networks
,
. ,
, . ,
. ,
Spatial Temporal Attention Graph Convolutional
Networks .
Attention node ,
Attention edge . ,
, Mechanics-stream
. Mechanics-stream ,
.
, NTU-RGB+D NTU-RGB+D120 2
.
1.
,
.
.
,
.
, Graph Convolu-
tional Networks (GCN)
[1][2][3][4][5]. , GCN
,
. Spatial Temporal GCN (ST-GCN)
[1] GCN .
ST-GCN ,
,
1
a)
siraki@mprg.cs.chubu.ac.jp
b)
hirakawa@mprg.cs.chubu.ac.jp
c)
takayoshi@isc.chubu.ac.jp
d)
fujiyoshi@isc.chubu.ac.jp
. ,
[2][3]. ,
. ,
. ,
,
.
,
Spatial Temporal Attention GCN (STA-GCN)
. Attention edge ,
Attention node
. Attention node Attention ,
. , Attention edge
,
. Attention node Attention edge
, ,
Spatial temporal attention graph
(Attention graph) .
.
[3][5].
.
,
. ,
. ,
, , , ,
,
Mechanics-stream .
2.
Spatial Temporal GCN (ST-GCN) [1] , GCN
. ST-GCN ,
,
2
. ST-GCN ,
, .
1
Attention Branch Network:
Learning of Attention Mechanism for Visual Explanation
Hiroshi Fukui, Tsubasa Hirakawa, Takayoshi Yamashita, Hironobu Fujiyoshi
Chubu University
1200 Matsumotocho, Kasugai, Aichi, Japan
{fhiro@mprg.cs, hirakawa@mprg.cs, yamashita@isc, fujiyoshi@isc}.chubu.ac.jp
Abstract
Visual explanation enables humans to understand the de-
cision making of deep convolutional neural network (CNN),
but it is insufficient to contribute to improving CNN perfor-
mance. In this paper, we focus on the attention map for vi-
sual explanation, which represents a high response value as
the attention location in image recognition. This attention
region significantly improves the performance of CNN by
introducing an attention mechanism that focuses on a spe-
cific region in an image. In this work, we propose Attention
Branch Network (ABN), which extends a response-based vi-
sual explanation model by introducing a branch structure
with an attention mechanism. ABN can be applicable to
several image recognition tasks by introducing a branch for
the attention mechanism and is trainable for visual expla-
nation and image recognition in an end-to-end manner. We
evaluate ABN on several image recognition tasks such as
image classification, fine-grained recognition, and multiple
facial attribute recognition. Experimental results indicate
that ABN outperforms the baseline models on these image
recognition tasks while generating an attention map for vi-
sual explanation. Our code is available 1
.
1. Introduction
Deep convolutional neural network (CNN) [1, 17] mod-
els have been achieved the great performance on various
image recognition tasks [25, 9, 7, 34, 8, 12, 18]. How-
ever, despite CNN models performing well on such tasks,
it is difficult to interpret the decision making of CNN in
the inference process. To understand the decision mak-
ing of CNN, methods of interpreting CNN have been pro-
posed [39, 41, 26, 4, 24, 3, 22].
“Visual explanation” has been used to interpret the de-
cision making of CNN by highlighting the attention loca-
1https://github.com/machine-perception-robotics
-group/attention_branch_network
Attention map
Attention map
…
Great grey owl
Ruffed grouse
(a) Class Activation Mapping
(b) Attention Branch Network
wc
GAP&fc.
Attention branch
Perception branch
Label
Label
Lper(xi)
Latt(xi)
Feature
extractor
Input image
Input image
L(xi)
Attention
mechanism
Featuremap
Feature
extractor
Figure 1. Network structures of class activation mapping and pro-
posed attention branch network.
tion in a top-down manner during the inference process.
Visual explanation can be categorized into gradient-based
or response-based. Gradient-based visual explanation typ-
ically use gradients with auxiliary data, such as noise [4]
and class index [24, 3]. Although these methods can inter-
pret the decision making of CNN without re-training and
modifying the architecture, they require the backpropaga-
tion process to obtain gradients. In contrast, response-
based visual explanation can interpret the decision mak-
ing of CNN during the inference process. Class activation
mapping (CAM) [41], which is a representative response-
based visual explanation, can obtain an attention map in
each category using the response of the convolution layer.
CAM replaces the convolution and global average pool-
ing (GAP) [20] and obtains an attention map that include
high response value positions representing the class, as
shown in Fig. 1(a). However, CAM requires replacing the
fully-connected layer with a convolution layer and GAP,
thus, decreasing the performance of CNN.
To avoid this problem, gradient-based methods are of-
010705
6. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
7. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
8. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
9. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
10. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
11. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
12. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
×
Σ
Lall(x) = Eatt(x) + Eper(x)
Eper(x)
Eatt(x)
13. 003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
How Small Network Can Detect Pedestrian?
Anonymous CVPR submission
Paper ID ****
Abstract
1. Introduction
t log y + (1 − t) log (1 − y) (1)
vc
i =
1
M × N
M
m=1
N
n=1
fc
m,n(xi) (2)
v1
i , v2
i , v3
i , vc
i (3)
g (xi) = g(xi) ·
1
C
C
c=1
fc
(xi) (4)
f (xi, yi) (5)
2. Concolusion
References
487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
000
001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
CVPR
#****
CVPR 2017 Submission #****. CONFIDENTIAL
How Small Network Can
Anonymous CVPR
Paper ID *
Abstract
1. Introduction
t log y + (1 − t) log (1 − y) (1)
vc
i =
1
M × N
M
m=1
N
n=1
fc
m,n(xi) (2)
v1
i , v2
i , v3
i , vc
i (3)
g (xi) = g(xi) ·
1
C
C
c=1
fc
(xi) (4)
f (xi, yi) (5)
2. Concolusion
×
000
001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
How Small Network Can Detect Pedestrian?
Anonymous CVPR submission
Paper ID ****
Abstract
1. Introduction
t log y + (1 − t) log (1 − y) (1)
vc
i =
1
M × N
M
m=1
N
n=1
fc
m,n(xi) (2)
v1
i , v2
i , v3
i , vc
i (3)
g (xi) = g(xi) ·
1
C
C
c=1
fc
(xi) (4)
f (xi, yi) (5)
2. Concolusion
References
Σ
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
1. Introduction
t log y + (1 − t) log (1 − y) (1)
vc
i =
1
M × N
M
m=1
N
n=1
fc
m,n(xi) (2)
v1
i , v2
i , v3
i , vc
i (3)
g (xi) = g(xi) ·
1
C
C
c=1
fc
(xi) (4)
f (xi, yi) (5)
2. Concolusion
References
14. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
15. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
16. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
17. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
18. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
×
×
×
19. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
20. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
21. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
22. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
23. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
×
Σ
24. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
25. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
Conv.
Conv.
GAP
Res.
Block 5
output
Attention branch
Perception branch
Res.
Attention
Block
Conv.
Attention mapAttention map
Feature extractor
Label
𝐿 𝑚𝑎𝑝
𝐿𝑎𝑡𝑡(𝑥)
𝐿𝑝𝑒𝑟(𝑥)
𝐿 𝑚𝑎𝑝 = 𝛼 𝑥 − 𝑦
2
2
𝐿𝑎𝑙𝑙(𝑥) = 𝐿𝑎𝑡𝑡(𝑥) + 𝐿𝑝𝑒𝑟(𝑥) + 𝐿 𝑚𝑎𝑝
26. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
27. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
28. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
29. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
30. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
❌
31. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
33. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
34. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
35. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
36. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
0 Tt
37. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
Input
Output
Convolutions Full
Connection
A
V = {v0, v1, . . . , vN }<latexit sha1_base64="ahF1XlmO+63oShDr5uLd56z8mV0=">AAACgXicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIhNkqxFSXxRvoKJTFG+ooxOSk5JcUgzh+MbXxAsoGegZgoIDJMIQylBmgICBfYDlDDEMKQz5DMkMpQy5DKkMeQwmQncOQyFAMhNEMhgwGDAVAsViGaqBYEZCVCZZPZahl4ALqLQWqSgWqSASKZgPJdCAvGiqaB+SDzCwG604G2pIDxEVAnQoMqgZXDVYafDY4YbDa4KXBH5xmVYPNALmlEkgnQfSmFsTzd0kEfyeoKxdIlzBkIHThdXMJQxqDBditmUC3F4BFQL5Ihugvq5r+OdgqSLVazWCRwWug+xca3DQ4DPRBXtmX5KWBqUGzGbiAEWCIHtyYjDAjPUMDPcNAE2UHV2hUcDBIMygxaADD25zBgcGDIYAhFGhvE8MKho0Mm5iYmTSZDJiMIEqZGKF6hBlQAJM1AL7dkro=</latexit><latexit sha1_base64="ahF1XlmO+63oShDr5uLd56z8mV0=">AAACgXicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIhNkqxFSXxRvoKJTFG+ooxOSk5JcUgzh+MbXxAsoGegZgoIDJMIQylBmgICBfYDlDDEMKQz5DMkMpQy5DKkMeQwmQncOQyFAMhNEMhgwGDAVAsViGaqBYEZCVCZZPZahl4ALqLQWqSgWqSASKZgPJdCAvGiqaB+SDzCwG604G2pIDxEVAnQoMqgZXDVYafDY4YbDa4KXBH5xmVYPNALmlEkgnQfSmFsTzd0kEfyeoKxdIlzBkIHThdXMJQxqDBditmUC3F4BFQL5Ihugvq5r+OdgqSLVazWCRwWug+xca3DQ4DPRBXtmX5KWBqUGzGbiAEWCIHtyYjDAjPUMDPcNAE2UHV2hUcDBIMygxaADD25zBgcGDIYAhFGhvE8MKho0Mm5iYmTSZDJiMIEqZGKF6hBlQAJM1AL7dkro=</latexit><latexit sha1_base64="ahF1XlmO+63oShDr5uLd56z8mV0=">AAACgXicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIhNkqxFSXxRvoKJTFG+ooxOSk5JcUgzh+MbXxAsoGegZgoIDJMIQylBmgICBfYDlDDEMKQz5DMkMpQy5DKkMeQwmQncOQyFAMhNEMhgwGDAVAsViGaqBYEZCVCZZPZahl4ALqLQWqSgWqSASKZgPJdCAvGiqaB+SDzCwG604G2pIDxEVAnQoMqgZXDVYafDY4YbDa4KXBH5xmVYPNALmlEkgnQfSmFsTzd0kEfyeoKxdIlzBkIHThdXMJQxqDBditmUC3F4BFQL5Ihugvq5r+OdgqSLVazWCRwWug+xca3DQ4DPRBXtmX5KWBqUGzGbiAEWCIHtyYjDAjPUMDPcNAE2UHV2hUcDBIMygxaADD25zBgcGDIYAhFGhvE8MKho0Mm5iYmTSZDJiMIEqZGKF6hBlQAJM1AL7dkro=</latexit><latexit sha1_base64="ahF1XlmO+63oShDr5uLd56z8mV0=">AAACgXicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIhNkqxFSXxRvoKJTFG+ooxOSk5JcUgzh+MbXxAsoGegZgoIDJMIQylBmgICBfYDlDDEMKQz5DMkMpQy5DKkMeQwmQncOQyFAMhNEMhgwGDAVAsViGaqBYEZCVCZZPZahl4ALqLQWqSgWqSASKZgPJdCAvGiqaB+SDzCwG604G2pIDxEVAnQoMqgZXDVYafDY4YbDa4KXBH5xmVYPNALmlEkgnQfSmFsTzd0kEfyeoKxdIlzBkIHThdXMJQxqDBditmUC3F4BFQL5Ihugvq5r+OdgqSLVazWCRwWug+xca3DQ4DPRBXtmX5KWBqUGzGbiAEWCIHtyYjDAjPUMDPcNAE2UHV2hUcDBIMygxaADD25zBgcGDIYAhFGhvE8MKho0Mm5iYmTSZDJiMIEqZGKF6hBlQAJM1AL7dkro=</latexit>
Aij =
(
1 vi, vj
0 vi, vj
<latexit sha1_base64="8mDTFuyI8sljk3/zYyuzBgw/Je8=">AAADVXicnVJNaxNRFD2T1FrjR6JuBDfBUHEh4Y4UKoLQooILF/0waaETwszLa5x2Zt4w8xpaY5a6cOlG0JVCEfFnuPEPiPQniMsKbhS872VQa1HBN8zM/Xjn3HPve0Eahbkm2nNK5Ykjk0enjlWOnzh5qlo7faadq61MyJZQkcpWAz+XUZjIlg51JFfTTPpxEMmVYPOGya8MZJaHKrmrd1LZif1+Eq6HwtccUrVtDOEhQ4w65jFCl/0QG2xdR4VjHgJI9DmWcEbAR85+znmTHT8u76r/eAcFxwiXC9uweVwhgALXoztid/SMngiHepwxKO8XNvpvti5t0E3LOGbzWGmC3kHd3VqDmmRX/bDhFkYDxVpQtVdM1ONiAltc1lBqtiNLmWON2yekHOtwIZ9HqVmuyUs7JI9RpnTKOYFN/vbZWyuiCfuxFdaxMhXjFHOYJqbpPb2mfXpHb+gjff0j19ByGC07diwWK9Nu9fG55S//RMX817j3E/VXzRrruGq1mguR2ojpQozxg/tP95evLU0PL9JL+sT6X9AeveUOksFnsbsol56jwgfg/j7uw0b7StOlprs405i7VRzFFM7jAi7xvGcxh9tYQAvCaTsPnIfOo9KH0rfyRHlyvLXkFJizOLDK1e/6Rrhp</latexit><latexit sha1_base64="8mDTFuyI8sljk3/zYyuzBgw/Je8=">AAADVXicnVJNaxNRFD2T1FrjR6JuBDfBUHEh4Y4UKoLQooILF/0waaETwszLa5x2Zt4w8xpaY5a6cOlG0JVCEfFnuPEPiPQniMsKbhS872VQa1HBN8zM/Xjn3HPve0Eahbkm2nNK5Ykjk0enjlWOnzh5qlo7faadq61MyJZQkcpWAz+XUZjIlg51JFfTTPpxEMmVYPOGya8MZJaHKrmrd1LZif1+Eq6HwtccUrVtDOEhQ4w65jFCl/0QG2xdR4VjHgJI9DmWcEbAR85+znmTHT8u76r/eAcFxwiXC9uweVwhgALXoztid/SMngiHepwxKO8XNvpvti5t0E3LOGbzWGmC3kHd3VqDmmRX/bDhFkYDxVpQtVdM1ONiAltc1lBqtiNLmWON2yekHOtwIZ9HqVmuyUs7JI9RpnTKOYFN/vbZWyuiCfuxFdaxMhXjFHOYJqbpPb2mfXpHb+gjff0j19ByGC07diwWK9Nu9fG55S//RMX817j3E/VXzRrruGq1mguR2ojpQozxg/tP95evLU0PL9JL+sT6X9AeveUOksFnsbsol56jwgfg/j7uw0b7StOlprs405i7VRzFFM7jAi7xvGcxh9tYQAvCaTsPnIfOo9KH0rfyRHlyvLXkFJizOLDK1e/6Rrhp</latexit><latexit sha1_base64="8mDTFuyI8sljk3/zYyuzBgw/Je8=">AAADVXicnVJNaxNRFD2T1FrjR6JuBDfBUHEh4Y4UKoLQooILF/0waaETwszLa5x2Zt4w8xpaY5a6cOlG0JVCEfFnuPEPiPQniMsKbhS872VQa1HBN8zM/Xjn3HPve0Eahbkm2nNK5Ykjk0enjlWOnzh5qlo7faadq61MyJZQkcpWAz+XUZjIlg51JFfTTPpxEMmVYPOGya8MZJaHKrmrd1LZif1+Eq6HwtccUrVtDOEhQ4w65jFCl/0QG2xdR4VjHgJI9DmWcEbAR85+znmTHT8u76r/eAcFxwiXC9uweVwhgALXoztid/SMngiHepwxKO8XNvpvti5t0E3LOGbzWGmC3kHd3VqDmmRX/bDhFkYDxVpQtVdM1ONiAltc1lBqtiNLmWON2yekHOtwIZ9HqVmuyUs7JI9RpnTKOYFN/vbZWyuiCfuxFdaxMhXjFHOYJqbpPb2mfXpHb+gjff0j19ByGC07diwWK9Nu9fG55S//RMX817j3E/VXzRrruGq1mguR2ojpQozxg/tP95evLU0PL9JL+sT6X9AeveUOksFnsbsol56jwgfg/j7uw0b7StOlprs405i7VRzFFM7jAi7xvGcxh9tYQAvCaTsPnIfOo9KH0rfyRHlyvLXkFJizOLDK1e/6Rrhp</latexit><latexit sha1_base64="8mDTFuyI8sljk3/zYyuzBgw/Je8=">AAADVXicnVJNaxNRFD2T1FrjR6JuBDfBUHEh4Y4UKoLQooILF/0waaETwszLa5x2Zt4w8xpaY5a6cOlG0JVCEfFnuPEPiPQniMsKbhS872VQa1HBN8zM/Xjn3HPve0Eahbkm2nNK5Ykjk0enjlWOnzh5qlo7faadq61MyJZQkcpWAz+XUZjIlg51JFfTTPpxEMmVYPOGya8MZJaHKrmrd1LZif1+Eq6HwtccUrVtDOEhQ4w65jFCl/0QG2xdR4VjHgJI9DmWcEbAR85+znmTHT8u76r/eAcFxwiXC9uweVwhgALXoztid/SMngiHepwxKO8XNvpvti5t0E3LOGbzWGmC3kHd3VqDmmRX/bDhFkYDxVpQtVdM1ONiAltc1lBqtiNLmWON2yekHOtwIZ9HqVmuyUs7JI9RpnTKOYFN/vbZWyuiCfuxFdaxMhXjFHOYJqbpPb2mfXpHb+gjff0j19ByGC07diwWK9Nu9fG55S//RMX817j3E/VXzRrruGq1mguR2ojpQozxg/tP95evLU0PL9JL+sT6X9AeveUOksFnsbsol56jwgfg/j7uw0b7StOlprs405i7VRzFFM7jAi7xvGcxh9tYQAvCaTsPnIfOo9KH0rfyRHlyvLXkFJizOLDK1e/6Rrhp</latexit>
A 2 RN⇥N
<latexit sha1_base64="CfLyKX0XvDwnEzCf6683isE8Uzg=">AAAChXicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQsoVMckpSk41sZk5sXkJpZkJCVVB9XGVfvFlGTmphYr+NXGCygb6BmAgQImwxDKUGaAgoB8geUMMQwpDPkMyQylDLkMqQx5DCVAdg5DIkMxEEYzGDIYMBQAxWIZqoFiRUBWJlg+laGWgQuotxSoKhWoIhEomg0k04G8aKhoHpAPMrMYrDsZaEsOEBcBdSowqBpcNVhp8NnghMFqg5cGf3CaVQ02A+SWSiCdBNGbWhDP3yUR/J2grlwgXcKQgdCF180lDGkMFmC3ZgLdXgAWAfkiGaK/rGr652CrINVqNYNFBq+B7l9ocNPgMNAHeWVfkpcGpgbNZuACRoAhenBjMsKM9AwN9AwDTZQdXKFRwcEgzaDEoAEMb3MGBwYPhgCGUKC9bQxrGLYybGNiZ9JlMmEygyhlYoTqEWZAAUz2AFttlj8=</latexit><latexit sha1_base64="CfLyKX0XvDwnEzCf6683isE8Uzg=">AAAChXicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQsoVMckpSk41sZk5sXkJpZkJCVVB9XGVfvFlGTmphYr+NXGCygb6BmAgQImwxDKUGaAgoB8geUMMQwpDPkMyQylDLkMqQx5DCVAdg5DIkMxEEYzGDIYMBQAxWIZqoFiRUBWJlg+laGWgQuotxSoKhWoIhEomg0k04G8aKhoHpAPMrMYrDsZaEsOEBcBdSowqBpcNVhp8NnghMFqg5cGf3CaVQ02A+SWSiCdBNGbWhDP3yUR/J2grlwgXcKQgdCF180lDGkMFmC3ZgLdXgAWAfkiGaK/rGr652CrINVqNYNFBq+B7l9ocNPgMNAHeWVfkpcGpgbNZuACRoAhenBjMsKM9AwN9AwDTZQdXKFRwcEgzaDEoAEMb3MGBwYPhgCGUKC9bQxrGLYybGNiZ9JlMmEygyhlYoTqEWZAAUz2AFttlj8=</latexit><latexit sha1_base64="CfLyKX0XvDwnEzCf6683isE8Uzg=">AAAChXicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQsoVMckpSk41sZk5sXkJpZkJCVVB9XGVfvFlGTmphYr+NXGCygb6BmAgQImwxDKUGaAgoB8geUMMQwpDPkMyQylDLkMqQx5DCVAdg5DIkMxEEYzGDIYMBQAxWIZqoFiRUBWJlg+laGWgQuotxSoKhWoIhEomg0k04G8aKhoHpAPMrMYrDsZaEsOEBcBdSowqBpcNVhp8NnghMFqg5cGf3CaVQ02A+SWSiCdBNGbWhDP3yUR/J2grlwgXcKQgdCF180lDGkMFmC3ZgLdXgAWAfkiGaK/rGr652CrINVqNYNFBq+B7l9ocNPgMNAHeWVfkpcGpgbNZuACRoAhenBjMsKM9AwN9AwDTZQdXKFRwcEgzaDEoAEMb3MGBwYPhgCGUKC9bQxrGLYybGNiZ9JlMmEygyhlYoTqEWZAAUz2AFttlj8=</latexit><latexit sha1_base64="CfLyKX0XvDwnEzCf6683isE8Uzg=">AAAChXicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQsoVMckpSk41sZk5sXkJpZkJCVVB9XGVfvFlGTmphYr+NXGCygb6BmAgQImwxDKUGaAgoB8geUMMQwpDPkMyQylDLkMqQx5DCVAdg5DIkMxEEYzGDIYMBQAxWIZqoFiRUBWJlg+laGWgQuotxSoKhWoIhEomg0k04G8aKhoHpAPMrMYrDsZaEsOEBcBdSowqBpcNVhp8NnghMFqg5cGf3CaVQ02A+SWSiCdBNGbWhDP3yUR/J2grlwgXcKQgdCF180lDGkMFmC3ZgLdXgAWAfkiGaK/rGr652CrINVqNYNFBq+B7l9ocNPgMNAHeWVfkpcGpgbNZuACRoAhenBjMsKM9AwN9AwDTZQdXKFRwcEgzaDEoAEMb3MGBwYPhgCGUKC9bQxrGLYybGNiZ9JlMmEygyhlYoTqEWZAAUz2AFttlj8=</latexit>
N<latexit sha1_base64="QPOH/Js4+brK8za38x5vuSpQHsU=">AAACZHichVHLSsNAFD2Nr1qrVosgCFIsiqtyI4LiShDBlVhrH1BLSeKowTQJSVrQ4g/oVnHhSkFE/Aw3/oCL/oAgLiu4ceFtGhAt6h1m5syZe+6cmVFtQ3c9okZI6uru6e0L90cGooNDw7GR0ZxrVR1NZDXLsJyCqrjC0E2R9XTPEAXbEUpFNURePVhp7edrwnF1y9zyDm1Rqih7pr6ra4rHVHq9HEtSivxIdAI5AEkEsWHFbrGNHVjQUEUFAiY8xgYUuNyKkEGwmSuhzpzDSPf3BY4RYW2VswRnKMwe8LjHq2LAmrxu1XR9tcanGNwdViYwTU90R016pHt6oY9fa9X9Gi0vhzyrba2wy8Mn45n3f1UVnj3sf6n+9OxhF4u+V5292z7TuoXW1teOLpqZpc3p+gxd0yv7v6IGPfANzNqbdpMWm5eI8AfIP5+7E+TmUjKl5PR8cnk1+IowJjCFWX7vBSxjDRvI8rkCpzjDeehZikpxaaydKoUCTRzfQpr8BKMzidE=</latexit><latexit sha1_base64="QPOH/Js4+brK8za38x5vuSpQHsU=">AAACZHichVHLSsNAFD2Nr1qrVosgCFIsiqtyI4LiShDBlVhrH1BLSeKowTQJSVrQ4g/oVnHhSkFE/Aw3/oCL/oAgLiu4ceFtGhAt6h1m5syZe+6cmVFtQ3c9okZI6uru6e0L90cGooNDw7GR0ZxrVR1NZDXLsJyCqrjC0E2R9XTPEAXbEUpFNURePVhp7edrwnF1y9zyDm1Rqih7pr6ra4rHVHq9HEtSivxIdAI5AEkEsWHFbrGNHVjQUEUFAiY8xgYUuNyKkEGwmSuhzpzDSPf3BY4RYW2VswRnKMwe8LjHq2LAmrxu1XR9tcanGNwdViYwTU90R016pHt6oY9fa9X9Gi0vhzyrba2wy8Mn45n3f1UVnj3sf6n+9OxhF4u+V5292z7TuoXW1teOLpqZpc3p+gxd0yv7v6IGPfANzNqbdpMWm5eI8AfIP5+7E+TmUjKl5PR8cnk1+IowJjCFWX7vBSxjDRvI8rkCpzjDeehZikpxaaydKoUCTRzfQpr8BKMzidE=</latexit><latexit sha1_base64="QPOH/Js4+brK8za38x5vuSpQHsU=">AAACZHichVHLSsNAFD2Nr1qrVosgCFIsiqtyI4LiShDBlVhrH1BLSeKowTQJSVrQ4g/oVnHhSkFE/Aw3/oCL/oAgLiu4ceFtGhAt6h1m5syZe+6cmVFtQ3c9okZI6uru6e0L90cGooNDw7GR0ZxrVR1NZDXLsJyCqrjC0E2R9XTPEAXbEUpFNURePVhp7edrwnF1y9zyDm1Rqih7pr6ra4rHVHq9HEtSivxIdAI5AEkEsWHFbrGNHVjQUEUFAiY8xgYUuNyKkEGwmSuhzpzDSPf3BY4RYW2VswRnKMwe8LjHq2LAmrxu1XR9tcanGNwdViYwTU90R016pHt6oY9fa9X9Gi0vhzyrba2wy8Mn45n3f1UVnj3sf6n+9OxhF4u+V5292z7TuoXW1teOLpqZpc3p+gxd0yv7v6IGPfANzNqbdpMWm5eI8AfIP5+7E+TmUjKl5PR8cnk1+IowJjCFWX7vBSxjDRvI8rkCpzjDeehZikpxaaydKoUCTRzfQpr8BKMzidE=</latexit><latexit sha1_base64="QPOH/Js4+brK8za38x5vuSpQHsU=">AAACZHichVHLSsNAFD2Nr1qrVosgCFIsiqtyI4LiShDBlVhrH1BLSeKowTQJSVrQ4g/oVnHhSkFE/Aw3/oCL/oAgLiu4ceFtGhAt6h1m5syZe+6cmVFtQ3c9okZI6uru6e0L90cGooNDw7GR0ZxrVR1NZDXLsJyCqrjC0E2R9XTPEAXbEUpFNURePVhp7edrwnF1y9zyDm1Rqih7pr6ra4rHVHq9HEtSivxIdAI5AEkEsWHFbrGNHVjQUEUFAiY8xgYUuNyKkEGwmSuhzpzDSPf3BY4RYW2VswRnKMwe8LjHq2LAmrxu1XR9tcanGNwdViYwTU90R016pHt6oY9fa9X9Gi0vhzyrba2wy8Mn45n3f1UVnj3sf6n+9OxhF4u+V5292z7TuoXW1teOLpqZpc3p+gxd0yv7v6IGPfANzNqbdpMWm5eI8AfIP5+7E+TmUjKl5PR8cnk1+IowJjCFWX7vBSxjDRvI8rkCpzjDeehZikpxaaydKoUCTRzfQpr8BKMzidE=</latexit>
38. 487-8501
1200
Tel 0568-51-9096
Fax 0568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GROUP
Chubu University
Department of Robotics Science and Technology
College of Engineering
Professor
MACHINE PERCEPTION AND ROBOTICS GR
Chubu University
Department of Robotics Science and Techn
College of Engineering
Professor
Dr.Eng.
Hironobu Fujiyosh
Machine Perception and Robotics Group
1200 Matsumoto-cho, Kasugai, Aichi
487-8501 Japan
Tel +81-568-51-9096
Fax +81-568-51-9409
hf@cs.chubu.ac.jp
http://vision.cs.chubu.ac.jp
MACHINE PERCEPTION AND ROBOTICS GR
˜A = A + I<latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit>
Aij =
(
1 vi, vj
0 vi, vj
<latexit sha1_base64="8mDTFuyI8sljk3/zYyuzBgw/Je8=">AAADVXicnVJNaxNRFD2T1FrjR6JuBDfBUHEh4Y4UKoLQooILF/0waaETwszLa5x2Zt4w8xpaY5a6cOlG0JVCEfFnuPEPiPQniMsKbhS872VQa1HBN8zM/Xjn3HPve0Eahbkm2nNK5Ykjk0enjlWOnzh5qlo7faadq61MyJZQkcpWAz+XUZjIlg51JFfTTPpxEMmVYPOGya8MZJaHKrmrd1LZif1+Eq6HwtccUrVtDOEhQ4w65jFCl/0QG2xdR4VjHgJI9DmWcEbAR85+znmTHT8u76r/eAcFxwiXC9uweVwhgALXoztid/SMngiHepwxKO8XNvpvti5t0E3LOGbzWGmC3kHd3VqDmmRX/bDhFkYDxVpQtVdM1ONiAltc1lBqtiNLmWON2yekHOtwIZ9HqVmuyUs7JI9RpnTKOYFN/vbZWyuiCfuxFdaxMhXjFHOYJqbpPb2mfXpHb+gjff0j19ByGC07diwWK9Nu9fG55S//RMX817j3E/VXzRrruGq1mguR2ojpQozxg/tP95evLU0PL9JL+sT6X9AeveUOksFnsbsol56jwgfg/j7uw0b7StOlprs405i7VRzFFM7jAi7xvGcxh9tYQAvCaTsPnIfOo9KH0rfyRHlyvLXkFJizOLDK1e/6Rrhp</latexit><latexit sha1_base64="8mDTFuyI8sljk3/zYyuzBgw/Je8=">AAADVXicnVJNaxNRFD2T1FrjR6JuBDfBUHEh4Y4UKoLQooILF/0waaETwszLa5x2Zt4w8xpaY5a6cOlG0JVCEfFnuPEPiPQniMsKbhS872VQa1HBN8zM/Xjn3HPve0Eahbkm2nNK5Ykjk0enjlWOnzh5qlo7faadq61MyJZQkcpWAz+XUZjIlg51JFfTTPpxEMmVYPOGya8MZJaHKrmrd1LZif1+Eq6HwtccUrVtDOEhQ4w65jFCl/0QG2xdR4VjHgJI9DmWcEbAR85+znmTHT8u76r/eAcFxwiXC9uweVwhgALXoztid/SMngiHepwxKO8XNvpvti5t0E3LOGbzWGmC3kHd3VqDmmRX/bDhFkYDxVpQtVdM1ONiAltc1lBqtiNLmWON2yekHOtwIZ9HqVmuyUs7JI9RpnTKOYFN/vbZWyuiCfuxFdaxMhXjFHOYJqbpPb2mfXpHb+gjff0j19ByGC07diwWK9Nu9fG55S//RMX817j3E/VXzRrruGq1mguR2ojpQozxg/tP95evLU0PL9JL+sT6X9AeveUOksFnsbsol56jwgfg/j7uw0b7StOlprs405i7VRzFFM7jAi7xvGcxh9tYQAvCaTsPnIfOo9KH0rfyRHlyvLXkFJizOLDK1e/6Rrhp</latexit><latexit sha1_base64="8mDTFuyI8sljk3/zYyuzBgw/Je8=">AAADVXicnVJNaxNRFD2T1FrjR6JuBDfBUHEh4Y4UKoLQooILF/0waaETwszLa5x2Zt4w8xpaY5a6cOlG0JVCEfFnuPEPiPQniMsKbhS872VQa1HBN8zM/Xjn3HPve0Eahbkm2nNK5Ykjk0enjlWOnzh5qlo7faadq61MyJZQkcpWAz+XUZjIlg51JFfTTPpxEMmVYPOGya8MZJaHKrmrd1LZif1+Eq6HwtccUrVtDOEhQ4w65jFCl/0QG2xdR4VjHgJI9DmWcEbAR85+znmTHT8u76r/eAcFxwiXC9uweVwhgALXoztid/SMngiHepwxKO8XNvpvti5t0E3LOGbzWGmC3kHd3VqDmmRX/bDhFkYDxVpQtVdM1ONiAltc1lBqtiNLmWON2yekHOtwIZ9HqVmuyUs7JI9RpnTKOYFN/vbZWyuiCfuxFdaxMhXjFHOYJqbpPb2mfXpHb+gjff0j19ByGC07diwWK9Nu9fG55S//RMX817j3E/VXzRrruGq1mguR2ojpQozxg/tP95evLU0PL9JL+sT6X9AeveUOksFnsbsol56jwgfg/j7uw0b7StOlprs405i7VRzFFM7jAi7xvGcxh9tYQAvCaTsPnIfOo9KH0rfyRHlyvLXkFJizOLDK1e/6Rrhp</latexit><latexit sha1_base64="8mDTFuyI8sljk3/zYyuzBgw/Je8=">AAADVXicnVJNaxNRFD2T1FrjR6JuBDfBUHEh4Y4UKoLQooILF/0waaETwszLa5x2Zt4w8xpaY5a6cOlG0JVCEfFnuPEPiPQniMsKbhS872VQa1HBN8zM/Xjn3HPve0Eahbkm2nNK5Ykjk0enjlWOnzh5qlo7faadq61MyJZQkcpWAz+XUZjIlg51JFfTTPpxEMmVYPOGya8MZJaHKrmrd1LZif1+Eq6HwtccUrVtDOEhQ4w65jFCl/0QG2xdR4VjHgJI9DmWcEbAR85+znmTHT8u76r/eAcFxwiXC9uweVwhgALXoztid/SMngiHepwxKO8XNvpvti5t0E3LOGbzWGmC3kHd3VqDmmRX/bDhFkYDxVpQtVdM1ONiAltc1lBqtiNLmWON2yekHOtwIZ9HqVmuyUs7JI9RpnTKOYFN/vbZWyuiCfuxFdaxMhXjFHOYJqbpPb2mfXpHb+gjff0j19ByGC07diwWK9Nu9fG55S//RMX817j3E/VXzRrruGq1mguR2ojpQozxg/tP95evLU0PL9JL+sT6X9AeveUOksFnsbsol56jwgfg/j7uw0b7StOlprs405i7VRzFFM7jAi7xvGcxh9tYQAvCaTsPnIfOo9KH0rfyRHlyvLXkFJizOLDK1e/6Rrhp</latexit>
˜A = A + I<latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit>
˜A = A + I<latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit>
˜A = A + I<latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit><latexit sha1_base64="TomtJlApWF/mxlr+rHF9E8Br6dU=">AAACgnicSyrIySwuMTC4ycjEzMLKxs7BycXNw8vHLyAoFFacX1qUnBqanJ+TXxSRlFicmpOZlxpaklmSkxpRUJSamJuUkxqelO0Mkg8vSy0qzszPCympLEiNzU1Mz8tMy0xOLAEKxQvIVsckpcWUZOakpCo41tqCeEBaG0x71sYLKBvoGYCBAibDEMpQZoCCgHyB5QwxDCkM+QzJDKUMuQypDHkMJUB2DkMiQzEQRjMYMhgwFADFYhmqgWJFQFYmWD6VoZaBC6i3FKgqFagiESiaDSTTgbxoqGgekA8ysxisOxloSw4QFwF1KjCoGlw1WGnw2eCEwWqDlwZ/cJpVDTYD5JZKIJ0E0ZtaEM/fJRH8naCuXCBdwpCB0IXXzSUMaQwWYLdmAt1eABYB+SIZor+savrnYKsg1Wo1g0UGr4HuX2hw0+Aw0Ad5ZV+SlwamBs1m4AJGgCF6cGMywoz0DA30DANNlB08oFHBwSDNoMSgAQxvcwYHBg+GAIZQoL3NDCsZNjFsZmJh0mIyZDKGKGVihOoRZkABTDYAxMqUJQ==</latexit>