This document provides an overview of sensors for robotics. It begins by classifying sensors as either internal state sensors, which measure positions and velocities of robot joints, or external state sensors, which monitor the robot's environment. Popular internal state sensors discussed include encoders, gyroscopes, and accelerometers. External state sensors can be vision, touch, acoustic, or movement-based. The benefits of sensor fusion are then covered. Various sensor characteristics and types of errors are also defined. The document concludes by describing specific sensors like wheel encoders, magnetic compasses, and gyroscopes in more detail.
1) Sensors are devices that detect physical quantities and convert them into signals that can be measured. They are needed for industrial monitoring, safety, and automation.
2) Common sensors include position, proximity, range, touch, and force sensors. Position sensors like LVDT and RVDT convert linear or angular displacement into electrical signals.
3) Sensors have characteristics like range, sensitivity, accuracy, and response time that determine their effectiveness. Understanding sensor types and properties is important for robotics applications.
The document describes the design and construction of an autonomous mobile mini-sumo robot. Key components of the robot include an Arduino Uno board for control, line and distance sensors for navigation, and two DC motors powered by a motor driver shield. The robot was designed using Fritzing software and programmed using the Arduino language to follow the regulations of sumo robot competitions.
This document provides information about robotics architecture, sensors, components, and reactive systems. It discusses the key components of a robot including actuators, power supply, and various sensors. Vision, touch, hearing, proximity, range, and other sensors used in robotics are described. Reactive robotic systems are defined as having behaviors as basic building blocks, avoiding abstract representations, and being inspired by animal models. The document also covers robot architecture and the sense-plan-act model, as well as hierarchical, reactive, and hybrid control systems.
1. Sensors allow machines and robots to monitor their surroundings in various ways and for many applications.
2. The document discusses different types of sensors like position, proximity, range sensors and their uses in industry, environment, and safety.
3. It also explains the characteristics and working principles of important position sensors like LVDT and RVDT, which can convert linear or angular displacement into electrical signals.
IRJET- Can Robots are Changing the World: A ReviewIRJET Journal
This document summarizes a research paper on robots and how they are changing the world. It discusses how robots are rapidly developing from industrial uses to serving as companions. It proposes a hierarchical probabilistic representation of space that would allow robots to understand their environments in a way that is compatible with human users. This conceptual representation of space would be useful for robots to become familiar with their surroundings and interact in a semantically and socially intelligent manner. The paper reviews robot capabilities, sensors, effectors, software architectures, applications, and debates around machine intelligence and thinking.
Unit III - Solved Question Bank- Robotics Engineering -Sanjay Singh
This Question Bank for Robotics Engineering is only for academic purpose and not for any commercial use. Students of Anna University and other Universities can use it for reference and knowledge.
Transducers and sensors
Sensors in robotics
Tactile sensors
Proximity and range sensors
Miscellaneous sensors and sensor based system
Use of sensors in Robotics
VARIOUS SENSOR USED IN ROBOTICS WITH APPLICATIONS | J4RV3I12003Journal For Research
This paper gives brief introduction about various sensors used in robotics and their applications. A sensor is a device that detects the changes in electrical or physical or other quantities and thereby produces an output and whose purpose is to detect events or changes in its environment and send the information to other electronic devices. Robotic sensors are used to estimate robots condition and environment. Sensors in robots are based on the functions of human sensory organs. Sensors used in robots provide intelligence to the robot and improve their performance.
1) Sensors are devices that detect physical quantities and convert them into signals that can be measured. They are needed for industrial monitoring, safety, and automation.
2) Common sensors include position, proximity, range, touch, and force sensors. Position sensors like LVDT and RVDT convert linear or angular displacement into electrical signals.
3) Sensors have characteristics like range, sensitivity, accuracy, and response time that determine their effectiveness. Understanding sensor types and properties is important for robotics applications.
The document describes the design and construction of an autonomous mobile mini-sumo robot. Key components of the robot include an Arduino Uno board for control, line and distance sensors for navigation, and two DC motors powered by a motor driver shield. The robot was designed using Fritzing software and programmed using the Arduino language to follow the regulations of sumo robot competitions.
This document provides information about robotics architecture, sensors, components, and reactive systems. It discusses the key components of a robot including actuators, power supply, and various sensors. Vision, touch, hearing, proximity, range, and other sensors used in robotics are described. Reactive robotic systems are defined as having behaviors as basic building blocks, avoiding abstract representations, and being inspired by animal models. The document also covers robot architecture and the sense-plan-act model, as well as hierarchical, reactive, and hybrid control systems.
1. Sensors allow machines and robots to monitor their surroundings in various ways and for many applications.
2. The document discusses different types of sensors like position, proximity, range sensors and their uses in industry, environment, and safety.
3. It also explains the characteristics and working principles of important position sensors like LVDT and RVDT, which can convert linear or angular displacement into electrical signals.
IRJET- Can Robots are Changing the World: A ReviewIRJET Journal
This document summarizes a research paper on robots and how they are changing the world. It discusses how robots are rapidly developing from industrial uses to serving as companions. It proposes a hierarchical probabilistic representation of space that would allow robots to understand their environments in a way that is compatible with human users. This conceptual representation of space would be useful for robots to become familiar with their surroundings and interact in a semantically and socially intelligent manner. The paper reviews robot capabilities, sensors, effectors, software architectures, applications, and debates around machine intelligence and thinking.
Unit III - Solved Question Bank- Robotics Engineering -Sanjay Singh
This Question Bank for Robotics Engineering is only for academic purpose and not for any commercial use. Students of Anna University and other Universities can use it for reference and knowledge.
Transducers and sensors
Sensors in robotics
Tactile sensors
Proximity and range sensors
Miscellaneous sensors and sensor based system
Use of sensors in Robotics
VARIOUS SENSOR USED IN ROBOTICS WITH APPLICATIONS | J4RV3I12003Journal For Research
This paper gives brief introduction about various sensors used in robotics and their applications. A sensor is a device that detects the changes in electrical or physical or other quantities and thereby produces an output and whose purpose is to detect events or changes in its environment and send the information to other electronic devices. Robotic sensors are used to estimate robots condition and environment. Sensors in robots are based on the functions of human sensory organs. Sensors used in robots provide intelligence to the robot and improve their performance.
A variety of sensors can enhance a FIRST robotics competition robot's operation, including ultrasonic sensors, gyroscopes, accelerometers, encoders, photoelectric sensors, and joysticks. These sensors provide input to drive the robot and make autonomous decisions. Common sensors include ultrasonic detectors for measuring distance, gyroscopes and accelerometers for sensing motion, encoders for monitoring wheel rotations and gear health, and photoelectric sensors for detecting objects. Joysticks also act as sensors, providing directional control and buttons that can be programmed to control additional robot functions. Diagrams and sample code are provided to illustrate how these sensors integrate with the robot control system.
This document discusses the use of sensors in robotics. It begins by introducing how sensors give robots human-like sensing abilities like vision, touch, hearing, and movement. It then describes several key sensors used in robotics - vision sensors that allow robots to see their environment, touch sensors that allow robots to feel contact and interpret emotions, and hearing sensors that allow robots to perceive speech. The document also lists and describes other common sensors like proximity, range, tactile, light, sound, temperature, contact, voltage, and current sensors and their applications in robotics.
A one decade survey of autonomous mobile robot systems IJECEIAES
Recently, autonomous mobile robots have gained popularity in the modern world due to their relevance technology and application in real world situations. The global market for mobile robots will grow significantly over the next 20 years. Autonomous mobile robots are found in many fields including institutions, industry, business, hospitals, agriculture as well as private households for the purpose of improving day-to-day activities and services. The development of technology has increased in the requirements for mobile robots because of the services and tasks provided by them, like rescue and research operations, surveillance, carry heavy objects and so on. Researchers have conducted many works on the importance of robots, their uses, and problems. This article aims to analyze the control system of mobile robots and the way robots have the ability of moving in real-world to achieve their goals. It should be noted that there are several technological directions in a mobile robot industry. It must be observed and integrated so that the robot functions properly: Navigation systems, localization systems, detection systems (sensors) along with motion and kinematics and dynamics systems. All such systems should be united through a control unit; thus, the mission or work of mobile robots are conducted with reliability.
PC-based mobile robot navigation sytemANKIT SURATI
The document describes the design and components of a PC-based mobile robot for navigation. It uses a vision system with image processing and feature matching to navigate. A fuzzy logic controller computes the speed and angular speed needed by the two motors. An obstacle avoidance algorithm detects obstacles using pixel appearance differences. The robot is powered by a 12V battery and controlled through a software interface on a PC. Testing showed the fuzzy controller could achieve desired turns to navigate autonomously.
Robotic assistance for the visually impaired and elderly peopletheijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
The papers for publication in The International Journal of Engineering& Science are selected through rigorous peer reviews to ensure originality, timeliness, relevance, and readability.
Robotic assistance for the visually impaired and elderly peopletheijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
The document provides an introduction to robotics and line path followers. It discusses what robots are and their common applications. Robots are mechanical devices that perform automated tasks under direct or indirect human control. They are used for dangerous, difficult, repetitive or dull tasks. The document also describes different types of mobile robots, including rolling and walking robots. It then discusses the basic components, working and circuit diagram of a line path follower robot. Key sensors used in robots like IR sensors and their operating methods of through-beam, reflex and proximity detection are explained.
This document provides an outline and overview of a project to design an autonomous robot that can avoid obstacles using Arduino and ultrasonic sensors. The summary includes:
1) The project aims to design and build a fully autonomous robot that can avoid any obstacles in its path as it moves.
2) The robot uses Arduino, ultrasonic sensors, and a motor driver IC to detect obstacles and navigate around them autonomously without human intervention.
3) The document outlines the components, software, circuit diagram, and working process of the robot, and discusses its applications in dangerous environments or for tasks like automatic vacuum cleaning.
Sensor and different types of sensor .pdfIbrahim Tareq
A sensor is a device that detects and responds to some type of input from the physical environment. Common sensors include light sensors, sound sensors, contact sensors, tilt sensors, distance sensors, proximity sensors, navigation sensors, acceleration sensors, chemical sensors, and biosensors. Each sensor detects a specific type of input or environmental phenomenon and outputs a signal that can be read or processed.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollereSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
This document summarizes the design and implementation of an autonomous biped robot using an Arduino microcontroller. The robot uses ultrasonic sensors for obstacle avoidance and a sound sensor to take instructions. It is able to walk in a balanced manner through a sequence of motions of its servo motors in the legs and central plate. The Arduino allows for programming of the robot's behavior through an open-source IDE. The goals are to demonstrate integration of AI, bipedal mechanisms, and sensors on a low-cost and open-source platform to further research in humanoid robotics.
IRJET- Survey Paper on Human Following RobotIRJET Journal
The document summarizes research on developing an autonomous human following robot. It discusses using triangulation of radio signals from a tag worn by a human to calculate the tag's location using multiple antennas on the robot. The robot would use triangulation and received signal strength to determine the tag's position and direction to follow the human. It reviews several localization algorithms and navigation techniques used in other projects. The proposed method is to use triangulation of signals from three antennas to accurately calculate the tag's position and allow the robot to autonomously follow or be remotely controlled via Bluetooth.
1. Sensors are devices that detect physical parameters and convert them into signals that can be processed by systems. Common sensors measure temperature, pressure, velocity, rotation, flow, and other variables.
2. Sensors are needed in industry to monitor machinery and prevent failures, in the environment to detect hazards, and for safety and security applications like fire detection.
3. Common sensors used in robotics include position sensors, proximity sensors, range sensors, tactile sensors, and force sensors. Position sensors like LVDTs and RVDTs convert linear or angular displacement into electrical signals.
Dennis C. Erickson is a senior mentor for FRC teams 1510 and 2898. He provides guidance on using various sensors to enhance robot operations, including ultrasonic detectors, gyro sensors, accelerometers, and cameras. The document outlines different sensor types and provides code examples and diagrams to help understand how to implement the sensors.
This document discusses using hand gestures to control swarm robots. It describes detecting hands using skin color tracking and extracting features from hands using Gabor filters. Gestures are recognized by comparing features to a database. The swarm robots communicate using Bluetooth to reach consensus on recognized gestures and navigate accordingly. The system was implemented using foot-bot robots with a webcam for input and motors/sensors for movement and communication. The approach aims to allow effective control of swarm robots through robust gesture recognition that handles noise and environments.
This document discusses using hand gestures to control swarm robots. It describes detecting hands using skin color tracking and extracting features from hands using Gabor filters. Gestures are recognized by comparing features to a database. The swarm robots communicate using Bluetooth to reach consensus on recognized gestures and navigate accordingly. The system was implemented using foot-bot robots with a webcam for input and motors/sensors for movement and communication. The approach aims to allow effective control of swarm robots through robust gesture recognition that handles noise and environments.
Project report on Iot Based Garbage Monitoring System alok pal
This is an innovative project idea which is based on Internet of Things,which will be helpful to keep the city clean.This system monitors garbage level in the garbage bins and informs about the level of garbage collected in the bin via web page. In this project we are using garbage Monitoring concept with help of DHT11 sensor and Ultrasonic sensors and Mediatek Linkit One Board,MQTT protocol is used to publish the status of garbage bin to the cloud,which can be accessed by respective Authorities
This document describes a proposed vehicle density sensor system to manage traffic congestion. The system uses sensors like infrared sensors placed on roads to detect vehicle density in real-time. A microcontroller collects this sensor data and sends it to a computer. Based on the detected vehicle density, the computer then dynamically adjusts the timing of traffic lights at intersections to reduce congestion. By taking into account current traffic conditions, this proposed system aims to more efficiently control traffic flows compared to traditional fixed-time traffic light systems.
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...Transcat
Join us for this solutions-based webinar on the tools and techniques for commissioning and maintaining PV Systems. In this session, we'll review the process of building and maintaining a solar array, starting with installation and commissioning, then reviewing operations and maintenance of the system. This course will review insulation resistance testing, I-V curve testing, earth-bond continuity, ground resistance testing, performance tests, visual inspections, ground and arc fault testing procedures, and power quality analysis.
Fluke Solar Application Specialist Will White is presenting on this engaging topic:
Will has worked in the renewable energy industry since 2005, first as an installer for a small east coast solar integrator before adding sales, design, and project management to his skillset. In 2022, Will joined Fluke as a solar application specialist, where he supports their renewable energy testing equipment like IV-curve tracers, electrical meters, and thermal imaging cameras. Experienced in wind power, solar thermal, energy storage, and all scales of PV, Will has primarily focused on residential and small commercial systems. He is passionate about implementing high-quality, code-compliant installation techniques.
A variety of sensors can enhance a FIRST robotics competition robot's operation, including ultrasonic sensors, gyroscopes, accelerometers, encoders, photoelectric sensors, and joysticks. These sensors provide input to drive the robot and make autonomous decisions. Common sensors include ultrasonic detectors for measuring distance, gyroscopes and accelerometers for sensing motion, encoders for monitoring wheel rotations and gear health, and photoelectric sensors for detecting objects. Joysticks also act as sensors, providing directional control and buttons that can be programmed to control additional robot functions. Diagrams and sample code are provided to illustrate how these sensors integrate with the robot control system.
This document discusses the use of sensors in robotics. It begins by introducing how sensors give robots human-like sensing abilities like vision, touch, hearing, and movement. It then describes several key sensors used in robotics - vision sensors that allow robots to see their environment, touch sensors that allow robots to feel contact and interpret emotions, and hearing sensors that allow robots to perceive speech. The document also lists and describes other common sensors like proximity, range, tactile, light, sound, temperature, contact, voltage, and current sensors and their applications in robotics.
A one decade survey of autonomous mobile robot systems IJECEIAES
Recently, autonomous mobile robots have gained popularity in the modern world due to their relevance technology and application in real world situations. The global market for mobile robots will grow significantly over the next 20 years. Autonomous mobile robots are found in many fields including institutions, industry, business, hospitals, agriculture as well as private households for the purpose of improving day-to-day activities and services. The development of technology has increased in the requirements for mobile robots because of the services and tasks provided by them, like rescue and research operations, surveillance, carry heavy objects and so on. Researchers have conducted many works on the importance of robots, their uses, and problems. This article aims to analyze the control system of mobile robots and the way robots have the ability of moving in real-world to achieve their goals. It should be noted that there are several technological directions in a mobile robot industry. It must be observed and integrated so that the robot functions properly: Navigation systems, localization systems, detection systems (sensors) along with motion and kinematics and dynamics systems. All such systems should be united through a control unit; thus, the mission or work of mobile robots are conducted with reliability.
PC-based mobile robot navigation sytemANKIT SURATI
The document describes the design and components of a PC-based mobile robot for navigation. It uses a vision system with image processing and feature matching to navigate. A fuzzy logic controller computes the speed and angular speed needed by the two motors. An obstacle avoidance algorithm detects obstacles using pixel appearance differences. The robot is powered by a 12V battery and controlled through a software interface on a PC. Testing showed the fuzzy controller could achieve desired turns to navigate autonomously.
Robotic assistance for the visually impaired and elderly peopletheijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
The papers for publication in The International Journal of Engineering& Science are selected through rigorous peer reviews to ensure originality, timeliness, relevance, and readability.
Robotic assistance for the visually impaired and elderly peopletheijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
The document provides an introduction to robotics and line path followers. It discusses what robots are and their common applications. Robots are mechanical devices that perform automated tasks under direct or indirect human control. They are used for dangerous, difficult, repetitive or dull tasks. The document also describes different types of mobile robots, including rolling and walking robots. It then discusses the basic components, working and circuit diagram of a line path follower robot. Key sensors used in robots like IR sensors and their operating methods of through-beam, reflex and proximity detection are explained.
This document provides an outline and overview of a project to design an autonomous robot that can avoid obstacles using Arduino and ultrasonic sensors. The summary includes:
1) The project aims to design and build a fully autonomous robot that can avoid any obstacles in its path as it moves.
2) The robot uses Arduino, ultrasonic sensors, and a motor driver IC to detect obstacles and navigate around them autonomously without human intervention.
3) The document outlines the components, software, circuit diagram, and working process of the robot, and discusses its applications in dangerous environments or for tasks like automatic vacuum cleaning.
Sensor and different types of sensor .pdfIbrahim Tareq
A sensor is a device that detects and responds to some type of input from the physical environment. Common sensors include light sensors, sound sensors, contact sensors, tilt sensors, distance sensors, proximity sensors, navigation sensors, acceleration sensors, chemical sensors, and biosensors. Each sensor detects a specific type of input or environmental phenomenon and outputs a signal that can be read or processed.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollereSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
This document summarizes the design and implementation of an autonomous biped robot using an Arduino microcontroller. The robot uses ultrasonic sensors for obstacle avoidance and a sound sensor to take instructions. It is able to walk in a balanced manner through a sequence of motions of its servo motors in the legs and central plate. The Arduino allows for programming of the robot's behavior through an open-source IDE. The goals are to demonstrate integration of AI, bipedal mechanisms, and sensors on a low-cost and open-source platform to further research in humanoid robotics.
IRJET- Survey Paper on Human Following RobotIRJET Journal
The document summarizes research on developing an autonomous human following robot. It discusses using triangulation of radio signals from a tag worn by a human to calculate the tag's location using multiple antennas on the robot. The robot would use triangulation and received signal strength to determine the tag's position and direction to follow the human. It reviews several localization algorithms and navigation techniques used in other projects. The proposed method is to use triangulation of signals from three antennas to accurately calculate the tag's position and allow the robot to autonomously follow or be remotely controlled via Bluetooth.
1. Sensors are devices that detect physical parameters and convert them into signals that can be processed by systems. Common sensors measure temperature, pressure, velocity, rotation, flow, and other variables.
2. Sensors are needed in industry to monitor machinery and prevent failures, in the environment to detect hazards, and for safety and security applications like fire detection.
3. Common sensors used in robotics include position sensors, proximity sensors, range sensors, tactile sensors, and force sensors. Position sensors like LVDTs and RVDTs convert linear or angular displacement into electrical signals.
Dennis C. Erickson is a senior mentor for FRC teams 1510 and 2898. He provides guidance on using various sensors to enhance robot operations, including ultrasonic detectors, gyro sensors, accelerometers, and cameras. The document outlines different sensor types and provides code examples and diagrams to help understand how to implement the sensors.
This document discusses using hand gestures to control swarm robots. It describes detecting hands using skin color tracking and extracting features from hands using Gabor filters. Gestures are recognized by comparing features to a database. The swarm robots communicate using Bluetooth to reach consensus on recognized gestures and navigate accordingly. The system was implemented using foot-bot robots with a webcam for input and motors/sensors for movement and communication. The approach aims to allow effective control of swarm robots through robust gesture recognition that handles noise and environments.
This document discusses using hand gestures to control swarm robots. It describes detecting hands using skin color tracking and extracting features from hands using Gabor filters. Gestures are recognized by comparing features to a database. The swarm robots communicate using Bluetooth to reach consensus on recognized gestures and navigate accordingly. The system was implemented using foot-bot robots with a webcam for input and motors/sensors for movement and communication. The approach aims to allow effective control of swarm robots through robust gesture recognition that handles noise and environments.
Project report on Iot Based Garbage Monitoring System alok pal
This is an innovative project idea which is based on Internet of Things,which will be helpful to keep the city clean.This system monitors garbage level in the garbage bins and informs about the level of garbage collected in the bin via web page. In this project we are using garbage Monitoring concept with help of DHT11 sensor and Ultrasonic sensors and Mediatek Linkit One Board,MQTT protocol is used to publish the status of garbage bin to the cloud,which can be accessed by respective Authorities
This document describes a proposed vehicle density sensor system to manage traffic congestion. The system uses sensors like infrared sensors placed on roads to detect vehicle density in real-time. A microcontroller collects this sensor data and sends it to a computer. Based on the detected vehicle density, the computer then dynamically adjusts the timing of traffic lights at intersections to reduce congestion. By taking into account current traffic conditions, this proposed system aims to more efficiently control traffic flows compared to traditional fixed-time traffic light systems.
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...Transcat
Join us for this solutions-based webinar on the tools and techniques for commissioning and maintaining PV Systems. In this session, we'll review the process of building and maintaining a solar array, starting with installation and commissioning, then reviewing operations and maintenance of the system. This course will review insulation resistance testing, I-V curve testing, earth-bond continuity, ground resistance testing, performance tests, visual inspections, ground and arc fault testing procedures, and power quality analysis.
Fluke Solar Application Specialist Will White is presenting on this engaging topic:
Will has worked in the renewable energy industry since 2005, first as an installer for a small east coast solar integrator before adding sales, design, and project management to his skillset. In 2022, Will joined Fluke as a solar application specialist, where he supports their renewable energy testing equipment like IV-curve tracers, electrical meters, and thermal imaging cameras. Experienced in wind power, solar thermal, energy storage, and all scales of PV, Will has primarily focused on residential and small commercial systems. He is passionate about implementing high-quality, code-compliant installation techniques.
Software Engineering and Project Management - Introduction, Modeling Concepts...Prakhyath Rai
Introduction, Modeling Concepts and Class Modeling: What is Object orientation? What is OO development? OO Themes; Evidence for usefulness of OO development; OO modeling history. Modeling
as Design technique: Modeling, abstraction, The Three models. Class Modeling: Object and Class Concept, Link and associations concepts, Generalization and Inheritance, A sample class model, Navigation of class models, and UML diagrams
Building the Analysis Models: Requirement Analysis, Analysis Model Approaches, Data modeling Concepts, Object Oriented Analysis, Scenario-Based Modeling, Flow-Oriented Modeling, class Based Modeling, Creating a Behavioral Model.
Applications of artificial Intelligence in Mechanical Engineering.pdfAtif Razi
Historically, mechanical engineering has relied heavily on human expertise and empirical methods to solve complex problems. With the introduction of computer-aided design (CAD) and finite element analysis (FEA), the field took its first steps towards digitization. These tools allowed engineers to simulate and analyze mechanical systems with greater accuracy and efficiency. However, the sheer volume of data generated by modern engineering systems and the increasing complexity of these systems have necessitated more advanced analytical tools, paving the way for AI.
AI offers the capability to process vast amounts of data, identify patterns, and make predictions with a level of speed and accuracy unattainable by traditional methods. This has profound implications for mechanical engineering, enabling more efficient design processes, predictive maintenance strategies, and optimized manufacturing operations. AI-driven tools can learn from historical data, adapt to new information, and continuously improve their performance, making them invaluable in tackling the multifaceted challenges of modern mechanical engineering.
Discover the latest insights on Data Driven Maintenance with our comprehensive webinar presentation. Learn about traditional maintenance challenges, the right approach to utilizing data, and the benefits of adopting a Data Driven Maintenance strategy. Explore real-world examples, industry best practices, and innovative solutions like FMECA and the D3M model. This presentation, led by expert Jules Oudmans, is essential for asset owners looking to optimize their maintenance processes and leverage digital technologies for improved efficiency and performance. Download now to stay ahead in the evolving maintenance landscape.
Build the Next Generation of Apps with the Einstein 1 Platform.
Rejoignez Philippe Ozil pour une session de workshops qui vous guidera à travers les détails de la plateforme Einstein 1, l'importance des données pour la création d'applications d'intelligence artificielle et les différents outils et technologies que Salesforce propose pour vous apporter tous les bénéfices de l'IA.
Null Bangalore | Pentesters Approach to AWS IAMDivyanshu
#Abstract:
- Learn more about the real-world methods for auditing AWS IAM (Identity and Access Management) as a pentester. So let us proceed with a brief discussion of IAM as well as some typical misconfigurations and their potential exploits in order to reinforce the understanding of IAM security best practices.
- Gain actionable insights into AWS IAM policies and roles, using hands on approach.
#Prerequisites:
- Basic understanding of AWS services and architecture
- Familiarity with cloud security concepts
- Experience using the AWS Management Console or AWS CLI.
- For hands on lab create account on [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
# Scenario Covered:
- Basics of IAM in AWS
- Implementing IAM Policies with Least Privilege to Manage S3 Bucket
- Objective: Create an S3 bucket with least privilege IAM policy and validate access.
- Steps:
- Create S3 bucket.
- Attach least privilege policy to IAM user.
- Validate access.
- Exploiting IAM PassRole Misconfiguration
-Allows a user to pass a specific IAM role to an AWS service (ec2), typically used for service access delegation. Then exploit PassRole Misconfiguration granting unauthorized access to sensitive resources.
- Objective: Demonstrate how a PassRole misconfiguration can grant unauthorized access.
- Steps:
- Allow user to pass IAM role to EC2.
- Exploit misconfiguration for unauthorized access.
- Access sensitive resources.
- Exploiting IAM AssumeRole Misconfiguration with Overly Permissive Role
- An overly permissive IAM role configuration can lead to privilege escalation by creating a role with administrative privileges and allow a user to assume this role.
- Objective: Show how overly permissive IAM roles can lead to privilege escalation.
- Steps:
- Create role with administrative privileges.
- Allow user to assume the role.
- Perform administrative actions.
- Differentiation between PassRole vs AssumeRole
Try at [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
Height and depth gauge linear metrology.pdfq30122000
Height gauges may also be used to measure the height of an object by using the underside of the scriber as the datum. The datum may be permanently fixed or the height gauge may have provision to adjust the scale, this is done by sliding the scale vertically along the body of the height gauge by turning a fine feed screw at the top of the gauge; then with the scriber set to the same level as the base, the scale can be matched to it. This adjustment allows different scribers or probes to be used, as well as adjusting for any errors in a damaged or resharpened probe.
Accident detection system project report.pdfKamal Acharya
The Rapid growth of technology and infrastructure has made our lives easier. The
advent of technology has also increased the traffic hazards and the road accidents take place
frequently which causes huge loss of life and property because of the poor emergency facilities.
Many lives could have been saved if emergency service could get accident information and
reach in time. Our project will provide an optimum solution to this draw back. A piezo electric
sensor can be used as a crash or rollover detector of the vehicle during and after a crash. With
signals from a piezo electric sensor, a severe accident can be recognized. According to this
project when a vehicle meets with an accident immediately piezo electric sensor will detect the
signal or if a car rolls over. Then with the help of GSM module and GPS module, the location
will be sent to the emergency contact. Then after conforming the location necessary action will
be taken. If the person meets with a small accident or if there is no serious threat to anyone’s
life, then the alert message can be terminated by the driver by a switch provided in order to
avoid wasting the valuable time of the medical rescue team.
1. 6 b i d lli S
EE6221 Robotics and Intelligent Sensors
(Part 3)
Lecture 1: Sensors
Lecture 1: Sensors
Guoqiang Hu
Guoqiang Hu
School of EEE, NTU
2. Outline
• Sensor classification
• Sensor characterization
M t d ( iti )
• Motor encoders (position)
• Heading/orientation sensors
• Ground-based beacons
Ground based beacons
• Active ranging
• Speed sensors
p
• Vision (sensors, depth from focus, stereo vision, optic
flow, etc.)
Reference:
• Introduction to Autonomous Mobile Robots (ch.4), R. Siegwart, and I. Nourbakhsh,
MIT Press 2004
2
MIT Press, 2004
• Sensors for Mobile Robots, Theory and Applications, H.R. Everett, A K Peters Ltd.,
1995
2
3. Robot Sensors
R b t S b di id d i t t b i l
• Robot Sensors can be divided into two basic classes:
1) Internal State Sensors
2) External State Sensors
2) External State Sensors.
• Internal sensors are used for control of the internal
states of the robot/machine.
• External sensors are needed for the robot/machine to
react intelligently with its surroundings.
Sonar IR haptic
Sonar, IR, haptic
feedback, tactile
sensors, range of
motion sensors,
motion sensors,
vision sensors (CCD
cameras), etc.
3
4. Internal State Sensors
I t l t t i t f d i d t
• Internal state sensors consists of devices used to
measure position, velocity, or acceleration of robot joints
and/or the end effectors. The following devices are some
g
of the sensors that belong to this class:
• Potentiometers • Synchros
• Linear inductive scales
• Differential transformers (LVDTs and RVDTs)
• Optical interrupters • Optical encoders
• Tachometers • Accelerometers
4
5. The successful control of most robots depends on
Internal State Sensors
The successful control of most robots depends on
being able to obtain accurate information about the joint
and/or end effectors
and/or end effectors.
Necessary to have devices (transducers) that provide
such information and can be readily utilized in a robot for
such information and can be readily utilized in a robot for
this purpose.
I ti l iti l it d/ l ti ( t
In particular, position, velocity, and/or acceleration (or at
least analog or digital representations of these
quantities) must be measured to ensure that the robotic
quantities) must be measured to ensure that the robotic
machine moves in a desired manner.
5
6. Internal State Sensors
The internal state sensors must not only permit the
required degree of accuracy to be achieved, they must
also be cost effective since each of the robot’s axes will
also be cost-effective since each of the robot s axes will
normally utilize such devices.
Th l i d h d i i l i i h
The sensor selection and the decision to place it either on
the load side or on the output of the joint actuator itself is
influenced by such factors as:
influenced by such factors as:
1) overall sensor cost,
2) power needs for a particular joint,
2) power needs for a particular joint,
3) maximum permissible size of the actuator,
4) sensor resolution, and
5) the need to monitor directly the actions of the joint.
6
7. External State Sensors
Th d l ll d t l t t i d
The second class, called external state sensors, is used
to monitor the robot’s geometric and/or dynamic relation to
its task, environment, or the objects that it is handling.
j g
Although it is possible to utilize a robot without any
external sensing, but in order to achieve greater
flexibilities and intelligent abilities, the robot will require
such devices to be aware of its surroundings.
Robotic sensing mainly gives robots the ability to see,
touch, hear and move, and uses algorithms that require
environmental feedback
environmental feedback.
Robots use different sensors to detect different factors of
the environment.
7
the environment.
8. External State Sensors
Th f t l t t
The usage of external state sensors:
• Vision: Visual sensors help robots to identify the
surrounding environment and take appropriate action
surrounding environment and take appropriate action.
• Touch: Touch patterns enable robots to interpret
human emotions in interactive applications. Robots use
human emotions in interactive applications. Robots use
touch signals to map the profile of a surface in hostile
environment such as a water pipe.
• Hearing: Robots can perceive our emotion through the
way we talk. Acoustic and linguistic features are
generally used to characterize emotions
generally used to characterize emotions.
• Movement: Provide a guidance system for an
automated robot to determine the ideal path to perform
8
its task.
9. External State Sensors
I t t f t l t t ( t f ll li t)
Instruments of external state sensors (not a full list):
Vision (camera, infrared, laser, RFID) for landmark
recognition 3D scene recovery scene understanding etc
recognition, 3D scene recovery, scene understanding, etc.
Tactile sensors: Four measurable features—force, contact
time, repetition, and contact area change—can effectively
time, repetition, and contact area change can effectively
categorize touch patterns.
Range finders (lasers, ultrasonic) for 3D scene recovery,
local position.
GPS (differential GPS) for global positioning
Odometry (encoders, gyroscopes, accelerometers) –
absolute motion recovery.
9
10. The Role of Multi-Sensor Integration and Fusion
Th f t l i t id t ith
The purpose of external sensors is to provide a system with
useful information concerning some features of interest in
the system’s environment. This enables the machine to act
y
intelligently via some intelligent programme.
The potential advantages in integrating and/or fusing
The potential advantages in integrating and/or fusing
information from multiple sensors are that the
information can be decided more accurately, concerning
f t th t i ibl t i ith i di id l
features that are impossible to perceive with individual
sensors, as well as in less time, and at a lesser cost.
These advantages correspond, respectively, to the notions
of the redundancy, complementarity, timeliness, and
cost of the information provided the system
10
cost of the information provided the system.
11. The Role of Multi-Sensor Integration and Fusion
R d d t i f ti i id d f f
Redundant information is provided from a group of
sensors (or a single sensor over time) when each sensor is
perceiving, possibly with a different fidelity, the same
p g p y y
features in the environment.
The integration or fusion of redundant information can
reduce overall uncertainty and thus serve to increase the
accuracy with which the features are perceived by the
system
system.
Multiple sensors providing redundant information can also
serve to increase reliability in the case of sensor error or
serve to increase reliability in the case of sensor error or
failure.
11
12. The Role of Multi-Sensor Integration and Fusion
C l t i f ti f lti l ll
Complementary information from multiple sensors allows
features in the environment to be perceived that are
impossible to perceive using just the information from each
p p g j
individual sensor operating separately.
More timely information, as compared to the speed at
which it could be provided by a single sensor, may be
provided by multiple sensors either because of the actual
speed of operation of each sensor or because of the
speed of operation of each sensor, or because of the
processing parallelism that may possibly be achieved as
part of tile integration process.
12
13. The Role of Multi-Sensor Integration and Fusion
L tl i f ti i th t t f t ith
Less costly information, in the context of a system with
multiple sensors, is information obtained at a lesser cost as
compared to the equivalent quality of information that could
p q q y
be obtained from a single sensor.
Unless the information provided by the single sensor is
being used for additional functions in the system, the total
cost of the single sensor should be compared to the total
cost of the integrated multi-sensor system
cost of the integrated multi sensor system.
With redundancy, it is possible then to use lower cost
sensors with lower quality that can be combined to
sensors with lower quality that can be combined to
achieve the same quality of reading from a high quality
sensor (thereby being costly).
13
14. Classification of Sensors for Robots
Cl ifi ti b d f ti
Classification based on functions:
• Proprioceptive sensors
Measure the internal state of the system (robot) e g
– Measure the internal state of the system (robot), e.g.,
Angular position of motor wheel, motor speed, wheel
load, battery status, etc.
– Examples: GPS, INS, Shaft Encoders, Compass, etc.
14
15. Classification of Sensors for Robots
E t ti
• Exteroceptive sensors
– Determine the measurements of objects relative to a
robot's frame of reference (i.e., information about the
robot s frame of reference (i.e., information about the
robot’s environment)
– Measure: e.g., distance from the robot to another
object
– Keep the robot from colliding with other objects
Types: contact sensors range sensors vision
– Types: contact sensors, range sensors, vision
sensors
• Exproprioceptive sensors (Directional Sensors)
p p p ( )
15
16. Classification of Sensors for Robots
Cl ifi ti b d h f
Classification based on where energy comes from:
• Active sensors
emit their own energy and measure the reaction
– emit their own energy and measure the reaction
– better performance, but some influence on the
environment and generally consumes more energy
– Examples: ultrasonic, laser, and IR sensors.
• Passive sensors
– receives energy already in the environment
cons me less energ b t often ha e signal and noise
– consume less energy, but often have signal and noise
problems
– Examples: camera
16
p
19. Sensor Characteristics
Si i ht ti lt i t f
• Size, weight, power consumption, voltage, interface
• Field of view (FOV)
• Range (or full scale)
• Range (or full scale)
– lower and upper limits
– e.g., triangulation range finder may provide
g , g g y p
unpredictable values outside its working range
• Dynamic range (e.g., camera pixels)
– ratio between the smallest and largest possible
values of a changeable quantity (e.g., sound, light,
etc.)
)
– often specified in decibels (ratio between powers)
– e.g., power measurement from 1mW to 20W:
19
10log[20/0.001]=43dB
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
20. Sensor Characteristics
• Resolution
– smallest input change that can be detected (due to
noise physical limitations or determined by the A/D
noise, physical limitations or determined by the A/D
conversion)
• Linearity
– variation of output signal as function of the input
signal
sometimes expressed as % of full scale
– sometimes expressed as % of full scale
• Hysteresis
– path-dependency (e g thermostats some tactile
path dependency (e.g. thermostats, some tactile
sensors)
20
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
21. Sensor Characteristics
B d idth F
• Bandwidth or Frequency
– the speed at which a sensor can provide a stream of
readings
readings
– usually there is an upper limit depending on the
sensor and the sampling rate
– one has also to consider delay (phase) of the signal
• Sensitivity
ti f t t h t i t h
– ratio of output change to input change
• Cross-sensitivity (and cross-talk)
– sensitivity to other environmental parameters
– sensitivity to other environmental parameters
– influence of other active sensors
21
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
22. Sensor Characteristics
A ( tit d it f )
• Accuracy (exactitude, opposite of error)
– degree of conformity between the measurement and
true value
true value
– often expressed as a portion of the true value:
Accuracy=1-|m-v|/v, m = measured value, v = true
l
value
• Precision (or repeatability)
relates to reproducibility
– relates to reproducibility
of sensor results:
precision=range divided by
precision range divided by
standard deviation
22
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
23. Types of Errors
S t ti d t i i ti
• Systematic errors: deterministic
– caused by factors that can (in theory) be modeled and predicted
– e.g., distortion caused by the optics of a camera
• Random errors: non-deterministic
– no prediction possible
– however, they may be described probabilistically
however, they may be described probabilistically
– e.g., hue instability of camera, black level noise of
photoreceptors, non-returning echoes in ultrasonic sensors, etc.
• Others
• Others
– cross-sensitivity of sensors, motion blur
– rarely possible to model -> appear as “random” errors but are
ith t ti d
neither systematic nor random
– systematic errors and random errors might be well defined in
controlled environment. This is often not the case for mobile
robots!
23
robots!
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
24. Wheel/Motor Encoders
M iti d f th h l t i
• Measure position or speed of the wheels or steering.
• Integrate wheel movements to get an estimate of the position ->
odometry
• Optical encoders are proprioceptive sensors thus the position
estimation in relation to a fixed reference frame is only valuable for
short movements.
• Typical resolutions: 64 - 2048 increments per revolution.
24
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
25. Magnetic compasses
Si 2000 B C
• Since over 2000 B.C.
– when Chinese suspended a piece of naturally magnetite from a
silk thread and used it to guide a chariot over land.
• Magnetic field on earth
– absolute measure of orientation.
• Large variety of solutions to measure the earth magnetic field
g y g
– mechanical magnetic compass
– direct measure of the magnetic field: Hall-effect, Flux Gate,
magnetoresistive sensors
magnetoresistive sensors
• Major drawback
– weakness of the Earth magnetic field (0.3-0.6 μT)
il di t b d b ti bj t th
– easily disturbed by magnetic objects or other sources
– not working in indoor environments (except locally)
• Often used in attitude sensing…
25
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
26. Gyroscopes
Th l f i t i t h i hi l
• The goal of gyroscopic systems is to measure changes in vehicle
orientation by taking advantage of physical laws that produce
predictable effects under rotation.
Th f ti i i i l
• Three functioning principles:
– Mechanical
– Optical
– Micro-electromechanical systems (MEMS)
• Two categories
– Orientation -> directly measure angles (very rare in robotics!)
Orientation directly measure angles (very rare in robotics!)
– Rate gyros -> measure rotation velocity, which can be
integrated…
• A problem common to all gyroscopes is that of drift => unless the
• A problem common to all gyroscopes is that of drift => unless the
error is corrected through reference to some alternate
measurement, the drift will eventually exceed the required accuracy
of the system.
26
of the system.
27. Mechanical Gyroscopes
R l l t ( i i ti ) th t d f
• Rely on angular momentum (gyroscopic inertia): the tendency of a
rotating object to keep rotating at the same angular speed about the
same axis of rotation in the absence of an external torque.
R ti t i ti l t th i i d th t f
• Reactive torque is proportional to the spinning speed, the rate of
precession and the wheel’s inertia.
• No torque can be transmitted from the outer pivot to the wheel axis
> i i i ill th f b t bl
=> spinning axis will therefore be space-stable.
• Nevertheless, remaining friction in the bearings of the gyro axis
introduce some errors over time (drift).
• A high-quality mechanical gyro can cost up to $100’000 and drift
may be kept as low as 0.02°/h
• Mechanically complex => not much used in mobile robotics.
27
28. Optical Gyroscopes
B d th b h i f ti l t di i t ti
• Based on the behavior of an optical standing wave in a rotating
frame (the Sagnac effect).
• Optical gyroscopes use the interference of light to detect mechanical
t ti
rotation:
– two light beams travel (e.g. along an optical fiber of up to 5 km in
length) in opposite directions
– the beam traveling against the rotation experiences a slightly
shorter path than the other beam
– the resulting phase shift affects how the beams interfere with
each other when they are combined
– the intensity of the combined beam then depends on the rotation
rate of the device
• Resolution can be smaller than 0.0001°/s
28
29. MEMS Rate Gyros
B d ib ti h i l l t t C i li
• Based on vibrating mechanical elements to sense Coriolis
acceleration.
• Coriolis acceleration is the apparent acceleration that arises in a
t ti f f f
rotating frame of reference:
• Various possible MEMS structure: tuning-fork, vibrating wheel,
wineglass resonator
• MEMS gyros have no rotating parts, have low-power consumption
requirements, and are small => they are quickly replacing
mechanical and optical gyros in robotic applications.
29
30. Accelerometers and Pressure Sensors
• Accelerometers / inclinometers
• Accelerometers / inclinometers
– An accelerometer is a device that measures the proper acceleration
of the device.
Conceptually an accelerometer behaves as a damped mass on a
– Conceptually, an accelerometer behaves as a damped mass on a
spring.
– MEMS are based on a cantilever beam with a proof mass.
Th f i th b d fl ti i ft iti
– The way of measuring the beam deflection is often capacitive or
piezoresistive.
– MEMS accelerometers can have three axes => inclinometers.
• Pressure / airspeed / altitude
– Deforming membrane (piezoresistive, capacitive, optical,
electromagnetic, etc.)
– Either absolute or differential (Pitot tube)
– Example of the Freescale MPX:
• resolution of 4 cm in altitude
30
resolution of 4 cm in altitude
• variations due to atmospheric pressure changes over one day: up
to 45 m
31. Inertial Measurement Units (IMU) or Inertial Navigation
System (INS)
• A device that utilizes sensors such as gyroscopes and accelerometers to
• A device that utilizes sensors such as gyroscopes and accelerometers to
estimate the relative position (velocity, and acceleration) of a vehicle in
motion.
• A complete IMU/INS maintains a 6 DOF estimate of the vehicle pose:
• A complete IMU/INS maintains a 6 DOF estimate of the vehicle pose:
position (x,y,z) and orientation (yaw, pitch and roll).
• This requires integrating information from the sensors in real time.
• IMU/INS are extremely sensitive to measurement errors => an external
source of information such as GPS and/or magnetometer is often
required to correct this (cf. ch. 8 for more information on sensor fusion).
31
required to correct this (cf. ch. 8 for more information on sensor fusion).
• List of IMU/INS products: http://damien.douxchamps.net/research/imu/
(adapted from the Handbook of Robotics, 2008, ch.20.4)
32. Beacon-based positioning
• Beacons are signaling devices with a precisely known position.
• Beacon-based navigation is used since the humans started to travel
– Natural beacons (landmarks) like stars, mountains or the sun
– Artificial beacons like lighthouses
• The recently introduced Global Positioning System (GPS)
revolutionized modern navigation technology
e o ut o ed ode a gat o tec o ogy
– Already one of the key sensors for outdoor mobile robotics
– For indoor robots GPS is not applicable
• Major drawback with the use of beacons indoors:
• Major drawback with the use of beacons indoors:
– Beacons require changes in the environment -> costly
– Limit flexibility and adaptability to changing environments
32
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
33. Passive optical beacons
P i t fl ti b f k iti
• Passive retroreflective beacons of known position
• Distance and heading of at least two beacons are measured (e.g.
using a scanning laser range finder)
(adapted from Siegwart & Nourbakhsh, 2004, ch. 5)
33
34. Active ultrasonic beacons
R b t t k th itt l ti
• Robots must know the emitter locations
• Using time of flight (TOF), they can deduce their position
• Time synchronization is required (e.g. RF or IR)
34
35. Global positioning system (GPS)
F t
• Facts:
– Developed for military use by the US (NAVSTAR)
– Recently became accessible for commercial applications
– 24 satellites (including three spares) orbiting the Earth every 12
hours at a height of 20’190 km
– 4 satellites are located in each of six planes inclined by 55° with
p y
respect to the plane of the Earth’s equator
– Location of any GPS receiver is determined through a time of
flight measurement
g
• Technical challenges:
– Time synchronization between the individual satellites and the
GPS receiver
GPS receiver
– Real-time update of the exact location of the satellites
– Precise measurement of the time of flight
Interferences ith other signals reflections
35
– Interferences with other signals, reflections
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
37. Global positioning system (GPS)
Time synchronization:
• Time synchronization:
– ultra-precision time synchronization is extremely important
– roughly 0.3m/ns => position accuracy proportional to precision of time measurement
– atomic clocks on each satellite
atomic clocks on each satellite
• Real time update of the exact location of the satellites:
– monitoring the satellites from a number of widely distributed ground stations
– master station analyses all the measurements and transmits the actual position to each of
h lli
the satellites
• Exact measurement of the time of flight
– the receiver correlates a pseudocode with the same code coming from the satellite
– the delay time for best correlation represents the time of flight
the delay time for best correlation represents the time of flight
– quartz clock on the GPS receivers are not very precise
– the range measurement with four satellites allows to identify the three values (x, y, z) for
the position and the clock correction dT
• Recent commercial GPS receivers have a position accuracy within 20 m (typ. 10 m) in the
horizontal plane and 45 m in the vertical plane, depending on the number of satellites within
line of sight and multipath issues. WAAS allows to get close to 1-2 m accuracy.
• The update rate is typically between 1 and 4 Hz only.
37
p yp y y
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
38. TOF Range Sensors
R i f ti
• Range information:
– key element for obstacle avoidance, localization and environment modeling
• Ultrasonic sensors as well as laser range sensors make use of
propagation speed of sound or electromagnetic waves respectively.
The traveled distance of a sound or electromagnetic wave is given
by d=c * t
• Where
– d = distance traveled (usually round-trip)
– c = speed of wave propagation
t ti f fli ht
– t = time of flight
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
38
39. TOF Range Sensors
• Important to keep in mind:
Important to keep in mind:
– Propagation speed of sound: 0.3m/ms
– Propagation speed of electromagnetic signals: 0.3m/ns
3 t d t 10 f lt i t l
– 3 meters correspond to 10ms for an ultrasonic system versus only
10ns for a laser range sensor
=> time of flight with electromagnetic signals involves very fast
l t i
electronics
=> laser range sensors are more expensive and delicate to design
• The quality of TOF range sensors mainly depends on:
– Uncertainties about the exact time of arrival of the reflected signal
– Inaccuracies in the time of fight measure (laser range sensors)
– Opening angle of transmitted beam (especially ultrasonic range
Opening angle of transmitted beam (especially ultrasonic range
sensors)
– Interaction with the target (surface, diffuse/specular reflections)
Variation of propagation speed (sound)
39
– Variation of propagation speed (sound)
– Speed of mobile robot and target (if not at stand still)
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
40. Ultrasonic Sensors
T it k t f ( lt i )
• Transmit a packet of (ultrasonic) pressure waves
• Distance d of the echoing object can be calculated based on the
propagation speed of sound c and the time of flight t. d=c*t/2
• Typical frequency: 40 180kHz
• Typical frequency: 40 - 180kHz
• Generation of sound wave: piezo transducer
– transmitter and receiver can be separated or not
• Sound beam propagates in a cone (approx )
• Sound beam propagates in a cone (approx.)
– opening angles around 20 to 70 degrees
– regions of constant depth
segments of an arc
– segments of an arc
(sphere for 3D)
40 Typical intensity distribution of a ultrasonic sensor
41. Laser Rangefinders
Al k l d LIDAR (LI ht D t ti A d R i )
• Also known as laser radar or LIDAR (LIght Detection And Ranging)
• Transmitted and received beams are coaxial
• Transmitter illuminates a target with a collimated beam (laser)
• Diffuse reflection with most surfaces because wavelength is typ.
824nm
• Receiver detects the time needed for round-trip
p
• An optional mechanism sweeps the light beam to cover the required
scene (in 2D or 3D).
41
(adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
42. Laser Rangefinders
Ti f fli ht t i ll hi d
• Time of flight measurement is generally achieved
using one of the following methods:
– Pulsed laser (e.g. Sick)
• direct measurement of time of flight
• requires resolving picoseconds (3m = 10ns)
– Phase shift measurement (e.g. Hokuyo)
( g y )
• sensor transmits 100% amplitude modulated
light at a known frequency and measures the
phase shift between the transmitted and
phase shift between the transmitted and
reflected signals
• technically easier than the above method
42
43. Phase-shift laser Rangefinders
• Distance D, between the beam splitter and the target is given by
(Woodbury et al 1993): D = lamda*theta/(4*pi)
(Woodbury, et al., 1993): D = lamda theta/(4 pi)
where theta is the phase difference between transmitted and
reflected beam and with c the speed of light, f the modulating
frequency
frequency
• Confidence in the range (phase/time estimate) is inversely
proportional to the square of the received signal amplitude.
Hence dark distant objects ill not prod ce as good range
43
• Hence dark, distant objects will not produce as good range
estimates as closer, brighter objects.