Smart home technologies aim to automate functions in the home and provide services to inhabitants through intelligent environments. Automation and robotics play an important role by controlling the physical environment through devices like automated blinds, thermostats, and doors. Personal service robots can assist with tasks like cleaning, lawn mowing, and assisting the elderly or disabled. For robots to operate autonomously in intelligent environments, they require the ability to integrate sensors, adapt to changes, and interact intuitively with humans.
This document discusses automation and robotics in intelligent environments. It describes how robots can automate functions in the home and provide services to inhabitants. It outlines various types of robots that can be used for tasks like home cleaning, lawn mowing, and assisting the elderly. The document also discusses the requirements for robots in intelligent environments, including autonomy, intuitive interfaces, and adaptation. It covers topics like modeling robot mechanisms, sensor selection, control architectures, and dealing with uncertainty.
Simultaneous Mapping and Navigation For Rendezvous in Space ApplicationsNandakishor Jahagirdar
1. The document describes a project to develop an autonomous navigation system for a robot using image processing. A camera on the robot captures images and sends them wirelessly to a workstation where edge detection algorithms are used to identify obstacles and determine a safe path for the robot.
2. An omni-directional robot platform called FIREBIRD V is used, which has three wheels placed 1200 apart. Images are captured and transmitted to a workstation running MATLAB for processing using algorithms like Prewitt edge detection.
3. The processed images are used to detect edges in the environment and identify a safe local path for the robot to follow without collisions while navigating autonomously. This system could have applications for rendezvous
SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS Nandakishor Jahagirdar
The project is to develop a autonomous navigation system along with mapping of the path.
A robot which senses the edges of the object in the path and move without colliding the object. This application equipped with camera as main component which captures the images and transmitted to workstation through wireless antenna.
The processing of the image is done on a workstation or computer using MATLAB-2013a. An IR ranging device, which senses any objects ahead of it and accordingly the robot change its direction to avoid any collision.
Thus we ensure that even in cases of circumstances leading to errors in the output of the image processing algorithm, a decision can be made using the input from the IR sensors.
Smart home technologies aim to automate functions and provide services to inhabitants through intelligent environments. Automation can control the physical environment through devices like automated blinds and thermostats. Robots are also used for tasks like cleaning, lawn mowing, and assisting the elderly. For robots to operate autonomously in intelligent environments, they must be able to make decisions using sensors, complete complex tasks, and interact intuitively with humans. Control architectures balance deliberative planning with reactive behaviors to achieve goals while responding to changes.
This annual report summarizes the activities of the National Association of Private ICT Companies from Moldova (ATIC) in 2011. It provides an overview of ATIC's mission to promote the competitiveness of the ICT sector and its objectives. The report outlines ATIC's organizational structure, lists its board members and staff. It also recognizes ATIC's gold partners and sponsors. Finally, the report summarizes ATIC's accomplishments in key areas like education, entrepreneurship support, improving the business environment, market development, government engagement, and association promotion.
A document discusses parliamentary elections for Grand Duke and mentions various political factions and wine varieties. Elections are occurring and different political groups are involved including ones referred to as "Kuddelfleck", "kachkéis", and "Pastaschutta". Various white wine grapes are also listed such as Riesling, Pinot Gris, Pinot blanco, and Auxerrois.
Meller et al (2012) Single Unit Firing Rates In Macaque SI (In Review)David
A manuscript describing my recent doctoral work characterizing the cortical representation of cutaneous sensory information. Currently in review for publication
Smart home technologies aim to automate functions in the home and provide services to inhabitants through intelligent environments. Automation and robotics play an important role by controlling the physical environment through devices like automated blinds, thermostats, and doors. Personal service robots can assist with tasks like cleaning, lawn mowing, and assisting the elderly or disabled. For robots to operate autonomously in intelligent environments, they require the ability to integrate sensors, adapt to changes, and interact intuitively with humans.
This document discusses automation and robotics in intelligent environments. It describes how robots can automate functions in the home and provide services to inhabitants. It outlines various types of robots that can be used for tasks like home cleaning, lawn mowing, and assisting the elderly. The document also discusses the requirements for robots in intelligent environments, including autonomy, intuitive interfaces, and adaptation. It covers topics like modeling robot mechanisms, sensor selection, control architectures, and dealing with uncertainty.
Simultaneous Mapping and Navigation For Rendezvous in Space ApplicationsNandakishor Jahagirdar
1. The document describes a project to develop an autonomous navigation system for a robot using image processing. A camera on the robot captures images and sends them wirelessly to a workstation where edge detection algorithms are used to identify obstacles and determine a safe path for the robot.
2. An omni-directional robot platform called FIREBIRD V is used, which has three wheels placed 1200 apart. Images are captured and transmitted to a workstation running MATLAB for processing using algorithms like Prewitt edge detection.
3. The processed images are used to detect edges in the environment and identify a safe local path for the robot to follow without collisions while navigating autonomously. This system could have applications for rendezvous
SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS Nandakishor Jahagirdar
The project is to develop a autonomous navigation system along with mapping of the path.
A robot which senses the edges of the object in the path and move without colliding the object. This application equipped with camera as main component which captures the images and transmitted to workstation through wireless antenna.
The processing of the image is done on a workstation or computer using MATLAB-2013a. An IR ranging device, which senses any objects ahead of it and accordingly the robot change its direction to avoid any collision.
Thus we ensure that even in cases of circumstances leading to errors in the output of the image processing algorithm, a decision can be made using the input from the IR sensors.
Smart home technologies aim to automate functions and provide services to inhabitants through intelligent environments. Automation can control the physical environment through devices like automated blinds and thermostats. Robots are also used for tasks like cleaning, lawn mowing, and assisting the elderly. For robots to operate autonomously in intelligent environments, they must be able to make decisions using sensors, complete complex tasks, and interact intuitively with humans. Control architectures balance deliberative planning with reactive behaviors to achieve goals while responding to changes.
This annual report summarizes the activities of the National Association of Private ICT Companies from Moldova (ATIC) in 2011. It provides an overview of ATIC's mission to promote the competitiveness of the ICT sector and its objectives. The report outlines ATIC's organizational structure, lists its board members and staff. It also recognizes ATIC's gold partners and sponsors. Finally, the report summarizes ATIC's accomplishments in key areas like education, entrepreneurship support, improving the business environment, market development, government engagement, and association promotion.
A document discusses parliamentary elections for Grand Duke and mentions various political factions and wine varieties. Elections are occurring and different political groups are involved including ones referred to as "Kuddelfleck", "kachkéis", and "Pastaschutta". Various white wine grapes are also listed such as Riesling, Pinot Gris, Pinot blanco, and Auxerrois.
Meller et al (2012) Single Unit Firing Rates In Macaque SI (In Review)David
A manuscript describing my recent doctoral work characterizing the cortical representation of cutaneous sensory information. Currently in review for publication
Describes work as Program Manager at a small medical device start-up and work building a neurophysiology recording lab during my doctoral research at Arizona State University
This document provides information about an after-prom event for juniors and seniors attending prom. It states that the event will be alcohol, drug, and smoke free and supervised. Students can enjoy free food, drinks, games and activities all night with prizes. The event is on Friday, May 10th from 11:30pm to 5:00am at Bohrer Park and tickets are required. Students should dress comfortably and actively. The event is free to students and is funded entirely through donations from local businesses, school groups, fundraising events, and parents. The document provides tips for parents, advising against hosting their own after-prom parties or allowing alcohol and instead encouraging safe and responsible behavior.
Este documento presenta una breve introducción a la ciencia y la metrología. Explica que la ciencia es la recopilación y desarrollo del conocimiento del planeta Tierra y el universo a través de la experimentación metodológica. También describe que la metrología es la ciencia de la medición, que trata de los sistemas de unidades adoptados e instrumentos usados para realizar y interpretar medidas en diversos campos como la temperatura, el tiempo, la electricidad y la longitud.
This document provides a name - Lizelle Anne. No other context or details are given about this person. The single word "Lizelle Anne" is the only information contained within the document.
The document summarizes the main decision-making processes in the European Union. There are three main institutions involved - the European Parliament, the Council of the European Union, and the European Commission. The Commission proposes new legislation, which is then passed by the Parliament and Council. The three main legislative procedures are co-decision, consultation, and assent. Co-decision involves proposals going between the Parliament and Council through multiple readings and potential conciliation. Consultation allows Parliament to approve, reject, or request amendments to Commission proposals. Assent requires the Parliament's approval of important Council decisions.
Montenegro es un país candidato a la UE ubicado en los Balcanes. Su capital y ciudad más grande es Podgorica, con alrededor de 150,000 habitantes. El país tiene un sistema político presidencialista y su presidente actual es Milo Đukanović. La economía de Montenegro depende en gran medida del turismo y la agricultura.
This document contains a single name - Amira Luzille. No other context or details are provided about this person. The name Amira Luzille is the only information given in the original document.
The Republic of Austria is a sovereign member of the European Union with a population of 8.3 million people. Vienna is the capital city, located on the Danube River in central Europe. Austria joined the EU in 1995 and uses the euro as its currency. The country is a federal parliamentary republic and predominantly German-speaking.
Hand washing is important to prevent the spread of germs. Proper hand washing includes washing thoroughly with soap and water before and after meals and any time hands are visibly dirty. References provide more information on hand washing from medical experts to encourage healthy habits.
Christmas in Austria involves visiting Christmas markets, decorating trees with candles, and having a Christmas lunch that usually includes Austrian trout, turkey, Sachertorte cake, and treats from Weihnachtsbaeckerei bakeries. People also exchange gifts and take part in various Christmas activities.
Finland is located in Northern Europe between Sweden and Russia. It has a total area of 337,030 km2 and a population of around 5.3 million people. Helsinki is the capital and largest city of Finland. Finland has been a member of the European Union since 1995 and uses the euro as its currency. The country has a predominantly Lutheran population and Finnish and Swedish are its official languages.
The document provides details about the Securities Market (Advanced) Module offered by the National Stock Exchange of India. It lists 34 modules offered along with details of each module such as fees, duration of test, number of questions, maximum marks, and pass percentage. The modules cover a wide range of topics related to financial markets including equity, debt, derivatives, banking, insurance, and financial planning. The document emphasizes the important role of securities markets in promoting economic growth and outlines various reforms implemented in the Indian capital markets.
The document provides information about the country of Cyprus. It discusses that Cyprus is an island in the Mediterranean Sea located south of Turkey. Nicosia is the divided capital city, with the southern and northern portions separated by a Green Line. The official languages are Greek and Turkish, with Greek spoken by 84% of the population. Cyprus uses the Euro as currency and has a culture with influences from both Greek and Turkish traditions.
Christmas traditions in Sweden include celebrating Advent and St. Lucia's Day, decorating with ornaments, and exchanging gifts on Christmas Eve followed by a meal. Christmas Day is spent with family, and celebrations continue on Stephen's Day and Knut's Day.
A Review On AI Vision Robotic Arm Using Raspberry PiAngela Shin
This document summarizes a research project that designed an artificial intelligence (AI) vision robotic arm using a Raspberry Pi microcontroller. The robotic arm has 6 degrees of freedom and is intended to perform multifunctional tasks like detecting, identifying, grasping, and repositioning objects. A computer vision system with a camera is used to recognize objects and their spatial positions to control the robotic arm's movement. The vision system is processed using the Raspberry Pi's computing power to recognize objects in real-time based on software commands. The study aims to interest and automate various axes of the manipulator to lift, carry and place objects as desired using integrated electric motors and a vision-based control system.
Design and Analysis of Robotic Rover with Gripper Arm using Embedded CEditor IJCATR
This paper confers the development and working of robotic rover performing various autonomous tasks to identify, pick and
drop an object at an appropriate position using microcontroller 8051 in conjunction with embedded C programming. The system
employs infrared proximity sensors, DC geared motors, microcontroller board, etc. Robotic rover technology offers many applications
in space explorations, military operations, etc. The motive of this research work is to design a wireless robotic rover being autonomous
controlled that is capable of completing tasks with proliferated accuracy in smooth terrain.
Applications Of Mems In Robotics And Bio Mems Using Psoc With Metal Detector ...IOSR Journals
Abstract: This project deals with accelerometercontrolled robot with wireless image and voice transmission as well as metal Detector. This robot is prototype for the “Path Finder”. This robot is controlled based on PSoCdevice using MEMS accelerometer remote. This can be moved forward and reverse direction using geared motors of 60RPM. Also this robot can take sharp turnings towards left and right directions. A high sensitive induction type metal detector is designed using colpitts oscillator principle and fixed to this robot. Also a wireless camera with voice is interfaced to the kit. When the robot is moving on a surface, the system produces a beep sound when metal is detected. This beep sound will be transmitted to remote place. Simultaneously the images around the robot will be transmitted to remote place. User can monitor the images and metal detection alarms on Television. Keywords: PSoC designer 1.0, keil -c,PSOC device (CY8C29466), AT89S52.
1) The document describes the design and implementation of a pick and place robot using a PIC microcontroller, sensors, and DC motors. It includes the mechanical design of the robotic arm and gripper.
2) Simulation results show the robot arm moving in response to signals from the PIC microcontroller to the DC motors. The real-world behavior is then compared to the simulation results.
3) Different robot configurations - including Cartesian, cylindrical, parallel, and SCARA - are evaluated in terms of their advantages and disadvantages for various applications. The document concludes that the articulated robot arm performed pick and place tasks as intended.
The document describes the development of a low-cost robotic arm sample changer for a neutron powder diffractometer. A $500 hobby robotic arm was modified with an antibacklash spring and used to retrieve samples stored horizontally on a sample plate. Self-centering mechanisms like conical surfaces and magnets helped ensure reproducibility despite imprecisions in the arm. Programming techniques like common starting positions and minimal on-time further reduced errors. The system demonstrated that inexpensive model robots can automate simple crystallography tasks when advanced options are prohibitive due to space or funding.
This paper introduce a system that used ARM
basedmicrocontroller and wireless sensors to control the various
devices and to monitor the information regarding the CNC machines
parameter using WI-FI technology .If there is any error in machine it
can’t be recognized by the person sitting in the office.The existing
system is difficult to maintain. This consumes lot of time on
communication between technical persons. To overcome this problem
we are trying to develop the system. This system will give a
informatioto the respective technical person according to the error
detected
This paper is focused on developing a platform that
helps researchers to create verify and implement their
machine learning algorithms to a humanoid robot in real
environment. The presented platform is durable, easy to fix,
upgrade, fast to assemble and cheap. Also, using this platform
we present an approach that solves a humanoid balancing
problem, which uses only fully connected neural network as a
basic idea for real time balancing. The method consists of 3
main conditions: 1) using different types of sensors detect the
current position of the body and generate the input
information for the neural network, 2) using fully connected
neural network produce the correct output, 3) using servomotors make movements that will change the current position
to the new one. During field test the humanoid robot can
balance on the moving platform that tilts up to 10 degrees to
any direction. Finally, we have shown that using our platform
we can do research and compare different neural networks in
similar conditions which can be important for the researchers
to do analyses in machine learning and robotics.
Describes work as Program Manager at a small medical device start-up and work building a neurophysiology recording lab during my doctoral research at Arizona State University
This document provides information about an after-prom event for juniors and seniors attending prom. It states that the event will be alcohol, drug, and smoke free and supervised. Students can enjoy free food, drinks, games and activities all night with prizes. The event is on Friday, May 10th from 11:30pm to 5:00am at Bohrer Park and tickets are required. Students should dress comfortably and actively. The event is free to students and is funded entirely through donations from local businesses, school groups, fundraising events, and parents. The document provides tips for parents, advising against hosting their own after-prom parties or allowing alcohol and instead encouraging safe and responsible behavior.
Este documento presenta una breve introducción a la ciencia y la metrología. Explica que la ciencia es la recopilación y desarrollo del conocimiento del planeta Tierra y el universo a través de la experimentación metodológica. También describe que la metrología es la ciencia de la medición, que trata de los sistemas de unidades adoptados e instrumentos usados para realizar y interpretar medidas en diversos campos como la temperatura, el tiempo, la electricidad y la longitud.
This document provides a name - Lizelle Anne. No other context or details are given about this person. The single word "Lizelle Anne" is the only information contained within the document.
The document summarizes the main decision-making processes in the European Union. There are three main institutions involved - the European Parliament, the Council of the European Union, and the European Commission. The Commission proposes new legislation, which is then passed by the Parliament and Council. The three main legislative procedures are co-decision, consultation, and assent. Co-decision involves proposals going between the Parliament and Council through multiple readings and potential conciliation. Consultation allows Parliament to approve, reject, or request amendments to Commission proposals. Assent requires the Parliament's approval of important Council decisions.
Montenegro es un país candidato a la UE ubicado en los Balcanes. Su capital y ciudad más grande es Podgorica, con alrededor de 150,000 habitantes. El país tiene un sistema político presidencialista y su presidente actual es Milo Đukanović. La economía de Montenegro depende en gran medida del turismo y la agricultura.
This document contains a single name - Amira Luzille. No other context or details are provided about this person. The name Amira Luzille is the only information given in the original document.
The Republic of Austria is a sovereign member of the European Union with a population of 8.3 million people. Vienna is the capital city, located on the Danube River in central Europe. Austria joined the EU in 1995 and uses the euro as its currency. The country is a federal parliamentary republic and predominantly German-speaking.
Hand washing is important to prevent the spread of germs. Proper hand washing includes washing thoroughly with soap and water before and after meals and any time hands are visibly dirty. References provide more information on hand washing from medical experts to encourage healthy habits.
Christmas in Austria involves visiting Christmas markets, decorating trees with candles, and having a Christmas lunch that usually includes Austrian trout, turkey, Sachertorte cake, and treats from Weihnachtsbaeckerei bakeries. People also exchange gifts and take part in various Christmas activities.
Finland is located in Northern Europe between Sweden and Russia. It has a total area of 337,030 km2 and a population of around 5.3 million people. Helsinki is the capital and largest city of Finland. Finland has been a member of the European Union since 1995 and uses the euro as its currency. The country has a predominantly Lutheran population and Finnish and Swedish are its official languages.
The document provides details about the Securities Market (Advanced) Module offered by the National Stock Exchange of India. It lists 34 modules offered along with details of each module such as fees, duration of test, number of questions, maximum marks, and pass percentage. The modules cover a wide range of topics related to financial markets including equity, debt, derivatives, banking, insurance, and financial planning. The document emphasizes the important role of securities markets in promoting economic growth and outlines various reforms implemented in the Indian capital markets.
The document provides information about the country of Cyprus. It discusses that Cyprus is an island in the Mediterranean Sea located south of Turkey. Nicosia is the divided capital city, with the southern and northern portions separated by a Green Line. The official languages are Greek and Turkish, with Greek spoken by 84% of the population. Cyprus uses the Euro as currency and has a culture with influences from both Greek and Turkish traditions.
Christmas traditions in Sweden include celebrating Advent and St. Lucia's Day, decorating with ornaments, and exchanging gifts on Christmas Eve followed by a meal. Christmas Day is spent with family, and celebrations continue on Stephen's Day and Knut's Day.
A Review On AI Vision Robotic Arm Using Raspberry PiAngela Shin
This document summarizes a research project that designed an artificial intelligence (AI) vision robotic arm using a Raspberry Pi microcontroller. The robotic arm has 6 degrees of freedom and is intended to perform multifunctional tasks like detecting, identifying, grasping, and repositioning objects. A computer vision system with a camera is used to recognize objects and their spatial positions to control the robotic arm's movement. The vision system is processed using the Raspberry Pi's computing power to recognize objects in real-time based on software commands. The study aims to interest and automate various axes of the manipulator to lift, carry and place objects as desired using integrated electric motors and a vision-based control system.
Design and Analysis of Robotic Rover with Gripper Arm using Embedded CEditor IJCATR
This paper confers the development and working of robotic rover performing various autonomous tasks to identify, pick and
drop an object at an appropriate position using microcontroller 8051 in conjunction with embedded C programming. The system
employs infrared proximity sensors, DC geared motors, microcontroller board, etc. Robotic rover technology offers many applications
in space explorations, military operations, etc. The motive of this research work is to design a wireless robotic rover being autonomous
controlled that is capable of completing tasks with proliferated accuracy in smooth terrain.
Applications Of Mems In Robotics And Bio Mems Using Psoc With Metal Detector ...IOSR Journals
Abstract: This project deals with accelerometercontrolled robot with wireless image and voice transmission as well as metal Detector. This robot is prototype for the “Path Finder”. This robot is controlled based on PSoCdevice using MEMS accelerometer remote. This can be moved forward and reverse direction using geared motors of 60RPM. Also this robot can take sharp turnings towards left and right directions. A high sensitive induction type metal detector is designed using colpitts oscillator principle and fixed to this robot. Also a wireless camera with voice is interfaced to the kit. When the robot is moving on a surface, the system produces a beep sound when metal is detected. This beep sound will be transmitted to remote place. Simultaneously the images around the robot will be transmitted to remote place. User can monitor the images and metal detection alarms on Television. Keywords: PSoC designer 1.0, keil -c,PSOC device (CY8C29466), AT89S52.
1) The document describes the design and implementation of a pick and place robot using a PIC microcontroller, sensors, and DC motors. It includes the mechanical design of the robotic arm and gripper.
2) Simulation results show the robot arm moving in response to signals from the PIC microcontroller to the DC motors. The real-world behavior is then compared to the simulation results.
3) Different robot configurations - including Cartesian, cylindrical, parallel, and SCARA - are evaluated in terms of their advantages and disadvantages for various applications. The document concludes that the articulated robot arm performed pick and place tasks as intended.
The document describes the development of a low-cost robotic arm sample changer for a neutron powder diffractometer. A $500 hobby robotic arm was modified with an antibacklash spring and used to retrieve samples stored horizontally on a sample plate. Self-centering mechanisms like conical surfaces and magnets helped ensure reproducibility despite imprecisions in the arm. Programming techniques like common starting positions and minimal on-time further reduced errors. The system demonstrated that inexpensive model robots can automate simple crystallography tasks when advanced options are prohibitive due to space or funding.
This paper introduce a system that used ARM
basedmicrocontroller and wireless sensors to control the various
devices and to monitor the information regarding the CNC machines
parameter using WI-FI technology .If there is any error in machine it
can’t be recognized by the person sitting in the office.The existing
system is difficult to maintain. This consumes lot of time on
communication between technical persons. To overcome this problem
we are trying to develop the system. This system will give a
informatioto the respective technical person according to the error
detected
This paper is focused on developing a platform that
helps researchers to create verify and implement their
machine learning algorithms to a humanoid robot in real
environment. The presented platform is durable, easy to fix,
upgrade, fast to assemble and cheap. Also, using this platform
we present an approach that solves a humanoid balancing
problem, which uses only fully connected neural network as a
basic idea for real time balancing. The method consists of 3
main conditions: 1) using different types of sensors detect the
current position of the body and generate the input
information for the neural network, 2) using fully connected
neural network produce the correct output, 3) using servomotors make movements that will change the current position
to the new one. During field test the humanoid robot can
balance on the moving platform that tilts up to 10 degrees to
any direction. Finally, we have shown that using our platform
we can do research and compare different neural networks in
similar conditions which can be important for the researchers
to do analyses in machine learning and robotics.
This document outlines the design of a tour guide robot for the Chambers Technology Center building. It includes sections on the system design, hardware and software research, project development, justifications for design choices, test results, conclusions, applications, lessons learned, and future improvements. The robot uses sensors and microcontrollers to navigate autonomously around obstacles while providing verbal descriptions of points of interest on its tour route. Hardware includes ultrasonic sensors for obstacle avoidance, a compass sensor for navigation, and a Raspberry Pi for voice recognition and speech. Software includes algorithms for navigation and the Voicecommand program. The team developed the system over the semester and tested its performance.
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
The document summarizes a student project to design a data collection system using SunSPOT devices and Bioloid robots. The project involves developing software for the SunSPOT and robot architectures, as well as a user interface to gather and display sensor data from the SunSPOT devices. Within their budget and timeline, the student team was able to complete over 80% of the specified requirements and learned valuable lessons around project management and technical skills.
Precision robotic assembly using attractive regionsijmech
This document summarizes a research paper on precision robotic assembly using attractive regions. The key points are:
1) Researchers developed a new method to decompose the 6-dimensional configuration space of a peg-in-hole assembly task into two 3D subspaces using the concept of "attractive regions."
2) An impedance controller is integrated with the attractive regions approach to allow the robotic system to achieve human-like assembly performance without force sensing.
3) The approach is experimentally validated using a 7 degree-of-freedom robotic arm to insert three different prismatic pegs into holes on a fixed base, guided by stereo vision identification of the parts.
Design and development of touch screen controlled stairs climbing roboteSAT Journals
Abstract This paper presents a method of developing a stairs climbing robot with self balancing chair mounted on the top of the robot. It is one of the major task in the field of Mechatronics require a mechanical arrangement and electronics based control of the actuators using wireless technology. In most of the mechanism it is hard to maintain the slope position of the seat while carrying some goods on it, so taking in action all these condition the robot is to design and develop [1] which will climb on the stairs and adjust themselves as per environmental condition. Keywords:- Accelerometer, CC2500, Touch Screen, Microcontroller, Relays
The document describes a robotic arm subsystem for a free-flying robot on the International Space Station. It provides an overview of the free-flying robot and its components, including a robotic arm. Key requirements for the arm subsystem are listed, such as a mass limit of 1kg, range of motion specifications, grasping capabilities, and electrical safety standards. The task is to decompose the arm subsystem and document the process.
This document describes a final year project to develop a gesture controlled robotic arm. A team of 4 students will build the robotic arm and a wearable hand glove controller. Sensors in the glove will detect hand gestures which will wirelessly control the motion of the robotic arm. The aim is to allow intuitive human-machine interaction. The robotic arm will use servos for motion and the glove will use flex sensors and an accelerometer to detect gestures. An Arduino microcontroller will process the glove sensor data and send commands to the arm over Bluetooth. Potential applications include industrial tasks like assembly and materials handling.
The document outlines a project proposal for developing an Arduino-controlled robotic arm. It includes sections on motivation, aims and objectives, literature review, block diagram, hardware and software requirements, circuit diagram, interfacing diagram, component specifications, timeline, advantages and limitations, conclusions, and references. The overall goal is to create a simple robotic arm that can be programmed to perform repetitive tasks as a way to increase productivity in industrial settings.
This document describes a final year project to build a gesture controlled robotic arm. A team of 4 students will build both a robotic arm and a gesture controlled glove. The arm will have 6 axes of rotation and be able to lift up to 1kg. The glove will contain flex sensors and an accelerometer to detect hand gestures and wirelessly control the arm's movement. The goal is to allow intuitive control of the robotic arm through natural hand gestures. Applications could include industrial tasks like welding or materials handling.
Humans have evolved to better survive and have evolved their invention. In today’s age, a
large number of robots are placed in many areas replacing manpower in severe or dangerous
workplaces. Moreover, the most important thing is to take care of this technology for developing
robots progresses. This paper proposes an autonomous moving system which automatically finds its
target from a scene, lock it and approach towards its target and hits through a shooting mechanism.
The main objective is to provide reliable, cost effective and accurate technique to destroy an unusual
threat in the environment using image processing.
Humans have evolved to better survive and have evolved their invention. In today’s age, a
large number of robots are placed in many areas replacing manpower in severe or dangerous
workplaces. Moreover, the most important thing is to take care of this technology for developing
robots progresses. This paper proposes an autonomous moving system which automatically finds its
target from a scene, lock it and approach towards its target and hits through a shooting mechanism.
The main objective is to provide reliable, cost effective and accurate technique to destroy an unusual
threat in the environment using image processing.
1) The document describes a wireless mobile robot designed for military applications that uses sensors to interact with the physical world and navigate under the control of a base station.
2) The robot is designed to complete all tasks like detecting obstacles, border security, recording audio/video, and firing within critical time limits for safety.
3) The robot's functions are managed by the real-time operating system Salvo on an 8051 microcontroller to ensure tasks meet deadlines even when resources are fully utilized.
Virtual environment for assistant mobile robotIJECEIAES
This paper shows the development of a virtual environment for a mobile robotic system with the ability to recognize basic voice commands, which are oriented to the recognition of a valid command of bring or take an object from a specific destination in residential spaces. The recognition of the voice command and the objects with which the robot will assist the user, is performed by a machine vision system based on the capture of the scene, where the robot is located. In relation to each captured image, a convolutional network based on regions is used with transfer learning, to identify the objects of interest. For human-robot interaction through voice, a convolutional neural network (CNN) of 6 convolution layers is used, oriented to recognize the commands to carry and bring specific objects inside the residential virtual environment. The use of convolutional networks allowed the adequate recognition of words and objects, which by means of the associated robot kinematics give rise to the execution of carry/bring commands, obtaining a navigation algorithm that operates successfully, where the manipulation of the objects exceeded 90%. Allowing the robot to move in the virtual environment even with the obstruction of objects in the navigation path.
1. 2. THE SMORG NEUROPHYSIOLOGY LABORATORY
Introduction
The SensoriMotor Research Group (SMoRG) was founded at Arizona State University in
2006 to investigate sensorimotor learning and representations in the nervous system, as well as
the neural mechanisms that enable fine motor skills. At its inception, total SMoRG assets
included people, ideas and a profoundly empty laboratory space in which to combine them to
produce meaningful science. This chapter will describe the process of developing the neural
recording laboratory, where the experimental work of this manuscript was accomplished. A
description of this work is fitting because it featured significant technical accomplishments,
produced a novel experimental facility and required a sustained effort of more than two years to
complete. The overall goal was clear, even if the path to achieve it was not; develop an
experimental facility that included a robot arm, 3D motion capture, virtual reality simulation, a
cortical neural recording system and custom software to integrate it all.
Robot Arm
Installation. A six-axis industrial robot (model VS-6556G, Denso Robotics, Long Beach,
CA, USA) was acquired for object presentation during behavioral experimental tasks (Figure
8(a)). The very first task required fabrication of a platform on which to mount the robot in a
secure yet mobile way. A space frame cube was assembled from extruded aluminum segments
(1530 T-slotted series, 80/20® Inc., Columbia City, IN, USA) with bolted corner gussets for
maximum structural integrity. The top and bottom faces of the cube were covered with a single
piece of 0.25 in. thick plate steel to which the base of the robot was attached with stainless steel
bolts. The entire robot platform was supported by swivel joint mounting feet at the corners and
rested on a 0.5 in. thick rubber pad to dampen the vibration and inertial loads resulting from robot
movement.
2. 44
Figure 8. Robot and Associated Hardware. A. The 6-axis industrial robot was mounted on a
sturdy platform and controlled using custom software. Dedicated signal and air channels routed
through the robot enabled feedback from a 6-DOF F/T sensor, object touch sensors and control of
a pneumatic tool changer. B. The robot end effector. The F/T sensor (b2) was mounted directly to
the robot end effector (b1). The master plate of the tool changer (b3) was mounted to the F/T
sensor using a custom interface plate. Air lines originating from ports on the robot controlled the
locking mechanism of the master plate. C. Grasp object assembly. The object was mounted to a
six-inch standoff post that mounted to a tool plate. Touch sensors were mounted flush with the
object surface and wires were routed to the object interior for protection. Power and signal lines
were routed through a pass-through connector (not visible), through the robot interior to an
external connector on the robot base. Small felt discs on each sensor were used for grasp training.
The large flange extending from the bottom of the object was a temporary training aide to guide
the subject’s hand to the correct location.
4. 46
Programming. The robot included a tethered teach pendant interface device through
which basic simple operation of the robot could be accomplished either through direct control of
a specific axis, or by executing a script written in the PAL programming language. The
behavioral task planned for our experiments required real-time programmatic control of robot
actions, requiring the development of custom software routines using a software development kit
(SDK) provided by the manufacturer (ORiN-II, Denso Robotics). The routines implemented basic
movement commands to pre-defined poses in the working space. Pose coordinates (position,
rotation, pose type) were determined by manually driving the robot to a desired pose using the
teach pendant, then using motor encoder data to read back the actual coordinates. Programmatic
control included the ability to select the desired pose, speed, acceleration and other secondary
movement parameters. For selected operations involving a stereotyped sequence of basic
movements (retrieving or replacing grasp objects) individual commands were grouped into
compound movements to simplify user programming and operation.
The custom software routines were developed in the C++ programming language and
compiled into a library of functions accessed by code interface modules in the LabVIEW®
graphical programming environment (National Instruments Corporation, Austin, TX, USA). An
intuitive graphical user interface (GUI) was developed in LabVIEW® allowing the user to easily
operate the robot from a computer connected to the robot controller through the local network.
Safety Measures. Errors in robot operation were capable of causing considerable damage
to the experimental setup, including the robot itself. To mitigate this possibility, robot control
programs developed in LabVIEW® actively monitored force and torque data acquired from a 6-
axis force/torque (F/T) sensor (Mini85, ATI Industrial Automation, Inc., Apex, NC, USA)
mounted on the robot end effector (Figure 8(b)). Maximum force and torque limits for each
movement were specified, tailored to purely inertial loads during movement or to direct loading
5. 47
during object retrieval, replacement and behavioral manipulation. The robot was immediately
halted if these limits were exceeded. The addition of the F/T sensor also added the capability of
monitoring kinetics of object manipulation for scientific analysis.
Tool Changer. A pneumatic tool changer (QC-11, ATI Industrial Automation Inc.) was
the final element of the overall robot system (Figure 8(b)). This enabled the robot to retrieve
presentation objects from the tool holder mounted to the side of the robot platform. A master
plate was mounted directly to the force/torque sensor via a custom interface plate and connected
to compressed air lines, which operated the locking mechanism. The air lines connected directly
to dedicated air channels routed through the interior of the robot, which was supplied by a gas
cylinder mounted nearby. Internal solenoid valves in the robot were controlled programmatically
(via LabVIEW®) to operate the tool changer during object retrieval and replacement. A tool plate
was attached to each grasp object to interface with the master plate. Each tool plate was fitted
with four mounting posts that aligned the tool in the object holder for reliable and repeatable
object retrieval.
Grasp Objects
Object Design. Grasp objects were designed to elicit in the experimental subject a
variety of hand postures in order to investigate the sensory feedback resulting from each. Initially,
up to seven different objects were envisioned including simple polygonal shapes (cylinder,
rectangular polygon, convex polygon and concave polygon) as well as objects requiring specific
manipulation (e.g., squeezing, pulling, etc.) for successful task completion. The behavioral task
used for the research described in this manuscript required just two objects; small and large
versions of a modified cylinder design. An early version of the small object used for training is
shown in Figure 8(c).
6. 48
Initially, grasp objects were machined out of solid polymer materials such as
polytetrafluoroethylene (Teflon®, DuPont) or polyacetal (Delrin®, DuPont). However, fabrication
quickly shifted to stereolithography (rapid prototyping) techniques to speed production and
reduce cost during numerous design iterations. The resulting prototype objects proved to be
sufficiently robust to withstand the rigors of repeated use. The modified cylinder design was
developed primarily in response to the need to register precise finger placement during grasping
using surface mounted resistive touch sensors (TouchMini v1.2, Infusion Systems, Montreal,
Quebec, Canada). Simple cylindrical designs could not balance the size of the object (cylinder
diameter, which drove hand aperture) with the need for a relatively planar surface on which to
attach the touch sensors. Mounting the flexible sensors on a curved surface would have
introduced an undesired bias into the output, which was modulated by deformation or bending of
the sensor. The solution was to essentially unfold the surface of a cylinder into an extended
surface whose center portion was curved to accept the palm of the hand, while the peripheral
portions merged into a relatively flat surface. These complex shapes were perfectly suited to the
stereolithography process, and had the added benefit of opening up additional space in the interior
of the object that was used to route and protect delicate electrical connections from wear and tear.
Touch Sensors. Thin (0.04 in, 1 mm), circular (∅0.75 in, 19 mm) resistive touch sensors
were glued directly to the outer surface of the object in shallow indentations that perfectly
matched the thickness and diameter of the sensor. This prevented the monkeys from picking at
the edges since the surface appeared to be uniform except for a slight change in texture. At the
center of each indentation was a deeper well that permitted further indentation of the flexible
sensor, which increased the magnitude and reliability of the output in comparison to mounting on
a flat surface. Wires were immediately routed inside of the object for protection. Sensors were
placed at locations where the distal phalange of the thumb, index and middle fingers contacted the
7. 49
object surface when a prototype version was pressed into the hand of the first monkey to be
trained in the behavioral task (monkey F). Electrical connections were routed to a 10-pin pass-
through connector on the tool plate that made electrical connections to a corresponding connector
when an object was retrieved by the tool changer. Signals were routed through dedicated lines
inside the robot and emerged at a master connector on the robot base. From here, the signals were
routed to the behavioral control software (LabVIEW®) and actively monitored to indicate
successful object grasping.
Grasp Training. The F/T sensor and touch sensors were excellent tools for training
monkeys to grasp the objects in a specific and repeatable way. The desired interaction was a
precision grip in which the distal phalange of the thumb, index and middle digits contacted the
object at the location of the sensors and maintained simultaneous supra-threshold contact for at
least 250 ms.
Basic Interaction. The first training stage was to establish the connection between the
object and reward. Touch sensor feedback was not used during this stage. Instead, feedback from
the F/T sensor was used to register physical interaction with an object presented directly ahead of
the monkey. Any contact with the object was immediately rewarded with several drops of juice.
Initially, these interactions often involved slapping or scratching the object. This behavior was
steadily eliminated by withholding reward (and an audible cue) when such actions resulted in
excessive force or torque levels. The basic interaction training stage was complete when the
monkey had learned to consistently place its hand on the object without exceeding force and
torque thresholds.
Fine Tuning. This stage involved training the monkey to place the thumb, index and
middle digits directly on the touch sensors. F/T feedback was not used to register successful
interaction, rather, only to detect excessive force applied to the object. In this case, the audible
8. 50
cue was played, the object was withdrawn from the workspace and no reward was given. Small
felt discs approximately 2 mm in height were attached to each touch sensor to attract the
monkey’s attention during haptic exploration of the object. Initially, brief (10 ms) contact with
any of the three sensors was sufficient to earn a juice reward. Next, brief simultaneous contact
with any two sensors was sufficient then, finally, contact with all three sensors was required to
earn the juice reward. The final step was to steadily increase the required grasp duration to 250
ms.
Motion Capture
Our experiments required that the 3D position and orientation of the subject’s hand were
captured at all times. This information served two primary functions. First, it was used to animate
the motion of hand and object models in a virtual reality simulation in which subjects would
eventually be trained to carry out the behavioral task. Second, the data were used to reconstruct
the kinematics of hand movement during the task, which could be correlated with simultaneously
recorded neural activity.
Kinematic analysis of hand movement is a technically challenging undertaking,
especially for the hand and even more so for the child-sized hand of the juvenile macaques used
in this research. Detailed reconstructions require tracking the orientation of individual digit
segments (implying two markers per segment) with millimeter precision. Markers attached to the
segments are often occluded by the movement of adjacent digits or by intervening experimental
apparatus. Active markers require power and signal connections, which quickly becomes a
logistical challenge of routing wires and connections while minimizing the impact to the
underlying behavioral task.
Approach #1: Passive Marker Motion Capture. The first approach was to implement a
camera-based motion capture system using passive detection markers (Vitrius, Tenetec
9. 51
Innovations AG, Zürich, Switzerland). In theory, this approach offered several advantages for
mitigating the challenges of motion capture described above. First, passive markers required no
power or signal lines, thus eliminating a significant degree of logistical complexity and increasing
reliability. Second, the Vitrius system was predicated on a unique approach that promised to
dramatically reduce the number of cameras and markers required for high-precision motion
capture; 3D position determination with just one camera and one marker. All other known
camera-based motion capture systems ultimately derive 3D position from an estimation of the
parallax between two distributed observations of a point in space. By contrast, the Vitrius system
calculated position by estimating the linear distance between the camera focal plane and a flat,
square marker of known size. The relationship was simple; the smaller the marker’s focal plane
representation (pixel area), the further its distance along a ray extending from the center of the
detected area. The trajectory of the ray itself was determined by the optical properties of the lens
and the orientation of the camera. A unique pixelated pattern on each maker was used for
identification. An example of a Vitrius marker is shown in Figure 9(d), where several individual
markers have been attached to the faces of a cube-shaped base.
Numerous shortcomings of the Vitrius system quickly became apparent. At best, position
accuracy was 10-20 mm, an order of magnitude greater than the required value. Camera
resolution was insufficient to adequately capture the small (3 mm2) markers required for the
monkey hand. When markers were viewed by the cameras at any angle other than perpendicular
to the focal plane, the apparent decrease in detected marker area resulted in an accompanying
over-estimation of marker distance. The system had no integrated calibration procedure, requiring
the user to manually measure the position and orientation of each camera. Finally, the Vitrius
software was poorly designed and implemented, resulting in frequent system crashes and loss of
data.
10. 52
Approach #2: Data Gloves. Given the substantial shortcomings of the Vitrius system,
effort quickly shifted to developing non-camera based methods for capturing hand posture. One
option was to adapt a data glove (Figure 9(a)) for use on a monkey hand. Developed for
applications such as virtual reality simulations, video gaming, and animation, data gloves are
outfitted with an array of sensors to capture hand motion and return real-time joint angle data.
Gloves normally include up to three resistive bend sensors per digit (spanning each joint),
abduction/adduction sensors between digits and, in some cases, palm bend sensors and wrist
angle sensors. Typically, these systems do not measure 3D position, requiring the addition of a
motion tracking system to the dorsal surface of the hand or forearm.
The advantages of this approach were promising, yet significant challenges remained.
First, the lack of position and orientation sensing implied that motion tracking could not be
abandoned completely. Second, the cost of a typical data glove was prohibitive, especially
considering the potential wear and tear when used in non-human primate research. Neither would
any manufacturer even consider the possibility of customizing the glove for the monkey hand.
Finally, a glove covering the hand would interfere with the basic research goals of investigating
cutaneous feedback during reaching and grasping.
The first solution was to personally customize an inexpensive data glove, combined with
Vitrius motion capture for position and orientation sensing (Figure 9(b)). This approach
combined the benefits of a data glove while minimizing the use of the Vitrius system. Bend
sensors were removed from the original glove and reassembled into the new Monkey Glove,
where they were restrained within an inner pocket on the dorsal aspect of each digit. Electronics
(wires, circuit boards) were encased in epoxy for protection and sewn into a pocket on the hand
dorsum. An array of three Vitrius markers was also mounted on the hand dorsum to track the
orientation of the hand. Initially, the glove fingers were attached to the digits using only narrow
11. 53
loops of fabric at the intermediate and distal phalangeal joints in order to expose the skin that
would come into contact with the grasp objects. Eventually, however, the entire finger was
removed from the glove and bend sensors were held loosely in place using thin plastic fasteners at
the aforementioned joints.
Numerous refinements to this approach were devised, including the addition of a wireless
transmitter, and a 2-axis accelerometer for measuring pitch and roll (Figure 9(c)). Despite these
improvements, numerous problems plagued this approach. The Vitrius system was still required
for position tracking and the estimation of finger posture from a single bend sensor was
inaccurate and unreliable. A strategy was developed to utilize larger (5 mm2) Vitrius markers
attached to the faces of finger-mounted cubes to improve camera visibility (Figure 9(d)). This
approach used fewer markers and was able to capture only crude measures of hand posture.
12. 54
Figure 9. Approaches to Hand Posture Measurement. A. Commercially available data gloves
feature numerous integrated bend sensors to capture the posture of the digits and palm but were
prohibitively expensive and difficult to customize to the monkey hand. B. An early prototype of
the custom Monkey Glove. Bend sensors and electronics were removed from a gaming glove and
reconfigured to the monkey hand. C. A wireless version of the Monkey Glove with rechargeable
battery, accelerometer and transmitter encased in protective epoxy. D. An alternative strategy for
passive motion capture. Cube markers with finger attachment clips were developed to utilize
larger markers for improved camera visibility. A smaller set of these markers captured only crude
measures of hand posture.
14. 56
Approach #3: Active Marker Motion Capture. Ultimately, the solution to the
motion capture dilemma was to implement an active marker motion capture system (Impulse
System, Phasespace Inc., San Leandro, CA, USA). The Impulse system used active LED markers
and eight cameras equipped with linear sensors, each with a digitally-enhanced effective
resolution of 900 megapixels, to capture marker positions at frame rates up to 480 Hz. A robust
calibration routine used a linear wand outfitted with several active markers to define the capture
space by systematically sweeping the wand through the field of view of the cameras. A single
marker was glued directly to the nail of each digit and an array of three markers was placed on
the hand dorsum for tracking the position and orientation of the overall hand. Each finger marker
and the dorsal array were encased in epoxy for protection and a single wire was routed along the
arm to a nearby device that transmitted data wirelessly to a server computer outside the testing
room. Data were acquired in real-time through a network interface using custom software
developed in C++ using an SDK from the system manufacturer. Data were simultaneously saved
to a file for later analysis and routed to the virtual reality simulation, running on a stand-alone
computer, to animate the position and posture of a virtual hand model. The data required no
filtering and no perceptible time lag was observed due to network transmission delays.
Virtual Reality Simulation
Simulation Hardware. The virtual reality simulation provided all visual cues to the
monkeys during the behavioral task. It was displayed on a flat screen 3D monitor (SeeReal
Technologies S.A., Luxembourg) mounted horizontally and directly above the seating area.
Subjects were not required to wear anaglyphic glasses. The monitor generated a 3D screen image
by vertically interlacing distinct left and right eye images, then projecting each through a beam
splitter to the appropriate eye. This system did require the subject to maintain position in a sweet
spot to produce the optimal 3D effect, which was easily accomplished since the subject’s head
15. 57
was restrained throughout the course of an experiment. A mirror was located four inches in
front of the monkey at a 45° angle to reflect the screen image from the monitor. The mirror
extended down to approximately chin level, allowing subjects to use the arms freely in the
workspace while at the same time hiding it from view. The simulation was generated using a
dedicated computer to ensure that the computational load did not effect the operation of the
Master Control Program (MCP), which was implemented in LabVIEW® on a separate computer.
The MCP continuously read current motion capture data from the network, computed kinematic
parameters, then transmitted them to the simulation computer (via User Datagram Protocol, UDP)
which used the parameters to animate a virtual hand model.
Virtual Modeling. The software implementation used a software toolkit (Vizard,
WorldViz LLC, Santa Barbara, CA, USA) based on the Python (Python Software Foundation
Corporation, DE, USA) programming language. The virtual hand model was a fully articulated
(all digital joints) human hand included with the software toolkit. Animated degrees of freedom
included 3D position, 3-axis rotation and grasp aperture (all digits). To animate the grasping
motion, the rotation angle of all digit joints was scaled according to the current aperture estimate
to produce a realistic representation of grasping movement.
Virtual models of the grasp objects were generated from the same CAD models used to
design and fabricate the physical objects with stereolithography. CAD models were simply
converted to the Virtual Reality Modeling Language (VRML) format and imported into the
virtual environment, resulting in exact representations of the original objects. Virtual grasp
objects were located in the simulation environment in correspondence with physical objects
presented in the workspace. That is, the transformation from camera coordinates (millimeters,
origin at task start position) to simulation coordinates (non-dimensional) was tuned so that when
the subject made contact with a physical object in the workspace, the virtual hand intersected the
16. 58
virtual object in the simulation.
Individual trials of the behavioral task began when a subject placed its right hand on a 4-
in square hold pad located at mid-abdominal height. The virtual model of the hold pad was a
simple flattened cube displayed in a position corresponding to the physical hold pad. Contact with
the hold pad was monitored by a single touch sensor, identical to those used to sense finger
placement on the physical objects.
Virtual Task Training. Training in the virtual task required subjects to carry out the
physical task, learned previously in full sight and with the aid of the F/T sensor and surface-
mounted touch sensors, using cues only from the virtual environment. In actuality, this was a
combined physical/virtual task with two variants. In the physical variant, an object was presented
in the workspace, while in the virtual variant no object was presented. In both task variants, the
appearance of grasp objects was used as a training aid. At the start of a trial, only the virtual hand
and hold pad were displayed, the latter in red. Hand contact with the physical hold pad caused the
virtual hold pad to turn green. After the required hold period, the virtual hold pad was removed,
followed by presentation of a physical object in the workspace. The corresponding virtual object
was then initially displayed in white but turned green whenever the subject physically interacted
with the object in the workspace. Interaction was again determined by either the F/T sensor or
touch sensors. These visual cues facilitated the training process for the virtual task and remained
in place throughout the course of experimentation. In the virtual task variant, collision of the
virtual hand and object models (detected by the simulation software) triggered the change in
color.
Neural Recording System
Neurophysiological experimentation was accomplished using a 16-channel acquisition
system for amplification, filtering and recording neural activity (MAP System, Plexon Inc.,
17. 59
Dallas, TX, USA). The MAP box itself used digital signal processing for 40 kHz (25 µs)
analog-to-digital conversion on each channel with 12-bit resolution. Included control software
provided a suite of programs for real-time spike sorting, visualization and analysis of neural
activity, all of which were run on a dedicated computer independent of the MCP. Digital Events
were generated by the MCP to mark the occurrence of significant events during an experiment.
Each event type was encoded as a unique 8-bit digital word using digital outputs from a 68-pin
terminal block (SCC-68, National Instruments Corporation) and input directly to the MAP box,
which saved spike times and digital event data (word value, time) to a single file.
System Integration
System Architecture. The preceding sections of this chapter described the sub-systems
of the overall laboratory setup, including a 6-axis industrial robot, 6-DOF F/T sensor, tool
changer, 3D motion capture system, virtual reality simulation and a neural data acquisition
system. The LabVIEW® graphical programming environment was used to integrate these sub-
systems into a coordinated whole. Initially, all subsystems were to be controlled by software
running in a real-time operating environment (LabVIEW® Real-Time Module, National
Instruments Corporation) uploaded to a dedicated target processor (PXI-6259, National
Instruments Corporation) for deterministic performance. However, several subsystems required
the Windows® operating system (Microsoft Corporation, Redmond, WA, USA) for programmatic
control. This precluded a purely real-time application, which was based on the VxWorks
operating system (Wind River Corporation, Alameda, CA, USA). Instead, a hybrid system was
developed that coordinated the overall programmatic control of a single LabVIEW® application
between the real-time processor (the target) and a standard personal computer (the host). System
timing deferred to the target processor (1 µs resolution) and communication between the target
and host was mediated through the local network (network shared variables). Program
18. 60
development took place on the host computer. At run time, the MCP code was uploaded to the
target computer where it was compiled and executed.
System Operation. The MCP, which included compatible inputs (touch sensors,
incoming UDP messages) and outputs (digital events, outgoing UDP messages), was executed on
the real-time target. The robot control program ran on the host, awaiting movement commands
from the MCP according to the progression of behavioral task stages. A separate program for
monitoring F/T sensor output also ran on the host, providing continuous feedback related to
object contact and monitoring the robot arm for excessive loading conditions. The MCP
generated digital events in response to task events, which were routed directly to the MAP box of
the neural recording system. The MCP also featured a continuous loop that read data from the
motion capture server (TCP/IP protocol) and saved it to a local data file. Digital event markers
were also written to the camera data file so that kinematic data and neural data, which were saved
to different files, could later be temporally aligned by matching corresponding event markers.
Kinematic parameters derived from the camera data were sent (via UDP) to the virtual reality
simulation computer to animate the virtual hand model. UDP messages were also sent from the
simulation to the MCP to report virtual object collisions.