SlideShare a Scribd company logo
1 of 9
1
Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011
Table of Contents
Introduction.............................................................................................................................. 2
A Brief Outline of Vision Systems............................................................................................... 2
New Technology Available In Vision Systems.............................................................................. 2
How Omnidirectional Vision Works........................................................................................... 2
Existing Applications of Omnidirectional Vision Systems............................................................. 4
Automated Manufacture with Omnidirectional Vision Systems..................................................... 5
Conclusion............................................................................................................................... 5
References................................................................................................................................ 6
Appendix 1 ............................................................................................................................... 7
Appendix 2 ............................................................................................................................... 8
Appendix 3 ............................................................................................................................... 9
2
Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011
Introduction
In many circumstances the reality of robotic capability still lags behind the science fiction portrayal. [1] C-3PO,
the famous fictional character from the Star Wars saga, is a protocol droid who boasts that he is fluent ‘in over
six million forms of communication’. [2] In my opinion this epitomises the visionary expectations of robotic
and automated intelligence systems of the next century, as predicted by scientists fromthe 1970s. The reality is
much different, however many improvements have been made. There are three main aspects present within any
automated system, the strength relating to the physical payload the robot can move, the physical structure of a
robot in relation to its payload and robotic intelligence. The area in which the most significant technological
advances have been made within the last few years is the robotic intelligence field. The amount of manual
interaction needed by an automated system, and also the ability of the systemto think and carry out indepen dent
tasks has greatly improved and this is partly through the use of integrated vision systems and the development of
an omnidirectional vision system. [1]
A Brief Outline of Vision Systems
Machine vision systems can be developed and refined to meet the user’s specific application requirements and
so selecting the correct systemfor the operation required can be challenging. The wrong initial choice can result
in insufficient inspections, decreased productivity and increased rejections as well as incurring a large financial
cost to the company. Take for example a manufacturer of precision engineered engine parts for the aerospace
industry. Each product must be inspected to ensure the dimensions, shape and formation of the part is correct.
This requires image collection from the camera, which in turn will deploy a vision code to extract the edges of
the component. The programme then requires an additional efficient code to determine the exact spacing and
form of the component. Finally the programme must run an analysis phase to decide if the part is to be rejected
or moved further along the production process. This involves hundreds of engineering hours and due to the
rapid development of electronic systems available in PCs and cameras, duplicates which may be required to run
a system become unavailable. Consequently this requires the vision system to be modified, retested and any
advanced code to be debugged. [3] It is therefore no coincidence that the acceptance of machine vision systems
has been insignificant until recent technological developments and a resulting upgrade of the systemas a whole
has emerged. [4]
New Technology Available In Vision Systems
According to a recently published report, by Reportlinker.com, recent advancements in machine vision
technology, such as smart cameras and vision guided robotics, has increased the scope of the machine vision
market for a wider application in the industrial and non-industrial sectors. One of the most rapidly advancing
technologies is the technology behind 3D vision systems; more commonly termed omnidirectional vision
systems within the industry. These systems are used as a tool in solving complexand challenging vision tasks
within the automated manufacturing environment of many manufacturing lines in several different
manufacturing industry sectors. Enabling 3D vision within a machine will enhance flexibility and robotic
intelligence and consequently address some of the present issues which are obvious within current machine
vision systems. [5]
The most effective way of realising 3D vision within a manufacturing robotics system is using two cameras
which are placed side-by-side to produce stereo vision, producing an almost instantaneous estimate of distances
to an object placed within a scene. Distance detection is a primary cue for detecting things which stand out in
depth from the background. Stereo vision is also highly effective for segmenting objects and gauging their
shape and size. [6] This is only possible due to the technical workings of the system.
How Omnidirectional Vision Works
An omnidirectional vision system is composed of a CCD camera and a mirror which faces the camera. This
produces a range of viewing angles, between 360° in the horizontal direction and 120° in the vertical direction.
3
Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011
There is therefore an inherent blind spot within the viewing range of this type of camera and so research and
prototypes have been developed to evolve a fully functioning multidirectional vision system without this blind
spot occurring.
The diagram below illustrates the structure of an omnidirectional vision system which has the occurrence of a
blind spot, obtaining a real-time, and 360° x 240°
omnidirectional images. The lens of the camera is fixed at
a single view point behind a hyperbolic mirror. As can be
seen on the diagram, there are two reflection mirrors. At
the centre of the primary reflection mirror there is a small
hole through which the camera shoots video information.
After the primary reflection mirror there is a secondary
reflection mirror. At the centre of the secondary reflection
mirror there is another small hole with an embedded wide-
angle lens. Therefore as the camera is shooting, the
information being gathered is undergoing two reflections.
The first reflection occurs in the wide-angle lens in the secondary reflection mirror, a second reflection then
occurs in the small hole within the primary reflection mirror before the image is captured on the camera. The
reflections which occur are there for the purpose of moving the position of the imaging point, the point at which
the image being captured is formed on the lens of the camera. This structure has two imaging points, the first is
between the wide-angle lens and the lens of the camera and the second is at the focus of the lens on the camera.
This use of the reflection of light and the hyperbolic mirrors therefore eliminate the occurrence of a dead -angle,
blind spot, before the primary reflection mirror, i.e. this ensures the image the camera is trying to capture is
channelled towards the camera lens. This therefore makes the design of the mirror component critical.
The design of the catadioptric mirror for this omnidirectional vision system adopts the use of average angular
resolution. The relationship between the point on the imaging plane and the incidence angle occurring on the
mirror is linear; this can be seen in appendix 1. As clearly shown in appendix 1, the incidence beam, V1, of a
light source, P, hits the primary reflection mirror and is reflected. This reflected beam of light, V2, strikes the
secondary reflection mirror and is again reflected. The reflected beam of light, V3, then enters the camera lens
with an incidence angle of ɸ1 and images on the camera unit. Through the use of equations outlined in
appendix 2, the curvature for the primary and secondary reflection mirrors can be accurately calculated, giving
the result shown in the diagram below.
Once the curvature of the mirrors has been established the next most
important factor to consider is the lens combination of the imaging
unit itself. The video information before the secondary reflection
mirror is invisible within the current design of the vision system. In
order to obtain information occurring before the secondary
reflection mirror a wide-angle lens must be used. The wide angle
lens and the camera lens together compose a combination lens
device as shown in figure 1.1. The wide-angle lens is situated in
front of the primary reflection mirror and in the secondary reflection
mirror. The central axis of the camera lens, the central axis of the
wide-angle lens, the primary reflection mirror and the secondary
reflection mirror are position along the same axis line, as displayed
in figure 1.3.
The projected
picture through the hole in the primary reflection mirror images
between the wide angle lens and the camera lens; this is known
as the first imaging point. The projected picture coming through
the camera lens images in the camera component. With all of
this information the lens diameter and focus of the camera can
FIG.1.1. the diagrammatical layout ofan
omnidirectional vision system. [PIC1]
FIG.1.2. the designed curvature ofthe
primaryand secondary reflection mirrors.
[PIC2]
FIG.1.3. A diagram showingthe positioning
of the camera, wide-angle lens and mirrors
within the system. [PIC3]
4
Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011
then be determined and relationships between entities derived, this is summarised in appendix 3.
This camera design enables 240° x 360° viewpoints, however placing two of these omnidirectional vision
systems together in a back-to-back configuration can produce the desired 360° x 360° viewpoint with all images
being captured in real-time. The corresponding camera design and its associated image disposal flow are shown
below.
Each camera lens and wide-angle lens of the omnidirectional vision system captures images and automatically
detects the centre of that image. Before the unwrapping of the omnidirectional image occurs, the central part of
the image needs to be separated. This occurs to obtain the most real image possible; this also explains the use of
a connector to join the two omnidirectional vision systems together. Both of the systems are of the same
average angular resolution so as to incur no dead angle. The video cable and power cable exit through a hole
within the connector. Each video cable from the separate omnidirectional vision systems connects to a video
image access unit. As each camera within an omnidirectional vision system has a view range of 360° x 240°
and the same average angular resolution in the vertical direction, it can realise image information fussed
between two omnidirectional vision systems easily. The video access unit reads the image information from
both vision systems separately, storing the images in storage space constantly and then splitting the circular
video images captured by the combination lens. The separate images obtained by the two omnidirectional
vision systems are unwrapped through the use of an unwrapping algorithm. After unwrapping the two separate
images are stitched together and the result is shown in the picture below. [7] This type of imaging has been used
in many applications for a few years
now.
Existing Applications of
Omnidirectional Vision Systems
One of the most well-known robots to
use a 3D visioning system is Honda’s
humanoid robot, Asimo, which was first
unveiled at the turn of the millennium.
Asimo is now capable of identifying
people, their postures and gestures and
therefore moves independently in
response. The camera mounted in
Asimo’s head is capable of detecting the movements of multiple objects while assessing distance and direction,
much like the omnidirectional vision system outlined above. However, the intelligence which is most applicable
to a manufacturing setting is that of environment recognition. Asimo is capable of assessing the immediate
environment surrounding him and recognising the position of obstacles and avoiding themto prevent a collision,
this includes people and immoveable objects. [8] From a safety point-of-view this would be a desirable quality
for the manufacturing industry. Asimo however is not the only robot using this technology.
FIG.1.4. the diagram outlines
the layoutof a ‘back-to-back’
omnidirectional vision system
and its image disposal flow.
[PIC4]
FIG.1.5. this is an unwrapped image created usingthis vision system.
[PIC5]
5
Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011
3D vision systems are an integral part of the autonomous intelligence within an unmanned aerial vehicle. The
GD170 is a sophisticated 2 axis, high resolution UAV vision system. This lightweight, stand-alone, gyro
stabilized daylight observation system protects against wind loads, humidity and dust with an optically perfect
Lexan dome. This vision systemassists with applications such as damage assessment, search and rescue, traffic
surveillance, coastal and border control, anti-terrorism operations, surveillance, anti-smuggling surveillance and
infrastructure inspection. [9] Some of these operations are very similar to those that occur within an industrial
manufacturing environment so it is easy to see how these types of vision system could be utilised within this
sector.
A novel but interesting application using a type of 3D vision system is a robot which is putting Lego together.
A video detailing the robot and how it works can be viewed at the following link;
http://www.youtube.com/watch?v=n6tQiJq9pQA . This robot has been designed to complete repetitive and
monotonous tasks, similar to what many people find themselves undertaking in the manufacturing industry . [10]
Therefore the intelligence used here is easily transferrable to a manufacturing environment.
Automated Manufacture with Omnidirectional Vision Systems
According to the video link, the technology used within the robot at the International Robot Exhibition 2009
was a world first. Within manufacturing and engineering there is a need to enhance automation standards and
this would be derived fromthe use of such systems as the one found in the robot building Lego. This intelligent
technology is now making the emergence of robots and machinery with this technology embedded within the
system, a superior alternative to human labour.
The ability of an omnidirectional vision system to deliver high accuracy while ensuring throughput on the
production line is enabling the sought after process and quality control needed to produce lean and flexible
manufacturing systems. The depth of vision and also the high quality image produced enables the 3D
omnidirectional vision systemto serve as an efficient quality control tool. [11]
Relating back to current applications of omnidirectional vision systems we can also quickly identify other
applications within manufacturing where this type of technology could be beneficial. The robot from the
International Robot Exhibition shows that people could be replaced easily with robots, however this is probably
not desirable because of sociological reasons but monotonous, repetitive and dangerous jobs could be done
using an automated system with omnidirectional vision, cutting cost and improving safety. As Asimo and
UAVs have shown people recognition is a key aspect of this type of robotic intelligence and this could be useful
within manufacturing in one of two ways; 1) the identification of people would dramatically improve safety as
many of the conventional automated systems have electronic safety precautions in place which still have a
certain amount of capacity in which an industrial accident could occur, 2) the ability to sense depth and distance
while avoiding obstacles could be utilised within automated guided vehicles and work towards making this
technology more efficient.
Conclusion
It is evident that the intelligence within machine vision systems is developing at a fast pace. An extensive
amount of research into the use of 3D omnidirectional vision systems is currently taking place across the globe
and is set to replace the old machine vision sensing techniques, e.g. the solid state camera. The incorporation of
the newly developed 3D vision technology will enhance flexibility within the manufacturing process and robotic
technologies whilst also improving upon some of the current issues with machine vision, including complexity,
cost and length of time to obtain information.
The design outlined above is the latest research depicting the most efficient and effective way of producing a
system displaying 360° x 360° 3D vision. It has been tried and tested but, as yet, has not been embedded and
used for any significant applications within manufacturing. However, vision systems similar to the design
depicted above have been used in applications such as Unmanned Aerial Vehicles and humanoid robotic
systems. This technology can easily be placed within a manufacturing setting for applications such as quality
control, safety and the improvement of efficiency within automated guided vehicles. It is easy to see that this
6
Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011
intelligence and technology could be of great use within the industry, however the beneficial results and
improvements must be considered alongside the sociological aspects which could result in the replacement of
human labour with that of an intelligent machine.
References
[1] http://www.jimpinto.com/writings/robotics.html - accessed 22/10/2011
[2] http://en.wikipedia.org/wiki/C-3PO - accessed 22/10/2011
[3] http://www.controleng.com/search/search-single-display/inside-machines-embedded-machine-vision-
systems-an-alternative-to-pc-vision-systems/922912c575.html - accessed 1/11/2011
[4] http://www.frost.com/prod/servlet/report-brochure.pag?id=D366-01-00-00-00 – accessed 1/11/2011
[5] http://www.digikey.com/us/en/techzone/sensors/resources/articles/five-senses-of-sensors-vision.html-
accessed 1/11/2011
[6] http://www.tyzx.com/news/pdf/IEEE%20Computer%20article%20for%20post.pdf – accessed 8/11/2011
[7] http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-full-
sphere-view-and-without-dead-angle - accessed 17/11/2011
[8] http://world.honda.com/ASIMO/technology/intelligence.html - accessed 17/11/2011
[9] http://www.uavvision.com/gimbals/gd170.html - accessed 17/11/2011
[10] http://www.youtube.com/watch?v=n6tQiJq9pQA – accessed 17/11/2011
[PIC1] – http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-
full-sphere-view-and-without-dead-angle - accessed 17/11/2011
[PIC2] – http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-
full-sphere-view-and-without-dead-angle - accessed 17/11/2011
[PIC3] – http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-
full-sphere-view-and-without-dead-angle - accessed 17/11/2011
[PIC4] – http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-
full-sphere-view-and-without-dead-angle - accessed 17/11/2011
[PIC5] - http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-
full-sphere-view-and-without-dead-angle - accessed 17/11/2011
7
Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011
Appendix 1 – The Relationship Between the Point on the Imaging Plane and The Angle of Incidence Occurring
on the Mirror
By building a relationship between the distance from the pixel, P, to the spindle, Z, and the incidence angle ɸ
∅ = 𝑎0 ∙ 𝑃 + 𝑏0
a0 and b0 are arbitrary parameters.
If f is the focus of the camera unit, P is the distance from pixel to spindle Z, P2(t2F2) is the reflex point on the
secondary reflection mirror. According to the imaging principle this gives;
P = f
𝑡2
𝐹2
Source : http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-
full-sphere-view-and-without-dead-angle - page 32/33 accessed 20/11/2011
8
Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011
Appendix 2 – The Curvature of the Primary and Secondary Reflection Mirrors
By substituting;
P = f
𝑡2
𝐹2
into,
∅ = 𝑎0 ∙ 𝑃 + 𝑏0
We get;
∅ = 𝑎0 ∙ ( f
𝑡2
𝐹2
) + 𝑏0
According to catadioptric principle we get;
tan−1
(
𝑡1
𝐹1 − 𝑠
) = 𝑎0 ∙ ( f
𝑡2
𝐹2
) + 𝑏0
By using this equation above along with;
𝐹1
2
− 2𝛼𝐹1 − 1 = 0
and,
𝐹2
2
− 2𝛽𝐹2 − 1 = 0
Then a numerical solution for F1 and F2 can be found, hence giving the appropriate curvature values for both
reflection mirrors.
Source : http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-
full-sphere-view-and-without-dead-angle - page 32/33 accessed 20/11/2011
9
Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011
Appendix 3 – The Relationship Between Lens Diameter and Focus
According to the lens imaging equation we get;
1
𝑓1
=
1
𝑠1
+
1
𝑠2
1
𝑓2
=
1
𝑠3
+
1
𝑠4
𝑑 = 𝑠2 + 𝑠3
By taking the combination lens focus into account we get;
1
𝑓3
=
(𝑓1 + 𝑓2 − 𝑑)
𝑓1 𝑓2
Also the lens diameter, D, has a magnification of;
𝑛 =
𝐷
𝑓3
In order for both entities to have the same average angle, the design must use the following equation;
𝑛 =
𝐷
𝑓3
= 2∅1 𝑀𝐴𝑋
∅1 𝑀𝐴𝑋
is the maximum angle between the secondary reflected light, V2 and the Z axis.
Source : http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-
full-sphere-view-and-without-dead-angle - page 34 accessed 20/11/2011

More Related Content

Viewers also liked

Beeldmateriaal Drechterland Draait Door deel 1
Beeldmateriaal Drechterland Draait Door deel 1Beeldmateriaal Drechterland Draait Door deel 1
Beeldmateriaal Drechterland Draait Door deel 1Loft25
 
Central Park BROCHURE [small]
Central Park BROCHURE [small]Central Park BROCHURE [small]
Central Park BROCHURE [small]Andrew Salmon
 
Michaelhampton figuredrawing-designandinvention-130423232526-phpapp02
Michaelhampton figuredrawing-designandinvention-130423232526-phpapp02Michaelhampton figuredrawing-designandinvention-130423232526-phpapp02
Michaelhampton figuredrawing-designandinvention-130423232526-phpapp02Brandon Pamcakes
 
Eshan Senanayake- Thesis 2011
Eshan Senanayake- Thesis 2011Eshan Senanayake- Thesis 2011
Eshan Senanayake- Thesis 2011Eshan Senanayake
 
Sigmoid Brochure May 2015
Sigmoid Brochure May 2015Sigmoid Brochure May 2015
Sigmoid Brochure May 2015Rahul Nath
 
Critical Success Factors of QRM
Critical Success Factors of QRMCritical Success Factors of QRM
Critical Success Factors of QRMKerrie Noble
 
Document 4 - Interns@Strathclyde
Document 4 - Interns@StrathclydeDocument 4 - Interns@Strathclyde
Document 4 - Interns@StrathclydeKerrie Noble
 
Bsf kna (2)
Bsf kna (2)Bsf kna (2)
Bsf kna (2)praduma
 
88 Brochure 25.10 Small
88 Brochure 25.10 Small88 Brochure 25.10 Small
88 Brochure 25.10 SmallAndrew Salmon
 
KH ATL Brochure small
KH ATL Brochure smallKH ATL Brochure small
KH ATL Brochure smallAndrew Salmon
 
APDM - companion flange manufacturing report
APDM - companion flange manufacturing reportAPDM - companion flange manufacturing report
APDM - companion flange manufacturing reportKerrie Noble
 
Document 1 - Interns@Strathclyde
Document 1 - Interns@StrathclydeDocument 1 - Interns@Strathclyde
Document 1 - Interns@StrathclydeKerrie Noble
 
Individual Project Stage 2 Report
Individual Project Stage 2 ReportIndividual Project Stage 2 Report
Individual Project Stage 2 ReportKerrie Noble
 
Literacies of Bilingual Youth: A profile of Txt, Social, and Bilingual Literacy
Literacies of Bilingual Youth: A profile of Txt, Social, and Bilingual LiteracyLiteracies of Bilingual Youth: A profile of Txt, Social, and Bilingual Literacy
Literacies of Bilingual Youth: A profile of Txt, Social, and Bilingual LiteracyMichelle_JM
 

Viewers also liked (20)

Beeldmateriaal Drechterland Draait Door deel 1
Beeldmateriaal Drechterland Draait Door deel 1Beeldmateriaal Drechterland Draait Door deel 1
Beeldmateriaal Drechterland Draait Door deel 1
 
Central Park BROCHURE [small]
Central Park BROCHURE [small]Central Park BROCHURE [small]
Central Park BROCHURE [small]
 
Michaelhampton figuredrawing-designandinvention-130423232526-phpapp02
Michaelhampton figuredrawing-designandinvention-130423232526-phpapp02Michaelhampton figuredrawing-designandinvention-130423232526-phpapp02
Michaelhampton figuredrawing-designandinvention-130423232526-phpapp02
 
TR-26
TR-26TR-26
TR-26
 
Eshan Senanayake- Thesis 2011
Eshan Senanayake- Thesis 2011Eshan Senanayake- Thesis 2011
Eshan Senanayake- Thesis 2011
 
Sigmoid Brochure May 2015
Sigmoid Brochure May 2015Sigmoid Brochure May 2015
Sigmoid Brochure May 2015
 
Critical Success Factors of QRM
Critical Success Factors of QRMCritical Success Factors of QRM
Critical Success Factors of QRM
 
Document 4 - Interns@Strathclyde
Document 4 - Interns@StrathclydeDocument 4 - Interns@Strathclyde
Document 4 - Interns@Strathclyde
 
Hehe2
Hehe2Hehe2
Hehe2
 
Bsf kna (2)
Bsf kna (2)Bsf kna (2)
Bsf kna (2)
 
CV DUMIE
CV DUMIECV DUMIE
CV DUMIE
 
MICHELFineArt
MICHELFineArtMICHELFineArt
MICHELFineArt
 
88 Brochure 25.10 Small
88 Brochure 25.10 Small88 Brochure 25.10 Small
88 Brochure 25.10 Small
 
KH ATL Brochure small
KH ATL Brochure smallKH ATL Brochure small
KH ATL Brochure small
 
APDM - companion flange manufacturing report
APDM - companion flange manufacturing reportAPDM - companion flange manufacturing report
APDM - companion flange manufacturing report
 
Document 1 - Interns@Strathclyde
Document 1 - Interns@StrathclydeDocument 1 - Interns@Strathclyde
Document 1 - Interns@Strathclyde
 
Новости GTGP
Новости GTGPНовости GTGP
Новости GTGP
 
Individual Project Stage 2 Report
Individual Project Stage 2 ReportIndividual Project Stage 2 Report
Individual Project Stage 2 Report
 
201508_Portfolio Ruben Urcola
201508_Portfolio Ruben Urcola201508_Portfolio Ruben Urcola
201508_Portfolio Ruben Urcola
 
Literacies of Bilingual Youth: A profile of Txt, Social, and Bilingual Literacy
Literacies of Bilingual Youth: A profile of Txt, Social, and Bilingual LiteracyLiteracies of Bilingual Youth: A profile of Txt, Social, and Bilingual Literacy
Literacies of Bilingual Youth: A profile of Txt, Social, and Bilingual Literacy
 

Similar to mechantronics - assignment 1

Machine Vision: The Key Considerations for Successful Visual Inspection
Machine Vision: The Key Considerations for Successful Visual InspectionMachine Vision: The Key Considerations for Successful Visual Inspection
Machine Vision: The Key Considerations for Successful Visual InspectionOptima Control Solutions
 
dissertation master degree
dissertation master degreedissertation master degree
dissertation master degreeKubica Marek
 
Robot Machine Vision
Robot Machine VisionRobot Machine Vision
Robot Machine Visionanand hd
 
JCN Electronics The Servant Project Report
JCN Electronics The Servant Project ReportJCN Electronics The Servant Project Report
JCN Electronics The Servant Project ReportCem Recai Çırak
 
Industrial application of machine vision
Industrial application of machine visionIndustrial application of machine vision
Industrial application of machine visioneSAT Publishing House
 
APPLICATIONS OF MACHINE VISION
APPLICATIONS OF MACHINE VISIONAPPLICATIONS OF MACHINE VISION
APPLICATIONS OF MACHINE VISIONanil badiger
 
VISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEM
VISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEMVISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEM
VISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEMijmech
 
10.1109@ecs.2015.7124874
10.1109@ecs.2015.712487410.1109@ecs.2015.7124874
10.1109@ecs.2015.7124874Ganesh Raja
 
Eye gaze tracking with a web camera
Eye gaze tracking with a web cameraEye gaze tracking with a web camera
Eye gaze tracking with a web camerajpstudcorner
 
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...IRJET Journal
 
IRJET- Smart Mirror using Eye Gaze Tracking
IRJET- Smart Mirror using Eye Gaze TrackingIRJET- Smart Mirror using Eye Gaze Tracking
IRJET- Smart Mirror using Eye Gaze TrackingIRJET Journal
 
Robot Vision ,components for robot vision
Robot Vision ,components for robot visionRobot Vision ,components for robot vision
Robot Vision ,components for robot visionKRSavinJoseph
 
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)IRJET Journal
 
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)IRJET Journal
 
IRJET- Tracking of Wall Mounted Solar Panels with Real Time Monitoring
IRJET- Tracking of Wall Mounted Solar Panels with Real Time MonitoringIRJET- Tracking of Wall Mounted Solar Panels with Real Time Monitoring
IRJET- Tracking of Wall Mounted Solar Panels with Real Time MonitoringIRJET Journal
 
The vibration analysis of automobile outer rear view mirror with its developm...
The vibration analysis of automobile outer rear view mirror with its developm...The vibration analysis of automobile outer rear view mirror with its developm...
The vibration analysis of automobile outer rear view mirror with its developm...eSAT Journals
 

Similar to mechantronics - assignment 1 (20)

Machine Vision: The Key Considerations for Successful Visual Inspection
Machine Vision: The Key Considerations for Successful Visual InspectionMachine Vision: The Key Considerations for Successful Visual Inspection
Machine Vision: The Key Considerations for Successful Visual Inspection
 
dissertation master degree
dissertation master degreedissertation master degree
dissertation master degree
 
1604.08848v1
1604.08848v11604.08848v1
1604.08848v1
 
Robot Machine Vision
Robot Machine VisionRobot Machine Vision
Robot Machine Vision
 
JCN Electronics The Servant Project Report
JCN Electronics The Servant Project ReportJCN Electronics The Servant Project Report
JCN Electronics The Servant Project Report
 
Industrial application of machine vision
Industrial application of machine visionIndustrial application of machine vision
Industrial application of machine vision
 
APPLICATIONS OF MACHINE VISION
APPLICATIONS OF MACHINE VISIONAPPLICATIONS OF MACHINE VISION
APPLICATIONS OF MACHINE VISION
 
VISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEM
VISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEMVISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEM
VISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEM
 
10.1109@ecs.2015.7124874
10.1109@ecs.2015.712487410.1109@ecs.2015.7124874
10.1109@ecs.2015.7124874
 
Eye gaze tracking with a web camera
Eye gaze tracking with a web cameraEye gaze tracking with a web camera
Eye gaze tracking with a web camera
 
final ppt
final pptfinal ppt
final ppt
 
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
IRJET- Moving Object Detection with Shadow Compression using Foreground Segme...
 
IRJET- Smart Mirror using Eye Gaze Tracking
IRJET- Smart Mirror using Eye Gaze TrackingIRJET- Smart Mirror using Eye Gaze Tracking
IRJET- Smart Mirror using Eye Gaze Tracking
 
Camera technologybasics
Camera technologybasicsCamera technologybasics
Camera technologybasics
 
Robot Vision ,components for robot vision
Robot Vision ,components for robot visionRobot Vision ,components for robot vision
Robot Vision ,components for robot vision
 
Pid
PidPid
Pid
 
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
 
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
IRJET- Adroit Speculum for Institutional Updates (Smart Mirror)
 
IRJET- Tracking of Wall Mounted Solar Panels with Real Time Monitoring
IRJET- Tracking of Wall Mounted Solar Panels with Real Time MonitoringIRJET- Tracking of Wall Mounted Solar Panels with Real Time Monitoring
IRJET- Tracking of Wall Mounted Solar Panels with Real Time Monitoring
 
The vibration analysis of automobile outer rear view mirror with its developm...
The vibration analysis of automobile outer rear view mirror with its developm...The vibration analysis of automobile outer rear view mirror with its developm...
The vibration analysis of automobile outer rear view mirror with its developm...
 

More from Kerrie Noble

5TH year progress critique
5TH year progress critique5TH year progress critique
5TH year progress critiqueKerrie Noble
 
CWS Business Plan (2)
CWS Business Plan (2)CWS Business Plan (2)
CWS Business Plan (2)Kerrie Noble
 
Project Brief - 5th Year
Project Brief - 5th YearProject Brief - 5th Year
Project Brief - 5th YearKerrie Noble
 
Individual Project Stage 1 Report (Academic honesty)
Individual Project Stage 1 Report (Academic honesty)Individual Project Stage 1 Report (Academic honesty)
Individual Project Stage 1 Report (Academic honesty)Kerrie Noble
 
Final Submission - Team 19 Lidar Installation Report
Final Submission - Team 19 Lidar Installation ReportFinal Submission - Team 19 Lidar Installation Report
Final Submission - Team 19 Lidar Installation ReportKerrie Noble
 
Document 3 - Interns@Strathclyde
Document 3 - Interns@StrathclydeDocument 3 - Interns@Strathclyde
Document 3 - Interns@StrathclydeKerrie Noble
 
Document 2 - Interns@Strathclyde
Document 2 - Interns@StrathclydeDocument 2 - Interns@Strathclyde
Document 2 - Interns@StrathclydeKerrie Noble
 
reflective logbook - final submission
reflective logbook - final submissionreflective logbook - final submission
reflective logbook - final submissionKerrie Noble
 
EPSRC report (short version)
EPSRC report (short version)EPSRC report (short version)
EPSRC report (short version)Kerrie Noble
 
Moon Rover Project
Moon Rover ProjectMoon Rover Project
Moon Rover ProjectKerrie Noble
 
Advanced Design Methods - Sustainable design presentation
Advanced Design Methods - Sustainable design presentationAdvanced Design Methods - Sustainable design presentation
Advanced Design Methods - Sustainable design presentationKerrie Noble
 
Production Techniques 2 - advanced machining techniques report
Production Techniques 2 - advanced machining techniques reportProduction Techniques 2 - advanced machining techniques report
Production Techniques 2 - advanced machining techniques reportKerrie Noble
 
RAPID TOOLING - presentationupdate
RAPID TOOLING - presentationupdateRAPID TOOLING - presentationupdate
RAPID TOOLING - presentationupdateKerrie Noble
 
Production and Operations management - assignment 1 - defining a system
Production and Operations management - assignment 1 - defining a systemProduction and Operations management - assignment 1 - defining a system
Production and Operations management - assignment 1 - defining a systemKerrie Noble
 
Product development
Product developmentProduct development
Product developmentKerrie Noble
 

More from Kerrie Noble (20)

5TH year progress critique
5TH year progress critique5TH year progress critique
5TH year progress critique
 
CWS Business Plan (2)
CWS Business Plan (2)CWS Business Plan (2)
CWS Business Plan (2)
 
Stage 2 Report
Stage 2 ReportStage 2 Report
Stage 2 Report
 
Stage 1 Report
Stage 1 ReportStage 1 Report
Stage 1 Report
 
Project Brief - 5th Year
Project Brief - 5th YearProject Brief - 5th Year
Project Brief - 5th Year
 
Individual Project Stage 1 Report (Academic honesty)
Individual Project Stage 1 Report (Academic honesty)Individual Project Stage 1 Report (Academic honesty)
Individual Project Stage 1 Report (Academic honesty)
 
Final Submission - Team 19 Lidar Installation Report
Final Submission - Team 19 Lidar Installation ReportFinal Submission - Team 19 Lidar Installation Report
Final Submission - Team 19 Lidar Installation Report
 
Document 3 - Interns@Strathclyde
Document 3 - Interns@StrathclydeDocument 3 - Interns@Strathclyde
Document 3 - Interns@Strathclyde
 
Document 2 - Interns@Strathclyde
Document 2 - Interns@StrathclydeDocument 2 - Interns@Strathclyde
Document 2 - Interns@Strathclyde
 
reflective logbook - final submission
reflective logbook - final submissionreflective logbook - final submission
reflective logbook - final submission
 
project report
project reportproject report
project report
 
EPSRC report (short version)
EPSRC report (short version)EPSRC report (short version)
EPSRC report (short version)
 
Moon Rover Project
Moon Rover ProjectMoon Rover Project
Moon Rover Project
 
Advanced Design Methods - Sustainable design presentation
Advanced Design Methods - Sustainable design presentationAdvanced Design Methods - Sustainable design presentation
Advanced Design Methods - Sustainable design presentation
 
DFMA report
DFMA reportDFMA report
DFMA report
 
Production Techniques 2 - advanced machining techniques report
Production Techniques 2 - advanced machining techniques reportProduction Techniques 2 - advanced machining techniques report
Production Techniques 2 - advanced machining techniques report
 
RAPID TOOLING - presentationupdate
RAPID TOOLING - presentationupdateRAPID TOOLING - presentationupdate
RAPID TOOLING - presentationupdate
 
Presentation1
Presentation1Presentation1
Presentation1
 
Production and Operations management - assignment 1 - defining a system
Production and Operations management - assignment 1 - defining a systemProduction and Operations management - assignment 1 - defining a system
Production and Operations management - assignment 1 - defining a system
 
Product development
Product developmentProduct development
Product development
 

mechantronics - assignment 1

  • 1. 1 Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011 Table of Contents Introduction.............................................................................................................................. 2 A Brief Outline of Vision Systems............................................................................................... 2 New Technology Available In Vision Systems.............................................................................. 2 How Omnidirectional Vision Works........................................................................................... 2 Existing Applications of Omnidirectional Vision Systems............................................................. 4 Automated Manufacture with Omnidirectional Vision Systems..................................................... 5 Conclusion............................................................................................................................... 5 References................................................................................................................................ 6 Appendix 1 ............................................................................................................................... 7 Appendix 2 ............................................................................................................................... 8 Appendix 3 ............................................................................................................................... 9
  • 2. 2 Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011 Introduction In many circumstances the reality of robotic capability still lags behind the science fiction portrayal. [1] C-3PO, the famous fictional character from the Star Wars saga, is a protocol droid who boasts that he is fluent ‘in over six million forms of communication’. [2] In my opinion this epitomises the visionary expectations of robotic and automated intelligence systems of the next century, as predicted by scientists fromthe 1970s. The reality is much different, however many improvements have been made. There are three main aspects present within any automated system, the strength relating to the physical payload the robot can move, the physical structure of a robot in relation to its payload and robotic intelligence. The area in which the most significant technological advances have been made within the last few years is the robotic intelligence field. The amount of manual interaction needed by an automated system, and also the ability of the systemto think and carry out indepen dent tasks has greatly improved and this is partly through the use of integrated vision systems and the development of an omnidirectional vision system. [1] A Brief Outline of Vision Systems Machine vision systems can be developed and refined to meet the user’s specific application requirements and so selecting the correct systemfor the operation required can be challenging. The wrong initial choice can result in insufficient inspections, decreased productivity and increased rejections as well as incurring a large financial cost to the company. Take for example a manufacturer of precision engineered engine parts for the aerospace industry. Each product must be inspected to ensure the dimensions, shape and formation of the part is correct. This requires image collection from the camera, which in turn will deploy a vision code to extract the edges of the component. The programme then requires an additional efficient code to determine the exact spacing and form of the component. Finally the programme must run an analysis phase to decide if the part is to be rejected or moved further along the production process. This involves hundreds of engineering hours and due to the rapid development of electronic systems available in PCs and cameras, duplicates which may be required to run a system become unavailable. Consequently this requires the vision system to be modified, retested and any advanced code to be debugged. [3] It is therefore no coincidence that the acceptance of machine vision systems has been insignificant until recent technological developments and a resulting upgrade of the systemas a whole has emerged. [4] New Technology Available In Vision Systems According to a recently published report, by Reportlinker.com, recent advancements in machine vision technology, such as smart cameras and vision guided robotics, has increased the scope of the machine vision market for a wider application in the industrial and non-industrial sectors. One of the most rapidly advancing technologies is the technology behind 3D vision systems; more commonly termed omnidirectional vision systems within the industry. These systems are used as a tool in solving complexand challenging vision tasks within the automated manufacturing environment of many manufacturing lines in several different manufacturing industry sectors. Enabling 3D vision within a machine will enhance flexibility and robotic intelligence and consequently address some of the present issues which are obvious within current machine vision systems. [5] The most effective way of realising 3D vision within a manufacturing robotics system is using two cameras which are placed side-by-side to produce stereo vision, producing an almost instantaneous estimate of distances to an object placed within a scene. Distance detection is a primary cue for detecting things which stand out in depth from the background. Stereo vision is also highly effective for segmenting objects and gauging their shape and size. [6] This is only possible due to the technical workings of the system. How Omnidirectional Vision Works An omnidirectional vision system is composed of a CCD camera and a mirror which faces the camera. This produces a range of viewing angles, between 360° in the horizontal direction and 120° in the vertical direction.
  • 3. 3 Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011 There is therefore an inherent blind spot within the viewing range of this type of camera and so research and prototypes have been developed to evolve a fully functioning multidirectional vision system without this blind spot occurring. The diagram below illustrates the structure of an omnidirectional vision system which has the occurrence of a blind spot, obtaining a real-time, and 360° x 240° omnidirectional images. The lens of the camera is fixed at a single view point behind a hyperbolic mirror. As can be seen on the diagram, there are two reflection mirrors. At the centre of the primary reflection mirror there is a small hole through which the camera shoots video information. After the primary reflection mirror there is a secondary reflection mirror. At the centre of the secondary reflection mirror there is another small hole with an embedded wide- angle lens. Therefore as the camera is shooting, the information being gathered is undergoing two reflections. The first reflection occurs in the wide-angle lens in the secondary reflection mirror, a second reflection then occurs in the small hole within the primary reflection mirror before the image is captured on the camera. The reflections which occur are there for the purpose of moving the position of the imaging point, the point at which the image being captured is formed on the lens of the camera. This structure has two imaging points, the first is between the wide-angle lens and the lens of the camera and the second is at the focus of the lens on the camera. This use of the reflection of light and the hyperbolic mirrors therefore eliminate the occurrence of a dead -angle, blind spot, before the primary reflection mirror, i.e. this ensures the image the camera is trying to capture is channelled towards the camera lens. This therefore makes the design of the mirror component critical. The design of the catadioptric mirror for this omnidirectional vision system adopts the use of average angular resolution. The relationship between the point on the imaging plane and the incidence angle occurring on the mirror is linear; this can be seen in appendix 1. As clearly shown in appendix 1, the incidence beam, V1, of a light source, P, hits the primary reflection mirror and is reflected. This reflected beam of light, V2, strikes the secondary reflection mirror and is again reflected. The reflected beam of light, V3, then enters the camera lens with an incidence angle of ɸ1 and images on the camera unit. Through the use of equations outlined in appendix 2, the curvature for the primary and secondary reflection mirrors can be accurately calculated, giving the result shown in the diagram below. Once the curvature of the mirrors has been established the next most important factor to consider is the lens combination of the imaging unit itself. The video information before the secondary reflection mirror is invisible within the current design of the vision system. In order to obtain information occurring before the secondary reflection mirror a wide-angle lens must be used. The wide angle lens and the camera lens together compose a combination lens device as shown in figure 1.1. The wide-angle lens is situated in front of the primary reflection mirror and in the secondary reflection mirror. The central axis of the camera lens, the central axis of the wide-angle lens, the primary reflection mirror and the secondary reflection mirror are position along the same axis line, as displayed in figure 1.3. The projected picture through the hole in the primary reflection mirror images between the wide angle lens and the camera lens; this is known as the first imaging point. The projected picture coming through the camera lens images in the camera component. With all of this information the lens diameter and focus of the camera can FIG.1.1. the diagrammatical layout ofan omnidirectional vision system. [PIC1] FIG.1.2. the designed curvature ofthe primaryand secondary reflection mirrors. [PIC2] FIG.1.3. A diagram showingthe positioning of the camera, wide-angle lens and mirrors within the system. [PIC3]
  • 4. 4 Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011 then be determined and relationships between entities derived, this is summarised in appendix 3. This camera design enables 240° x 360° viewpoints, however placing two of these omnidirectional vision systems together in a back-to-back configuration can produce the desired 360° x 360° viewpoint with all images being captured in real-time. The corresponding camera design and its associated image disposal flow are shown below. Each camera lens and wide-angle lens of the omnidirectional vision system captures images and automatically detects the centre of that image. Before the unwrapping of the omnidirectional image occurs, the central part of the image needs to be separated. This occurs to obtain the most real image possible; this also explains the use of a connector to join the two omnidirectional vision systems together. Both of the systems are of the same average angular resolution so as to incur no dead angle. The video cable and power cable exit through a hole within the connector. Each video cable from the separate omnidirectional vision systems connects to a video image access unit. As each camera within an omnidirectional vision system has a view range of 360° x 240° and the same average angular resolution in the vertical direction, it can realise image information fussed between two omnidirectional vision systems easily. The video access unit reads the image information from both vision systems separately, storing the images in storage space constantly and then splitting the circular video images captured by the combination lens. The separate images obtained by the two omnidirectional vision systems are unwrapped through the use of an unwrapping algorithm. After unwrapping the two separate images are stitched together and the result is shown in the picture below. [7] This type of imaging has been used in many applications for a few years now. Existing Applications of Omnidirectional Vision Systems One of the most well-known robots to use a 3D visioning system is Honda’s humanoid robot, Asimo, which was first unveiled at the turn of the millennium. Asimo is now capable of identifying people, their postures and gestures and therefore moves independently in response. The camera mounted in Asimo’s head is capable of detecting the movements of multiple objects while assessing distance and direction, much like the omnidirectional vision system outlined above. However, the intelligence which is most applicable to a manufacturing setting is that of environment recognition. Asimo is capable of assessing the immediate environment surrounding him and recognising the position of obstacles and avoiding themto prevent a collision, this includes people and immoveable objects. [8] From a safety point-of-view this would be a desirable quality for the manufacturing industry. Asimo however is not the only robot using this technology. FIG.1.4. the diagram outlines the layoutof a ‘back-to-back’ omnidirectional vision system and its image disposal flow. [PIC4] FIG.1.5. this is an unwrapped image created usingthis vision system. [PIC5]
  • 5. 5 Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011 3D vision systems are an integral part of the autonomous intelligence within an unmanned aerial vehicle. The GD170 is a sophisticated 2 axis, high resolution UAV vision system. This lightweight, stand-alone, gyro stabilized daylight observation system protects against wind loads, humidity and dust with an optically perfect Lexan dome. This vision systemassists with applications such as damage assessment, search and rescue, traffic surveillance, coastal and border control, anti-terrorism operations, surveillance, anti-smuggling surveillance and infrastructure inspection. [9] Some of these operations are very similar to those that occur within an industrial manufacturing environment so it is easy to see how these types of vision system could be utilised within this sector. A novel but interesting application using a type of 3D vision system is a robot which is putting Lego together. A video detailing the robot and how it works can be viewed at the following link; http://www.youtube.com/watch?v=n6tQiJq9pQA . This robot has been designed to complete repetitive and monotonous tasks, similar to what many people find themselves undertaking in the manufacturing industry . [10] Therefore the intelligence used here is easily transferrable to a manufacturing environment. Automated Manufacture with Omnidirectional Vision Systems According to the video link, the technology used within the robot at the International Robot Exhibition 2009 was a world first. Within manufacturing and engineering there is a need to enhance automation standards and this would be derived fromthe use of such systems as the one found in the robot building Lego. This intelligent technology is now making the emergence of robots and machinery with this technology embedded within the system, a superior alternative to human labour. The ability of an omnidirectional vision system to deliver high accuracy while ensuring throughput on the production line is enabling the sought after process and quality control needed to produce lean and flexible manufacturing systems. The depth of vision and also the high quality image produced enables the 3D omnidirectional vision systemto serve as an efficient quality control tool. [11] Relating back to current applications of omnidirectional vision systems we can also quickly identify other applications within manufacturing where this type of technology could be beneficial. The robot from the International Robot Exhibition shows that people could be replaced easily with robots, however this is probably not desirable because of sociological reasons but monotonous, repetitive and dangerous jobs could be done using an automated system with omnidirectional vision, cutting cost and improving safety. As Asimo and UAVs have shown people recognition is a key aspect of this type of robotic intelligence and this could be useful within manufacturing in one of two ways; 1) the identification of people would dramatically improve safety as many of the conventional automated systems have electronic safety precautions in place which still have a certain amount of capacity in which an industrial accident could occur, 2) the ability to sense depth and distance while avoiding obstacles could be utilised within automated guided vehicles and work towards making this technology more efficient. Conclusion It is evident that the intelligence within machine vision systems is developing at a fast pace. An extensive amount of research into the use of 3D omnidirectional vision systems is currently taking place across the globe and is set to replace the old machine vision sensing techniques, e.g. the solid state camera. The incorporation of the newly developed 3D vision technology will enhance flexibility within the manufacturing process and robotic technologies whilst also improving upon some of the current issues with machine vision, including complexity, cost and length of time to obtain information. The design outlined above is the latest research depicting the most efficient and effective way of producing a system displaying 360° x 360° 3D vision. It has been tried and tested but, as yet, has not been embedded and used for any significant applications within manufacturing. However, vision systems similar to the design depicted above have been used in applications such as Unmanned Aerial Vehicles and humanoid robotic systems. This technology can easily be placed within a manufacturing setting for applications such as quality control, safety and the improvement of efficiency within automated guided vehicles. It is easy to see that this
  • 6. 6 Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011 intelligence and technology could be of great use within the industry, however the beneficial results and improvements must be considered alongside the sociological aspects which could result in the replacement of human labour with that of an intelligent machine. References [1] http://www.jimpinto.com/writings/robotics.html - accessed 22/10/2011 [2] http://en.wikipedia.org/wiki/C-3PO - accessed 22/10/2011 [3] http://www.controleng.com/search/search-single-display/inside-machines-embedded-machine-vision- systems-an-alternative-to-pc-vision-systems/922912c575.html - accessed 1/11/2011 [4] http://www.frost.com/prod/servlet/report-brochure.pag?id=D366-01-00-00-00 – accessed 1/11/2011 [5] http://www.digikey.com/us/en/techzone/sensors/resources/articles/five-senses-of-sensors-vision.html- accessed 1/11/2011 [6] http://www.tyzx.com/news/pdf/IEEE%20Computer%20article%20for%20post.pdf – accessed 8/11/2011 [7] http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with-full- sphere-view-and-without-dead-angle - accessed 17/11/2011 [8] http://world.honda.com/ASIMO/technology/intelligence.html - accessed 17/11/2011 [9] http://www.uavvision.com/gimbals/gd170.html - accessed 17/11/2011 [10] http://www.youtube.com/watch?v=n6tQiJq9pQA – accessed 17/11/2011 [PIC1] – http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with- full-sphere-view-and-without-dead-angle - accessed 17/11/2011 [PIC2] – http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with- full-sphere-view-and-without-dead-angle - accessed 17/11/2011 [PIC3] – http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with- full-sphere-view-and-without-dead-angle - accessed 17/11/2011 [PIC4] – http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with- full-sphere-view-and-without-dead-angle - accessed 17/11/2011 [PIC5] - http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with- full-sphere-view-and-without-dead-angle - accessed 17/11/2011
  • 7. 7 Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011 Appendix 1 – The Relationship Between the Point on the Imaging Plane and The Angle of Incidence Occurring on the Mirror By building a relationship between the distance from the pixel, P, to the spindle, Z, and the incidence angle ɸ ∅ = 𝑎0 ∙ 𝑃 + 𝑏0 a0 and b0 are arbitrary parameters. If f is the focus of the camera unit, P is the distance from pixel to spindle Z, P2(t2F2) is the reflex point on the secondary reflection mirror. According to the imaging principle this gives; P = f 𝑡2 𝐹2 Source : http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with- full-sphere-view-and-without-dead-angle - page 32/33 accessed 20/11/2011
  • 8. 8 Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011 Appendix 2 – The Curvature of the Primary and Secondary Reflection Mirrors By substituting; P = f 𝑡2 𝐹2 into, ∅ = 𝑎0 ∙ 𝑃 + 𝑏0 We get; ∅ = 𝑎0 ∙ ( f 𝑡2 𝐹2 ) + 𝑏0 According to catadioptric principle we get; tan−1 ( 𝑡1 𝐹1 − 𝑠 ) = 𝑎0 ∙ ( f 𝑡2 𝐹2 ) + 𝑏0 By using this equation above along with; 𝐹1 2 − 2𝛼𝐹1 − 1 = 0 and, 𝐹2 2 − 2𝛽𝐹2 − 1 = 0 Then a numerical solution for F1 and F2 can be found, hence giving the appropriate curvature values for both reflection mirrors. Source : http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with- full-sphere-view-and-without-dead-angle - page 32/33 accessed 20/11/2011
  • 9. 9 Kerrie Noble 200948192 DM309 – Mechatronics Design andApplications 19/11/2011 Appendix 3 – The Relationship Between Lens Diameter and Focus According to the lens imaging equation we get; 1 𝑓1 = 1 𝑠1 + 1 𝑠2 1 𝑓2 = 1 𝑠3 + 1 𝑠4 𝑑 = 𝑠2 + 𝑠3 By taking the combination lens focus into account we get; 1 𝑓3 = (𝑓1 + 𝑓2 − 𝑑) 𝑓1 𝑓2 Also the lens diameter, D, has a magnification of; 𝑛 = 𝐷 𝑓3 In order for both entities to have the same average angle, the design must use the following equation; 𝑛 = 𝐷 𝑓3 = 2∅1 𝑀𝐴𝑋 ∅1 𝑀𝐴𝑋 is the maximum angle between the secondary reflected light, V2 and the Z axis. Source : http://www.intechopen.com/articles/show/title/design-of-stereo-omni-directional-vision-sensors-with- full-sphere-view-and-without-dead-angle - page 34 accessed 20/11/2011