SlideShare a Scribd company logo
1 of 7
Download to read offline
International Journal of Informatics and Communication Technology (IJ-ICT)
Vol.9, No.3, December 2020, pp. 205~211
ISSN: 2252-8776, DOI: 10.11591/ijict.v9i3.pp205-211  205
Journal homepage: http://ijict.iaescore.com
Augmented reality and virtual reality in our daily life
Vallidevi Krishnamurthy1
, Nagarajan K. K2
, D.Venkata Vara Prasad3
, Ganesh Kumar4
, Arjith
Natarajan5
, Shailesh Saravanan6
, Akshaya Natarajan7
, Shankaran Murugan8
, Dattuluri Rushitaa9
1-3, 5-9
SSN College of Engineering, India
4
Chennai, SRM University, India
Article Info ABSTRACT
Article history:
Received Feb 3, 2020
Revised Mar 3, 2020
Accepted Mar 27, 2020
Augmented reality, the new age technology, has widespread applications in
every field imaginable. This technology has proven to be an inflection point
in numerous verticals, improving lives and improving performance. In this
paper, we explore the various possible applications of Augmented Reality
(AR) in the field of Medicine. The objective of using AR in medicine or
generally in any field is the fact that, AR helps in motivating the user,
making sessions interactive and assist in faster learning. In this paper, we
discuss about the applicability of AR in the field of medical diagnosis.
Augmented reality technology reinforces remote collaboration, allowing
doctors to diagnose patients from a different locality. Additionally, we
believe that a much more pronounced effect can be achieved by bringing
together the cutting edge technology of AR and the lifesaving field of
Medical sciences. AR is a mechanism that could be applied in the learning
process too. Similarly, virtual reality could be used in the field where more of
practical experience is needed such as driving, sports, neonatal care training.
Keywords:
Augmented reality
Driving
Education
Medical diagnosis
Neonatal care training
Virtual reality
This is an open access article under the CC BY-SA license.
Corresponding Author:
Vallidevi Krishnamurthy,
SSN College of Engineering,
Tamil Nadu, Chennai, 603110, India.
Email: vallidevik@ssn.edu.in
1. INTRODUCTION
The fundamental value added by Augmented Reality is that, it brings components of the digital
world into a person’s perception of the real world. Using AR, we can augment any object into the real world.
There are many existing applications that use AR and many more that are still trying to implement their
application using AR in their respective fields. The first application that was developed using AR in 1990’s
was used for pilot training [1] and since then its applicability has been complemented with a variety of fields.
The significance of AR is observed when they are used for those training purposes that are generally difficult
to do in real life, such as pilot training, medical diagnosis, medical experiments, army training, automobile
industries etc. Augmented Reality in conjunction with technologies like IoT, virtual reality has evolved to an
extent that makes artificial intelligence sound like a task that can be easily accomplished. This paper explains
a model that saves the time taken to perform preliminary examinations in medical diagnosis.
The preliminary process of examination is eliminated by digitally transporting the patient’s details
to the doctor and hence, the doctor can evaluate them when he is available. The doctor can further suggest the
required tests and diagnosis for the respective patient. AR could be used for medical education through which
the student interaction in classes could be increased and there could be more of practical discussions in the
class. This will trigger the interest of the students. Virtual reality takes the person to that place and makes us
to learn new things in the environment that is needed for the learning process. In Virtual Reality, a whole
new environment is created according to the demand. On the other hand, in Augmented Reality, images are
 ISSN: 2252-8776
Int J Inf & Commun Technol, Vol. 9, No. 3, December 2020: 205 – 211
206
super-imposed into the real world than creating a completely new environment. Augmented Reality
supplements reality, rather than completely replacing it. The virtual objects display information that the user
cannot directly detect with his own senses. AR not only increases our learning curve but also improves our
brain’s productivity. It changes the way one thinks, works and how one learns. Instead of having just one
angle of any concept, we get multiple angles of the same concept.
Sergey et al, in [2] which discusses about interactive environment for education using AR and 3D
visualization. Bouras in [3] reviews the importance of virtual environment in e-learning. Charvi Agarwal et al
in [4], confers about recent practices of AR in daily life. [5-7] examines the e-learning and e-learning based
improvement of skill set of the user. [8] Explains about the applications of AR.
The following research papers [2] and [8] were referred to for AR technology which discuses about
interactive environment for education using AR and 3D visualization. [9] Discusses about kinetic fusion
technology and the implementation of kinetic camera. [10] Discusses about advantages and limitations of AR
in educational environment. [4] Discusses about the recent practices of AR in daily life. [3] Discusses about
the importance of virtual environment in e-learning. [11] Discusses about structural equation and modeling
approach for explaining the enhancement in learning outcomes through desktop VR. [12] Discusses about
quantitative analysis of education through social learning outcomes.
2. PROPOSED ALGORITHM
2.1. Augmented reality and applications
In order to evaluate where virtual objects are to be blended in real environment, the following
approaches are considered.
Marker based approach in a marker-based approach [13], the images and the corresponding image descriptors
to be recognized are provided beforehand. The markers that are used can be QR Codes or black and white
square mirrors. Marker-less Approach, in a marker-less approach, an AR application recognizes things there
were not directly provided to the application beforehand. This scenario is much more difficult to implement
because the recognition algorithm running in your AR application has to identify patterns, colors or some
other features that may exist in camera frames.
The advancement of technology in the field of medicine has grown to an extent that makes diagnosis
very simple and efficient. Doctors can use augmented reality as a visualization technique and training aid for
surgery [14]. The input data of the patient can be either images, or metadata about the heart beat and so on
which will be transported to the doctor, along with the patient’s 3D hologram. Thus, by using AR in this
field, it gives the doctor a complete vision like an X-Ray of the patient. The goal is to make sure the
diagnosis process is accelerated.
2.2. Virtual reailty and its applications
Virtual reality based two e-learning systems are discussed in the following section. The first is the e-
learning based virtual car driving system (VIRECAR) and the second is the e-learning based virtual cricket
training system. The car driving system provides road safety measures which could help the learners to avoid
accidents. The other system will help the players to learn more by not losing their energy and train
themselves before appearing for a real match.
3. RESEARCH METHOD
A system that aims to simplify the doctor/patient preliminary examinations shown in Figure 1 is the
system that is newly proposed in this paper. The idea is to digitally transport the patient to doctor’s room as a
simple simplified 3D object, with all necessary details to facilitate the findings. Also, without just
transporting a digital object the system will impart interactivity to these objects where in, at the doctor’s side,
it gets rendered as augmented reality models superimposed in the environment. Thus, simulating the patient’s
appointment.
Data Collection Module, The first module of the system comprises of collection of patient data. This
is done at the clinical premise. Patient data includes preliminary medical examination data like, temperature,
breathing level, blood pressure. The collection and gathering shall be conducted by an assistant who uses the
respective devices to capture these data. The image of the patient is then captured using Kinect Cameras
which develops a 3D image of the patient using appropriate depths and coloring.
3D Model Building
The videos are to be recorded, featuring the patient with the help of Kinect Cameras. These are
special cameras with sensors to detect the depth of the patient dynamically. Images and videos are captured
Int J Inf & Commun Technol ISSN: 2252-8776 
Augmented reality and virtual reality in our daily life (Vallidevi Krishnamurthy)
207
via Depth cameras and they appear to be as 2D images. These images when fed into Kinect Fusion, will
churn out 3D modelling of the scene captured along with objects.
Figure 1. Premedical diagnosis system
− Conversion
The conversion of 2D images can be performed by Kinect API’s. Although the modelling of the 2D
image into 3D models can be done in real time using Kinect Fusion [14], such an approach is traded for more
accurate and detailed models by processing it separately. Though the process takes a lot of time, the needed
quality of the 3D images that are rendered is achieved. The images captured as photographs are then
overlayed onto the 3D objects to improve the originality of the objects. The same process is applied to any
particular part of body, under examination.
− Exclusion
The subject of interest at the hand is only the patient and his vitals, any other objects present in the
surrounding can be ignored. Thus, removal of the other objects is done in this component of the system.
Selective capturing of images is not possible using Kinect Cameras, hence the approach used is to remove
other unnecessary objects from the scene. This additional processing can be achieved through
any image/graphic processing libraries like OpenCV, OpenGL and so on. The sequence of the steps would
be: a) Detect unnecessary surfaces, b) Select them iteratively, c) Remove. The approach may vary for
each patient.
− Order of Processing
It is ideal to process the video in a particular order. 3D models are constructed first and then the
background objects are removed from the video. However, this reversal may not work due to the presence of
the objects in the surrounding area vital for depth sensing.
− Medical data correlator
The patient’s data includes his/her appearance, in most cases patient come with complaints and no
particulars. In all of the medical check-up, the complaint of the particular patient forms the core of the data
collector. The complaint as narrated by the patient and recorded by the medical assistant is the vital
information to any decision; general notes/ observations about mental state of the patient are also included.
The data collected includes vital signs and basic checkups/ examination.
Technologies and its definitions the intention of the proposed system is to use some of the existing
software technologies, to make the preliminary diagnosis process effortless and uncomplicated. Some of the
technologies that are used in the system are noticeably defined here. This section is in reference to the readers
who do not have a clear idea about the definitions of certain technologies.
 ISSN: 2252-8776
Int J Inf & Commun Technol, Vol. 9, No. 3, December 2020: 205 – 211
208
− Kinect Fusion
Kinect Fusion [14] is a new and widely-available commodity sensor platform that incorporates a
structured light based depth sensor. Real-time infrastructure-free tracking of a handheld camera whilst
simultaneously mapping the physical scene in high-detail promises new possibilities for augmented and
mixed reality applications. In [14], a detailed method with analysis of what may be the first system that
permits real-time, dense volumetric reconstruction of objects/things using a handheld Kinect depth sensor.
Users can simply pick up and move a Kinect device to generate a continuously updating, smooth, fully fused
3D surface reconstruction.
− Unity 3D Kit
Unity can be used to primarily create three dimensional and two dimensional games for computers.
It can be defined as a cross-platform game engine, to create simulations. Unity can be used together with AR
to create gaming applications that superimpose objects into the real world. One of the examples of a game
using AR, is the Pokemon Go. In order to build the application, we utilise the SDK (Software Development
Kit) of Unity, called Vuforia. It uses Computer Vision technology to identify objects, registers their position.
And orient them. Vuforia is a free technology and is an ideal solution that serves the purpose.
− HoloLens
As one of the first AR headsets available today on the market, the Microsoft HoloLens is bound to
have a profound impact on the development of AR applications [15]. The HoloLens is a head-mounted
display unit connected to an adjustable, cushioned inner headband, which can tilt HoloLens up and down, as
well as forward and backward.
3.1. Smart education System for deaf- mute children
A new era of technology (EduAR) comprises of augmented reality with Holograms that can be
deployed in the field of education. AR adds to the reality that is ordinarily seen rather than replacing the
reality. The ultimate goal of EduAR is to create a convenient and natural immersion such that, it becomes
extremely easy to understand and interpret what we learn. Compared to the previous years, education today
has improved and expanded to an extent that it becomes hard to follow and keep track of everything that is
been taught. This is where EduAR comes of help. One of the unique features of EduAR is the ability to ease
the learning of deaf-mute students. Using their vision and EduAR, these students can understand concepts as
easily as any other student.
Consider a scenario of a biology class. Using EduAR, holographic images of the brain are used to
explain the minute and intricate parts, which in reality is difficult for the teachers to describe just based on
textbooks. As a result, EduAR helps the student understand how a brain works visually, thereby helping them
to perform better during operations. It also enables medical students to acquire knowledge and understanding
about the human body by means of interaction within a virtual environment in which no patients are at risk.
This in further, can be extended to different streams in a way that is usually difficult for a student to interpret.
Teaching and learning has to be much more than a simple process of knowledge transfer by memorizing.
EduAR improves the student’s learning experience thereby, developing the interests of an individual. Many
labs that have 3D models of brains or hearts, do not really give much information on the deeper layers of the
brain which will be one the major advantages of using EduAR. Also, with EduAR dynamic labeling will be
an asset. Teachers can modify or add labels of parts as and when required rather than buying a new 3D model
every time. Thus, EduAR focuses on transforming the overall education experience.
3.2. E-learning based virtual car driving system
Simulations of real-world scenarios can be achieved with a high value of realism using the concept
of virtual reality. This characteristic of virtual reality environments gives an advantage towards e-learning
where the solution to real-world problems can be learned virtually. Gesture recognition is another technology
that has seen a steady increase in popularity as a method of human-computer interaction. Specific gestures
can be translated to perform specific tasks. With hand gesture detection devices like Leap Motion, gesture
detection to communicate with an environment has become simplistic, and usable. Combining such
technologies with the learning process provides lucrative methodologies, which can be leveraged to simplify
the process of learning. After implementing the VIRECAR virtual simulator system, it was analyzed with the
feedback from a sample of 20 licensed drivers and 20 new learners in driving. E-learning process in
VIRECAR as shown in Figure 2.
Int J Inf & Commun Technol ISSN: 2252-8776 
Augmented reality and virtual reality in our daily life (Vallidevi Krishnamurthy)
209
Figure 2. E-learning process in VIRECAR
The learning experience obtained from VIRECAR is explained in three stages in this paper: First
Stage is considered as the Pre-Class Content in e-learning and Second Stage is considered as the In-Class
Content and the Third Stage is considered as the Post Class Content, where these three stages of e-learning
are already explained in [5], [6]. In the first stage - Basics of Driving, the user is required to understand the
road and safety regulations prescribed by the Road Transport Office and are also required to understand the
basic functionalities of the car. Towards the completion of this phase, the learners are required to undertake a
test evaluating their competency in the basics of driving a car.
In the second stage – Learning through VIRECAR, VIRECAR is used to understand the working of
a car and getting used to the driving environment. At this stage, the
Concept of an instructor is introduced. The instructor at this stage is present at regular intervals evaluating the
learning process undertaken by the learner and feedbacks are provided as and when necessary.
The third stage – Actual Driving, which can also be described as a post-class exercise is to drive an
actual car based on the learning’s obtained from VIRECAR. In this phase, the instructor is present full-time
which allows him to give a feedback. The learner at this stage is expected to drive a car with the experience
and feedback obtained from the second stage and the knowledge of the rules and regulations obtained from
the first stage.
The above three steps occurring for the first time constitute the first iteration of the learning process.
In the forthcoming iterations, the learner is not required to go through the first stage again since, the
requirement to proceed to the second stage is to master the concepts in the first stage. Thus, here, a concept
of back-propagation is used.
Based on the feedback obtained from the third stage of the VIRECAR e-learning process, the learner
might have to gain more experience from the virtual environment before gaining that experience in the real-
world. Essentially, the learner is expected to experience each aspect of driving a car including changing
gears, using the pedals for acceleration, brake, and clutch, to steer the wheel and so on. This allows for a safe
environment for the learner to effectively experience the learning process described in this paper aimed at
learning, driving a car properly.
4. RESULTS AND DISCUSSION
To analyze the VIRECAR simulator system that is described in this paper, the system was tested
with a sample of 12 licensed drivers and thus the efficiency of the VIRECAR was evaluated by their
feedback. They were introduced to the whole system as described in Figure 2. Furthermore, they were asked
to try the simulator to answer certain questions that were deemed vital, towards evaluating VIRECAR.Some
of the following questions that were put forward:
− How close to reality was it?
− How helpful was it in understanding the driving scenario?
− How useful was the system in learning rules and regulations?
− If you had not driven an actual car, would you prefer to learn using VIRECAR?
− Do you think you will be able to rectify any mistakes through VIRECAR?
 ISSN: 2252-8776
Int J Inf & Commun Technol, Vol. 9, No. 3, December 2020: 205 – 211
210
Table 1 shows a consolidated data that was obtained from the testers of VIRECAR. The drivers
gave an average score of 7.833 and 8.167 to the questions regarding driving scenario and understanding the
rules and regulations as shown in Table 2. This gives us the view that the phase one of VIRECAR must be
made more stringent and some of the elements from phase one must also be implemented in phase two to
provide a sense of continuity. This paper discusses about the use of virtual reality in the e-learning based car
driving system. Similar such mobile based learning system could be developed with the learning platform
and frameworks that were discussed in [16]. This paper also discusses the use of augmented reality in the
field of medicine. This will help the patients to get a second opinion from the doctors who will be capable of
detecting the abnormalities in the patient, from their experience, even when the doctors are not in the same
geographical location. Similar such video-based abnormal detection is discussed in [17]. These two systems
will be very much useful for the end-users as they are replacing the original working models and giving the
same experience to the end users. In the case of the e-learning based car driving system, there is no fuel
wastage, no added traffic due to the novice drivers, and no need to worry about pollution too. Now days,
there are few research happening for replacing the fuel and taking a chargeable vehicle instead of that [18].
Such problems are all solved as the driving system developed by the authors is fully based on virtual reality.
Table 1. Feedback Acquired from 12 users of VIRECAR
1 2 3 4 5 6 7 8 9 10 11 12 Mean
Q1 7 8 9 6 7 7 8 6 7 7 8 7 7.250
Q2 8 8 8 7 7 9 7 7 9 8 7 9 7.833
Q3 9 8 9 7 9 8 9 7 8 9 7 8 8.167
Q4 8 9 10 7 9 8 9 8 9 9 9 9 8.667
Q5 Y Y Y N Y Y Y N Y Y Y Y -
Table 2. Average rating from 12 users of VIRECAR
Question Average Rating
Q1 7.250
Q2 7.833
Q3 8.167
Q4 8.667
Q5 11 - Yes; 1 - No
5. CONCLUSION
Despite the fact that augmented reality and virtual reality sounds appealing on paper, there are
several challenges in implementing the system on a practical aspect. They fall in two categories: technical
and social. Considering the technical point of view, one requires a lot of components to make the system
function properly. Object recognition is the first step in any AR system. If objects in real world and virtual
world are not aligned with each other, the illusion of two coexisting worlds is affected. Hence, the quality of
the components is an important factor to be taken care of. Problems like inaccurate sensor readings, poor
pattern recognition can greatly affect the accuracy of the system. Considering the social aspect, augmented
reality is still at its nascent stage, and remains widely unknown to the general public. Similarly, virtual reality
also has the problem in implementation as it needs more components and new technologies to be learnt.
There are many more applications for both AR and VR to be used more effectively in our day today
life. One such interesting application is the Neonatal care training system which could be developed with
these new technologies. Inspite of many implementation issues, these technologies have their own benefits of
not taking any public into risk during the initial learning process.
REFERENCES
[1] T. P. Caudell and D. W. Mizell, “Augmented reality: an application of heads-up display technology to manual
manufacturing processes,” In Proceedings of the Twenty-Fifth Hawaii International Conference on System
Sciences, 1992, pp. 659–669, 1992.
[2] Jules White, Douglas C. Schmidt, Mani Golparvar-Fard, “Applications of Augmented Reality,” Proceedings of the
IEEE ISSN: 0018-9219 ,vol. 102 , no. 2.
Int J Inf & Commun Technol ISSN: 2252-8776 
Augmented reality and virtual reality in our daily life (Vallidevi Krishnamurthy)
211
[3] Nor Farhah Saidin, Noor Dayana Abd Halim and Noraffandy Yahaya, “A Review of Research on Augmented
Reality in Education: Advantages and Applications,” International Education Studies, no. 13, pp. 1913-9039, 2015.
[4] Charvi Agarwal and Narina Thakur, “The Evolution and Future Scope of Augmented Reality,” International
Journal of Computer Science Issues (IJCSI), vol. 11, no. 6, p. 1, 2014.
[5] Vladimir. U. “Smart Education and Smart e-Learning,” Springer Publishing Company, Incorporated, 2015.
[6] Gleadow, R. “Design for learning – a case study of blended learning in a science unit,” F1000 Research, vol. 4,
no. 898, 2015.
[7] Senthil K N, “Enhancement of Skills through E-Learning: Prospects and Problems,” The Online Journal of
Distance Education and e-Learning, vol. 4, no. 3, pp. 24-32, 2016.
[8] Bouras, “An e-Learning Through Distributed Virtual Environments,” Journal of Network and Computer
Applications, vol. 24, pp. 175-199, 2001.
[9] Sergey Sannikov, Fedor Zhdanov, Pavel Chebotarev and Pavel Rabinovich., “Interactive Educational Content
Based on Augmented Reality and 3D Visualization,” 4th International Young Scientists Conference on
Computational Science, pp. 720–729, 2015.
[10] Y. Genc, S. Riedel, F. Souvannavong, C. Akinlar and N. Navab, "Marker-less tracking for AR: a learning-based
approach," Proceedings. International Symposium on Mixed and Augmented Reality, pp. 295-304, 2002.
[11] ElindaAi-Lim L, “How does desktop virtual reality enhance learning outcomes? A Structural Equation Modeling
Approach,” Elsevier Computers and Education, vol. 55, no. 4, pp. 1424-1442, 2010.
[12] Georgios . D. “Investigating the educational value of social learning networks: a quantitative analysis, Interactive
Technology and Smart Education,” vol. 13,no. 4, pp. 305-322, 2016.
[13] M. Nguyen and A. Yeap, "StereoTag: A novel stereogram-marker-based approach for Augmented Reality," 2016
23rd International Conference on Pattern Recognition (ICPR), Cancun, pp. 1059-1064, 2016.
[14] Richard A. Newcombe, Shahram Izadi, Otmar Hilliges , David Molyneaux , David Kim , Andrew J. Davison,
Pushmeet Kohli, Jamie Shotton, Steve Hodges, Andrew Fitzgibbon, “KinectFusion: Real-Time Dense Surface
Mapping and Tracking.” IEEE International Symposium on Mixed and Augmented Reality 2011 Science and
Technolgy Proceedings 26 -29 October, Basel, Switzerland, 2011.
[15] M. Garon, P. O. Boulet, J. P. Doironz, L. Beaulieu and J. F. Lalonde, "Real-Time High Resolution 3D Data on the
HoloLens," 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Merida,
pp. 189-191, 2016.
[16] Ahmad Shukri Bin Moh Noor, Marwan Nasser Yousef Atoom, Rabiei Mamat “A review of cloud oriented mobile
learning platform and frameworks,” International Journal of Electrical and Computer Engineering (IJECE), vol. 9,
no. 6, pp. 5529-5536, 2019.
[17] Ahlam Al-Dhamari, Rubita Sudirman, Nasrul Humaimi Mahmood, Nor Hisham Khamis, Azli Yahya, “Online
video-based abnormal detection using highly motion techniques and statistical measures,” TELKOMNIKA, vol. 17,
no. 4, pp.2039-2047, 2019.
[18] Junghoon Lee, Gyung-Leen Park, “Design of a Monitoring-combined Siting Scheme for Electric Vehicle
Chargers,” International Journal of Electrical and Computer Engineering (IJECE), vol. 8, no. 6,
pp. 5303-5310, 2018.

More Related Content

What's hot

IRJET- Logistics Network Superintendence Based on Knowledge Engineering
IRJET- Logistics Network Superintendence Based on Knowledge EngineeringIRJET- Logistics Network Superintendence Based on Knowledge Engineering
IRJET- Logistics Network Superintendence Based on Knowledge EngineeringIRJET Journal
 
A survey paper on various biometric security system methods
A survey paper on various biometric security system methodsA survey paper on various biometric security system methods
A survey paper on various biometric security system methodsIRJET Journal
 
Assistance Application for Visually Impaired - VISION
Assistance Application for Visually  Impaired - VISIONAssistance Application for Visually  Impaired - VISION
Assistance Application for Visually Impaired - VISIONIJSRED
 
ANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCE
ANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCEANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCE
ANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCEijesajournal
 
Impact Of Artificial Intelligence On Wildlife
Impact Of Artificial Intelligence On WildlifeImpact Of Artificial Intelligence On Wildlife
Impact Of Artificial Intelligence On WildlifeaNumak & Company
 
IRJET- IoT based Facial Recognition Biometric Attendance
IRJET- IoT based Facial Recognition Biometric AttendanceIRJET- IoT based Facial Recognition Biometric Attendance
IRJET- IoT based Facial Recognition Biometric AttendanceIRJET Journal
 
IRJET- Virtual Fitness Trainer with Spontaneous Feedback using a Line of Moti...
IRJET- Virtual Fitness Trainer with Spontaneous Feedback using a Line of Moti...IRJET- Virtual Fitness Trainer with Spontaneous Feedback using a Line of Moti...
IRJET- Virtual Fitness Trainer with Spontaneous Feedback using a Line of Moti...IRJET Journal
 
Protection for images transmission
Protection for images transmissionProtection for images transmission
Protection for images transmissionIAEME Publication
 
Dermatological diagnosis by mobile application
Dermatological diagnosis by mobile applicationDermatological diagnosis by mobile application
Dermatological diagnosis by mobile applicationjournalBEEI
 
IRJET - A Review on Face Recognition using Deep Learning Algorithm
IRJET -  	  A Review on Face Recognition using Deep Learning AlgorithmIRJET -  	  A Review on Face Recognition using Deep Learning Algorithm
IRJET - A Review on Face Recognition using Deep Learning AlgorithmIRJET Journal
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET Journal
 
Face Recognition and Increased Reality System for Mobile Devices
Face Recognition and Increased Reality System for Mobile DevicesFace Recognition and Increased Reality System for Mobile Devices
Face Recognition and Increased Reality System for Mobile Devicesijtsrd
 
IRJET- Self Driving Car using Deep Q-Learning
IRJET-  	  Self Driving Car using Deep Q-LearningIRJET-  	  Self Driving Car using Deep Q-Learning
IRJET- Self Driving Car using Deep Q-LearningIRJET Journal
 
IRJET- Review on Human Action Detection in Stored Videos using Support Vector...
IRJET- Review on Human Action Detection in Stored Videos using Support Vector...IRJET- Review on Human Action Detection in Stored Videos using Support Vector...
IRJET- Review on Human Action Detection in Stored Videos using Support Vector...IRJET Journal
 
Saksham presentation
Saksham presentationSaksham presentation
Saksham presentationSakshamTurki
 
Iris recognition system
Iris recognition systemIris recognition system
Iris recognition systemAnil Shrestha
 
A Simulation Method of Soft Tissue Cutting In Virtual Environment with Haptics
A Simulation Method of Soft Tissue Cutting In Virtual Environment with HapticsA Simulation Method of Soft Tissue Cutting In Virtual Environment with Haptics
A Simulation Method of Soft Tissue Cutting In Virtual Environment with HapticsIJERA Editor
 
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATIONMOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATIONijma
 

What's hot (20)

IRJET- Logistics Network Superintendence Based on Knowledge Engineering
IRJET- Logistics Network Superintendence Based on Knowledge EngineeringIRJET- Logistics Network Superintendence Based on Knowledge Engineering
IRJET- Logistics Network Superintendence Based on Knowledge Engineering
 
A survey paper on various biometric security system methods
A survey paper on various biometric security system methodsA survey paper on various biometric security system methods
A survey paper on various biometric security system methods
 
Assistance Application for Visually Impaired - VISION
Assistance Application for Visually  Impaired - VISIONAssistance Application for Visually  Impaired - VISION
Assistance Application for Visually Impaired - VISION
 
Published_Papers
Published_PapersPublished_Papers
Published_Papers
 
ANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCE
ANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCEANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCE
ANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCE
 
Impact Of Artificial Intelligence On Wildlife
Impact Of Artificial Intelligence On WildlifeImpact Of Artificial Intelligence On Wildlife
Impact Of Artificial Intelligence On Wildlife
 
IRJET- IoT based Facial Recognition Biometric Attendance
IRJET- IoT based Facial Recognition Biometric AttendanceIRJET- IoT based Facial Recognition Biometric Attendance
IRJET- IoT based Facial Recognition Biometric Attendance
 
IRJET- Virtual Fitness Trainer with Spontaneous Feedback using a Line of Moti...
IRJET- Virtual Fitness Trainer with Spontaneous Feedback using a Line of Moti...IRJET- Virtual Fitness Trainer with Spontaneous Feedback using a Line of Moti...
IRJET- Virtual Fitness Trainer with Spontaneous Feedback using a Line of Moti...
 
Protection for images transmission
Protection for images transmissionProtection for images transmission
Protection for images transmission
 
Dermatological diagnosis by mobile application
Dermatological diagnosis by mobile applicationDermatological diagnosis by mobile application
Dermatological diagnosis by mobile application
 
IRJET - A Review on Face Recognition using Deep Learning Algorithm
IRJET -  	  A Review on Face Recognition using Deep Learning AlgorithmIRJET -  	  A Review on Face Recognition using Deep Learning Algorithm
IRJET - A Review on Face Recognition using Deep Learning Algorithm
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language Interpreter
 
Face Recognition and Increased Reality System for Mobile Devices
Face Recognition and Increased Reality System for Mobile DevicesFace Recognition and Increased Reality System for Mobile Devices
Face Recognition and Increased Reality System for Mobile Devices
 
AUGMENTED REALITY
AUGMENTED REALITYAUGMENTED REALITY
AUGMENTED REALITY
 
IRJET- Self Driving Car using Deep Q-Learning
IRJET-  	  Self Driving Car using Deep Q-LearningIRJET-  	  Self Driving Car using Deep Q-Learning
IRJET- Self Driving Car using Deep Q-Learning
 
IRJET- Review on Human Action Detection in Stored Videos using Support Vector...
IRJET- Review on Human Action Detection in Stored Videos using Support Vector...IRJET- Review on Human Action Detection in Stored Videos using Support Vector...
IRJET- Review on Human Action Detection in Stored Videos using Support Vector...
 
Saksham presentation
Saksham presentationSaksham presentation
Saksham presentation
 
Iris recognition system
Iris recognition systemIris recognition system
Iris recognition system
 
A Simulation Method of Soft Tissue Cutting In Virtual Environment with Haptics
A Simulation Method of Soft Tissue Cutting In Virtual Environment with HapticsA Simulation Method of Soft Tissue Cutting In Virtual Environment with Haptics
A Simulation Method of Soft Tissue Cutting In Virtual Environment with Haptics
 
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATIONMOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
 

Similar to 07 20278 augmented reality...

Augmented Reality for Smart Profile Display Part-1
Augmented Reality for Smart Profile Display Part-1Augmented Reality for Smart Profile Display Part-1
Augmented Reality for Smart Profile Display Part-1IRJET Journal
 
Augmented Reality for Smart Profile Display Part-1
Augmented Reality for Smart Profile Display Part-1Augmented Reality for Smart Profile Display Part-1
Augmented Reality for Smart Profile Display Part-1IRJET Journal
 
Augmented Reality (AR) for Education System
Augmented Reality (AR) for Education SystemAugmented Reality (AR) for Education System
Augmented Reality (AR) for Education SystemIRJET Journal
 
3D Object Tracking And Manipulation In Augmented Reality
3D Object Tracking And Manipulation In Augmented Reality3D Object Tracking And Manipulation In Augmented Reality
3D Object Tracking And Manipulation In Augmented RealitySabrina Ball
 
IRJET-Mixed Reality in Healthcare Education
IRJET-Mixed Reality in Healthcare EducationIRJET-Mixed Reality in Healthcare Education
IRJET-Mixed Reality in Healthcare EducationIRJET Journal
 
Augmented Reality (AR) In Healthcare
Augmented Reality (AR) In HealthcareAugmented Reality (AR) In Healthcare
Augmented Reality (AR) In HealthcareAudrey Britton
 
Building EdTech Solutions with AR
Building EdTech Solutions with ARBuilding EdTech Solutions with AR
Building EdTech Solutions with ARIRJET Journal
 
IRJET- Overview of Augmented Reality in Education
IRJET- Overview of Augmented Reality in EducationIRJET- Overview of Augmented Reality in Education
IRJET- Overview of Augmented Reality in EducationIRJET Journal
 
GYM MANAGEMENT SYSTEM USING AUGMENTED REALITY
GYM MANAGEMENT SYSTEM USING AUGMENTED REALITYGYM MANAGEMENT SYSTEM USING AUGMENTED REALITY
GYM MANAGEMENT SYSTEM USING AUGMENTED REALITYIRJET Journal
 
IRJET- Educatar: Dissemination of Conceptualized Information using Augmented ...
IRJET- Educatar: Dissemination of Conceptualized Information using Augmented ...IRJET- Educatar: Dissemination of Conceptualized Information using Augmented ...
IRJET- Educatar: Dissemination of Conceptualized Information using Augmented ...IRJET Journal
 
Case Study of Augmented Reality Applications in Medical Field
Case Study of Augmented Reality Applications in Medical FieldCase Study of Augmented Reality Applications in Medical Field
Case Study of Augmented Reality Applications in Medical Fieldijtsrd
 
Augmented Reality And Its Science
Augmented Reality And Its ScienceAugmented Reality And Its Science
Augmented Reality And Its ScienceLisa Graves
 
Rise of augmented reality : current and future applications
Rise of augmented reality : current and future applicationsRise of augmented reality : current and future applications
Rise of augmented reality : current and future applicationsU Reshmi
 
Augmented Reality in Medical Education
Augmented Reality in Medical EducationAugmented Reality in Medical Education
Augmented Reality in Medical EducationIRJET Journal
 
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITYENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITYIRJET Journal
 
Augmented Reality for Smart Profile Display Part-2
Augmented Reality for Smart Profile Display Part-2Augmented Reality for Smart Profile Display Part-2
Augmented Reality for Smart Profile Display Part-2IRJET Journal
 
IRJET - Application of AR in Education
IRJET - Application of AR in EducationIRJET - Application of AR in Education
IRJET - Application of AR in EducationIRJET Journal
 
New Era of Teaching Learning : 3D Marker Based Augmented Reality
New Era of Teaching Learning : 3D Marker Based Augmented RealityNew Era of Teaching Learning : 3D Marker Based Augmented Reality
New Era of Teaching Learning : 3D Marker Based Augmented Realityijistjournal
 

Similar to 07 20278 augmented reality... (20)

Augmented Reality for Smart Profile Display Part-1
Augmented Reality for Smart Profile Display Part-1Augmented Reality for Smart Profile Display Part-1
Augmented Reality for Smart Profile Display Part-1
 
Augmented Reality for Smart Profile Display Part-1
Augmented Reality for Smart Profile Display Part-1Augmented Reality for Smart Profile Display Part-1
Augmented Reality for Smart Profile Display Part-1
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Augmented Reality (AR) for Education System
Augmented Reality (AR) for Education SystemAugmented Reality (AR) for Education System
Augmented Reality (AR) for Education System
 
3D Object Tracking And Manipulation In Augmented Reality
3D Object Tracking And Manipulation In Augmented Reality3D Object Tracking And Manipulation In Augmented Reality
3D Object Tracking And Manipulation In Augmented Reality
 
IRJET-Mixed Reality in Healthcare Education
IRJET-Mixed Reality in Healthcare EducationIRJET-Mixed Reality in Healthcare Education
IRJET-Mixed Reality in Healthcare Education
 
Augmented Reality (AR) In Healthcare
Augmented Reality (AR) In HealthcareAugmented Reality (AR) In Healthcare
Augmented Reality (AR) In Healthcare
 
Building EdTech Solutions with AR
Building EdTech Solutions with ARBuilding EdTech Solutions with AR
Building EdTech Solutions with AR
 
IRJET- Overview of Augmented Reality in Education
IRJET- Overview of Augmented Reality in EducationIRJET- Overview of Augmented Reality in Education
IRJET- Overview of Augmented Reality in Education
 
AR BOOK - EDUCATION
AR BOOK - EDUCATIONAR BOOK - EDUCATION
AR BOOK - EDUCATION
 
GYM MANAGEMENT SYSTEM USING AUGMENTED REALITY
GYM MANAGEMENT SYSTEM USING AUGMENTED REALITYGYM MANAGEMENT SYSTEM USING AUGMENTED REALITY
GYM MANAGEMENT SYSTEM USING AUGMENTED REALITY
 
IRJET- Educatar: Dissemination of Conceptualized Information using Augmented ...
IRJET- Educatar: Dissemination of Conceptualized Information using Augmented ...IRJET- Educatar: Dissemination of Conceptualized Information using Augmented ...
IRJET- Educatar: Dissemination of Conceptualized Information using Augmented ...
 
Case Study of Augmented Reality Applications in Medical Field
Case Study of Augmented Reality Applications in Medical FieldCase Study of Augmented Reality Applications in Medical Field
Case Study of Augmented Reality Applications in Medical Field
 
Augmented Reality And Its Science
Augmented Reality And Its ScienceAugmented Reality And Its Science
Augmented Reality And Its Science
 
Rise of augmented reality : current and future applications
Rise of augmented reality : current and future applicationsRise of augmented reality : current and future applications
Rise of augmented reality : current and future applications
 
Augmented Reality in Medical Education
Augmented Reality in Medical EducationAugmented Reality in Medical Education
Augmented Reality in Medical Education
 
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITYENHANCED EDUCATION THROUGH AUGMENTED REALITY
ENHANCED EDUCATION THROUGH AUGMENTED REALITY
 
Augmented Reality for Smart Profile Display Part-2
Augmented Reality for Smart Profile Display Part-2Augmented Reality for Smart Profile Display Part-2
Augmented Reality for Smart Profile Display Part-2
 
IRJET - Application of AR in Education
IRJET - Application of AR in EducationIRJET - Application of AR in Education
IRJET - Application of AR in Education
 
New Era of Teaching Learning : 3D Marker Based Augmented Reality
New Era of Teaching Learning : 3D Marker Based Augmented RealityNew Era of Teaching Learning : 3D Marker Based Augmented Reality
New Era of Teaching Learning : 3D Marker Based Augmented Reality
 

More from IAESIJEECS

08 20314 electronic doorbell...
08 20314 electronic doorbell...08 20314 electronic doorbell...
08 20314 electronic doorbell...IAESIJEECS
 
06 17443 an neuro fuzzy...
06 17443 an neuro fuzzy...06 17443 an neuro fuzzy...
06 17443 an neuro fuzzy...IAESIJEECS
 
05 20275 computational solution...
05 20275 computational solution...05 20275 computational solution...
05 20275 computational solution...IAESIJEECS
 
04 20268 power loss reduction ...
04 20268 power loss reduction ...04 20268 power loss reduction ...
04 20268 power loss reduction ...IAESIJEECS
 
03 20237 arduino based gas-
03 20237 arduino based gas-03 20237 arduino based gas-
03 20237 arduino based gas-IAESIJEECS
 
02 20274 improved ichi square...
02 20274 improved ichi square...02 20274 improved ichi square...
02 20274 improved ichi square...IAESIJEECS
 
01 20264 diminution of real power...
01 20264 diminution of real power...01 20264 diminution of real power...
01 20264 diminution of real power...IAESIJEECS
 
08 20272 academic insight on application
08 20272 academic insight on application08 20272 academic insight on application
08 20272 academic insight on applicationIAESIJEECS
 
07 20252 cloud computing survey
07 20252 cloud computing survey07 20252 cloud computing survey
07 20252 cloud computing surveyIAESIJEECS
 
06 20273 37746-1-ed
06 20273 37746-1-ed06 20273 37746-1-ed
06 20273 37746-1-edIAESIJEECS
 
05 20261 real power loss reduction
05 20261 real power loss reduction05 20261 real power loss reduction
05 20261 real power loss reductionIAESIJEECS
 
04 20259 real power loss
04 20259 real power loss04 20259 real power loss
04 20259 real power lossIAESIJEECS
 
03 20270 true power loss reduction
03 20270 true power loss reduction03 20270 true power loss reduction
03 20270 true power loss reductionIAESIJEECS
 
02 15034 neural network
02 15034 neural network02 15034 neural network
02 15034 neural networkIAESIJEECS
 
01 8445 speech enhancement
01 8445 speech enhancement01 8445 speech enhancement
01 8445 speech enhancementIAESIJEECS
 
08 17079 ijict
08 17079 ijict08 17079 ijict
08 17079 ijictIAESIJEECS
 
07 20269 ijict
07 20269 ijict07 20269 ijict
07 20269 ijictIAESIJEECS
 
06 10154 ijict
06 10154 ijict06 10154 ijict
06 10154 ijictIAESIJEECS
 
05 20255 ijict
05 20255 ijict05 20255 ijict
05 20255 ijictIAESIJEECS
 
04 18696 ijict
04 18696 ijict04 18696 ijict
04 18696 ijictIAESIJEECS
 

More from IAESIJEECS (20)

08 20314 electronic doorbell...
08 20314 electronic doorbell...08 20314 electronic doorbell...
08 20314 electronic doorbell...
 
06 17443 an neuro fuzzy...
06 17443 an neuro fuzzy...06 17443 an neuro fuzzy...
06 17443 an neuro fuzzy...
 
05 20275 computational solution...
05 20275 computational solution...05 20275 computational solution...
05 20275 computational solution...
 
04 20268 power loss reduction ...
04 20268 power loss reduction ...04 20268 power loss reduction ...
04 20268 power loss reduction ...
 
03 20237 arduino based gas-
03 20237 arduino based gas-03 20237 arduino based gas-
03 20237 arduino based gas-
 
02 20274 improved ichi square...
02 20274 improved ichi square...02 20274 improved ichi square...
02 20274 improved ichi square...
 
01 20264 diminution of real power...
01 20264 diminution of real power...01 20264 diminution of real power...
01 20264 diminution of real power...
 
08 20272 academic insight on application
08 20272 academic insight on application08 20272 academic insight on application
08 20272 academic insight on application
 
07 20252 cloud computing survey
07 20252 cloud computing survey07 20252 cloud computing survey
07 20252 cloud computing survey
 
06 20273 37746-1-ed
06 20273 37746-1-ed06 20273 37746-1-ed
06 20273 37746-1-ed
 
05 20261 real power loss reduction
05 20261 real power loss reduction05 20261 real power loss reduction
05 20261 real power loss reduction
 
04 20259 real power loss
04 20259 real power loss04 20259 real power loss
04 20259 real power loss
 
03 20270 true power loss reduction
03 20270 true power loss reduction03 20270 true power loss reduction
03 20270 true power loss reduction
 
02 15034 neural network
02 15034 neural network02 15034 neural network
02 15034 neural network
 
01 8445 speech enhancement
01 8445 speech enhancement01 8445 speech enhancement
01 8445 speech enhancement
 
08 17079 ijict
08 17079 ijict08 17079 ijict
08 17079 ijict
 
07 20269 ijict
07 20269 ijict07 20269 ijict
07 20269 ijict
 
06 10154 ijict
06 10154 ijict06 10154 ijict
06 10154 ijict
 
05 20255 ijict
05 20255 ijict05 20255 ijict
05 20255 ijict
 
04 18696 ijict
04 18696 ijict04 18696 ijict
04 18696 ijict
 

Recently uploaded

Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Science&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfScience&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfjimielynbastida
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Bluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdfBluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdfngoud9212
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Neo4j
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsAndrey Dotsenko
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 

Recently uploaded (20)

Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Science&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfScience&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdf
 
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptxVulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Bluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdfBluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdf
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 

07 20278 augmented reality...

  • 1. International Journal of Informatics and Communication Technology (IJ-ICT) Vol.9, No.3, December 2020, pp. 205~211 ISSN: 2252-8776, DOI: 10.11591/ijict.v9i3.pp205-211  205 Journal homepage: http://ijict.iaescore.com Augmented reality and virtual reality in our daily life Vallidevi Krishnamurthy1 , Nagarajan K. K2 , D.Venkata Vara Prasad3 , Ganesh Kumar4 , Arjith Natarajan5 , Shailesh Saravanan6 , Akshaya Natarajan7 , Shankaran Murugan8 , Dattuluri Rushitaa9 1-3, 5-9 SSN College of Engineering, India 4 Chennai, SRM University, India Article Info ABSTRACT Article history: Received Feb 3, 2020 Revised Mar 3, 2020 Accepted Mar 27, 2020 Augmented reality, the new age technology, has widespread applications in every field imaginable. This technology has proven to be an inflection point in numerous verticals, improving lives and improving performance. In this paper, we explore the various possible applications of Augmented Reality (AR) in the field of Medicine. The objective of using AR in medicine or generally in any field is the fact that, AR helps in motivating the user, making sessions interactive and assist in faster learning. In this paper, we discuss about the applicability of AR in the field of medical diagnosis. Augmented reality technology reinforces remote collaboration, allowing doctors to diagnose patients from a different locality. Additionally, we believe that a much more pronounced effect can be achieved by bringing together the cutting edge technology of AR and the lifesaving field of Medical sciences. AR is a mechanism that could be applied in the learning process too. Similarly, virtual reality could be used in the field where more of practical experience is needed such as driving, sports, neonatal care training. Keywords: Augmented reality Driving Education Medical diagnosis Neonatal care training Virtual reality This is an open access article under the CC BY-SA license. Corresponding Author: Vallidevi Krishnamurthy, SSN College of Engineering, Tamil Nadu, Chennai, 603110, India. Email: vallidevik@ssn.edu.in 1. INTRODUCTION The fundamental value added by Augmented Reality is that, it brings components of the digital world into a person’s perception of the real world. Using AR, we can augment any object into the real world. There are many existing applications that use AR and many more that are still trying to implement their application using AR in their respective fields. The first application that was developed using AR in 1990’s was used for pilot training [1] and since then its applicability has been complemented with a variety of fields. The significance of AR is observed when they are used for those training purposes that are generally difficult to do in real life, such as pilot training, medical diagnosis, medical experiments, army training, automobile industries etc. Augmented Reality in conjunction with technologies like IoT, virtual reality has evolved to an extent that makes artificial intelligence sound like a task that can be easily accomplished. This paper explains a model that saves the time taken to perform preliminary examinations in medical diagnosis. The preliminary process of examination is eliminated by digitally transporting the patient’s details to the doctor and hence, the doctor can evaluate them when he is available. The doctor can further suggest the required tests and diagnosis for the respective patient. AR could be used for medical education through which the student interaction in classes could be increased and there could be more of practical discussions in the class. This will trigger the interest of the students. Virtual reality takes the person to that place and makes us to learn new things in the environment that is needed for the learning process. In Virtual Reality, a whole new environment is created according to the demand. On the other hand, in Augmented Reality, images are
  • 2.  ISSN: 2252-8776 Int J Inf & Commun Technol, Vol. 9, No. 3, December 2020: 205 – 211 206 super-imposed into the real world than creating a completely new environment. Augmented Reality supplements reality, rather than completely replacing it. The virtual objects display information that the user cannot directly detect with his own senses. AR not only increases our learning curve but also improves our brain’s productivity. It changes the way one thinks, works and how one learns. Instead of having just one angle of any concept, we get multiple angles of the same concept. Sergey et al, in [2] which discusses about interactive environment for education using AR and 3D visualization. Bouras in [3] reviews the importance of virtual environment in e-learning. Charvi Agarwal et al in [4], confers about recent practices of AR in daily life. [5-7] examines the e-learning and e-learning based improvement of skill set of the user. [8] Explains about the applications of AR. The following research papers [2] and [8] were referred to for AR technology which discuses about interactive environment for education using AR and 3D visualization. [9] Discusses about kinetic fusion technology and the implementation of kinetic camera. [10] Discusses about advantages and limitations of AR in educational environment. [4] Discusses about the recent practices of AR in daily life. [3] Discusses about the importance of virtual environment in e-learning. [11] Discusses about structural equation and modeling approach for explaining the enhancement in learning outcomes through desktop VR. [12] Discusses about quantitative analysis of education through social learning outcomes. 2. PROPOSED ALGORITHM 2.1. Augmented reality and applications In order to evaluate where virtual objects are to be blended in real environment, the following approaches are considered. Marker based approach in a marker-based approach [13], the images and the corresponding image descriptors to be recognized are provided beforehand. The markers that are used can be QR Codes or black and white square mirrors. Marker-less Approach, in a marker-less approach, an AR application recognizes things there were not directly provided to the application beforehand. This scenario is much more difficult to implement because the recognition algorithm running in your AR application has to identify patterns, colors or some other features that may exist in camera frames. The advancement of technology in the field of medicine has grown to an extent that makes diagnosis very simple and efficient. Doctors can use augmented reality as a visualization technique and training aid for surgery [14]. The input data of the patient can be either images, or metadata about the heart beat and so on which will be transported to the doctor, along with the patient’s 3D hologram. Thus, by using AR in this field, it gives the doctor a complete vision like an X-Ray of the patient. The goal is to make sure the diagnosis process is accelerated. 2.2. Virtual reailty and its applications Virtual reality based two e-learning systems are discussed in the following section. The first is the e- learning based virtual car driving system (VIRECAR) and the second is the e-learning based virtual cricket training system. The car driving system provides road safety measures which could help the learners to avoid accidents. The other system will help the players to learn more by not losing their energy and train themselves before appearing for a real match. 3. RESEARCH METHOD A system that aims to simplify the doctor/patient preliminary examinations shown in Figure 1 is the system that is newly proposed in this paper. The idea is to digitally transport the patient to doctor’s room as a simple simplified 3D object, with all necessary details to facilitate the findings. Also, without just transporting a digital object the system will impart interactivity to these objects where in, at the doctor’s side, it gets rendered as augmented reality models superimposed in the environment. Thus, simulating the patient’s appointment. Data Collection Module, The first module of the system comprises of collection of patient data. This is done at the clinical premise. Patient data includes preliminary medical examination data like, temperature, breathing level, blood pressure. The collection and gathering shall be conducted by an assistant who uses the respective devices to capture these data. The image of the patient is then captured using Kinect Cameras which develops a 3D image of the patient using appropriate depths and coloring. 3D Model Building The videos are to be recorded, featuring the patient with the help of Kinect Cameras. These are special cameras with sensors to detect the depth of the patient dynamically. Images and videos are captured
  • 3. Int J Inf & Commun Technol ISSN: 2252-8776  Augmented reality and virtual reality in our daily life (Vallidevi Krishnamurthy) 207 via Depth cameras and they appear to be as 2D images. These images when fed into Kinect Fusion, will churn out 3D modelling of the scene captured along with objects. Figure 1. Premedical diagnosis system − Conversion The conversion of 2D images can be performed by Kinect API’s. Although the modelling of the 2D image into 3D models can be done in real time using Kinect Fusion [14], such an approach is traded for more accurate and detailed models by processing it separately. Though the process takes a lot of time, the needed quality of the 3D images that are rendered is achieved. The images captured as photographs are then overlayed onto the 3D objects to improve the originality of the objects. The same process is applied to any particular part of body, under examination. − Exclusion The subject of interest at the hand is only the patient and his vitals, any other objects present in the surrounding can be ignored. Thus, removal of the other objects is done in this component of the system. Selective capturing of images is not possible using Kinect Cameras, hence the approach used is to remove other unnecessary objects from the scene. This additional processing can be achieved through any image/graphic processing libraries like OpenCV, OpenGL and so on. The sequence of the steps would be: a) Detect unnecessary surfaces, b) Select them iteratively, c) Remove. The approach may vary for each patient. − Order of Processing It is ideal to process the video in a particular order. 3D models are constructed first and then the background objects are removed from the video. However, this reversal may not work due to the presence of the objects in the surrounding area vital for depth sensing. − Medical data correlator The patient’s data includes his/her appearance, in most cases patient come with complaints and no particulars. In all of the medical check-up, the complaint of the particular patient forms the core of the data collector. The complaint as narrated by the patient and recorded by the medical assistant is the vital information to any decision; general notes/ observations about mental state of the patient are also included. The data collected includes vital signs and basic checkups/ examination. Technologies and its definitions the intention of the proposed system is to use some of the existing software technologies, to make the preliminary diagnosis process effortless and uncomplicated. Some of the technologies that are used in the system are noticeably defined here. This section is in reference to the readers who do not have a clear idea about the definitions of certain technologies.
  • 4.  ISSN: 2252-8776 Int J Inf & Commun Technol, Vol. 9, No. 3, December 2020: 205 – 211 208 − Kinect Fusion Kinect Fusion [14] is a new and widely-available commodity sensor platform that incorporates a structured light based depth sensor. Real-time infrastructure-free tracking of a handheld camera whilst simultaneously mapping the physical scene in high-detail promises new possibilities for augmented and mixed reality applications. In [14], a detailed method with analysis of what may be the first system that permits real-time, dense volumetric reconstruction of objects/things using a handheld Kinect depth sensor. Users can simply pick up and move a Kinect device to generate a continuously updating, smooth, fully fused 3D surface reconstruction. − Unity 3D Kit Unity can be used to primarily create three dimensional and two dimensional games for computers. It can be defined as a cross-platform game engine, to create simulations. Unity can be used together with AR to create gaming applications that superimpose objects into the real world. One of the examples of a game using AR, is the Pokemon Go. In order to build the application, we utilise the SDK (Software Development Kit) of Unity, called Vuforia. It uses Computer Vision technology to identify objects, registers their position. And orient them. Vuforia is a free technology and is an ideal solution that serves the purpose. − HoloLens As one of the first AR headsets available today on the market, the Microsoft HoloLens is bound to have a profound impact on the development of AR applications [15]. The HoloLens is a head-mounted display unit connected to an adjustable, cushioned inner headband, which can tilt HoloLens up and down, as well as forward and backward. 3.1. Smart education System for deaf- mute children A new era of technology (EduAR) comprises of augmented reality with Holograms that can be deployed in the field of education. AR adds to the reality that is ordinarily seen rather than replacing the reality. The ultimate goal of EduAR is to create a convenient and natural immersion such that, it becomes extremely easy to understand and interpret what we learn. Compared to the previous years, education today has improved and expanded to an extent that it becomes hard to follow and keep track of everything that is been taught. This is where EduAR comes of help. One of the unique features of EduAR is the ability to ease the learning of deaf-mute students. Using their vision and EduAR, these students can understand concepts as easily as any other student. Consider a scenario of a biology class. Using EduAR, holographic images of the brain are used to explain the minute and intricate parts, which in reality is difficult for the teachers to describe just based on textbooks. As a result, EduAR helps the student understand how a brain works visually, thereby helping them to perform better during operations. It also enables medical students to acquire knowledge and understanding about the human body by means of interaction within a virtual environment in which no patients are at risk. This in further, can be extended to different streams in a way that is usually difficult for a student to interpret. Teaching and learning has to be much more than a simple process of knowledge transfer by memorizing. EduAR improves the student’s learning experience thereby, developing the interests of an individual. Many labs that have 3D models of brains or hearts, do not really give much information on the deeper layers of the brain which will be one the major advantages of using EduAR. Also, with EduAR dynamic labeling will be an asset. Teachers can modify or add labels of parts as and when required rather than buying a new 3D model every time. Thus, EduAR focuses on transforming the overall education experience. 3.2. E-learning based virtual car driving system Simulations of real-world scenarios can be achieved with a high value of realism using the concept of virtual reality. This characteristic of virtual reality environments gives an advantage towards e-learning where the solution to real-world problems can be learned virtually. Gesture recognition is another technology that has seen a steady increase in popularity as a method of human-computer interaction. Specific gestures can be translated to perform specific tasks. With hand gesture detection devices like Leap Motion, gesture detection to communicate with an environment has become simplistic, and usable. Combining such technologies with the learning process provides lucrative methodologies, which can be leveraged to simplify the process of learning. After implementing the VIRECAR virtual simulator system, it was analyzed with the feedback from a sample of 20 licensed drivers and 20 new learners in driving. E-learning process in VIRECAR as shown in Figure 2.
  • 5. Int J Inf & Commun Technol ISSN: 2252-8776  Augmented reality and virtual reality in our daily life (Vallidevi Krishnamurthy) 209 Figure 2. E-learning process in VIRECAR The learning experience obtained from VIRECAR is explained in three stages in this paper: First Stage is considered as the Pre-Class Content in e-learning and Second Stage is considered as the In-Class Content and the Third Stage is considered as the Post Class Content, where these three stages of e-learning are already explained in [5], [6]. In the first stage - Basics of Driving, the user is required to understand the road and safety regulations prescribed by the Road Transport Office and are also required to understand the basic functionalities of the car. Towards the completion of this phase, the learners are required to undertake a test evaluating their competency in the basics of driving a car. In the second stage – Learning through VIRECAR, VIRECAR is used to understand the working of a car and getting used to the driving environment. At this stage, the Concept of an instructor is introduced. The instructor at this stage is present at regular intervals evaluating the learning process undertaken by the learner and feedbacks are provided as and when necessary. The third stage – Actual Driving, which can also be described as a post-class exercise is to drive an actual car based on the learning’s obtained from VIRECAR. In this phase, the instructor is present full-time which allows him to give a feedback. The learner at this stage is expected to drive a car with the experience and feedback obtained from the second stage and the knowledge of the rules and regulations obtained from the first stage. The above three steps occurring for the first time constitute the first iteration of the learning process. In the forthcoming iterations, the learner is not required to go through the first stage again since, the requirement to proceed to the second stage is to master the concepts in the first stage. Thus, here, a concept of back-propagation is used. Based on the feedback obtained from the third stage of the VIRECAR e-learning process, the learner might have to gain more experience from the virtual environment before gaining that experience in the real- world. Essentially, the learner is expected to experience each aspect of driving a car including changing gears, using the pedals for acceleration, brake, and clutch, to steer the wheel and so on. This allows for a safe environment for the learner to effectively experience the learning process described in this paper aimed at learning, driving a car properly. 4. RESULTS AND DISCUSSION To analyze the VIRECAR simulator system that is described in this paper, the system was tested with a sample of 12 licensed drivers and thus the efficiency of the VIRECAR was evaluated by their feedback. They were introduced to the whole system as described in Figure 2. Furthermore, they were asked to try the simulator to answer certain questions that were deemed vital, towards evaluating VIRECAR.Some of the following questions that were put forward: − How close to reality was it? − How helpful was it in understanding the driving scenario? − How useful was the system in learning rules and regulations? − If you had not driven an actual car, would you prefer to learn using VIRECAR? − Do you think you will be able to rectify any mistakes through VIRECAR?
  • 6.  ISSN: 2252-8776 Int J Inf & Commun Technol, Vol. 9, No. 3, December 2020: 205 – 211 210 Table 1 shows a consolidated data that was obtained from the testers of VIRECAR. The drivers gave an average score of 7.833 and 8.167 to the questions regarding driving scenario and understanding the rules and regulations as shown in Table 2. This gives us the view that the phase one of VIRECAR must be made more stringent and some of the elements from phase one must also be implemented in phase two to provide a sense of continuity. This paper discusses about the use of virtual reality in the e-learning based car driving system. Similar such mobile based learning system could be developed with the learning platform and frameworks that were discussed in [16]. This paper also discusses the use of augmented reality in the field of medicine. This will help the patients to get a second opinion from the doctors who will be capable of detecting the abnormalities in the patient, from their experience, even when the doctors are not in the same geographical location. Similar such video-based abnormal detection is discussed in [17]. These two systems will be very much useful for the end-users as they are replacing the original working models and giving the same experience to the end users. In the case of the e-learning based car driving system, there is no fuel wastage, no added traffic due to the novice drivers, and no need to worry about pollution too. Now days, there are few research happening for replacing the fuel and taking a chargeable vehicle instead of that [18]. Such problems are all solved as the driving system developed by the authors is fully based on virtual reality. Table 1. Feedback Acquired from 12 users of VIRECAR 1 2 3 4 5 6 7 8 9 10 11 12 Mean Q1 7 8 9 6 7 7 8 6 7 7 8 7 7.250 Q2 8 8 8 7 7 9 7 7 9 8 7 9 7.833 Q3 9 8 9 7 9 8 9 7 8 9 7 8 8.167 Q4 8 9 10 7 9 8 9 8 9 9 9 9 8.667 Q5 Y Y Y N Y Y Y N Y Y Y Y - Table 2. Average rating from 12 users of VIRECAR Question Average Rating Q1 7.250 Q2 7.833 Q3 8.167 Q4 8.667 Q5 11 - Yes; 1 - No 5. CONCLUSION Despite the fact that augmented reality and virtual reality sounds appealing on paper, there are several challenges in implementing the system on a practical aspect. They fall in two categories: technical and social. Considering the technical point of view, one requires a lot of components to make the system function properly. Object recognition is the first step in any AR system. If objects in real world and virtual world are not aligned with each other, the illusion of two coexisting worlds is affected. Hence, the quality of the components is an important factor to be taken care of. Problems like inaccurate sensor readings, poor pattern recognition can greatly affect the accuracy of the system. Considering the social aspect, augmented reality is still at its nascent stage, and remains widely unknown to the general public. Similarly, virtual reality also has the problem in implementation as it needs more components and new technologies to be learnt. There are many more applications for both AR and VR to be used more effectively in our day today life. One such interesting application is the Neonatal care training system which could be developed with these new technologies. Inspite of many implementation issues, these technologies have their own benefits of not taking any public into risk during the initial learning process. REFERENCES [1] T. P. Caudell and D. W. Mizell, “Augmented reality: an application of heads-up display technology to manual manufacturing processes,” In Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, 1992, pp. 659–669, 1992. [2] Jules White, Douglas C. Schmidt, Mani Golparvar-Fard, “Applications of Augmented Reality,” Proceedings of the IEEE ISSN: 0018-9219 ,vol. 102 , no. 2.
  • 7. Int J Inf & Commun Technol ISSN: 2252-8776  Augmented reality and virtual reality in our daily life (Vallidevi Krishnamurthy) 211 [3] Nor Farhah Saidin, Noor Dayana Abd Halim and Noraffandy Yahaya, “A Review of Research on Augmented Reality in Education: Advantages and Applications,” International Education Studies, no. 13, pp. 1913-9039, 2015. [4] Charvi Agarwal and Narina Thakur, “The Evolution and Future Scope of Augmented Reality,” International Journal of Computer Science Issues (IJCSI), vol. 11, no. 6, p. 1, 2014. [5] Vladimir. U. “Smart Education and Smart e-Learning,” Springer Publishing Company, Incorporated, 2015. [6] Gleadow, R. “Design for learning – a case study of blended learning in a science unit,” F1000 Research, vol. 4, no. 898, 2015. [7] Senthil K N, “Enhancement of Skills through E-Learning: Prospects and Problems,” The Online Journal of Distance Education and e-Learning, vol. 4, no. 3, pp. 24-32, 2016. [8] Bouras, “An e-Learning Through Distributed Virtual Environments,” Journal of Network and Computer Applications, vol. 24, pp. 175-199, 2001. [9] Sergey Sannikov, Fedor Zhdanov, Pavel Chebotarev and Pavel Rabinovich., “Interactive Educational Content Based on Augmented Reality and 3D Visualization,” 4th International Young Scientists Conference on Computational Science, pp. 720–729, 2015. [10] Y. Genc, S. Riedel, F. Souvannavong, C. Akinlar and N. Navab, "Marker-less tracking for AR: a learning-based approach," Proceedings. International Symposium on Mixed and Augmented Reality, pp. 295-304, 2002. [11] ElindaAi-Lim L, “How does desktop virtual reality enhance learning outcomes? A Structural Equation Modeling Approach,” Elsevier Computers and Education, vol. 55, no. 4, pp. 1424-1442, 2010. [12] Georgios . D. “Investigating the educational value of social learning networks: a quantitative analysis, Interactive Technology and Smart Education,” vol. 13,no. 4, pp. 305-322, 2016. [13] M. Nguyen and A. Yeap, "StereoTag: A novel stereogram-marker-based approach for Augmented Reality," 2016 23rd International Conference on Pattern Recognition (ICPR), Cancun, pp. 1059-1064, 2016. [14] Richard A. Newcombe, Shahram Izadi, Otmar Hilliges , David Molyneaux , David Kim , Andrew J. Davison, Pushmeet Kohli, Jamie Shotton, Steve Hodges, Andrew Fitzgibbon, “KinectFusion: Real-Time Dense Surface Mapping and Tracking.” IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technolgy Proceedings 26 -29 October, Basel, Switzerland, 2011. [15] M. Garon, P. O. Boulet, J. P. Doironz, L. Beaulieu and J. F. Lalonde, "Real-Time High Resolution 3D Data on the HoloLens," 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Merida, pp. 189-191, 2016. [16] Ahmad Shukri Bin Moh Noor, Marwan Nasser Yousef Atoom, Rabiei Mamat “A review of cloud oriented mobile learning platform and frameworks,” International Journal of Electrical and Computer Engineering (IJECE), vol. 9, no. 6, pp. 5529-5536, 2019. [17] Ahlam Al-Dhamari, Rubita Sudirman, Nasrul Humaimi Mahmood, Nor Hisham Khamis, Azli Yahya, “Online video-based abnormal detection using highly motion techniques and statistical measures,” TELKOMNIKA, vol. 17, no. 4, pp.2039-2047, 2019. [18] Junghoon Lee, Gyung-Leen Park, “Design of a Monitoring-combined Siting Scheme for Electric Vehicle Chargers,” International Journal of Electrical and Computer Engineering (IJECE), vol. 8, no. 6, pp. 5303-5310, 2018.