Eye Tracking Based Human - Computer InteractionSharath Raj
This Presentation aims at explaining how eye tracking works and the usage of Houghman Circle Detection Algorithm in order to detect the iris.
https://www.picostica.com
Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies.
The Virtual Dimension Center (VDC) Fellbach has compiled the state of the art as well as the market situation and areas of application of the technology field "Eye Tracking" and put it together in a whitepaper.
Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies,
This presentation is based on a poster presentation presented at the 2008 PBIRG conference in Washington, D.C.
It demonstrates how we used only eye-gaze information to improve critical metrics in ad that are related to later recall. Once the most important element within an ad is determined, we measure 2 critical metrics; \"time to first fixation\" and \"total gaze duration\".
Eye Tracking Based Human - Computer InteractionSharath Raj
This Presentation aims at explaining how eye tracking works and the usage of Houghman Circle Detection Algorithm in order to detect the iris.
https://www.picostica.com
Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies.
The Virtual Dimension Center (VDC) Fellbach has compiled the state of the art as well as the market situation and areas of application of the technology field "Eye Tracking" and put it together in a whitepaper.
Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies,
This presentation is based on a poster presentation presented at the 2008 PBIRG conference in Washington, D.C.
It demonstrates how we used only eye-gaze information to improve critical metrics in ad that are related to later recall. Once the most important element within an ad is determined, we measure 2 critical metrics; \"time to first fixation\" and \"total gaze duration\".
Eye tracking and its economic feasibilityJeffrey Funk
These slides use concepts from my (Jeff Funk) course entitled analyzing hi-tech opportunities to analyze how the economic feasibility of eye tracking technology is becoming better through improvements in infrared LEDs, micro-projectors, image sensors, and microprocessors. The capability to track an eye’s movement can help us better identify tired drivers and equipment operators, understand the eye movements of retail shoppers, and develop better human-computer interfaces. Tired drivers and machine operators lead to accidents and these accidents lead to loss of human life and equipment damage. Retailers would like to better understand the eye movements of their customers in order to better design retail stores. Eye trackers would enable one type of human-computer interface, Google Glasses, to understand the information that users are viewing and thus what they want to access
Eye tracking is done with a combination of infrared LEDs, micro-projectors, image sensors, and microprocessors. All of these components are experiencing rapid improvements in cost and performance as feature sizes are made smaller and the number of transistors are increased. Improvements in image sensors have led to higher accuracy and precision where precision refers to consistency. Much of these improvements have come from higher pixel densities and sampling frequencies of the image sensors; the latter enables tracking even when there are head movements.
These improvements have also led to lower costs and cost reductions continue to occur. The cost of high-end eye tracking systems have dropped from about 30,000 USD in 2000 to 18,000 in 2010 and 5,000 in 2013. Further reductions will occur as Moore’s Law continues and as higher volumes enable lower margins.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Gesture recognition is a topic in computer science and language technology which interpret human gestures via mathematical algorithms.
Gestures can originate from any bodily motion or state but commonly originate from the face or hand.
Gesture recognition enables humans to communicate with the machine (HMI) and interact naturally without any mechanical devices.
Gps enabled android application for buseSAT Journals
Abstract Nowadays in this fast life where everyone in is hurry to reach their destinations. Waiting for bus is a hectic and even many of us are unaware of the bus timings, hence to overcome this difficulty we have come up with the system of GPS Enabled Android Application For Bus Schedule System is considered in this study. The major feature included in the Android Application is user can know all the Nasik City Bus timings. This includes the various ways to find out the Bus available, by giving the appropriate bus number, by providing the source and destination or by providing the bus stop. Second module consisting of capturing the current latitude, longitude and location of the current position of the user using the GPS facility available in mobile Third module includes the facility of security call, if user feels unsecure or any disaster occur, just by pressing a single button user can notify to its closed ones with its current position. Then the Android and Java platforms are used to develop the application using eclipse tool. Keywords: GPS, Google Android, Android SDK, Google API.
The Basic Idea Behind “Smart Web Cam Motion Detection Surveillance System” Is To Stop The Intruder To Getting Into The Place Where A High End Security Is Required. This Paper Proposes A Method For Detecting The Motion Of A Particular Object Being Observed. The Motion Tracking Surveillance Has Gained A Lot Of Interests Over Past Few Years. This System Is Brought Into Effect Providing Relief To The Normal Video Surveillance System Which Offers Time-Consuming Reviewing Process. Through The Study And Evaluation Of Products, We Propose A Motion Tracking Surveillance System Consisting Of Its Method For Motion Detection And Its Own Graphic User Interface.
Automated Driver Fatigue Detection and Road Accident Prevention System: An Intelligent Approach to Solve a Fatal Problem. At least 4,284 people, including 516 women and 539 children, were killed and 9,112 others were injured in 3,472 road accidents across Bangladesh in 2017. Some of those accidents could have been avoided if proper systems were implemented at the time. This project focuses on creating a system based on EEG (Electroencephalogram) and ECG (electrocardiogram) signal from driver which will alert a driver about drowsiness while driving.
For college going students, employees of a company or for a common man, bus is the most comprehensive and affordable mode of transport. The use of bus for transport reduces private vehicle usage and thus reduces fuel consumption. It also curbs traffic congestion. The user usually wants to know the accurate arrival time of the bus. Long time waiting at the bus stops discourage the use of buses. Also many unpredictable factors delay the schedule of a bus like harsh weather situation, traffic conditions etc. Towards this aim of reducing this problem, we are proposing a project which will assist the bus travellers in predicting bus timings. The system described uses DriverSideApp, ClientSideApp and a server.
In this project, we propose an innovative method for predicting the bus information. No specific device is required for this purpose. Our sole objective is to build a application that will help student to access the current location of the bus. Moving forward our application focus on to providing them more convenience with bus schedules, bus location information so that they may not get delayed. . Further, the recent advent and popularity of Android technology motivates us to create an Android application for the same.
In today's competitive environment, the security concerns have grown tremendously. In the modern world, possession is known to be 9/10'ths of the law. Hence, it is imperative for one to be able to safeguard one's property from worldly harms such as thefts, destruction of property, people with malicious intent etc. Due to the advent of technology in the modern world, the methodologies used by thieves and robbers for stealing has been improving exponentially. Therefore, it is necessary for the surveillance techniques to also improve with the changing world. With the improvement in mass media and various forms of communication, it is now possible to monitor and control the environment to the advantage of the owners of the property
Real Time Eye Blinking and Yawning Detectionijtsrd
Detecting eye blink and yawning is important, for example in systems that monitor the vigilance of the human operator, eg Driver's drowsiness. Driver fatigue is one of the leading causes of the worlds deadliest road accidents. This shows that in the transport sector in particular, where a driver of heavy vehicles is often open to hours of monotonous driving which causes fatigue without frequent rest periods. It is therefore essential to design a road accident prevention system that can detect the drivers drowsiness, determine the drivers level of carelessness and warn when an imminent danger occurs. In this article, we propose a real time system that uses eye detection techniques, blinking and yawning. The system is designed as a non intrusive real time monitoring system. The priority is to improve driver safety without being intrusive. In this work, the blink of an eye and the drivers yawn are detected. If the drivers eyes remain closed for more than a certain time and the drivers mouth is open to yawning, the driver is said to be fatigue. Ohnmar Win "Real Time Eye Blinking and Yawning Detection" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd28004.pdfPaper URL: https://www.ijtsrd.com/engineering/electrical-engineering/28004/real-time-eye-blinking-and-yawning-detection/ohnmar-win
Human Activity Recognition using Smartphone's sensor Pankaj Mishra
Human activity recognition plays significant role in medical field and in security system. In this project we have design a model which recognize a person’s activity based on Smartphone.
A 3- dimensional Smartphone sensor named accelerometer and gyroscope is used to collect time series signal, from which 26 features are generated in time and frequency domain. The activities are classified using 2 different dormant learning method i.e. k-nearest neighbor algorithm, decision tree algorithm.
7Cs of Learning Design: How it really happens - UNISA Benchmark Workshoptbirdcymru
This description of practical 7Cs of Learning Design training was presented for delegates of University of South Africa, 24 February 2014 at University of Leicester
The Digital Science Labratory - using new technologies to improve trahing and learning in Secondary Science
Ed Walsh, Science Adviser, Cornwall Learning.
http://cornwalllearning.org
Eye tracking and its economic feasibilityJeffrey Funk
These slides use concepts from my (Jeff Funk) course entitled analyzing hi-tech opportunities to analyze how the economic feasibility of eye tracking technology is becoming better through improvements in infrared LEDs, micro-projectors, image sensors, and microprocessors. The capability to track an eye’s movement can help us better identify tired drivers and equipment operators, understand the eye movements of retail shoppers, and develop better human-computer interfaces. Tired drivers and machine operators lead to accidents and these accidents lead to loss of human life and equipment damage. Retailers would like to better understand the eye movements of their customers in order to better design retail stores. Eye trackers would enable one type of human-computer interface, Google Glasses, to understand the information that users are viewing and thus what they want to access
Eye tracking is done with a combination of infrared LEDs, micro-projectors, image sensors, and microprocessors. All of these components are experiencing rapid improvements in cost and performance as feature sizes are made smaller and the number of transistors are increased. Improvements in image sensors have led to higher accuracy and precision where precision refers to consistency. Much of these improvements have come from higher pixel densities and sampling frequencies of the image sensors; the latter enables tracking even when there are head movements.
These improvements have also led to lower costs and cost reductions continue to occur. The cost of high-end eye tracking systems have dropped from about 30,000 USD in 2000 to 18,000 in 2010 and 5,000 in 2013. Further reductions will occur as Moore’s Law continues and as higher volumes enable lower margins.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Gesture recognition is a topic in computer science and language technology which interpret human gestures via mathematical algorithms.
Gestures can originate from any bodily motion or state but commonly originate from the face or hand.
Gesture recognition enables humans to communicate with the machine (HMI) and interact naturally without any mechanical devices.
Gps enabled android application for buseSAT Journals
Abstract Nowadays in this fast life where everyone in is hurry to reach their destinations. Waiting for bus is a hectic and even many of us are unaware of the bus timings, hence to overcome this difficulty we have come up with the system of GPS Enabled Android Application For Bus Schedule System is considered in this study. The major feature included in the Android Application is user can know all the Nasik City Bus timings. This includes the various ways to find out the Bus available, by giving the appropriate bus number, by providing the source and destination or by providing the bus stop. Second module consisting of capturing the current latitude, longitude and location of the current position of the user using the GPS facility available in mobile Third module includes the facility of security call, if user feels unsecure or any disaster occur, just by pressing a single button user can notify to its closed ones with its current position. Then the Android and Java platforms are used to develop the application using eclipse tool. Keywords: GPS, Google Android, Android SDK, Google API.
The Basic Idea Behind “Smart Web Cam Motion Detection Surveillance System” Is To Stop The Intruder To Getting Into The Place Where A High End Security Is Required. This Paper Proposes A Method For Detecting The Motion Of A Particular Object Being Observed. The Motion Tracking Surveillance Has Gained A Lot Of Interests Over Past Few Years. This System Is Brought Into Effect Providing Relief To The Normal Video Surveillance System Which Offers Time-Consuming Reviewing Process. Through The Study And Evaluation Of Products, We Propose A Motion Tracking Surveillance System Consisting Of Its Method For Motion Detection And Its Own Graphic User Interface.
Automated Driver Fatigue Detection and Road Accident Prevention System: An Intelligent Approach to Solve a Fatal Problem. At least 4,284 people, including 516 women and 539 children, were killed and 9,112 others were injured in 3,472 road accidents across Bangladesh in 2017. Some of those accidents could have been avoided if proper systems were implemented at the time. This project focuses on creating a system based on EEG (Electroencephalogram) and ECG (electrocardiogram) signal from driver which will alert a driver about drowsiness while driving.
For college going students, employees of a company or for a common man, bus is the most comprehensive and affordable mode of transport. The use of bus for transport reduces private vehicle usage and thus reduces fuel consumption. It also curbs traffic congestion. The user usually wants to know the accurate arrival time of the bus. Long time waiting at the bus stops discourage the use of buses. Also many unpredictable factors delay the schedule of a bus like harsh weather situation, traffic conditions etc. Towards this aim of reducing this problem, we are proposing a project which will assist the bus travellers in predicting bus timings. The system described uses DriverSideApp, ClientSideApp and a server.
In this project, we propose an innovative method for predicting the bus information. No specific device is required for this purpose. Our sole objective is to build a application that will help student to access the current location of the bus. Moving forward our application focus on to providing them more convenience with bus schedules, bus location information so that they may not get delayed. . Further, the recent advent and popularity of Android technology motivates us to create an Android application for the same.
In today's competitive environment, the security concerns have grown tremendously. In the modern world, possession is known to be 9/10'ths of the law. Hence, it is imperative for one to be able to safeguard one's property from worldly harms such as thefts, destruction of property, people with malicious intent etc. Due to the advent of technology in the modern world, the methodologies used by thieves and robbers for stealing has been improving exponentially. Therefore, it is necessary for the surveillance techniques to also improve with the changing world. With the improvement in mass media and various forms of communication, it is now possible to monitor and control the environment to the advantage of the owners of the property
Real Time Eye Blinking and Yawning Detectionijtsrd
Detecting eye blink and yawning is important, for example in systems that monitor the vigilance of the human operator, eg Driver's drowsiness. Driver fatigue is one of the leading causes of the worlds deadliest road accidents. This shows that in the transport sector in particular, where a driver of heavy vehicles is often open to hours of monotonous driving which causes fatigue without frequent rest periods. It is therefore essential to design a road accident prevention system that can detect the drivers drowsiness, determine the drivers level of carelessness and warn when an imminent danger occurs. In this article, we propose a real time system that uses eye detection techniques, blinking and yawning. The system is designed as a non intrusive real time monitoring system. The priority is to improve driver safety without being intrusive. In this work, the blink of an eye and the drivers yawn are detected. If the drivers eyes remain closed for more than a certain time and the drivers mouth is open to yawning, the driver is said to be fatigue. Ohnmar Win "Real Time Eye Blinking and Yawning Detection" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd28004.pdfPaper URL: https://www.ijtsrd.com/engineering/electrical-engineering/28004/real-time-eye-blinking-and-yawning-detection/ohnmar-win
Human Activity Recognition using Smartphone's sensor Pankaj Mishra
Human activity recognition plays significant role in medical field and in security system. In this project we have design a model which recognize a person’s activity based on Smartphone.
A 3- dimensional Smartphone sensor named accelerometer and gyroscope is used to collect time series signal, from which 26 features are generated in time and frequency domain. The activities are classified using 2 different dormant learning method i.e. k-nearest neighbor algorithm, decision tree algorithm.
7Cs of Learning Design: How it really happens - UNISA Benchmark Workshoptbirdcymru
This description of practical 7Cs of Learning Design training was presented for delegates of University of South Africa, 24 February 2014 at University of Leicester
The Digital Science Labratory - using new technologies to improve trahing and learning in Secondary Science
Ed Walsh, Science Adviser, Cornwall Learning.
http://cornwalllearning.org
Developing Competitive Strategies in Higher Education through Visual Data Miningertekg
Download Link > https://ertekprojects.com/gurdal-ertek-publications/blog/visual-data-mining-for-developing-competitive-strategies-in-higher-education/
Information visualization is the growing field of computer science that aims at visually mining data for knowledge discovery. In this paper, a data mining framework and a novel information visualization scheme is developed and applied to the domain of higher education. The presented framework consists of three main types of visual data analysis: Discovering general insights, carrying out competitive benchmarking, and planning for High School Relationship Management (HSRM). In this paper the framework and the square tiles visualization scheme are described and an application at a private university in Turkey with the goal of attracting bright-est students is demonstrated.
User Experience Design and Usability Testing for Mobile Technology Support in...Renée Schulz
This is the virtual presentation used at EduLearn21.
BLENDED & MOBILE LEARNING
Event: EDULEARN21
Track: Digital & Distance Learning
Session type: VIRTUAL
Abstract: https://iated.org/concrete3/view_abstract.php?paper_id=88226
Proceedings of EDULEARN21 Conference
5th-6th July 2021
ISBN: 978-84-09-31267-2
pages 1056-1066
STEM / STEAM - integrating into a master's program Eileen O'Connor
Science, technology, engineering and mathematics (STEM), often enhanced with the arts (STEAM) has become an important interdisciplinary perspective that can be brought to education, business and community based projects. This presentations highlights the theoretical / academic underpinnings of this approach and provides examples from work done within the SUNY Empire State College's masters program in these areas.
Slides from Keynote Presentation by Janine Bowes. In this presentation Janine will explore the skills and attributes that an online teacher needs in the 21st century to stay on top of the game. In considering the past two decades of online learning, it is useful to note some underlying principles that are timeless but also to be open to new possibilities.
Leveraging Eye-gaze and Time-series Features to Predict User Interests and Bu...Nelson J. S. Silva
We developed a new concept to improve the efficiency of visual analysis
through visual recommendations. It uses a novel eye-gaze based
recommendation model that aids users in identifying interesting
time-series patterns. Our model combines time-series features and
eye-gaze interests, captured via an eye-tracker. Mouse selections are
also considered. The system provides an overlay visualization with
recommended patterns, and an eye-history graph, that supports
the users in the data exploration process. We conducted an experiment
with 5 tasks where 30 participants explored sensor data of a
wind turbine. This work presents results on pre-attentive features,
and discusses the precision/recall of our model in comparison to
final selections made by users. Our model helps users to efficiently
identify interesting time-series patterns.
Similar to 2021_03_26 "Eye-tracking techniques and methods in e-learning environments" - Aleksandra Klasnja Milicevic (20)
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
2021_03_26 "Eye-tracking techniques and methods in e-learning environments" - Aleksandra Klasnja Milicevic
1. Eye-tracking Techniques and
Methods in e-Learning
Environments
Aleksandra Klasnja-Milicevic
Department of Mathematics and Informatics,
Faculty of Sciences, University of Novi Sad, Serbia
2. Agenda
The human eye and eye-tracking device
1
Eye-tracker: possible usages
2
Eye-tracking technology in digital learning
environments
3
Assessing Learning Styles Through Eye Tracking
4
3. Eye-tracking technology
Eye-tracking technology - a set of methods and techniques used to: discover, identify
and record the activities of eye movements.
Eye-tracking system confirmed its usefulness in terms of:
• identifying behavioural response,
• presenting cognitive load,
• providing an alternative means for human-computer interaction,
• prompting interface design and
• adapting appearance of elements according to user data.
Incorporating eye tracking into adaptive e-learning systems - useful in a process of
adaptation to the requirements and needs of the learner.
3
4. Structure of the Human Eye
https://scienceeasylearning.wordpress.com/2015/05/27/structure-of-human-eye-and-its-working-and-defects-in-human-eye/
• Rays of light enter the cornea and through
the pupil are focused on the lens.
• From the lens, through the vitreous humor
the light is transmitted to the retina.
• Perceived by the photoreceptors of the
retina, the light is delivered to the brain by
means of the optical nerve.
4
5. There are two type of photoreceptors located in the fovea:
1. Cons:
• much less sensitive to light than rods
• do not saturate
• can discriminate colors (red, green and blue)
2. Rods:
• Work at very low levels of light.
• We use these for night vision.
• Don't help with color vision.
Structure of the Human Eye
5
https://scienceeasylearning.wordpress.com/2015/05/27/structure-of-human-eye-and-its-working-and-defects-in-human-eye/
7. • The history of eye tracking started at the end of 1800’s;
• in 1879 Louis Èmile Javal observed that reading does not involve
a smooth sweeping of the eyes along the text, as previously
assumed, but a series of short stops (called fixations) and quick
saccades. This observation raised important questions about
reading, questions which were explored during the 1900s:
• theorized the existence fixations and saccades giving rise to
some questions:
• What are the words where the eyes stop?
• For how long?
• When does the gaze regress back to already seen words?
Eye tracking: a short history and definitions
8. • Eye tracking is the process of measuring the motion of the eye
relative to the head position.
The device used to make these measures is called eye tracker.
• A fixation occurs when a person looks at a specific point for a
time longer than 100 ms.
• Saccades are fast movements from a point to another (e.g by
skipping a certain number of pixels in the vertical and horizontal
directions on a screen).
Eye tracking: a short history and definitions
8
9. Eye tracking: a short history and definitions
• An example of fixations and saccades
over text.
• This is the typical pattern of eye
movement during reading.
• The eyes never move smoothly over
still text.
http://www.digitimes.com.tw/tw/B2B/Seminar/Service/download/053A410120/053A410120_R559FP5NJ1W8L7F4MCOL.pdf
9
10. Gazeplots and heat-maps
Eye tracking technology has been a useful tool
to evaluate the effectiveness of websites to
communicate information.
In eye tracking research related to HCI it is also
commonly create gazeplots and heat-maps.
https://vwo.com/blog/heatmap-and-ux/
http://www.digitimes.com.tw/tw/B2B/Seminar/Service/download/053A410120
/053A410120_R559FP5NJ1W8L7F4MCOL.pdf
11. Kinds of Eye trackers
Electric potential
based tracking
Eye-attached tracking Optical tracking
http://www.jessebandersen.com/2014/03/the-eye-tribe-development-kit-box.html
11
12. Eye-tracking system
Get a modern PowerPoint Presentation that is beautifully designed. I hope
and I believe that this Template will your Time, Money and Reputation.
Easy to change colors, photos and Text.
Easy to change colors, photos and Text. Get a modern PowerPoint
Presentation that is beautifully designed. I hope and I believe that this
Template will your Time, Money and Reputation. Easy to change colors,
photos and Text.
I hope and I believe that this Template will your Time, Money and
Reputation. Easy to change colors, photos and Text. Easy to change colors,
photos and Text.
Your Text Here
http://www.tobii.com/group/about/this-is-eye-tracking 12
14. Eye-tracker: possible usages
Assistive interfaces: e.g., web browsing and writing
Ocular control: e.g. keyboard / mouse simulation
Graphic tools: help in drawing activities
Museums: access information directly with
the eyes
RSVP (Rapid Serial Visual Presentation): browsing high
quantities of images
15
15. Eye-tracker: possible usages
01
02
03
04
Classifiers:
they allow to "train" a statistical
instrument so that,
given input data, it can return
answers about something.
Eye tracking has already been
studied for flight safety by
comparing scan paths and fixation
duration to evaluate the progress of
pilot trainees, for estimating pilots’
skills, for analyzing crew’s joint
attention and shared situational
awareness.
Aviation applications
try to figure out if the user is
understanding something, e.g.
– Pupil dimension
– Saccadic movements
E-Learning:
for Authentication /
Validation process
Biometrics:
16
16. ❑ Observe users’ learning behavior in real time by monitoring
characteristics such as:
❑ objects and areas of interest,
❑ time spent on objects,
❑ frequency of visits, and
❑ sequences in which content is consumed.
❑ Research could be focused on analyzing eye-movement
patterns during learning and linking these patterns with
cognitive processes.
Enhancing E-learning system
with Eye Tracking Technologies
17
17. Enhancing E-learning system
with Eye Tracking Technologies
• Fixation duration and saccade length can determine the
learner’s mental state, whether the learner is
concentrating, stressed, tired.
• It can provide encouragement to learners, or even offer
additional material to keep the attention.
• At the same time, the system can let the teacher know
which part of the lesson needs to be improved.
Determining a learner’s mental state.
18
18. ❑ Saccadic velocity - decrease with increasing tiredness and to
increase with increasing task difficulty.
❑ Blink rate - decreasing blink velocity and decreasing degree
of openness may be indicators for increasing tiredness.
❑ Thus, if tiredness is identified, it should be possible through
adaptive e-learning mechanisms to suggest optimized
strategies such as the best time to take a break.
Enhancing E-learning system
with Eye Tracking Technologies
19
19. In an e-learning course concerned with Alexander the Great's Conquest of Persia, a map of
Alexander's advance in the region is shown. The map content is updated in correspondence
to the text paragraph currently read by the learner. In the example, the second paragraph
("Granikos") is being read and the map shows the journey of Alexander from Macedonia to
Granikos (green, yellow and red areas indicate fixations and gaze duration).
Enhancing E-learning system
with Eye Tracking Technologies
20
https://old.eurodl.org
20. Identifying important teaching areas.
• By tracking the number of fixation on each
area of interest, a teacher can make sure
that learners are paying attention to the
most important areas.
• If that is not the case, a teacher can
modify the material.
• Also, if the learner spends too much time
in one area of interest, it could mean that
area is problematic a more detailed
explanation or more related material
should be included.
Enhancing E-learning system
with Eye Tracking Technologies
21
21. ❑ The system can direct the activities of learners and generates referrals
links and actions in the learning process.
❑ System can identify where the gaze is held and can offer a more detailed
explanation of the term or concept in the form of dictionary.
❑ Option to the student himself chooses a word he wants to see details -
button ’Q’.
❑ Identifies and analyses learning sequences and suggests learners
optimal sequence according to the successful results of finally test.
❑ Reports about progress, test results, coursework and their own
learning styles.
Enhancing E-learning system
with Eye Tracking Technologies
22
22. Enhancing E-learning system with Eye
Tracking Technologies
Learning style identification.
• Currently, learners are required to fill in a
questionnaire and we have no way of knowing
if they did it honestly.
• By using eye tracking technologies to identify
learning styles (using a sample lesson), we can
be sure that the learning style they areas
signed with is the one that suits them best.
• Another option would be modifying the style
throughout the course progression.
23
23. ❑ The learning process will be improved, because the system
will create or deliver adapted content by means of tracked
statistical data - optimise material to an individual's needs.
❑ (e.g. by delivering more images/tables for learners that have
problems with large and complicated texts).
❑ if somebody prefers text and ignores pictures the number of
pictures presented could be reduced, and vice versa – learning
style theory – automatic identification of learning styles with
the assistance of eye-tracker
Enhancing E-learning system
with Eye Tracking Technologies
24
24. Enhancing E-learning system with Eye
Tracking Technologies
taking into concern the learner's learning type, perception, cognition, fields of
interest and knowledge level,
supplying personalized course content,
achieving a better quality of the progress throw the course,
providing more details about the perceptive and mental processes of the
learner, recognizing potential learner problems and providing recommendations
for improvement and adaptation, uncovering the need for additional material.
25
25. Assessing Learning Styles Through Eye Tracking for
E-Learning Applications
N. Nugrahaningsih, M. Porta, A. Klasnja-Milicevic, Mirjana Ivanovic
26. Assessing Learning Styles Through Eye Tracking
• Adapting the presentation of learning
material to the specific student’s
characteristics is useful to improve the
overall learning experience and learning
styles can play an important role to this
purpose.
• The study about the possibility to distinguish
between Visual and Verbal learning styles
from gaze data.
26
28. Research question
RQ Is it possible to distinguish Visual
and Verbal learners from:
- the features of their gaze behavior
- percentage of fixation duration,
- percentage of fixations, and
- average fixation duration
recorded by an eye tracker?
28
29. Participants
• 90 volunteer students participated in the experiment
• 57 males and 33 females,
• 18 years old on average.
• All of them were freshman Computer Engineering students
of the Informatics Department of the University of
Palangkaraya and had not attended any computer
programming course yet.
• No personal data were stored, as all the participants in the
experiment were anonymously identified through numbers.
• The participants did not get any academic credits for
participating in the experiments.
29
30. Participants
• Six of the 90 participants did not fill in the questionnaire
completely.
• Other four participants failed the eye tracking calibration
procedure (consisting in fixating the center of a circle
appearing in different positions of the screen).
• Moreover, 25 participants tried the test more than once, due
to problems occurring in the data recording phase.
• Thus, in the end, we decided to consider eye data from the
surely reliable 55 participants.
30
31. Materials
To record gaze data - Eye Tribe ET-1000 eye tracker, with 60 Hz data
sampling rate.
Stimuli were displayed on a 21.5'' monitor.
31
32. Procedure
• “Traditional” approach - the participants were initially asked to complete the Index of
Learning Styles (ILS) questionnaire.
Experimental Phase 1.
• After three days from the Experimental Phase 1 - an eye tracking experiment.
• The participants were not informed that this trial related to the questionnaire they
had answered in Phase 1.
• A within-subjects experimental design was used, in which participants tried all the
available conditions.
Experimental Phase 2.
32
33. Experimental Phase 1.
• The ILS questionnaire is an instrument composed of
44 multiple-choice questions which aims to distinguish
four bipolar styles:
• Active/Reflective (AR),
• Sensible/Intuitive (SI),
• Visual/Verbal (VV), and
• Sequential/Global (SG).
• There are two answers (a and b) for each question. In
our study, the original questionnaire was translated into
Indonesian.
33
34. Experimental Phase 1.
The result score is an odd number between 1 and 11:
• If the score is 1 or 3: the respondent is well balanced
on the two dimensions of that scale
• If the score is 5 or 7: the respondent has a moderate
preference for one dimension of the scale
• If the score is 9 or 11: the respondent has a very
strong preference for one dimension of the scale
34
35. Experimental Phase 2.
The eye tracking experiment was conducted in a quiet room,
with artificial illumination from the ceiling.
The participant in the test was seated at about 55 cm from
the monitor.
The task was to read and try to understand the topics
presented in a group of slides.
No time limit was set for each slide, so that the participants
could learn at their own pace (a new slide was loaded by
pressing the space bar). In total there were seven slides.
35
36. Experimental Phase 2.
01
contained a description of the task;
The first slide
02
consisted of a graphical overview of the topics;
The second slide
03
explained the basic notion of variable;
The third slide
04
presented the concept of algorithm;
The fourth slide
covered the three basic imperative programming constructs, namely
sequence, selection, and iteration.
The fifth, sixth and seventh slides, respectively
05
36
38. Analysis of Eye Tracking Data and Results
?
• In each slide, we defined two AOIs: one for the text section and another for the picture region.
• Text and pictures were alternately on the left and on the right within slides.
38
39. Analysis of Eye Tracking Data and Results
• The independent variables of the eye tracking study were the position of the
picture and of the text areas on the slides (left-right or right-left).
• The controlled variables were the textual and graphical contents displayed in
the slides.
• The dependent variables, besides the questionnaire outcomes for Phase 1,
in Phase 2 were:
• the percentage of fixation duration,
• the percentage of fixations, and
• the average fixation duration.
• Percentages were preferred to absolute values because the
time spent on each slide by each participant was different.
39
41. Analysis of Eye Tracking Data and Results
• For a temporal analysis of eye behavior, we subdivided the whole time spent by
each participant on a slide into ten intervals.
• Score Distributions
• The scores obtained from the Felder-Silverman questionnaire were not evenly
distributed.
• We grouped them based on the median (MED) and median absolute deviation
(MAD) values of the score. Specifically, we identified three groups:
• Group 1, with score < MED – MAD - “more verbal than visual”
• Group 2, with score > MED + MAD - “more visual than verbal”
• Group 3, with score in the range (MED – MAD) (MED + MAD) –
“between visual and verbal”
42
42. Analysis of Eye Tracking Data and Results
• The answer to the research question of our study (“Is it possible to distinguish Vis
ual and Verbal learners from the features of the gaze behavior recorded by an eye
tracker?”):
• A relation between gaze behavior and learners’ group could be found:
• for Group 1 (participants who were classified as more verbal than visual)
• for Group 2 (participants who were classified as more visual than verbal),
• not for Group 3 (participants who were classified as being between visual and
verbal).
• Specifically, the percentage of fixation duration on the text area, computed up to
intervals 9 and 10 gives clear information about the user’s style group (Group 1 or
Group 2).
• If most of the time (at least 90%) spent on the slide is evaluated, the Visual/Verbal
learner can be successfully recognized.
43
43. Analysis of Eye Tracking Data and
Results
• Gaze data were coupled with the outcomes of the Index of
Learning Styles (ILS) questionnaire.
• A connection between the Visual/Verbal learning style was
found for a specific information layout, which give a
constructive contribution to the field of e-learning in general,
and to the area of automatic learning style assessment
specifically.
• Exploiting eye tracking in this field is of importance because
it can enable “intelligent” e-learning systems in which learning
styles are assessed in a seamless way.
46
44. Conclusions and Future work
The automatic recognition of users’ learning styles is a very important step
towards intelligent adaptive learning platforms.
To achieve an adaptive eLearning system, it is essential to monitor the learner
behavior dynamically to diagnose their learning style.
Eye tracking can serve that purpose by investigating the eye gaze movement
while engaging in the eLearning environment.
It would be also useful to consider an application of eye tracking technology in
combination with other biosensor systems.
Additional tools and analytical data might explore hidden patterns in user behavior
and activities.
47