Sensory system for implementing a human—computer interface based


Published on

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Sensory system for implementing a human—computer interface based

  1. 1. This paper describes a sensory system forimplementing a human–computer interface based onelectrooculography. An acquisition system captureselectrooculograms and transmits them through theZigBee protocol. The data acquired are analysed inreal time using a microcontroller-based platformrunning the Linux operating system. The continuouswavelet transform and neural network are used toprocess and analyse the signals to obtain highlyreliable results in real time. To enhance systemusability, the graphical interface is projected ontospecial eyewear, which is also used to position thesignal-capturing electrodes.
  2. 2. The purpose of this research paper is to develop a system to capture andanalyse EOG signals in order to implement an HCI, as shown in Figure 1.The system comprises two electronic modules—the signal AcquisitionModule (AM) and the Processing Module (PM). Eyewear incorporating aset of appropriately positioned dry electrodes captures the EOGsignals, which the AM acquires, digitizes and transmits using the ZigBeeprotocol. The PM receives the signals from the AM and executes thealgorithms to detect the direction of the user’s gaze. Simultaneously, itprojects the user interface onto the eyewear and, according to theselection made by the user, transmits the commands via WiFi to a homeautomation system or performs other tasks (i.e., call a nurse, etc.).
  3. 3. The electrooculogram captured by five electrodes placed aroundthe eyes. The EOG signals are obtained by placing two electrodesright and left (A-B) to detect horizontal movement and anotherpair above and below the left eye (C-D) to detect verticalmovement. A reference electrode is placed above the right eye (E).The eyewear has a composite video input (PAL format) anddisplays high-colour, high-contrast images at 320 × 240resolution, equivalent to a 46-inch screen viewed at a distance of 3metres.
  4. 4. Analog signal acquisition hardware includes two differential input (CH1, CH2), which are digitized internal ADC of the microcontroller (LPC1756, 12-bit resolution, sampling at 100-300 Hz frequency, ridden in steps of 10 Hz) before being transferred via the protocol ZigBee.Two channel amplifier was designed to get bioelectric signals, each channel can be configured dynamically and individually (the active adjustment of the channel, the channel offset, sampling frequency or amplification circuit) via commands transmitted via the protocol ZigBee.
  5. 5. Function Processing Module must receive signals from EOG through the protocol ZigBee, apply the appropriate algorithms to detect movement of the users eye, display the user interface and sends the appropriate command through WiFi to your home automation system for high-performance. Processing Module based on SoC (system on chip), OMAP3530, which includes a the kernel of the cortex-A8 as well as the C64x + DSP, reaching 720 MHz. It has 512 MB ​of RAM and 512 MB ​of flash memory. This provides a direct composite video output (compatible with both PAL and NTSC formats), coupled with the wrapper 230 Vuzix eyewear
  6. 6. Figure 5 shows the processing performed on the digitizedsignal, EOG. processing consists of correction of signalreceiving, applying a linear model and intermittent eye trainedneural network according to signals from the user. Acquisitionmodule allows for adjustment of channel gain. Eye modelcalculates the ratio between the change in EOG and eyemovements, as well as the calculation of the minimum thresholdof detection.
  7. 7. Detector blinking eyes determines consistently 2 or 3blinks an eye when it is discarded interrupted eyemovements can be seen in figures. Block eyemovement detector determines the legality ofintermittent motion detection.
  8. 8. EOG signals are selected by 100 times per second. 1.68 ms are needed tohandle the CWT, while interrupted linear model takes 0.012 ms to detectmovement and determine the quantity of it. Blink detection takes 0.26ms. A delay of 250 ms is required after the stick-slip is detected. Signalpropagation on RBF takes 8.52 ms. Finally, the block of the eyemovement detector requires 0.035 ms