Your SlideShare is downloading. ×
Swarm Extreme
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Swarm Extreme

3,766
views

Published on

Published in: Technology, Business

0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
3,766
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. SWARM Extreme
    Baturalp Torun
    baturalp@gmail.com
    Jitendra Bothra
    jitendrabothra@gmail.com
    Course: CS7780 - Special Topics in Networks
    Guide: Prof. Guevara Noubir (noubir@ccs.neu.edu)
    College of Computer and Information Science
    Northeastern University
    April 2011
  • 2. Agenda
    Hardware Used
    A R Drone
    Emotiv EPOC
    Similar Works
    Our Objective
    Approach
    Design
    Problems
    Future Enhancements
    Conclusion
    PAGE 2
  • 3. A. R. Drone Technical Details
    Embedded computer system
    ARM9 processor, 128MB RAM, Wi-Fi b/g, USB, Linux OS
    Inertial guidance systems
    3 axis accelerometer
    2 axis gyro-meter
    1 axis yaw precision gyro-meter
    Specs:
    Speed: 5m/s; 18km/h
    Weight: Less than 1 pound
    Flying time ~12 mins.
    Ultrasound altimeter
    Range: 6 meters – vertical stabilization
    Camera
    Vertical high speed camera: up to 60 fps – allows stabilization
    PAGE 3
  • 4. Emotiv EPOC headset tech specs
    Based on EEG, 14 sensors – positioned for accurate spatial resolution
    Detecting facial expressions are very fast (<10ms)
    Wireless chip is proprietary and operates on frequency 2.4GHz
    Hacked to use via Python
    https://github.com/daeken/Emokit/blob/master/Announcement.md
    https://github.com/daeken/Emokit
    PAGE 4
  • 5. Similar Works
    http://dsc.discovery.com/videos/prototype-this-mind-controlled-car.html
    http://www.autonomos.inf.fu-berlin.de/subprojects/braindriver
    http://sensorlab.cs.dartmouth.edu/pubs/neurophone.pdf
    http://www.engadget.com/2011/02/19/german-researchers-take-mind-controlled-car-for-a-carefully-cont/
    PAGE 5
  • 6. Our Goal
    Our goal is to control the A R Drone using thoughts via Emotive EPOC
    Control the A.R. Drone using Computer
    Get the commands from Emotiv EPOC and process those
    Design an architecture to connect both and is extendable to incorporate multiple devices.
    Establish connections and fine tune the data for smooth controlling
    PAGE 6
  • 7. Our Approach
    Map headset signals to reasonable commands
    Create a channel between the commands from headset interface and A. R. Drone
    client/server architecture
    allows us to control multiple A. R. Drones remotely
    programs can be extended to run on different environments
    PAGE 7
  • 8. Design (Server)
    Provided by Emotiv
    PAGE 8
    Emotiv
    Connect
    Emo Composer
    Configurations
    Server
    Mappings
    Emo Engine
    Core
    Emotiv
    EPOC
  • 9. Design (Client)
    PAGE 9
    SWARM
    CLIENT
    (updates every 500ms)
    Virtual
    Input
    Device
    (updates every 20ms)
    Win32
    A R Drone
    Controller
    Buffer
    Quad-copter
  • 10. Problems
    Emotive SDK is platform dependent
    Headset sends many signals
    States change very rapidly – causes noisy interstates
    Training requires to focus and not interchangeable from person to person
    There is no universal training method to get same results
    PAGE 10
  • 11. Future Enhancements
    Current System:
    Enhance the system to connect with multiple clients
    Enable the system to work remotely via Internet
    Client could be made more intelligent in order to handle emergency situations
    Long Term:
    The technology could be used to control devices which we used in daily routine, like cars, phones, other electronics etc.
    On long run the EEG devices could be improved to a level where controlling devices will become as natural as controlling once body parts.
    PAGE 11
  • 12. Conclusion
    We are able to fully control the A R Drone using earlier by facial expression and gyro-meter and later by only using the cognitive commands. Given time this system could be future enhanced to control multiple devices simultaneously with a higher accuracy.
    The available technology for reading and processing the thoughts is pretty good to control a system with limited command set, but it needs a lot of improvement in order to be used for complex systems.
    Great learning experience
    PAGE 12

×