Slides talleres-interacion1

500 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
500
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Slides talleres-interacion1

  1. 1. TALLER: LLENGUATGES AUDIOVISUALS I NARRATIVA INTERACTIVA <ul><li>Nina Valkanova & Marco Romeo </li></ul>
  2. 2. ABOUT NINA <ul><li>Computer scientist and media designer </li></ul><ul><li>PhD Candidate at the Interactive Technologies Group </li></ul><ul><li>Interested in interaction design for urban media displays </li></ul><ul><li>Teaching at the UPF </li></ul><ul><ul><li>Taller de Sistemas Interactivos II (2009, 2010, 2011) </li></ul></ul><ul><li>Contact </li></ul><ul><ul><li>[email_address] </li></ul></ul><ul><ul><li>http://ninavalkanova.com </li></ul></ul><ul><ul><li>via the Moodle </li></ul></ul>
  3. 3. ROADMAP THEORY Concepts of Audio-Visual Language THEORY Hypertext Interfaces Interactive Narrative WORKSHOP Camera Interaction Blob tracking PRACTICE Application of Concepts from Audio-Visual Language Color tracking Fiducials
  4. 4. ABOUT THIS CLASS <ul><li>Workshop on the creation of camera interactions </li></ul><ul><li>Practical exercises </li></ul><ul><li>Group work – 2 or 3 people . </li></ul><ul><ul><li>“ Pair programming” </li></ul></ul><ul><ul><li>Form your group NOW and send US an email with your names TODAY </li></ul></ul><ul><ul><li>No FACEBOOK please </li></ul></ul><ul><li>Evaluation - next slide </li></ul>
  5. 5. EVALUATION <ul><li>Presence – evaluated individually. 80% presence from practice (Marco) and workshop (Marco and Nina) </li></ul><ul><li>Assignments – evaluated in group </li></ul><ul><li>Need to be completed IN CLASS </li></ul><ul><li>Submission – via Moodle. </li></ul>
  6. 6. XBOX GAME
  7. 7. COMPUTER VISION <ul><li>Creating “seeing” applications </li></ul><ul><li>Feeding an application with a series of images (video, or single image) </li></ul><ul><li>Analyzed for something particular (face, light, color, object) </li></ul>
  8. 8. APPLICATIONS <ul><li>Robotics </li></ul><ul><li>Surveillance machines </li></ul><ul><li>Weapons systems  </li></ul><ul><li>Games </li></ul><ul><li>Touchscreen interfaces </li></ul><ul><li>Interactive installations </li></ul>
  9. 9. MINORITY REPORT (2002)
  10. 10. FULL-BODY GAME
  11. 11. M-TOUCH
  12. 12. EXERCISE 1 <ul><li>Download from the course’s page the document “Processing_OpenCV_JMyron_PSEye.pdf” </li></ul><ul><li>Follow the installation instructions step by step </li></ul><ul><li>Make sure you have step 5. (simple exercise) working </li></ul>
  13. 13. BLOB TRACKING
  14. 14. BLOB TRACKING
  15. 15. EXERCISE 2 <ul><li>Open the BLOBS example of OpenCV and have a look at it </li></ul><ul><li>Using the starting example from step 5. (of the instructions) – make the BLOBS example of the OpenCV library work with the image data obtained from JMyron </li></ul><ul><li>opencv.copy(img,width,0,0,width,height,0,0,width,height); </li></ul>
  16. 16. EXERCISE 2 <ul><li>Reminder from exercise 1: this code makes possible that Jmyron image data can be used by OpenCV </li></ul><ul><ul><li>m.update(); </li></ul></ul><ul><ul><li>int[] img = m.image(); </li></ul></ul><ul><ul><li>loadPixels(); </li></ul></ul><ul><ul><li>opencv.copy(img,width,0,0,width,height,0,0,width,height); </li></ul></ul><ul><ul><li>updatePixels(); </li></ul></ul><ul><li>Put this piece code in a separate function called </li></ul><ul><li>JMyron2OpenCV(int w, int h, JMyron buffer) </li></ul><ul><li>In the beginning of this function put the following code, which cleans the OpenCV buffer </li></ul><ul><ul><li>int[] cleanPx = new int[w*h]; </li></ul></ul><ul><ul><li>opencv.copy(cleanPx,w,0,0,w,h,0,0,w,h); </li></ul></ul>
  17. 17. EXERCISE 2 <ul><li>The interesting part: where should you call the function? </li></ul><ul><li>JMyron2OpenCV </li></ul>
  18. 18. EXERCISE 3 <ul><li>Control a video using camera interaction: </li></ul><ul><li>Extend your blob tracking sketch by the following: </li></ul><ul><ul><li>Add a window showing a video playback (similar to exercise done with Marco; use the Loop video example) </li></ul></ul><ul><ul><li>Program an interaction controlling the playback of the video (stop / play), which uses the number of blobs detected by the camera </li></ul></ul>
  19. 19. FOR NEXT TIME <ul><li>Deadline: 1 June 17:00h </li></ul><ul><li>Submit to the Moodle a file with the commented code of “step 5: simple exercise”, explaining each code step </li></ul><ul><li>Submit to the Moodle your first blob-tracking sketch (exercise 2) </li></ul><ul><li>Submit to the Moodle a ZIPPED archive containing your sketch about video interaction and blob-tracking, and the data folder with the video. Use the loop video example. (exercise 3) </li></ul>

×