Autonomous Band                     EECS 498: Autonomous Robotics Laboratory                                University of ...
Our camera calibration calibrates by first detecting the four corners (marked by bluesquares) on the music board (the syst...
Once a note is detected, the note blob is projected from camera pixel space to physicalmusic board space. The notes value ...
Music PlayingOverviewThe music actuation component of the Rage with the Machine project featured fourrobotic arms autonomo...
2. Query for all free arms and use the closest one of them to the next note to play as      our choice.   3. To improve de...
the middle xylophone key. All of these tactics for sustainability were conducted blind tothe planner, abstracted away and ...
were limited by the number available, the space required for each arm (4 sq. ft.), and thenumber of USB ports and number o...
Upcoming SlideShare
Loading in...5
×

Autonomous Band Project Writeup

379

Published on

Writeup for our Autonomous Band project. We created an artificial intelligence system that detects and parses large sheet music with an overhead camera and plays the music on xylophones with a series of synchronized robotics arms. See the website link on the writeup for video demonstrations and more information.

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
379
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Autonomous Band Project Writeup

  1. 1. Autonomous Band EECS 498: Autonomous Robotics Laboratory University of Michigan Robert Bergen, Mark Isaacson, Sami Luber, and Zach Oligschalaeger http://eecs498teammusic.webs.com/OverviewFor our EECS 498 Autonomous Robotics Lab Final Project, we created an autonomousband, consisting of four robotic arms and two xylophones. An overhead camera is used todetect music notes, which are parsed into music and sent to the robotic arm controller.The arm controller uses motion planning to direct the arms to play notes while avoidingarm collisions.Autonomously reading and playing music is challenging because it requires building arobust music-reading system, careful synchronization between the music reader and themusic player components, and motion planning between the arms to ensure notes areplayed smoothly and without arm collisions. Described in detail in the Music Readingsection, our system continuously detects musical notes on the music board to allow usersto see their desired music change in real-time on the GUI. However, to allow the user toset their desired song before playing it, the user controls when the new music is sent tothe music player. At this point, the music player discards previously sent music and clearsqueued notes to accommodate the new music. Described in detail in the Music Playingsection, based on the number of available robotic arms, the music player componentschedules each robotic arm to play notes using motion planning to avoid arm collisions.Music ReadingRepresenting MusicOur system strives to resemble real sheet music as much as possible. Notes arerepresented by red circles, uniform in color and size. Users can set these notes as desiredon the music board. Furthermore, our system supports reading and playing all G-clefnotes (middle C - high G).Camera Calibration
  2. 2. Our camera calibration calibrates by first detecting the four corners (marked by bluesquares) on the music board (the system pauses if more or less than four corners aredetected). Using the locations of the four corners, a homography is built to project notesfrom camera pixel space to physical music board space with respect to the center of themusic board. Using knowledge of pre-defined line and stanza spacing to determine thenote value and time of each music note based on its physical (x,y) position on the musicboard.Detecting NotesWhen detecting music, each stanza is scanned separately for music notes. We detectmusic notes by using a Blob Detection algorithm. In the blob detection algorithm, there isa color threshold for notes in which colors are represented using HSV (more robust inenvironments with varying lighting). The blob detection algorithm is also blob sizesensitive. More specifically, there is a threshold range on how many pixels a blob mustbe within to be considered a note. If a note blob is larger than expected, we assume thereare overlapping notes on the music board that are being detected as one large note blob.Using the covariance matrix of the overlapping note blob and the expected size of a note,we determine the configuration of the overlapping notes and how many notes the blobshould be broken into. The round shape of the notes is also important as it improves ourvariance calculation accuracy when building the covariance matrix.
  3. 3. Once a note is detected, the note blob is projected from camera pixel space to physicalmusic board space. The notes value and time (within an eighth count from the beginningof the stanza) are determined based on a position threshold on the (x,y) physical locationof the note on the music board. The time of the note is "snapped" (rounded) to the nearesteighth count, so that the music flows more smoothly.Finally, because our music reader actively reads music in real-time, our system issensitive to any changes made to the music note configuration on the music board. Theblob detection algorithm runs continuously and updates its parsed music based on anychanges to the note configuration.Parsing MusicOnce all notes in each stanza are detected, we preform auto-spacing on the notes for"better sounding" music. More specifically, if there are three or four notes in a stanza,regardless of the "snapped" times for each note, note times will be adjusted such that thetimes are spread evenly across the stanza.The music speed is also automatically adjusted for the robots physical limit. In otherwords, the music speed is slowed down enough to allow the robot arms ample time toplay each note. Using the systems GUI, the user can also further slow down the speed ofthe music.GUI and Music TransmissionAs the music reader dynamically detects music, a GUI shows the user the currentlydetected music notes, a camera feed of the actual music board, and the music currentlybeing played on the robot arms. This allows the user to adjust the notes on the musicboard without interrupting the currently playing music, and then, once the user is satisfiedwith the new music, send the new music to the robot arms to be played. Upon sending thenew music to the music player component, a clear message is sent that invalidatespreviously sent music notes and clears the queue of current notes to be played (allowingthese notes to be replaced with the updated music). An adjustable bar on the GUI alsoallows the user to change the speed of the music.
  4. 4. Music PlayingOverviewThe music actuation component of the Rage with the Machine project featured fourrobotic arms autonomously playing xylophones in conjunction and communication withthe vision component. Our system managed the planning, timing, and mechanical aspectsof playing a desired piece of music and was written to be expandable to meet ourresource budget.PlanningOur planner served as a scheduler for our robotic arms. Each arm had its availability toplay a note mapped out in what we called a Planner Line, which was a sequence ofactions to perform, each of which kept track of its start, hitting the key, and completiontimes, which we calibrated with trial information from the mechanical component of theproject (see below). The planner would receive the sequence of notes it should play in anLCM (Lightweight Communications and Marshaling) message and proceed to distributethem to the Planner Lines via our arm allocation algorithm either by appending to theexisting song or erasing and starting anew. As we continued to work, our algorithm grewmore sophisticated to achieve our definition of optimality, which was maximizing thenumber of notes we could play in a given period of time. We implemented this in thefollowing growth stages: 1. Query for the first free arm and use it to play the next note.
  5. 5. 2. Query for all free arms and use the closest one of them to the next note to play as our choice. 3. To improve demoability by increasing the number of occupied arms in a given period of time, we modified our algorithm to query for all free arms, then all of those which were tied for being closest to the next note, and finally chose the arm to use at random from that list. 4. Use method 3 as a quick and almost always successful greedy choice, but upon failure to add a note, achieve optimality by redistributing future notes via a branch and bound family algorithm. The planner would then maintain a thread that queried the Planner Lines to determine if they should play the note sitting at their individual playheads and signal the proper arms state machine to do so.Arm InterfaceOur arms were managed via an Arm class, which was responsible for knowing theconfiguration space locations of positions corresponding to hitting every key on ourxylophones as well as positions directly above them and conducting smooth transitionsbetween them. It also abstracted the mirroring of arms and their various positions fromone side of a xylophone to the other.The process of making these movements smooth and reliably accurate took a number ofdesign features to achieve. We had to establish a procedure that involved moving ourarms one joint at a time in specific and varying orders to prevent collisions andunexpected behavior that arose by simply ordering the arm from place to place, a practicewhich unfortunately increased the delay between playing notes consecutively. For thepurpose of achieving accuracy, we implemented extremely tight thresholds on what itmeant for an arm joint to have arrived at a location.The above system performed extremely well after tuning what it meant to be above akey to decrease playing delays; however, our system encountered issues regarding servooverloads while attempting to sustain positions above keys at the edge of our arms safeoperating range which required a combination of tactics to solve. We determined that bylowering the maximum operating torque for a servo, we could allow it to last longerunder stress, and wrote it into our code to achieve this when an arm was idle for asufficiently long period of time. To achieve the best demoability and reliability for oursystem, we further determined that it would be safest to not only lower torque after beingidle for long periods of time, but also to move the arm to a known safe position, above
  6. 6. the middle xylophone key. All of these tactics for sustainability were conducted blind tothe planner, abstracted away and allowed for new instructions at a moments notice.MechanicsThe portion of the system we found we had the greatest difficulty refining was ourmechanical interface with the xylophones. In the course of testing, we determined that inorder to affect a sound off of a xylophone, it was necessary to not merely hit a note andrelease quickly with some artifact, but to pivot while doing so. After several designiterations and trials, we developed a rig which consisted of a 3" x ¾" hex bolt with one-degree-of-freedom for striking with a pivot to produce the appropriate ring from ourxylophones.Once the rig was constructed, we also had to ensure that we could actuate the servos insuch a fashion that we could strike the key as intended. This required moving severaljoints in quick succession, and was a process that we changed several times over thecourse of development, and had the undesirable quality of often needing to be manuallyfine tuned for specific keys, rather than be based off of some mathematical model andadjustments. When we had settled on the motion of the arm, we ran a calibration programthat ran through every possible motion of the arm between our pre-programmed positionsand recorded the times taken over several trials for use in our planner.SuccessWe view our system as a success. We were able to completely abstract the playing ofmusic in a reliable planner that was able to achieve optimal playing capacity. We wereable to play well known songs at a reasonable tempo with a relatively small number ofrobotic arms. The limitations of our system were either due to our resource budget or aconsequence of being pressed for time. While our program could accommodate anynumber of robotic arms (and therefore play more complex and interesting music), we
  7. 7. were limited by the number available, the space required for each arm (4 sq. ft.), and thenumber of USB ports and number of AC power sockets (1 per arm). We could havefurther improved our system by reducing delays between notes, but were constrained bythe amount of time it wouldve taken to find positions and movement patterns startingfrom closer above each note that still produced quality sound, and in the same way werelimited from reducing servo overload issues by decreasing the operating radius of ourarms. These known issues stated, we did manage to find solutions to both within thescope of programming, by making our planning algorithm optimal and our arms employeffective safety protocols, and as such, these issues were voided by demonstration day,and our system a success.

×