2. Topics
• Polulu M3PI Robot
• Demonstration
• Part I: Kiva Robot Imitation
• Getting started with Basic Testing
• Simple Line Following
• Intersections
• Bluetooth Communication
• Automated Intersection Detection
• Dual Modes of Operation
• Sharp Turns and Speed Vs. P control
• Part II: Extremum Seeking Controller
• Simulation
• Cosine Shaped Path
• High-Pass Filter and Perturbation
• Smoothing Factor
3. Polulu M3PI Robot
• Screen
• Bluetooth Module
• Light Sensors
• Two Wheels
• Coded in C++
14. SharpTurns, P control and
Speed
• Turns greater that 90°
• Original Line Position Function
• Five Sensors for line detection on far left or right
• P control value
• Lower value for reliability
• Higher value for speed
18. Dual Modes of Operation
Manual Mode
• Asks for directions at
detected intersections
• Error returned to
terminal if invalid
direction is input
Automatic Mode
• Asks which storage unit
it should navigate to
• Asks for directions
again upon arrival
22. Extremum Seeking
Controller
• Find and follow the
largest change in
gradient field
• Simulate sinusoidal
path of movement
• Use controller to adjust
direction
• Utilize additional
controller to smooth
movement
24. Summary
Part I
• Simple line following program
• 90° turns and intersection detection
• Bluetooth communication
Part II
• Robot Extremum Seeking Controller operation
• Gradient following program
• Operation using simulation results
Good afternoon, everyone, we are Team Enigma for Robot Programming. I am Jinhao, Computer Science major. This is Airana, Computer Engineer, Rossen, Electrical Engineering, Nikki, Computer Engineering and Lindsey, Computer Engineering. We are going to introduce you what we have done in the past few weeks.
In the following presentation, we are going to introduce project concept for robot programming in the beginning. After that, our project is mainly divided into 2 parts, Kiva Robot imitation and Extremum Seeking Controller. For the Part I, our robot can do things such as simple line following, intersections and bluetooth communication. For the Part 2, we have a cosine Shaped Path and high- Pass filter and Perturbation. At the end of our presentation, we are going to give some recommendations.
Ok,as you can see on this diagram, this is our robot Polulu M3PI Robot. It has 5 identical light sensors at the front. They are used to tell the robot where it is. Also, there is a blue tooth card and screen for us to communicate with this robot. There are 2 motor wheels on the 2 sides. Code for this specific robot is compiled on the MBed website. Libraries that can be used with this robot can also be downloaded from there.
This is our first part for our project --- Kiva Robot imitation.
Kiva is used by the online retailer Amazon, for the purpose of warehouse automation to make the retrieval of products more efficient. The Kiva robots are signaled to go to a specific storage unit by following a magnetic strip on the ground. The Kiva robot then lifts the storage unit and follows the magnetic track to the main area of the warehouse, where a worker is presented with the inventory item in question. Afterwards, the Kiva robot returns the storage unit back to its original location. In our task, we had to use the M3PI robot to similarly follow the lines on various tracks. Eventually, we programmed the robot to go to an assigned “storage unit” via Bluetooth, mimicking the job of the Kiva robots. A PID controller was used in order to make the movement of the robot smoother and the turning of the robot more accurate. Instead of utilizing magnetic strips on the ground, like the Kiva robots do, our task was to develop a controller which utilized the M3PI’s reflectance sensor to follow a black line on the ground.
The first program we wrote was called alpha run. All it did was move each of the motors in each direction, in testing this we discovered that the motor commands were switched. We chose to keep them this way rather than changing the library, because we would have had to change the library for every new program we wrote, which would be a easy place to make mistakes.
The second program we wrote was called betacalibrationline this program calibrated the sensors and then read the values on a scale of 0-1000 and displayed the values to the m3pi’s screen we did this to determine the threshold between black and white, which turned out to be 300
Our next task was to write code that would follow a line. While we were looking through the library we found a function that would return an estimation of where the line was with 0 being all the way to the left and 1 being all the way to the right
After finding the line position function we put it into the p control equation.
After writing the simple line following program we then realized that we would need the outer two sensors for detecting the intersections for the later tracks. To do this we needed to customize the line position function to use only the middle three sensors
We did this by reading the resources for the original functions and replicating the equations
As I have mentioned PID controller, it stands for Proportional, integral and derivative controller. It is very commonly used in control applications. The purpose of the PID controller is to produce a desired linear response from the robot or computer using feedback control. The control system takes the raw output from the system and feeds it back into the PID control, shifting the values through proportional control, integral control, or derivative control. The proportional control utilizes the current proportional value of the control error; the integral control utilizes the past values of the control error through integration; and the derivative control utilizes the values of the control error through derivative gain.
For the fourth track there were many turns in the track that were greater than 90 degrees and no intersections
For this we decided to return to the original provided line positioning function
For this we focused on using the P control constant value to balance speed and reliability
Now Rossen is going to introduce part two of our project
After we had the line following code working, we then moved on to turning.
We of course use the outer two sensors to detect any 90 degree angle turns
To track where we are in the intersection we use the sensors. In the diagram you can see in the first part the turn is detected. Then the robot pivots on the right wheel until the right most sensor sees white again. Then the line positioning function takes back over.
After finishing the first of these functions we wrote one for each different type of intersection and each possible direction.
Another functionality we use for the third track was the automated intersection detection.
The robot first detects the intersection, then moves forward over the intersection, takes a set of values to determine if it can move right or left, then moves further forward to detect if it can move forward at the intersection
After we had the intersection functions working on the second track, we moved onto the third track, which you can see here.
For the third track we furthered our imitation of the Kiva robots. In this imitation we have two different modes of operation, manual and automatic.
In manual mode the robot follows the line until it reaches an intersection, detects its type, and asks the user which direction to turn
In automatic mode the robot waits to be told which “storage unit” it should go to, and then asks for directions again once it reaches its destination
Now rossen is going to explain how the blue tooth communication was implemented for the third track
Now that we've had a chance to see the robot in action, I'd like to summarize the phases of the project. Our first task of part 1 was to design a simple line following program using C++ and upload it to the robot. We became familiar with the M3PI library, coded the program, and ensured the robot performed correctly.
Next, we had to implement 90 degree turns and intersection detection functionality. We programmed intersection detection by stopping the robot when all five sensors detected black. To allow for 90 degree turns, we instructed one motor to move forward while simultaneously moving the other motor in reverse.
After we achieved this, we needed to optimize the speed and reverse turning capability. To do this, we mainly had to change the values of the speed by means of trial and error until we eventually reached the ideal speed.
Our last task of part 1 was to program and establish Bluetooth communication. In order to do this, we coded a small program in Python to communicate from the command terminal on the laptop to the Bluetooth device on the robot. We programmed the robot to wait for commands from the laptop.
For the initial phase of part 2, we had to generate an ideal simulation of robot Extremum Seeking Controller operation. We used MATLAB to produce the sample model and coded the simulation while altering values via trial and error until an ideal model was reached.
After this was accomplished, we had to design a gradient following program. We converted the code and simulation modeled in MATLAB into C++. Using this basic model, the robot oscillated in the cosine-like motion we coded, while directing itself toward the darker gradient on the track.
The result of this was the robot turning too aggressively toward the gradient. To fix this, we had to optimize operation using simulation results. Again, we had to alter several numbers used in the seeking equation to achieve the more smooth behavior.