The main goal for our semester project was to construct a wall climbing robot, which would be controlled remotely, and have a video image transmitted back from the robot to the user. We inherited most of the parts from previous semester’s work but with only motion control functioning. Initially we had to figure out what every element did, and to get them to work. The main components of the climber were the FASST controller/receiver pair, the servomotors, the camera, and the image transmitter. Throughout the semester we first got every individual component of the climber to work, and tested the functionality. We also managed to implement the battery status overlay onto the screen. We also resoldered much of the internal circuitry of the climber, in order to cut down total amount of wires. We added a level of control to the central VRAM motor so that it can be controlled remotely, but only in a off-low-high way instead of an analogue way which we had hoped. We also wanted to add a camera switching option, but were unable to get the devices to work correctly. In the end we had made a climber which was controlled remotely, sent an image with battery status back to the controller, and a controllable VRAM motor.
2. Introduction
The main goal for our semester project was to construct a wall climbing robot,
which would be controlled remotely, and have a video image transmitted back from the
robot to the user. We inherited most of the parts from previous semester’s work but with
only motion control functioning. Initially we had to figure out what every element did,
and to get them to work. The main components of the climber were the FASST
controller/receiver pair, the servomotors, the camera, and the image transmitter.
Throughout the semester we first got every individual component of the climber to work,
and tested the functionality. We also managed to implement the battery status overlay
onto the screen. We also resoldered much of the internal circuitry of the climber, in order
to cut down total amount of wires. We added a level of control to the central VRAM
motor so that it can be controlled remotely, but only in a off-low-high way instead of an
analogue way which we had hoped. We also wanted to add a camera switching option,
but were unable to get the devices to work correctly. In the end we had made a climber
which was controlled remotely, sent an image with battery status back to the controller,
and a controllable VRAM motor.
Tech Rationale
A wall climbing robot can have many uses. Our robot specifically is useful for
going into an area and being able to view the area without having any humans entering
the potentially dangerous area. The uses of this for military applications are obvious, to
send a robot into a potentially dangerous area to see if there are any enemy soldiers or
bombs without endangering soldiers’ lives. Another potential use would be to send the
3. robot into areas a human couldn’t physically go like a crawl space or cave of some kind
and gather data. Technically the robot is rather simple, as there are just two small
cameras, four servomotors, a wall climbing motor, and various integrated circuits. One
potential downfall is that the robot needs a relatively smooth surface to either traverse or
climb, but in principal the robot shows the ability to gain information remotely. Also we
would have to conduct experiments to see the distance able for the robot to be controlled
and also to send information back to the user.
Procedure
When we first arrived in lab, we were greeted by a smattering of parts. Our
mission throughout the first few weeks was to determine the composition of the robot.
Our first goal was to determine the function of the receiver which talked to the remote
control. The receiver is an Futaba R617FS which has 7 channels and receives at 2.4 GHz.
In striving to figure out how the receiver was powered, we discovered that it is in fact
powered by the components which plug into each of the 7 channels. Each channel has 3
pins which correspond to power, ground, and signal. In analyzing the signal channel, we
found that many of the switches and sticks corresponded directly to channels on the
receiver. We also analyed the signal to see how it worked and found that each channel
supported a 1.5 mS pulse, and this could vary from about 1 to 2 mS when the control was
pushed to the extreme. Scope traces are found in the appendix.
4. From here, our first goal was to be able to drive the bot. Thus, we removed all the
components from the bot which were not directly related to driving the wheels. The
remaining parts were the two batteries and their attached switch, the voltage regulators
and driver circuits on each pair of wheels, and the receiver. After trying several
combinations of receiver channels, we became about to drive the bot on the ground using
the right stick to control the right wheels, and the left stick to control the left wheels.
These two sticks corresponded to channels 2 and 3 in the receiver. A full listing of
switch/channel correspondences are found in our lab notes.
Our next goal was to figure out what was happening in the video side of things.
We had very much difficulty in figuring out what each of the parts was doing. With some
help from Dr. Janet, we found we had an EagleTree Elogger, which takes telemetry and
packages it as a video signal. We had an EagleTree Video OSD, which overlays the
telemetry on top of video. Finally we had a video splitter which took two video signals
and passed through one at a time, much like a multiplexer. Lastly, there was a 2.4 GHz
500mW transmitter. At this point we did not seem to have a corresponding receiver. We
received one from Dr. Janet later in the semester. We did, though, have what appeared to
be a 12v 2.5” LCD screen. Our goal was to incrementally build up the video circuit from
basic parts until we had a full working unit. A flow chart of our final system is provided
in the appendix.
The first goal was to get any signal to display on the screen. We thus took one of
the cameras and plugged it directly into the screen. We were able to get an image when
5. we provided the screen with 12v and the camera with the 5v which come off the voltage
regulators which we also had. The next step was to try to overlay data. This was actually
harder than it seemed because the 3 pin sockets which were on the overlay were poorly
labeled, and thus we had to try many combinations till we were able to obtain the video
output signal from both the camera and the Elogger. We then attached the battery
connector which went between the two power rails to the Elogger and were able to
monitor the battery voltages across the rails. If we so desired, and had extra battery
connecters, we could have designed the circuit such that it measured current as well, or
possibly used it to measure RPM on a motor if we had the appropriate hardware.
Our next goal was to implement the video switcher which would allow us to
switch between two camera sources. First though, it was necessary to resolder the control
wires to the unit, as they had broken off. After this we connected the units and were
unable to get any video to display. We read the datasheet and found that a signal of
1.2mS changed the video to one source, and a signal of 1.8mS changed it to the other,
with anything in between having no effect. We looked at the scope trace for that
particular channel and found that what we expected was precisely what was being output.
We did find, however, that if we pulled the signal line high one video source was
displayed. We believe either we resoldered the wires wrong or more likely that the chip
had burnt out, but either way, we had not the time to complete it.
Once we had receiver the 2.4 GHz receiver from Dr. Janet, our next task was to
try to get wireless video working. We had heard that there had been issues in the past
6. where there was a large amount of interference with the remote control and the video
transmission. We found this to be the case, when the controller was switched on, the
video would become scrambled. We also knew that the remote only used seven channels,
yet the transmitter had 8 available. We went through each of the combinations and found
channel 6 to have little or no interference. We thus used channel 6 for the remainder of
the project. We did though, notice as we moved the bot around, and away from the
receiver, the signal would become more choppy and initially thought this was just
interference, so we tried to order a transmitter pair at 1.2 GHz, only to find they were not
stocked anymore. Thus we had to make do. Through more trial, though, we found that we
were able to get a strong signal most of the time, and that the signal quality seemed to
have more to do with the transmitter placement and the objects that lay in between the
transmitter and receiver than with whether the remote was on or off. We also found that
by touching the antenna of the receiver, effectively turning our bodies into antennae, the
signal quality was abundantly greater. We thus concluded that the signal quality on this
channel was not caused by interference, but by the limited broadcast power of the
transmitter. We concluded that if interference was really an issue, than by touching the
antenna, we would not only be amplifying the video signal but the controller signal as
well, which would cause no noticeable drop in interference. Since signal quality
improved, we concluded it must be a power issue. We proposed that any signal
interference would likely be because there is a large amount of data on the 2.4 GHz range
regardless of channel since the controller outputs on the other 7 channels. We also
attempted to disable some of the channels of the controller which we weren’t using, and
7. we were able to, but by doing so, we found that the controller simply output the 1.5mS
signal rather than actually disabling the channel.
The last component of the project was the suction motor, which would cause the
bot to stick to walls. This, as we had initially assumed, does not just connect to the 22v
Rails, but uses something called a Pixie 20p to control the motor. The pixie takes the
PWM signal from the radio receiver and drives the motor accordingly. Since the pixie is
powered by the 22v line, the line to the receiver must be cut so there is not a short circuit
from 22 to 5 volts. The pixie controls the motor proportionally from off, to full on based
on the proportion the control is from about 1.1 to 1.9mS pulses it receives. We tried to
implement this using the dial on the controller, but the pixie refused to arm. Before the
pixie is operational, it must receive an off signal for 2 seconds. The pixie was never
arming when attached to the dial. We eliminated the pixie as the problem when we
attached it to one of the channels corresponding to the joysticks. It worked superbly.
Under further analysis, we found that the dial was not varying the pulse width over the
full range but only from about 1.4 to 1.6 mS, which was not enough for the pixie. We
tried to program the controller to operate the dial over a larger range but were unable to.
We thus assumed something must be amiss with the dial itself. In the end, we attached
the pixie to the 3 position switch such that there would be an off, low, and high setting.
In terms of assembling the robot, we undid most of the previous wiring and
soldered it back together in a slightly more reasonable manner, having large nodes for the
22 volt rails to which many things could be connected to. Unfortunately the shortness of
8. some wires limited the usefulness of rearranging things, and the length of some of the
others caused a large mess to be remaining in the end anyway. In the end, we had two
nodes which connected to: the terminals of the battery circuit, the battery monitor output,
the pixie, the two wheel motor controllers, and the voltage regulator for the camera
circuit, which had the correct outputs for the camera and the transmitter.
As was demonstrated, the project was a success, as we were able to drive the bot
via the controller while viewing the camera with little interference. We were also able to
use the battery monitor and control the central motor remotely.
Main Costs
Futaba 7C R/C System $280.00
Eagle Tree Video OSD $80.99
Eagle Tree Elogger $69.99
LM 500mW Transmitter $49.50
2.4 GHz Receiver ~$50.00
Total >$530.48
Timeline
10/18: Introduced to robot, not working, most parts disconnected
11/1: Climber can drive around and video overlay works
11/22: Video link working, pixie received
11/28: Pixie working when not attached to dial
9. 12/4: System integrated
Conclusion and Future Direction
Before out project really got going, we knew we wanted to implement a wall
climber that would be controlled by a user and could send information back. After seeing
what we had to work with, we narrowed our goal to a climber with motion control that
would send back an image. We accomplished these goals, as well as being able to
control the central VRAM motor, so that power can be saved when the climber is moving
on flat surfaces as opposed to climbing walls. We did not manage to add camera
switching to the climber which we had hoped to. Immediate future work on the robot
should be focused on implementing the camera switching. Another important task is to
fix the interference problem on the transmitted image. Because the controller and the
image transmitter are both 2.4 GHz there is interference. However, we believe that the
interference can be overcome with a more powerful image transmitter, without having to
change frequencies. The next steps beyond that would be to find or build a cover for the
chassis so that the climber can climb without having circuits fall out, as well as to include
a skirt for the VRAM motor for better wall climbing ability.