ADVANCED MICROPROCESSOR SYSTEMS
DESIGN OF AUTONOMOUS WHEELCHAIR
DESIGNED BY: ANISH GOEL
1. Features ……………………………………………………………………………………………………………. 3
2. Control System Block diagram & description……………………………………………………… 5
3. Description and functionality of Blocks………………………………………………………………. 8
4. Components Listing…………………………………………………………………………………………….. 19
5. Design Requirements and Physical Layout………………………………………………………….. 20
List of Figures and Tables
Figure 2.1: Control System Block Schematic……………………………………………………………………………….. 5
Figure 3.1: Internal Architecture LPC 2148 Microcontroller……………………………………………………….. 9
Figure 3.2: Internal Architecture of RSC-464………………………………………………………………………………. 10
Figure 3.3: Power Supply circuit…………………………………………………………………………………………………. 12
Figure 3.4: LIS320DL Internal Architecture…………………………………………………………………………………. 12
Figure 3.5: SHARP GP2Y0A02YK Distance Sensor. a) Internal Architecture, b) Sensing Module…. 13
Figure 3.6: Circuit for Bi-Directional Control of Motors……………………………………………………………… 14
Figure 3.7: ERM320240SS-1 Internal Architecture…………………………………………………………………….. 16
Figure 3.8: Communication Interface Circuit Diagram……………………………………………………………….. 17
Figure 3.9: (a) Interface Circuit of 4x4 Keypad and (b) Keypad Configuration……………………………. 18
Figure 3.10: Equivalent Circuit of joystick………………………………………………………………………………….. 18
Figure 5.1: Physical Layout of Components……………………………………………………………………………….. 21
Table 3.1: CPU Interfaces…………………………………………………………………………………………………………… 8
Table 3.2: Voltage Requirements……………………………………………………………………………………………… 11
Table 3.3: Wheelchair Movement for relay conditions…………………………………………………………….. 15
Table 4.1: Component Listing…………………………………………………………………………………………………… 19
Includes training mode
Embedded with Artificial Intelligence
Safety and emergency operations
Direct Interface to Computer for up gradation and data transfer
The autonomous wheelchair design consists of the following features as below that are described later
with respect to the functionality.
Autonomous: The chair is operated through a control system that acts as the brain of the chair. The
control system derives all the commands from the operator or the user. The sensors act as the input to
the control system and the driving motors act as the output to the control system. These I/O’s relate to
the operation or the movement of the chair. As far as the user interface is concerned, the input to the
control system consists of a microphone for verbal commands and a keypad for selecting the functions
and interacting with the control system. The output consists of a speaker and a display for graphical user
Thus, like a computer system the control system reads the input, processes them using the
instruction and data stored in the system memory and then drives the chair and performs the other
output operation. This chair is autonomous in the sense that it can operate on its own without any
assistance from human operator once it is programmed.
Training Mode: This functionality provides the user to train the wheelchair to perform a certain
operation like going to a pre-defined place through the path specified by the user by putting the control
system in training mode. The user is provided with a joystick, a keypad and a display for the complete
interface with the control system of the chair. The user selects the training mode and then operates the
chair using the joystick (moves the chair to particular place) and gives the command to the system as:
“SAVE POSITION”. While the user was operating the chair with the help of the joystick the controller was
recording all the steps carried out by the user and saves it into the memory. The user may later on use
the command “GOTO <position name>” to automatically drive the chair to the pre-defined position. The
commands can be specified using the keypad or verbally. The joystick acts as another input device to
control system. This joystick can also be used to manual override the system in case some part of the
system is no functioning properly.
Safety and Emergency issues: The chair is embedded with the number of sensors to ensure safety to its
users. Temperature Sensor displays the temperature in the LCD screen and warns the user to take
appropriate action in case the temperature of the environment is too high or too low. The control
system can itself take safety actions if it does not receive any command from the user in this case.
Distance measuring sensors provides the facility of avoiding collision with any object in the path of the
chair. It detects the presence of any object in its path and guides the controller to take an appropriate
Accelerometer measures the acceleration of the wheelchair along the coordinate axes and is
used to provide the feedback regarding the speed of the chair. In case the chair is moving down the
slope and there is an increase in the speed due to acceleration, the accelerometer provides the input to
the controller that may take appropriate action to decrease the speed of the chair and maintain it below
the maximum permissible speed.
Password protection function provides security to the chair, as the control system asks the user
to enter the password whenever user switches between the modes. Speech recognition function allows
the system to accept verbal commands only from user.
Artificial Intelligence: The wheelchair has a brain which is the CPU of its control system. This chip along
with the sensors is responsible for providing the inputs to the brain and processing data. An intelligent
algorithm on the Microcontroller chip carries out the analysis of all the activities performed by the
wheelchair and the commands given by its user and allows the chair to adjust to its user and the
environment accordingly. The self learning algorithm guides the chair to take some appropriate
decisions without the user specifying the commands. For instance the chair may negotiate to the same
path more efficiently in the successive turns, because the algorithm allows the chair to perform various
mathematical analyses to calculate the best possible path to the destination every time the chair is
moving to the same destination.
Interface to personal Computer: For the purpose of connecting the control system of the wheelchair to a
personal computer, communication interface circuit is provided. This allows the chair to be connected to
the personal computer for up gradation of the software and also provides the functionality for the data
transfer. JTAG interface provides online testing/debug functionality to the system.
2. Control System Block diagram & description
Power Supply &
Microphone Input Regulator
Liquid Crystal Display
Input CPU Output
MEMS Interface Interface
Infra-Red Sensor Accelerometer &
Array Temp. Sensor
To Personal Computer
Figure: 2.1 Control System Block Schematic.
Description of the Block Diagram: The control system consists of two processing elements. The ARM-7
based Microcontroller is the CPU of the system and executes all the commands specified by the user, as
well as all the commands required for the basic functionality of the system. The program of the
complete system is stored in the flash memory of the microcontroller and the interface to the external
world and the I/O devices are devised by the on chip peripheral interface support. The CPU is supported
by a speech processor RSC-464 that executes all the commands related to the voice interface. This
processor in interfaced to the ARM Microcontroller and derives all the commands from the CPU for its
operation. Though the programmer has an additional load of programming both the processors, this
multi-processor system results in a speedup and increased throughput over the uni-processor system.
So in this case, the ARM microcontroller deals with all the input output interfaces while the co-processor
takes care of all the voice related operations. The keypad provides the user interface to the system to
input the commands and the LCD provides the interactive output to the user with a graphical user
interface. The joystick provides the user with a functionality to operate the chair directly in the training
mode and also acts as a backup in case the control systems hangs up, this operation is referred to as
manual override where the user can directly operate the chair without the assistance from the control
system. The joystick is also used in the training mode to train the chair as explained in the modes of
operation. A speaker is interfaced to the speech processor to output various messages to the user. The
system also has a direct RS-232 interface to the serial port of the personal computer to upload and
download the program from the system. A regulated power supply is provided to all the chips and
peripherals on the motherboard using a power supply circuit. As the motors need a high current for
their operation, the driver circuit provides suitable amplification of power for the motors. The
operations of all these modules are described in detail in the following sections.
Working of the system:
The system consists of three modes, Training mode, Autonomous mode, Joystick mode. The chair is first
trained according to the user so that it can adjust to the environment with respect to the user and then
it can be operated in autonomous mode. The functionality of all the modes is explained below.
Training Mode: This mode is used to train a chair so that it can be operated over a wide range of
terrains. Using the graphical user interface the user can switch between the modes. The system will ask
user to enter the password while changing the mode to provide the system with security. Once the user
enters the training mode the user can perform the functions listed below:
1. Train the chair to move around the environment and save different position and specify a voice
command for a particular destination. The system will save the position of the chair with respect
to the initial or start position and assign the name specified by the user to that particular area.
2. Train the chair to move at different speeds over different range of terrains.
3. Make setting related to the operations that are to be performed during emergency and other
safety related issues.
While the chair is in training mode and user is operating, the control system provides the user with
some suggestions and optimizations to perform a specific task. Such as the shortest distance to reach to
a particular position in the environment. The user can also ask for assistance from the control system
such as the demonstration mode in which the chair can execute previously stored standard commands
to introduce the user with its capabilities. Once the chair is trained for a specific location the user can
give command as “GOBACK” to reach to the position from where the training began for this location.
Autonomous Mode: In this mode the chair executes the commands provided by user to reach to a
specific location. The user is provided with some standard commands as mentioned earlier like:
MOVETO, EXECUTE, STOP, RESUME etc. To reach to a particular position that was trained earlier in the
training mode and saved as <living room> for instance, the user gives the command to the chair as
“MOVETO living room”. The system then recognizes the command (MOVETO) first and then takes the
second input (living room) as its data. The microcontroller compares the data to the previously stored
locations with reference to the command MOVETO and if the stored location consists the entry LIVING
ROOM, the chair moves to the specified location. In case the location does not exists in the memory the
systems gives a message as “UNDEFINED LOCATION”. At this instance itself the system gives the chance
to the user to train it to move to this location (living room) by asking: “TRAIN for new location?” The
user can skip this and give another command or can switch to training mode immediately and train the
chair to move to the new location (living room).
Joystick mode: This mode is used to operate the chair when the control system does not function
properly or breaks down due to some accident. In this mode the system directly connects the output of
the joystick to the motor drives. However a separate circuit is needed for this functionality and a
particular logic to put the wheelchair in this mode. However, if the control system is functioning
properly the user can manually switch to this mode by entering a pass code. When the control system is
in Joystick Mode also called the “Manual Override” mode, the control system does not interfere with
the user activities and allows the joystick to directly drive the motors by user control of the joystick. In
this case the outputs of all the sensors are masked by the microcontroller and none of the chair
activities are recorded by the control system.
Functioning of the Control system with respect to training and autonomous modes is explained with
respect to the description and functionality of all the system blocks in the next section.
3. Description and functionality of the blocks.
Central Processing Unit: Microcontroller LPC2148FBD64 is the CPU of the system which is a 16-bit/32-
bit ARM7TDMI-S based microcontroller. The following features of this chip are supportive to the
designed control system:
Flash Memory of 512KB for program and data storage and 32KB RAM for internal operations.
10-bit 14 channel on-chip analog to digital convertor for analog interfacing.
Real time clock with independent power and 32 KHz clock input.
Serial communication ports for UART and I2C interfaces.
45 General purpose I/O interface pins 5V tolerable.
32-bit timers/external event counters and interrupt pins.
Maximum CPU clock of 60MHz with on-chip circuitry for peripheral clock scaling.
I/O device Description Interfaced to Pins Interface Comments
CPU block Type
RSC-464 Speech Processor General Purpose P1[27:16] Parallel Multiplexed
I/O with LCD
ERM320240SS-1 LCD Module General Purpose P1[27:16] Parallel Multiplexed
I/O with RSC-464
using CS signal
9SA50RR6559 Joystick A/D Convertor AD0[1:0] Analog
PM8018-RP7361 D.C. Motor General Purpose P0[9:7] & Parallel Connected via
-- 63652 I/O P0 driver circuit
GP2Y0A02YK Distance Measuring A/D Convertor AD1[7:0] Analog
Sensors (8 Nos.)
LIS302DL MEMS Accelerometer I2C Interfaces SCL0, I2C
Keypad 4x4 Keypad General Purpose EINT0, Parallel Interface can
I/O P0[20:17] be serial also
JTAG Interface Testing and Debug Test/Debug TDI,TDO, Serial
Circuit Interface TCK,TMS
RS-232 Interface Serial Interface UART TXD0, SPI
Table 3.1: CPU Interfaces
Table 3.1 gives the I/O devices and co-processor connection details with the CPU. Figure 3.1 shows
the internal architecture of the LPC2148 microcontroller. Other connections of the CPU include:
RESET: External reset pin to reset the device, causing the I/O peripherals to take their default
states, and processor execution to begin from starting address.
XTAL1 and XTAL2: Connected to a crystal resonator for oscillator circuit.
RTCX1 and RTCX 2: Crystal connections for RTC circuit. Connected to a crystal of 32.768 KHz.
Vdd and Vss: Supply voltage for core.
VBAT: RTC power supply voltage. 3.3 V on this pin supplies the power to the RTC.
VDDA, VSSA and VRef: Supply and reference voltage to ADC.
The CPU derives all the commands from the user once it is programmed. The basic
functionality of the CPU specifies dealing with the inputs and outputs as well as communicating with
the co-processor. The load over the CPU is reduced by introducing off-chip modules for various
functionalities like co-processor that takes care of the speech processing part and the output
generation on the speaker. The LCD module containing embedded controller takes care of the
graphical user interface with the keypad interface circuit which also includes a keypad encoder to
provide serial data as well as key de-bouncing logic. This reduces the software load considerably as
the CPU commands these modules with a specific tasks and then waits for the results. While the
tasks are being performed by these modules the CPU can take service other interfaces like sensors,
joystick and motors. This functionality allows the CPU to service all the modules more frequently.
The CPU core voltage is 3.0 V to 3.6 V. However as all the I/O pads are 5V tolerable, all the devices
and modules are directly interfaced to the CPU.
Figure 3.1: Internal Architecture LPC 2148 Microcontroller
P a g e | 10
Speech Recognition Processor: RSC-464 is used to process the speech commands in the system. Though
this chip is an embedded DSP microcontroller, it is used as a co-processor with the ARM microcontroller
that is the CPU of the system. The main idea behind using a co-processor with a CPU is to separate the
digital and analog block and to divide the load amongst the two processor, to achieve a higher system
throughput. Some interesting features of this chip that supports the system are: Speaker Independent
and Speaker Dependent recognition, available in different languages for international use, Speaker
Verification – voice password biometric security. The chip also includes an on-chip 8 bit microcontroller,
64KB of ROM, 16 bit ADC, 10 bit DAC & PWM, and microphone preamplifier, Twin-DMA, Vector Math
accelerator, and Multiplier, 16 configurable I/O lines.
However it must be noticed that this chip should be programmed first for its speech processing
functionality and then interfaced with the CPU. This adds additional overhead of communication
between the CPU and speech processor. This is achieved by proper synchronization between the two
processors. The CPU is connected to co-processor at the I/O lines of the co-processor and a proper
protocol is designed for a reliable and fast communication. Figure 3.2 shows the block schematic of the
internal architecture of the speech processing chip RSC-464. It shows the capability of this chip to
function as a complete module for speech processing. As mentioned above, the CPU of the control
system is interfaced to this chip at its I/O lines and thus the chip derives all the signals from the CPU for
its operation. The CPU has the full access to the internal memory of this chip. The CPU provides all the
data and command signals to the chip along with the control signals and in return fetches the data
processed by this chip. This chip also contains a DAC and a speaker outputs that can also derive all the
controls from the CPU. Hence the main functionalities of the on chip microcontroller core of the speech
1. To process the speech input provided through the microphone.
2. To output appropriate signals to the DAC and speaker.
3. To communicate with CPU for deriving command, data and control signals and also
provide CPU with the data.
Figure 3.2: Internal Architecture of RSC-464
The above figure show the block schematic of the speech recognition chip. As this chip is used as a co-
processor in the system, the on-chip microcontroller is interfaced with the CPU via the general purpose
I/O line and hence the chip derives all the signals for its operation from the CPU. The CPU can also
access the internal ROM of this chip through this interface. The 10 bit DAC and the PWM is used to drive
the speaker for voice outputs from the system. The following modes of the chip are main features with
respect to the speech processing and voice recognition in the system.
P a g e | 11
Speaker Independent recognition requires no user training. The RSC-464 can recognize up to 15
commands in an active set. Text-to-SI (T2SI), based on a hybrid of Hidden Markov Modeling and
Neural Net technologies, allows creation of accurate SI recognition sets in seconds. SI requires
Speaker Dependent recognition allows the user to create names for products or customize
recognition sets. SD is implemented with DTW (dynamic time warping) pattern matching
technology. SD requires programmable memory to store the personalized speech templates
(trained patterns) that may be on-chip SRAM, or off-chip serial EEPROM, Flash Memory, or
SRAM. As the on-chip memory is low capacity an off-chip serial memory chip AT45DB041B is
interfaced to store more of user commands.
Speaker Verification enables the RSC-464 to authenticate when a previously trained password is
spoken by the target user. SV is also implemented with DTW technology. This provides
authentication to the system. So the system recognizes the commands only from its user only.
Microphone interface for voice input: For user voice input an input microphone preamp and 16 bit
Analog-to-Digital Converter (ADC) for speech and audio/analog input is provided on this chip. An
external microphone passes an audio signal to the preamplifier and ADC to convert the incoming
speech signal into digital data. Speech features are extracted using the Digital Filter engine. The
microcontroller (Processing element of RSC-464) processes these speech features using speech
recognition algorithms in firmware, with the help of the “L1” Vector Accelerator and enhanced
instruction set. The resulting speech recognition results are then passed onto the CPU to take a
particular control action. The on chip processor may also store these results in the system memory if
commanded by the system CPU.
Power Supply: Hi-power battery with a rating of 25.9V 40Ah is used as the power supply to the
complete system. The voltage from this is considerably dropped down to a range of 1.8 to 12 V to be
supplied to the circuitry whereas a hi-power is also derived for the operation of the motors. As the
motors are high torque motors and have to provide suitable power to drag the chair as well as its
passenger, this high power battery is chosen to support the system and provide power for a
considerable amount of period before it is recharged. Figure 3.3 shows the power supply circuit for
the system that uses the battery to derive all the voltage levels required. Regulator circuits using
78xx voltage regulators and LM 317 current regulator are used to obtain the voltage levels. The
motors are driven directly through the battery i.e. at 25.9V to save the conversion power.
The voltages provided by the power supply to the different blocks and parts of the control system
are specified in table 3.2: (All voltages with respect to ground or 0 volts)
Voltages available: 25.9V, 12V, 5V, 3.3V, 1.8V. Others voltages (e.g. 4.7V) are obtained by operating
a Zener diode as voltage regulator for that specific block/component.
Voltage Level (Volts) Blocks/Components
25.9 V Motors
12 V Optional (For 12V relays)
5V ULN2804, Relays, LCD (backlight and Controller), EDE1144, Comm. Interface
4.7 V Infra-red sensors, joystick, CPU ADC block (VDDA and Vref.)
3.3 V CPU (LPC2148), RSC-464, LIS320DL
Table 3.2: Voltage Requirements
P a g e | 12
Power Supply for Autonomous Wheelchair
1 3 1 3
VIN VOUT VIN VOUT
BT1 C9 C10
1000u 100n C12 C11
U11 LM317/DD U10 LM317/DD
3 2 3 2
VIN VOUT R23 VIN VOUT R22
1 D5 1 D4
C16 1N4007 C13
C18 100n 1N4007 C15
100n R24 C17 100u R21 C14
360 10u 120
Figure 3.3: Power Supply circuit
Sensors: The chair uses different kinds of sensors for its fully autonomous operation. The different
sensors and sensing modules used in the wheelchair are:
Accelerometer: Its keeps the track of the acceleration of the chair with respect to the co-ordinate
axes. It acts as the input to the controller when the chair is in motion. The control system adjusts the
speed of the motors accordingly to provide a safe and constant speed. For instance the system may
decrease the speed while going down the slope.
MEMS accelerometer LIS302DL by ST microelectronics is used as the motion sensing element in the
system that measures the acceleration of the chair with respect to all the three co-ordinate axes.
The output of this integrated sensing element is digital. The complete device includes a sensing
element and an IC interface able to take the information from the sensing element and to provide a
signal to the external world through an I2C/SPI serial interface. Figure 3.4 shows the internal
architecture of the MEMS accelerometer chip. The inputs from all the three co-ordinate axes is
multiplexed and converted to digital outputs which are then interfaced to the CPU using the I2C
Figure 3.4: LIS320DL Internal Architecture
P a g e | 13
Sensing and distance measurement:
Sensors are required to negotiate the chair to its destination. The chair should have the distance
readings from all the objects of its surroundings. Infra-red sensing elements provide the distance
between the chair and the objects in the surroundings. Once the distance is provided to the system
it can make appropriate decisions for the safe motion of the chair. This distance reading is used by
the system in both the modes i.e. training mode, to help the user in maintaining an appropriate
distance from the surrounding objects and also in autonomous mode, when the chair is moving
around on its own. The chair consists of an array of infra red sensors which sense the distance of the
chair from the surrounding from all the directions. The module GP2Y0A02YK by SHARP is used as the
infra red sensor. This sensor has a less influence on the color of the reflected object and their
reflectivity due to optical traveling method and hence is suitable to detect objects in the
surroundings as the environment consists of different colors. The detection range of this sensor is
20cm to 150 cm. The output Vo (in figure 3.5: (a)) of these sensors can be directly connected to the
microcontroller at the ADC inputs. Figure 3.5 (a) shows the internal architecture of the modules
circuitry and figure 3.5(b) shows the module in which the transmitter and receiver LED’s are visible.
To ADC input
Figure 3.5: SHARP GP2Y0A02YK Distance Sensor. a) Internal Architecture, b) Sensing Module
Eight such modules are used in the system to have input from all the directions of the chair. Two
modules are placed on each side, one in extreme left side and other in extreme right side. As
mentioned above the minimum range of the sensors are 20cm which is suitable for the application
as the chair should maintain a minimum distance from the surroundings for safe operation.
However if the user wishes to cross this limit the system can be put in “Manual Override” and the
chair can be driven through joystick anywhere. However, the maximum range of these sensors is
150cm. While traveling a long distance the system should take a long run toward the destination. In
such a case long range distance sensor is required, however this is applicable only in forward
direction because the chair may not move sideward or in the reverse direction for a longer distance.
The outputs from the sensors are calibrated in terms of distance readings and are stored in the
system memory. The outputs of these sensors are fed directly to the ADC inputs of the CPU. Another
module containing ultra-sonic distance sensor with a range of 3m is placed in the front of the chair
which add the capability to traverse a longer distance while moving in the forward direction.
Motors and Drivers and Turning Mechanism:
The system consists of two 24 volt DC motors that provide a torque of 268 lb (each) to the
wheelchair. The shaft of each motor is connected to each wheel of the wheelchair. Thus when both
the motors rotate in one direction the chair moves forward. Rotating both the motors in other
P a g e | 14
direction provides backward motion. For turning the wheelchair differential turning mechanism is
used in which one motor is rotated in one direction while the other motor is rotated in other
direction. Thus the chair turns in the direction corresponding to the wheel that rotates in forward
direction. Whereas rotating the motors in opposite direction with respect to above condition turns
the chair in opposite direction. This also provides a sharp turn as the chair rotates about its own
center. For providing smooth turns, one motor is completely stopped while the other is rotated. In
this case the chair rotates about the opposite wheel. The motors are operated through DC relays
that derive the appropriate control signals from the microcontroller. When a relay is switched ON a
25.9V DC supply obtained from the main power source is directly connected to the motor.
Motor Model: PM8018-RP7361 -- 63652
Motor Rating: Shaft Speed = 41 RPM
Wheel Radius r = 30 cm
Circumference of wheel C = 2*π*r
C = 2*3.14*30
C= 188.4 cm
Therefore Speed = 188.4 * 41 = 7724.4 cm/min
Speed S = 128.74 cm per second (Approx. 1.3 meters per second)
This is the maximum achievable speed by the wheelchair. However if a lower speed is required Pulse
width modulation technique may be used to switch the relay ON and OFF at a particular frequency.
The average human walking speed is 1.4 meters per second, so the speed of 1.3 meters per second
for the chair should be just fine for safe movement.
Relay Logic for bi-directional control of DC motors: As the motors are operated through relays, a
particular logic is needed to drive the motors in both directions (forward and reverse). For this
functionality two relays are required per motor. Figure 3.6 shows the circuit to implement the bi-
directional control of DC motors.
Bi-Directional Control of DC motors
for Autonomous Wheelchair MG1
VCC 24V VCC 24V LS2
U1 5 5
1 18 4 4
MG1 Forward IN1 OUT1
2 17 1 1
3 IN2 OUT2 16 2 2
MG 1 Rev erse IN3 OUT3 0 0
5 IN4 OUT4 14
MG 2 Forward IN5 OUT5
6 13 RELAY SPDT RELAY SPDT
7 IN6 OUT6 12 VCC 5V VCC 5V
MG 2 Rev erse IN7 OUT7 MG2
10 2 1
VCC 5V MOTOR SERVO
VCC 24V VCC 24V LS4
0 5 5
RELAY SPDT RELAY SPDT
VCC 5V VCC 5V
Figure 3.6: Circuit for Bi-Directional Control of Motors
P a g e | 15
Circuit Description: A ULN driver chip is used for driving the relays. ULN 2804 is a driver with open
collector configuration, thus logic 1 at the input provides a ground at the output. Hence the output of
ULN is connected to the ground of the relay coil. Two Darlington pairs are used for a single relay to
increase the driving capability of the circuit. The relays connect 25.9V or ground to the particular motor
once it is switched ON or OFF. The signals from the microcontroller are connected to the ULN to operate
the relay. Table 3.3 shows the moment of the chair with respect to the signals from the microcontroller.
Signals From Microcontroller Relay Status Wheelchair
MG1 F MG1 R MG2 F MG2 R LS1 LS2 LS3 LS4 Movement
0 0 0 0 OFF OFF OFF OFF STOP
0 0 0 1 OFF OFF OFF ON TURN RIGHT SLOW
0 0 1 0 OFF OFF ON OFF TURN LEFT SLOW
0 0 1 1 OFF OFF ON ON STOP
0 1 0 0 OFF ON OFF OFF TURN LEFT SLOW
0 1 0 1 OFF ON OFF ON REVERSE
0 1 1 0 OFF ON ON OFF TURN LEFT FAST
0 1 1 1 OFF ON ON ON TURN LEFT SLOW
1 0 0 0 ON OFF OFF OFF TURN RIGHT SLOW
1 0 0 1 ON OFF OFF ON TURN RIGHT FAST
1 0 1 0 ON OFF ON OFF FORWARD
1 0 1 1 ON OFF ON ON TURN RIGHT SLOW
1 1 0 0 ON ON OFF OFF STOP
1 1 0 1 ON ON OFF ON TURN RIGHT SLOW
1 1 1 0 ON ON ON OFF TURN LEFT SLOW
1 1 1 1 ON ON ON ON STOP
Table 3.3: Wheelchair Movement for relay conditions
Thus from the table it is clear that by having various combinations at the input of the ULN, different
movement of the chair can be achieved. Also for more then 1 combination, the chair can be in same
state, such as it can be in stop stage for 4 different combinations of the input. This allows the
programmer to switch between states by changing less number of bits. Also, fast and slow turning can
be achieved using differential turning mechanism as disused above.
LCD Interface: The system is provided with a LCD interface to the CPU that provides graphical user
interface to the system. All the outputs are visible on the LCD module, such as:
Mode of Operation ( Training mode, Autonomous mode)
Time, Temperature and other general data
Navigation Map and present status etc.
The LCD derives all the signals from the CPU and hence appropriate function (like “Display”) should be
serviced accordingly in the program. The CPU is interfaced to the LCD controller EPSON S1D13350 that
takes care of displaying the data and graphics on the LCD screen. The controller also has the ability to
display scrolling text and partition the display into multiple screens.
P a g e | 16
As shows in figure 3.7 the LCD controller derives
the following signals from the CPU: DB0-DB7 the
data to be displayed in ASCII value, WR and RD
the read and write signals to write data and
commands into LCD memory and read the
status of LCD if it is busy in its internal memory,
A0 to distinguish between control and data
words. In addition to these signals, it also has
reset, contrast control and power supply pins
for operation of the LCD controller and separate
power pins for backlight of the module.
Figure 3.7: ERM320240SS-1 Internal Architecture
Communication Interfaces to personal Computer: The control system has the following interfaces to a
JTAG Interface for testing and debugging.
RS-232 Serial Interface for communication and programming.
RS-232 interface provides serial interface to computer for upload/download of the programs and data.
This is a standard serial communication interface performed via RXD and TXD signals that transform TTL
signals to CMOS and visa-versa. Once the chair is programmed in training mode by the user, this data
can be uploaded into the computer so that it can be later on referenced or modified and optimized by
using different algorithms which are not possible to implement on the system CPU. This program will
also serve as the backup in case the control system breaks down, the user need not train the chair again,
and instead the user can download the program from P.C. that was previously uploaded.
The JTAG interface provides online testing and debugging for the control system. Some of the software
platforms also have functionality to program the chip through JTAG interface. The CPU contains
boundary scan test logic also referred as IEEE 1149.1 standard. Figure 3.8 gives the circuit
implementation for the communication interfaces.
P a g e | 17
Communication Interfaces to Personal Computer
JTAG Interface VCC
For Autonomous Wheelchair
8 1A 1Y 9
13 2A 2Y 12
18 3A 3Y 16
RS-232 Serial Interface
19 3OE 5V
4OE D2 U6
74HC125/LCC 2 1
R101 8 9
R2IN R2OUT RXD
0 R100 1N5817
5.1k 11 14
P1 C101 10 T1IN T1OUT 7
D1 .01uf TXD T2IN T2OUT
13 R3 100 C4
25 2 1 1
12 R8100 R200 3 C1+
To parallel Port
24 100 0.1u 4 C1-
11 1N5817VCC C100 5
23 100pF 2 C2-
10 C5 6 V+
22 U3 20 0.1u V-
21 3 4
8 8 1A 1Y 9 R9 100 C6 C7
2A 2Y TDI 0.1u
20 13 12 0.1u
7 18 3A 3Y 16 C1
19 R1 300 4A 4Y 100pF VCC
6 2 0
18 6 100
5 R6 300 CTRL 14 2OE R11
17 19 3OE
4 R5 300 4OE C3
3 R7 74HC125/LCC
15 300 R2 100 R10 100 0
14 300 R4
1 C2 CONNECTOR DB9
CONNECTOR DB25 To Serial Port
Figure 3.8: Communication Interface Circuit Diagram
Keypad Interface: The keypad is another input device to the system. The user communicates with the
control system and specifies all the commands through the keypad. The LCD displays all the activities
performed by the user as well as displays the menu for the control system functionalities. A 4x4 keypad
is interfaced to the CPU via a keypad encoder that provides a 4-bit encoded data and hardware
denouncing logic, decreasing the software load considerably as well as the number of I/O pins required
for interface. The circuit for keypad interface is custom designed using EDE1144 integrated circuit that
also has a capability to provide a serial output of data to the CPU via the serial output pin as shown in
the figure 3.9 (a). Thus, keys can be read using the serial output pin by the CPU once the DATA VALID
signal goes high. Hence the DATA VALID signal may be connected at the interrupt line of the CPU to
detect any activity on the keypad and an appropriate function may be called to read the data and store
it. The chip also outputs a beep signal to the buzzer connected to indicate if a key is pressed.
Figure 3.9 (a) shows the circuit implementation of the keypad interface and 3.9(b) shows the keypad
configuration of the 4x4 keypad for the system. As seen in the configuration, the keypas consists of
digits 0-9 and an array of four direction arrows and select button (middle) for navigating through the
LCD for graphical interface. This keypad provides the following functionality to the system:
Make settings for the control system and save these settings.
Input the pass code when requested by control system.
Execute standard commands without providing verbal input in case user is mute.
Navigate the Graphical User Interface Screen.
P a g e | 18
Keypad Interface for Autonomous Wheelchair
VCC_BAR Key Pad Configuration
SW1 SW2 SW3 SW4
1 2 3
Q1 SW SW SW SW
R20 330 SW5 SW6 SW7 SW8
4 5 2
VCC_BAR 1 18 R17 SW SW SW SW
17 7 8 9
330 SW9 SW10 SW11 SW12
+5V OSC2 Y1
+5V OSC1 4MHz R18 SW SW SW SW 0
GND +5V VCC_BAR
0 6 ede1144 13 330 SW13 SW14 SW15 SW16
8 11 R19 SW SW SW SW
R3 C0 330
R12 R13 4.7K 4.7K
Figure 3.9: (a) Interface Circuit of 4x4 Keypad and (b) Keypad Configuration
Joystick Interface: A 9000 series contactless joystick from APEM technologies with inductive sensing and
circuitry is interfaced as an input device to the control system that performs the following operations:
To train the chair in training mode to negotiate the chair to a specific position.
To operate the chair in “Manual Override” mode.
The joystick operates by passing an oscillating current through a drive coil, directly mounted at
the lower end of the operating lever, and immediately above the four sensing coils. When the shaft and
drive coil moves away from the centre, the signals detected in each opposing pair of coils increase
nominally in proportion to deflection. The phase of those signals determines the direction. Synchronous
electronic switches followed by integrating amplifiers provide DC signals directly equivalent to those of
potentiometer joysticks. Figure 3.10 shows the equivalent circuit of the joystick. As the signals from the
X-Y axes are obtained simultaneously through X-Axis and
Y-Axis output pins (when joystick is moved diagonally), the
CPU reads both the signals and operates the motors
accordingly. The outputs are directly connected to the
ADC inputs of the CPU and the analog reading are
previously calibrated and stored in the memory. The
internal ADC of the CPU is 10-bit successive approximation
analog to digital convertor with each channel capable of
performing more then 400,000 10-bit samples per second
providing high input processing speed to the joystick
Figure 3.10: Equivalent Circuit of joystick
P a g e | 19
4. Component Listing:
Table 4.1: Component Listing
Component Description Vendor Comments
LPC2148 CPU Philips
RSC-464 Speech Processor SENSORY Inc.
ERM320240SS-1 LCD Module East Ring Tech. Contains Embedded
controller EPSON SED-1335
9SA50RR6559 Joystick APEM
PM8018-RP7361 -- D.C. Motor Groschopp Inc.
PL-1055275-7S-TM Polymer Li-Ion Battery Powerizer
GP2Y0A02YK Distance Measuring SHARP
LIS302DL MEMS Accelerometer IC ST Microelectronics
Power Supply Circuit Regulated Power Supply Custom Designed Designed using Regulator
IC’s and discrete component
Keypad 4x4 Keypad Custom Designed Designed Using EDE1144
JTAG Interface Testing and Debug Custom Designed Designed using Buffer IC’s
Circuit and discrete components
RS-232 Interface Serial Interface Custom Designed Designed using MAX-232 IC
Driver Circuit Motor Driver Custom Designed Designed using ULN-2804
P a g e | 20
5. Design Requirements and Physical Layout
Design Name: @utochair-2008
Approximate Cost of single design for testing: 1850-2050 $
Approximate Cost of design for mass production: 1250-1450$
Note: Above mentioned prices includes the price of the control system with battery and the wheelchair.
Estimated Time for Construction:
Time for Construction of single module for testing: 3 Months
Time for Construction of single module for mass production: 1 week/module for single
assembly stage, 2-3 days for assembly line production.
Note: Time for single module includes the time required to develop software for the system and the
For ARM microcontroller :
Keil RVMDK-ARM Development tool: includes uVision IDE, Debugger, Simulator, RealView
Compilation Tool, and the RTX Real-Time Kernel.
Runs on windows XP platform.
Programming Language: C language.
For Sensory Speech Recognition Processor:
SENSORY’S FluentChip Firmware.
P a g e | 21
Infra-red Sensors (Front)
Figure: 5.1 Physical Layout of Components
Figure 5.1 shows the placement of all the components of the system. Some of the components and
modules not visible in the above picture are:
Sensing modules on the other two sides (back and right)
Motors that are placed inside and shaft connected to the rare wheels.
Microphone is embedded in hand rest of the chair.
Future Design Includes:
Advanced Navigation system using GPS and digital camera interface for place recognition.
Connectivity to a mobile device to give commands to chair from mobile location.
Connectivity to internet for data transfer and upgrading of software packages.
Training of the chair on graphical screen instead of physical movement.
More advanced algorithms that optimize specified commands online.