SlideShare a Scribd company logo
1 of 6
Download to read offline
Proceedings of the 1999 fEEE
Intemationaf Conference on Robotics & Automation
Detroit, Michigan q May 1999
Share Control in IntelligentArm/HandTeleoperatedSystem
YOLI SOllg Wang Tianmiao Wei Jun Yang Fenglei Zhang Qixian
Robotics Institute of Beijing University of Aero. and Astro.
Beijing, China, 100083
Tel: (8610) 82317748 FAX: (8610) 62371315
Email: syoll@pLlbllic, bLlaa.edll.cll
Abstract: This system is mainly composed of
industrial robot, dexterous hand (BH-3), graphic
simulation and planning module, 6-DOF teleoperated
mechanical arm (BH-TMA) and data glove with 5-
fingered 11-DOF (BHG-3) etc. It consists of vision,
force, torque, fiber, angle, and fingertip tacti Ie sensors.
In order to implement some complex operations in the
integrated system, we propose a task-orientecl
hierarchical control share mode]. Moreover, we also
express our viewpoints about share control in
teleoperated system. Finally, the experimental and
simulative results are given to show that the share
control construction is high-efficiency, valuable and
successful.
1. Introduction
Share control in teleoperated system is a very
important issue in the front field of space robotics.
Many researches have been developed in space robots,
autonomous agents, inte[ iigent control and dexterous
manipulation etc. Lots of experiments ‘1231show that it
is impossible for space robot to autonomously
perform space-manipulating tasks under complex
environment, therefore astronaut or operator on the
ground need to remotely monitor and operate the
executive system. Simultaneously, in the influence of
universal communication time-delay and micro-
gravity manipulation, the error judges made by the
astronaut and operator can’t be avoided since the
cause-effect relation wi II be destroyed. The
teleoperated share control technique is a very effective
method to resolve above questions by coordinating
high-level man-monitoring harmony and low-level
autonomous control.
With the development of robot application and
research, a teleoperated robot system necessarily
depends on varied sensors and external instruments ‘“~]
to obtain the relative environmental information, such
as vision, force, distance, tactile and so on.
Furthermore, with the increment of sensor’s quantity
and type, each sensor type has own characteristics and
functions. Therefore, it isn ‘t feasible to find a general
model for some different sensors that are independent
of the physical sensors. So sensor integration, fusion
and share technique is becoming increasingly
important to improve performance and robustness in
SLIChsystems.
[n this arnl/ hand system, becaLlse of the
disequilibrium and Llncellaitlty of time, space, and
position, a single control model for various different
tasks is impossible. la terms of past references ‘8],
share control of multi-sensor integration and data
fusion is still difficulty task.
Traditional share control method often adapts
Bayes decision approach “1 and Dempster-Shafer
theory of evidence model “], but these two methods
have respective defaults. Bayes decision approach
can’t strictly distinguish between uncertain and
unknown. Denlpster-Shafer evidence theory can make
up for this default, but it lacks tightness in axiomatic
mathematics definition. Here, we propose the task-
oriented muki-agent share construction which is put
forward to this system.
2. Architecture
This teleoperated system is a platform for research
and application. It comprises 4 main modules: ])
Graphic simulation for task and trajectory planning
using BH-TMA and BHG-3; 2) Local autonomous
control for tracking, grabbing and manipulating
workplaces in the workspace based on multi-sensors,
such as global and local vision, wrist Force/Torque,
optical iiber, and force on the finger tip etc; 3)
Te]elllallipLl] atioll from global simulation to local
planning and collaborate robotic arnl/hand via the
remote data communication; 4) Remote control of
robotic arn~/hand manipulation by BH-TMA and
BHG-3. The system physical diagram is given as
shown as Fig. 1.
There are 15 DOF in autonomous control sub-
system, 6 for the arm and 9 for the hand, and 17 DOF
in teleoperated sub-system, 6 for teleoperated
mechanical arm and 11 for the data glove. With so
many degrees of freedom, an effective approach is to
decompose the search space into lower dimensional
subsets that can be explored using heuristic search
techniques. Even then, well-chosen sensing and
~econstruction strategies are essential to reduce the
(Teometric complexity of-the planning problem.~
0-7803-51 80-0-5/99 $10.00 @ 1999 IEEE 2489
Fig. 1 System physical diagram
3. Implementing Techniques
The hand/arm teleoperated system can control,
make decision and execute based on ]multi-sensor
fusion information. It can adapt environmental change,
track and locate object, modify planning modu Ie,
remotely manipulate workpieces, receive simu Iat ing
data, harmonically perform dexterous assemble task.
In this system, the share control is mainly
composed of three modules: autonolnous control,
teleoperation and simulation. In this article, the
emphases of our share control has three different
contexts:
. Sensor data share
q Multi-agent-based share
q Man/machine interactive share
The sensor data share is a basic share mode, and is
the basement of low-level local autonomoLls control.
The multi-agent-based share is a behavior-based and
task-oriented share mode, it is important for hand/arm
to perform dexterous and precise tasks. Man/math ine
interactive share is a system-level share mode to
coordinate high-level planning and low-level
autonomous control and it is guaranty to remotely
fulfill various manipulations in a safe condition.
3.1 Autonomous control
Autonomous control module fllses the
environmental data obtained from sensors, compares
and filtrates them with a optimized model. 1n terms of
planning result of high-level simulation system, it
determines action and task sequences for path, orbit
and grasping optimization, in turn controls the low-
Ievel controller and mechanical basement to perform
respective task. To finish these functions, autonomous
control architecture is shown in Fig. 2 diagram:
! MolIonmoduleplanning,Taskplanning
1
: Modulesandprotocolsharecntrol
[1 ~ =]
1 I
/
c1priorimage lzE5
trealmenl
Fig. 2 Autonomous control Architecture
3.2 Sensor data share
The integrated system consists of many kinds of
sensors. Two CCD provide the location parameter to
control the motion of arnl/hand and calibrate the
environment between robot and worktable in
workspace. The 6-D wrist Force/Torque sensor
mounted on the end of effecter of PUMA560 and the
3-D tactile sensor on the fingertip of BH-3 dexterous
hand are used to test the force and make compliant
control. Nine angular potentiometers in the finger
joints are used to provide the grasp space information.
Three optical fiber sensors are used to avoid obstacle
and collision. Countering so many sensors data, we
propose the low-level sensors data share architecture
shown in Fig.3.
2490
, .“,.., , “,!,.. s,’
Fig. 3 Sensor Data Share Architecture
3.3 Multi-agent-based share
From above figure we can see, there are a lot of
sensors and control hardware in this system. How to
share well so many external information resources
and make full use of them is the key to perfectly
implement manipulation tasks. The task-oriented
sensor fusion and share is the chief strategy. The
fusion module in manipulating process can be
expressed in following function:
[
P(sl) S1 3 (Col’zdition Sl)
P= P(S2) S2 3 (condition s2)
P(S3) S3 3 (condition s3) ...
Where P is logic control parameter based on lmLllti-
sensor data, T is task decision valve to shifl different
sub-task operation. S represents the sensor statLls to
assist fusion and decision. In this model, we can
simply illustrate as follow:
When S1 is under relative operating range in the
work process, the fusion data P are mainly acquired
from it, and operation gets into relative task section.
When s] is out a relative operating range, the fusion
data is mainly acquired from S2 or S3 or others, then
the control system shifts into different task sect ion.
The prior level of S is defined by different task, and
S will be integrated to take into action under different
sensor condition, A modular program with leve I
control performs sensor data share, data
communication, and low-level robot cooperation.
In teleoperated system, we adopt a multi-level
sensor integration and data fusion module, the
different sensors functions are:
Distance sensor: By using three fiber sensors, we can
achieve the distance information from tile fingertip of
dexterous hand to workpiece. It is necessary to
perform the operating tasks accurately, on the other
hand, it is the key to avoid unexpected collision
between hand and operating object,
Force/Torque sensors: 6-D wrist force/torque sensor
can provide force and torque value to perform
precisely axle-hole assembly and workpiece access,
and ensure experiment safety by compliant control,
Fingertip Force sensor: By fingertip force sensor, we
can obtain the touch force between finger and object
in the process of operation, and can compute the value
of 3 fingers’ force. [f the value is over the valve in a
direction, sub-system send a command to stop the
movement in this direction right away, and the whole
manipulation will not finish until the value of force in
every direction is ok. By fusing the finger force data,
the force acted on fingers is Iimited in a proper scope.
Angular sensor: During dexterous hand movement,
sLlb-system will obtain every joint’s angle in every
action cycle(25ms) to judge if the joint angle is in the
normal range. Once the angle exceeds the normal
limit, operation is stopped at once,
Visual treatment: Vision sensor can provide the
information used to calibrate the system, locate
accurately workpiece’s position for performing real-
time visual track and search.
3.4 Man/machine interactive share
We also develop a real-time control program using
the 6-DOF mechanical arm and 5-finger 11-DOF data
g[OVe made by 11s. [t iS very LlsefLll platform to
research spatial robotics, teleoperated share control
technology and so on.
Data glove: By using the BHG-3 data glove, we can
control the dexterous hand to perform coordinately
the teleoperated tasks.
Tele-operating: By using the 6-D BH-TMA, we can
control the PU.MA560 robot arm to perform
coordinately the teleoperated tasks.
3.5 5-Finger 1l-Joint data glove
Fig.4 Data glove full view
BHG-3 Data Glove consists of mechanical parts,
electrical parts, A/D data collecting board and
simulation software. It suits different types of adult
2491
hand, and checks small movements of 11 DOF
distributed on five fingers. The mechanical part has
193 spares, and net weight is 300g. It can check tiny
lnOVement frOm -2t)0 to 90° , and resolution is up to
0.6°. By graphic simulative software, it performs real-
timely man/machine interactive control,
BHG-3 adopts mechanical connecting rod
mechanism to detect joint’s movement, includes 2-DOF
spatial 6-rod structure and 1-DOF plane 4-rod structure.
When wearing BHG-3, it can be fixed in the hand by
leather belt, The detecting theoretical analysis of 2-DOF
mechanism is shown in Fig. 5.
/
I
/’”
Fig.5 detecting theoretical analysis
Where a is joint angle, J? is potentiometer angle.
If there are subtle change Aa in joint angle,
potentiometer can detect subtle change A~. We can
compute unconcentric circle movement to obtain
finger joint position. Based on sine theorem, we
work out Aa:
sin ~ r—_— (1)
sin y d
sinflxd
sin y =
r
(2)
Consider that i5yis subtle change:
sin~ + 3Y)
= siny + @” cosy
Solve:
d.sin(fl+A~) -r. siny
Ay = (4)
r.cosy
From (2), (4), Aa is the following:
Aa=Afi+Ay=
AP+d. sin(fl+Afl)-d. sin~ (5)
i-2 -d2 .sin2~
Because the function between Aa and A~ is stable,
it is very convenient and quick to compute the joint
change. Meanwhile, the data glove structure is simple
and suited for teleoperation by operator.
3.6. Visual servo calibration
We adopt vision sensor in hand/arm teleoperated
system to resolve the uncertainty, calibrate the
workpiece position and postLlre, identify and locate
the object in external environment. Meanwhile, it can
improve system autonomous ability and perform
clexterous and accurate manipulation.
System has two CCD cameras, one is arranged on
the head of workspace as global sense and another is
on the back of the dexterous hand as local sense. The
imagines collected from two CCD are transmitted to
high-level PC, then are pre-processed, computed and
analyzed by visual algorithm.
in visual servo system, to locate accurately, we
must build the image relation between 2-D plane and
3-D space to define the target 3-D position. The usual
method is to calibrate at first to obtain the inner (focal
distance, proportional factor and distortion coefficient
etc) and outer (the direction and position in the world
coordination etc) parameters of vision sensor, then
compute precise target.
Consider these complex calibrating and locating
algorithm limited in a special environment, we
propose a new calibrating and locating method
directed at our hand/arm integrated teleoperated
system. We use a 3-D two-layer template with
markers designed by ourselves to resolve simply
vision sensor calibration and correct image distortion.
From grabbing two images when robot moves, we
model two projection linear function through target
point, and work out the point of intersection to locate
the real position of target point, that is the target point
coordination. Because the projection line through
target point has intersecting point with template, we
can get its coordination in accordance with the
projection relation.
There are two stages in the whole locating process:
calibration and location. The calibrating principle
diagram is given in Fig. 6.
Y v
x
,> >0
A
CB
Fig. 6 Calibrating principle
Where A is the image plane; B is the first
2492
position of the template; C is the moving position of manipulations, such as twisting-loosing-twisting
the template; OXYZ represents image co~rdinate axis;
Ouvw represents template coordinate axis; d
represents the sliding distance of the template; P is the
target point; P’ is its image point; P 1 and P2 are the
intersecting points between projection line and
template. Supposed that the marker point (X,,Y, ) in
the template has different coordination (X ,,,Y Ii) and
(Xti$Yzi) i=l. 2S““m (m is the quantity of mark point)
in the image.
{
) Y = fiy(~,, >l’’,,) (5)x, = flx(~l,?q, ~ J
x, = f~.r(~z,, ~z,)> E = f2J(~2, , y?, )
In terms of (5), we can calibrate the visLlal system
coefficient (~X, ~Y) . According to the projection
relation saved between images and template, we can
get the intersecting point between the projection Iine
and virtual template (i.e. defines the projection line
through target point). In turn we can define the two
projection lines in two images, at last find the
intersecting point to end a location.
4. Experiments
4.1 Autonomous operation
A) Switch button
ln the limited operating environment, the system
can autonomously use local vision to track and locate
button on the worktable. At the same time, arm and
hand approach the button in a hominine postLlre to
push it. Once the threshold of force detected real-time
by the force sensor is exceeded, the system stops
immediately, that is shown in Fig. 7.
Fig. 7 Push button
B) Twist the bulb and the valve
By the guide of local vision, the system can locate
accurately the bulb, valve in the operating table, and
control the dexterous hand to take hold of them. After
grasping the bulb and the valve, the robot and BH-3
hand can harmonically perform a series of
again-loosing again etc as shown in Fig.8.
Fig. 8 Twist bLllb
4.2 Plug-in hole for assembly
To perform the plug-in hole, degenerated grasp
method is explored to ensure grasping reliability,
concentricity and strength. To ensure plllg-in
precision, vision tracking is Llsed for locating the
workpiece in operation table. To ensure system’s
reliable and safety. 6D wrist force sensor is Llsed for
detecting the threshold of the tactile force real time
dLlring inserting hole.
Fig. 9 Plug in the hole
For example in Fig.9, the system can control the
arm and hand to move above workpiece position, and
use local vision immediately to locate the accurate
workpiece on the operation table. After that, the BH-3
hand firmly grasps the workpiece with a degenerated
method, pLIlls the workpiece up from the hole in
operation table, and then holds it to next work area.
The system autonomoLlsly switches the local vision to
search the hole, and locate accLwately the hole
position. in the guide of the 6D-wrist force sensor, the
system inserts the workpiece slowly. Once the
threshold of the inserting force is exceeded, it can
stop the operation autonomously to protect the arm
and hand.
4.3 Grab the cup and pour a cup of tea
The poLlring water experiment is given as shown in
Fig.10.
2493
Fig. 10 Pour a cup of tea
We design a set of horninine tasks to demonstrate
dexterous manipulation. The syste]m grabs a CLIpthat
is full of tea on the operation table from the worktable,
by the guide of vision, it move to another position and
pour water accurately into a empty cup slowly.
Moreover, the system can also adopt optical-fiber
distance sensor to protect from collision in case there
are some damages during moving.
4.4 Remote control with teleoperated mechanical
arm and data glove
We also develop an approach to control PUMA560C
arm using BH-TMA and control BH-3 dexteroLls hand
using BHG-3 in a long distance, to perform the
relevant avoiding obstacle and grabbing cup operation,
lt is useful to research space robotics and teleoperated
shared control technique. Our research works are
shown in Fig.1 1 and Fig.12.
Fig. 11 Control arm/hand to grab cup by
BH-TMA and BH-DG
Fig. 12 Real avoiding obstacle operation
5. Conclusion
This paper proposes a multi-sensor integrated share
model for hand/arm teleoperation. We perform local
aLltonomy, teleoperated control and simulating
planning experiments. By oLlrexperimental researches,
mu It i-agent-based share and man/math ine interact ive
shares are our main contributions. OLlr experimental
verifications show that the methods Llsed in operating
tasks are high efficient, and simple, inclLiding
grabbing the cup and pouring a cLlpof tea, plLlgging ill
hole for assembly, twisting the bLllb (valve) and
teleoperated control etc.
In oLlr next research, we will continLloLlsly improve
the hardware environment for increasing the
integrated system’s real-t ime, accuracy, and w iII add a
set of VR eqLlipment for teleoperated dexterity. We
wil I deeply develop the web-based teleoperated
theory, large delay control, graphic simulation,
intelligent local aLltonomy, force compliant control
and so on.
[11
[21
[31
[41
[51
[61
[71
[81
References
DLR - Institute for Robotics and System dynamics.
Prof. Dr.-lng. G. Hirzin~er. “SPACE ROBOT
ACTIVITIES - A SURVEY.” 1987-1992 Scientific
Report.
K. Machida, Y. Toda, Y. Murase, and S.Komada.
“Precise Space Telerobotic System Using 3-Finger
Multisensory Hand”. Proc. of the IEEE lnt. Conf.
on Robotics and Automation, 1995.
M. Nohmi, D.N.Nenchev and M. Uchiyama.
“Momentum Control of a Tethered Space Robot
Through Tether Tension Control”. Proc. of the
IEEE lnt. Conf. on Robotics and Automation, 1998,
pp920 -925
Rafael Kelly, PaLll Shirkey and Mark W. Spong.
“ Fixed-Canlera Visual servo control for pianar
Robots.” Proc. of the IEEE Int. Conf. on Robotics
and Automation, 1996, pp2643 -2649.
Andersen U., “Design of a Data Glove Input
System.” M,SC Thesis. University of Salford.
1994.
Durrant-whyte H F. “Sensor Models and
Multisensory Integration”. Int J Robot Res, 1988,
7(6): 97-113.
Bogler p L “Shafer-dempster Reasoning with
Applications to Multisensor Target Identification
Systems”, IEEE Trans. Syst. Man Cybern, 1987,
SMC-17(6):968 -977
Green F C A, et al. “Multisensory robot assembly
station”, Robotics, 1986.2:’205-214
2494

More Related Content

What's hot

EGT10 DESIGN AND APPLICATION FOR POSITION GPS TRACKER WITH VISUAL BASIC
EGT10 DESIGN AND APPLICATION FOR POSITION GPS TRACKER WITH VISUAL BASICEGT10 DESIGN AND APPLICATION FOR POSITION GPS TRACKER WITH VISUAL BASIC
EGT10 DESIGN AND APPLICATION FOR POSITION GPS TRACKER WITH VISUAL BASICijmnct
 
Design and Development of Arm-Based Control System for Nursing Bed
Design and Development of Arm-Based Control System for Nursing Bed Design and Development of Arm-Based Control System for Nursing Bed
Design and Development of Arm-Based Control System for Nursing Bed IJCSES Journal
 
Design, Develop and Implement an Efficient Polynomial Divider
Design, Develop and Implement an Efficient Polynomial DividerDesign, Develop and Implement an Efficient Polynomial Divider
Design, Develop and Implement an Efficient Polynomial DividerIJLT EMAS
 
IRJET- Wearable Sensor based Fall Detection System
IRJET- Wearable Sensor based Fall Detection SystemIRJET- Wearable Sensor based Fall Detection System
IRJET- Wearable Sensor based Fall Detection SystemIRJET Journal
 
EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...
EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...
EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...csandit
 

What's hot (8)

Ijcatr04061003
Ijcatr04061003Ijcatr04061003
Ijcatr04061003
 
EGT10 DESIGN AND APPLICATION FOR POSITION GPS TRACKER WITH VISUAL BASIC
EGT10 DESIGN AND APPLICATION FOR POSITION GPS TRACKER WITH VISUAL BASICEGT10 DESIGN AND APPLICATION FOR POSITION GPS TRACKER WITH VISUAL BASIC
EGT10 DESIGN AND APPLICATION FOR POSITION GPS TRACKER WITH VISUAL BASIC
 
L10 fkbc
L10 fkbcL10 fkbc
L10 fkbc
 
Design and Development of Arm-Based Control System for Nursing Bed
Design and Development of Arm-Based Control System for Nursing Bed Design and Development of Arm-Based Control System for Nursing Bed
Design and Development of Arm-Based Control System for Nursing Bed
 
Design, Develop and Implement an Efficient Polynomial Divider
Design, Develop and Implement an Efficient Polynomial DividerDesign, Develop and Implement an Efficient Polynomial Divider
Design, Develop and Implement an Efficient Polynomial Divider
 
IRJET- Wearable Sensor based Fall Detection System
IRJET- Wearable Sensor based Fall Detection SystemIRJET- Wearable Sensor based Fall Detection System
IRJET- Wearable Sensor based Fall Detection System
 
EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...
EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...
EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...
 
08764396
0876439608764396
08764396
 

Similar to ieee

Design and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksDesign and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksEcwayt
 
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksDesign and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksEcwaytech
 
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksDesign and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksEcwayt
 
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksDesign and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksecwayerode
 
IRJET- Real Time Implementation of Air Writing
IRJET- Real Time Implementation of  Air WritingIRJET- Real Time Implementation of  Air Writing
IRJET- Real Time Implementation of Air WritingIRJET Journal
 
A haptic feedback system based on leap motion controller for prosthetic hand ...
A haptic feedback system based on leap motion controller for prosthetic hand ...A haptic feedback system based on leap motion controller for prosthetic hand ...
A haptic feedback system based on leap motion controller for prosthetic hand ...IJECEIAES
 
fmelleHumanActivityRecognitionWithMobileSensors
fmelleHumanActivityRecognitionWithMobileSensorsfmelleHumanActivityRecognitionWithMobileSensors
fmelleHumanActivityRecognitionWithMobileSensorsFridtjof Melle
 
The Design of Multi-Platforms Rail Intelligence Flatness Detection System
The Design of Multi-Platforms Rail Intelligence Flatness Detection SystemThe Design of Multi-Platforms Rail Intelligence Flatness Detection System
The Design of Multi-Platforms Rail Intelligence Flatness Detection SystemIJRESJOURNAL
 
Unobtrusive hand gesture recognition using ultra-wide band radar and deep lea...
Unobtrusive hand gesture recognition using ultra-wide band radar and deep lea...Unobtrusive hand gesture recognition using ultra-wide band radar and deep lea...
Unobtrusive hand gesture recognition using ultra-wide band radar and deep lea...IJECEIAES
 
Surface Electromyography (SEMG) Based Fuzzy Logic Controller for Footballer b...
Surface Electromyography (SEMG) Based Fuzzy Logic Controller for Footballer b...Surface Electromyography (SEMG) Based Fuzzy Logic Controller for Footballer b...
Surface Electromyography (SEMG) Based Fuzzy Logic Controller for Footballer b...IRJET Journal
 
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollerHand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollereSAT Publishing House
 
Magnetic levitation system
Magnetic levitation systemMagnetic levitation system
Magnetic levitation systemrahul bhambri
 
Media Control Using Hand Gesture Moments
Media Control Using Hand Gesture MomentsMedia Control Using Hand Gesture Moments
Media Control Using Hand Gesture MomentsIRJET Journal
 
IRJET- Offline Location Detection and Accident Indication using Mobile Sensors
IRJET- Offline Location Detection and Accident Indication using Mobile SensorsIRJET- Offline Location Detection and Accident Indication using Mobile Sensors
IRJET- Offline Location Detection and Accident Indication using Mobile SensorsIRJET Journal
 
Optimal Control of a Teleoperation System via LMI- based Robust PID Controllers
Optimal Control of a Teleoperation System via LMI- based Robust PID ControllersOptimal Control of a Teleoperation System via LMI- based Robust PID Controllers
Optimal Control of a Teleoperation System via LMI- based Robust PID Controllersidescitation
 
Wearable sensor network for lower limb angle estimation in robotics applications
Wearable sensor network for lower limb angle estimation in robotics applicationsWearable sensor network for lower limb angle estimation in robotics applications
Wearable sensor network for lower limb angle estimation in robotics applicationsTELKOMNIKA JOURNAL
 
A Survey on Mobile Sensing Technology and its Platform
A Survey on Mobile Sensing Technology and its PlatformA Survey on Mobile Sensing Technology and its Platform
A Survey on Mobile Sensing Technology and its PlatformEswar Publications
 
Draft activity recognition from accelerometer data
Draft activity recognition from accelerometer dataDraft activity recognition from accelerometer data
Draft activity recognition from accelerometer dataRaghu Palakodety
 
Improving Posture Accuracy of Non-Holonomic Mobile Robot System with Variable...
Improving Posture Accuracy of Non-Holonomic Mobile Robot System with Variable...Improving Posture Accuracy of Non-Holonomic Mobile Robot System with Variable...
Improving Posture Accuracy of Non-Holonomic Mobile Robot System with Variable...TELKOMNIKA JOURNAL
 

Similar to ieee (20)

Design and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksDesign and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
 
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksDesign and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
 
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksDesign and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
 
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasksDesign and evaluation of a haptic computer assistant for tele-manipulation tasks
Design and evaluation of a haptic computer assistant for tele-manipulation tasks
 
IRJET- Real Time Implementation of Air Writing
IRJET- Real Time Implementation of  Air WritingIRJET- Real Time Implementation of  Air Writing
IRJET- Real Time Implementation of Air Writing
 
Epma 010
Epma 010Epma 010
Epma 010
 
A haptic feedback system based on leap motion controller for prosthetic hand ...
A haptic feedback system based on leap motion controller for prosthetic hand ...A haptic feedback system based on leap motion controller for prosthetic hand ...
A haptic feedback system based on leap motion controller for prosthetic hand ...
 
fmelleHumanActivityRecognitionWithMobileSensors
fmelleHumanActivityRecognitionWithMobileSensorsfmelleHumanActivityRecognitionWithMobileSensors
fmelleHumanActivityRecognitionWithMobileSensors
 
The Design of Multi-Platforms Rail Intelligence Flatness Detection System
The Design of Multi-Platforms Rail Intelligence Flatness Detection SystemThe Design of Multi-Platforms Rail Intelligence Flatness Detection System
The Design of Multi-Platforms Rail Intelligence Flatness Detection System
 
Unobtrusive hand gesture recognition using ultra-wide band radar and deep lea...
Unobtrusive hand gesture recognition using ultra-wide band radar and deep lea...Unobtrusive hand gesture recognition using ultra-wide band radar and deep lea...
Unobtrusive hand gesture recognition using ultra-wide band radar and deep lea...
 
Surface Electromyography (SEMG) Based Fuzzy Logic Controller for Footballer b...
Surface Electromyography (SEMG) Based Fuzzy Logic Controller for Footballer b...Surface Electromyography (SEMG) Based Fuzzy Logic Controller for Footballer b...
Surface Electromyography (SEMG) Based Fuzzy Logic Controller for Footballer b...
 
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollerHand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
 
Magnetic levitation system
Magnetic levitation systemMagnetic levitation system
Magnetic levitation system
 
Media Control Using Hand Gesture Moments
Media Control Using Hand Gesture MomentsMedia Control Using Hand Gesture Moments
Media Control Using Hand Gesture Moments
 
IRJET- Offline Location Detection and Accident Indication using Mobile Sensors
IRJET- Offline Location Detection and Accident Indication using Mobile SensorsIRJET- Offline Location Detection and Accident Indication using Mobile Sensors
IRJET- Offline Location Detection and Accident Indication using Mobile Sensors
 
Optimal Control of a Teleoperation System via LMI- based Robust PID Controllers
Optimal Control of a Teleoperation System via LMI- based Robust PID ControllersOptimal Control of a Teleoperation System via LMI- based Robust PID Controllers
Optimal Control of a Teleoperation System via LMI- based Robust PID Controllers
 
Wearable sensor network for lower limb angle estimation in robotics applications
Wearable sensor network for lower limb angle estimation in robotics applicationsWearable sensor network for lower limb angle estimation in robotics applications
Wearable sensor network for lower limb angle estimation in robotics applications
 
A Survey on Mobile Sensing Technology and its Platform
A Survey on Mobile Sensing Technology and its PlatformA Survey on Mobile Sensing Technology and its Platform
A Survey on Mobile Sensing Technology and its Platform
 
Draft activity recognition from accelerometer data
Draft activity recognition from accelerometer dataDraft activity recognition from accelerometer data
Draft activity recognition from accelerometer data
 
Improving Posture Accuracy of Non-Holonomic Mobile Robot System with Variable...
Improving Posture Accuracy of Non-Holonomic Mobile Robot System with Variable...Improving Posture Accuracy of Non-Holonomic Mobile Robot System with Variable...
Improving Posture Accuracy of Non-Holonomic Mobile Robot System with Variable...
 

ieee

  • 1. Proceedings of the 1999 fEEE Intemationaf Conference on Robotics & Automation Detroit, Michigan q May 1999 Share Control in IntelligentArm/HandTeleoperatedSystem YOLI SOllg Wang Tianmiao Wei Jun Yang Fenglei Zhang Qixian Robotics Institute of Beijing University of Aero. and Astro. Beijing, China, 100083 Tel: (8610) 82317748 FAX: (8610) 62371315 Email: syoll@pLlbllic, bLlaa.edll.cll Abstract: This system is mainly composed of industrial robot, dexterous hand (BH-3), graphic simulation and planning module, 6-DOF teleoperated mechanical arm (BH-TMA) and data glove with 5- fingered 11-DOF (BHG-3) etc. It consists of vision, force, torque, fiber, angle, and fingertip tacti Ie sensors. In order to implement some complex operations in the integrated system, we propose a task-orientecl hierarchical control share mode]. Moreover, we also express our viewpoints about share control in teleoperated system. Finally, the experimental and simulative results are given to show that the share control construction is high-efficiency, valuable and successful. 1. Introduction Share control in teleoperated system is a very important issue in the front field of space robotics. Many researches have been developed in space robots, autonomous agents, inte[ iigent control and dexterous manipulation etc. Lots of experiments ‘1231show that it is impossible for space robot to autonomously perform space-manipulating tasks under complex environment, therefore astronaut or operator on the ground need to remotely monitor and operate the executive system. Simultaneously, in the influence of universal communication time-delay and micro- gravity manipulation, the error judges made by the astronaut and operator can’t be avoided since the cause-effect relation wi II be destroyed. The teleoperated share control technique is a very effective method to resolve above questions by coordinating high-level man-monitoring harmony and low-level autonomous control. With the development of robot application and research, a teleoperated robot system necessarily depends on varied sensors and external instruments ‘“~] to obtain the relative environmental information, such as vision, force, distance, tactile and so on. Furthermore, with the increment of sensor’s quantity and type, each sensor type has own characteristics and functions. Therefore, it isn ‘t feasible to find a general model for some different sensors that are independent of the physical sensors. So sensor integration, fusion and share technique is becoming increasingly important to improve performance and robustness in SLIChsystems. [n this arnl/ hand system, becaLlse of the disequilibrium and Llncellaitlty of time, space, and position, a single control model for various different tasks is impossible. la terms of past references ‘8], share control of multi-sensor integration and data fusion is still difficulty task. Traditional share control method often adapts Bayes decision approach “1 and Dempster-Shafer theory of evidence model “], but these two methods have respective defaults. Bayes decision approach can’t strictly distinguish between uncertain and unknown. Denlpster-Shafer evidence theory can make up for this default, but it lacks tightness in axiomatic mathematics definition. Here, we propose the task- oriented muki-agent share construction which is put forward to this system. 2. Architecture This teleoperated system is a platform for research and application. It comprises 4 main modules: ]) Graphic simulation for task and trajectory planning using BH-TMA and BHG-3; 2) Local autonomous control for tracking, grabbing and manipulating workplaces in the workspace based on multi-sensors, such as global and local vision, wrist Force/Torque, optical iiber, and force on the finger tip etc; 3) Te]elllallipLl] atioll from global simulation to local planning and collaborate robotic arnl/hand via the remote data communication; 4) Remote control of robotic arn~/hand manipulation by BH-TMA and BHG-3. The system physical diagram is given as shown as Fig. 1. There are 15 DOF in autonomous control sub- system, 6 for the arm and 9 for the hand, and 17 DOF in teleoperated sub-system, 6 for teleoperated mechanical arm and 11 for the data glove. With so many degrees of freedom, an effective approach is to decompose the search space into lower dimensional subsets that can be explored using heuristic search techniques. Even then, well-chosen sensing and ~econstruction strategies are essential to reduce the (Teometric complexity of-the planning problem.~ 0-7803-51 80-0-5/99 $10.00 @ 1999 IEEE 2489
  • 2. Fig. 1 System physical diagram 3. Implementing Techniques The hand/arm teleoperated system can control, make decision and execute based on ]multi-sensor fusion information. It can adapt environmental change, track and locate object, modify planning modu Ie, remotely manipulate workpieces, receive simu Iat ing data, harmonically perform dexterous assemble task. In this system, the share control is mainly composed of three modules: autonolnous control, teleoperation and simulation. In this article, the emphases of our share control has three different contexts: . Sensor data share q Multi-agent-based share q Man/machine interactive share The sensor data share is a basic share mode, and is the basement of low-level local autonomoLls control. The multi-agent-based share is a behavior-based and task-oriented share mode, it is important for hand/arm to perform dexterous and precise tasks. Man/math ine interactive share is a system-level share mode to coordinate high-level planning and low-level autonomous control and it is guaranty to remotely fulfill various manipulations in a safe condition. 3.1 Autonomous control Autonomous control module fllses the environmental data obtained from sensors, compares and filtrates them with a optimized model. 1n terms of planning result of high-level simulation system, it determines action and task sequences for path, orbit and grasping optimization, in turn controls the low- Ievel controller and mechanical basement to perform respective task. To finish these functions, autonomous control architecture is shown in Fig. 2 diagram: ! MolIonmoduleplanning,Taskplanning 1 : Modulesandprotocolsharecntrol [1 ~ =] 1 I / c1priorimage lzE5 trealmenl Fig. 2 Autonomous control Architecture 3.2 Sensor data share The integrated system consists of many kinds of sensors. Two CCD provide the location parameter to control the motion of arnl/hand and calibrate the environment between robot and worktable in workspace. The 6-D wrist Force/Torque sensor mounted on the end of effecter of PUMA560 and the 3-D tactile sensor on the fingertip of BH-3 dexterous hand are used to test the force and make compliant control. Nine angular potentiometers in the finger joints are used to provide the grasp space information. Three optical fiber sensors are used to avoid obstacle and collision. Countering so many sensors data, we propose the low-level sensors data share architecture shown in Fig.3. 2490
  • 3. , .“,.., , “,!,.. s,’ Fig. 3 Sensor Data Share Architecture 3.3 Multi-agent-based share From above figure we can see, there are a lot of sensors and control hardware in this system. How to share well so many external information resources and make full use of them is the key to perfectly implement manipulation tasks. The task-oriented sensor fusion and share is the chief strategy. The fusion module in manipulating process can be expressed in following function: [ P(sl) S1 3 (Col’zdition Sl) P= P(S2) S2 3 (condition s2) P(S3) S3 3 (condition s3) ... Where P is logic control parameter based on lmLllti- sensor data, T is task decision valve to shifl different sub-task operation. S represents the sensor statLls to assist fusion and decision. In this model, we can simply illustrate as follow: When S1 is under relative operating range in the work process, the fusion data P are mainly acquired from it, and operation gets into relative task section. When s] is out a relative operating range, the fusion data is mainly acquired from S2 or S3 or others, then the control system shifts into different task sect ion. The prior level of S is defined by different task, and S will be integrated to take into action under different sensor condition, A modular program with leve I control performs sensor data share, data communication, and low-level robot cooperation. In teleoperated system, we adopt a multi-level sensor integration and data fusion module, the different sensors functions are: Distance sensor: By using three fiber sensors, we can achieve the distance information from tile fingertip of dexterous hand to workpiece. It is necessary to perform the operating tasks accurately, on the other hand, it is the key to avoid unexpected collision between hand and operating object, Force/Torque sensors: 6-D wrist force/torque sensor can provide force and torque value to perform precisely axle-hole assembly and workpiece access, and ensure experiment safety by compliant control, Fingertip Force sensor: By fingertip force sensor, we can obtain the touch force between finger and object in the process of operation, and can compute the value of 3 fingers’ force. [f the value is over the valve in a direction, sub-system send a command to stop the movement in this direction right away, and the whole manipulation will not finish until the value of force in every direction is ok. By fusing the finger force data, the force acted on fingers is Iimited in a proper scope. Angular sensor: During dexterous hand movement, sLlb-system will obtain every joint’s angle in every action cycle(25ms) to judge if the joint angle is in the normal range. Once the angle exceeds the normal limit, operation is stopped at once, Visual treatment: Vision sensor can provide the information used to calibrate the system, locate accurately workpiece’s position for performing real- time visual track and search. 3.4 Man/machine interactive share We also develop a real-time control program using the 6-DOF mechanical arm and 5-finger 11-DOF data g[OVe made by 11s. [t iS very LlsefLll platform to research spatial robotics, teleoperated share control technology and so on. Data glove: By using the BHG-3 data glove, we can control the dexterous hand to perform coordinately the teleoperated tasks. Tele-operating: By using the 6-D BH-TMA, we can control the PU.MA560 robot arm to perform coordinately the teleoperated tasks. 3.5 5-Finger 1l-Joint data glove Fig.4 Data glove full view BHG-3 Data Glove consists of mechanical parts, electrical parts, A/D data collecting board and simulation software. It suits different types of adult 2491
  • 4. hand, and checks small movements of 11 DOF distributed on five fingers. The mechanical part has 193 spares, and net weight is 300g. It can check tiny lnOVement frOm -2t)0 to 90° , and resolution is up to 0.6°. By graphic simulative software, it performs real- timely man/machine interactive control, BHG-3 adopts mechanical connecting rod mechanism to detect joint’s movement, includes 2-DOF spatial 6-rod structure and 1-DOF plane 4-rod structure. When wearing BHG-3, it can be fixed in the hand by leather belt, The detecting theoretical analysis of 2-DOF mechanism is shown in Fig. 5. / I /’” Fig.5 detecting theoretical analysis Where a is joint angle, J? is potentiometer angle. If there are subtle change Aa in joint angle, potentiometer can detect subtle change A~. We can compute unconcentric circle movement to obtain finger joint position. Based on sine theorem, we work out Aa: sin ~ r—_— (1) sin y d sinflxd sin y = r (2) Consider that i5yis subtle change: sin~ + 3Y) = siny + @” cosy Solve: d.sin(fl+A~) -r. siny Ay = (4) r.cosy From (2), (4), Aa is the following: Aa=Afi+Ay= AP+d. sin(fl+Afl)-d. sin~ (5) i-2 -d2 .sin2~ Because the function between Aa and A~ is stable, it is very convenient and quick to compute the joint change. Meanwhile, the data glove structure is simple and suited for teleoperation by operator. 3.6. Visual servo calibration We adopt vision sensor in hand/arm teleoperated system to resolve the uncertainty, calibrate the workpiece position and postLlre, identify and locate the object in external environment. Meanwhile, it can improve system autonomous ability and perform clexterous and accurate manipulation. System has two CCD cameras, one is arranged on the head of workspace as global sense and another is on the back of the dexterous hand as local sense. The imagines collected from two CCD are transmitted to high-level PC, then are pre-processed, computed and analyzed by visual algorithm. in visual servo system, to locate accurately, we must build the image relation between 2-D plane and 3-D space to define the target 3-D position. The usual method is to calibrate at first to obtain the inner (focal distance, proportional factor and distortion coefficient etc) and outer (the direction and position in the world coordination etc) parameters of vision sensor, then compute precise target. Consider these complex calibrating and locating algorithm limited in a special environment, we propose a new calibrating and locating method directed at our hand/arm integrated teleoperated system. We use a 3-D two-layer template with markers designed by ourselves to resolve simply vision sensor calibration and correct image distortion. From grabbing two images when robot moves, we model two projection linear function through target point, and work out the point of intersection to locate the real position of target point, that is the target point coordination. Because the projection line through target point has intersecting point with template, we can get its coordination in accordance with the projection relation. There are two stages in the whole locating process: calibration and location. The calibrating principle diagram is given in Fig. 6. Y v x ,> >0 A CB Fig. 6 Calibrating principle Where A is the image plane; B is the first 2492
  • 5. position of the template; C is the moving position of manipulations, such as twisting-loosing-twisting the template; OXYZ represents image co~rdinate axis; Ouvw represents template coordinate axis; d represents the sliding distance of the template; P is the target point; P’ is its image point; P 1 and P2 are the intersecting points between projection line and template. Supposed that the marker point (X,,Y, ) in the template has different coordination (X ,,,Y Ii) and (Xti$Yzi) i=l. 2S““m (m is the quantity of mark point) in the image. { ) Y = fiy(~,, >l’’,,) (5)x, = flx(~l,?q, ~ J x, = f~.r(~z,, ~z,)> E = f2J(~2, , y?, ) In terms of (5), we can calibrate the visLlal system coefficient (~X, ~Y) . According to the projection relation saved between images and template, we can get the intersecting point between the projection Iine and virtual template (i.e. defines the projection line through target point). In turn we can define the two projection lines in two images, at last find the intersecting point to end a location. 4. Experiments 4.1 Autonomous operation A) Switch button ln the limited operating environment, the system can autonomously use local vision to track and locate button on the worktable. At the same time, arm and hand approach the button in a hominine postLlre to push it. Once the threshold of force detected real-time by the force sensor is exceeded, the system stops immediately, that is shown in Fig. 7. Fig. 7 Push button B) Twist the bulb and the valve By the guide of local vision, the system can locate accurately the bulb, valve in the operating table, and control the dexterous hand to take hold of them. After grasping the bulb and the valve, the robot and BH-3 hand can harmonically perform a series of again-loosing again etc as shown in Fig.8. Fig. 8 Twist bLllb 4.2 Plug-in hole for assembly To perform the plug-in hole, degenerated grasp method is explored to ensure grasping reliability, concentricity and strength. To ensure plllg-in precision, vision tracking is Llsed for locating the workpiece in operation table. To ensure system’s reliable and safety. 6D wrist force sensor is Llsed for detecting the threshold of the tactile force real time dLlring inserting hole. Fig. 9 Plug in the hole For example in Fig.9, the system can control the arm and hand to move above workpiece position, and use local vision immediately to locate the accurate workpiece on the operation table. After that, the BH-3 hand firmly grasps the workpiece with a degenerated method, pLIlls the workpiece up from the hole in operation table, and then holds it to next work area. The system autonomoLlsly switches the local vision to search the hole, and locate accLwately the hole position. in the guide of the 6D-wrist force sensor, the system inserts the workpiece slowly. Once the threshold of the inserting force is exceeded, it can stop the operation autonomously to protect the arm and hand. 4.3 Grab the cup and pour a cup of tea The poLlring water experiment is given as shown in Fig.10. 2493
  • 6. Fig. 10 Pour a cup of tea We design a set of horninine tasks to demonstrate dexterous manipulation. The syste]m grabs a CLIpthat is full of tea on the operation table from the worktable, by the guide of vision, it move to another position and pour water accurately into a empty cup slowly. Moreover, the system can also adopt optical-fiber distance sensor to protect from collision in case there are some damages during moving. 4.4 Remote control with teleoperated mechanical arm and data glove We also develop an approach to control PUMA560C arm using BH-TMA and control BH-3 dexteroLls hand using BHG-3 in a long distance, to perform the relevant avoiding obstacle and grabbing cup operation, lt is useful to research space robotics and teleoperated shared control technique. Our research works are shown in Fig.1 1 and Fig.12. Fig. 11 Control arm/hand to grab cup by BH-TMA and BH-DG Fig. 12 Real avoiding obstacle operation 5. Conclusion This paper proposes a multi-sensor integrated share model for hand/arm teleoperation. We perform local aLltonomy, teleoperated control and simulating planning experiments. By oLlrexperimental researches, mu It i-agent-based share and man/math ine interact ive shares are our main contributions. OLlr experimental verifications show that the methods Llsed in operating tasks are high efficient, and simple, inclLiding grabbing the cup and pouring a cLlpof tea, plLlgging ill hole for assembly, twisting the bLllb (valve) and teleoperated control etc. In oLlr next research, we will continLloLlsly improve the hardware environment for increasing the integrated system’s real-t ime, accuracy, and w iII add a set of VR eqLlipment for teleoperated dexterity. We wil I deeply develop the web-based teleoperated theory, large delay control, graphic simulation, intelligent local aLltonomy, force compliant control and so on. [11 [21 [31 [41 [51 [61 [71 [81 References DLR - Institute for Robotics and System dynamics. Prof. Dr.-lng. G. Hirzin~er. “SPACE ROBOT ACTIVITIES - A SURVEY.” 1987-1992 Scientific Report. K. Machida, Y. Toda, Y. Murase, and S.Komada. “Precise Space Telerobotic System Using 3-Finger Multisensory Hand”. Proc. of the IEEE lnt. Conf. on Robotics and Automation, 1995. M. Nohmi, D.N.Nenchev and M. Uchiyama. “Momentum Control of a Tethered Space Robot Through Tether Tension Control”. Proc. of the IEEE lnt. Conf. on Robotics and Automation, 1998, pp920 -925 Rafael Kelly, PaLll Shirkey and Mark W. Spong. “ Fixed-Canlera Visual servo control for pianar Robots.” Proc. of the IEEE Int. Conf. on Robotics and Automation, 1996, pp2643 -2649. Andersen U., “Design of a Data Glove Input System.” M,SC Thesis. University of Salford. 1994. Durrant-whyte H F. “Sensor Models and Multisensory Integration”. Int J Robot Res, 1988, 7(6): 97-113. Bogler p L “Shafer-dempster Reasoning with Applications to Multisensor Target Identification Systems”, IEEE Trans. Syst. Man Cybern, 1987, SMC-17(6):968 -977 Green F C A, et al. “Multisensory robot assembly station”, Robotics, 1986.2:’205-214 2494