Applications of GRID in Clinical Neurophysiology
and Electrical Impedance Tomography of brain
Fritschy, J. a, Horesh, L. a, Bayford, R. ab, Holder, D. a.
Department of Medical Physics & Bioengineering, University College London
School of Health and Social Sciences, Middlesex University
Abstract.The computational requirements in Neurophysiology are increasing
with the development of new analysis methods. The resources the GRID has to
offer are ideally suited for this complex processing. A practical implementation of
the GRID, Condor, has been assessed using a local cluster of 920 PCs. The
reduction in processing time was assessed in spike recognition of the
Electroencephalogram (EEG) in epilepsy using wavelets and the computationally
demanding task of non-linear image reconstruction with Electrical Impedance
Tomography (EIT). Processing times were decreased by 25 and 40 times
respectively. This represents a substantial improvement in processing time, but is
still sub optimal due to factors such as shared access to resources and lack of
checkpoints so that interrupted jobs had to be restarted. Future work will be to use
these methods in non-linear EIT image reconstruction of brain function and
methods for automated EEG analysis, if possible with further optimized GRID
Keywords: Electrical Impedance Tomography, GRID, Condor and Matlab
Clinical Neurophysiology is the clinical discipline of investigation of the nervous
system by electronic methods. The principal methods are EEG
(Electroencephalography) - the recording of voltages on the scalp produced by the
brain, EMG (Electromyography) - the recording of voltages produced by muscles or
nerves, usually in the limbs, with skin or intramuscular electrodes and Evoked
Potentials - the recording of voltages produced on the scalp by evoked physiological
stimulation. The clinical practice of these methods has remained largely unchanged for
half a century; although PCs are now widely used to record digitized data, they do little
processing and function as data recorders. However, this is likely to change in the near
future. New methods of data analysis, for example, chaos theory analysis to predict the
likelihood of epileptic seizures , require substantial computing resources. In our
research group at University College London (UCL), we are working on new methods
such as these that will enable the power of modern computers to improve the
information that can be obtained from these methods. Many of them can be performed
on a local PC but there are some that may take hours, days or even weeks on a well
specified PC (512MB of memory, Pentium 4 – 3Ghz) and data is required more
urgently for diagnosis. We have therefore been investigating the use of the GRID to
perform this processing. The concept is to develop user-friendly software, which could
be used in an acute clinical setting, such as a Casualty department or investigative
outpatient clinic. The doctor or technician would acquire the data and transparently
send it for processing, which would be performed in real time or at least over a few
minutes, at remote resources over the GRID.
One potential application lies in the automated analysis of EEG records for
epileptic activity. The particular application is in investigation of subjects with severe
epilepsy with a view to performing curative surgery, in which the abnormal part of the
brain, which causes epilepsy, is removed. The position of the abnormal part of the
brain must be ascertained first. One of the methods is to admit the subject to hospital
for several days onto a “telemetry” ward where EEG and video are recorded
continuously. After several epileptic seizures are captured, the traces are analysed to
indicate the likely region of origin of the epileptic activity. It is possible for epileptic
activity to occur in a subject without the occurrence of a physical seizure. All the
several days of EEG are therefore analyzed in case there are some attacks or other
activity not noticed by the ward staff or patient. This is usually done manually by
technicians and is very time consuming. Software methods  have therefore been
developed to automate this, so that only suspicious parts of the days of recordings are
inspected. As the majority of epileptic activity seen in an EEG comprises a negative
spike-shaped voltage deflection lasting less than 70 msec., this is termed “automated
Another application is the new method of Electrical Impedance Tomography (EIT)
. This is a recently developed medical imaging method in which tomographic “slice”
images are rapidly produced using electrodes placed around the body. The equipment
comprises a box of electronics about the size of a video recorder, attached to a PC. It is
safe, portable and inexpensive and has unique potential for recording brain images
continuously at the bedside or acutely in casualty departments where it is not
practicable to use conventional large brain scanners such as MRI or X-ray CT. In EIT,
multiple measurements of electrical impedance are made using r four electrodes for
each individual measurement.
Typically, several hundred of samples are collected in a fraction of a second,
using different combinations of 16 to 64 ECG type electrodes applied to the body part
of interest. These are transformed into a tomographic image using a reconstruction
algorithm . In the early development of EIT, this could be achieved by a relatively
simple algorithm, which employed back-projection  or a matrix sensitivity
method in which a single matrix multiplication was performed . These were
not computationally demanding because they employed the assumption that voltages
recorded on the exterior were linearly related to changes in impedance properties in the
subject. More recently, non-linear algorithms have been developed, which reflect the
true non-linear relation between external voltages and internal impedance properties.
In general, these require multiple iterations for solution. Our group has recently
implemented such a non-linear algorithm , which has the additional demand that a
fine finite element mesh of the head is employed. This has become necessary because a
potentially important application lies in using EIT to image urgently on arrival in
casualty departments in acute stroke subjects. It has recently been shown that brain
damage can be minimized by the use of thrombolytic (clot-dissolving) drugs, but these
must be administered with 3 hours and neuroimaging must be performed first, in order
to exclude a brain hemorrhage, as the thrombolytic drugs cause these to extend. Linear
reconstruction algorithms have been shown to work well in images where there is a
change over time so that data can be normalized, but this is not the case for stroke,
where no prior and posterior data sequences are available. Production of a data not
normalized over time is much more demanding and we have shown that a non-linear
method is needed . Whereas previous linear solutions with simple head models
could be reconstructed on a PC in a few seconds , the current non-linear method
would take about 100 hours on a well-specified PC for each image. In some
applications for EIT, such as for imaging epileptic seizures, many images need to be
acquired, so this poses an unacceptable delay.
1.2 Purpose and experimental design.
The purpose of this paper is to assess whether a GRID resource , the
Condor platform , which distributes computational tasks over heterogeneous
computing resources, presents a significant improvement in this timing bottleneck. For
this work, a cluster of 920 PCs available in different departments at UCL was used.
For this study, we present two examples: wavelet analysis of epileptic activity in the
EEG and a non-linear EIT reconstruction algorithm. They were written in Matlab
proprietary software; this posed a practical problem as specific Matlab libraries had to
be installed on the remote PC cluster before use.
Apart from the quantitative analysis, we evaluated the advantages and
disadvantages of all the practicalities related with this work.
1.3 Explanation of Condor GRID middleware.
1.3.1 Physical UCL-Condor architecture
Condor architecture can have many different configurations but there are three
essential roles: the user’s machine, the central submitter and the executing nodes. The
submitter machine is a node in the cluster from which the input data is sent to the
nodes. The executing nodes are computers on the cluster that are configured for
running jobs. Once the files are transferred from the user’s machine to the central
submitter, the execution is launched and the Condor operating system sends the jobs to
different nodes on the cluster. Once the jobs are finished, the central submitter retrieve
them and finally the results are transferred to the user’s machine.
1.3.2 Logical Condor architecture
The three upper logical components (figure 1): Application, Application Agent and
Customer Agent, are on the client-side -in the central submitter -and the four lower
components (Owner Agent, Remote Execution Agent, Local Resource Agent and
Resource) are on the resource-side. In the middle, between the two sides, the
Matchmaker marries up the components on either side. On the resource-side, the
information about available resources is constantly updated in the four corresponding
components, which have different levels of abstraction. At the end of this chain of
information is the Owner Agent, which contains the knowledge regarding the amount
of available nodes, their technical characteristics such as memory, operating systems
and processor speed. It regularly transmits this information to the Matchmaker. (action
1 in figure 1). Once the client sends tasks to the pool, the Customer Agent sends
appropriate information such as number of tasks, amount of memory and operating
system needed to the Matchmaker (action 1 in figure 1). The Matchmaker then makes a
decision as to which tasks will be executed, selects the resources, and allows direct
communication between those (action 2). Afterwards, a chain reaction is started, which
allows the communication between peers on different sides of the Matchmaker. At the
end (action 6) the user of the application contacts the resource.
1.3.3. Relevant concepts about Condor related to our work.
In this study, Condor performance was assessed against a well-specified PC, in
two different conditions: Full execution and satisfactory execution. Full execution is
defined as when all the jobs sent to the Condor pool are completed. Satisfactory
Application Agent Client
Remote Execution Agent
Local Resource Manager
execution is when 70% of the jobs are completed. Since interpolation techniques can be
applied to the algorithm that assembles all the finalized tasks, successful EIT
reconstructions can be achieved with 70% of the information.
Figure 1. Key logical objects in Condor architecture
2.1 UCL-Condor pool specifications
The UCL-Condor pool has 920 nodes (Pentium 3, 1GHz. CPU, 256MB RAM and
750MB of free disk space, operating system: Windows 2000). These machines are
distributed around the UCL campus and are used by the students.
The tests presented in this work included executable files coded in Matlab .
Unfortunately, none of the nodes in the pool had Matlab installed. To be able to run a
Matlab executable file in a machine that did not have Matlab installed, it was necessary
that some DLL libraries were installed in a local directory. Matlab’s executable file
(mglinstaller.exe) deploys these libraries into the local machine, and the directory in
which these were deployed had to be added to the PATH variable of the machine .
2.3 First test
In this test, a set of 100 tasks, which normally took 74 minutes each in a well
specified PC, such as an Intel Pentium 4, 3 Ghz, 512MB RAM, was sent to the Condor
pool. Each was a Matlab® algorithm, which performed wavelet analysis  on
EEG data for epileptic spike detection. The algorithm implemented a Daubechies 4
wavelet analysis with a window of 1024 samples. This was run on a single channel of
EEG acquired in a patient with epilepsy lasting 22 min sampled at 200 Hz. To run this
algorithm on the executing nodes, a batch file executed five steps: 1) deploy the
necessary Matlab® libraries in the node (mglinstaller.exe), 2) set up the PATH
variable, 3) run the desired executable file 4) retrieve the processed data and, finally 5)
clean up and restore the nodes to their initial condition.
2.4 Second test.
This was a reconstruction of multi-frequency EIT tank data using a non-linear
absolute imaging reconstruction method. A set of 300 processes, each normally taking
2.5 hours in the above well-specified PC, was sent to the Condor pool. The
reconstruction algorithm is a regularized search direction non-linear Polak-Riebeire
Conjugate Gradients solution for production of EIT images of the head . It
employed a forward solution in which the head is modeled as a finite element mesh of
25000 elements with three layers, representing the scalp, skull and brain .
The conductivity values in this mesh were iteratively calculated in order to
minimize the cost function compared to the boundary voltages measured with 32
electrodes in a saline filled tank, using a non-linear conjugate gradients method. 300
electrode combinations were used and computation was guillotined at 25 iterations.
In the first test, EEG analysis, while the execution of the 100 files (74 minutes
each) would have required 7400 minutes (5.1 days) in a well-specified PC, Condor
finished a full execution in 297 minutes. This was therefore 4 times the execution time
for a single file, but still 25 times faster than a serial execution in a well-specified PC.
In the second test, EIT image reconstruction, the 300 tasks would have required
approximately 750 hours (31.2 days) in a well-specified PC whereas Condor completed
a full execution in 1128 minutes (18.8 hours), which is 39.9 times faster. A
satisfactory execution was completed in 346 minutes, which is 130 times faster than in
a well-specified PC.
Overall, use of the Condor pool substantially reduced the time taken to process
these applications. Although the speed-up was between 25 and 130 times, Condor’s
performance was slower than expected.
The limitation for the speed was probably mainly due to: a) Load of the pool.
Since the pool was used simultaneously by other jobs, in practice this clearly diluted
processing time available compared to individual processing of one job by one PC. b)
Absence of checkpoints. The Condor procedure lacked checkpoints, so that if processor
sharing interrupted a job before completion, it had to be restarted. This occurred in
about 5% of jobs. Although this figure is low, it has a disproportionately large effect on
total processing time because of parallel execution. c) Job allocation. Condor normally
randomly allocates jobs irrespective of size, so that demanding ones may be allocated
to a loaded machine.
Implementation proved surprisingly time-consuming. There were four technical
issues: a) We had to obtain an account in the central submitter and a security certificate
b) We were operating under Matlab, which was not installed on any available cluster.
We therefore had to obtain permission and establish a procedure to deploy Matlab
libraries dynamically and delete them once the jobs were finished, c) Usually, the
prototyping stage is carried out inside the Matlab environment. Condor required
independent executable files, so time had to be spent in producing these with all
appropriate libraries and variables inserted. d) Details of job execution, for the
procedures used here, was not straightforward, as some continual updating of output
files was needed for the iterative procedures in the EIT reconstruction. Although the
Condor Job Description Language is easy to implement for simple strategies, this
adaptation proved time consuming. The support for Condor issues, through the condor-
users list , was excellent and the documentation is complete and has details and
A Matlab wrapper has recently become available, which should allow us to submit
Condor jobs from inside the Matlab environment without paying attention to the pre-
submission tasks . Since the pre-submission tasks such as compilation of the
Matlab file, its transfer to the central submitter and preparation of the special
submission file, are time consuming and prone to errors, it is likely that this will confer
considerable improvement in processing time. Another planned technical improvement
is that Condor permits the user to implement special criteria in job submission such as
looking for nodes with specific memory, processor speed and low probability of being
diverted during execution. The tasks employed in this initial study were relatively
simple but this important improvement in the processing power will allow us to test
more complicated meshes that had been replaced by simpler ones because of the
processing time bottleneck.
We plan the use of non-linear methods, both for reconstruction of EIT images of
brain function in stroke and epilepsy, and for automated analysis of the EEG in
epilepsy and other neurological conditions. We anticipate that use of the GRID will
greatly enhance this work, as it will give us the opportunity to test more sophisticated
and powerful analysis algorithms.
The authors wish to thank Clovis Chapman, the UCL-Condor pool’s administrator
for providing support with the submission of the Matlab ® code over the pool and Dr.
Andrea Romsauerova for the EIT data.
This work supported by the BBSRC under an e-science grant.
 Litt B, Echauz J.- Prediction of epileptic seizures.- Lancet Neurol. 2002 May;1(1):22-30.
 Qu H. et al.- A patient-specific algorithm for the detection of seizure onset in long-term EEG monitoring:
possible use as a warning device.- IEEE Trans Biomed Eng. 1997 Feb;44(2):115-22.
 Holder, D.- Electrical Impedance Tomography: Methods, history and applications.- Institute of Physics
Publishing: London 2004. ISBN: 0750309520.
 Barber D., Brown B., Freeston I.- Imaging Spatial Distributions of Resistivity Using Applied Potential
Tomography(1983).- Electron. Lett. 19: 933 – 935.
 Barber D. - Image Reconstruction in Applied Potential Tomography, Electrical Impedance Tomography
(1990). - Internal Report, Department of Medical Physics and Clinical Engineering, University of
 Geselowitz D. - An Application of Electrocardiographic Lead Therory to Impedance Plethysmography
(1971). - IEEE Trans. Biomed. Eng. 18: 38 – 41.
 Kotre C. - A Sensitivity Coefficient Method for the reconstruction of Electrical Impedance Tomograms
(1989). - Physiological Measurements 10: 275 – 281.
 Kotre C. - EIT Image Reconstruction Using Sensitivity Weighted Filtered Back projection (1994). -
Physiological Measurements 15 (Suppl. 2A): 125 - 136.
 Horesh et al.- Beyond the linear Domain. The way forward in MFEIT Image Reconstruction of the
Human Head. - Proceedings ICEBI XII. Gdansk 2004. Page 499-502.
 Yerworth R. et al.- Robustness of Linear and Nonlinear reconstructions algorithms for brain EITs. Non-
Linear – is it worth the effort? - Proceedings ICEBI XII. Gdansk 2004. Page 683-686.
 Tidswell T. - Three-Dimensional Electrical Impedance Tomography of Human Brain Activity.-
NeuroImage. Volume13, issue 2, February 2001.
 Foster I. - “What is the GRID?”- GridToday. July 22, 2002: Vol 1. No. 6.
 Foster I., Kesselman C. et al - “The Anatomy of the Grid“- Globus project. Technical papers.
 Foster I. et al -“The Physiology of the Grid”. - Globus Project. Technical papers. http://www.globus.org/
 Condor Project, http://www.cs.wisc.edu/condor/
 Matlab® site. www.mathworks.com. “Building Stand-Alone Applications”
 Aboufadel E., Schlicker S - Discovering wavelets.- ISBN: 0471331937.
 Burrus S. et al - Introduction to wavelets and wavelet transforms. - ISBN: 0134896009.
 Liston et al. - A multi-shell algorithm to reconstruct EIT images of brain function. - Physiological
Measurements. 23.1 (2002): 105-19.
 Eres M.- “User Deployment of Grid Toolkits to Engineers”. All Hands Meeting, Nottingham, September