SlideShare a Scribd company logo
1 of 19
Download to read offline
Aspect-Oriented: a Candidate for the
Biologically Inspired Programming Paradigm
for Neural Networks and Evolvable Software
Linchuan Wang 1,2
Xuefei Tang 3
Lijia Zhang 4
School of Computer Science and Engineering
University of Electronic Science and Technology of China
Chengdu, China
Abstract
We observe that the nervous systems of biology handle “the non-orthogonal con-
cerns” e↵ectively other than the Object-Oriented paradigm does. This nature phe-
nomenon inspires us with a programming paradigm to handle “cross-cutting”. It
is argued that the Aspect-Oriented paradigm is a candidate for the biologically-
inspired programming paradigm. To support this point, the Aspect-Oriented para-
digm is used to implement a simple Artificial Neural Networks(ANN) and the pre-
liminary experiment shows good results. In addition, we proposed a biologically-
inspired framework of evolvable software which could be implemented by using the
Aspect-Oriented paradigm combining with neurocomputing and genetic computing.
Key words: biologically-inspired computing,programming
paradigm,aspect-oriented programming, object-oriented,evolvable
software
1 Introduction
Neural networks are massively parallel processing systems, that require ex-
pensive and usually not available hardware, in order to be realized. Instead
of hardware implementation software simulation is widely used in research
and industry, for its low costs and flexibility. Moreover, Object-Oriented Pro-
gramming(OOP) language is the usual choice to implement artificial neural
1
This project is partly funded by UESTC Comsys info. Inc. and thanks to users from
aosd-discussion mailing list who have given very good comments on this paper.
2
Email: l.c.wang@std.uestc.edu.cn
3
Email: xftang@uestc.edu.cn
4
Email: zhanglijia@std.uestc.edu.cn
Wang, Tang and Zhang
networks. The artificial neural networks, however, implemented with Object-
Oriented(OO) technology provide few supports to change the size and topology
of the neural networks in runtime[1,2]. Furthermore, the parallelism of artifi-
cial neural networks is not friendly to the programmers who have to transform
manually the parallel neural networks into the codes executed serially since
an OO program is a collection of dialoging of a group of objects indeed. This
manual transformation is a complex process and the final implementations are
often hard to be reused. We argue that these problems in implementation of
artificial neural networks can be solved by using Aspect-Oriented Program-
ming (AOP).
Our research reveals that nervous systems of biology handle “non-orthogonal
concerns”(cross-cutting) e↵ectively. This nature phenomenon inspires a pro-
gramming paradigm for neural networks and evolvable software, which can
handle non-orthogonal concerns e↵ectively as that of nervous systems of biol-
ogy. As far as we know, there is no discussion on this phenomenon.
Moreover, we believe that the AO paradigm is a candidate for the desire
paradigm since the AO paradigm partly satisfy the considerations to imple-
ment the desire paradigm and bears a natural analogy with nerve systems. To
verify this idea, we implement a simple artificial neural networks which solves
the MONK’s problem I using AOP.
Nevertheless, the simulation of neural networks is never the only thing we
talk about here. The more interesting topic in this paper is how to apply
artificial neural networks in evolvable software development though this idea
is very young now.
The next section gives brief reviews on separation of concerns and AOP.
The biologically-inspired programming paradigm is discussed in detail in the
section 3. The implementation of ANN using the AO paradigm is arranged in
section 4. We propose a biologically-inspired framework of evolvable software
in the section 5. The section 6 is the conclusions.
2 Reviews on the Principle of Separation of Concerns
and AOP
One of the approaches to solve complex problems is to divide the problem
into some smaller, simpler and loose coupling sub problems, which is so called
the separation of concerns principle [3]. The principle confirms us that we
could not handle many problems at a time and we should deal with problems
one by one, and the important problem should be represented intentionally
(clearly and declaredly) and should be localized. Thus we can get intelligibil-
ity, adaptability, reusability and many other important qualities of software
systems since the degree of satisfaction of software requirements could be con-
veniently verified with such intentional representation and localization. Nev-
ertheless, the separation of concerns principle gives no guide to separate con-
cerns. Many methods are proposed in the passed decades, such as procedure-
Wang, Tang and Zhang
oriented method and OO method, to e↵ectively handle orthogonal concerns.
Unfortunately, many concerns are non-orthogonal. As a result, there are mas-
sive redundancies when we represent all of these problems intentionally and
locally with these methods.
Aspect-Oriented Programming (AOP) [4]provides a mechanism to handle
non-orthogonal concerns by modularizing cross-cutting via augmenting the
Object-Oriented programming paradigm with Aspect, join point, Pointcut
and Advice[5]. AOP is based on the idea that computer systems are better
programmed by separately specifying the various concerns and properties or
areas of interest of a system and some description of their relationships, and
then relying on instruments in the AOP environment to weave or compose
them together into a coherent program.[6] AOP does what OOP cannot do
e↵ectively, that is clearly and cleanly modularize functional system code, i.e.
source code, by separating concerns into well localized units, called aspects,
to eliminate code tangling.
The OO technology becomes popular owing to its synthesizing three im-
portant factors of software development, i.e. computer platform, thought way
and the features of issues. The basic elements of the OO paradigm are objects
and interactions between them. This paradigm is in harmony with thought
way of human and the rules of the natural world and hence that we can map
problem space to design space directly, and then to programming space[7]. It
should be pointed out that the rules of the nature world the OO paradigm
inspired are the rules of the mechanistic nature. We can innovate new software
technology with inspirations from the organic nature to handle non-orthogonal
concerns e↵ectively other than OO does[12].
3 The Biologically-inspired Programming Paradigm and
it’s Candidate
3.1 The biologically-inspired paradigm to handle cross-cutting
We observe that the nervous systems handle “cross-cutting” in their body,
which inspires us with a programming paradigm to handle cross-cutting in
software systems.
3.1.1 How does nature handle “cross-cutting”
The communications between objects in the Object-Oriented paradigm are
very close to intercellular communications in primitive multicellular animals
without nervous system, such as sponges, in which the cells communicate di-
rectly via gap junction with knowing exactly each others. 5
Sponges have no
nervous system(only a kind of neuron without synapse). Their responses to
stimulation are regional, slow and it’s level depends on the strength of stim-
5
The knowledge of biology of this paper mainly refer to [8,9]
Wang, Tang and Zhang
ulation. The information materials are transmitted by di↵usion of gelatinous
materials, dissociating amoebocyte and the contact of fixed cells. Sponges can
do very simple behaviors with this mechanism.
Although, in the view of great nature, Object-Oriented technology is pretty
primitive, we have built a few systems that are most complex and often hardest
to understand and maintain for human. How to low complexity becomes
a prominent problem while the scale of the software systems is increasing.
Therefore, we should to find new methods to address the problem. One of the
contributors of this problem is non-orthogonal concerns (i.e. cross-cutting)
which the OO paradigm does not deal with e↵ectively.
Nature does that e↵ectively. The process of natural evolution is not stop on
sponge, and therefore, more complex animals with nervous systems appeared.
The first nervous system was found in Coelenterate (such as Hydra). Neurons
partially separated from e↵ectors and form a neural net without nerve center
in which nerve impulses are broadcasted throughout the body to stimulate all
of the muscle cells to product simple behaviors. Since the neurons conducted
impulses without direction such neural net is called di↵use nervous systems.
But animals need better maneuverability to get energy to survive in a com-
petitive world. Therefore, a real nervous system appeared in Annelida (such
as earthworm). The earthworm are made up of segments that are formed by
subdivisions that partially transect the body cavity. Segments each contain
elements of such body systems as circulatory, nervous, and excretory tracts.
Metamerism increases the e ciency of body movement by allowing the e↵ect
of muscle contraction to be extremely localized, and it makes possible the
development of greater complexity in general body organization. It’s very
interesting that moving “concerns” are separated by nature! As result, the
segments are cohesive and loose coupling.
The most important inspiration from process of natural evolution described
above is that the complex behaviors are not completed by directly communi-
cating of muscle cells of the segments but completed by centrally control of
the ganglia (nerve center). Obviously, it can reduce the complexity of ani-
mals otherwise, for example, each muscle cell of the segment has to have the
same “movement logic”. We could say that nature encapsulates the “cross-
cutting” into the ganglion(neuron). This phenomenon is more common in
vertebrate(Figure 1). The extreme example is cortex, in which the most com-
plex “cross-cuttings”, such as emotion, learning ability, are handled very well.
3.1.2 Inspirations of nervous system
How to improve object-oriented paradigm to implement “nervous systems” of
large scale software system to deal with cross-cutting? Let us investigate more
details in nervous system. Nervous system are made of neurons. Although
nervous system vary enormously in structure and function, neurons function
similarly in animals as humans. A typical neuron has four morphologically
defined regions: cell body, dendrites, axon, and presynaptic terminals(Figure.
Wang, Tang and Zhang
Limb
UpperLimb
forward(rang : integer)
backward(rang : integer)
LowerLimb
forward(rang : integer)
backward(rang : integer)
LeftArm
forward(rang : integer)
backward(rang : integer)
RightArm
forward(rang : integer)
backward(rang : integer)
LeftLeg
forward(rang : integer)
backward(rang : integer)
RightLeg
forward(rang : integer)
backward(rang : integer) Synchronization
after():call(void UpperLimb.*ward(int))
|| call (void LowerLimb.*ward(int))
{
//synchronize
}
after():call(void UpperLimb.*ward(int))
|| call (void LowerLimb.*ward(int))
{
//synchronize
}
Center Nervous System(Aspect)
Fig. 1. An example of “cross-cutting” in nervous system: synchronization of limbs
in walking .
2).
Neurotransmitter molecules
Postsynaptic
membrane
receptors
Presynaptic
membrane
Synaptic
vesicles
Postsynaptic dendrite
Postsynaptic membrane
Presynaptic membrane
Synaptic
knob
Synaptic vesicles
containing
neurotransmitter
molecules
Mitochondrion
Synaptic cleft
Synapse
Nucleus
Soma(Cell body)
Axon
Direction
of Impulse
Dendrite
Axon
Terminal
Axon
Fig. 2. The structure of neuron.
The axon is the main conducting unit of the neuron, capable of conveying
electrical signals along distances that rang from as short as 0.1 mm to as long as
2 m, thereby conveying information to di↵erent targets. Near its end the axon
divides into fine branches that make contact with other neurons or e↵ector.
The point of contact is known as synapse. The cell transmitting a signal is
the presynaptic cell, and the cell receiving the signal is the postsynaptic cell.
Wang, Tang and Zhang
Ramon y Caljal pointed out in his principle of connectional specificity that
the connections between neuron-to-neuron and neuron-to-e↵ector are speci-
fied. The principle of connectional specificity states that each cell communi-
cating with certain postsynaptic target cells but not with others and always at
specialized points of synaptic contact. The specified connection is similar to
that of direct intercellular communication (by gap junction or by plasma mem-
brane bound molecules). But the specified connections can be changed owing
to synaptic plasticity. For example, long-term habituation and sensitization
involve structural changes in the presynaptic terminals of sensory neurons,
long-term habituation leads to a loss of synapses, and long-term sensitization
to an increase(Figure. 3). Thus, nervous systems in which “cross-cutting” is
modularized can learn and evolve in a changing environment.
1.Control 2. Long-term habituation 3. Long-term sensitization
Sensory
neuron
Motor
neuron
Fig. 3. Long-term habituation leads to a loss of synapses, and long-term sensitized
to an increase.[10]
If we draw an analogy between object in object-oriented paradigm and
e↵ector in animals’ body, it could be seen that the features of intercellular
communicating through synapse, which is the key to handle “cross-cutting”,
do not exist in object-oriented paradigm in which that is done by point-to-
point method call similar to gap junction communication mentioned above.
From discussions above, we conclude several considerations to augment
object-oriented paradigm to implement “nervous system” of large scale soft-
ware system to deal with cross-cutting.
1)add new communication mechanism similar to synapse communication.
2)di↵erentiate “neuron” from “e↵ector”(object).
3)add a mechanism supporting “synaptic plasticity”.
Up to now, we observe that the Aspect-Oriented paradigm is a candidate
for the desire paradigm. We discuss this point in the next section.
3.2 A candidate for the desire paradigm: Aspect-Oriented
As an extension of the Object-Oriented paradigm, AOP adds Aspect, Point-
Cut and JoinPoint into Object-Oriented paradigm[11]. The Aspect has a
natural analogy with neural cell in nervous systems of animals and Aspect
could be modeled with a simplified neural model (Figure.4).
Wang, Tang and Zhang
PointCut
(Axon)
Aspect
(Neuron)
join point
(Synapse)
...
...
Advice
(Transduction)
Object
(Effector)
Executing Point
of a program
(Membrane
Receptor)
join point
Expression
(Presynapse )
...
...
Fig. 4. Metaphor between AOP and nerve system. Please note that join points
defined in object or aspect correspond to membrane receptors of cell. It can be
method call, method execution etc. [11] , by which PointCut synapses with object
or aspect.
Aspect(Neuron)
Aspect is “neuron” di↵erentiated from “e↵ector”(object), which is a mod-
ularized cross-cutting, i.e. modules in which cross-cutting encapsulated. It
consists of join point, PointCut and Advice. Join points connect with an Ad-
vice through the PointCut they belong to and expose the logic of cross-cutting
implemented in the Advice. In other word, the Advice surrounded by a “wall”
of join points(Figure.5) and hence that the only way to make the Advice to
function is to activate one of join points in the wall. Thus, cross-cutting which
can not be e↵ectively handled by the OO paradigm is encapsulated into As-
pect. Aspect has some features similar to that of Class. First, Aspect is
cohesive because join points can only activate certain Advice. Second, Aspect
is loose coupling with other parts of the system because it communicates with
other elements just by join points in the wall.[12]
PointCut(Axon)
Together with join points, PointCut adds new communication mechanism
similar to synapse communication into the OO paradigm. PointCut is the
main conducting unit of the Aspect, capable of getting the context of the
target objects synapsed by the own join points (we also call it as the context
of the join point). A PointCut, connecting to at least one Advice, could have
Wang, Tang and Zhang
Aspect Class
Operation
Attribute
join point
Advice
PointCut
Fig. 5. Another representation of Aspect and Class[13]
several join points, which are program execution points similar to membrane
receptors of cell defined in object or aspect. It can be method call, method
execution etc. , by which PointCut synapses with object or aspect. Rather
the direction of impulse transmission in the axon of most neurons is always
from the base of axon to its terminals, information in PointCut is transmitted
bi-directly, i.e. Aspect can pass information from and to the targets. We
call the PointCut, which passes information from the target to the Aspect it
belongs to, as dendrite.
Advice(Signal Transduction)
Advice similar to signal transduction logic in neuron is a logic implement-
ing cross-cutting. Which can change behaviors of the target objects synapsed
by the join point of the own PointCut and can be activated by join points.
The activated Advice performs some operations such as before(), around() and
after() in the context of join point to implement cross-cutting. For any object,
the Advice is inaccessible, in other words object can not call the operations
of Advice, which guarantees the modularity of cross-cutting.
3.3 Sample: a Simplest Nervous Net
The simplest neuronal network consists of three cells: a sensory neuron con-
nected to a motor neuron connected to a muscle cell, which can be mapped
into the neuroscience-inspired model described above(Figure 6).
1 class Effector {
2 boolean state,transmitter;
3 private boolean receptor(){
4 boolean transmitter_received = false;
5 returntransmitter_received;
6 }
7 public boolean emite(){
8 Transmitter = state;
9 return transmitter;
10 }
11 public void contract(){
12 //to contract and change the state...
13 }
Wang, Tang and Zhang
Sensory Neuron
Motor NeuronEffector
(muscle cell)
Fig. 6. A simplest nervous net
14 public void relax(){
15 //to relax and change the state...
16 }
17 public void go(){
18 if(receptor()) contract();
19 else relax();
20 }
21 }
The sensory neuron in neuronal network is implemented as an aspect which
has two pointcuts , axon() and dendrite(), similar to that of neuron. These
pointcuts are di↵erent in direction of information transmitted, axon() trans-
mit information (transmitter to release) from SensoryNeuron to other aspect
or object (in this case, E↵ector) and dendrite() vice versa. The Advice similar
to signal transduction in cell which dendrite() binding to receives information
emitted by E↵ector (line 9 ), and then the Advice of axons() decides what in-
formation be transmitted to the target, in this simple case transmitter received
is transmitted directly.
1 aspect SensoryNeuron {
2 boolean transmitter_to_release,transmitter_received;
3 pointcut axon():within(MotorNeuron) && execution(* receptor(..));
4 pointcut dendrite():
5 within(Effector) && execution( * emit(..));
6 boolean around(): axon() {
7 return transmitter_received;
8 }
9 after() returning(boolean x) :dendrite() {
10 transmitter_received = x;
11 }
12 }
MotorNeuron is an aspect similar to SensoryNeuron. However there are
two di↵erences between SensoryNeuron and MotorNeuron. First, MotorNeu-
ron only has one pointcut axon(), which transmits orders to the E↵ector in
certain logic (line 8). Second, the former has a membrane receptor() (line 4)
similar to that of E↵ector which receives transmitter emitted by the latter.
Wang, Tang and Zhang
1 aspect MotorNeuron {
2 boolean transmitter_to_release, transmitter_received;
3 pointcut axon(): within(Effector) && execution( * receptor(..));
4 private boolean receptor(){
5 boolean transmitter_received = false;
6 return transmitter_received;
7 }
8 boolean around(): axon() {
9 transmitter_received = receptor();
10 //Motor Neuron receive singal = transmitter_received
11 if (transmitter_received) transmitter_to_release = true;
12 else transmitter_to_release = false;
13 //Motor Neuron release transmitter = transmitter_to_release
14 return transmitter_to_release;
15 }
16 }
4 Applying AO paradigm in ANN
We use AOP to implement ANN to support one of our points, i.e. AOP
dealing with crosscutting is similar to that of real neural nets, which implies
that we could apply ANN theory in large scale AO systems in which ANN
is a by-product. ANN could be implemented by using other programming
paradigms but these implementations are some shortcomings as mentioned in
section 1.
With the paradigm described above, artificial neural network can be easily
simulated. To illustrate it, we present a simple example of three-tier BP neural
network which solves the MONK’s problem I.[14]
The MONK’s problems rely on the an artificial domain; in which robots
are described by six di↵erent attributes:
x1:head shape 2round, square, octagon
x2:body shape 2round, square, octagon
x3:is simling 2yes, no
x4:holding 2sword, balloon, flag
x5:jacket color 2red, yellow, green, blue
x6:has tie 2yes, no
The learning task is a binary classification task. Each problem is given by
a logic description of a class. Robot belongs either to this class or not, but
instead of providing a complete class description to the learning problem,
only a subset of all 432 possible robots with its classification is given. The
learning task is then to generalize over these example and, if the particular
learning technique at hand allow this, to derive a simple class description. The
MONK’s problems are composed of three tasks. One of them is the MONK’s
problem I described bellow. Figure 7 illustrates the network architecture used
in the MONK’s problem I. Each unit in the Hidden1 layer represents one of
the attributes.
Wang, Tang and Zhang
Problem 4.1 (Monk1): (head shape = body shape) or (jacket color = red).
From 432 possible examples, 124 were randomly selected for the training set.
No noise is present.
Fig. 6 The network architecture for the MONK’s Problem I.[14]
0 1 2 3 4 5
0 1 2 3 4 5
Output
Hidden2
Hidden1
Input
head-shape body-shape smiling hoiding jacket-color has-tie
Fig. 7. The network architecture for the MONK’s Problem I.[15]
Group7
Group0
Group8
Group1 … … … Group5
Group6
Fig. 8. The aspect-oriented design model of the network in Fig. 7
Each aspect of the network represents a neuron(We call it as Aspect-
Oriented Neural Networks(AONN) ). Learning algorithm is a typical cross-
cutting which should be modeled as an aspect with pointcuts which cross cut
every aspect which need to learn (in this case, every aspect in all layers except
Input ). Given simplicity factor, it does not be shown in the figure.
The architecture in figure 7 is mapped into the AO design space as shown
in figure 8. By using the programming technique described in section 3.3,
the design model can be implemented easily. We implement neuron into an
abstract aspect with one pointcut dendrite().
1 public abstract aspect Neuron {
Wang, Tang and Zhang
2 //some definitions
3 ...
4 //n is the amount of input of the neuron
5 protected int n;
6 protected float[] input;
7 protected float[] weight;
8 protected Params params;//e.g. weights and threshold of the neuron
9 protected ID id;//identity of the neuron
10 protected OutPut output;//transmitter
11 abstract pointcut dendrite();
12 after() returning(OutPut out) : dendrite() {
13 if (!init) init();
14 input[out.getId().getNumber()] = out.getValue();
15 counter++;
16 if (! (counter < n)) {
17 counter = 0;
18 getOutPutWrapper();
19 }
20 }
21 public ID getID() {
22 return id;
23 }
24 /* *Wrapper() just only wrap the method *OutPut(float net)
25 in order to the dendrite() pointcut picks out jp within the
26 correct aspect(see to their implementations).This is some trick*/
28 public abstract void init();
29 public abstract OutPut getOutPutWrapper();
30 public abstract OutPut getOutPut(float net);
31 public abstract float getNetInput();
32 public abstract Params getParamsWrapper();
33 public abstract Params getParams(ID id);
34 }
Abstract aspect SigmoidNeuron partly implement Neuron with sigmoid
active function.(line 6)
1 public abstract aspect SigmoidNeuron
2 extends Neuron {
3 //some definitions
4 ...
5 //sigmoid active function
6 public OutPut getOutPut(float net) {
7 //calculateing output with sigmoid function and return it
8 ...
9 }
10 public float getNetInput() {
11 //this executing point will be captured by learning aspect
12 Params tempParams = getParamsWrapper();
13 //compute and return net value
14 ...
15 }
16 public Params getParams(ID id) {
17 return params;
18 }
Wang, Tang and Zhang
19 }
We implement neurons into aspects and name it in a simple rule. The
name of neuron is composed by tow parts, the order name and the group
name. We group neurons by the the target they synapse. The neurons, which
have the same target, are in the same group, such as the neuron which belongs
to hidden layer 2 is in Group 7,etc. So every neuron has a group id. Second,
the position of the neuron in the group (from left to right) is the order name
of neuron, such as Neuron0, Neuron1,etc.(See also to figure 8)
1 aspect Neuron0Group7
2 extends SigmoidNeuron {
3 pointcut dendrite() : within(*Group0) && call( * getOutPut(..));
4 public void init() {
5 //initialize this neuron and set the flag
6 ...
7 }
8 public OutPut getOutPutWrapper(){
9 return getOutPut(getNetInput());
10 }
11 public Params getParamsWrapper(){
12 return getParams(id);
13 }
14 }
There are some di↵erences among the neurons. The neurons in the input
layer is the sensory neuron mentioned in section 3.3. These neurons have
pointcut dendrite() which could pickup the join points in e↵ectors which are
pattern generators(POJO) in this case. Second, we add a new method fi-
nalOutPut(ID id) into the neurons in the output layer by which the learning
aspect will get the final output without knowing the scale of the net. Once the
pattern generators product all data the output aspect will call finalOutPut(ID
id) to output the result. Thus the net is ready to run together with a simple
main class and if the parameters are assigned correctly it will surely return
right results for all the patterns. But we want it to get the right parameters
by itself, i.e. learning.
Learning is a typical cross-cutting which is implemented as aspect Bp with
several pointcuts which cross cut every aspect which needs to learn (in this
case, every aspect in all layers except input layer).The key in implementing
learning aspect is parameters update. Once the output layer products an out-
put, pointcut checkErr() will capture it and check if the net need to learn (i.e.
the result is wrong or right) and pointcut checkPass() calculates the update
using learning algorithm(BP algorithm in this case, we can easily replace it
with another learning algorithm). While the net running, if there are some
parameters need to be refreshed pointcut paramsUpdate() will update it.
1 public aspect Bp {
2 ...
3 //we do not care about fire order of the neuron in the net
4 //so data is storeed in the hash tables
Wang, Tang and Zhang
5
6 pointcut paramsUpdate() : if (learningFlag && hasUpdate)
7 && call( * getParams(..))&& within(Neuron*);
8 Params around() : paramsUpdate() {
9 }
10
12 pointcut checkErr() : within(Neuron*) && call( * finalOutPut(..));
13 after() returning(float v) : checkErr() {
14 }
15
16 // check if the input neuron fires, that is the end of one pass
17 pointcut checkPass() : if (learningFlag&&isInputsReady&&isParamsReady)
18 && within(Neuron*) && call( * finalOutPut(..));
19 after() : checkPass() {
20 ...
21 calculate();
22 }
23
24 //get all outputs of the neuron in the net
25 pointcut getAllOutPut() : if (learningFlag) && within(Neuron*) &&
26 call( * getOutPut(..));
27 after() returning(OutPut out) : getAllOutPut() {
28 }
29
30 pointcut getAllParams() : if ( (!onePass) && learningFlag) &&
31 within(Neuron*) && call( * getParams(..));
32 after() returning(Params tempParams) : getAllParams() {
33 //store the params into hash table and set the flag
34 ...
35 }
36 public void calculate() {
37 //calculateing weight update using BP algorithm
38 ...
39 }
40 }
Our preliminary experiment shows that AONN has good traits. First, the
parallelism of neural networks is transformed automatically. Neural networks
are parallel processing systems, in which neurons in the same layer fire parallel,
so programmer usually be required to transform the parallelism into serial
procedure or dialoging between objects manually. That is prone to err and is
often hard to implement a large scale neural network. In AONN, rather care
about the details of interaction of neuron (transformation) we just focus on the
synapses (join points). The transformation are automatically completed by
weaving .(By process algebras[16,17], we can prove the correctness of weaving
but for space limitation it will be discussed in another paper ). Second, the
approach used in AONN has high traceability. The problem space is mapped
into the design space directly, and then to programming space. Moreover,
AONN has low change impact. We can add or delete neurons and neuron layers
or change the topology with no impact on the whole system. Furthermore, by
using dynamic AOP (the implementation is immature), such as JAC, JBoss,
Wang, Tang and Zhang
Axon etc.[18,19,20], the topology of NN can be changed in runtime. Fourth,
the implementation can be reused easily. The neurons implemented in aspects
are cohesive and loose coupling with other parts of system so it can be used in
another system with light modification. Last but never the least, the learning
algorithm can be changed on demands. The learning algorithm of the neural
network is localized into a neuron so it can be replaced easily.
On the other hand, we define an aspect for each neuron in the network, but
it needs a reasonable e↵ort to write though we have carefully avoid it. It seems
to be reasonable to create multiple objects from one neuron class definition,
but AspectJ’s pointcut and advice mechanism can not allow to instantiate ar-
bitrary aspect instances, and requires to distinguish advised objects.It might
be an interesting idea to apply instance-level aspects for representing connec-
tion between neurons and this is one of our future works.[31,32]
The Aspect-Oriented paradigm, a candidate for biologically-inspired pro-
gramming paradigm discussed in section 3.1, does not dedicated to implement
ANN; it has been used to develop more general applications and can be used
to develop biologically-inspired software which is discussed in the next section
at abstract level.
5 Biologically-Inspired Framework of Evolvable Soft-
ware
The concepts of biologically-inspired computing exist in computer science for
decades. In earlier, W. McCulloch and W. Pitts proposed artificial neural
networks[21], John von Neumann presented cellular automata[22], Alan Tur-
ing researched on Morphogenesis[23] and John H. Holland invented genetic
algorithms[24] . And more recently some new biologically-inspired comput-
ing are presented, such as [25,26]. A mass of applications using biologically-
inspired computing are developed, such as [27,28,29,30].
We argue that although the most of the applications using biologically-
inspired computing are implemented as software systems the software technol-
ogy itself does not benefit from ideas of biologically-inspired computing owing
to the software paradigm’s incongruity with the structure and the mechanism
of biology(particular nervous systems), such as absence of synapse commu-
nication mechanism in the Object-Oriented paradigm. As discussed earlier,
Aspect-Oriented paradigm is close to the desired paradigm. We believe that
it can be used to develop biologically-inspired evolvable software proposed
below(Figure 9).
The biologically-inspired framework of evolvable software consists of four
layers. The Core Application Layer (CAL) similar to organs of biology in-
cludes only orthogonal functions of the system, which is constructed by using
the OO paradigm and run independently. The systemic functions, such as se-
curity, transaction, persistence and distribution etc., are implemented in the
Reflex Aspects Layer(RAL) similar to spinal cord of animals with the Aspect-
Wang, Tang and Zhang
Oriented paradigm. The RAL controls the functions of core application in
CAL and their e↵ectiveness are adjusted according to the changing environ-
ment. RAL is what the Aspect-Oriented community focus on now. At the top
of the architecture, the Knowledge-Based AONN Layer (KBAL) serves as a
“cortex” of the software system which controls indirectly the core applications
via RAL. Owing to the combination of di↵erent biologically-inspired comput-
ing technologies, such as knowledge-based neurocomputing, genetic comput-
ing and synapse(join point) plasticity, KBAL learns and adapts to changing
environment(new requirements), thus provides us evolvable software.
Organs
Spinal Cord
Cortex
Environment
Core Application Layer
Reflex Aspects Layer
Knowledge-Based
AONN Layer
Fig. 9. The architecture of Aspect-Oriented evolvable and adaptive software and
the biological analog
6 Conclusions
To handle non-orthogonal concerns (cross-cutting) which is not done e↵ec-
tively by using the Object-Oriented paradigm, we have proposed a biologically-
inspired programming paradigm with lessons from nervous systems. There
are three considerations to augment to handle cross-cutting: 1) add new com-
munication mechanism similar to synapse communication. 2) di↵erentiate
“neuron” from “e↵ector”(object). 3) add a mechanism supporting “synaptic
plasticity”.
The Aspect-Oriented paradigm, which satisfy the considerations 1) and 2)
at present, is a candidate for the desire paradigm. It has been used in ANN
simulation , named as Aspect-Oriented Neural Networks(AONN), to support
our ideas and the preliminary experiment shows that AONN has many good
traits. Although the concentration on the consideration 3) (i.e. dynamic
AOP)is increasing, it has not been well implemented yet. The researches on
synaptic plasticity in neuroscience may give us more inspirations to achieve
it.
In addition, we have proposed a preliminary biologically-oriented evolv-
Wang, Tang and Zhang
able software framework which can be implemented using the AO paradigm
combining with biologically-inspired computing technologies, such as neuro-
computing and genetic computing.
References
[1] George D. Manioudakis, and Spiridon d Likothanassis, An Object-Oriented
Toolbox for Adaptive Neural Networks’ Implementation, International Journal
on Artificial Intelligence Tools. 10, No.3(2001), 345–371.
[2] Ellingsen B.K. An object-oriented approach to neural networks, Technical
Report No.45, ISSN 0803-6489, UIB-IFI,
URL:http://citeseer.ist.psu.edu/ellingsen95objectoriented.html.
[3] Dijkstra, Edsger W., . “A discipline of programming,” Englewood, Cli↵s,
NJ:Prentice-Hall Inc.,1976.
[4] Kiczales G, Lamping J, Mendhekar A, Maeda C, Lopes CV, Loingtier JM, Irwin
J.,Aspect-Oriented programming, In: Proceedings of the European Conference
on Object-Oriented Programming (ECOOP). LNCS 1241, Springer-Verlag,
1997. 220–242.
URL:http://citeseer. nj.nec.com/63210.html.
[5] Soares, S., and Borba, P., (2002), “Progressive implementation with aspect
oriented programming”, in Springer Verlag, editor, The 12th Workshop for PhD
Students in Object–Oriented Systems, ECOOP 02.
[6] Elrad, T., Filman, R.E., and Bader, A., “Aspect oriented programming”,
Communications of the ACM, vol. 44, no. 10(2001).
[7] F.Q. Yang, H. MEI, J, Lu and Z. Jin, Some Discussion on the Development
of Software Technology(in Chinese), Chinese Journal of Electronics, Vol.30,
No.12A, 1901–1906, 2002
[8] William K. Purves, David Sadava, Gordon H. Orians, Craig Heller. “Life: The
Science of Biology,” 7th Ed., W H Freeman, Bedford, 2004
[9] E. R. Kandel, J. H. Schwartz, T. M. Jessell, “Essentials of Neural Science and
Behavior,” McGraw-Hill, 1996
[10] Bailey, C.H., and Chen, M. Morphological basis of long-term habituation and
sensitization in Aplysia. Science 220:91–93.
[11] The AspectJ Team, The AspectJ Programming Guide,With associated web site
http://eclipse.org/aspectj.
[12] L.C. Wang, X.F. Tang, NBO: An Approach for Aspect-Oriented Software
Development in Light of Neurobionics, submitted.
[13] Taylor, D.A., “Object-Oriented Technology: A Manager’s Guide,” New
York:Addison Wesley, 1990
Wang, Tang and Zhang
[14] Thrun, S.B., Bala, J., Bloedorn, E., Bratko, L., Cestnik, B., Cheng, J., De
Jong, K., Dzeroski, S., Fahlmann, S.E., Fisher, D.,Hamann, R., Kaufman, K.,
Keller, S., Kononenko, L., Kreuziger, J., Michalski, R.S. Mitchell, T., Pachowicz,
P., Reich, Y., Vafaie, H., Van de Welde, W., Wenzel, W., Wnek, J., Zhang, J.
(1991), The Monk’sProblems: A Performance Comparison of Di↵erent Learning
Algorithms, Technical Report: CMU-CS-91-197, Carnegie Mellon University.
[15] Masumi Ishikawa, Structural learning and rule discovery from data, in S.
Amari and N. Kasabov Eds., Brain-Like Computing and Intelligent Information
Systems, Chapter 16, pp.396-415, Springer (1998)
[16] C.A.R. Hoare. “Communicating Sequential Process,” Prentice-Hall, Englewood
Cli↵s, NJ, 1985
[17] James H. Andrews. Process-Algebraic Foundations of Aspect-Oriented
Programming. In Proceedings of the Third International Conference on
Metalevel Architectures and Separation of Crosscutting Concerns Pages: 187
– 209, 2001
[18] Pawlak, R., L. Seinturier, L. Duchien, and G. Florin, “JAC: A Flexible Solution
for Aspect-Oriented Programming in Java,” in Metalevel Architectures and
Separation of Crosscutting Concerns (Reflection 2001), LNCS 2192, pp. 1–24,
Springer, 2001
[19] http://www.jboss.org/developers/projects/jboss/aop .
[20] Swen Aussmann, Michael Haupt. Axon-Dynamic AOP through Runtime
Inspection and Monitoring. ECOOP’03 workshop on Advancing the State of
the Art in Runtime Ispection (ASARTI) 2003
[21] W. McCulloch and W. Pitts, A Logical Calculus of the Ideas Immanent in
Nervous Activity, Bulletin of Mathematics Biophysics., Vol. 5, 1943:115–133
[22] John von Neumann, Theory of Self-Reproducing Automata. University of
Illinois Press. 1966 (Originally published in 1953)
[23] Alan Turing. The Chemical Basic of Morphogenesis, Philosophical Transactions
of Royal Society B(London). 1952
[24] John H. Holland. Genetic algorithms and the optimal allocation of trials. SIAM
Journal on Computation, 2:88-105, 1973.
[25] Gh. Paun, Computing with Membranes, J. Comput. System Sci. 61(1) (2000)
108–143. (See also Turku Center for Computer Science-TUCS Report No. 208,
1998, www.tucs.fi)
[26] Abelson et al, Amorphous Computing, Communications of the ACM, Volume
43, Number 5, May 2001
[27] M. Wang and T. Suda, “The bio-networking architecture: A biologically inspired
approach to the design of scalable, adaptive, and survivable/available network
applications,” in Proceedings of the 1st IEEE Symposium on Applications and
the Internet (SAINT), (San Diego, CA), IEEE, 8–12 January 2001.
Wang, Tang and Zhang
[28] Dario Floreano and Joseba Urzelai. Neural Morphogenesis, Synaptic Plasticity,
and Evolution, “Theory in Biosciences,” vol. 120, no. 3-4, pp. 225–240(16),
Urban & Fischer, 2001.
[29] Dan C. Marinescu, Ladislau Boloni. Biological Metaphor in the design of
complex software software systems. Futuree Generation Computer Systems
Vol17( 2001),345–360
[30] Eduardo Sanchez , Daniel Mange , Moshe Sipper , Marco Tomassini , Andrs
Prez-Uribe , Andr Stau↵er, Phylogeny, Ontogeny, and Epigenesis: Three Sources
of Biological Inspiration for Softening Hardware, Proceedings of the First
International Conference on Evolvable Systems: From Biology to Hardware,
p.35–54, October 07-08, 1996
[31] Hridesh Rajan and Kevin Sullivan, Eos: Instance-Level Aspects for Integrated
System Design, In 9th European Software Engineering Conference (ESEC) and
11th ACM SIGSOFT International Symposium on the Foundations of Software
Engineering (FSE-11), pp.297–306, 2003.
[32] Kouhei Sakurai, Hidehiko Masuhara, Naoyasu Ubayashi, Saeko Matsuura and
Seiichi Komiya, Association Aspects, In Proceedings of the 3rd International
Conference on Aspect-Oriented Software Development (AOSD’04), 2004.

More Related Content

What's hot

soft-computing
 soft-computing soft-computing
soft-computingstudent
 
Artificial Neural Network Abstract
Artificial Neural Network AbstractArtificial Neural Network Abstract
Artificial Neural Network AbstractAnjali Agrawal
 
Soft Computing
Soft ComputingSoft Computing
Soft ComputingMANISH T I
 
Analytical Review on the Correlation between Ai and Neuroscience
Analytical Review on the Correlation between Ai and NeuroscienceAnalytical Review on the Correlation between Ai and Neuroscience
Analytical Review on the Correlation between Ai and NeuroscienceIOSR Journals
 
Pattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkPattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkEditor IJCATR
 
Neural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionNeural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionVipra Singh
 
Neural Computing
Neural ComputingNeural Computing
Neural ComputingESCOM
 
Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...
Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...
Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...IOSR Journals
 
Neural Network Applications In Machining: A Review
Neural Network Applications In Machining: A ReviewNeural Network Applications In Machining: A Review
Neural Network Applications In Machining: A ReviewAshish Khetan
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final ReportShikhar Agarwal
 
fundamentals-of-neural-networks-laurene-fausett
fundamentals-of-neural-networks-laurene-fausettfundamentals-of-neural-networks-laurene-fausett
fundamentals-of-neural-networks-laurene-fausettZarnigar Altaf
 
STUDY AND IMPLEMENTATION OF ADVANCED NEUROERGONOMIC TECHNIQUES
STUDY AND IMPLEMENTATION OF ADVANCED NEUROERGONOMIC TECHNIQUES STUDY AND IMPLEMENTATION OF ADVANCED NEUROERGONOMIC TECHNIQUES
STUDY AND IMPLEMENTATION OF ADVANCED NEUROERGONOMIC TECHNIQUES acijjournal
 
ON SOFT COMPUTING TECHNIQUES IN VARIOUS AREAS
ON SOFT COMPUTING TECHNIQUES IN VARIOUS AREASON SOFT COMPUTING TECHNIQUES IN VARIOUS AREAS
ON SOFT COMPUTING TECHNIQUES IN VARIOUS AREAScscpconf
 

What's hot (19)

soft-computing
 soft-computing soft-computing
soft-computing
 
Artificial Neural Network Abstract
Artificial Neural Network AbstractArtificial Neural Network Abstract
Artificial Neural Network Abstract
 
Soft Computing
Soft ComputingSoft Computing
Soft Computing
 
A tutorial in Connectome Analysis (0) - Marcus Kaiser
A tutorial in Connectome Analysis (0) - Marcus KaiserA tutorial in Connectome Analysis (0) - Marcus Kaiser
A tutorial in Connectome Analysis (0) - Marcus Kaiser
 
Analytical Review on the Correlation between Ai and Neuroscience
Analytical Review on the Correlation between Ai and NeuroscienceAnalytical Review on the Correlation between Ai and Neuroscience
Analytical Review on the Correlation between Ai and Neuroscience
 
forney_techrep2015
forney_techrep2015forney_techrep2015
forney_techrep2015
 
International Journal of Engineering Inventions (IJEI)
International Journal of Engineering Inventions (IJEI)International Journal of Engineering Inventions (IJEI)
International Journal of Engineering Inventions (IJEI)
 
Pattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkPattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural Network
 
Neural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionNeural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
 
Soft computing01
Soft computing01Soft computing01
Soft computing01
 
Neural Computing
Neural ComputingNeural Computing
Neural Computing
 
Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...
Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...
Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...
 
Neural Network Applications In Machining: A Review
Neural Network Applications In Machining: A ReviewNeural Network Applications In Machining: A Review
Neural Network Applications In Machining: A Review
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final Report
 
Soft computing
Soft computingSoft computing
Soft computing
 
fundamentals-of-neural-networks-laurene-fausett
fundamentals-of-neural-networks-laurene-fausettfundamentals-of-neural-networks-laurene-fausett
fundamentals-of-neural-networks-laurene-fausett
 
Introduction to soft computing
 Introduction to soft computing Introduction to soft computing
Introduction to soft computing
 
STUDY AND IMPLEMENTATION OF ADVANCED NEUROERGONOMIC TECHNIQUES
STUDY AND IMPLEMENTATION OF ADVANCED NEUROERGONOMIC TECHNIQUES STUDY AND IMPLEMENTATION OF ADVANCED NEUROERGONOMIC TECHNIQUES
STUDY AND IMPLEMENTATION OF ADVANCED NEUROERGONOMIC TECHNIQUES
 
ON SOFT COMPUTING TECHNIQUES IN VARIOUS AREAS
ON SOFT COMPUTING TECHNIQUES IN VARIOUS AREASON SOFT COMPUTING TECHNIQUES IN VARIOUS AREAS
ON SOFT COMPUTING TECHNIQUES IN VARIOUS AREAS
 

Similar to Aspect oriented a candidate for neural networks and evolvable software

An approach for self creating software code in bionets with artificial embryo...
An approach for self creating software code in bionets with artificial embryo...An approach for self creating software code in bionets with artificial embryo...
An approach for self creating software code in bionets with artificial embryo...eSAT Publishing House
 
An Ontology For Building A Conceptual Problem Solving Model
An Ontology For Building A Conceptual Problem Solving ModelAn Ontology For Building A Conceptual Problem Solving Model
An Ontology For Building A Conceptual Problem Solving ModelAmy Isleb
 
Conversion of Artificial Neural Networks (ANN) To Autonomous Neural Networks
Conversion of Artificial Neural Networks (ANN) To Autonomous Neural NetworksConversion of Artificial Neural Networks (ANN) To Autonomous Neural Networks
Conversion of Artificial Neural Networks (ANN) To Autonomous Neural NetworksIJMER
 
Soft computing
Soft computingSoft computing
Soft computingCSS
 
Summary Of Thesis
Summary Of ThesisSummary Of Thesis
Summary Of Thesisguestb452d6
 
Minimizing Musculoskeletal Disorders in Lathe Machine Workers
Minimizing Musculoskeletal Disorders in Lathe Machine WorkersMinimizing Musculoskeletal Disorders in Lathe Machine Workers
Minimizing Musculoskeletal Disorders in Lathe Machine WorkersWaqas Tariq
 
Artifical neural networks
Artifical neural networksArtifical neural networks
Artifical neural networksalldesign
 
19 3 sep17 21may 6657 t269 revised (edit ndit)
19 3 sep17 21may 6657 t269 revised (edit ndit)19 3 sep17 21may 6657 t269 revised (edit ndit)
19 3 sep17 21may 6657 t269 revised (edit ndit)IAESIJEECS
 
CI_GA_module2_ABC_updatedG.ppt .
CI_GA_module2_ABC_updatedG.ppt           .CI_GA_module2_ABC_updatedG.ppt           .
CI_GA_module2_ABC_updatedG.ppt .Athar739197
 
2014 Practical emotional neural networks
2014 Practical emotional neural networks2014 Practical emotional neural networks
2014 Practical emotional neural networksEhsan Lotfi
 
Artificial Intelligence A Modern Approach
Artificial Intelligence A Modern ApproachArtificial Intelligence A Modern Approach
Artificial Intelligence A Modern ApproachSara Perez
 
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...CSCJournals
 
NatashaBME1450.doc
NatashaBME1450.docNatashaBME1450.doc
NatashaBME1450.docbutest
 
Artificial Neural Network and its Applications
Artificial Neural Network and its ApplicationsArtificial Neural Network and its Applications
Artificial Neural Network and its Applicationsshritosh kumar
 
Neural Network
Neural NetworkNeural Network
Neural NetworkSayyed Z
 

Similar to Aspect oriented a candidate for neural networks and evolvable software (20)

An approach for self creating software code in bionets with artificial embryo...
An approach for self creating software code in bionets with artificial embryo...An approach for self creating software code in bionets with artificial embryo...
An approach for self creating software code in bionets with artificial embryo...
 
VEU_CST499_FinalReport
VEU_CST499_FinalReportVEU_CST499_FinalReport
VEU_CST499_FinalReport
 
An Ontology For Building A Conceptual Problem Solving Model
An Ontology For Building A Conceptual Problem Solving ModelAn Ontology For Building A Conceptual Problem Solving Model
An Ontology For Building A Conceptual Problem Solving Model
 
Conversion of Artificial Neural Networks (ANN) To Autonomous Neural Networks
Conversion of Artificial Neural Networks (ANN) To Autonomous Neural NetworksConversion of Artificial Neural Networks (ANN) To Autonomous Neural Networks
Conversion of Artificial Neural Networks (ANN) To Autonomous Neural Networks
 
Soft computing
Soft computingSoft computing
Soft computing
 
deepLearning report
deepLearning reportdeepLearning report
deepLearning report
 
Summary Of Thesis
Summary Of ThesisSummary Of Thesis
Summary Of Thesis
 
Minimizing Musculoskeletal Disorders in Lathe Machine Workers
Minimizing Musculoskeletal Disorders in Lathe Machine WorkersMinimizing Musculoskeletal Disorders in Lathe Machine Workers
Minimizing Musculoskeletal Disorders in Lathe Machine Workers
 
Artifical neural networks
Artifical neural networksArtifical neural networks
Artifical neural networks
 
19 3 sep17 21may 6657 t269 revised (edit ndit)
19 3 sep17 21may 6657 t269 revised (edit ndit)19 3 sep17 21may 6657 t269 revised (edit ndit)
19 3 sep17 21may 6657 t269 revised (edit ndit)
 
CI_GA_module2_ABC_updatedG.ppt .
CI_GA_module2_ABC_updatedG.ppt           .CI_GA_module2_ABC_updatedG.ppt           .
CI_GA_module2_ABC_updatedG.ppt .
 
A04401001013
A04401001013A04401001013
A04401001013
 
2014 Practical emotional neural networks
2014 Practical emotional neural networks2014 Practical emotional neural networks
2014 Practical emotional neural networks
 
Artificial Intelligence A Modern Approach
Artificial Intelligence A Modern ApproachArtificial Intelligence A Modern Approach
Artificial Intelligence A Modern Approach
 
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
 
NatashaBME1450.doc
NatashaBME1450.docNatashaBME1450.doc
NatashaBME1450.doc
 
Neural network
Neural networkNeural network
Neural network
 
Artificial Neural Network and its Applications
Artificial Neural Network and its ApplicationsArtificial Neural Network and its Applications
Artificial Neural Network and its Applications
 
Neural Network
Neural NetworkNeural Network
Neural Network
 
D010242223
D010242223D010242223
D010242223
 

Recently uploaded

User Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather StationUser Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather StationColumbia Weather Systems
 
Pests of jatropha_Bionomics_identification_Dr.UPR.pdf
Pests of jatropha_Bionomics_identification_Dr.UPR.pdfPests of jatropha_Bionomics_identification_Dr.UPR.pdf
Pests of jatropha_Bionomics_identification_Dr.UPR.pdfPirithiRaju
 
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.PraveenaKalaiselvan1
 
Solution chemistry, Moral and Normal solutions
Solution chemistry, Moral and Normal solutionsSolution chemistry, Moral and Normal solutions
Solution chemistry, Moral and Normal solutionsHajira Mahmood
 
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptxTHE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptxNandakishor Bhaurao Deshmukh
 
Analytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptxAnalytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptxSwapnil Therkar
 
Speech, hearing, noise, intelligibility.pptx
Speech, hearing, noise, intelligibility.pptxSpeech, hearing, noise, intelligibility.pptx
Speech, hearing, noise, intelligibility.pptxpriyankatabhane
 
Davis plaque method.pptx recombinant DNA technology
Davis plaque method.pptx recombinant DNA technologyDavis plaque method.pptx recombinant DNA technology
Davis plaque method.pptx recombinant DNA technologycaarthichand2003
 
Environmental Biotechnology Topic:- Microbial Biosensor
Environmental Biotechnology Topic:- Microbial BiosensorEnvironmental Biotechnology Topic:- Microbial Biosensor
Environmental Biotechnology Topic:- Microbial Biosensorsonawaneprad
 
User Guide: Capricorn FLX™ Weather Station
User Guide: Capricorn FLX™ Weather StationUser Guide: Capricorn FLX™ Weather Station
User Guide: Capricorn FLX™ Weather StationColumbia Weather Systems
 
Topic 9- General Principles of International Law.pptx
Topic 9- General Principles of International Law.pptxTopic 9- General Principles of International Law.pptx
Topic 9- General Principles of International Law.pptxJorenAcuavera1
 
preservation, maintanence and improvement of industrial organism.pptx
preservation, maintanence and improvement of industrial organism.pptxpreservation, maintanence and improvement of industrial organism.pptx
preservation, maintanence and improvement of industrial organism.pptxnoordubaliya2003
 
TOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physicsTOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physicsssuserddc89b
 
Sulphur & Phosphrus Cycle PowerPoint Presentation (2) [Autosaved]-3-1.pptx
Sulphur & Phosphrus Cycle PowerPoint Presentation (2) [Autosaved]-3-1.pptxSulphur & Phosphrus Cycle PowerPoint Presentation (2) [Autosaved]-3-1.pptx
Sulphur & Phosphrus Cycle PowerPoint Presentation (2) [Autosaved]-3-1.pptxnoordubaliya2003
 
Pests of safflower_Binomics_Identification_Dr.UPR.pdf
Pests of safflower_Binomics_Identification_Dr.UPR.pdfPests of safflower_Binomics_Identification_Dr.UPR.pdf
Pests of safflower_Binomics_Identification_Dr.UPR.pdfPirithiRaju
 
OECD bibliometric indicators: Selected highlights, April 2024
OECD bibliometric indicators: Selected highlights, April 2024OECD bibliometric indicators: Selected highlights, April 2024
OECD bibliometric indicators: Selected highlights, April 2024innovationoecd
 
Base editing, prime editing, Cas13 & RNA editing and organelle base editing
Base editing, prime editing, Cas13 & RNA editing and organelle base editingBase editing, prime editing, Cas13 & RNA editing and organelle base editing
Base editing, prime editing, Cas13 & RNA editing and organelle base editingNetHelix
 
STOPPED FLOW METHOD & APPLICATION MURUGAVENI B.pptx
STOPPED FLOW METHOD & APPLICATION MURUGAVENI B.pptxSTOPPED FLOW METHOD & APPLICATION MURUGAVENI B.pptx
STOPPED FLOW METHOD & APPLICATION MURUGAVENI B.pptxMurugaveni B
 
Transposable elements in prokaryotes.ppt
Transposable elements in prokaryotes.pptTransposable elements in prokaryotes.ppt
Transposable elements in prokaryotes.pptArshadWarsi13
 
Behavioral Disorder: Schizophrenia & it's Case Study.pdf
Behavioral Disorder: Schizophrenia & it's Case Study.pdfBehavioral Disorder: Schizophrenia & it's Case Study.pdf
Behavioral Disorder: Schizophrenia & it's Case Study.pdfSELF-EXPLANATORY
 

Recently uploaded (20)

User Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather StationUser Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather Station
 
Pests of jatropha_Bionomics_identification_Dr.UPR.pdf
Pests of jatropha_Bionomics_identification_Dr.UPR.pdfPests of jatropha_Bionomics_identification_Dr.UPR.pdf
Pests of jatropha_Bionomics_identification_Dr.UPR.pdf
 
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
 
Solution chemistry, Moral and Normal solutions
Solution chemistry, Moral and Normal solutionsSolution chemistry, Moral and Normal solutions
Solution chemistry, Moral and Normal solutions
 
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptxTHE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
 
Analytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptxAnalytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptx
 
Speech, hearing, noise, intelligibility.pptx
Speech, hearing, noise, intelligibility.pptxSpeech, hearing, noise, intelligibility.pptx
Speech, hearing, noise, intelligibility.pptx
 
Davis plaque method.pptx recombinant DNA technology
Davis plaque method.pptx recombinant DNA technologyDavis plaque method.pptx recombinant DNA technology
Davis plaque method.pptx recombinant DNA technology
 
Environmental Biotechnology Topic:- Microbial Biosensor
Environmental Biotechnology Topic:- Microbial BiosensorEnvironmental Biotechnology Topic:- Microbial Biosensor
Environmental Biotechnology Topic:- Microbial Biosensor
 
User Guide: Capricorn FLX™ Weather Station
User Guide: Capricorn FLX™ Weather StationUser Guide: Capricorn FLX™ Weather Station
User Guide: Capricorn FLX™ Weather Station
 
Topic 9- General Principles of International Law.pptx
Topic 9- General Principles of International Law.pptxTopic 9- General Principles of International Law.pptx
Topic 9- General Principles of International Law.pptx
 
preservation, maintanence and improvement of industrial organism.pptx
preservation, maintanence and improvement of industrial organism.pptxpreservation, maintanence and improvement of industrial organism.pptx
preservation, maintanence and improvement of industrial organism.pptx
 
TOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physicsTOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physics
 
Sulphur & Phosphrus Cycle PowerPoint Presentation (2) [Autosaved]-3-1.pptx
Sulphur & Phosphrus Cycle PowerPoint Presentation (2) [Autosaved]-3-1.pptxSulphur & Phosphrus Cycle PowerPoint Presentation (2) [Autosaved]-3-1.pptx
Sulphur & Phosphrus Cycle PowerPoint Presentation (2) [Autosaved]-3-1.pptx
 
Pests of safflower_Binomics_Identification_Dr.UPR.pdf
Pests of safflower_Binomics_Identification_Dr.UPR.pdfPests of safflower_Binomics_Identification_Dr.UPR.pdf
Pests of safflower_Binomics_Identification_Dr.UPR.pdf
 
OECD bibliometric indicators: Selected highlights, April 2024
OECD bibliometric indicators: Selected highlights, April 2024OECD bibliometric indicators: Selected highlights, April 2024
OECD bibliometric indicators: Selected highlights, April 2024
 
Base editing, prime editing, Cas13 & RNA editing and organelle base editing
Base editing, prime editing, Cas13 & RNA editing and organelle base editingBase editing, prime editing, Cas13 & RNA editing and organelle base editing
Base editing, prime editing, Cas13 & RNA editing and organelle base editing
 
STOPPED FLOW METHOD & APPLICATION MURUGAVENI B.pptx
STOPPED FLOW METHOD & APPLICATION MURUGAVENI B.pptxSTOPPED FLOW METHOD & APPLICATION MURUGAVENI B.pptx
STOPPED FLOW METHOD & APPLICATION MURUGAVENI B.pptx
 
Transposable elements in prokaryotes.ppt
Transposable elements in prokaryotes.pptTransposable elements in prokaryotes.ppt
Transposable elements in prokaryotes.ppt
 
Behavioral Disorder: Schizophrenia & it's Case Study.pdf
Behavioral Disorder: Schizophrenia & it's Case Study.pdfBehavioral Disorder: Schizophrenia & it's Case Study.pdf
Behavioral Disorder: Schizophrenia & it's Case Study.pdf
 

Aspect oriented a candidate for neural networks and evolvable software

  • 1. Aspect-Oriented: a Candidate for the Biologically Inspired Programming Paradigm for Neural Networks and Evolvable Software Linchuan Wang 1,2 Xuefei Tang 3 Lijia Zhang 4 School of Computer Science and Engineering University of Electronic Science and Technology of China Chengdu, China Abstract We observe that the nervous systems of biology handle “the non-orthogonal con- cerns” e↵ectively other than the Object-Oriented paradigm does. This nature phe- nomenon inspires us with a programming paradigm to handle “cross-cutting”. It is argued that the Aspect-Oriented paradigm is a candidate for the biologically- inspired programming paradigm. To support this point, the Aspect-Oriented para- digm is used to implement a simple Artificial Neural Networks(ANN) and the pre- liminary experiment shows good results. In addition, we proposed a biologically- inspired framework of evolvable software which could be implemented by using the Aspect-Oriented paradigm combining with neurocomputing and genetic computing. Key words: biologically-inspired computing,programming paradigm,aspect-oriented programming, object-oriented,evolvable software 1 Introduction Neural networks are massively parallel processing systems, that require ex- pensive and usually not available hardware, in order to be realized. Instead of hardware implementation software simulation is widely used in research and industry, for its low costs and flexibility. Moreover, Object-Oriented Pro- gramming(OOP) language is the usual choice to implement artificial neural 1 This project is partly funded by UESTC Comsys info. Inc. and thanks to users from aosd-discussion mailing list who have given very good comments on this paper. 2 Email: l.c.wang@std.uestc.edu.cn 3 Email: xftang@uestc.edu.cn 4 Email: zhanglijia@std.uestc.edu.cn
  • 2. Wang, Tang and Zhang networks. The artificial neural networks, however, implemented with Object- Oriented(OO) technology provide few supports to change the size and topology of the neural networks in runtime[1,2]. Furthermore, the parallelism of artifi- cial neural networks is not friendly to the programmers who have to transform manually the parallel neural networks into the codes executed serially since an OO program is a collection of dialoging of a group of objects indeed. This manual transformation is a complex process and the final implementations are often hard to be reused. We argue that these problems in implementation of artificial neural networks can be solved by using Aspect-Oriented Program- ming (AOP). Our research reveals that nervous systems of biology handle “non-orthogonal concerns”(cross-cutting) e↵ectively. This nature phenomenon inspires a pro- gramming paradigm for neural networks and evolvable software, which can handle non-orthogonal concerns e↵ectively as that of nervous systems of biol- ogy. As far as we know, there is no discussion on this phenomenon. Moreover, we believe that the AO paradigm is a candidate for the desire paradigm since the AO paradigm partly satisfy the considerations to imple- ment the desire paradigm and bears a natural analogy with nerve systems. To verify this idea, we implement a simple artificial neural networks which solves the MONK’s problem I using AOP. Nevertheless, the simulation of neural networks is never the only thing we talk about here. The more interesting topic in this paper is how to apply artificial neural networks in evolvable software development though this idea is very young now. The next section gives brief reviews on separation of concerns and AOP. The biologically-inspired programming paradigm is discussed in detail in the section 3. The implementation of ANN using the AO paradigm is arranged in section 4. We propose a biologically-inspired framework of evolvable software in the section 5. The section 6 is the conclusions. 2 Reviews on the Principle of Separation of Concerns and AOP One of the approaches to solve complex problems is to divide the problem into some smaller, simpler and loose coupling sub problems, which is so called the separation of concerns principle [3]. The principle confirms us that we could not handle many problems at a time and we should deal with problems one by one, and the important problem should be represented intentionally (clearly and declaredly) and should be localized. Thus we can get intelligibil- ity, adaptability, reusability and many other important qualities of software systems since the degree of satisfaction of software requirements could be con- veniently verified with such intentional representation and localization. Nev- ertheless, the separation of concerns principle gives no guide to separate con- cerns. Many methods are proposed in the passed decades, such as procedure-
  • 3. Wang, Tang and Zhang oriented method and OO method, to e↵ectively handle orthogonal concerns. Unfortunately, many concerns are non-orthogonal. As a result, there are mas- sive redundancies when we represent all of these problems intentionally and locally with these methods. Aspect-Oriented Programming (AOP) [4]provides a mechanism to handle non-orthogonal concerns by modularizing cross-cutting via augmenting the Object-Oriented programming paradigm with Aspect, join point, Pointcut and Advice[5]. AOP is based on the idea that computer systems are better programmed by separately specifying the various concerns and properties or areas of interest of a system and some description of their relationships, and then relying on instruments in the AOP environment to weave or compose them together into a coherent program.[6] AOP does what OOP cannot do e↵ectively, that is clearly and cleanly modularize functional system code, i.e. source code, by separating concerns into well localized units, called aspects, to eliminate code tangling. The OO technology becomes popular owing to its synthesizing three im- portant factors of software development, i.e. computer platform, thought way and the features of issues. The basic elements of the OO paradigm are objects and interactions between them. This paradigm is in harmony with thought way of human and the rules of the natural world and hence that we can map problem space to design space directly, and then to programming space[7]. It should be pointed out that the rules of the nature world the OO paradigm inspired are the rules of the mechanistic nature. We can innovate new software technology with inspirations from the organic nature to handle non-orthogonal concerns e↵ectively other than OO does[12]. 3 The Biologically-inspired Programming Paradigm and it’s Candidate 3.1 The biologically-inspired paradigm to handle cross-cutting We observe that the nervous systems handle “cross-cutting” in their body, which inspires us with a programming paradigm to handle cross-cutting in software systems. 3.1.1 How does nature handle “cross-cutting” The communications between objects in the Object-Oriented paradigm are very close to intercellular communications in primitive multicellular animals without nervous system, such as sponges, in which the cells communicate di- rectly via gap junction with knowing exactly each others. 5 Sponges have no nervous system(only a kind of neuron without synapse). Their responses to stimulation are regional, slow and it’s level depends on the strength of stim- 5 The knowledge of biology of this paper mainly refer to [8,9]
  • 4. Wang, Tang and Zhang ulation. The information materials are transmitted by di↵usion of gelatinous materials, dissociating amoebocyte and the contact of fixed cells. Sponges can do very simple behaviors with this mechanism. Although, in the view of great nature, Object-Oriented technology is pretty primitive, we have built a few systems that are most complex and often hardest to understand and maintain for human. How to low complexity becomes a prominent problem while the scale of the software systems is increasing. Therefore, we should to find new methods to address the problem. One of the contributors of this problem is non-orthogonal concerns (i.e. cross-cutting) which the OO paradigm does not deal with e↵ectively. Nature does that e↵ectively. The process of natural evolution is not stop on sponge, and therefore, more complex animals with nervous systems appeared. The first nervous system was found in Coelenterate (such as Hydra). Neurons partially separated from e↵ectors and form a neural net without nerve center in which nerve impulses are broadcasted throughout the body to stimulate all of the muscle cells to product simple behaviors. Since the neurons conducted impulses without direction such neural net is called di↵use nervous systems. But animals need better maneuverability to get energy to survive in a com- petitive world. Therefore, a real nervous system appeared in Annelida (such as earthworm). The earthworm are made up of segments that are formed by subdivisions that partially transect the body cavity. Segments each contain elements of such body systems as circulatory, nervous, and excretory tracts. Metamerism increases the e ciency of body movement by allowing the e↵ect of muscle contraction to be extremely localized, and it makes possible the development of greater complexity in general body organization. It’s very interesting that moving “concerns” are separated by nature! As result, the segments are cohesive and loose coupling. The most important inspiration from process of natural evolution described above is that the complex behaviors are not completed by directly communi- cating of muscle cells of the segments but completed by centrally control of the ganglia (nerve center). Obviously, it can reduce the complexity of ani- mals otherwise, for example, each muscle cell of the segment has to have the same “movement logic”. We could say that nature encapsulates the “cross- cutting” into the ganglion(neuron). This phenomenon is more common in vertebrate(Figure 1). The extreme example is cortex, in which the most com- plex “cross-cuttings”, such as emotion, learning ability, are handled very well. 3.1.2 Inspirations of nervous system How to improve object-oriented paradigm to implement “nervous systems” of large scale software system to deal with cross-cutting? Let us investigate more details in nervous system. Nervous system are made of neurons. Although nervous system vary enormously in structure and function, neurons function similarly in animals as humans. A typical neuron has four morphologically defined regions: cell body, dendrites, axon, and presynaptic terminals(Figure.
  • 5. Wang, Tang and Zhang Limb UpperLimb forward(rang : integer) backward(rang : integer) LowerLimb forward(rang : integer) backward(rang : integer) LeftArm forward(rang : integer) backward(rang : integer) RightArm forward(rang : integer) backward(rang : integer) LeftLeg forward(rang : integer) backward(rang : integer) RightLeg forward(rang : integer) backward(rang : integer) Synchronization after():call(void UpperLimb.*ward(int)) || call (void LowerLimb.*ward(int)) { //synchronize } after():call(void UpperLimb.*ward(int)) || call (void LowerLimb.*ward(int)) { //synchronize } Center Nervous System(Aspect) Fig. 1. An example of “cross-cutting” in nervous system: synchronization of limbs in walking . 2). Neurotransmitter molecules Postsynaptic membrane receptors Presynaptic membrane Synaptic vesicles Postsynaptic dendrite Postsynaptic membrane Presynaptic membrane Synaptic knob Synaptic vesicles containing neurotransmitter molecules Mitochondrion Synaptic cleft Synapse Nucleus Soma(Cell body) Axon Direction of Impulse Dendrite Axon Terminal Axon Fig. 2. The structure of neuron. The axon is the main conducting unit of the neuron, capable of conveying electrical signals along distances that rang from as short as 0.1 mm to as long as 2 m, thereby conveying information to di↵erent targets. Near its end the axon divides into fine branches that make contact with other neurons or e↵ector. The point of contact is known as synapse. The cell transmitting a signal is the presynaptic cell, and the cell receiving the signal is the postsynaptic cell.
  • 6. Wang, Tang and Zhang Ramon y Caljal pointed out in his principle of connectional specificity that the connections between neuron-to-neuron and neuron-to-e↵ector are speci- fied. The principle of connectional specificity states that each cell communi- cating with certain postsynaptic target cells but not with others and always at specialized points of synaptic contact. The specified connection is similar to that of direct intercellular communication (by gap junction or by plasma mem- brane bound molecules). But the specified connections can be changed owing to synaptic plasticity. For example, long-term habituation and sensitization involve structural changes in the presynaptic terminals of sensory neurons, long-term habituation leads to a loss of synapses, and long-term sensitization to an increase(Figure. 3). Thus, nervous systems in which “cross-cutting” is modularized can learn and evolve in a changing environment. 1.Control 2. Long-term habituation 3. Long-term sensitization Sensory neuron Motor neuron Fig. 3. Long-term habituation leads to a loss of synapses, and long-term sensitized to an increase.[10] If we draw an analogy between object in object-oriented paradigm and e↵ector in animals’ body, it could be seen that the features of intercellular communicating through synapse, which is the key to handle “cross-cutting”, do not exist in object-oriented paradigm in which that is done by point-to- point method call similar to gap junction communication mentioned above. From discussions above, we conclude several considerations to augment object-oriented paradigm to implement “nervous system” of large scale soft- ware system to deal with cross-cutting. 1)add new communication mechanism similar to synapse communication. 2)di↵erentiate “neuron” from “e↵ector”(object). 3)add a mechanism supporting “synaptic plasticity”. Up to now, we observe that the Aspect-Oriented paradigm is a candidate for the desire paradigm. We discuss this point in the next section. 3.2 A candidate for the desire paradigm: Aspect-Oriented As an extension of the Object-Oriented paradigm, AOP adds Aspect, Point- Cut and JoinPoint into Object-Oriented paradigm[11]. The Aspect has a natural analogy with neural cell in nervous systems of animals and Aspect could be modeled with a simplified neural model (Figure.4).
  • 7. Wang, Tang and Zhang PointCut (Axon) Aspect (Neuron) join point (Synapse) ... ... Advice (Transduction) Object (Effector) Executing Point of a program (Membrane Receptor) join point Expression (Presynapse ) ... ... Fig. 4. Metaphor between AOP and nerve system. Please note that join points defined in object or aspect correspond to membrane receptors of cell. It can be method call, method execution etc. [11] , by which PointCut synapses with object or aspect. Aspect(Neuron) Aspect is “neuron” di↵erentiated from “e↵ector”(object), which is a mod- ularized cross-cutting, i.e. modules in which cross-cutting encapsulated. It consists of join point, PointCut and Advice. Join points connect with an Ad- vice through the PointCut they belong to and expose the logic of cross-cutting implemented in the Advice. In other word, the Advice surrounded by a “wall” of join points(Figure.5) and hence that the only way to make the Advice to function is to activate one of join points in the wall. Thus, cross-cutting which can not be e↵ectively handled by the OO paradigm is encapsulated into As- pect. Aspect has some features similar to that of Class. First, Aspect is cohesive because join points can only activate certain Advice. Second, Aspect is loose coupling with other parts of the system because it communicates with other elements just by join points in the wall.[12] PointCut(Axon) Together with join points, PointCut adds new communication mechanism similar to synapse communication into the OO paradigm. PointCut is the main conducting unit of the Aspect, capable of getting the context of the target objects synapsed by the own join points (we also call it as the context of the join point). A PointCut, connecting to at least one Advice, could have
  • 8. Wang, Tang and Zhang Aspect Class Operation Attribute join point Advice PointCut Fig. 5. Another representation of Aspect and Class[13] several join points, which are program execution points similar to membrane receptors of cell defined in object or aspect. It can be method call, method execution etc. , by which PointCut synapses with object or aspect. Rather the direction of impulse transmission in the axon of most neurons is always from the base of axon to its terminals, information in PointCut is transmitted bi-directly, i.e. Aspect can pass information from and to the targets. We call the PointCut, which passes information from the target to the Aspect it belongs to, as dendrite. Advice(Signal Transduction) Advice similar to signal transduction logic in neuron is a logic implement- ing cross-cutting. Which can change behaviors of the target objects synapsed by the join point of the own PointCut and can be activated by join points. The activated Advice performs some operations such as before(), around() and after() in the context of join point to implement cross-cutting. For any object, the Advice is inaccessible, in other words object can not call the operations of Advice, which guarantees the modularity of cross-cutting. 3.3 Sample: a Simplest Nervous Net The simplest neuronal network consists of three cells: a sensory neuron con- nected to a motor neuron connected to a muscle cell, which can be mapped into the neuroscience-inspired model described above(Figure 6). 1 class Effector { 2 boolean state,transmitter; 3 private boolean receptor(){ 4 boolean transmitter_received = false; 5 returntransmitter_received; 6 } 7 public boolean emite(){ 8 Transmitter = state; 9 return transmitter; 10 } 11 public void contract(){ 12 //to contract and change the state... 13 }
  • 9. Wang, Tang and Zhang Sensory Neuron Motor NeuronEffector (muscle cell) Fig. 6. A simplest nervous net 14 public void relax(){ 15 //to relax and change the state... 16 } 17 public void go(){ 18 if(receptor()) contract(); 19 else relax(); 20 } 21 } The sensory neuron in neuronal network is implemented as an aspect which has two pointcuts , axon() and dendrite(), similar to that of neuron. These pointcuts are di↵erent in direction of information transmitted, axon() trans- mit information (transmitter to release) from SensoryNeuron to other aspect or object (in this case, E↵ector) and dendrite() vice versa. The Advice similar to signal transduction in cell which dendrite() binding to receives information emitted by E↵ector (line 9 ), and then the Advice of axons() decides what in- formation be transmitted to the target, in this simple case transmitter received is transmitted directly. 1 aspect SensoryNeuron { 2 boolean transmitter_to_release,transmitter_received; 3 pointcut axon():within(MotorNeuron) && execution(* receptor(..)); 4 pointcut dendrite(): 5 within(Effector) && execution( * emit(..)); 6 boolean around(): axon() { 7 return transmitter_received; 8 } 9 after() returning(boolean x) :dendrite() { 10 transmitter_received = x; 11 } 12 } MotorNeuron is an aspect similar to SensoryNeuron. However there are two di↵erences between SensoryNeuron and MotorNeuron. First, MotorNeu- ron only has one pointcut axon(), which transmits orders to the E↵ector in certain logic (line 8). Second, the former has a membrane receptor() (line 4) similar to that of E↵ector which receives transmitter emitted by the latter.
  • 10. Wang, Tang and Zhang 1 aspect MotorNeuron { 2 boolean transmitter_to_release, transmitter_received; 3 pointcut axon(): within(Effector) && execution( * receptor(..)); 4 private boolean receptor(){ 5 boolean transmitter_received = false; 6 return transmitter_received; 7 } 8 boolean around(): axon() { 9 transmitter_received = receptor(); 10 //Motor Neuron receive singal = transmitter_received 11 if (transmitter_received) transmitter_to_release = true; 12 else transmitter_to_release = false; 13 //Motor Neuron release transmitter = transmitter_to_release 14 return transmitter_to_release; 15 } 16 } 4 Applying AO paradigm in ANN We use AOP to implement ANN to support one of our points, i.e. AOP dealing with crosscutting is similar to that of real neural nets, which implies that we could apply ANN theory in large scale AO systems in which ANN is a by-product. ANN could be implemented by using other programming paradigms but these implementations are some shortcomings as mentioned in section 1. With the paradigm described above, artificial neural network can be easily simulated. To illustrate it, we present a simple example of three-tier BP neural network which solves the MONK’s problem I.[14] The MONK’s problems rely on the an artificial domain; in which robots are described by six di↵erent attributes: x1:head shape 2round, square, octagon x2:body shape 2round, square, octagon x3:is simling 2yes, no x4:holding 2sword, balloon, flag x5:jacket color 2red, yellow, green, blue x6:has tie 2yes, no The learning task is a binary classification task. Each problem is given by a logic description of a class. Robot belongs either to this class or not, but instead of providing a complete class description to the learning problem, only a subset of all 432 possible robots with its classification is given. The learning task is then to generalize over these example and, if the particular learning technique at hand allow this, to derive a simple class description. The MONK’s problems are composed of three tasks. One of them is the MONK’s problem I described bellow. Figure 7 illustrates the network architecture used in the MONK’s problem I. Each unit in the Hidden1 layer represents one of the attributes.
  • 11. Wang, Tang and Zhang Problem 4.1 (Monk1): (head shape = body shape) or (jacket color = red). From 432 possible examples, 124 were randomly selected for the training set. No noise is present. Fig. 6 The network architecture for the MONK’s Problem I.[14] 0 1 2 3 4 5 0 1 2 3 4 5 Output Hidden2 Hidden1 Input head-shape body-shape smiling hoiding jacket-color has-tie Fig. 7. The network architecture for the MONK’s Problem I.[15] Group7 Group0 Group8 Group1 … … … Group5 Group6 Fig. 8. The aspect-oriented design model of the network in Fig. 7 Each aspect of the network represents a neuron(We call it as Aspect- Oriented Neural Networks(AONN) ). Learning algorithm is a typical cross- cutting which should be modeled as an aspect with pointcuts which cross cut every aspect which need to learn (in this case, every aspect in all layers except Input ). Given simplicity factor, it does not be shown in the figure. The architecture in figure 7 is mapped into the AO design space as shown in figure 8. By using the programming technique described in section 3.3, the design model can be implemented easily. We implement neuron into an abstract aspect with one pointcut dendrite(). 1 public abstract aspect Neuron {
  • 12. Wang, Tang and Zhang 2 //some definitions 3 ... 4 //n is the amount of input of the neuron 5 protected int n; 6 protected float[] input; 7 protected float[] weight; 8 protected Params params;//e.g. weights and threshold of the neuron 9 protected ID id;//identity of the neuron 10 protected OutPut output;//transmitter 11 abstract pointcut dendrite(); 12 after() returning(OutPut out) : dendrite() { 13 if (!init) init(); 14 input[out.getId().getNumber()] = out.getValue(); 15 counter++; 16 if (! (counter < n)) { 17 counter = 0; 18 getOutPutWrapper(); 19 } 20 } 21 public ID getID() { 22 return id; 23 } 24 /* *Wrapper() just only wrap the method *OutPut(float net) 25 in order to the dendrite() pointcut picks out jp within the 26 correct aspect(see to their implementations).This is some trick*/ 28 public abstract void init(); 29 public abstract OutPut getOutPutWrapper(); 30 public abstract OutPut getOutPut(float net); 31 public abstract float getNetInput(); 32 public abstract Params getParamsWrapper(); 33 public abstract Params getParams(ID id); 34 } Abstract aspect SigmoidNeuron partly implement Neuron with sigmoid active function.(line 6) 1 public abstract aspect SigmoidNeuron 2 extends Neuron { 3 //some definitions 4 ... 5 //sigmoid active function 6 public OutPut getOutPut(float net) { 7 //calculateing output with sigmoid function and return it 8 ... 9 } 10 public float getNetInput() { 11 //this executing point will be captured by learning aspect 12 Params tempParams = getParamsWrapper(); 13 //compute and return net value 14 ... 15 } 16 public Params getParams(ID id) { 17 return params; 18 }
  • 13. Wang, Tang and Zhang 19 } We implement neurons into aspects and name it in a simple rule. The name of neuron is composed by tow parts, the order name and the group name. We group neurons by the the target they synapse. The neurons, which have the same target, are in the same group, such as the neuron which belongs to hidden layer 2 is in Group 7,etc. So every neuron has a group id. Second, the position of the neuron in the group (from left to right) is the order name of neuron, such as Neuron0, Neuron1,etc.(See also to figure 8) 1 aspect Neuron0Group7 2 extends SigmoidNeuron { 3 pointcut dendrite() : within(*Group0) && call( * getOutPut(..)); 4 public void init() { 5 //initialize this neuron and set the flag 6 ... 7 } 8 public OutPut getOutPutWrapper(){ 9 return getOutPut(getNetInput()); 10 } 11 public Params getParamsWrapper(){ 12 return getParams(id); 13 } 14 } There are some di↵erences among the neurons. The neurons in the input layer is the sensory neuron mentioned in section 3.3. These neurons have pointcut dendrite() which could pickup the join points in e↵ectors which are pattern generators(POJO) in this case. Second, we add a new method fi- nalOutPut(ID id) into the neurons in the output layer by which the learning aspect will get the final output without knowing the scale of the net. Once the pattern generators product all data the output aspect will call finalOutPut(ID id) to output the result. Thus the net is ready to run together with a simple main class and if the parameters are assigned correctly it will surely return right results for all the patterns. But we want it to get the right parameters by itself, i.e. learning. Learning is a typical cross-cutting which is implemented as aspect Bp with several pointcuts which cross cut every aspect which needs to learn (in this case, every aspect in all layers except input layer).The key in implementing learning aspect is parameters update. Once the output layer products an out- put, pointcut checkErr() will capture it and check if the net need to learn (i.e. the result is wrong or right) and pointcut checkPass() calculates the update using learning algorithm(BP algorithm in this case, we can easily replace it with another learning algorithm). While the net running, if there are some parameters need to be refreshed pointcut paramsUpdate() will update it. 1 public aspect Bp { 2 ... 3 //we do not care about fire order of the neuron in the net 4 //so data is storeed in the hash tables
  • 14. Wang, Tang and Zhang 5 6 pointcut paramsUpdate() : if (learningFlag && hasUpdate) 7 && call( * getParams(..))&& within(Neuron*); 8 Params around() : paramsUpdate() { 9 } 10 12 pointcut checkErr() : within(Neuron*) && call( * finalOutPut(..)); 13 after() returning(float v) : checkErr() { 14 } 15 16 // check if the input neuron fires, that is the end of one pass 17 pointcut checkPass() : if (learningFlag&&isInputsReady&&isParamsReady) 18 && within(Neuron*) && call( * finalOutPut(..)); 19 after() : checkPass() { 20 ... 21 calculate(); 22 } 23 24 //get all outputs of the neuron in the net 25 pointcut getAllOutPut() : if (learningFlag) && within(Neuron*) && 26 call( * getOutPut(..)); 27 after() returning(OutPut out) : getAllOutPut() { 28 } 29 30 pointcut getAllParams() : if ( (!onePass) && learningFlag) && 31 within(Neuron*) && call( * getParams(..)); 32 after() returning(Params tempParams) : getAllParams() { 33 //store the params into hash table and set the flag 34 ... 35 } 36 public void calculate() { 37 //calculateing weight update using BP algorithm 38 ... 39 } 40 } Our preliminary experiment shows that AONN has good traits. First, the parallelism of neural networks is transformed automatically. Neural networks are parallel processing systems, in which neurons in the same layer fire parallel, so programmer usually be required to transform the parallelism into serial procedure or dialoging between objects manually. That is prone to err and is often hard to implement a large scale neural network. In AONN, rather care about the details of interaction of neuron (transformation) we just focus on the synapses (join points). The transformation are automatically completed by weaving .(By process algebras[16,17], we can prove the correctness of weaving but for space limitation it will be discussed in another paper ). Second, the approach used in AONN has high traceability. The problem space is mapped into the design space directly, and then to programming space. Moreover, AONN has low change impact. We can add or delete neurons and neuron layers or change the topology with no impact on the whole system. Furthermore, by using dynamic AOP (the implementation is immature), such as JAC, JBoss,
  • 15. Wang, Tang and Zhang Axon etc.[18,19,20], the topology of NN can be changed in runtime. Fourth, the implementation can be reused easily. The neurons implemented in aspects are cohesive and loose coupling with other parts of system so it can be used in another system with light modification. Last but never the least, the learning algorithm can be changed on demands. The learning algorithm of the neural network is localized into a neuron so it can be replaced easily. On the other hand, we define an aspect for each neuron in the network, but it needs a reasonable e↵ort to write though we have carefully avoid it. It seems to be reasonable to create multiple objects from one neuron class definition, but AspectJ’s pointcut and advice mechanism can not allow to instantiate ar- bitrary aspect instances, and requires to distinguish advised objects.It might be an interesting idea to apply instance-level aspects for representing connec- tion between neurons and this is one of our future works.[31,32] The Aspect-Oriented paradigm, a candidate for biologically-inspired pro- gramming paradigm discussed in section 3.1, does not dedicated to implement ANN; it has been used to develop more general applications and can be used to develop biologically-inspired software which is discussed in the next section at abstract level. 5 Biologically-Inspired Framework of Evolvable Soft- ware The concepts of biologically-inspired computing exist in computer science for decades. In earlier, W. McCulloch and W. Pitts proposed artificial neural networks[21], John von Neumann presented cellular automata[22], Alan Tur- ing researched on Morphogenesis[23] and John H. Holland invented genetic algorithms[24] . And more recently some new biologically-inspired comput- ing are presented, such as [25,26]. A mass of applications using biologically- inspired computing are developed, such as [27,28,29,30]. We argue that although the most of the applications using biologically- inspired computing are implemented as software systems the software technol- ogy itself does not benefit from ideas of biologically-inspired computing owing to the software paradigm’s incongruity with the structure and the mechanism of biology(particular nervous systems), such as absence of synapse commu- nication mechanism in the Object-Oriented paradigm. As discussed earlier, Aspect-Oriented paradigm is close to the desired paradigm. We believe that it can be used to develop biologically-inspired evolvable software proposed below(Figure 9). The biologically-inspired framework of evolvable software consists of four layers. The Core Application Layer (CAL) similar to organs of biology in- cludes only orthogonal functions of the system, which is constructed by using the OO paradigm and run independently. The systemic functions, such as se- curity, transaction, persistence and distribution etc., are implemented in the Reflex Aspects Layer(RAL) similar to spinal cord of animals with the Aspect-
  • 16. Wang, Tang and Zhang Oriented paradigm. The RAL controls the functions of core application in CAL and their e↵ectiveness are adjusted according to the changing environ- ment. RAL is what the Aspect-Oriented community focus on now. At the top of the architecture, the Knowledge-Based AONN Layer (KBAL) serves as a “cortex” of the software system which controls indirectly the core applications via RAL. Owing to the combination of di↵erent biologically-inspired comput- ing technologies, such as knowledge-based neurocomputing, genetic comput- ing and synapse(join point) plasticity, KBAL learns and adapts to changing environment(new requirements), thus provides us evolvable software. Organs Spinal Cord Cortex Environment Core Application Layer Reflex Aspects Layer Knowledge-Based AONN Layer Fig. 9. The architecture of Aspect-Oriented evolvable and adaptive software and the biological analog 6 Conclusions To handle non-orthogonal concerns (cross-cutting) which is not done e↵ec- tively by using the Object-Oriented paradigm, we have proposed a biologically- inspired programming paradigm with lessons from nervous systems. There are three considerations to augment to handle cross-cutting: 1) add new com- munication mechanism similar to synapse communication. 2) di↵erentiate “neuron” from “e↵ector”(object). 3) add a mechanism supporting “synaptic plasticity”. The Aspect-Oriented paradigm, which satisfy the considerations 1) and 2) at present, is a candidate for the desire paradigm. It has been used in ANN simulation , named as Aspect-Oriented Neural Networks(AONN), to support our ideas and the preliminary experiment shows that AONN has many good traits. Although the concentration on the consideration 3) (i.e. dynamic AOP)is increasing, it has not been well implemented yet. The researches on synaptic plasticity in neuroscience may give us more inspirations to achieve it. In addition, we have proposed a preliminary biologically-oriented evolv-
  • 17. Wang, Tang and Zhang able software framework which can be implemented using the AO paradigm combining with biologically-inspired computing technologies, such as neuro- computing and genetic computing. References [1] George D. Manioudakis, and Spiridon d Likothanassis, An Object-Oriented Toolbox for Adaptive Neural Networks’ Implementation, International Journal on Artificial Intelligence Tools. 10, No.3(2001), 345–371. [2] Ellingsen B.K. An object-oriented approach to neural networks, Technical Report No.45, ISSN 0803-6489, UIB-IFI, URL:http://citeseer.ist.psu.edu/ellingsen95objectoriented.html. [3] Dijkstra, Edsger W., . “A discipline of programming,” Englewood, Cli↵s, NJ:Prentice-Hall Inc.,1976. [4] Kiczales G, Lamping J, Mendhekar A, Maeda C, Lopes CV, Loingtier JM, Irwin J.,Aspect-Oriented programming, In: Proceedings of the European Conference on Object-Oriented Programming (ECOOP). LNCS 1241, Springer-Verlag, 1997. 220–242. URL:http://citeseer. nj.nec.com/63210.html. [5] Soares, S., and Borba, P., (2002), “Progressive implementation with aspect oriented programming”, in Springer Verlag, editor, The 12th Workshop for PhD Students in Object–Oriented Systems, ECOOP 02. [6] Elrad, T., Filman, R.E., and Bader, A., “Aspect oriented programming”, Communications of the ACM, vol. 44, no. 10(2001). [7] F.Q. Yang, H. MEI, J, Lu and Z. Jin, Some Discussion on the Development of Software Technology(in Chinese), Chinese Journal of Electronics, Vol.30, No.12A, 1901–1906, 2002 [8] William K. Purves, David Sadava, Gordon H. Orians, Craig Heller. “Life: The Science of Biology,” 7th Ed., W H Freeman, Bedford, 2004 [9] E. R. Kandel, J. H. Schwartz, T. M. Jessell, “Essentials of Neural Science and Behavior,” McGraw-Hill, 1996 [10] Bailey, C.H., and Chen, M. Morphological basis of long-term habituation and sensitization in Aplysia. Science 220:91–93. [11] The AspectJ Team, The AspectJ Programming Guide,With associated web site http://eclipse.org/aspectj. [12] L.C. Wang, X.F. Tang, NBO: An Approach for Aspect-Oriented Software Development in Light of Neurobionics, submitted. [13] Taylor, D.A., “Object-Oriented Technology: A Manager’s Guide,” New York:Addison Wesley, 1990
  • 18. Wang, Tang and Zhang [14] Thrun, S.B., Bala, J., Bloedorn, E., Bratko, L., Cestnik, B., Cheng, J., De Jong, K., Dzeroski, S., Fahlmann, S.E., Fisher, D.,Hamann, R., Kaufman, K., Keller, S., Kononenko, L., Kreuziger, J., Michalski, R.S. Mitchell, T., Pachowicz, P., Reich, Y., Vafaie, H., Van de Welde, W., Wenzel, W., Wnek, J., Zhang, J. (1991), The Monk’sProblems: A Performance Comparison of Di↵erent Learning Algorithms, Technical Report: CMU-CS-91-197, Carnegie Mellon University. [15] Masumi Ishikawa, Structural learning and rule discovery from data, in S. Amari and N. Kasabov Eds., Brain-Like Computing and Intelligent Information Systems, Chapter 16, pp.396-415, Springer (1998) [16] C.A.R. Hoare. “Communicating Sequential Process,” Prentice-Hall, Englewood Cli↵s, NJ, 1985 [17] James H. Andrews. Process-Algebraic Foundations of Aspect-Oriented Programming. In Proceedings of the Third International Conference on Metalevel Architectures and Separation of Crosscutting Concerns Pages: 187 – 209, 2001 [18] Pawlak, R., L. Seinturier, L. Duchien, and G. Florin, “JAC: A Flexible Solution for Aspect-Oriented Programming in Java,” in Metalevel Architectures and Separation of Crosscutting Concerns (Reflection 2001), LNCS 2192, pp. 1–24, Springer, 2001 [19] http://www.jboss.org/developers/projects/jboss/aop . [20] Swen Aussmann, Michael Haupt. Axon-Dynamic AOP through Runtime Inspection and Monitoring. ECOOP’03 workshop on Advancing the State of the Art in Runtime Ispection (ASARTI) 2003 [21] W. McCulloch and W. Pitts, A Logical Calculus of the Ideas Immanent in Nervous Activity, Bulletin of Mathematics Biophysics., Vol. 5, 1943:115–133 [22] John von Neumann, Theory of Self-Reproducing Automata. University of Illinois Press. 1966 (Originally published in 1953) [23] Alan Turing. The Chemical Basic of Morphogenesis, Philosophical Transactions of Royal Society B(London). 1952 [24] John H. Holland. Genetic algorithms and the optimal allocation of trials. SIAM Journal on Computation, 2:88-105, 1973. [25] Gh. Paun, Computing with Membranes, J. Comput. System Sci. 61(1) (2000) 108–143. (See also Turku Center for Computer Science-TUCS Report No. 208, 1998, www.tucs.fi) [26] Abelson et al, Amorphous Computing, Communications of the ACM, Volume 43, Number 5, May 2001 [27] M. Wang and T. Suda, “The bio-networking architecture: A biologically inspired approach to the design of scalable, adaptive, and survivable/available network applications,” in Proceedings of the 1st IEEE Symposium on Applications and the Internet (SAINT), (San Diego, CA), IEEE, 8–12 January 2001.
  • 19. Wang, Tang and Zhang [28] Dario Floreano and Joseba Urzelai. Neural Morphogenesis, Synaptic Plasticity, and Evolution, “Theory in Biosciences,” vol. 120, no. 3-4, pp. 225–240(16), Urban & Fischer, 2001. [29] Dan C. Marinescu, Ladislau Boloni. Biological Metaphor in the design of complex software software systems. Futuree Generation Computer Systems Vol17( 2001),345–360 [30] Eduardo Sanchez , Daniel Mange , Moshe Sipper , Marco Tomassini , Andrs Prez-Uribe , Andr Stau↵er, Phylogeny, Ontogeny, and Epigenesis: Three Sources of Biological Inspiration for Softening Hardware, Proceedings of the First International Conference on Evolvable Systems: From Biology to Hardware, p.35–54, October 07-08, 1996 [31] Hridesh Rajan and Kevin Sullivan, Eos: Instance-Level Aspects for Integrated System Design, In 9th European Software Engineering Conference (ESEC) and 11th ACM SIGSOFT International Symposium on the Foundations of Software Engineering (FSE-11), pp.297–306, 2003. [32] Kouhei Sakurai, Hidehiko Masuhara, Naoyasu Ubayashi, Saeko Matsuura and Seiichi Komiya, Association Aspects, In Proceedings of the 3rd International Conference on Aspect-Oriented Software Development (AOSD’04), 2004.