SlideShare a Scribd company logo
Neural Concept Network
Ver 0.3
1
Index
1. Introduction
2. Overview
3. Conception
4. Concept Network function
5. Neural Network function
6. Neural Concept Network function
7. Reference materials
2
1. Introduction
3
Precaution
• This document is under creation.
• The slide is a note in the process of
development and may be deleted in the official version.
• This document uses some animations. The slide with the
following icons has animation. Use the PowerPoint
version because the PDF version can't play animations.
animation
4
Self-introduction
• Name
– Akihiro Yamamoto
• Twitter
– A_Ym
• Focus on
– AI, Artificial Consciousness (AC), Neural Network, Concept
theory, Brain, Quantum Info, OpenCL, C#, MS Azure…
And BOOM BOOM SATELLITES!!
5
2. Overview
6
What’s Neural Concept Network
• Neural Concept Network is a directed network for
representation, searching, analyzing, learning, and
forming concepts.
• Consisting of the following two functions.
– Concept Network:
function for representing concepts
– Neural Network:
function for searching, analyzing, learning, and forming
concepts
It is abbreviated as NCN as needed.
7
Features of Neural Concept Network
• NCN can represent concepts in a form that can be
understood by those who do not have knowledge of
neural networks or mathematics.
• It can also represent “relative, hierarchical, context-
sensitive, self-inclusion, self-reference” concepts, which
is complicated in conventional concept representation.
• It has the functionality of a simplified Spiking Neural
Network (SNN) and is used for concept search, (top-
down, bottom-up) analysis, learning and formation.
8
3. Conception
9
Conception-1
• NCN is a part of the realization of HLAI (Human Level
Artificial Intelligence) with the ability to think like human
beings.
• At present, the distance between
Top-down approach and Bottom-up approach,
for realization of HLAI is too far, so I thought we needed
a middle-out approach.
• I thought that it was necessary to implement the
representation and the processing of the concept as a
function of the middle point.
10
HLAI Roadmap
Human Intelligence Artificial Intelligence
Artificial Neural NetworkHuman Neural Network
Concept
Semantics
Language
Thought
Planning
Recognition
Perception
Sense
Cognition
11
HLAI Roadmap points
• The point is that Concept is located below Language,
Semantics.
• At present, there is a lot of flow that it goes on the
processing of the higher order concept after the
language processing is realized.
• However, I think that the language processing cannot be
achieved if the concept processing and then the
processing of the meaning are not realized first.
12
Conception-2
• When I thought about the directed network structure for
concept representation, I was able to create something
like a neural network.
• When I tried and built SNN functionality and ran it, I
found useful results for the analysis of concepts.
13
4. Concept Network function
14
Define the concept
• It is difficult to define the concept of the concept.
• Therefore, it is defined by the following circulation
representation.
A concept is to represent a certain concept by
relation to other concepts.
• The following sections describe the relation
representation (RR) of concepts in NCN.
15
RR1 – basic-relation
• NCN represent that the concept "A" relate to the concept
"B" in the node and the directed edge as follows.
• A node is called concept, and a directed edge is called
relation.
• A concept of starting/ending point of relation is called
source/destination concept.
A B
(source) concept (destination) concept
relation
(origin)
16
Objects of concept
• Here, the concept “A” and the concept “B”, this “A”, “B”
is just a label for clarity, in fact it may be a text, voice,
image or other neural networks.
• Internally, it is identified by the UUID.
17
• Self, correlation and circulation relations are represented
as follows.
A B A B
C
A
18
Example of basic-relation:
A human relate to a hand.
human hand
19
RR2 – multiple-relations
• Multiple relations are represented as follows.
A
B
C
C
B
A
C
A
B
C
A
B
20
Example of multiple-relations:
A human relate to a hand and a foot.
human
hand
foot
foot
hand
human
or
21
RR3 – sub-relation
• Further qualifying a relation with other concepts is called
sub-relation.
• The following is a representation of concept "A", which
has a relation of "C" to "B". If relation1 is the origin,
relation2 is a sub-relation.
A B
C
relation2
(sub-relation)
relation1
(origin)
22
• Conversely, if relation2 is the origin, relation1 is called
super-relation.
A B
C
relation2
(origin)
relation1
(super-relation)
23
Example of sub-relation:
human relate to hand that have.
→ (A) human has(have) (a) hand.
human hand
have
24
Sentence type and order of relation (1/2)
• In this example, the relationship is defined in a Japanese
sentence type order, but in NCN the sequence of the
relations has no grammatical meaning, so you may
define the relationship in the English sentence order.
• What meaning is found from these conceptual structures
depends on the interpreting side.
human hand
have
human have
hand
Japanese sentence type English sentence type 25
Sentence type and order of relation (2/2)
• When active and passive is preferred, it is easy to do the
relative-representation (described later) by the
sentence type order of English.
• For the intransitive verb, a Japanese sentence type order
is easier to express in relative terms.
26
RR4 – nested-relation
• The nested relation to further qualify the sub-relation in
the sub-relation is represented as follows.
A B
C
D
27
A B
C
D
• It is called "owner-relation" is the top-level relationship
when relation3 is origin.
relation3
(origin)
relation2
(super-relation)
relation1
(owner-relation)
28
Example1 of nested-relation:
human relate to hand that have qualified by two.
→ (A) human (have)has two hand(s).
human hand
have
two
29
Example2 of nested-relation:
human relate to hand and foot that have qualified by
two.
→ (A) human (have)has two hand(s) and foot(feet).
human hand
have
two
foot
30
Example3 of nested-relation:
The network of Example2 can be simplified as follows.
human
hand
have
two
foot
31
Limits of three-term expression
• Relation representation of concepts is similar to RDF
(Resource Description Framework) or a graph database.
– ex: RDF represents relation by triple (subject, predict, object).
• When dealing with real information, the three-term
expression is not enough.
• NCN also represents the relation itself, which
corresponds to RDF predicates, and can be combined
with sub-relation to create a more realistic
representation of a concept.
32
RR5 - relativity
• The degree of relationship (relativity) of a sub-relation
is represented by 0.0 < relativity < 1.0.
• The closer to 0.0, the closer to the source concept, and
the closer to 1.0, the closer to the destination concept
relation.
A B
C
D
relativity = 0.25
relativity = 0.5 E
relativity = 0.75
33
Example of relativity:
(A) human (have)has two hand(s) with five fingers and
foot(feet).
human hand
have
two
finger
five
34
• The relativity and the order of the sub-relation do not
have grammatical meaning.
• As shown below, the order of the things you want to
emphasize may be different even with the same fact.
Mr. Smith
yesterday
this road
passed through yesterday Mr. Smith
this road
passed through
yesterday this road
Mr. Smith
passed through
Mr. Smith passed through this road on yesterday. Yesterday, Mr. Smith passed through this road. Yesterday and this road, Mr. smith passed through.
Mr. Smith が昨日この道を通った。 昨日 Mr. Smith がこの道を通った。 昨日この道を Mr. Smith が通った。
35
• Since English grammar and Japanese grammar may not
be able to reproduce the order of emphasis of the
concept, it is necessary to supplement it with the
decoration of the character in the case of sentences by
inflection and gesture, etc. in an actual conversation.
Mr. Smith passed through this road on yesterday.
Mr. Smith が昨日この道を ”通った” 。
Mr. Smith
yesterday
this road
passed through
36
• On the other hand, a sentence with a restriction, such as
poetry, might recall multiple complex conceptual
structures.
37
Sliding vector representation of
conceptual networks
• If a concept network is interpreted as a sliding (or liner)
vector, it can be handled by calculation graph neural
networks?
co1
co3
co4
co2
𝑎
𝑏
𝑐
𝑎 = 𝑐𝑜2 − 𝑐𝑜1
𝑏 = 𝑐𝑜3 − 𝑎 ∗ 0.5
𝑐 = 𝑐𝑜4 − 𝑏 ∗ 0.5
38
Grammatical representation
• The idea that grammatical information is explicitly added
separately.
Mr. Smith
passed through
yesterday
this road
S
V
O
M
grammar
39
• Sub-relation can be represented relatively as follows.
• (In the context of A,) B is related to C.
A
RR6 - relative-representation
A
C
C
B
B
40
A
CB
D
• When the nested-relation is represented relatively, it
becomes as follows.
• (In the context of A,) B is related to C that D.
A
C
B
D
41
• When the sub-relation in the relative-representation is
further relative-representation, it becomes a hierarchical
representation as follows.
• (In the context of A-B,) C is related to D.
A
C
B
D
A
B
C D
42
relativity-representation
in relative representation
• In relative representation, the information of the
relativity disappears, so it is necessary to think about the
representation.
A
A
C
C
B
B
D
E
D
E
Idea1. Color representation
A
CB
D
E
Idea2. 3D representation
43
Examples of relative-relation:
• A human has two legs with five fingers.
• An insect has six legs with finger.
human foot
have
two
finger
five
insect
six
human
foot
havetwo
finger
five
insect
foot
havesix
finger
44
Example1 of complex relative-representations:
Self-reference, self-inclusion representation
(In me,) He may think I'm a delicate man, but I’m
bold.
I
(me)
he
bold
delicate
I (me)
he
I
delicate
bold
45
Example2 of complex relative-representations:
Self-inclusion representation + time representation
WIP
46
Example3 of complex relative-representations:
Triangular relationships
WIP
47
Example4 of complex relative-representations:
Write a love letter in NCN
WIP
48
Exception representation
WIP
• Generally, crows are black, and swans is white.
• However, there are exceptions such as albino and
melanism, and white crows and black swans exist.
• It is necessary to represent it while preventing
catastrophic forgetting (interference) due to such
exceptions.
49
Examples of existing logical
representations
NCN can support existing logical representation, such as:
• Top-down analysis
– Fish bone diagram
– Mind Map
• Bottom-up analysis
– KJ method
• Meaning description
– RDF/OWL
• Structure description
– UML
– ER diagram
– Graph Database
50
Comparing association representations in
UML
UML (Class Diagram) NCN
ParentClass ChildClass
ParentChildfood1 0..*
Parent Child
ParentChildfood
1 0..*
51
Chomsky’s Generative grammar
• When grammatical information is added to the nested
relation representation of the concept, the grammatical
structure can be represented.
WIP
52
Summary of relation representation of
concepts
• Frame representation is incorporated into the network
structure itself, and recursive relative representations of
concepts that were difficult in conventional logical
representations can be made.
53
5. Neural Network function
54
Neural Network function
• SNN utilize the temporal change of the neural potential
to express and process information and is closer to
biological neurons than conventional computational
graph neural networks, allowing for more flexible
information representation and processing.
• NCN also provides information processing power by
branched structures equivalent to neurite (or nerve),
such as axon and dendrite.
55
Pros & Cons of Neural Network function
• Pros
– Dynamic network can be formed.
– Signal, processing is superposition-able.
• Cons
– It is costly to calculate.
– There may be similar restrictions as humans.
56
Spiking Neural Network Function
NCN has parameters for the spiking neural network in
addition to the parameters of conventional calculation
graph neural networks.
• conventional calculation graph neural networks function
– Weight: Synaptic Weights. Positive and negative real number
– Potential: Positive and negative real number (mV)
– Threshold: Positive real number (mV)
• spiking neural networks function
– Attenuation rate:
Time attenuation rate of potential. Positive real number
(mV/msec)
– Refractory period: Positive real number (msec)
57
• The Neural Network feature changes the name of the
structure used in the Concept Network feature as follows.
neuron
neurite
synapse
58
Firing function
• NCN does not have an input or an output layer, and any
neuron can be treated as an input or an output neuron.
• Represent the directed edge and relation used for I/O
separately.
input/output
relation
animation
59
Combination with other NN
• It is also assumed that the input and output are
combined with other types of neural networks, such as
DNN.
60
13.0 pps 13.0 pps
firing rate 1/1
6.5 pps
firing rate 1/2
0.0 pps
No further firing due to the
attenuation characteristics of the
potential
• When the input frequency and amount of the signal
exceeds the attenuation of the potential, the firing
frequency become 1/n of the input frequency.
• This is because even one input amount is less than the
threshold value, it can fire beyond the threshold when
input as two or three times.
pps: pulse per second
threshold = 1.0
weight = 1.0 weight = 0.9 weight = 0.9
input 13.0 ppsanimation
61
• By attenuation characteristics of the potential, it will not
respond when the distance (number of stages) from the
input neuron becomes distant.
• This is an important feature to prevent infinite firing
loops in NCN that can be circulated networks.
• Similarly, the higher input frequency of the signal causes
a wider range of propagation.
• This feature is utilized to control affect range of the input
signal.
62
Harmonic sound
• It may be related to the mechanism of harmonic sound;
whose frequency component is an integer multiple of the
fundamental tone or one integer fraction.
63
Frequency Coding
• There are various theories about the representation of
information in the brain, and here are two.
– rate coding theory
– temporal coding theory
• NCN uses frequency coding that combines these features.
• Frequency coding detects how much it fires in sync with the
frequency of the input signal as the degree of the relation.
• By using coprime frequencies with sufficiently large least
common multiple for input, it is possible to determine how
much the network reacts to which input frequency,
regardless of the propagation path.
64
Information expression using frequency
• The information expression using the frequency has the
following.
– PM: Phase modulation
– FM: Frequency modulation
– AM: Amplitude modulation
• In these, AM seems to require a population expression of
multiple neurons rather than a single neuron from the
firing characteristics of the neural network (all or none
law).
65
Image of the firing cycle (13.0pps)
Time
13.00 pps
→ 13/1
6.50 pps
→ 13/2
4.33 pps
→ 13/3
3.25 pps
→ 13/4
2.60 pps
→ 13/5
2.16 pps
→ 13/6
1.0s
animation
66
Image of the firing cycle (11.0pps)
Time
11.0 pps
→ 11/1
5.5 pps
→ 11/2
3.66 pps
→ 11/3
2.75 pps
→ 11/4
2.2 pps
→ 11/5
1.83 pps
→ 11/6
1.0s
animation
67
Image of the firing cycle (7.0pps)
Time
7.0 pps
→ 7/1
3.5 pps
→ 7/2
2.33 pps
→ 7/3
1.75 pps
→ 7/4
1.4 pps
→ 7/5
1.6 pps
→ 7/6
1.0s
animation
68
Image of firing cycle when using three coprime frequencies
Time
1.0s
animation
69
Image of firing cycle when using three coprime frequencies
(Monochrome version)
Time
1.0s
animation
70
• The following is an example of the reaction when
coprime frequencies are colored in RGB as a channel
and input from different neurons.
Input 13.0 pps
channel A
Input 11.0 pps
channel B
Input 7.0 pps
channel C
6.5 pps
channel A 50%
3.25 pps
channel A 25%
channel A 25%
channel B 25%
channel B 25%
channel C 25%
5.5 pps
channel B 50%
3.5 pps
channel C 50%
1.75 pps
channel C 25%
channel A
12.5%
channel B 12.5%
channel C 12.5%
channel A 25%
channel C 25%
3.25 pps
channel B 25%
71
Summary of frequency coding
• By using coprime frequencies for input channels, the
possibility that each frequency interferes in the unit time is
very low.
• Using this characteristic, it is possible to separate or
superimpose multiple channels in the input and output and
processing of information.
• By analyzing the output signal for each frequency channel, it
is possible to determine whether the neurons are reacting to
which input frequency channel regardless on the path.
• Therefore, compared to the calculation graph NN, it is
possible to prevented to become a black box.
72
Actual behavior when frequency is
superposition
• In fact, it does not go so well because it is affected by
the potential that rose at another frequency when the
frequency is superpositioned nearby.
• There is some way to eliminate the affect, but there is a
possibility that it can be made some information
expression, and it is necessary to verify which is better.
73
Frequency Channel Combinations
• The combination requirement of the frequency channel is
that it is “coprime, and the ratio of the differences is small".
• Three sequential numbers starting from any odd number are
coprime.
• Examples:
(1, 2, 3), (3, 4, 5), (5, 6, 7), (7, 8, 9), (9, 10, 11), (11, 12,
13), (13, 14, 15), …, (41, 42, 43), (43, 44, 45)
• Higher frequencies reduce the ratio of frequency differences.
• However, when the time resolution is 0.1ms interference
occurred at the combination (43, 44, 45) or higher.
74
Relationship with Gödel Number
• Combining a coprime integer with a frequency of 1/n or
n times to express the conceptual structure is similar to
the idea of "Gödel's incompleteness theorems".
75
Periodic firing, consciousness,
concentration
Is the control of the frequency channel related to
• consciousness, selection, and concentration
• gamma rhythm, burst firing
• capacity of short-term memory is related to 4±1 chunks*
• Some say it has nothing to do with binding problems.
• The boundary between consciousness and the unconscious
may be as simple as sound. Human beings feel that a certain
pattern of sound pressure changes is more than a certain
number of times, and if repeated in a period below a certain
level, they may feel that it is a single sound, but
consciousness may also be the same.
* It used to be a 7±2 chunk and was called a magic number. 76
Frequency Representation and
Parallelism
• The affect range control and the channel representation
by the frequency can achieve the same processing by
adding the influence range information and channel
information directly to the signal.
• But parallel processing becomes difficult, and there is a
possibility that the performance falls when the scale
increases.
• I think that it is good to express everything by "wave
superposition and time" such as quantum mechanics and
process it into particles only when obtaining information.
77
Back-firing
• NCN has a back-firing function in which the backward
propagation of the signal in the opposite direction of the
relation in addition to the forward propagates for top-
down, bottom-up analysis of concepts represented by
the Concept Network function.
• The following slide shows the order of the process of
forward firing and back firing.
78
• (forward) firing
animation
1. Input signal
2. Increase/decrease in potential
3. propagate potential
4. increase/decrease
in potential by weight
79
• back-firing
animation
1. Input backward signal
2. Increase/decrease in potential
3. back propagate potential
4. increase/decrease in potential
80
Types of ions
• Use virtual ions and ion channels to achieve forward and
back firing.
• In addition to this, three frequency channels, phase, and
input amount are combined to form an input signal.
forward backward
Excitatory
fo-ex
(+1eV)
ba-ex
(+1eV)
inhibitory
fo-in
(-1eV)
ba-in
(-1eV)
81
Types of Ions and Quantum
chromodynamics
• The combination of frequency channels,
forward/retrograde, and excitability/inhibitory properties
may be linked to the color charge and top/bottom,
strange/charm, up/down, which is the nature of the
quark.
82
Reproduction of back firing function
in biological neurons
• Back propagation by electrical synapses
– In a chemical synapse, signals propagate forward only.
– However, in an electrical synapse, the signal propagates forward and
backward.
– Electrical synapses are present in inhibitory neurons in the
hippocampus and cerebral cortex.
• Back propagation on dendrites
– Potential change may propagate backward in the dendrites.
→ It seems to be difficult to reproduce with a simple network.
83
Closer to biological neurons
• To reproduce the propagation velocity on axons and
dendrites, Introduce a parameter called width (thickness?).
width = 0.5
width = 1.0
width = 2.0
animation
84
Axons and dendrites
• The propagation velocity of the axon (myelinated nerve)
is fast.
• The propagation velocity of the dendrites is slow.
• Activity potential may also occur on the dendrites, called
dendritic-spike.
85
Propagation velocity of biological neurite
• A thicker biological neurite propagates action potential
faster than thinner one.
• Potential less than the threshold is propagated while
attenuating and, in that case, a thicker neurite may
propagates potential slowly more than thinner one.
86
Collision between forward, and back
propagation signal
• NCN uses back propagation not only for learning, but
also for analyzing concepts.
• When the forward signal from the source
neuron(concept) and the backward signal from the
destination neuron (concept) are input at the same (or
1/n or n times) frequency, the collision point on neurite
changes by changing the phase of input signals.
87
• It is possible to estimate the structure of the neural
(concept) networks by analyzing reactions of each
neurons.
Input forward signal Input backward signal
animation
88
Signal collisions less than the threshold
• When the forward propagation signal and the back-
propagation signal of less than the threshold collides and
if the potential exceeds the threshold, action potential
occurs in the middle of the neurite.
• This state can also be utilized to analyze the networks
structure.
89
Relative representation of concepts
by phase control
• Controlling the collision point of the signal by phase
control of the forward signal and retrograde signal
corresponds to changing the position of the relative
representation in the conceptual network.
90
Structural representation by reaction
timing
• The changing the frequency and the phase of the top-down
and the bottom-up signal and by analyzing its firing reaction
it is possible to infer the structure of the network.
• This means that the network structure can be coded at the
firing timing and rate.
• Of course, there is no point in inferencing and knowing the
structure of a predefined network.
• If this signal can be processed by a neural network, it means
that a meta neural network that dynamically represents and
processes a virtual neural network can be realized.
91
Integration of information by periodic
firing and detection
• WIP
92
6. Neural Concept Network function
93
6-1. Searching
94
WIP
95
Similarity between consciousness,
concentration and radar
• There is a possibility to make use of the similarity of the
function of the radar in consideration and concentration.
• Radar scanning mode
– Lock-on mode
• TWS (Track While Scan) : Multiple target can be tracked at the same time.
• STT (Single Target Track) : Only single target is tracked. It may be
related to the concentration of consciousness.
• Phased Array Radar
– It is possible to have a directivity to the synthetic wave by
shifting the oscillation timing of many radar arrays.
– It may be possible to transmit directed signals by the same
mechanism in the brain.
96
6-2. Analyzing
97
WIP
98
6-3. Learning
99
WIP
100
6-4. Formation
101
Forming new concept from concept loop
WIP
102
A
B
C
DF
E
G
H
A
B
C
DF
E
G
H
“I”
Concept, relation and superstring
theory
WIP
• Concept has a size.
• Concept and Relation can be converted to each other.
• Relation becomes Concept when rounded.
• Concept becomes Relation when make smaller and
stretched it.
103
7. Reference materials
104
Reference materials
No. Title
1 I Am A Strange loop
2 The Neural Code of Pitch and Harmony
3
4
105

More Related Content

Similar to Neural Concept Network v0.3 (en)

Is Abstraction the Key to Artificial Intelligence? - Lorenza Saitta
Is Abstraction the Key to Artificial Intelligence? - Lorenza SaittaIs Abstraction the Key to Artificial Intelligence? - Lorenza Saitta
Is Abstraction the Key to Artificial Intelligence? - Lorenza Saitta
WithTheBest
 
ESWC 2011 BLOOMS+
ESWC 2011 BLOOMS+ ESWC 2011 BLOOMS+
ESWC 2011 BLOOMS+
Prateek Jain
 
MDST 3705 2012-03-05 Databases to Visualization
MDST 3705 2012-03-05 Databases to VisualizationMDST 3705 2012-03-05 Databases to Visualization
MDST 3705 2012-03-05 Databases to VisualizationRafael Alvarado
 
From Signal to Symbols
From Signal to SymbolsFrom Signal to Symbols
From Signal to Symbols
gpano
 
Higher-order clustering coefficients
Higher-order clustering coefficientsHigher-order clustering coefficients
Higher-order clustering coefficients
Austin Benson
 
Interactive visualization and exploration of network data with Gephi
Interactive visualization and exploration of network data with GephiInteractive visualization and exploration of network data with Gephi
Interactive visualization and exploration of network data with Gephi
Digital Methods Initiative
 
Bill howe 8_graphs
Bill howe 8_graphsBill howe 8_graphs
Bill howe 8_graphs
Mahammad Valiyev
 
Graph Analyses with Python and NetworkX
Graph Analyses with Python and NetworkXGraph Analyses with Python and NetworkX
Graph Analyses with Python and NetworkX
Benjamin Bengfort
 
Effective Semantics for Engineering NLP Systems
Effective Semantics for Engineering NLP SystemsEffective Semantics for Engineering NLP Systems
Effective Semantics for Engineering NLP Systems
Andre Freitas
 
Natural Language Processing in R (rNLP)
Natural Language Processing in R (rNLP)Natural Language Processing in R (rNLP)
Natural Language Processing in R (rNLP)
fridolin.wild
 
Similarity on DBpedia
Similarity on DBpediaSimilarity on DBpedia
Similarity on DBpedia
Samantha Lam
 
Different Semantic Perspectives for Question Answering Systems
Different Semantic Perspectives for Question Answering SystemsDifferent Semantic Perspectives for Question Answering Systems
Different Semantic Perspectives for Question Answering Systems
Andre Freitas
 
Information Retrieval using Semantic Similarity
Information Retrieval using Semantic SimilarityInformation Retrieval using Semantic Similarity
Information Retrieval using Semantic SimilaritySaswat Padhi
 
Higher-order clustering coefficients
Higher-order clustering coefficientsHigher-order clustering coefficients
Higher-order clustering coefficients
Austin Benson
 
Book chapter 2
Book chapter 2Book chapter 2
Book chapter 2
Ricardo Pucinelli
 
Bekas for cognitive_speaker_series
Bekas for cognitive_speaker_seriesBekas for cognitive_speaker_series
Bekas for cognitive_speaker_series
diannepatricia
 
Bekas for cognitive_speaker_series
Bekas for cognitive_speaker_seriesBekas for cognitive_speaker_series
Bekas for cognitive_speaker_series
diannepatricia
 
Modeling and mining complex networks with feature-rich nodes.
Modeling and mining complex networks with feature-rich nodes.Modeling and mining complex networks with feature-rich nodes.
Modeling and mining complex networks with feature-rich nodes.
Corrado Monti
 
Graph based data models
Graph based data modelsGraph based data models
Graph based data models
Moumie Soulemane
 
Cmap Tools as an essential for teaching academic writing
Cmap Tools as an essential  for teaching academic writingCmap Tools as an essential  for teaching academic writing
Cmap Tools as an essential for teaching academic writing
Lawrie Hunter
 

Similar to Neural Concept Network v0.3 (en) (20)

Is Abstraction the Key to Artificial Intelligence? - Lorenza Saitta
Is Abstraction the Key to Artificial Intelligence? - Lorenza SaittaIs Abstraction the Key to Artificial Intelligence? - Lorenza Saitta
Is Abstraction the Key to Artificial Intelligence? - Lorenza Saitta
 
ESWC 2011 BLOOMS+
ESWC 2011 BLOOMS+ ESWC 2011 BLOOMS+
ESWC 2011 BLOOMS+
 
MDST 3705 2012-03-05 Databases to Visualization
MDST 3705 2012-03-05 Databases to VisualizationMDST 3705 2012-03-05 Databases to Visualization
MDST 3705 2012-03-05 Databases to Visualization
 
From Signal to Symbols
From Signal to SymbolsFrom Signal to Symbols
From Signal to Symbols
 
Higher-order clustering coefficients
Higher-order clustering coefficientsHigher-order clustering coefficients
Higher-order clustering coefficients
 
Interactive visualization and exploration of network data with Gephi
Interactive visualization and exploration of network data with GephiInteractive visualization and exploration of network data with Gephi
Interactive visualization and exploration of network data with Gephi
 
Bill howe 8_graphs
Bill howe 8_graphsBill howe 8_graphs
Bill howe 8_graphs
 
Graph Analyses with Python and NetworkX
Graph Analyses with Python and NetworkXGraph Analyses with Python and NetworkX
Graph Analyses with Python and NetworkX
 
Effective Semantics for Engineering NLP Systems
Effective Semantics for Engineering NLP SystemsEffective Semantics for Engineering NLP Systems
Effective Semantics for Engineering NLP Systems
 
Natural Language Processing in R (rNLP)
Natural Language Processing in R (rNLP)Natural Language Processing in R (rNLP)
Natural Language Processing in R (rNLP)
 
Similarity on DBpedia
Similarity on DBpediaSimilarity on DBpedia
Similarity on DBpedia
 
Different Semantic Perspectives for Question Answering Systems
Different Semantic Perspectives for Question Answering SystemsDifferent Semantic Perspectives for Question Answering Systems
Different Semantic Perspectives for Question Answering Systems
 
Information Retrieval using Semantic Similarity
Information Retrieval using Semantic SimilarityInformation Retrieval using Semantic Similarity
Information Retrieval using Semantic Similarity
 
Higher-order clustering coefficients
Higher-order clustering coefficientsHigher-order clustering coefficients
Higher-order clustering coefficients
 
Book chapter 2
Book chapter 2Book chapter 2
Book chapter 2
 
Bekas for cognitive_speaker_series
Bekas for cognitive_speaker_seriesBekas for cognitive_speaker_series
Bekas for cognitive_speaker_series
 
Bekas for cognitive_speaker_series
Bekas for cognitive_speaker_seriesBekas for cognitive_speaker_series
Bekas for cognitive_speaker_series
 
Modeling and mining complex networks with feature-rich nodes.
Modeling and mining complex networks with feature-rich nodes.Modeling and mining complex networks with feature-rich nodes.
Modeling and mining complex networks with feature-rich nodes.
 
Graph based data models
Graph based data modelsGraph based data models
Graph based data models
 
Cmap Tools as an essential for teaching academic writing
Cmap Tools as an essential  for teaching academic writingCmap Tools as an essential  for teaching academic writing
Cmap Tools as an essential for teaching academic writing
 

Recently uploaded

From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
Product School
 
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Ramesh Iyer
 
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Product School
 
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdfSmart TV Buyer Insights Survey 2024 by 91mobiles.pdf
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf
91mobiles
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
Kari Kakkonen
 
Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
OnBoard
 
GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...
GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...
GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...
Sri Ambati
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
Elena Simperl
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
DanBrown980551
 
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
James Anderson
 
How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...
Product School
 
GraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge GraphGraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge Graph
Guy Korland
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance
 
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Tobias Schneck
 
Epistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI supportEpistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI support
Alan Dix
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
KatiaHIMEUR1
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
DianaGray10
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
DianaGray10
 
Connector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a buttonConnector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a button
DianaGray10
 
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Product School
 

Recently uploaded (20)

From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
 
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
 
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
 
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdfSmart TV Buyer Insights Survey 2024 by 91mobiles.pdf
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
 
Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
 
GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...
GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...
GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
 
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
 
How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...
 
GraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge GraphGraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge Graph
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
 
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
 
Epistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI supportEpistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI support
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
 
Connector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a buttonConnector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a button
 
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...
 

Neural Concept Network v0.3 (en)

  • 2. Index 1. Introduction 2. Overview 3. Conception 4. Concept Network function 5. Neural Network function 6. Neural Concept Network function 7. Reference materials 2
  • 4. Precaution • This document is under creation. • The slide is a note in the process of development and may be deleted in the official version. • This document uses some animations. The slide with the following icons has animation. Use the PowerPoint version because the PDF version can't play animations. animation 4
  • 5. Self-introduction • Name – Akihiro Yamamoto • Twitter – A_Ym • Focus on – AI, Artificial Consciousness (AC), Neural Network, Concept theory, Brain, Quantum Info, OpenCL, C#, MS Azure… And BOOM BOOM SATELLITES!! 5
  • 7. What’s Neural Concept Network • Neural Concept Network is a directed network for representation, searching, analyzing, learning, and forming concepts. • Consisting of the following two functions. – Concept Network: function for representing concepts – Neural Network: function for searching, analyzing, learning, and forming concepts It is abbreviated as NCN as needed. 7
  • 8. Features of Neural Concept Network • NCN can represent concepts in a form that can be understood by those who do not have knowledge of neural networks or mathematics. • It can also represent “relative, hierarchical, context- sensitive, self-inclusion, self-reference” concepts, which is complicated in conventional concept representation. • It has the functionality of a simplified Spiking Neural Network (SNN) and is used for concept search, (top- down, bottom-up) analysis, learning and formation. 8
  • 10. Conception-1 • NCN is a part of the realization of HLAI (Human Level Artificial Intelligence) with the ability to think like human beings. • At present, the distance between Top-down approach and Bottom-up approach, for realization of HLAI is too far, so I thought we needed a middle-out approach. • I thought that it was necessary to implement the representation and the processing of the concept as a function of the middle point. 10
  • 11. HLAI Roadmap Human Intelligence Artificial Intelligence Artificial Neural NetworkHuman Neural Network Concept Semantics Language Thought Planning Recognition Perception Sense Cognition 11
  • 12. HLAI Roadmap points • The point is that Concept is located below Language, Semantics. • At present, there is a lot of flow that it goes on the processing of the higher order concept after the language processing is realized. • However, I think that the language processing cannot be achieved if the concept processing and then the processing of the meaning are not realized first. 12
  • 13. Conception-2 • When I thought about the directed network structure for concept representation, I was able to create something like a neural network. • When I tried and built SNN functionality and ran it, I found useful results for the analysis of concepts. 13
  • 14. 4. Concept Network function 14
  • 15. Define the concept • It is difficult to define the concept of the concept. • Therefore, it is defined by the following circulation representation. A concept is to represent a certain concept by relation to other concepts. • The following sections describe the relation representation (RR) of concepts in NCN. 15
  • 16. RR1 – basic-relation • NCN represent that the concept "A" relate to the concept "B" in the node and the directed edge as follows. • A node is called concept, and a directed edge is called relation. • A concept of starting/ending point of relation is called source/destination concept. A B (source) concept (destination) concept relation (origin) 16
  • 17. Objects of concept • Here, the concept “A” and the concept “B”, this “A”, “B” is just a label for clarity, in fact it may be a text, voice, image or other neural networks. • Internally, it is identified by the UUID. 17
  • 18. • Self, correlation and circulation relations are represented as follows. A B A B C A 18
  • 19. Example of basic-relation: A human relate to a hand. human hand 19
  • 20. RR2 – multiple-relations • Multiple relations are represented as follows. A B C C B A C A B C A B 20
  • 21. Example of multiple-relations: A human relate to a hand and a foot. human hand foot foot hand human or 21
  • 22. RR3 – sub-relation • Further qualifying a relation with other concepts is called sub-relation. • The following is a representation of concept "A", which has a relation of "C" to "B". If relation1 is the origin, relation2 is a sub-relation. A B C relation2 (sub-relation) relation1 (origin) 22
  • 23. • Conversely, if relation2 is the origin, relation1 is called super-relation. A B C relation2 (origin) relation1 (super-relation) 23
  • 24. Example of sub-relation: human relate to hand that have. → (A) human has(have) (a) hand. human hand have 24
  • 25. Sentence type and order of relation (1/2) • In this example, the relationship is defined in a Japanese sentence type order, but in NCN the sequence of the relations has no grammatical meaning, so you may define the relationship in the English sentence order. • What meaning is found from these conceptual structures depends on the interpreting side. human hand have human have hand Japanese sentence type English sentence type 25
  • 26. Sentence type and order of relation (2/2) • When active and passive is preferred, it is easy to do the relative-representation (described later) by the sentence type order of English. • For the intransitive verb, a Japanese sentence type order is easier to express in relative terms. 26
  • 27. RR4 – nested-relation • The nested relation to further qualify the sub-relation in the sub-relation is represented as follows. A B C D 27
  • 28. A B C D • It is called "owner-relation" is the top-level relationship when relation3 is origin. relation3 (origin) relation2 (super-relation) relation1 (owner-relation) 28
  • 29. Example1 of nested-relation: human relate to hand that have qualified by two. → (A) human (have)has two hand(s). human hand have two 29
  • 30. Example2 of nested-relation: human relate to hand and foot that have qualified by two. → (A) human (have)has two hand(s) and foot(feet). human hand have two foot 30
  • 31. Example3 of nested-relation: The network of Example2 can be simplified as follows. human hand have two foot 31
  • 32. Limits of three-term expression • Relation representation of concepts is similar to RDF (Resource Description Framework) or a graph database. – ex: RDF represents relation by triple (subject, predict, object). • When dealing with real information, the three-term expression is not enough. • NCN also represents the relation itself, which corresponds to RDF predicates, and can be combined with sub-relation to create a more realistic representation of a concept. 32
  • 33. RR5 - relativity • The degree of relationship (relativity) of a sub-relation is represented by 0.0 < relativity < 1.0. • The closer to 0.0, the closer to the source concept, and the closer to 1.0, the closer to the destination concept relation. A B C D relativity = 0.25 relativity = 0.5 E relativity = 0.75 33
  • 34. Example of relativity: (A) human (have)has two hand(s) with five fingers and foot(feet). human hand have two finger five 34
  • 35. • The relativity and the order of the sub-relation do not have grammatical meaning. • As shown below, the order of the things you want to emphasize may be different even with the same fact. Mr. Smith yesterday this road passed through yesterday Mr. Smith this road passed through yesterday this road Mr. Smith passed through Mr. Smith passed through this road on yesterday. Yesterday, Mr. Smith passed through this road. Yesterday and this road, Mr. smith passed through. Mr. Smith が昨日この道を通った。 昨日 Mr. Smith がこの道を通った。 昨日この道を Mr. Smith が通った。 35
  • 36. • Since English grammar and Japanese grammar may not be able to reproduce the order of emphasis of the concept, it is necessary to supplement it with the decoration of the character in the case of sentences by inflection and gesture, etc. in an actual conversation. Mr. Smith passed through this road on yesterday. Mr. Smith が昨日この道を ”通った” 。 Mr. Smith yesterday this road passed through 36
  • 37. • On the other hand, a sentence with a restriction, such as poetry, might recall multiple complex conceptual structures. 37
  • 38. Sliding vector representation of conceptual networks • If a concept network is interpreted as a sliding (or liner) vector, it can be handled by calculation graph neural networks? co1 co3 co4 co2 𝑎 𝑏 𝑐 𝑎 = 𝑐𝑜2 − 𝑐𝑜1 𝑏 = 𝑐𝑜3 − 𝑎 ∗ 0.5 𝑐 = 𝑐𝑜4 − 𝑏 ∗ 0.5 38
  • 39. Grammatical representation • The idea that grammatical information is explicitly added separately. Mr. Smith passed through yesterday this road S V O M grammar 39
  • 40. • Sub-relation can be represented relatively as follows. • (In the context of A,) B is related to C. A RR6 - relative-representation A C C B B 40
  • 41. A CB D • When the nested-relation is represented relatively, it becomes as follows. • (In the context of A,) B is related to C that D. A C B D 41
  • 42. • When the sub-relation in the relative-representation is further relative-representation, it becomes a hierarchical representation as follows. • (In the context of A-B,) C is related to D. A C B D A B C D 42
  • 43. relativity-representation in relative representation • In relative representation, the information of the relativity disappears, so it is necessary to think about the representation. A A C C B B D E D E Idea1. Color representation A CB D E Idea2. 3D representation 43
  • 44. Examples of relative-relation: • A human has two legs with five fingers. • An insect has six legs with finger. human foot have two finger five insect six human foot havetwo finger five insect foot havesix finger 44
  • 45. Example1 of complex relative-representations: Self-reference, self-inclusion representation (In me,) He may think I'm a delicate man, but I’m bold. I (me) he bold delicate I (me) he I delicate bold 45
  • 46. Example2 of complex relative-representations: Self-inclusion representation + time representation WIP 46
  • 47. Example3 of complex relative-representations: Triangular relationships WIP 47
  • 48. Example4 of complex relative-representations: Write a love letter in NCN WIP 48
  • 49. Exception representation WIP • Generally, crows are black, and swans is white. • However, there are exceptions such as albino and melanism, and white crows and black swans exist. • It is necessary to represent it while preventing catastrophic forgetting (interference) due to such exceptions. 49
  • 50. Examples of existing logical representations NCN can support existing logical representation, such as: • Top-down analysis – Fish bone diagram – Mind Map • Bottom-up analysis – KJ method • Meaning description – RDF/OWL • Structure description – UML – ER diagram – Graph Database 50
  • 51. Comparing association representations in UML UML (Class Diagram) NCN ParentClass ChildClass ParentChildfood1 0..* Parent Child ParentChildfood 1 0..* 51
  • 52. Chomsky’s Generative grammar • When grammatical information is added to the nested relation representation of the concept, the grammatical structure can be represented. WIP 52
  • 53. Summary of relation representation of concepts • Frame representation is incorporated into the network structure itself, and recursive relative representations of concepts that were difficult in conventional logical representations can be made. 53
  • 54. 5. Neural Network function 54
  • 55. Neural Network function • SNN utilize the temporal change of the neural potential to express and process information and is closer to biological neurons than conventional computational graph neural networks, allowing for more flexible information representation and processing. • NCN also provides information processing power by branched structures equivalent to neurite (or nerve), such as axon and dendrite. 55
  • 56. Pros & Cons of Neural Network function • Pros – Dynamic network can be formed. – Signal, processing is superposition-able. • Cons – It is costly to calculate. – There may be similar restrictions as humans. 56
  • 57. Spiking Neural Network Function NCN has parameters for the spiking neural network in addition to the parameters of conventional calculation graph neural networks. • conventional calculation graph neural networks function – Weight: Synaptic Weights. Positive and negative real number – Potential: Positive and negative real number (mV) – Threshold: Positive real number (mV) • spiking neural networks function – Attenuation rate: Time attenuation rate of potential. Positive real number (mV/msec) – Refractory period: Positive real number (msec) 57
  • 58. • The Neural Network feature changes the name of the structure used in the Concept Network feature as follows. neuron neurite synapse 58
  • 59. Firing function • NCN does not have an input or an output layer, and any neuron can be treated as an input or an output neuron. • Represent the directed edge and relation used for I/O separately. input/output relation animation 59
  • 60. Combination with other NN • It is also assumed that the input and output are combined with other types of neural networks, such as DNN. 60
  • 61. 13.0 pps 13.0 pps firing rate 1/1 6.5 pps firing rate 1/2 0.0 pps No further firing due to the attenuation characteristics of the potential • When the input frequency and amount of the signal exceeds the attenuation of the potential, the firing frequency become 1/n of the input frequency. • This is because even one input amount is less than the threshold value, it can fire beyond the threshold when input as two or three times. pps: pulse per second threshold = 1.0 weight = 1.0 weight = 0.9 weight = 0.9 input 13.0 ppsanimation 61
  • 62. • By attenuation characteristics of the potential, it will not respond when the distance (number of stages) from the input neuron becomes distant. • This is an important feature to prevent infinite firing loops in NCN that can be circulated networks. • Similarly, the higher input frequency of the signal causes a wider range of propagation. • This feature is utilized to control affect range of the input signal. 62
  • 63. Harmonic sound • It may be related to the mechanism of harmonic sound; whose frequency component is an integer multiple of the fundamental tone or one integer fraction. 63
  • 64. Frequency Coding • There are various theories about the representation of information in the brain, and here are two. – rate coding theory – temporal coding theory • NCN uses frequency coding that combines these features. • Frequency coding detects how much it fires in sync with the frequency of the input signal as the degree of the relation. • By using coprime frequencies with sufficiently large least common multiple for input, it is possible to determine how much the network reacts to which input frequency, regardless of the propagation path. 64
  • 65. Information expression using frequency • The information expression using the frequency has the following. – PM: Phase modulation – FM: Frequency modulation – AM: Amplitude modulation • In these, AM seems to require a population expression of multiple neurons rather than a single neuron from the firing characteristics of the neural network (all or none law). 65
  • 66. Image of the firing cycle (13.0pps) Time 13.00 pps → 13/1 6.50 pps → 13/2 4.33 pps → 13/3 3.25 pps → 13/4 2.60 pps → 13/5 2.16 pps → 13/6 1.0s animation 66
  • 67. Image of the firing cycle (11.0pps) Time 11.0 pps → 11/1 5.5 pps → 11/2 3.66 pps → 11/3 2.75 pps → 11/4 2.2 pps → 11/5 1.83 pps → 11/6 1.0s animation 67
  • 68. Image of the firing cycle (7.0pps) Time 7.0 pps → 7/1 3.5 pps → 7/2 2.33 pps → 7/3 1.75 pps → 7/4 1.4 pps → 7/5 1.6 pps → 7/6 1.0s animation 68
  • 69. Image of firing cycle when using three coprime frequencies Time 1.0s animation 69
  • 70. Image of firing cycle when using three coprime frequencies (Monochrome version) Time 1.0s animation 70
  • 71. • The following is an example of the reaction when coprime frequencies are colored in RGB as a channel and input from different neurons. Input 13.0 pps channel A Input 11.0 pps channel B Input 7.0 pps channel C 6.5 pps channel A 50% 3.25 pps channel A 25% channel A 25% channel B 25% channel B 25% channel C 25% 5.5 pps channel B 50% 3.5 pps channel C 50% 1.75 pps channel C 25% channel A 12.5% channel B 12.5% channel C 12.5% channel A 25% channel C 25% 3.25 pps channel B 25% 71
  • 72. Summary of frequency coding • By using coprime frequencies for input channels, the possibility that each frequency interferes in the unit time is very low. • Using this characteristic, it is possible to separate or superimpose multiple channels in the input and output and processing of information. • By analyzing the output signal for each frequency channel, it is possible to determine whether the neurons are reacting to which input frequency channel regardless on the path. • Therefore, compared to the calculation graph NN, it is possible to prevented to become a black box. 72
  • 73. Actual behavior when frequency is superposition • In fact, it does not go so well because it is affected by the potential that rose at another frequency when the frequency is superpositioned nearby. • There is some way to eliminate the affect, but there is a possibility that it can be made some information expression, and it is necessary to verify which is better. 73
  • 74. Frequency Channel Combinations • The combination requirement of the frequency channel is that it is “coprime, and the ratio of the differences is small". • Three sequential numbers starting from any odd number are coprime. • Examples: (1, 2, 3), (3, 4, 5), (5, 6, 7), (7, 8, 9), (9, 10, 11), (11, 12, 13), (13, 14, 15), …, (41, 42, 43), (43, 44, 45) • Higher frequencies reduce the ratio of frequency differences. • However, when the time resolution is 0.1ms interference occurred at the combination (43, 44, 45) or higher. 74
  • 75. Relationship with Gödel Number • Combining a coprime integer with a frequency of 1/n or n times to express the conceptual structure is similar to the idea of "Gödel's incompleteness theorems". 75
  • 76. Periodic firing, consciousness, concentration Is the control of the frequency channel related to • consciousness, selection, and concentration • gamma rhythm, burst firing • capacity of short-term memory is related to 4±1 chunks* • Some say it has nothing to do with binding problems. • The boundary between consciousness and the unconscious may be as simple as sound. Human beings feel that a certain pattern of sound pressure changes is more than a certain number of times, and if repeated in a period below a certain level, they may feel that it is a single sound, but consciousness may also be the same. * It used to be a 7±2 chunk and was called a magic number. 76
  • 77. Frequency Representation and Parallelism • The affect range control and the channel representation by the frequency can achieve the same processing by adding the influence range information and channel information directly to the signal. • But parallel processing becomes difficult, and there is a possibility that the performance falls when the scale increases. • I think that it is good to express everything by "wave superposition and time" such as quantum mechanics and process it into particles only when obtaining information. 77
  • 78. Back-firing • NCN has a back-firing function in which the backward propagation of the signal in the opposite direction of the relation in addition to the forward propagates for top- down, bottom-up analysis of concepts represented by the Concept Network function. • The following slide shows the order of the process of forward firing and back firing. 78
  • 79. • (forward) firing animation 1. Input signal 2. Increase/decrease in potential 3. propagate potential 4. increase/decrease in potential by weight 79
  • 80. • back-firing animation 1. Input backward signal 2. Increase/decrease in potential 3. back propagate potential 4. increase/decrease in potential 80
  • 81. Types of ions • Use virtual ions and ion channels to achieve forward and back firing. • In addition to this, three frequency channels, phase, and input amount are combined to form an input signal. forward backward Excitatory fo-ex (+1eV) ba-ex (+1eV) inhibitory fo-in (-1eV) ba-in (-1eV) 81
  • 82. Types of Ions and Quantum chromodynamics • The combination of frequency channels, forward/retrograde, and excitability/inhibitory properties may be linked to the color charge and top/bottom, strange/charm, up/down, which is the nature of the quark. 82
  • 83. Reproduction of back firing function in biological neurons • Back propagation by electrical synapses – In a chemical synapse, signals propagate forward only. – However, in an electrical synapse, the signal propagates forward and backward. – Electrical synapses are present in inhibitory neurons in the hippocampus and cerebral cortex. • Back propagation on dendrites – Potential change may propagate backward in the dendrites. → It seems to be difficult to reproduce with a simple network. 83
  • 84. Closer to biological neurons • To reproduce the propagation velocity on axons and dendrites, Introduce a parameter called width (thickness?). width = 0.5 width = 1.0 width = 2.0 animation 84
  • 85. Axons and dendrites • The propagation velocity of the axon (myelinated nerve) is fast. • The propagation velocity of the dendrites is slow. • Activity potential may also occur on the dendrites, called dendritic-spike. 85
  • 86. Propagation velocity of biological neurite • A thicker biological neurite propagates action potential faster than thinner one. • Potential less than the threshold is propagated while attenuating and, in that case, a thicker neurite may propagates potential slowly more than thinner one. 86
  • 87. Collision between forward, and back propagation signal • NCN uses back propagation not only for learning, but also for analyzing concepts. • When the forward signal from the source neuron(concept) and the backward signal from the destination neuron (concept) are input at the same (or 1/n or n times) frequency, the collision point on neurite changes by changing the phase of input signals. 87
  • 88. • It is possible to estimate the structure of the neural (concept) networks by analyzing reactions of each neurons. Input forward signal Input backward signal animation 88
  • 89. Signal collisions less than the threshold • When the forward propagation signal and the back- propagation signal of less than the threshold collides and if the potential exceeds the threshold, action potential occurs in the middle of the neurite. • This state can also be utilized to analyze the networks structure. 89
  • 90. Relative representation of concepts by phase control • Controlling the collision point of the signal by phase control of the forward signal and retrograde signal corresponds to changing the position of the relative representation in the conceptual network. 90
  • 91. Structural representation by reaction timing • The changing the frequency and the phase of the top-down and the bottom-up signal and by analyzing its firing reaction it is possible to infer the structure of the network. • This means that the network structure can be coded at the firing timing and rate. • Of course, there is no point in inferencing and knowing the structure of a predefined network. • If this signal can be processed by a neural network, it means that a meta neural network that dynamically represents and processes a virtual neural network can be realized. 91
  • 92. Integration of information by periodic firing and detection • WIP 92
  • 93. 6. Neural Concept Network function 93
  • 96. Similarity between consciousness, concentration and radar • There is a possibility to make use of the similarity of the function of the radar in consideration and concentration. • Radar scanning mode – Lock-on mode • TWS (Track While Scan) : Multiple target can be tracked at the same time. • STT (Single Target Track) : Only single target is tracked. It may be related to the concentration of consciousness. • Phased Array Radar – It is possible to have a directivity to the synthetic wave by shifting the oscillation timing of many radar arrays. – It may be possible to transmit directed signals by the same mechanism in the brain. 96
  • 102. Forming new concept from concept loop WIP 102 A B C DF E G H A B C DF E G H “I”
  • 103. Concept, relation and superstring theory WIP • Concept has a size. • Concept and Relation can be converted to each other. • Relation becomes Concept when rounded. • Concept becomes Relation when make smaller and stretched it. 103
  • 105. Reference materials No. Title 1 I Am A Strange loop 2 The Neural Code of Pitch and Harmony 3 4 105