1.1. Kings aligned parallel on squares separated by one file/ rank, Rook n
black King in the same file:
White King prevents black King from moving to the three outer squares
in adjoining file for the six square rectangle Anatomy of Mate mating pattern.
Black King is in a corner square. White King is blocking the black King from moving to the two outer squares of the four square Anatomy of Mate mating pattern. Rook is controlling either the back rank or the file ending in the corner square 1. DIRECT KING OPPOSITION: …contd.
2. MISALIGNED KING OPPOSITION: 2.1. Kings are on squares that are not aligned parallel to each other, but separated by one square White King prevents black King from moving to the three outer squares in adjoining file for the six square rectangle Anatomy of Mate mating pattern.
3 neurons for the rank of the piece (000 to 111 i.e. a to g)
3 neurons for the file of the piece (000 to 111 i.e. 1 to 8)
Thus, 18 input neurons indicate:
Neuron Number in the I/p Layer Representation 1-3 File of Black King 4-6 Rank of Black King 7-9 File of White King 10-12 Rank of White King 13-15 File of White Rook 16-18 Rank of White Rook
Board position i/p representation for given board Thus, input representation for above board position 001101011010101110 INPUT REPRESENTATION …contd. Parameter Value Representation File of Black king b 001 Rank of Black King 6 101 File of White king d 011 Rank of White King 3 010 File of White Rook f 101 Rank of White Rook 7 110
Initial board position After White King’s move The output representation of the above move is 001001 OUTPUT REPRESENTATAION …contd. Parameter Representation K/R index 0 No. of squares 01 Direction 001
(b) White Rook to move OUTPUT REPRESENTATAION …contd. Neuron in the O/p Layer Representation 1 K/R Index 2-3 Direction of the move 4-6 Number of squares the piece has moved
Every adjustable network parameter of the cost function should have its own individual learning-rate parameter
Every learning-rate parameter should be allowed to vary from one iteration to the next.
Derivative of the cost function w.r.t. a synaptic weight has the same algebraic sign for several consecutive iterations of the algorithm, the learning-rate parameter should be increased
Algebraic sign of the derivative of the cost function changes with respect to a particular synaptic weight alternates for several consecutive interactions of the algorithm, the learning-rate parameter for that weight should be decreased
EFFECT OF CLUSTERING AND BIAS Clustered samples on d1 Hidden nodes=80 Total number of testing samples= 39354 Total number of training samples=???? Total number of training epochs=????? Criteria Accuracy for biased NN Accuracy for unbiased NN Valid moves 54.609% 19.822% Ideal moves 3.295% 4.200%