0
INTRODUCING A ROUND ROBIN TOURNAMENT INTO BLONDIE24 Belal Al-Khateeb     Graham Kendallbxk@cs.nott.ac.uk    gxk@cs.nott.ac...
Outline2    -Introduction       - Checkers       - Samuel’s Checkers Program       - Chinook       - Deep Blue    - Blondi...
Checkers3        Opening Board of Checkers (Black moves first)
Checkers4             Black Forced to make Jump         move
Checkers5               Black Gets King
Samuel’s Checkers Program6    - 1959, Arthur Samuel started to look at      Checkers      - The determination of weights t...
Chinook7    - Produced by Jonathan Schaeffer in 1989.    - 40,000 openings.    - 8-piece endgame database in 1994.    - Wo...
Deep Blue8    - Developed by IBM in mid 1990s.    - An attempt to create a Chess program that     was capable of beating t...
Deep Blue9    - The end game database of Deep Blue      consists of all positions with five or fewer      chess pieces on ...
Blondie2410     - Produced by Fogel in 1999-2000     - Neural network as an evaluation function.     - Values for input no...
Blondie2411                 Blondie24’s EANN Architecture
Blondie2412     - Initial population of 30 neural networks     (players).     - Each neural network plays 5 games (as red)...
Blondie2413     - Repeat the process for 840 generations and       the best player after these generations is       retain...
Blondie2414          Blondie24 Performance after 165 games on                  zone.com
Blondie24-R15     - Has         the same structure and architecture that     Fogel utilised in Blondie24.     - The only e...
Blondie24-RR16     - Eliminate the randomness in the evolutionary       phase of Blondie24-R.     - A league competition b...
Results and Discussion17                    Blondie24-R   Blondie24-RR   Online   WinCheck3D         SX checkers     Blond...
Results and Discussion18     - Blondie24-RR plays two matches (one as red       and one as white) against Blondie24-      ...
Results and Discussion19     - Blondie24-R and Blondie24-RR win against an       online program which can be considered as...
Conclusion20     - The results show that Blondie24-RR is       performing better than Blondie24-R.     - Based on these re...
Future Works21      - Investigate if other changes are possible.      - Investigate using individual and social        lea...
References221- Samuel, A. L., Some studies in machine learning using the game of checkers 1959,1967.2- Fogel D. B., Blondi...
Questions/Discussions23                 Thank You
Upcoming SlideShare
Loading in...5
×

Blondie24 (round robin) cig09 seminar

166

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
166
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "Blondie24 (round robin) cig09 seminar"

  1. 1. INTRODUCING A ROUND ROBIN TOURNAMENT INTO BLONDIE24 Belal Al-Khateeb Graham Kendallbxk@cs.nott.ac.uk gxk@cs.nott.ac.uk School of Computer Science (ASAP Group) University of Nottingham
  2. 2. Outline2 -Introduction - Checkers - Samuel’s Checkers Program - Chinook - Deep Blue - Blondie24 - Blondie24-R - Blondie24-RR - Results and Discussion - Conclusion - Future Work
  3. 3. Checkers3 Opening Board of Checkers (Black moves first)
  4. 4. Checkers4 Black Forced to make Jump move
  5. 5. Checkers5 Black Gets King
  6. 6. Samuel’s Checkers Program6 - 1959, Arthur Samuel started to look at Checkers - The determination of weights through self-play - 39 Features - Included look-ahead via mini-max (Alpha- Beta) - Defeated Robert Nealy
  7. 7. Chinook7 - Produced by Jonathan Schaeffer in 1989. - 40,000 openings. - 8-piece endgame database in 1994. - Won the 1989 Computer Olympiad. - Chinook become the world champion. The first automated game player to have achieved this.
  8. 8. Deep Blue8 - Developed by IBM in mid 1990s. - An attempt to create a Chess program that was capable of beating the world champion at that time - 30 processors with parallel search, could evaluate up to 200 million chess positions per second - 8,000 different features - The opening database in Deep Blue consisted of 4,000 positions
  9. 9. Deep Blue9 - The end game database of Deep Blue consists of all positions with five or fewer chess pieces on the board. - Defeated Gary Kasparov in a six-game match in 1997 to become the first computer program to defeat a world Chess champion.
  10. 10. Blondie2410 - Produced by Fogel in 1999-2000 - Neural network as an evaluation function. - Values for input nodes Red (Black) – positive White – negative Empty – zero - Piece differential - Subsections (sub-boards)
  11. 11. Blondie2411 Blondie24’s EANN Architecture
  12. 12. Blondie2412 - Initial population of 30 neural networks (players). - Each neural network plays 5 games (as red) against 5 randomly chosen players:- +1 for a win 0 for a draw -2 for a loss -Best 15 players retained, the other 15 players eliminated. -Copy the best 15 players (replacing the worst
  13. 13. Blondie2413 - Repeat the process for 840 generations and the best player after these generations is retained. - Played 165 games at zone.com. - Rating: 2045.85 at that time - In top 500 of over 120,000 players on zone.com at that time. - Better than 99.61% of registered players on zone.com
  14. 14. Blondie2414 Blondie24 Performance after 165 games on zone.com
  15. 15. Blondie24-R15 - Has the same structure and architecture that Fogel utilised in Blondie24. - The only exception that the value of the King is fixed to 2. - The King is more valuable than an ordinary piece, and this is a well-known, even to novice players.
  16. 16. Blondie24-RR16 - Eliminate the randomness in the evolutionary phase of Blondie24-R. - A league competition between all the 30 neural networks. - All the neural networks play against each other. - The total number of matches per generation will be 870 (30*29) rather than 150 (30*5). - This increase (number of matches) will decrease the number of generations (840
  17. 17. Results and Discussion17 Blondie24-R Blondie24-RR Online WinCheck3D SX checkers Blondie24-R - Draw Win Lose (7-Piece) Lose (8-Piece) Blondie24-RR Win - Win Lose (2-Pieces ) Lose (4-Pieces) Results of Playing Against Selected Programs
  18. 18. Results and Discussion18 - Blondie24-RR plays two matches (one as red and one as white) against Blondie24- R, Blondie24-RR. - Wins as red against Blondie24-R. - The result is draw when Blondie24-RR moves second. - Reflects a success for our hypothesis based on the fact that both players are end products.
  19. 19. Results and Discussion19 - Blondie24-R and Blondie24-RR win against an online program which can be considered as another success. - Plays against two programs (strong). - For the first one Blondie24-RR lost with a two piece difference, Blondie24-R lost with a seven piece difference. - Playing against the second program shows that Blondie24-RR lost with a four piece difference, while Blondie24-R lost with an eight
  20. 20. Conclusion20 - The results show that Blondie24-RR is performing better than Blondie24-R. - Based on these results it would seem appropriate to use the league structure, instead of only choosing five random opponents to play against during the evolutionary phase.
  21. 21. Future Works21 - Investigate if other changes are possible. - Investigate using individual and social learning methods in order to enhance the ability of Blondie24-RR to overcome the problem of being an end product.
  22. 22. References221- Samuel, A. L., Some studies in machine learning using the game of checkers 1959,1967.2- Fogel D. B., Blondie24 Playing at the Edge of AI, United States of America Academic Press, 2002.3- Chellapilla K. and Fogel, D. B., Anaconda defeats hoyle 6-0: A case study competing an evolved checkers program against commercially available software 2000.4- Fogel D. B. and Chellapilla K., Verifying anacondas expert rating by competing against Chinook: experiments in co-evolving a neural checkers player.5- Chellapilla K. and Fogel D.B., Evolution, Neural Networks, Games, and Intelligence,” 1999..6- Chellapilla K. and Fogel D. B., Evolving an expert checkers playing program without using human expertise.7- Chellapilla K. and Fogel D. B., Evolving neural networks to play checkers without relying on expert knowledge.1999.
  23. 23. Questions/Discussions23  Thank You
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×