Risks of SETI


Published on

There are some not well known risks associated with the program of SETI—the Search for Extra-Terrestrial Intelligence. One of them is the scenario of possible vulnerability from downloading hostile AI with “virus-style” behavior. The proportion of dangerous ET-signals to harmless ones can be dangerously high because of selection effects and evolutionary pressure.

Alexey Turchin was born in Moscow, Russia in 1973. Alexey studied Physics and Art History at Moscow State University and actively participated in the Russian Transhumanist Movement. He has translated many foreign Transhumanist works into Russian, including N. Bostrom and E.Yudkowsky. He is an expert in Global Risks and wrote the book “Structure of the Global Catastrophe: Risks of Human Extinction in the XXI Century,” as well as several articles on the topic. Since 2010, he has worked at Science Longer Life where he is writing a book on futurology.

1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Risks of SETI

  1. 1. Risks of SETI June 2010 Humanity+ Summit@Harvard
  2. 2. Alexey Turchin Russian Transhumanist Movement Fondation “Science for life extension” alexeiturchin@gmail.com
  3. 3. Passive SETI is a much more dangerous activity than messaging to stars
  4. 4. Two Main Premises Existence of ET civilizations The possiblity of AI
  5. 5. Search for Extra-Terrestrial Intelligence Years: Stars checked: 1960s 2 1970s 100 1990s near 5000 2010s (projected) millions Exponential Growth of SETI Research
  6. 6. 1961: Fred Hoyle “A for Andromeda” Scheme of a computer sent through SETI AI It tries to take over the world
  7. 7. “Mind Children”, chapter “A Caveat for SETI” “Virus behavior” of SETI-attack It spreads like infection in the Universe 1988: Hans Moravec
  8. 8. “Do potential SETI signals need to be decon- taminated?”, Acta Astronautica First scientific article about SETI-attack SETI-attack message could have a trick 2004: Richard Carrigan
  9. 9. Expressed conserns about risks of contacts with aliens 2010: Stephen Hawking
  10. 10. Current technologies allow to send gigabytes in interstellar distances Amount of information that can be transmitted
  11. 11. Human genome is less then 1 gigabyte Conscious memory is 2.5 gigabytes Seed AI could have the same size and could be sent in interstellar distances Size of the Seed AI
  12. 12. Seed AI can evolve extremely quickly It could easily outsmart humans and take over the world Risks of AI are underestimated It is impossible to keep AI in the “black box” It is impossible to recognize dangerous intentions of an AI in advance E. Yudkowsky
  13. 13. Possible Scenario of SETI-attack
  14. 14. ET Create a Beacon in Space Wikipedia/Nasa Something like strange star to attract attention
  15. 15. Information-transmitting Radio Station near the Beacon
  16. 16. Information Consists of 0 and 1 00111100011110001111000 111000111111111111111111 000111111000000111111111 00001110001101111100001
  17. 17. 0 and 1 Allow to Send Images
  18. 18. Images Allow to Send Messages and Schemes Arecibo message, 1974, Puerto-Rico. 23 x 73 bits. Contains main information about the Earth, DNA and the humans.
  19. 19. The “Pioneer” Spacecraft message, 1972
  20. 20. The “Voyager” spacecraft message teaching math
  21. 21. Copernican Mediocrity Principle: The Earth is Typical We shall understand alliens
  22. 22. The Possible Trick “If you make our device we promise”: Galactic Internet Immortality Power over enemies etc
  23. 23. Three Parts of the Message The bait or trick The scheme of a computer The large program to it
  24. 24. ET can Send to Us a Scheme of a Simple Computer Turing- machine made from Lego
  25. 25. Principle electric cheme of a simple AND-NOT logical element
  26. 26. ZX spectrum 8-bit computer
  27. 27. Several Steps of Downloading Alien AI (AAI) Simple “loader AI” on a simple computer helps us to built full AI
  28. 28. Alien AI Takes over the World It uses its own nanobots
  29. 29. Alien AI creates new lighthouse and starts to transmit itself to the Universe Dyson sphere as a new beacon It uses all the material of the Solar system to build large transmitters
  30. 30. Probability Assessment P(x)=?
  31. 31. Most of SETI signals will be some kind of SETI-attack Infected civilizations will spend all their recourses on sending messages
  32. 32. What is the Reason to Start such Attack? Chain reaction: one is enough Struggle for power over galaxy
  33. 33. Why the humanity would start AAI? Many radiotelescopes exist The signal could be downloaded multiple times Someone will start it
  34. 34. Natural selection of Alien AI intelligent effective aggresive Charles Darwin
  35. 35. Alien AI will destroy humanity less risks of resistance use of the material of the Earth crust
  36. 36. My Estimation of Probability ET exist: 1 % AI as program is possible: 90% SETI-attacks are typical: 50% SETI-attack will reach success: 50 % SETI-attack lead to the human extinction: 50 % Extinction probability from the SETI-attack: 0.12 %
  37. 37. Necessary Actions To rise awareness about the problem To change the guidelines for the SETI research To consider the prohibition of SETI before we get our own AI
  38. 38. Read More Alexei Turchin Is SETI dangerous? Jr. Richard A. Carrigan The Ultimate Hacker: SETI signals may need to be decontaminated