Your SlideShare is downloading. ×
0
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Risks of SETI
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Risks of SETI

104

Published on

There are some not well known risks associated with the program of SETI—the Search for Extra-Terrestrial Intelligence. One of them is the scenario of possible vulnerability from downloading hostile AI …

There are some not well known risks associated with the program of SETI—the Search for Extra-Terrestrial Intelligence. One of them is the scenario of possible vulnerability from downloading hostile AI with “virus-style” behavior. The proportion of dangerous ET-signals to harmless ones can be dangerously high because of selection effects and evolutionary pressure.

Alexey Turchin was born in Moscow, Russia in 1973. Alexey studied Physics and Art History at Moscow State University and actively participated in the Russian Transhumanist Movement. He has translated many foreign Transhumanist works into Russian, including N. Bostrom and E.Yudkowsky. He is an expert in Global Risks and wrote the book “Structure of the Global Catastrophe: Risks of Human Extinction in the XXI Century,” as well as several articles on the topic. Since 2010, he has worked at Science Longer Life where he is writing a book on futurology.

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
104
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
1
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Risks of SETI June 2010 Humanity+ Summit@Harvard
  • 2. Alexey Turchin Russian Transhumanist Movement Fondation “Science for life extension” alexeiturchin@gmail.com
  • 3. Passive SETI is a much more dangerous activity than messaging to stars
  • 4. Two Main Premises Existence of ET civilizations The possiblity of AI
  • 5. Search for Extra-Terrestrial Intelligence Years: Stars checked: 1960s 2 1970s 100 1990s near 5000 2010s (projected) millions Exponential Growth of SETI Research
  • 6. 1961: Fred Hoyle “A for Andromeda” Scheme of a computer sent through SETI AI It tries to take over the world
  • 7. “Mind Children”, chapter “A Caveat for SETI” “Virus behavior” of SETI-attack It spreads like infection in the Universe 1988: Hans Moravec
  • 8. “Do potential SETI signals need to be decon- taminated?”, Acta Astronautica First scientific article about SETI-attack SETI-attack message could have a trick 2004: Richard Carrigan
  • 9. Expressed conserns about risks of contacts with aliens 2010: Stephen Hawking
  • 10. Current technologies allow to send gigabytes in interstellar distances Amount of information that can be transmitted
  • 11. Human genome is less then 1 gigabyte Conscious memory is 2.5 gigabytes Seed AI could have the same size and could be sent in interstellar distances Size of the Seed AI
  • 12. Seed AI can evolve extremely quickly It could easily outsmart humans and take over the world Risks of AI are underestimated It is impossible to keep AI in the “black box” It is impossible to recognize dangerous intentions of an AI in advance E. Yudkowsky
  • 13. Possible Scenario of SETI-attack
  • 14. ET Create a Beacon in Space Wikipedia/Nasa Something like strange star to attract attention
  • 15. Information-transmitting Radio Station near the Beacon
  • 16. Information Consists of 0 and 1 00111100011110001111000 111000111111111111111111 000111111000000111111111 00001110001101111100001
  • 17. 0 and 1 Allow to Send Images
  • 18. Images Allow to Send Messages and Schemes Arecibo message, 1974, Puerto-Rico. 23 x 73 bits. Contains main information about the Earth, DNA and the humans.
  • 19. The “Pioneer” Spacecraft message, 1972
  • 20. The “Voyager” spacecraft message teaching math
  • 21. Copernican Mediocrity Principle: The Earth is Typical We shall understand alliens
  • 22. The Possible Trick “If you make our device we promise”: Galactic Internet Immortality Power over enemies etc
  • 23. Three Parts of the Message The bait or trick The scheme of a computer The large program to it
  • 24. ET can Send to Us a Scheme of a Simple Computer Turing- machine made from Lego
  • 25. Principle electric cheme of a simple AND-NOT logical element
  • 26. ZX spectrum 8-bit computer
  • 27. Several Steps of Downloading Alien AI (AAI) Simple “loader AI” on a simple computer helps us to built full AI
  • 28. Alien AI Takes over the World It uses its own nanobots
  • 29. Alien AI creates new lighthouse and starts to transmit itself to the Universe Dyson sphere as a new beacon It uses all the material of the Solar system to build large transmitters
  • 30. Probability Assessment P(x)=?
  • 31. Most of SETI signals will be some kind of SETI-attack Infected civilizations will spend all their recourses on sending messages
  • 32. What is the Reason to Start such Attack? Chain reaction: one is enough Struggle for power over galaxy
  • 33. Why the humanity would start AAI? Many radiotelescopes exist The signal could be downloaded multiple times Someone will start it
  • 34. Natural selection of Alien AI intelligent effective aggresive Charles Darwin
  • 35. Alien AI will destroy humanity less risks of resistance use of the material of the Earth crust
  • 36. My Estimation of Probability ET exist: 1 % AI as program is possible: 90% SETI-attacks are typical: 50% SETI-attack will reach success: 50 % SETI-attack lead to the human extinction: 50 % Extinction probability from the SETI-attack: 0.12 %
  • 37. Necessary Actions To rise awareness about the problem To change the guidelines for the SETI research To consider the prohibition of SETI before we get our own AI
  • 38. Read More Alexei Turchin Is SETI dangerous? Jr. Richard A. Carrigan The Ultimate Hacker: SETI signals may need to be decontaminated

×