Risks of SETI
  Humanity+ Summit@Harvard
          June 2010
Alexey Turchin
Russian Transhumanist Movement
Fondation “Science for life extension”
   alexeiturchin@gmail.com
Passive SETI is a much more dangerous
    activity than messaging to stars
Two Main Premises

Existence of
ET civilizations




The possiblity of AI
Exponential Growth of SETI Research

    Search for Extra-Terrestrial Intelligence
     Years:                Stars checke...
1961: Fred Hoyle


  “A for Andromeda”

  Scheme of a computer sent through SETI
  AI
  It tries to take over the world
1988: Hans Moravec



     “Mind Children”,
      chapter “A Caveat for SETI”

      “Virus behavior” of SETI-attack
     ...
2004: Richard Carrigan



     “Do potential SETI signals need to be decon-
     taminated?”, Acta Astronautica



     Fi...
2010: Stephen Hawking



          Expressed conserns about risks
          of contacts with aliens
Amount of information
that can be transmitted




                Current technologies
                allow to send gigab...
Size of the Seed AI
 Human genome is less then
 1 gigabyte



 Conscious memory is
 2.5 gigabytes

 Seed AI could have the...
E. Yudkowsky
      Seed AI can evolve extremely
      quickly

      It could easily outsmart humans
      and take over t...
Possible Scenario of SETI-attack
ET Create a Beacon in Space




                         Wikipedia/Nasa



Something like strange star to attract attention
Information-transmitting
Radio Station near the Beacon
Information Consists of 0 and 1

00111100011110001111000
111000111111111111111111
000111111000000111111111
000011100011011...
0 and 1 Allow to Send Images
Images Allow to Send Messages
       and Schemes

            Arecibo message,
            1974,
            Puerto-Rico.
...
The “Pioneer”
Spacecraft
message,
1972
The “Voyager”
spacecraft
message
teaching math
Copernican Mediocrity Principle:
    The Earth is Typical




  We shall understand alliens
The Possible Trick
“If you make our device we promise”:
                 Galactic Internet

                 Immortality

...
Three Parts of the Message
The bait or trick

The scheme of a computer

The large program to it
ET can Send to Us a Scheme of a
      Simple Computer
                       Turing-
                       machine
      ...
Principle electric cheme
of a simple
AND-NOT
logical element
ZX spectrum
8-bit computer
Several Steps of Downloading
        Alien AI (AAI)




Simple “loader AI” on a simple
computer helps us to built full AI
Alien AI Takes over the World




   It uses its own nanobots
Alien AI creates new lighthouse
and starts to transmit itself to
         the Universe
                                   ...
Probability Assessment




P(x)=?
Most of SETI signals will be
 some kind of SETI-attack
            Infected civilizations will spend all
            their...
What is the Reason to
 Start such Attack?
         Chain reaction:
         one is enough


         Struggle for power
  ...
Why the humanity would start AAI?

                   Many
                   radiotelescopes
                   exist
   ...
Natural selection of Alien AI

intelligent

effective

aggresive
                  Charles Darwin
Alien AI will destroy humanity
 less risks of resistance

 use of the material of the Earth crust
My Estimation of Probability
ET exist: 1 %
AI as program is possible: 90%
SETI-attacks are typical: 50%
SETI-attack will r...
Necessary Actions
To rise awareness about the problem

To change the guidelines for the SETI research

To consider the pro...
Read More
Alexei Turchin
Is SETI dangerous?

Jr. Richard A. Carrigan
The Ultimate Hacker: SETI signals may
need to be deco...
Upcoming SlideShare
Loading in …5
×

SETI Risks - Alexey Turchin - H+ Summit @ Harvard

1,879 views

Published on

There are some not well known risks associated with the program of SETI—the Search for Extra-Terrestrial Intelligence. One of them is the scenario of possible vulnerability from downloading hostile AI with “virus-style” behavior. The proportion of dangerous ET-signals to harmless ones can be dangerously high because of selection effects and evolutionary pressure.

Alexey Turchin was born in Moscow, Russia in 1973. Alexey studied Physics and Art History at Moscow State University and actively participated in the Russian Transhumanist Movement. He has translated many foreign Transhumanist works into Russian, including N. Bostrom and E.Yudkowsky. He is an expert in Global Risks and wrote the book “Structure of the Global Catastrophe: Risks of Human Extinction in the XXI Century,” as well as several articles on the topic. Since 2010, he has worked at Science Longer Life where he is writing a book on futurology.

Published in: Technology, News & Politics
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,879
On SlideShare
0
From Embeds
0
Number of Embeds
313
Actions
Shares
0
Downloads
42
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

SETI Risks - Alexey Turchin - H+ Summit @ Harvard

  1. 1. Risks of SETI Humanity+ Summit@Harvard June 2010
  2. 2. Alexey Turchin Russian Transhumanist Movement Fondation “Science for life extension” alexeiturchin@gmail.com
  3. 3. Passive SETI is a much more dangerous activity than messaging to stars
  4. 4. Two Main Premises Existence of ET civilizations The possiblity of AI
  5. 5. Exponential Growth of SETI Research Search for Extra-Terrestrial Intelligence Years: Stars checked: 1960s 2 1970s 100 1990s near 5000 2010s (projected) millions
  6. 6. 1961: Fred Hoyle “A for Andromeda” Scheme of a computer sent through SETI AI It tries to take over the world
  7. 7. 1988: Hans Moravec “Mind Children”, chapter “A Caveat for SETI” “Virus behavior” of SETI-attack It spreads like infection in the Universe
  8. 8. 2004: Richard Carrigan “Do potential SETI signals need to be decon- taminated?”, Acta Astronautica First scientific article about SETI-attack SETI-attack message could have a trick
  9. 9. 2010: Stephen Hawking Expressed conserns about risks of contacts with aliens
  10. 10. Amount of information that can be transmitted Current technologies allow to send gigabytes in interstellar distances
  11. 11. Size of the Seed AI Human genome is less then 1 gigabyte Conscious memory is 2.5 gigabytes Seed AI could have the same size and could be sent in interstellar distances
  12. 12. E. Yudkowsky Seed AI can evolve extremely quickly It could easily outsmart humans and take over the world Risks of AI are underestimated It is impossible to keep AI in the “black box” It is impossible to recognize dangerous intentions of an AI in advance
  13. 13. Possible Scenario of SETI-attack
  14. 14. ET Create a Beacon in Space Wikipedia/Nasa Something like strange star to attract attention
  15. 15. Information-transmitting Radio Station near the Beacon
  16. 16. Information Consists of 0 and 1 00111100011110001111000 111000111111111111111111 000111111000000111111111 00001110001101111100001
  17. 17. 0 and 1 Allow to Send Images
  18. 18. Images Allow to Send Messages and Schemes Arecibo message, 1974, Puerto-Rico. 23 x 73 bits. Contains main information about the Earth, DNA and the humans.
  19. 19. The “Pioneer” Spacecraft message, 1972
  20. 20. The “Voyager” spacecraft message teaching math
  21. 21. Copernican Mediocrity Principle: The Earth is Typical We shall understand alliens
  22. 22. The Possible Trick “If you make our device we promise”: Galactic Internet Immortality Power over enemies etc
  23. 23. Three Parts of the Message The bait or trick The scheme of a computer The large program to it
  24. 24. ET can Send to Us a Scheme of a Simple Computer Turing- machine made from Lego
  25. 25. Principle electric cheme of a simple AND-NOT logical element
  26. 26. ZX spectrum 8-bit computer
  27. 27. Several Steps of Downloading Alien AI (AAI) Simple “loader AI” on a simple computer helps us to built full AI
  28. 28. Alien AI Takes over the World It uses its own nanobots
  29. 29. Alien AI creates new lighthouse and starts to transmit itself to the Universe Dyson sphere as a new beacon It uses all the material of the Solar system to build large transmitters
  30. 30. Probability Assessment P(x)=?
  31. 31. Most of SETI signals will be some kind of SETI-attack Infected civilizations will spend all their recourses on sending messages
  32. 32. What is the Reason to Start such Attack? Chain reaction: one is enough Struggle for power over galaxy
  33. 33. Why the humanity would start AAI? Many radiotelescopes exist The signal could be downloaded multiple times Someone will start it
  34. 34. Natural selection of Alien AI intelligent effective aggresive Charles Darwin
  35. 35. Alien AI will destroy humanity less risks of resistance use of the material of the Earth crust
  36. 36. My Estimation of Probability ET exist: 1 % AI as program is possible: 90% SETI-attacks are typical: 50% SETI-attack will reach success: 50 % SETI-attack lead to the human extinction: 50 % Extinction probability from the SETI-attack: 0.12 %
  37. 37. Necessary Actions To rise awareness about the problem To change the guidelines for the SETI research To consider the prohibition of SETI before we get our own AI
  38. 38. Read More Alexei Turchin Is SETI dangerous? Jr. Richard A. Carrigan The Ultimate Hacker: SETI signals may need to be decontaminated

×