Kim Solez Intro to Tech&Future of Medicine course 5 Sept 2013
Upcoming SlideShare
Loading in...5
×
 

Kim Solez Intro to Tech&Future of Medicine course 5 Sept 2013

on

  • 694 views

Introductory lecture for Technology and Future of Medicine course LABMP 590 on September 5, 2013 at the University of Alberta in Edmonton, Canada. http://www.singularitycourse.com

Introductory lecture for Technology and Future of Medicine course LABMP 590 on September 5, 2013 at the University of Alberta in Edmonton, Canada. http://www.singularitycourse.com

Statistics

Views

Total Views
694
Views on SlideShare
693
Embed Views
1

Actions

Likes
0
Downloads
9
Comments
0

1 Embed 1

https://twitter.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Kim Solez Intro to Tech&Future of Medicine course 5 Sept 2013 Kim Solez Intro to Tech&Future of Medicine course 5 Sept 2013 Presentation Transcript

  • Kim Solez, MD
  • • Founded by Linda Avey and Anne Wojcicki • Aims to teach people about themselves: carrier status to inheritable disease, health risks, and drug response.
  • • Offers genetic testing of 240+ genetically controlled health conditions and traits. • Ancestral composition testing, DNA relatives, and the 23andMe family tree tool.DNA Spit Kit
  • • Non-mandatory! Participation is completely voluntary. • Completely anonymous! No data is shared with the course and there is no pressure to share your results with anyone. • 23andMe is offering an academic discount to anyone who is interested. We estimate 20% off the regular $99 price tag.
  • 1. To outline the basic plan of course and its expectations, due dates, evaluation criteria, and the feedback mechanisms. 2. To begin to introduce the basic concepts of the course including the technological singularity, exponential change, existential risk. 3. To outline procedures for the student presentation evenings November 5th and December 3rd. 4. To outline expectations for 600-1000 word past lecture critique due October 15th, the mentor and topic selection by November 14th, and the 3,000- 4,000 word final paper due November 26th.
  • 1. Each 80-minute class period on Tuesdays and Thursdays will be divided up into 60-minute lecture, and 20-minute whole class discussion. Each student will take on a special project of their own with guidance by the faculty and present the results of that special project in the latter portion of the course. 2. To avoid consuming regular class time with student presentations, the presentations are mostly given in special student presentation evenings scheduled at a time convenient for the students and mixed with food, entertainment, and social events. One such evening is scheduled approximately midway in the semester November 5th so students who wish can give their presentation early. 3. The second student presentation evening is on the last day of class December 3rd.
  • 1. Students will be evaluated on their presentation on their chosen project in the course (30%), a paper on that project (40%), a critique and analysis of strengths and weaknesses of a previous lecture in the course (20%) and class participation (10%). The critique is due October 15th and should be 2-3 pages in length (600-1000 words). 2. Students should pick a mentor and final paper topic by
  • The technological singularity occurs as artificial intelligences surpass human beings as the smartest and most capable life forms on the Earth. Technological development is taken over by the machines, who can think, act and communicate so quickly that normal humans cannot even comprehend what is going on. The machines enter into a "runaway reaction" of self-improvement cycles, with each new generation of A.I.s appearing faster and faster. From this point onwards, technological advancement is explosive, under the control of the machines, and thus cannot be accurately predicted (hence the term "Singularity"). – Ray Kurzweil
  • Course conceptualized in March 2011, tested with focus groups in May for its suitability as a course for both undergraduate and graduate students.
  •  Ten minute introduction  Fifty minute lecture  Twenty minute discussion  In the course we talk about machines replacing many of the functions of human beings. This picture was taken by a machine without human intervention, as were many of the best still images from the course. The video camera is constantly comparing the scene to algorithms and takes still pictures when the requirements of the interesting picture algorithm are satisfied.
  • First teaching session 2011 Recent teaching session 2012 Hot-linked tables of contents in YouTube video descriptions allow one to jump right to content of interest.
  • Heather Graves, from Department of English and Film Studies, in Faculty of Arts.
  •  CCIS is on the other side of campus for many of us, but it is good exercise to go there and one often has excellent company making the trip!
  •  We endeavor to shoot broadcast quality video of each lecture and discussion.  Previous lectures on YouTube.com at /user/KimSolez and /user/avoca99 . Students are asked to critique one past lecture, and suggest improvements in presentation and hot- linked table of contents (20% of grade).  Students write one 3,000 word paper (40%) and give 20 min. presentation on same subject (30%). Also graded on class participation (10%).  No required reading. Suggested reading list distributed by Email, is constantly updated.
  •  The technological Singularity.  Existential risks, AI, genomics, and nanotech.  Ways to optimize a positive outcome for humanity in the co-evolution of humans and machines.  The influence of these considerations on medicine of the future.  Dean of Science speaking, prominent people internationally. Most lectures not very “medical”. Easily understood.  Balanced view provided by incorporating both technology skeptics and technology advocates.
  •  Student numbers more than double every six months? Maximum is 40 .  12-14 people seems ideal. With more than that all students cannot hear all presentations, and there is not enough time for questions! 0 10 20 30 40 Winter'12 Fall'12 Winter'13 Fall'13 Winter'14 Fall'14 Registered StudentsWinter ’12 Fall Winter ‘13 Fall Winter ‘14 Fall 2 5 12 27 40 40
  •  Some already know what the technological Singularity is, others don’t, and are finding out now.  However, if the people and ideas I present are genuinely new and interesting I should be able to satisfy both groups.  The most interesting aspects have to do with the impact on young people today.  Considerable youth orientation in the course. Faculty are getting younger and younger.
  •  Run like Autodesk Design Night.  Best. Salon. Ever. March 1, 2014.  Hosted by media professional Dr. Julielynn Wong of Singularity U.  Analogous to Paris Salon of a century ago which moved Western thought and culture forward, music, art, good conversation, something unique, innovative, and memorable!
  •  How Inside Outside and Singularity impact young people, important concepts discussed in course.  Nova program on PBS Television (7 million viewers)  Big Bang Theory (the TV show; 20 million viewers)  Singularity Summit (9,000 views per video)  Kim Solez – Technology and Future of Medicine Course LABMP 590 (1,700 views per video)
  •  . However, Marcus Hutter suggests that there is an element of human insignificance that makes the whole scenario much more challenging. Also, Hutter has created a model general AI which makes the challenge seem more immediate!
  •  Outside the Singularity looking in it will be white noise  Inside the Singularity if everything speeds up at the same rate we may not notice anything; it may seem like normal life to us  Even if initially our biological brains count for something in our mental processes, very soon the processing power of the machine implant will vastly outstrip our biological brains. Our biological brains become insignificant regardless of the friendliness or lack thereof of the AI
  •  Extreme risk taking because we can back ourselves up from backups if something bad happens  Insignificance, lack of identity. Why wait to create backups when we have the processing power to run several lives at once. Can replicate ourselves endlessly in seconds! No more waiting 9 months!  The world has little incentive to keep identities straight when biological brains contribute so little to mental processes. Bigger not better  Aimlessness, lack of sense of purpose
  •  The challenge of producing a friendly AI becomes just a small part of the much larger challenge of creating a friendly world in which humans still have lives of significance, human history is retained and extended  A positive outcome is possible; let’s make it likely  We all need to be engaged in ensuring a positive outcome for humanity. The future is ours to shape. We need to get busy doing that!  A simple approach is needed to engage the general public on these matters!  This course is a beginning attempt at achieving that.
  •  The challenge of friendly AI becomes just a small part of a much larger challenge of creating a friendly world in which humans still have lives of significance, human history is retained and extended.  We all need to be engaged in ensuring a positive outcome for humanity. The future is ours to shape. We need to get busy doing that!  Part of the imagined future could be one where all disease was eliminated but life was intolerable. Another where the only diseases are from bioterrorism.
  •  All natural disease may be eliminated, leaving only man-made diseases. But that may leave as much for physicians to do as there is today!  Challenging responses to bioterrorism and stem cell technologies.  Focus of medicine no longer disease but enhancement, which will extend beyond the physical to the moral.  Social responsibility an important aspect of medicine and one of the focuses of the course.
  •  “It is the curse of humanity that it learns to tolerate even the most horrible situations by habituation. Physicians are the natural attorneys of the poor, and the social problems should largely be solved by them.” -Rudolf Virchow
  •  It became apparent that the best way to make this happen was for me to create a novel course of new design. Thus, this course.  Presently, we know of no similar courses being presented elsewhere.  Eventually it is our hope that hundreds of similar courses will begin appearing at Universities all over the world.
  •  Regulatory oversight that is completely focused on compliance. Discourages risk-taking and innovation.  Health care doesn't have the same financial reward system. Facebook isn't about to pay $1 billion for the latest hot-ticket item in imaging and informatics.  Security always trumps information sharing, and so better, faster linkages are constrained because of security concerns, most of which are bogus.
  • HealthAppsNotUsed