The Pacific Research Platform (PRP) is a multi-institutional cyberinfrastructure project that connects researchers across California and beyond to share large datasets. It spans the 10 University of California campuses, major private research universities, supercomputer centers, and some out-of-state universities. Fifteen multi-campus research teams in fields like physics, astronomy, earth sciences, biomedicine, and multimedia will drive the technical needs of the PRP over five years. The goal is to create a "big data freeway" to allow high-speed sharing of data between research labs, supercomputers, and repositories across multiple networks without performance loss over long distances.
Opening Keynote Lecture
15th Annual ON*VECTOR International Photonics Workshop
Calit2’s Qualcomm Institute
University of California, San Diego
February 29, 2016
The Pacific Research Platform: a Science-Driven Big-Data Freeway SystemLarry Smarr
The Pacific Research Platform (PRP) is a multi-institutional partnership that establishes a high-capacity "big data freeway system" spanning the University of California campuses and other research universities in California to facilitate rapid data access and sharing between researchers and institutions. Fifteen multi-campus application teams in fields like particle physics, astronomy, earth sciences, biomedicine, and visualization drive the technical design of the PRP over five years. The goal of the PRP is to extend campus "Science DMZ" networks to allow high-speed data movement between research labs, supercomputer centers, and data repositories across campus, regional
Global Research Platforms: Past, Present, FutureLarry Smarr
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive functioning. Exercise boosts blood flow, releases endorphins, and promotes changes in the brain which help regulate emotions and stress levels.
The Pacific Research Platform:a Science-Driven Big-Data Freeway SystemLarry Smarr
The Pacific Research Platform will create a regional "Big Data Freeway System" along the West Coast to support science. It will connect major research institutions with high-speed optical networks, allowing them to share vast amounts of data and computational resources. This will enable new forms of collaborative, data-intensive research for fields like particle physics, astronomy, biomedicine, and earth sciences. The first phase aims to establish a basic networked infrastructure, with later phases advancing capabilities to 100Gbps and beyond with security and distributed technologies.
Pacific Research Platform Science DriversLarry Smarr
The document discusses the vision and progress of the Pacific Research Platform (PRP) in creating a "big data freeway" across the West Coast to enable data-intensive science. It outlines how the PRP builds on previous NSF and DOE networking investments to provide dedicated high-performance computing resources, like GPU clusters and Jupyter hubs, connected by high-speed networks at multiple universities. Several science driver teams are highlighted, including particle physics, astronomy, microbiology, earth sciences, and visualization, that will leverage PRP resources for large-scale collaborative data analysis projects.
The Pacific Research Platform: Building a Distributed Big-Data Machine-Learni...Larry Smarr
The document summarizes the Pacific Research Platform (PRP) which connects researchers across multiple universities with high-speed networks and computing resources for big data and machine learning applications. Key points:
- PRP connects 15 universities with optical networks, distributed storage devices (FIONAs), and over 350 GPUs for data analysis and AI training.
- It allows researchers to rapidly share and analyze large datasets, with one example reducing a workflow from 19 days to 52 minutes.
- Other projects using PRP resources include climate modeling, astrophysics simulations, and machine learning courses involving thousands of students.
The Pacific Research Platform (PRP) is a multi-institutional cyberinfrastructure project that connects researchers across California and beyond to share large datasets. It spans the 10 University of California campuses, major private research universities, supercomputer centers, and some out-of-state universities. Fifteen multi-campus research teams in fields like physics, astronomy, earth sciences, biomedicine, and multimedia will drive the technical needs of the PRP over five years. The goal is to create a "big data freeway" to allow high-speed sharing of data between research labs, supercomputers, and repositories across multiple networks without performance loss over long distances.
Opening Keynote Lecture
15th Annual ON*VECTOR International Photonics Workshop
Calit2’s Qualcomm Institute
University of California, San Diego
February 29, 2016
The Pacific Research Platform: a Science-Driven Big-Data Freeway SystemLarry Smarr
The Pacific Research Platform (PRP) is a multi-institutional partnership that establishes a high-capacity "big data freeway system" spanning the University of California campuses and other research universities in California to facilitate rapid data access and sharing between researchers and institutions. Fifteen multi-campus application teams in fields like particle physics, astronomy, earth sciences, biomedicine, and visualization drive the technical design of the PRP over five years. The goal of the PRP is to extend campus "Science DMZ" networks to allow high-speed data movement between research labs, supercomputer centers, and data repositories across campus, regional
Global Research Platforms: Past, Present, FutureLarry Smarr
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive functioning. Exercise boosts blood flow, releases endorphins, and promotes changes in the brain which help regulate emotions and stress levels.
The Pacific Research Platform:a Science-Driven Big-Data Freeway SystemLarry Smarr
The Pacific Research Platform will create a regional "Big Data Freeway System" along the West Coast to support science. It will connect major research institutions with high-speed optical networks, allowing them to share vast amounts of data and computational resources. This will enable new forms of collaborative, data-intensive research for fields like particle physics, astronomy, biomedicine, and earth sciences. The first phase aims to establish a basic networked infrastructure, with later phases advancing capabilities to 100Gbps and beyond with security and distributed technologies.
Pacific Research Platform Science DriversLarry Smarr
The document discusses the vision and progress of the Pacific Research Platform (PRP) in creating a "big data freeway" across the West Coast to enable data-intensive science. It outlines how the PRP builds on previous NSF and DOE networking investments to provide dedicated high-performance computing resources, like GPU clusters and Jupyter hubs, connected by high-speed networks at multiple universities. Several science driver teams are highlighted, including particle physics, astronomy, microbiology, earth sciences, and visualization, that will leverage PRP resources for large-scale collaborative data analysis projects.
The Pacific Research Platform: Building a Distributed Big-Data Machine-Learni...Larry Smarr
The document summarizes the Pacific Research Platform (PRP) which connects researchers across multiple universities with high-speed networks and computing resources for big data and machine learning applications. Key points:
- PRP connects 15 universities with optical networks, distributed storage devices (FIONAs), and over 350 GPUs for data analysis and AI training.
- It allows researchers to rapidly share and analyze large datasets, with one example reducing a workflow from 19 days to 52 minutes.
- Other projects using PRP resources include climate modeling, astrophysics simulations, and machine learning courses involving thousands of students.
Looking Back, Looking Forward NSF CI Funding 1985-2025Larry Smarr
This document provides an overview of the development of national research platforms (NRPs) from 1985 to the present, with a focus on the Pacific Research Platform (PRP). It describes the evolution of the PRP from early NSF-funded supercomputing centers to today's distributed cyberinfrastructure utilizing optical networking, containers, Kubernetes, and distributed storage. The PRP now connects over 15 universities across the US and internationally to enable data-intensive science and machine learning applications across multiple domains. Going forward, the document discusses plans to further integrate regional networks and partner with new NSF-funded initiatives to develop the next generation of NRPs through 2025.
This document discusses several projects related to connecting research institutions through high-speed networks:
1) The Pacific Research Platform connects campuses in California through a "big data superhighway" funded by NSF from 2015-2020.
2) CHASE-CI adds machine learning capabilities for researchers across 10 campuses in California using NSF-funded GPU resources.
3) A pilot project is using CENIC and Internet2 to connect regional research networks on a national scale, funded by NSF from 2018-2019.
Peering The Pacific Research Platform With The Great Plains NetworkLarry Smarr
The Pacific Research Platform (PRP) connects research institutions across the western United States with high-speed networks to enable data-intensive science collaborations. Key points:
- The PRP connects 15 campuses across California and links to the Great Plains Network, allowing researchers to access remote supercomputers, share large datasets, and collaborate on projects like analyzing data from the Large Hadron Collider.
- The PRP utilizes Science DMZ architectures with dedicated data transfer nodes called FIONAs to achieve high-speed transfer of large files. Kubernetes is used to manage distributed storage and computing resources.
- Early applications include distributed climate modeling, wildfire science, plankton imaging, and cancer genomics. The PR
The document provides an overview of the Pacific Research Platform (PRP) and discusses its role in connecting researchers across institutions and enabling new applications. It summarizes the PRP's key components like Science DMZs, Data Transfer Nodes (FIONAs), and use of Kubernetes for container management. Several examples are given of how the PRP facilitates high-performance distributed data analysis, access to remote supercomputers, and sensor networks coupled to real-time computing. Upcoming work on machine learning applications and expanding the PRP internationally is also outlined.
The Pacific Research Platform Enables Distributed Big-Data Machine-LearningLarry Smarr
The Pacific Research Platform enables distributed big data machine learning by connecting scientific instruments, sensors, and supercomputers across California and the United States with high-speed optical networks. Key components include FIONA data transfer nodes that allow fast disk-to-disk transfers near the theoretical maximum, Kubernetes to orchestrate distributed computing resources, and the Nautilus hypercluster which aggregates thousands of CPU cores and GPUs into a unified platform. This infrastructure has accelerated many scientific workflows and supported cutting-edge research in fields such as astronomy, oceanography, climate science, and particle physics.
CENIC: Pacific Wave and PRP Update Big News for Big DataLarry Smarr
The document discusses the Pacific Wave exchange and Pacific Research Platform (PRP). It provides an overview of Pacific Wave, including its history and connectivity across the Pacific and western US. It then discusses how the PRP will build on infrastructure projects to create a high-speed "big data freeway" for science across California universities. This will allow researchers to more easily share and analyze large datasets for projects in areas like climate modeling, cancer genomics, astronomy and particle physics. Details are provided on specific science applications and datasets that will benefit from the enhanced connectivity of the PRP.
The Pacific Research Platform: Building a Distributed Big Data Machine Learni...Larry Smarr
This document summarizes Dr. Larry Smarr's invited talk about the Pacific Research Platform (PRP) given at the San Diego Supercomputer Center in April 2019. The PRP is building a distributed big data machine learning supercomputer by connecting high-performance computing and data resources across multiple universities in California and beyond using high-speed networks. It provides researchers with petascale computing power, distributed storage, and tools like Kubernetes to enable collaborative data-intensive science across institutions.
Pacific Wave and PRP Update Big News for Big DataLarry Smarr
The Pacific Research Platform (PRP) aims to create a "Big Data freeway system" across research institutions in the western United States and Pacific region by leveraging high-bandwidth optical fiber networks. The PRP connects multiple universities and national laboratories, providing bandwidth up to 100Gbps for data-intensive science applications. Initial testing of the PRP demonstrated disk-to-disk transfer speeds exceeding 5Gbps between many sites. The PRP will be expanded with SDN/SDX capabilities to enable even higher performance for large-scale datasets from fields like astronomy, genomics, and particle physics.
- The Pacific Research Platform (PRP) interconnects campus DMZs across multiple institutions to provide high-speed connectivity for data-intensive research.
- The PRP utilizes specialized data transfer nodes called FIONAs that provide disk-to-disk transfer speeds of 10-100Gbps.
- Early applications of the PRP include distributing telescope data between UC campuses, connecting particle physics experiments to computing resources, and enabling real-time wildfire sensor data analysis.
An Integrated Science Cyberinfrastructure for Data-Intensive ResearchLarry Smarr
This document summarizes Dr. Larry Smarr's vision for an integrated science cyberinfrastructure to support data-intensive research. It discusses the exponential growth of digital data and need for dedicated high-bandwidth networks and data repositories. Specific examples are provided of initiatives at UCSD, regional optical networks connecting research institutions, and national projects like the Open Science Grid and Cancer Genomics Hub that are creating cyberinfrastructure to enable data-intensive scientific discovery.
Building the Pacific Research Platform: Supernetworks for Big Data ScienceLarry Smarr
The document summarizes Dr. Larry Smarr's presentation on building the Pacific Research Platform (PRP) to enable big data science across research universities on the West Coast. The PRP provides 100-1000 times more bandwidth than today's internet to support research fields from particle physics to climate change. In under 2 years, the prototype PRP has connected researchers and datasets across California through optical networks and is now expanding nationally and globally. The next steps involve adding machine learning capabilities to the PRP through GPU clusters to enable new discoveries from massive datasets.
The document summarizes the creation and evolution of Calit2, the California Institute for Telecommunications and Information Technology, a partnership between UC San Diego and UC Irvine. It describes how Calit2 was established in 2001 with a mission to explore how emerging technologies could transform applications through interdisciplinary research. With support from the state and industry partners, Calit2 has grown facilities and research projects in areas like networking, virtual reality, biomedicine, and more recently brain-inspired computing and machine learning.
Four Disruptive Trends for the Next DecadeLarry Smarr
Four disruptive trends will shape the next decade: 1) Distributed software systems will drive disintermediation and disrupt industries like transportation and hospitality; 2) Networked virtual reality will allow for planetary-scale collaboration and remote viewing; 3) Climate change will require adaptation of infrastructure to become intelligent, secure, low-carbon and climate-resilient; 4) Brain-inspired computing utilizing massive data and exascale supercomputers will enable emulation of the human brain within a decade and usher in an era of cognitive technologies.
Looking Back, Looking Forward NSF CI Funding 1985-2025Larry Smarr
This document provides an overview of the development of national research platforms (NRPs) from 1985 to the present, with a focus on the Pacific Research Platform (PRP). It describes the evolution of the PRP from early NSF-funded supercomputing centers to today's distributed cyberinfrastructure utilizing optical networking, containers, Kubernetes, and distributed storage. The PRP now connects over 15 universities across the US and internationally to enable data-intensive science and machine learning applications across multiple domains. Going forward, the document discusses plans to further integrate regional networks and partner with new NSF-funded initiatives to develop the next generation of NRPs through 2025.
This document discusses several projects related to connecting research institutions through high-speed networks:
1) The Pacific Research Platform connects campuses in California through a "big data superhighway" funded by NSF from 2015-2020.
2) CHASE-CI adds machine learning capabilities for researchers across 10 campuses in California using NSF-funded GPU resources.
3) A pilot project is using CENIC and Internet2 to connect regional research networks on a national scale, funded by NSF from 2018-2019.
Peering The Pacific Research Platform With The Great Plains NetworkLarry Smarr
The Pacific Research Platform (PRP) connects research institutions across the western United States with high-speed networks to enable data-intensive science collaborations. Key points:
- The PRP connects 15 campuses across California and links to the Great Plains Network, allowing researchers to access remote supercomputers, share large datasets, and collaborate on projects like analyzing data from the Large Hadron Collider.
- The PRP utilizes Science DMZ architectures with dedicated data transfer nodes called FIONAs to achieve high-speed transfer of large files. Kubernetes is used to manage distributed storage and computing resources.
- Early applications include distributed climate modeling, wildfire science, plankton imaging, and cancer genomics. The PR
The document provides an overview of the Pacific Research Platform (PRP) and discusses its role in connecting researchers across institutions and enabling new applications. It summarizes the PRP's key components like Science DMZs, Data Transfer Nodes (FIONAs), and use of Kubernetes for container management. Several examples are given of how the PRP facilitates high-performance distributed data analysis, access to remote supercomputers, and sensor networks coupled to real-time computing. Upcoming work on machine learning applications and expanding the PRP internationally is also outlined.
The Pacific Research Platform Enables Distributed Big-Data Machine-LearningLarry Smarr
The Pacific Research Platform enables distributed big data machine learning by connecting scientific instruments, sensors, and supercomputers across California and the United States with high-speed optical networks. Key components include FIONA data transfer nodes that allow fast disk-to-disk transfers near the theoretical maximum, Kubernetes to orchestrate distributed computing resources, and the Nautilus hypercluster which aggregates thousands of CPU cores and GPUs into a unified platform. This infrastructure has accelerated many scientific workflows and supported cutting-edge research in fields such as astronomy, oceanography, climate science, and particle physics.
CENIC: Pacific Wave and PRP Update Big News for Big DataLarry Smarr
The document discusses the Pacific Wave exchange and Pacific Research Platform (PRP). It provides an overview of Pacific Wave, including its history and connectivity across the Pacific and western US. It then discusses how the PRP will build on infrastructure projects to create a high-speed "big data freeway" for science across California universities. This will allow researchers to more easily share and analyze large datasets for projects in areas like climate modeling, cancer genomics, astronomy and particle physics. Details are provided on specific science applications and datasets that will benefit from the enhanced connectivity of the PRP.
The Pacific Research Platform: Building a Distributed Big Data Machine Learni...Larry Smarr
This document summarizes Dr. Larry Smarr's invited talk about the Pacific Research Platform (PRP) given at the San Diego Supercomputer Center in April 2019. The PRP is building a distributed big data machine learning supercomputer by connecting high-performance computing and data resources across multiple universities in California and beyond using high-speed networks. It provides researchers with petascale computing power, distributed storage, and tools like Kubernetes to enable collaborative data-intensive science across institutions.
Pacific Wave and PRP Update Big News for Big DataLarry Smarr
The Pacific Research Platform (PRP) aims to create a "Big Data freeway system" across research institutions in the western United States and Pacific region by leveraging high-bandwidth optical fiber networks. The PRP connects multiple universities and national laboratories, providing bandwidth up to 100Gbps for data-intensive science applications. Initial testing of the PRP demonstrated disk-to-disk transfer speeds exceeding 5Gbps between many sites. The PRP will be expanded with SDN/SDX capabilities to enable even higher performance for large-scale datasets from fields like astronomy, genomics, and particle physics.
- The Pacific Research Platform (PRP) interconnects campus DMZs across multiple institutions to provide high-speed connectivity for data-intensive research.
- The PRP utilizes specialized data transfer nodes called FIONAs that provide disk-to-disk transfer speeds of 10-100Gbps.
- Early applications of the PRP include distributing telescope data between UC campuses, connecting particle physics experiments to computing resources, and enabling real-time wildfire sensor data analysis.
An Integrated Science Cyberinfrastructure for Data-Intensive ResearchLarry Smarr
This document summarizes Dr. Larry Smarr's vision for an integrated science cyberinfrastructure to support data-intensive research. It discusses the exponential growth of digital data and need for dedicated high-bandwidth networks and data repositories. Specific examples are provided of initiatives at UCSD, regional optical networks connecting research institutions, and national projects like the Open Science Grid and Cancer Genomics Hub that are creating cyberinfrastructure to enable data-intensive scientific discovery.
Building the Pacific Research Platform: Supernetworks for Big Data ScienceLarry Smarr
The document summarizes Dr. Larry Smarr's presentation on building the Pacific Research Platform (PRP) to enable big data science across research universities on the West Coast. The PRP provides 100-1000 times more bandwidth than today's internet to support research fields from particle physics to climate change. In under 2 years, the prototype PRP has connected researchers and datasets across California through optical networks and is now expanding nationally and globally. The next steps involve adding machine learning capabilities to the PRP through GPU clusters to enable new discoveries from massive datasets.
The document summarizes the creation and evolution of Calit2, the California Institute for Telecommunications and Information Technology, a partnership between UC San Diego and UC Irvine. It describes how Calit2 was established in 2001 with a mission to explore how emerging technologies could transform applications through interdisciplinary research. With support from the state and industry partners, Calit2 has grown facilities and research projects in areas like networking, virtual reality, biomedicine, and more recently brain-inspired computing and machine learning.
Four Disruptive Trends for the Next DecadeLarry Smarr
Four disruptive trends will shape the next decade: 1) Distributed software systems will drive disintermediation and disrupt industries like transportation and hospitality; 2) Networked virtual reality will allow for planetary-scale collaboration and remote viewing; 3) Climate change will require adaptation of infrastructure to become intelligent, secure, low-carbon and climate-resilient; 4) Brain-inspired computing utilizing massive data and exascale supercomputers will enable emulation of the human brain within a decade and usher in an era of cognitive technologies.
- Digital mirror worlds are software models of physical systems that are continuously updated with real-time data, allowing them to closely mimic and predict the behavior of the real system.
- Advances in computing power and sensors are enabling increasingly detailed digital twins of objects, human bodies, cities, wildfires, and even the observable universe.
- One trillion sensors are expected to feed the planetary computer within the next decade, driving a global industrial internet and $15 trillion in economic value through digital twins of manufactured products.
- Digital twins powered by consumer sensor data may one day provide early disease detection and personalized health coaching at scale.
Advances and Breakthroughs in Computing – The Next Ten YearsLarry Smarr
The document discusses upcoming advances in computing over the next 10 years, including:
1) Exascale supercomputers capable of processing exabytes of data per day will be needed to analyze data from single instruments, requiring terabit per second networks for worldwide data transfers of terabytes of images every minute.
2) New computing architectures like quantum computing, nanoelectronic computing, approximate computing, and neuromorphic computing will be necessary to power planetary-scale computing and real-time brain simulation using exascale machines.
3) This new cyberinfrastructure will drive quantified health using trillions of sensors on human bodies and machines for applications like personalized health coaching and the industrial internet.
Future trends presentation by bakr al tamimi (public)Bakr Al-Tamimi
This document discusses emerging technologies and predictions about the future of various IT fields. Some key themes and predictions include: synthetic biology producing thousands of new life forms by 2035; drones becoming a major global military investment; and the majority of internet traffic being video by 2015. Additional predictions focus on new forms of communication like thought helmets, as well as new devices and the increasing role of automation.
Technical inovation in mechanical fieldKrishna Raj
ALL THE EXAMPLES OF RECENT INVENTION IN MECHANICAL FIELD .
BETTER DISCRIPTION WITH EXAMPLES AND THEIR IMAGES.
BEST EVER PPT OF TECHNICAL INOVATION IN MECHANICAL FIELD TOPIC.
U CAN EXPLORE IT
Data Science Innovations : Democratisation of Data and Data Science suresh sood
Data Science Innovations : Democratisation of Data and Data Science covers the opportunity of citizen data science lying at the convergence of natural language generation and discoveries in data made by the professions, not data scientists.
Smart Data - How you and I will exploit Big Data for personalized digital hea...Amit Sheth
Amit Sheth's keynote at IEEE BigData 2014, Oct 29, 2014.
Abstract from:
http://cci.drexel.edu/bigdata/bigdata2014/keynotespeech.htm
Big Data has captured a lot of interest in industry, with the emphasis on the challenges of the four Vs of Big Data: Volume, Variety, Velocity, and Veracity, and their applications to drive value for businesses. Recently, there is rapid growth in situations where a big data challenge relates to making individually relevant decisions. A key example is personalized digital health that related to taking better decisions about our health, fitness, and well-being. Consider for instance, understanding the reasons for and avoiding an asthma attack based on Big Data in the form of personal health signals (e.g., physiological data measured by devices/sensors or Internet of Things around humans, on the humans, and inside/within the humans), public health signals (e.g., information coming from the healthcare system such as hospital admissions), and population health signals (such as Tweets by people related to asthma occurrences and allergens, Web services providing pollen and smog information). However, no individual has the ability to process all these data without the help of appropriate technology, and each human has different set of relevant data!
In this talk, I will describe Smart Data that is realized by extracting value from Big Data, to benefit not just large companies but each individual. If my child is an asthma patient, for all the data relevant to my child with the four V-challenges, what I care about is simply, “How is her current health, and what are the risk of having an asthma attack in her current situation (now and today), especially if that risk has changed?” As I will show, Smart Data that gives such personalized and actionable information will need to utilize metadata, use domain specific knowledge, employ semantics and intelligent processing, and go beyond traditional reliance on ML and NLP. I will motivate the need for a synergistic combination of techniques similar to the close interworking of the top brain and the bottom brain in the cognitive models.
For harnessing volume, I will discuss the concept of Semantic Perception, that is, how to convert massive amounts of data into information, meaning, and insight useful for human decision-making. For dealing with Variety, I will discuss experience in using agreement represented in the form of ontologies, domain models, or vocabularies, to support semantic interoperability and integration. For Velocity, I will discuss somewhat more recent work on Continuous Semantics, which seeks to use dynamically created models of new objects, concepts, and relationships, using them to better understand new cues in the data that capture rapidly evolving events and situations.
Smart Data applications in development at Kno.e.sis come from the domains of personalized health, energy, disaster response, and smart city.
Opening talk at the "Interdisciplinary Data Resources to Address the Challenges of Urban Living” Workshop at the Urban Big Data Centre, University of Glasgow, 4 April 2016
This document discusses the emergence of tools and practices to help people manage the growing amount of information and data. It describes how data visualization tools will play an important role, allowing people to interact with and find patterns in large datasets. These tools will include network diagrams, interactive visualizations that allow user comments and sharing, and visualizations created by foundations to communicate data to broad audiences. The document also notes that social filtering, ambient displays, agents and interfaces will be other important tools to help people cope with information overload in the coming decade.
NYU's Center for Data Science recently acquired an NVIDIA DGX-1 AI supercomputer to accelerate research in fields such as machine learning, computer vision, and robotics. The director of AI Research at Facebook says the DGX-1 will be crucial for success in nearly every research project at NYU. OpenAI is using discussions on Reddit to increase the amount of natural language data available to train their AI systems to communicate like humans.
This is a version of series of talks given at NCSA-UIUC's director seminar, IBM Almaden, HP Labs, DERI-Galway, City Univ of Dublin, and KMI-Open University during Aug-Oct 2010 (replaces earlier keynote version). It deals with couple of items of the vision outlined at http://bit.ly/4ynB7A
A video of this presentation: http://www.ncsa.illinois.edu/News/Video/2010/sheth.html
Link to this talk as http://bit.ly/CHE-talk
The document discusses emerging technologies and their future impacts. It notes that growth can be linear, exponential, or discontinuous. While hardware continues to advance exponentially per Moore's law, software progress has slowed. The document outlines many developing technologies like AI, nanotechnology, biotechnology, and virtual reality that could drive discontinuous changes greater than the internet. It envisions a future with advanced computation, extended lifespans, integrated human-machine capabilities, and expanded access to space.
A general futurist look at how linear, exponential and discontinuous growth is shaping the future of technology and what may be expected in key areas such as hardware, software, semiconductors, artificial intelligence, nanotechnology, biotechnology, life extension and virtual worlds.
Audio: http://feeds.feedburner.com/BroaderPerspectivePodcast
Pervasive computing is defined as computing that is integrated into everyday objects and environments. It involves numerous computing devices that are casually accessible and often invisible, as well as mobile devices and ubiquitous network connectivity. A key goal is to gracefully integrate computing technology into human users' lives so that it recedes into the background. Realizing this vision will require contributions from various disciplines. The basic idea is linking physical objects to digital networks so that computing is liberated from devices like PCs and brought into everyday experiences. Early pioneers in the field include Mark Weiser and John Seely Brown. Literature on the topic builds on early work studying user mobility patterns using technologies like Bluetooth and analyzing data from mobile networks and geotagged photos.
Introducing the Internet of Things: lecture @IULM UniversityLeandro Agro'
This document discusses the Internet of Things (IoT) and how connected devices and sensors will continue to proliferate and connect physical objects to the internet. It notes that while internet access is widespread, the number of connected objects is still small compared to unconnected objects. It envisions that the next revolution will be connecting previously unconnected objects and networks of sensors. It discusses how technologies like mobile phones, social networks, open hardware, and self-tracking are enabling more connectivity between people, devices, and data. The size of the IoT market is forecasted to grow exponentially in the coming years.
Similar to Advanced Cyberinfrastructure Enabled Services and Applications in 2021 (20)
My Remembrances of Mike Norman Over The Last 45 YearsLarry Smarr
Mike Norman has been a leader in computational astrophysics for over 45 years. Some of his influential work includes:
- Cosmic jet simulations in the early 1980s which helped explain phenomena from galactic centers.
- Pioneering the use of adaptive mesh refinement in the 1990s to achieve dynamic load balancing on supercomputers.
- Massive cosmology simulations in the late 2000s with over 100 trillion particles using thousands of processors across multiple supercomputing sites, producing petabytes of data.
- Developing end-to-end workflows in the 2000s to couple supercomputers, high-speed networks, and large visualization systems to enable real-time analysis of extremely large astrophysics simulations.
Metagenics How Do I Quantify My Body and Try to Improve its Health? June 18 2019Larry Smarr
Larry Smarr discusses quantifying his body and health over time through extensive self-tracking. He measures various biomarkers through regular blood tests and analyzes his gut microbiome by sequencing stool samples. This revealed issues like chronic inflammation and an unhealthy microbiome. Smarr then took steps like a restricted eating window and increasing plant diversity in his diet, which reversed metabolic syndrome issues and correlated with shifts in his microbiome ecology. His goal is to continue precisely measuring factors like toxins, hormones, gut permeability and food/supplement impacts to further optimize his health.
Panel: Reaching More Minority Serving InstitutionsLarry Smarr
This document discusses engaging more minority serving institutions (MSIs) in cyberinfrastructure development through regional networks. It provides data showing the importance of MSIs like historically black colleges and universities (HBCUs) in educating underrepresented minority students in STEM fields. Regional networks can help equalize opportunities by assisting MSIs in overcoming barriers to resources through training, networking infrastructure support, and helping institutions obtain necessary staffing and funding. Strategies mentioned include collaborating with MSIs on grants and addressing issues identified in surveys like lack of vision for data use beyond compliance. The goal is to broaden participation in STEAM fields by leveraging the success MSIs have shown in supporting underrepresented students.
Global Network Advancement Group - Next Generation Network-Integrated SystemsLarry Smarr
This document summarizes a presentation on global petascale to exascale workflows for data intensive sciences. It discusses a partnership convened by the GNA-G Data Intensive Sciences Working Group with the mission of meeting challenges faced by data-intensive science programs. Cornerstone concepts that will be demonstrated include integrated network and site resource management, model-driven frameworks for resource orchestration, end-to-end monitoring with machine learning-optimized data transfers, and integrating Qualcomm's GradientGraph with network services to optimize applications and science workflows.
Wireless FasterData and Distributed Open Compute Opportunities and (some) Us...Larry Smarr
This document discusses opportunities for ESnet to support wireless edge computing through developing a strategy around self-guided field laboratories (SGFL). It outlines several potential science use cases that could benefit from wireless and distributed computing capabilities, both in the short term through technologies like 5G, LoRa and Starlink, and longer term through the vision of automated SGFL. The document proposes some initial ideas for deploying and testing wireless edge computing technologies through existing projects to help enable the SGFL vision and further scientific opportunities. It emphasizes that exploring these emerging areas could help drive new science possibilities if done at a reasonable scale.
The Asia Pacific and Korea Research Platforms: An Overview Jeonghoon MoonLarry Smarr
This document provides an overview of Asia Pacific and Korea research platforms. It discusses the Asia Pacific Research Platform working group in APAN, including its objectives to promote HPC ecosystems and engage members. It describes the Asi@Connect project which provides high-capacity internet connectivity for research across Asia-Pacific. It also discusses the Korea Research Platform and efforts to expand it to 25 national research institutes in Korea. New related projects on smart hospitals, agriculture, and environment are mentioned. The conclusion discusses enhancing APAN and the Korea Research Platform and expanding into new areas like disaster and AI education.
Panel: Reaching More Minority Serving InstitutionsLarry Smarr
This document discusses engaging more minority serving institutions (MSIs) in the National Research Platform (NRP). It provides data showing that MSIs serve a disproportionate number of underrepresented minority students and are important producers of STEM graduates from these groups. The NRP can help broaden participation in STEAM fields by providing MSIs access to advanced cyberinfrastructure resources, new learning modalities, and opportunities for collaborative research between MSIs and other institutions. Regional networks also have a role to play in helping MSIs overcome barriers and attracting them to collaborative grants. The goal is to tear down walls between research and teaching and reinvent the university experience for more inclusive learning and innovation.
Panel: The Global Research Platform: An OverviewLarry Smarr
The document provides an overview of the Global Research Platform (GRP), an international collaborative partnership creating a distributed environment for data-intensive global science. The GRP facilitates high-performance data gathering, analytics, transport up to terabits per second, computing, and storage to support large-scale global science cyberinfrastructure ecosystems. It aims to orchestrate research across multiple domains using international testbeds for investigating new technologies related to data-intensive science. Examples of instruments generating exabytes of data that would benefit include the Korea Superconducting Tokamak, the High Luminosity LHC, genomics, the SKA radio telescope, and the Vera Rubin Observatory.
Panel: Future Wireless Extensions of Regional Optical NetworksLarry Smarr
CENIC is a non-profit organization that operates an 8,000+ mile fiber optic network connecting over 12,000 sites across California, including K-12 schools, universities, libraries, and research organizations. It has over 750 private sector partners and contributes over $100 million annually to the California economy. CENIC's network enables research and education collaborations, innovation, and economic growth statewide. It also operates a wireless research network called PRP that connects wireless sensors to supercomputers, supporting applications like wildfire modeling.
Global Research Platform Workshops - Maxine BrownLarry Smarr
The document announces a workshop on global research platforms that will be held virtually in 2021 and in Salt Lake City in 2022, with topics including large-scale science, next-generation platforms, data transport, and international testbeds. It also announces the 4th Global Research Platform Workshop to be held in October 2023 in Limassol, Cyprus co-located with the IEEE eScience 2023 conference.
EPOC and NetSage provide engagement and network monitoring services to support research and education. NetSage collects anonymized network flow data to help understand traffic patterns and troubleshoot performance issues. It provides dashboards and analysis to answer common questions from network engineers and end users. Examples of NetSage deployments and use cases were shown for the CENIC network, including top sources and destinations of traffic, debugging slow flows, and analyzing international traffic patterns by country over time.
The document discusses accelerating science discovery with AI inference-as-a-service. It describes showcases using this approach for high energy physics and gravitational wave experiments. It outlines the vision of the A3D3 institute to unite domain scientists, computer scientists, and engineers to achieve real-time AI and transform science. Examples are provided of using AI inference-as-a-service to accelerate workflows for CMS, ProtoDUNE, LIGO, and other experiments.
Democratizing Science through Cyberinfrastructure - Manish ParasharLarry Smarr
This document summarizes a presentation by Manish Parashar on democratizing science through cyberinfrastructure. The key points are:
1) Broad, fair, and equitable access to advanced cyberinfrastructure is essential for democratizing 21st century science, but there are significant barriers related to knowledge, technical issues, social factors, and balancing capabilities.
2) An advanced cyberinfrastructure ecosystem for all requires integrated portals, access to local and national resources through high-speed networks, diverse allocation modes, embedded expertise networks, and broad training.
3) Realizing this vision will require a scalable federated ecosystem with diverse capabilities and incentives for partnerships to meet growing needs for cyberinfrastructure and
Panel: Building the NRP Ecosystem with the Regional Networks on their Campuses;Larry Smarr
This document summarizes a panel discussion on building the National Research Platform ecosystem with regional networks. The panelists discussed how their regional networks are connecting to and using the Nautilus nodes of the NRP. Examples included using NRP for deep learning and computer vision research at the University of Missouri, challenges of adoption in Nevada and potential solutions, and Georgia Tech's new involvement through the Southern Crossroads regional network. The regional networks see opportunities to expand NRP access and training to enable more researchers in their regions to take advantage of the platform.
Open Force Field: Scavenging pre-emptible CPU hours* in the age of COVID - Je...Larry Smarr
The document discusses Open Force Field (OpenFF), an open-source project that enables rapid development of molecular force fields through automated infrastructure, open data and software, and an open science approach. OpenFF provides access to large quantum chemical datasets, runs quantum chemistry calculations on pre-emptible cloud resources with minimal human intervention, and facilitates easy iteration and testing of new force field hypotheses through an open development model.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, when they may be suitable to use, and how tools like CloudBank and Kubernetes can help facilitate science users' access to cloud resources.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, noting they provide huge capacity and variety but are very expensive for regular use. Facilitating science users on clouds requires services like CloudBank and Kubernetes federation.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, noting they provide huge capacity and variety but are very expensive for regular use. Facilitating science users on clouds requires tools for account management, documentation, and integrating cloud resources through HTCondor and Kubernetes.
Frank Würthwein - NRP and the Path forwardLarry Smarr
NRP will replace PRP and aims to democratize access to national research cyberinfrastructure. The long term vision is to create an open national cyberinfrastructure by federating resources across research institutions. Key innovations include an innovative network fabric, application libraries for FPGAs, a "bring your own resource" model, and innovative scheduling and data infrastructure. The NSF has funded the Prototype National Research Platform project to support NRP for the next 5 years. NRP aims to grow resources, introduce new capabilities, and be driven by the research community.
Codeless Generative AI Pipelines
(GenAI with Milvus)
https://ml.dssconf.pl/user.html#!/lecture/DSSML24-041a/rate
Discover the potential of real-time streaming in the context of GenAI as we delve into the intricacies of Apache NiFi and its capabilities. Learn how this tool can significantly simplify the data engineering workflow for GenAI applications, allowing you to focus on the creative aspects rather than the technical complexities. I will guide you through practical examples and use cases, showing the impact of automation on prompt building. From data ingestion to transformation and delivery, witness how Apache NiFi streamlines the entire pipeline, ensuring a smooth and hassle-free experience.
Timothy Spann
https://www.youtube.com/@FLaNK-Stack
https://medium.com/@tspann
https://www.datainmotion.dev/
milvus, unstructured data, vector database, zilliz, cloud, vectors, python, deep learning, generative ai, genai, nifi, kafka, flink, streaming, iot, edge
The Ipsos - AI - Monitor 2024 Report.pdfSocial Samosa
According to Ipsos AI Monitor's 2024 report, 65% Indians said that products and services using AI have profoundly changed their daily life in the past 3-5 years.
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...sameer shah
"Join us for STATATHON, a dynamic 2-day event dedicated to exploring statistical knowledge and its real-world applications. From theory to practice, participants engage in intensive learning sessions, workshops, and challenges, fostering a deeper understanding of statistical methodologies and their significance in various fields."
Build applications with generative AI on Google CloudMárton Kodok
We will explore Vertex AI - Model Garden powered experiences, we are going to learn more about the integration of these generative AI APIs. We are going to see in action what the Gemini family of generative models are for developers to build and deploy AI-driven applications. Vertex AI includes a suite of foundation models, these are referred to as the PaLM and Gemini family of generative ai models, and they come in different versions. We are going to cover how to use via API to: - execute prompts in text and chat - cover multimodal use cases with image prompts. - finetune and distill to improve knowledge domains - run function calls with foundation models to optimize them for specific tasks. At the end of the session, developers will understand how to innovate with generative AI and develop apps using the generative ai industry trends.
"Financial Odyssey: Navigating Past Performance Through Diverse Analytical Lens"sameer shah
Embark on a captivating financial journey with 'Financial Odyssey,' our hackathon project. Delve deep into the past performance of two companies as we employ an array of financial statement analysis techniques. From ratio analysis to trend analysis, uncover insights crucial for informed decision-making in the dynamic world of finance."
End-to-end pipeline agility - Berlin Buzzwords 2024Lars Albertsson
We describe how we achieve high change agility in data engineering by eliminating the fear of breaking downstream data pipelines through end-to-end pipeline testing, and by using schema metaprogramming to safely eliminate boilerplate involved in changes that affect whole pipelines.
A quick poll on agility in changing pipelines from end to end indicated a huge span in capabilities. For the question "How long time does it take for all downstream pipelines to be adapted to an upstream change," the median response was 6 months, but some respondents could do it in less than a day. When quantitative data engineering differences between the best and worst are measured, the span is often 100x-1000x, sometimes even more.
A long time ago, we suffered at Spotify from fear of changing pipelines due to not knowing what the impact might be downstream. We made plans for a technical solution to test pipelines end-to-end to mitigate that fear, but the effort failed for cultural reasons. We eventually solved this challenge, but in a different context. In this presentation we will describe how we test full pipelines effectively by manipulating workflow orchestration, which enables us to make changes in pipelines without fear of breaking downstream.
Making schema changes that affect many jobs also involves a lot of toil and boilerplate. Using schema-on-read mitigates some of it, but has drawbacks since it makes it more difficult to detect errors early. We will describe how we have rejected this tradeoff by applying schema metaprogramming, eliminating boilerplate but keeping the protection of static typing, thereby further improving agility to quickly modify data pipelines without fear.
Advanced Cyberinfrastructure Enabled Services and Applications in 2021
1. “Advanced Cyberinfrastructure Enabled
Services and Applications
in 2021”
Keynote Presentation
NSF Workshop on Applications and Services in 2021
Washington, DC
January 28, 2016
Dr. Larry Smarr
Director, California Institute for Telecommunications and
Information Technology
Harry E. Gruber Professor,
Dept. of Computer Science and Engineering
Jacobs School of Engineering, UCSD
2. The Cyberinfrastructure of 2021
Will Be Radically Different from 2016
• Wired
• Wireless
• Sensors
• Computer Architecture
• Visualization
3. NSF Has Funded Over 100 Campuses to Build
Local “Big Data” Optical Fiber Freeways 1000x Shared Internet Speed
Red 2012 CC-NIE Awardees
Yellow 2013 CC-NIE Awardees
Green 2014 CC*IIE Awardees
Blue 2015 CC*DNI Awardees
Purple Multiple Time Awardees
Source: NSF
4. Optical Fibers Linking Big Data Researchers at 10-100Gbps
in U.S., Australia, Korea, Japan, and the Netherlands
5. Global Scientific Instruments Will Produce Ultralarge Datasets Continuously:
New Services and Applications Needed for Data Scientific Discovery
Square Kilometer Array Large Synoptic Survey Telescope
https://tnc15.terena.org/getfile/1939 www.lsst.org/sites/default/files/documents/DM%20Introduction%20-%20Kantor.pdf
Tracks ~40B Objects,
Creates 10M Alerts/Night
Within 1 Minute of Observing
2x40Gb/s
6. 5G Will Enable a Wide Range of New Wireless Applications and Services
Connecting People and Things
7. The Jump from Wireless 4G to 5G Will Be Transformative:
ITU’s International Mobile Telecommunication (IMT) System
www.netmanias.com/en/post/blog/7335/5g-kt/key-parameters-for-5g-mobile-communications-itu-r-wp-5d-standardization-status
8. Massive Changes in Our Ability to Quantify the World
Simultaneously Explosively Decrease in Sensor Cost and Increase of Use
www.genome.gov/images/content/costperMb2015_4.jpg
9. New Computing Architectures Are Developing Rapidly
From the End of Scaling Which Drove Moore’s Law
Quantu
m Realm
• Nanoelectronic Computing
• Approximate Computing
• Quantum Computing
• Brain-Inspired Computing
Graph source: www.iue.tuwien.ac.at/phd/filipovic/node20.html
10. The Future of Supercomputing
“High Performance Computing Will Evolve
Towards a Hybrid Model,
Integrating Emerging Non-von Neumann Architectures,
with Huge Potential in Pattern Recognition,
Streaming Data Analysis,
and Unpredictable New Applications.”
Horst Simon, Deputy Director,
U.S. Department of Energy’s
Lawrence Berkeley National Laboratory
11. Left & Right Brain Computing:
Arithmetic vs. Pattern Recognition
Adapted from D-Wave
12. Massive Public Private Partnership
to Accelerate Brain-Inspired Computers
Jan/Feb 2014
Over $100 Million
13. Brain-Inspired Processors
Are Accelerating the non-von Neumann Architecture Era
“On the drawing board are collections of 64, 256, 1024, and 4096 chips.
‘It’s only limited by money, not imagination,’ Modha says.”
Source: Dr. Dharmendra Modha
Founding Director, IBM Cognitive Computing Group
August 8, 2014
14. Contextual Robots With Neuromorphic Processors That
Can See and Learn Will Tie Into the Planetary Computer
April 2014
15. If You Are Planning New Applications and Services On a Ten Year Horizon,
It Helps to See What Unexpected Change Can Happen In a Decade…
One
Decade
www.benphoster.com/wp-content/uploads/2011/07/Facebook-User-Growth-Chart.png
From One Million to One Billion Users
In Less Than 8 Years!
www.thesocialnetwork-movie.com
DEC 2004
16. Collaborating in Virtual Reality at 10Gbps
University Research Frontier Today-What About in Five Years?
EVL
Calit2
Source: NTT Sponsored ON*VECTOR Workshop at Calit2 March 6, 2013
17. Why Would a Social Network Company
Buy a Consumer Virtual Reality Company?
18. One Year Later…
“We're working on VR because
I think it's the next major
computing and communication platform
after phones…”
-Mark Zuckerberg, Facebook CEO
July 1, 2015
19. Examples of Massive Markets That Are Being Disrupted
by a Combination of These Cyberinfrastructure Advances
• Quantified Machines Lead to the Industrial Internet
• Quantified Cars and Drones Lead to Self-Driving Vehicles
• Quantified Houses Lead to the Smart Electric Grid
• Quantified Selves Lead to Personalized Preventive Healthcare
20. The Planetary-Scale Computer Fed by a Trillion Sensors
Will Drive a Global Industrial Internet
www-bsac.eecs.berkeley.edu/frontpagefiles/BSACGrowingMEMS_Markets_%20SEMI.ORG.html
Next Decade
One Trillion “Within the next 20 years
the Industrial Internet
will have added
to the global economy
an additional $15 trillion.”
--General Electric
www.ge.com/docs/chapters/Industrial_Internet.pdf
24. From One to a Trillion Data Points Defining Me in 15 Years:
The Exponential Rise in Body Data
Weight
Blood
Variables
Human Genome
SNPs
Microbial Genome
Time Series
Improving Body
Discovering Disease
Human Genome
25. I Decided to Track My Internal Biomarkers
Just As I Did My External Body
One Blood Draw
For MeCalit2 64 Megapixel VROOM
26. A New Generation of Human Body Sensors
Will Provide Continuous Readouts
Startup MC10 Working With UIUC
27. Consumer Internal Self-Tracking Tools
Are Growing Rapidly
Blood Variable Time Series Stool Variable Time Series
Human Genetic Variations
Integrated Wellness
Human Microbiome
28. The Emergence of P4 Medicine --
Predictive, Preventive, Personalized, Participatory
Systems Biology &
Systems Medicine
Consumer-Driven
Social Networks
P4
MEDICINE
Digital Revolution
Big Data
How Will the Quantified Consumer
Be Integrated into Healthcare Systems?
ee Hood, Director ISB
29. A Vision for Healthcare
in the Coming Decades
Using this data, the planetary computer will be able
to build a computational model of your body
and compare your sensor stream with millions of others.
Besides providing early detection of internal changes
that could lead to disease,
cloud-powered voice-recognition wellness coaches could provide
continual personalized support on lifestyle choices, potentially
staving off disease
and making health care affordable for everyone.
ESSAY
An Evolution Toward a Programmable
Universe
By LARRY SMARR
Published: December 5, 2011
30. Deep Learning Will Provide
Personalized Assistants to Each of Us
Where Personalized Coaching is Now
Where Personalized Coaching is Going
January 10, 2014
31. Is the Release of Google’s of TensorFlow
as Transformative as the Release of C?
https://exponential.singularityu.org/medicine/big-data-machine-learning-with-jeremy-howard/
From Programming Computers
Step by Step To Achieve a Goal
To Showing the Computer
Some Examples of
What You Want It to Achieve
and Then Letting the Computer
Figure It Out On Its Own
--Jeremy Howard, Singularity Univ.
2015
32. AI is Advancing at a Amazing Pace:
Deep Learning Algorithms Working on Massive Datasets
1.5 Years!
Training on 30M Moves,
Then Playing Against Itself
33. Reverse Engineering of the Brain
Is Accelerating Under the Federal Brain Initiative
www.whitehouse.gov/infographics/brain-initiative
34. Kurzweil’s Theory of Mind:
The Human Neocortex is a Self-Organizing
Hierarchical System of Pattern Recognizers
“There are ~300M
Pattern Recognizers
in the Human Neocortex.”
In the Emerging
Synthetic Neocortex,
“Why Not a Billion?
Or a Trillion?”
November 13, 2012
35. The Defining Issue in IT for the Coming Decades
May 5, 2015August 25, 2015
36. This Next Decade’s Computing Transition
Will Not Be Just About Technology
"Those disposed to dismiss
an 'AI takeover' as science
fiction may think again after
reading this original and well-
argued book." —Martin Rees,
Past President, Royal Society
If our own extinction is
a likely, or even
possible, outcome of
our technological
development, shouldn't
we proceed with great
Success in creating AI would be
the biggest event in human
history. Unfortunately, it might
also be the last, unless we learn
how to avoid the risks.
– Steven Hawking