Robert Rankin, Professor of Physics and Astronomy at the University of Alberta, presented these slides as part of the Cybera Summit 2010 session "The Evolution of Collaborative Science." For more information please visit http://www.cybera.ca/evolution-collaborative-science
Computational Training and Data Literacy for Domain ScientistsJoshua Bloom
This document discusses training domain scientists in computational and data skills. It notes the increasing amount of data in fields like astronomy and challenges of traditional approaches. It advocates teaching skills like statistics, machine learning, and programming. Examples are given of bootcamps, seminars and degree programs in these areas at UC Berkeley taught by CS and statistics faculty. Challenges discussed include fitting such training into formal curricula and ensuring participation from underrepresented groups. The creation of collaborative spaces is proposed to better connect domain scientists with methodological experts to help scientists address the growing role of data in their fields.
Cybera's mission is to spur innovation in Alberta through the use of cyberinfrastructure. It supports this mission by providing peering connections between large ISPs, universities, and companies. Cybera also operates cyberports in several Alberta cities to facilitate collaboration and provides commercial supercomputing and cloud services. It aims to support very big data projects involving oil and gas, genomics, finance, brain research, and more through specialized tools and infrastructure.
Cybera - Leading Edge of Commercialization- A Case Study- CCAT2010Cybera Inc.
The document discusses the commercialization of digital radiography technology by IDC from the late 1990s to the present. It describes the technology, timeline of key events from regulatory approvals to rapid growth, struggles with management changes and liquidity issues, and lessons learned around building the right team, securing financing, patents, and knowing when to expand internationally. The technology has seen an 8000x increase in computing power and a shift to easier market entry and regulatory processes over time.
GeoCENS Water and Environmental Hub September 23, 2010 Workshop Presentation Cybera Inc.
The document discusses the creation of the Water and Environmental Hub, which aims to be an open source web platform that connects water and environmental data. It seeks to make data more accessible online to reduce the time and money spent acquiring it, and to enable more integrated management, research, innovation, and economic diversification. The hub would aggregate data from various sources related to surface water, groundwater, climate, and the environment. It would use distributed, scalable cloud computing and standards-based approaches to provide seamless access to the data through multiple portals and tools for target audiences like governments, industry, academia and the public.
Wide Area Virtualization: From the Enterprise to the CloudCybera Inc.
DataGardens
Presented at the Cybera/CANARIE National Summit 2009, as part of the session "From iPods to Data Centres: How the Cloud Impacts You." This session showcased a selection of Cybera pilot projects that leverage and demonstrate cloud computing concepts.
Nexus of Science Policy and ICT Policy: Implications and Outcomes - Susan Ba...Cybera Inc.
Susan Baldwin, Executive Director of Compute Canada, presented these slides as part of the Cybera Summit 2010 session "The Nexus of Science Policy and ICT Policy: Implications and Outcomes". For more information, visit http://www.cybera.ca/nexus-science-policy-and-ict-policy-implications-and-outcomes
Computational Training and Data Literacy for Domain ScientistsJoshua Bloom
This document discusses training domain scientists in computational and data skills. It notes the increasing amount of data in fields like astronomy and challenges of traditional approaches. It advocates teaching skills like statistics, machine learning, and programming. Examples are given of bootcamps, seminars and degree programs in these areas at UC Berkeley taught by CS and statistics faculty. Challenges discussed include fitting such training into formal curricula and ensuring participation from underrepresented groups. The creation of collaborative spaces is proposed to better connect domain scientists with methodological experts to help scientists address the growing role of data in their fields.
Cybera's mission is to spur innovation in Alberta through the use of cyberinfrastructure. It supports this mission by providing peering connections between large ISPs, universities, and companies. Cybera also operates cyberports in several Alberta cities to facilitate collaboration and provides commercial supercomputing and cloud services. It aims to support very big data projects involving oil and gas, genomics, finance, brain research, and more through specialized tools and infrastructure.
Cybera - Leading Edge of Commercialization- A Case Study- CCAT2010Cybera Inc.
The document discusses the commercialization of digital radiography technology by IDC from the late 1990s to the present. It describes the technology, timeline of key events from regulatory approvals to rapid growth, struggles with management changes and liquidity issues, and lessons learned around building the right team, securing financing, patents, and knowing when to expand internationally. The technology has seen an 8000x increase in computing power and a shift to easier market entry and regulatory processes over time.
GeoCENS Water and Environmental Hub September 23, 2010 Workshop Presentation Cybera Inc.
The document discusses the creation of the Water and Environmental Hub, which aims to be an open source web platform that connects water and environmental data. It seeks to make data more accessible online to reduce the time and money spent acquiring it, and to enable more integrated management, research, innovation, and economic diversification. The hub would aggregate data from various sources related to surface water, groundwater, climate, and the environment. It would use distributed, scalable cloud computing and standards-based approaches to provide seamless access to the data through multiple portals and tools for target audiences like governments, industry, academia and the public.
Wide Area Virtualization: From the Enterprise to the CloudCybera Inc.
DataGardens
Presented at the Cybera/CANARIE National Summit 2009, as part of the session "From iPods to Data Centres: How the Cloud Impacts You." This session showcased a selection of Cybera pilot projects that leverage and demonstrate cloud computing concepts.
Nexus of Science Policy and ICT Policy: Implications and Outcomes - Susan Ba...Cybera Inc.
Susan Baldwin, Executive Director of Compute Canada, presented these slides as part of the Cybera Summit 2010 session "The Nexus of Science Policy and ICT Policy: Implications and Outcomes". For more information, visit http://www.cybera.ca/nexus-science-policy-and-ict-policy-implications-and-outcomes
Cyberinfrastructure to Support Ocean ObservatoriesLarry Smarr
05.03.18
Invited Talk to the Ocean Studies Board
National Research Council
Title: Cyberinfrastructure to Support Ocean Observatories
University of California San Diego
The Pacific Research Platform Two Years InLarry Smarr
This document provides an overview of the Pacific Research Platform (PRP) after two years of operation. It describes several science drivers that are using the PRP, including biomedical research on cancer genomics and microbiomes, earth sciences like earthquake modeling, and astronomy. It highlights how the PRP is connecting sites like UC San Diego, UC Santa Cruz, UC Berkeley to share and analyze large datasets using high-speed networks. The PRP is expanding to support new areas like deep learning, cultural heritage projects, and connecting additional UC campuses through network upgrades.
The document discusses the evolving landscape of semantic technologies and their applications to scientific domains like eScience. It introduces the Tetherless World Constellation, a research group applying semantic web techniques. Examples are given of projects applying semantics to areas like virtual observatories and provenance capture. The value of semantic technologies is discussed for integration, discovery, and validation of scientific data and models. Modular ontologies and semantically-enabled frameworks are presented as important directions for reuse and collaboration.
This document discusses the field of astroinformatics, which uses machine learning algorithms and computational tools to analyze large astronomy datasets. It summarizes that quasars are extremely luminous celestial objects located far from Earth that emit unusually large amounts of energy. The document also notes that analyzing detailed spectra of thousands of quasars using machine learning could help identify anomalous emission patterns. Finally, it predicts that astroinformatics will be crucial for making sense of the huge volumes of data that will be produced by next-generation telescopes.
LambdaGrids--Earth and Planetary Sciences Driving High Performance Networks a...Larry Smarr
05.02.04
Invited Talk to the NASA Jet Propulsion Laboratory
Title: LambdaGrids--Earth and Planetary Sciences Driving High Performance Networks and High Resolution Visualizations
Pasadena, CA
The Pacific Research Platform (PRP) is a multi-institutional cyberinfrastructure project that connects researchers across California and beyond to share large datasets. It spans the 10 University of California campuses, major private research universities, supercomputer centers, and some out-of-state universities. Fifteen multi-campus research teams in fields like physics, astronomy, earth sciences, biomedicine, and multimedia will drive the technical needs of the PRP over five years. The goal is to create a "big data freeway" to allow high-speed sharing of data between research labs, supercomputers, and repositories across multiple networks without performance loss over long distances.
LSST/DM: Building a Next Generation Survey Data Processing SystemMario Juric
The document describes the Large Synoptic Survey Telescope (LSST) project. Key points:
- LSST will be an 8.4-meter telescope that will image the entire visible sky every few nights over 10 years to conduct a wide, deep, and fast optical survey.
- The goal is to produce catalogs and images of tens of billions of celestial objects that will be made freely available to the public and scientific community.
- The science goals include mapping the Milky Way, discovering transient objects like supernovae, studying dark energy and dark matter, and conducting a census of the solar system.
- Construction is set to begin in mid-2014 after receiving funding approval, with first light expected around 20
Toward a Global Interactive Earth Observing CyberinfrastructureLarry Smarr
The document discusses the need for a new generation of cyberinfrastructure to support interactive global earth observation. It outlines several prototyping projects that are building examples of systems enabling real-time control of remote instruments, remote data access and analysis. These projects are driving the development of an emerging cyber-architecture using web and grid services to link distributed data repositories and simulations.
Astronomical Data Processing on the LSST Scale with Apache SparkDatabricks
The next decade promises to be exciting for both astronomy and computer science with a number of large-scale astronomical surveys in preparation. One of the most important ones is Large Scale Survey Telescope, or LSST. LSST will produce the first ‘video’ of the deep sky in history by continually scanning the visible sky and taking one 3.2 giga-pixel image every 20 seconds. In this talk we will describe LSST’s unique design and how its image processing pipeline produces catalogs of astronomical objects. To process and quickly cross-match catalog data we built AXS (Astronomy Extensions for Spark), a system based on Apache Spark. We will explain its design and what is behind its great cross-matching performance.
The document provides an overview of NASA Ames Research Center, including its history, missions, programs, facilities, and educational activities. It notes that NASA Ames conducts applied research and develops critical technologies to enable NASA missions. Some key areas of focus include space, Earth, and life sciences; astrobiology; small satellites; aviation and aeronautics; exploration systems; and educational outreach. It also summarizes several past and current NASA Ames missions.
This document summarizes NASA's research on small satellites and nanosatellites. It discusses how smaller spacecraft can enable more science missions with lower costs through increased numbers of missions. Smallsats allow for a faster learning cycle and development of new technologies. NASA's Ames Research Center has developed several smallsat platforms and payloads over the past decade for applications in Earth science, heliophysics, planetary science, and astrophysics. These include gene expression, pharmaceutical, and spectroscopy experiments. Ames is working to mature technologies for smallsat missions like advanced components, autonomous operations, and formation flying.
Pacific Research Platform Application DriversLarry Smarr
The document summarizes several science driver teams that use the Pacific Research Platform (PRP) for high-speed data transfers between California universities. It discusses projects in biomedical research, earth sciences, particle physics, astronomy, and other fields. Specific examples highlighted include using the PRP to share cancer genomics data between multiple institutions, connect a supercomputer to telescope data, enable virtual reality transfers between universities, and link laboratories studying earthquakes. The PRP is also being expanded to support additional uses like cryo-electron microscopy, cultural heritage databases, and networking in southern California.
The Pacific Research Platform: A Regional-Scale Big Data Analytics Cyberinfra...Larry Smarr
National Ocean Exploration Forum 2017
Ocean Exploration in a Sea of Data
Calit2’s Qualcomm Institute
University of California, San Diego
October 21, 2017
The document describes Project ELSA, which aims to design and construct a low-cost spherical probe called the NeoPod to collect and transmit scientific data from the surface of Europa. A team of undergraduate students designed the NeoPod over 8 months to fit within a 25cm diameter sphere and weigh less than 10kg. The NeoPod integrates two sensors (a magnetometer and Geiger counter), an avionics package with an FPGA, a communications system, and a power system to operate for 100 hours. Simulation results show the NeoPod could transmit up to 301MB of data if located at Europa's poles. The project aims to demonstrate the feasibility of using low-cost probes to explore Europa and other planetary
This document summarizes the proceedings of the 3rd annual meeting of the NASA Institute for Advanced Concepts (NIAC) held on June 5-6, 2001 at NASA Ames Research Center. It provides an overview of proposals received and awards given by NIAC, as well as summaries of the status reports presented on innovative advanced aerospace concepts. The status reports covered concepts such as a space elevator, robotic planetary explorers, very large space telescopes, and in-situ resource utilization for Mars missions. Keynote speakers discussed visions for the future of aeronautics and space.
This document summarizes new frontiers in astronomy with the use of space telescopes like Hubble and beyond. It discusses the growth of data collection over the past 25 years through improved detectors and instruments. Various space telescopes that collect data across the electromagnetic spectrum are highlighted. Challenges of large astronomical surveys producing terabytes and petabytes of multi-wavelength data over time are discussed. Standards for data formats, coordinate systems, protocols and virtual observatory initiatives are outlined to help address these challenges.
Cyber Summit 2016: Technology, Education, and DemocracyCybera Inc.
What are the opportunities and the challenges offered by emerging modes of technologically-inflected communication and decision-making? What is our role and responsibility as educators and as developers of research and teaching digital infrastructures? What do students need in the 21st century? As education institutions and providers struggle to respond to the first two questions, are we abrogating our responsibility to the last?
In this talk, Matt Ratto will describe some of the opportunities and the challenges we currently face, laying out a model of action for how to potentially address the questions raised above. Core to his thinking are two related points; first that we must help students develop a greater sense of how the informational world and its attendant infrastructures helps shape how and what we think, and second, that a good way to do this is to give students the space to engage in reflexive acts of technological production – what Matt has termed ‘critical making.’ He will provide concrete examples from both his research and his teaching that demonstrate the value and importance of reflexive, hands-on work with digital technologies in helping students develop the critical digital literacy skills they need to function in today’s society.
Matt Ratto is an Associate Professor in the Faculty of Information at the University of Toronto and directs the Semaphore Research cluster on Inclusive Design, Mobile and Pervasive Computing and, as part of Semaphore, the Critical Making lab.
Cyber Summit 2016: Understanding Users' (In)Secure BehaviourCybera Inc.
1) The document summarizes a user study on phishing detection that found users still struggle to accurately identify phishing sites, being successful only 53% of the time on phishing sites and 78% on legitimate sites.
2) The study also found that users have only a shallow understanding of security indicators and place more attention on page content than security cues. Nearly half did not recognize a phishing version of their own bank site.
3) The presentation argues that common password policies may do more harm than good by placing unreasonable demands on human memory and behavior. It suggests rethinking such policies and advice to consider human capabilities and provide more practical and beneficial guidance.
More Related Content
Similar to Virtual Observatories as the Drivers of Space Science - Robert Rankin, University of Alberta
Cyberinfrastructure to Support Ocean ObservatoriesLarry Smarr
05.03.18
Invited Talk to the Ocean Studies Board
National Research Council
Title: Cyberinfrastructure to Support Ocean Observatories
University of California San Diego
The Pacific Research Platform Two Years InLarry Smarr
This document provides an overview of the Pacific Research Platform (PRP) after two years of operation. It describes several science drivers that are using the PRP, including biomedical research on cancer genomics and microbiomes, earth sciences like earthquake modeling, and astronomy. It highlights how the PRP is connecting sites like UC San Diego, UC Santa Cruz, UC Berkeley to share and analyze large datasets using high-speed networks. The PRP is expanding to support new areas like deep learning, cultural heritage projects, and connecting additional UC campuses through network upgrades.
The document discusses the evolving landscape of semantic technologies and their applications to scientific domains like eScience. It introduces the Tetherless World Constellation, a research group applying semantic web techniques. Examples are given of projects applying semantics to areas like virtual observatories and provenance capture. The value of semantic technologies is discussed for integration, discovery, and validation of scientific data and models. Modular ontologies and semantically-enabled frameworks are presented as important directions for reuse and collaboration.
This document discusses the field of astroinformatics, which uses machine learning algorithms and computational tools to analyze large astronomy datasets. It summarizes that quasars are extremely luminous celestial objects located far from Earth that emit unusually large amounts of energy. The document also notes that analyzing detailed spectra of thousands of quasars using machine learning could help identify anomalous emission patterns. Finally, it predicts that astroinformatics will be crucial for making sense of the huge volumes of data that will be produced by next-generation telescopes.
LambdaGrids--Earth and Planetary Sciences Driving High Performance Networks a...Larry Smarr
05.02.04
Invited Talk to the NASA Jet Propulsion Laboratory
Title: LambdaGrids--Earth and Planetary Sciences Driving High Performance Networks and High Resolution Visualizations
Pasadena, CA
The Pacific Research Platform (PRP) is a multi-institutional cyberinfrastructure project that connects researchers across California and beyond to share large datasets. It spans the 10 University of California campuses, major private research universities, supercomputer centers, and some out-of-state universities. Fifteen multi-campus research teams in fields like physics, astronomy, earth sciences, biomedicine, and multimedia will drive the technical needs of the PRP over five years. The goal is to create a "big data freeway" to allow high-speed sharing of data between research labs, supercomputers, and repositories across multiple networks without performance loss over long distances.
LSST/DM: Building a Next Generation Survey Data Processing SystemMario Juric
The document describes the Large Synoptic Survey Telescope (LSST) project. Key points:
- LSST will be an 8.4-meter telescope that will image the entire visible sky every few nights over 10 years to conduct a wide, deep, and fast optical survey.
- The goal is to produce catalogs and images of tens of billions of celestial objects that will be made freely available to the public and scientific community.
- The science goals include mapping the Milky Way, discovering transient objects like supernovae, studying dark energy and dark matter, and conducting a census of the solar system.
- Construction is set to begin in mid-2014 after receiving funding approval, with first light expected around 20
Toward a Global Interactive Earth Observing CyberinfrastructureLarry Smarr
The document discusses the need for a new generation of cyberinfrastructure to support interactive global earth observation. It outlines several prototyping projects that are building examples of systems enabling real-time control of remote instruments, remote data access and analysis. These projects are driving the development of an emerging cyber-architecture using web and grid services to link distributed data repositories and simulations.
Astronomical Data Processing on the LSST Scale with Apache SparkDatabricks
The next decade promises to be exciting for both astronomy and computer science with a number of large-scale astronomical surveys in preparation. One of the most important ones is Large Scale Survey Telescope, or LSST. LSST will produce the first ‘video’ of the deep sky in history by continually scanning the visible sky and taking one 3.2 giga-pixel image every 20 seconds. In this talk we will describe LSST’s unique design and how its image processing pipeline produces catalogs of astronomical objects. To process and quickly cross-match catalog data we built AXS (Astronomy Extensions for Spark), a system based on Apache Spark. We will explain its design and what is behind its great cross-matching performance.
The document provides an overview of NASA Ames Research Center, including its history, missions, programs, facilities, and educational activities. It notes that NASA Ames conducts applied research and develops critical technologies to enable NASA missions. Some key areas of focus include space, Earth, and life sciences; astrobiology; small satellites; aviation and aeronautics; exploration systems; and educational outreach. It also summarizes several past and current NASA Ames missions.
This document summarizes NASA's research on small satellites and nanosatellites. It discusses how smaller spacecraft can enable more science missions with lower costs through increased numbers of missions. Smallsats allow for a faster learning cycle and development of new technologies. NASA's Ames Research Center has developed several smallsat platforms and payloads over the past decade for applications in Earth science, heliophysics, planetary science, and astrophysics. These include gene expression, pharmaceutical, and spectroscopy experiments. Ames is working to mature technologies for smallsat missions like advanced components, autonomous operations, and formation flying.
Pacific Research Platform Application DriversLarry Smarr
The document summarizes several science driver teams that use the Pacific Research Platform (PRP) for high-speed data transfers between California universities. It discusses projects in biomedical research, earth sciences, particle physics, astronomy, and other fields. Specific examples highlighted include using the PRP to share cancer genomics data between multiple institutions, connect a supercomputer to telescope data, enable virtual reality transfers between universities, and link laboratories studying earthquakes. The PRP is also being expanded to support additional uses like cryo-electron microscopy, cultural heritage databases, and networking in southern California.
The Pacific Research Platform: A Regional-Scale Big Data Analytics Cyberinfra...Larry Smarr
National Ocean Exploration Forum 2017
Ocean Exploration in a Sea of Data
Calit2’s Qualcomm Institute
University of California, San Diego
October 21, 2017
The document describes Project ELSA, which aims to design and construct a low-cost spherical probe called the NeoPod to collect and transmit scientific data from the surface of Europa. A team of undergraduate students designed the NeoPod over 8 months to fit within a 25cm diameter sphere and weigh less than 10kg. The NeoPod integrates two sensors (a magnetometer and Geiger counter), an avionics package with an FPGA, a communications system, and a power system to operate for 100 hours. Simulation results show the NeoPod could transmit up to 301MB of data if located at Europa's poles. The project aims to demonstrate the feasibility of using low-cost probes to explore Europa and other planetary
This document summarizes the proceedings of the 3rd annual meeting of the NASA Institute for Advanced Concepts (NIAC) held on June 5-6, 2001 at NASA Ames Research Center. It provides an overview of proposals received and awards given by NIAC, as well as summaries of the status reports presented on innovative advanced aerospace concepts. The status reports covered concepts such as a space elevator, robotic planetary explorers, very large space telescopes, and in-situ resource utilization for Mars missions. Keynote speakers discussed visions for the future of aeronautics and space.
This document summarizes new frontiers in astronomy with the use of space telescopes like Hubble and beyond. It discusses the growth of data collection over the past 25 years through improved detectors and instruments. Various space telescopes that collect data across the electromagnetic spectrum are highlighted. Challenges of large astronomical surveys producing terabytes and petabytes of multi-wavelength data over time are discussed. Standards for data formats, coordinate systems, protocols and virtual observatory initiatives are outlined to help address these challenges.
Similar to Virtual Observatories as the Drivers of Space Science - Robert Rankin, University of Alberta (20)
Cyber Summit 2016: Technology, Education, and DemocracyCybera Inc.
What are the opportunities and the challenges offered by emerging modes of technologically-inflected communication and decision-making? What is our role and responsibility as educators and as developers of research and teaching digital infrastructures? What do students need in the 21st century? As education institutions and providers struggle to respond to the first two questions, are we abrogating our responsibility to the last?
In this talk, Matt Ratto will describe some of the opportunities and the challenges we currently face, laying out a model of action for how to potentially address the questions raised above. Core to his thinking are two related points; first that we must help students develop a greater sense of how the informational world and its attendant infrastructures helps shape how and what we think, and second, that a good way to do this is to give students the space to engage in reflexive acts of technological production – what Matt has termed ‘critical making.’ He will provide concrete examples from both his research and his teaching that demonstrate the value and importance of reflexive, hands-on work with digital technologies in helping students develop the critical digital literacy skills they need to function in today’s society.
Matt Ratto is an Associate Professor in the Faculty of Information at the University of Toronto and directs the Semaphore Research cluster on Inclusive Design, Mobile and Pervasive Computing and, as part of Semaphore, the Critical Making lab.
Cyber Summit 2016: Understanding Users' (In)Secure BehaviourCybera Inc.
1) The document summarizes a user study on phishing detection that found users still struggle to accurately identify phishing sites, being successful only 53% of the time on phishing sites and 78% on legitimate sites.
2) The study also found that users have only a shallow understanding of security indicators and place more attention on page content than security cues. Nearly half did not recognize a phishing version of their own bank site.
3) The presentation argues that common password policies may do more harm than good by placing unreasonable demands on human memory and behavior. It suggests rethinking such policies and advice to consider human capabilities and provide more practical and beneficial guidance.
Cyber Summit 2016: Insider Threat Indicators: Human BehaviourCybera Inc.
Serious threats to private and governmental organizations do not only come from the outside world, but also come from within. Some employees and contractors with legitimate access to buildings, networks, assets and information deliberately misuse their priviledged access to cause harm to their organization. What are the reasons behind their actions? Is it debts, greed, ideology, disgruntlement, or divided loyalty?
Regardless of their motivations or vulnerabilities, traitors have very similar types of personality and display a certain pattern of behaviours before committing an insider incident. As a prevention measure, it is vital that organizations and employees understand, recognize and detect the common indicators of insider threat. Would you recognize the signs?
Mario Vachon is an Insider Threat Security Specialist with the RCMP Departmental Security Branch.
Cyber Summit 2016: Research Data and the Canadian Innovation ChallengeCybera Inc.
Canada allocates a substantial amount of public funding to research, which is a critical factor in ensuring we remain innovative and competitive. Increasingly this funding is geared to the support and development of digital research infrastructure (DRI), including the underlying networks and the associated data acquisition, storage, analysis and visualization. In order to maximize the benefits of increasingly complex DRI and the research it facilitates, it is important to make sure data is properly stewarded, accessible and reusable. By adopting appropriate approaches to research data management we are better positioned to respond to challenges, such as effectively measuring research impacts, and ensuring the reproducibility, privacy, and security of research outputs.
Research Data Canada (RDC) is a member-driven organization committed to developing a sustainable approach to research data management, one based on interoperability and best practices. This session will provide an update on the efforts of RDC and partner organizations, including: CANARIE, Compute Canada, CARL Portage Network, CASRAI, the TriAgencies, and the Leadership Council for Digital Infrastructure. Intersections with international activities and projects will also be highlighted. These efforts are ultimately designed to faciliate a cohesive national approach to research data management, and one based on a clearly articulated vision for supporting innovation and discovery in Canada.
Mark Leggott is the Executive Director of Research Data Canada.
Cyber Summit 2016: Knowing More and Understanding Less in the Age of Big DataCybera Inc.
The Internet has revolutionized how — and how much — each of us can know. Our digital tools put the knowledge of the world at our fingertips — and soon, maybe, right into our heads. But what kinds of of knowledge do our devices give us, and how are they reshaping and challenging the role that education and libraries should play in our lives?
This talk was delivered by Michael Patrick Lynch, professor of philosophy at the University of Connecticut, where he directs the university’s Humanities Institute.
Cyber Summit 2016: Privacy Issues in Big Data Sharing and ReuseCybera Inc.
This document summarizes a presentation on big data and data reuse given by Bart Custers. It discusses:
1) The Eudeco project which examines big data and data reuse from legal, societal, economic, and technological perspectives across multiple European countries.
2) Issues with data sharing and reuse, including potential privacy violations, discrimination, lack of transparency, and unintended consequences from new uses of data or placing it in new contexts.
3) Potential solutions discussed, including privacy impact assessments, privacy by design, and new approaches focusing more on transparency and responsibility than restricting data access and use.
Cyber Summit 2016: Establishing an Ethics Framework for Predictive Analytics ...Cybera Inc.
This document summarizes a presentation about establishing an ethics framework for predictive analytics using student data in higher education. It discusses how technology has enabled more data collection and predictive modeling of student behavior. However, few guidelines exist for these practices. The presentation advocates developing an ethics framework that safeguards student privacy, promotes transparency, considers unintended consequences, and involves consultation. It also examines existing principles and discusses challenges like opaque predictive models that work against students' interests. The presenter argues universities should internalize norms of respecting trust and serving students, not just avoiding legal issues.
Cyber Summit 2016: The Data Tsunami vs The Network: How More Data Changes Eve...Cybera Inc.
Canada’s National Research and Education Network, like other ultra-speed research networks, has evolved to transfer massive amounts of data at 100Gbps and beyond. But with the volume of data traffic growing at more than 50% per year, the ability to move increasing volumes of data is challenging. What are the kinds of applications in research and education that are driving this growth? What are the implications of the coming data tsunami on our communication networks? And what happens to network economics to keep up with the demand? CANARIE’s Chief Technology Officer, Mark Wolff, explores these topics and offer insights into how the NREN will evolve to continue to meet the unique needs of Canada’s research and education community.
Cyber Summit 2016: Issues and Challenges Facing Municipalities In Securing DataCybera Inc.
The City of Calgary is responsible for providing municipal services to 1.1 million people and 16,000 employees with more than 700 sites and critical infrastructure units. The municipal services represent a $60B asset base including water and wastewater treatment plants, light rapid transit, emergency services, roads and recreation facilities, and has revenue and procurement streams of $4.0B annually. During his tenure, Owen Key, Chief Security Officer and the Chief Information Officer for the City, has implemented enterprise systems for CCTV, access and ID control, physical security information management systems, and has responsibility for information security.
Cyber Summit 2016: Using Law Responsibly: What Happens When Law Meets Technol...Cybera Inc.
This document summarizes issues at the intersection of law and technology in Canada over the next five years. It discusses debates around lawful access to data, encryption, data retention, and network interception capabilities. Other issues addressed include internet taxes, linking and payments between platforms, VPN use, global orders for content removal, localization requirements, and website blocking. The document argues that as these issues are addressed through law and policy, responsibilities must be met to use law responsibly and consider matters like privacy, oversight, safeguards, and technological implications.
This document summarizes a presentation given by Brian Hamilton on privacy, security, and access to data. It discusses the role of the Office of the Information and Privacy Commissioner of Alberta in overseeing privacy laws and reviewing research proposals. It outlines how the office analyzes information sharing and big data initiatives to ensure privacy is protected. Tips are provided for developing privacy controls and gaining approval, including conducting a privacy impact assessment and developing expertise in privacy principles.
Historically, the University of Alberta lacked a centrally managed repository for reporting data, resulting in inconsistency and disparity in access for units across campus. Meaningful and actionable reports were limited, and only focused on the interests and goals of the few units with data analysts who could synthesize the information.
Over the last couple of years, the University of Alberta has undertaken major changes in how information is managed and utilized. At the forefront of this change has been an increased interest in supporting the development of analytics and supporting tools. Beginning with the implementation of a centrally managed data warehouse with self-service capabilities, and the introduction of cloud services with business process analysis tools, the University is just starting down the road of big data.
This presentation explores opportunities and challenges for the University of Alberta in utilizing big data.
Predicting the Future With Microsoft BingCybera Inc.
The next generation of data scientists will be asked to build predictive models that can extract inferences from very large datasets which are unobservable at the surface, even to the best domain experts. Microsoft has access to some truly large data sets, web and search data from the Bing search engine and social data through collaborations with Twitter. In this talk, we show you how a small team of data scientists used this data to build the Bing Predicts engine — a collection of machine learnt predictive models that is beating industry experts at predicting the outcome of events like the Super Bowl, the Oscars, elections and referendums and even breakthroughs in health sciences. The talk will also give a preview of how organizations can adopt a big data mindset to generate and experiment with large data sets and to make amazing predictions using their own data.
Analytics 101: How to not fail at analyticsCybera Inc.
"Data Scientist" is perhaps the hottest job title of recent years. But what is a data scientist? What does a data scientist actually do? And where can they be found? In this talk, presenter Daniel Haight describes the benefits of analytics to decision-making, and explores the characteristics of successful organizations that have fostered their own team of data scientists.
The MOOC movement is only four years old, but has already had a tremendous impact on teaching and learning. While the some of the original hype surrounding MOOCs has not been realized, the reality is that they are here for good and are influencing institutional thinking. This talk will discuss the past, present and future of MOOCs.
While the use of online instructional technologies allows the presentation of theoretical science materials, how do we deal with the fact that such courses often include hands-on labs? Laboratory simulations can only provide a solution for online students in a limited and often artificial way. Nearly 20 years ago, Athabasca University developed a solution to the problem of students having to travel to complete their lab work. Emerging technologies at the time allowed for quantitative physics labs to be sent to students as a small kit. The physics initiative was so successful, with over 5,000 students served, that it was picked up in other fields at Athabasca University.
Over the years, such material has become cheaper, easier to use, and more integrated with modern computers. Athabasca is now pioneering ways to put real labs directly onto the internet. In this session, the methods used to make real lab experiences available to online students will be discussed, and some of them demonstrated.
Canadian municipalities are making great strides when it comes to sharing their data in fun, interactive ways. In this session, presenters will look at cities that are using their data to create useful apps and services for citizens; and describe how all community leaders can get involved to make their municipality more open and accessible.
Data science and the use of big data in healthcare delivery could revolutionize the field by decreasing costs and vastly improving efficiency and outcomes. There is an abundance of healthcare data in Canada, but it is mostly siloed and difficult to access due to privacy and security challenges. This session will offer insights into best practices for healthcare analytics programs, as well as use cases that demonstrate the potential benefits that can be realized through this work.
Checking in on Healthcare Data AnalyticsCybera Inc.
Data science and the use of big data in healthcare delivery could revolutionize the field by decreasing costs and vastly improving efficiency and outcomes. There is an abundance of healthcare data in Canada, but it is mostly siloed and difficult to access due to privacy and security challenges.
Open access and open data: international trends and strategic contextCybera Inc.
Governments around the world fund billions of dollars in research every year. Ensuring that the results of research are available to the public, other researchers and industry has become an important underlying value in order to maximize the impact of our publicly funded research. This session will discuss what’s driving the trend towards greater openness and provide an overview of international developments that will help put Canada’s activities into context.
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Discover top-tier mobile app development services, offering innovative solutions for iOS and Android. Enhance your business with custom, user-friendly mobile applications.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
2. CI
projects
in
Space
Science…
CANARIE
Network
Enabled
Platforms
(NEP)
for
Space
Science
CSSDP
(NEP-‐I)
Canadian
Space
Science
Data
Portal
www.cssdp.ca
CESWP
(NEP-‐II)
Cloud
Enabled
Space
Weather
Data
Assimilation
and
Modelling
Platform
www.ceswp.ca
Cybera
provides
overall
project
management
3. Project
Involvement
Institutions
involved
in
the
CSSDP/CESWP
projects
CANARIE
(Network
Enabled
Platform)
Cybera
(Project
Lead)
CSA
(CGSM
and
e-‐POP)
Universities
-‐
Alberta,
Calgary,
Saskatchewan,
New
Brunswick,
Michigan,
UCLA,
Colorado,
Augsburg
College,
Peking
Missions
–
NASA
THEMIS,
CSA
e-‐POP,
CSA
ORBITALS
4. Defini7on…
A Virtual Observatory (VO) encompasses all forms of
network tools, databases and websites that are
utilized for collaborative research.
From
Oct.
2010
NSF
will
require data management
plans as part of all NSF funding proposals.
“This addresses the need for data from publicly-funded
research to be made public” (NSF Deputy Director)
5. Space
Data
challenges…
Technical
innovation
means
increasingly
sophisticated
instruments
are
being
proposed
and
deployed
Data
volumes
are
growing
exponentially
Future
experiments
are
expected
to
generate
upwards
of
1015
Bytes
of
data
!
Data
management
challenges
are
numerous
Data
is
stored
in
different
formats
across
heterogeneous
computer
environments
Standards
where
they
exist
are
still
rapidly
evolving
Appropriately
defined
meta-‐data
is
needed
to
find
and
access
relevant
“physical”
data
(e.g.,
SPASE)
Collaboration
is
key
to
making
advances
in
space
science
6. CSSDP
is…
A
“one-‐stop-‐shop”
to
discover,
gather
and
visualize
relevant
data
(using
CANARIE’s
high-‐speed
network)
A
gateway
to
make
data
available
to
other
researchers
An
environment
to
host
common
analysis
tools
A
place
to
collaborate
with
research
teams
A
workflow
engine
to
simplify
research
tasks
www.cssdp.ca
7.
8. Metadata…
Metadata
Description
of
data
sets
or
other
resources
Allows
catalogue
and
search
of
data
CSSDP
follows
NASA/SPASE
XML
standard
Usually
generated
from
data
file
path/name
Metadata
includes
Date/Time
Project,
Instrument,
Observatory
Data
Stream
SPASE
XML
-‐
resource
then
can
be
shared
over
internet
9. Canada’s
Geospace
Monitoring
Array
(CGSM)
a
Window
into
the
Magnetosphere
The
CGSM
Array:
Monitors
Ionospheric
Footprint
of
Space
Weather
10/25/10
9
10. CSSDP
Data
Sources…
e-SOC UofC
UCLA UofA
e-‐POP
CHAIN
VMO
CARISMA
??
SFTP
UNB
SuperDARN
SFTP CSSDP
Data
Store
NRC
F10.7
UofSask
FTP
THEMIS
CANMOS
UCBerkeley FTP
NORSTAR
Geological Survey
GAIA?
of Canada
UofC
MACCS
Augsburg College
11. Who
uses
CSSDP?
Data
Providers
Make
data
available
to
others
to
use
and
study
Researchers
Discover,
view,
download
and
analyse
data
from
multiple
sources
Collaborators
Teams
who
want
to
collaborate
online
in
a
common,
data-‐integrated
environment
12. Researchers
One-‐stop
shop
to
discover
and
download
data
from
multiple
sources
Data
availability
reports
Quick-‐looks
and
online
parameterized
plots
Annotate
data
Automate
repetitive
tasks
–
workflows
Access
data
directly
from
desktop
analytics
Integration
with
IDL
tools
Web
services
13. Data
Providers
Make
data
available
when
you
want,
how
you
want
Control
data
access
Track
usage
Determine
how
you
want
your
data
presented
Provide
quick-‐looks
and
user-‐defined
graphics
On-‐demand
plots
Share
other
analytic
tools
14. Collaborators…
CSSDP
features
an
integrated
collaboration
environment
Workspaces
-‐
notices,
calendars,
discussion
boards,
upload
documents,
version
control
Public
workspaces
-‐
project
notices,
RSS
feeds
Private
workspaces
-‐
sharable,
team
collaboration
Data
integration
(planned
enhancements)
15. Where
are
we
going?…
Sputnik
1
–
October
4
1957
to
January
4th
1958
No
instruments
Caught
everyone
by
surprise
The
“space
race”
was
on
–
battle
of
political
ideologies
16. (the
space
age)
Sputnik
2
–
November
3rd
1957
to
April
14th
1958
Many
scientific
instruments
Carried
Laika
Thermal
insulation
failed;
Laika
died
after
a
few
hours
Satellite
was
enormous
and
easy
to
track
17. (the
space
age)
Explorer
1
–
January
31st
1958
to
March
19th
1970
Several
science
instruments
Discovered
the
radiation
belts
(confirmed
by
Explorer
3)
Established
that
micrometeorites
were
not
a
threat
at
LEO:
100km-‐1000km,
e.g.,
Space
Shuttle
William
Pickering,
James
Van
Allen,
and
Wernher
von
Braun
18. (the
space
age)
Yuri
Gagarin
(1934-‐1968):
April
12th
1961
–
first
human
to
orbit
Earth
John
Glen
(1921-‐):
February
20th
1962
–
first
American
to
orbit
Earth
(3
times)
Neil
Armstrong
(1930-‐):
July
21st
1969
–
first
human
to
walk
on
the
Moon
19. Living
With
a
Star…
Living
With
a
Star
(LWS)
Understanding
the
effects
of
the
Sun
on
Earth
and
the
solar
system
The
Sun
is
coupled
to
planetary
systems
and
space
through:
-‐ Radiation
-‐ Charged
particles
-‐ Electric
and
Magnetic
Fields
The
Plasma
Universe
99
%
of
visible
matter
in
the
universe
is
in
plasma
state
Plasma:
an
ionized
gas
of
equal
densities
of
ions
and
electrons
21. Who
Cares?…
Solar-‐Wind-‐Magnetosphere-‐
Ionosphere-‐Coupling
drives
‘Space
Weather’
SW
affects
space
and
ground
based
assets
in
numerous
ways
22. Satellite
damage…
Geostationary
satellites
are
affected
by
Space
Weather
Surface
charging
by
keV
electrons
Internal
charging
by
relativistic
“killer”
electrons
>2MeV
energy
Solar
flare
protons
cause
phantom
commands
23. Radia7on
Belt
Storm
Probes...
RBSP–
2
spacecraI
to
understand
rela7vis7c
par7cle
accelera7on,
transport,
and
loss.
Implemented
as
the
Launch
2012
2nd
mission
in
Living
with
a
Star.
Perigee:
~700
km
altitude
Apogee
~5.5
Re
geocentric
altitude
Inclination
~10
degrees
Sun
pointing,
spin
stabilized
Duration
2
years
(expendables
4
years)
Old View: STATIC
New View: DYNAMIC
24. UofA
ORBITALS
Satellite...
• Planned
launch
2011-‐12.
Examine
wave-‐par7cle
interac7ons
in
Van
Allen
Radia7on
Belts
(cf.
NASA
RBSP)
• Partnered
with
NASA
“MORE”;
will
contribute
spacecraI
instruments
• 12
hour
orbit
with
very
long-‐las7ng
CGSM-‐ground-‐
and
GEO
conjunc7ons.
Canada’s
contribution
to
LWS
and
NASA’s
RBSP
Mission
25. CESWP
is…
An
environment
to
share,
run
and
collaborate
on
simulation
and
analysis
work
Involves
the
creation
of
a
Compute
Cloud
that
spans
Canada
and
several
countries
Involves
moving
computer
models
into
the
cloud,
and
making
them
available
Not
intended
to
replace
entities
such
as
WestGrid
www.ceswp.ca
26.
27. Integra7on
of
data
and
models...
Simulations
using
the
Space
Weather
Modeling
Framework
–
SWMF
Polar
satellite
observations
of
the
Auroral
Oval
in
UVI
–
the
poleward
boundary
is
called
the
OCFLB
41. CSSDP
does
the
rest...
CSSDP
nightly
processes
will
automatically
run
and
catalogue
your
data
(consume
SPASE
metadata)
As
new
data
appears
on
your
site
CSSDP
will
automatically
generate
new
SPASE
XML
metadata
and
register
it
New
data
streams
can
be
added
any
time
42. NASA
THEMIS…
MISSION
SCIENCE
GOALS:
Primary:
“How
do
substorms
operate?”
–
One
of
the
oldest
and
most
important
ques7ons
in
Geoscience
–
A
turning
point
in
our
understanding
of
the
dynamic
RESOLVING THE PHYSICS OF ONSET AND
magnetosphere
EVOLUTION OF SUBSTORMS
First
bonus
science:
“What
accelerates
storm-‐Lme
‘killer’
electrons?”
–
A
significant
contribu7on
to
space
weather
science
Second
bonus
science:
“What
controls
efficiency
of
solar
wind
–
magnetosphere
coupling?”
–
Provides
global
context
of
Solar
Wind
–
Magnetosphere
interac7on
FIVE PROBES LINE UP TO TIME ONSET
AND TRACK ENERGY FLOW IN THE TAIL