10.02.19
Invited talk
Symposium #1816, Managing the Exaflood: Enhancing the Value of Networked Data for Science and Society
Title: Advancing the Metagenomics Revolution
San Diego, CA
Viral metagenomics is the study of viral genetic material sourced directly from the environment rather than from a host or natural reservoir. The goal is to ascertain the viral diversity in the environment that is often missed in studies targeting specific potential reservoirs.
High-Throughput Universal Digital High-Resolution Melt Platform for Bacterial...Benjamin Yang
This document describes the development of a diagnostic device to rapidly identify bacterial strains responsible for sepsis. Key points:
- The device uses droplet digital PCR to amplify and quantify bacterial DNA signatures from patient blood samples. High resolution melt analysis is then used to generate strain-specific melt curves of the 16S rRNA gene.
- A microfluidic chip is engineered to generate stable droplets and thermal gradients allowing acquisition of melt curves. Machine learning algorithms classify melt curves to identify bacterial strains.
- The goal is to identify bacteria in patient blood samples within clinically relevant timeframes, shortening diagnostic time from 12 days to 3-4 hours for targeted antibiotic treatment. Results show a stable thermal gradient is achieved within the chip
- The study provides a cost analysis of three next-generation sequencing applications: targeted gene panels (TGP), whole exome sequencing (WES), and whole genome sequencing (WGS). It finds per-sample costs of €333 for TGP, €792 for WES, and €1,669 for WGS.
- Costs are mainly driven by consumables such as sequencing and sample preparation. The estimated $1,000 genome has not been achieved, though it may be approached under best-case assumptions of long-term, efficient use of equipment and considerable cost reductions.
- The choice of sequencing approach in clinical practice should consider both costs and clinical effectiveness, not just costs alone.
The document summarizes activities at the California Institute for Telecommunications and Information Technology (Calit2). It describes two new buildings that provide laboratories and support over 1000 researchers working on projects including nanotechnology, virtual reality, and digital cinema. It highlights several projects Calit2 is involved in, such as prototyping extremely high bandwidth applications, borderless collaboration between global research centers, and wireless networks for disaster response.
Metagenomics Over Lambdas: Update on the CAMERA ProjectLarry Smarr
07.02.27
Invited Talk
6th Annual ON*VECTOR International Photonics Workshop
Title: Metagenomics Over Lambdas: Update on the CAMERA Project
La Jolla, CA
10.02.19
Invited talk
Symposium #1816, Managing the Exaflood: Enhancing the Value of Networked Data for Science and Society
Title: Advancing the Metagenomics Revolution
San Diego, CA
Viral metagenomics is the study of viral genetic material sourced directly from the environment rather than from a host or natural reservoir. The goal is to ascertain the viral diversity in the environment that is often missed in studies targeting specific potential reservoirs.
High-Throughput Universal Digital High-Resolution Melt Platform for Bacterial...Benjamin Yang
This document describes the development of a diagnostic device to rapidly identify bacterial strains responsible for sepsis. Key points:
- The device uses droplet digital PCR to amplify and quantify bacterial DNA signatures from patient blood samples. High resolution melt analysis is then used to generate strain-specific melt curves of the 16S rRNA gene.
- A microfluidic chip is engineered to generate stable droplets and thermal gradients allowing acquisition of melt curves. Machine learning algorithms classify melt curves to identify bacterial strains.
- The goal is to identify bacteria in patient blood samples within clinically relevant timeframes, shortening diagnostic time from 12 days to 3-4 hours for targeted antibiotic treatment. Results show a stable thermal gradient is achieved within the chip
- The study provides a cost analysis of three next-generation sequencing applications: targeted gene panels (TGP), whole exome sequencing (WES), and whole genome sequencing (WGS). It finds per-sample costs of €333 for TGP, €792 for WES, and €1,669 for WGS.
- Costs are mainly driven by consumables such as sequencing and sample preparation. The estimated $1,000 genome has not been achieved, though it may be approached under best-case assumptions of long-term, efficient use of equipment and considerable cost reductions.
- The choice of sequencing approach in clinical practice should consider both costs and clinical effectiveness, not just costs alone.
The document summarizes activities at the California Institute for Telecommunications and Information Technology (Calit2). It describes two new buildings that provide laboratories and support over 1000 researchers working on projects including nanotechnology, virtual reality, and digital cinema. It highlights several projects Calit2 is involved in, such as prototyping extremely high bandwidth applications, borderless collaboration between global research centers, and wireless networks for disaster response.
Metagenomics Over Lambdas: Update on the CAMERA ProjectLarry Smarr
07.02.27
Invited Talk
6th Annual ON*VECTOR International Photonics Workshop
Title: Metagenomics Over Lambdas: Update on the CAMERA Project
La Jolla, CA
overview on Next generation sequencing in breast csncerSeham Al-Shehri
Next-generation sequencing (NGS) and its application to breast cancer research is discussed. NGS allows for comprehensive profiling of microRNA (miRNA) expression, which can provide insights into tumorigenesis pathways. The document outlines the NGS process, including template preparation, sequencing/imaging, and complex data analysis using statistical methods and software. Key applications are identifying miRNA involvement in cancer driver genes and exploring miRNA expression patterns to discover potential diagnostic or prognostic biomarkers for breast cancer subtypes.
Viral Metagenomics (CABBIO 20150629 Buenos Aires)bedutilh
This is a one-hour lecture about metagenomics, focusing on discovery of viruses and unknown sequence elements. It is part of a one-day workshop about metagenome assembly of crAssphage, a bacteriophage virus found in human gut. The hands-on workflow can be found at http://tbb.bio.uu.nl/dutilh/CABBIO/ and should be doable in one afternoon with supervision. There is also an iPython notebook about this here: https://github.com/linsalrob/CrAPy
Application of Whole Genome Sequencing in the infectious disease’ in vitro di...ExternalEvents
This document discusses the application of whole genome sequencing in infectious disease diagnostics. It provides examples of how genome sequencing has been used to identify bacterial species, detect antibiotic resistance genes, and study outbreaks. The document also discusses challenges around regulatory approval of genomic tests, data sharing policies, and database management. Overall, it argues that whole genome sequencing is a valuable tool but that standards must be developed to ensure high quality data.
The Global Virome Project is a 10-year global effort to identify and characterize naturally occurring viruses with pandemic potential. It aims to build a comprehensive database of the estimated 1.6 million viral species circulating in mammals and waterfowl. This will allow researchers to develop broad-spectrum countermeasures against future zoonotic viruses and identify high-risk viruses to prevent spillover. The project will sample viruses in 108 sites across 63 countries over 10 years, prioritizing countries and species based on viral discovery rates and zoonotic risk prediction models. The goal is to capture over 85% of the global mammalian virome to transform virology and pandemic preparedness.
Metagenomics is the study of metagenomes, genetic material recovered directly from environmental samples. The broad field was referred to as environmental genomics, ecogenomics or community genomics. Recent studies use "shotgun" Sanger sequencing or next generation sequencing (NGS) to get largely unbiased samples of all genes from all the members of the sampled communities.
This study demonstrates the utility of using Next Generation Sequencing (NGS) technology and DNA analysis to identify and analyze closely related insect species and populations. The researchers sequenced DNA from two mitochondrial genes and a nuclear gene from individuals of two closely related fly species, Bactrocera philippinensis and B. occipitalis. They obtained overlapping sequences from these genes that could be assembled into full gene sequences. Their goal is to ultimately sequence the entire genome of multiple individuals to better characterize populations and species through comparative genomic analysis. DNA-based methods provide advantages over traditional taxonomy by requiring less material and being consistent across life stages.
This document discusses various approaches for spatial genomics including direct measurement techniques like RNA fluorescence in situ hybridization (RNA FISH) and multiplexed protein localization, as well as in situ sequencing methods like seqFISH, MERFISH, and fluorescent in situ sequencing (FISSEQ). It also covers computational inference methods that impute spatial gene expression patterns from bulk tissue sequencing data, and the commercial "Spatial Transcriptomics" approach that involves barcoding tissue sections on a slide and sequencing the resulting cDNA library.
Next Generation Sequencing for Identification and Subtyping of Foodborne Pat...Nathan Olson
"Next Generation Sequencing for Identification and Subtyping of Foodborne Pathogens" presentation at the Standards for Pathogen Identification via NGS (SPIN) workshop hosted by the National Institute for Standards and Technology October 2014 by Rebecca Lindsey, PhD from Enteric Diseases Laboratory Branch of the CDC.
Plant Phenotyping, a new scientific discipline to quantify plant traitsNetNexusBrasil
The document summarizes research on plant phenotyping conducted at the Forschungszentrum Jülich. It describes phenotyping as quantifying plant traits in space and time, including effects of environment and genetics. Methods discussed include automated measurements of shoots and roots, field phenotyping using mini-plots and aerial sensors, and 3D reconstruction of canopies. Examples demonstrate quantifying photosynthesis and measuring various plant traits from airborne platforms to better understand crop responses and gene-environment interactions.
K-mers in metagenomics
K-mers play a critical role in the exploration of metagenomic data. They have been widely used to assign taxonomic attributions to the short genomic fragments characteristic of shotgun (metagenomic) sequencing. These approaches provide an assembly-free method for profiling microbial communities, and have helped elucidate the factors driving microbial community composition across biogeochemical gradients. Advances in sequencing technology are now making it cost-effective to sequence microbial communities at sufficient depths to allow for the assembly of high-quality contigs. This has made it possible to adopt k-mer based approaches to enable reliable binning of contigs originating from a single microbial population within a community. In this session, I will present both an overview of how k-mers can be used to assign taxonomic attributions to short metagenomic reads, and discuss how these approaches have advanced to a point where population genomes can be recovered en masse from even complex microbial communities.
Errors and Limitaions of Next Generation SequencingNixon Mendez
This document discusses some key errors and limitations of next-generation sequencing (NGS). It notes that while NGS has significantly reduced costs and improved throughput, it also has some drawbacks compared to previous sequencing technologies. Specifically, it outlines issues related to low quality bases, PCR errors during amplification, and high error rates that can make rare mutations difficult to detect. Limitations include short read lengths that hamper assembly of repetitive regions, contamination risks, incomplete representation of repeats, difficulties assembling segmental duplications and genes fragmented across scaffolds. The document emphasizes the need for validation of genome assemblies and development of hybrid approaches combining long and short reads to overcome these challenges.
The presentation includes preliminary information about the big data mainly metagenomic data and discussions related to the hurdles in analyzing using conventional approaches. In the later part, brief introduction about machine learning approaches using biological example for each. In the last, work done with special focus on implementation of a machine learning approach Random Forest for the functional annotation and taxonomic classification of metagenomic data.
Biochemistry: A pivotal aspects in forensic scienceVanshikaVarshney5
In the above presentation, you will know the importance of biochemistry in forensic science. Biochemistry is not all about the chemicals, it is about your life, your environment. Basically, it belongs to you.
in this presentation, you can know about the biochemical techniques which are majority used in forensic science and various research occurs in the field of forensic science which is related with biochemistry.
This document presents the results of a study analyzing greenhouse gas observations from the ICOS atmospheric network to detect potential signals from reduced emissions during COVID-19 lockdowns in 2020. The study used a transport model (STILT) to simulate CO2, CH4, and CO concentrations and compare them to observations. No significant changes were detected between observed and simulated concentrations that could be attributed to COVID-19. The authors conclude that while the emission drop provided an opportunity to test detection capabilities, signal sizes were small and improved emission models are needed to better represent greenhouse gas observations.
Microarrays can be used for gene expression profiling, comparative genomics, disease diagnosis, drug discovery, and toxicological research. It allows researchers to examine thousands of genes simultaneously and see changes in gene expression patterns. Microarrays have applications in areas like cancer classification, pharmacogenomics, and toxicogenomics. While a powerful tool, microarrays also have limitations like being expensive to create and requiring time to develop.
CD Genomics, experts in dna & genome sequencing service, genotyping, health diagnostics, bioinformatics, custom cdna library development. Contact Us Today!
Spatial transcriptome profiling by MERFISH reveals sub-cellular RNA compartme...Jean Fan
The document summarizes research using Multiplexed Error-robust FISH (MERFISH), a technique that uses error-correcting barcodes to enable simultaneous spatial transcriptomic profiling of thousands of genes. MERFISH allows identification of subcellular localization of RNAs and spatial organization of single cell clusters. It also enables derivation of "RNA velocity" to predict the future transcriptional state of cells by discriminating between nuclear and cytoplasmic mRNA. When combined with computational analyses, MERFISH provides high-resolution insights into gene expression patterns within tissues at both the cellular and subcellular levels.
Next Generation Sequencing for Identification and Subtyping of Foodborne Pat...nist-spin
"Next Generation Sequencing for Identification and Subtyping of Foodborne Pathogens" presentation at the Standards for Pathogen Identification via NGS (SPIN) workshop hosted by National Institute for Standards and Technology October 2014 by Rebecca Lindsey, PhD from Enteric Diseases Laboratory Branch of the CDC.
Cyberinfrastructure for Advanced Marine Microbial Ecology Research and Analys...Larry Smarr
06.03.13
Invited Keynote
Annual Meeting CENIC 2006
Title: Cyberinfrastructure for Advanced Marine Microbial Ecology Research and Analysis (CAMERA)
Oakland, CA
overview on Next generation sequencing in breast csncerSeham Al-Shehri
Next-generation sequencing (NGS) and its application to breast cancer research is discussed. NGS allows for comprehensive profiling of microRNA (miRNA) expression, which can provide insights into tumorigenesis pathways. The document outlines the NGS process, including template preparation, sequencing/imaging, and complex data analysis using statistical methods and software. Key applications are identifying miRNA involvement in cancer driver genes and exploring miRNA expression patterns to discover potential diagnostic or prognostic biomarkers for breast cancer subtypes.
Viral Metagenomics (CABBIO 20150629 Buenos Aires)bedutilh
This is a one-hour lecture about metagenomics, focusing on discovery of viruses and unknown sequence elements. It is part of a one-day workshop about metagenome assembly of crAssphage, a bacteriophage virus found in human gut. The hands-on workflow can be found at http://tbb.bio.uu.nl/dutilh/CABBIO/ and should be doable in one afternoon with supervision. There is also an iPython notebook about this here: https://github.com/linsalrob/CrAPy
Application of Whole Genome Sequencing in the infectious disease’ in vitro di...ExternalEvents
This document discusses the application of whole genome sequencing in infectious disease diagnostics. It provides examples of how genome sequencing has been used to identify bacterial species, detect antibiotic resistance genes, and study outbreaks. The document also discusses challenges around regulatory approval of genomic tests, data sharing policies, and database management. Overall, it argues that whole genome sequencing is a valuable tool but that standards must be developed to ensure high quality data.
The Global Virome Project is a 10-year global effort to identify and characterize naturally occurring viruses with pandemic potential. It aims to build a comprehensive database of the estimated 1.6 million viral species circulating in mammals and waterfowl. This will allow researchers to develop broad-spectrum countermeasures against future zoonotic viruses and identify high-risk viruses to prevent spillover. The project will sample viruses in 108 sites across 63 countries over 10 years, prioritizing countries and species based on viral discovery rates and zoonotic risk prediction models. The goal is to capture over 85% of the global mammalian virome to transform virology and pandemic preparedness.
Metagenomics is the study of metagenomes, genetic material recovered directly from environmental samples. The broad field was referred to as environmental genomics, ecogenomics or community genomics. Recent studies use "shotgun" Sanger sequencing or next generation sequencing (NGS) to get largely unbiased samples of all genes from all the members of the sampled communities.
This study demonstrates the utility of using Next Generation Sequencing (NGS) technology and DNA analysis to identify and analyze closely related insect species and populations. The researchers sequenced DNA from two mitochondrial genes and a nuclear gene from individuals of two closely related fly species, Bactrocera philippinensis and B. occipitalis. They obtained overlapping sequences from these genes that could be assembled into full gene sequences. Their goal is to ultimately sequence the entire genome of multiple individuals to better characterize populations and species through comparative genomic analysis. DNA-based methods provide advantages over traditional taxonomy by requiring less material and being consistent across life stages.
This document discusses various approaches for spatial genomics including direct measurement techniques like RNA fluorescence in situ hybridization (RNA FISH) and multiplexed protein localization, as well as in situ sequencing methods like seqFISH, MERFISH, and fluorescent in situ sequencing (FISSEQ). It also covers computational inference methods that impute spatial gene expression patterns from bulk tissue sequencing data, and the commercial "Spatial Transcriptomics" approach that involves barcoding tissue sections on a slide and sequencing the resulting cDNA library.
Next Generation Sequencing for Identification and Subtyping of Foodborne Pat...Nathan Olson
"Next Generation Sequencing for Identification and Subtyping of Foodborne Pathogens" presentation at the Standards for Pathogen Identification via NGS (SPIN) workshop hosted by the National Institute for Standards and Technology October 2014 by Rebecca Lindsey, PhD from Enteric Diseases Laboratory Branch of the CDC.
Plant Phenotyping, a new scientific discipline to quantify plant traitsNetNexusBrasil
The document summarizes research on plant phenotyping conducted at the Forschungszentrum Jülich. It describes phenotyping as quantifying plant traits in space and time, including effects of environment and genetics. Methods discussed include automated measurements of shoots and roots, field phenotyping using mini-plots and aerial sensors, and 3D reconstruction of canopies. Examples demonstrate quantifying photosynthesis and measuring various plant traits from airborne platforms to better understand crop responses and gene-environment interactions.
K-mers in metagenomics
K-mers play a critical role in the exploration of metagenomic data. They have been widely used to assign taxonomic attributions to the short genomic fragments characteristic of shotgun (metagenomic) sequencing. These approaches provide an assembly-free method for profiling microbial communities, and have helped elucidate the factors driving microbial community composition across biogeochemical gradients. Advances in sequencing technology are now making it cost-effective to sequence microbial communities at sufficient depths to allow for the assembly of high-quality contigs. This has made it possible to adopt k-mer based approaches to enable reliable binning of contigs originating from a single microbial population within a community. In this session, I will present both an overview of how k-mers can be used to assign taxonomic attributions to short metagenomic reads, and discuss how these approaches have advanced to a point where population genomes can be recovered en masse from even complex microbial communities.
Errors and Limitaions of Next Generation SequencingNixon Mendez
This document discusses some key errors and limitations of next-generation sequencing (NGS). It notes that while NGS has significantly reduced costs and improved throughput, it also has some drawbacks compared to previous sequencing technologies. Specifically, it outlines issues related to low quality bases, PCR errors during amplification, and high error rates that can make rare mutations difficult to detect. Limitations include short read lengths that hamper assembly of repetitive regions, contamination risks, incomplete representation of repeats, difficulties assembling segmental duplications and genes fragmented across scaffolds. The document emphasizes the need for validation of genome assemblies and development of hybrid approaches combining long and short reads to overcome these challenges.
The presentation includes preliminary information about the big data mainly metagenomic data and discussions related to the hurdles in analyzing using conventional approaches. In the later part, brief introduction about machine learning approaches using biological example for each. In the last, work done with special focus on implementation of a machine learning approach Random Forest for the functional annotation and taxonomic classification of metagenomic data.
Biochemistry: A pivotal aspects in forensic scienceVanshikaVarshney5
In the above presentation, you will know the importance of biochemistry in forensic science. Biochemistry is not all about the chemicals, it is about your life, your environment. Basically, it belongs to you.
in this presentation, you can know about the biochemical techniques which are majority used in forensic science and various research occurs in the field of forensic science which is related with biochemistry.
This document presents the results of a study analyzing greenhouse gas observations from the ICOS atmospheric network to detect potential signals from reduced emissions during COVID-19 lockdowns in 2020. The study used a transport model (STILT) to simulate CO2, CH4, and CO concentrations and compare them to observations. No significant changes were detected between observed and simulated concentrations that could be attributed to COVID-19. The authors conclude that while the emission drop provided an opportunity to test detection capabilities, signal sizes were small and improved emission models are needed to better represent greenhouse gas observations.
Microarrays can be used for gene expression profiling, comparative genomics, disease diagnosis, drug discovery, and toxicological research. It allows researchers to examine thousands of genes simultaneously and see changes in gene expression patterns. Microarrays have applications in areas like cancer classification, pharmacogenomics, and toxicogenomics. While a powerful tool, microarrays also have limitations like being expensive to create and requiring time to develop.
CD Genomics, experts in dna & genome sequencing service, genotyping, health diagnostics, bioinformatics, custom cdna library development. Contact Us Today!
Spatial transcriptome profiling by MERFISH reveals sub-cellular RNA compartme...Jean Fan
The document summarizes research using Multiplexed Error-robust FISH (MERFISH), a technique that uses error-correcting barcodes to enable simultaneous spatial transcriptomic profiling of thousands of genes. MERFISH allows identification of subcellular localization of RNAs and spatial organization of single cell clusters. It also enables derivation of "RNA velocity" to predict the future transcriptional state of cells by discriminating between nuclear and cytoplasmic mRNA. When combined with computational analyses, MERFISH provides high-resolution insights into gene expression patterns within tissues at both the cellular and subcellular levels.
Next Generation Sequencing for Identification and Subtyping of Foodborne Pat...nist-spin
"Next Generation Sequencing for Identification and Subtyping of Foodborne Pathogens" presentation at the Standards for Pathogen Identification via NGS (SPIN) workshop hosted by National Institute for Standards and Technology October 2014 by Rebecca Lindsey, PhD from Enteric Diseases Laboratory Branch of the CDC.
Cyberinfrastructure for Advanced Marine Microbial Ecology Research and Analys...Larry Smarr
06.03.13
Invited Keynote
Annual Meeting CENIC 2006
Title: Cyberinfrastructure for Advanced Marine Microbial Ecology Research and Analysis (CAMERA)
Oakland, CA
Creating a Cyberinfrastructure for Advanced Marine Microbial Ecology Research...Larry Smarr
The document discusses the creation of the Cyberinfrastructure for Advanced Marine Microbial Ecology Research and Analysis (CAMERA) project. CAMERA aims to provide metagenomic sequencing and analysis of marine microbes at high speeds. It will include data from the Sorcerer II expedition and other projects. The document outlines how CAMERA will utilize Calit2's infrastructure including high-performance computing resources and optical networks to enable remote interactive analysis of large-scale genomic and environmental data sets.
The Singularity: Toward a Post-Human RealityLarry Smarr
06.02.13
Talk to UCSD's Sixth College
Honor's Course on Kurzweil's The Singularity is Near
Title: The Singularity: Toward a Post-Human Reality
La Jolla, CA
08.04.14
Invited Talk
National Astrobiology Institute Executive Council Meeting
Astrobiology Science Conference 2008
Santa Clara Convention Center
Title: High Performance Collaboration
Santa Clara, CA
Calit2 - CSE's Living Laboratory for ApplicationsLarry Smarr
08.05.27
UCSD CSE 91 - Perspectives in Computer Science (Spring 2008)
Calit2@UCSD
Title: Calit2 - CSE's Living Laboratory for Applications
La Jolla, CA
The Emerging Global Collaboratory for Microbial Metagenomics ResearchersLarry Smarr
08.07.30
Invited Talk
Delivered From Calit2@UCSD
Monash University MURPA Lecture
Title: The Emerging Global Collaboratory for Microbial Metagenomics Researchers
Melbourne, Australia
The document discusses emerging trends at the convergence of engineering, biology, physics, and information technology including:
1) LifeChips that merge microelectronics and life sciences allowing medical devices to interface with living systems at the nanoscale.
2) Nanotechnology applications such as nano-structured porous silicon for cancer treatment and nanosensors integrated on a single chip.
3) Building computational models of organisms like E. Coli and using optical networks to interactively view genomic data at high resolutions.
4) Global research collaborations enabled by dedicated high-speed optical networks supporting applications like marine metagenomics and digital cinema telepresence.
High Performance Cyberinfrastructure to Support Data-Intensive Biomedical Res...Larry Smarr
08.06.16
Invited Talk
Association of University Research Parks BioParks 2008
"From Discovery to Innovation"
Salk Institute
Title: High Performance Cyberinfrastructure to Support Data-Intensive Biomedical Research Instruments
La Jolla, CA
The Molecular Imaging Laboratory at Howard University provides state-of-the-art imaging equipment including high resolution MRI systems for small animal and clinical research. The lab aims to train students and foster multidisciplinary research using imaging to study disease processes and investigate new treatments. Areas of research include in vivo MRI and optical imaging of disease models in small animals, as well as molecular imaging of biological processes and developing new imaging probes and nanoparticles.
Global Telepresence in Support of Global Public HealthLarry Smarr
The document discusses Calit2's efforts to develop global telepresence technologies to support public health initiatives. It describes Calit2's work in building a multidisciplinary research network across UC campuses, developing telemedicine systems, and applying technologies like optical networks to enable real-time collaboration and data sharing in fields like genomics, metagenomics, and cellular imaging.
Making Sense of Information Through Planetary Scale ComputingLarry Smarr
Larry Smarr discusses how planetary-scale computing and high-speed networks enable data-intensive research through optical portals. This infrastructure allows remote visualization and analysis of large datasets across multiple sites in real-time. Examples include viewing microbial genomes, cosmological simulations, and remote instrument control. The infrastructure also aims to reduce carbon emissions through more efficient computing.
Calit2 has formed two divisional councils to provide leadership and strategic direction. It has also developed multiple communication channels like brochures and websites. Two new Calit2 buildings at UC San Diego and UC Irvine will provide major new laboratories linked by dedicated optical networks for over 1000 researchers working across disciplines like nanotechnology, biomedicine, and digital arts. Calit2 is also working on applications of high-speed networks for areas like telemedicine, disaster response, and digital cinema.
Calit2: a SoCal UC Infrastructure for InnovationLarry Smarr
- Calit2 is a research institution established by the University of California to explore how emerging technologies can transform applications and improve quality of life.
- It provides state-of-the-art laboratory facilities for over 1000 researchers at UC San Diego and UC Irvine to conduct collaborative, multidisciplinary research.
- Calit2 partners with over 200 companies on joint research projects, commercialization efforts, and workforce development through internships and fellowships.
Similar to Calit2’s Program in Nano-science, Nano-engineering, and Nano-medicine (20)
My Remembrances of Mike Norman Over The Last 45 YearsLarry Smarr
Mike Norman has been a leader in computational astrophysics for over 45 years. Some of his influential work includes:
- Cosmic jet simulations in the early 1980s which helped explain phenomena from galactic centers.
- Pioneering the use of adaptive mesh refinement in the 1990s to achieve dynamic load balancing on supercomputers.
- Massive cosmology simulations in the late 2000s with over 100 trillion particles using thousands of processors across multiple supercomputing sites, producing petabytes of data.
- Developing end-to-end workflows in the 2000s to couple supercomputers, high-speed networks, and large visualization systems to enable real-time analysis of extremely large astrophysics simulations.
Metagenics How Do I Quantify My Body and Try to Improve its Health? June 18 2019Larry Smarr
Larry Smarr discusses quantifying his body and health over time through extensive self-tracking. He measures various biomarkers through regular blood tests and analyzes his gut microbiome by sequencing stool samples. This revealed issues like chronic inflammation and an unhealthy microbiome. Smarr then took steps like a restricted eating window and increasing plant diversity in his diet, which reversed metabolic syndrome issues and correlated with shifts in his microbiome ecology. His goal is to continue precisely measuring factors like toxins, hormones, gut permeability and food/supplement impacts to further optimize his health.
Panel: Reaching More Minority Serving InstitutionsLarry Smarr
This document discusses engaging more minority serving institutions (MSIs) in cyberinfrastructure development through regional networks. It provides data showing the importance of MSIs like historically black colleges and universities (HBCUs) in educating underrepresented minority students in STEM fields. Regional networks can help equalize opportunities by assisting MSIs in overcoming barriers to resources through training, networking infrastructure support, and helping institutions obtain necessary staffing and funding. Strategies mentioned include collaborating with MSIs on grants and addressing issues identified in surveys like lack of vision for data use beyond compliance. The goal is to broaden participation in STEAM fields by leveraging the success MSIs have shown in supporting underrepresented students.
Global Network Advancement Group - Next Generation Network-Integrated SystemsLarry Smarr
This document summarizes a presentation on global petascale to exascale workflows for data intensive sciences. It discusses a partnership convened by the GNA-G Data Intensive Sciences Working Group with the mission of meeting challenges faced by data-intensive science programs. Cornerstone concepts that will be demonstrated include integrated network and site resource management, model-driven frameworks for resource orchestration, end-to-end monitoring with machine learning-optimized data transfers, and integrating Qualcomm's GradientGraph with network services to optimize applications and science workflows.
Wireless FasterData and Distributed Open Compute Opportunities and (some) Us...Larry Smarr
This document discusses opportunities for ESnet to support wireless edge computing through developing a strategy around self-guided field laboratories (SGFL). It outlines several potential science use cases that could benefit from wireless and distributed computing capabilities, both in the short term through technologies like 5G, LoRa and Starlink, and longer term through the vision of automated SGFL. The document proposes some initial ideas for deploying and testing wireless edge computing technologies through existing projects to help enable the SGFL vision and further scientific opportunities. It emphasizes that exploring these emerging areas could help drive new science possibilities if done at a reasonable scale.
The Asia Pacific and Korea Research Platforms: An Overview Jeonghoon MoonLarry Smarr
This document provides an overview of Asia Pacific and Korea research platforms. It discusses the Asia Pacific Research Platform working group in APAN, including its objectives to promote HPC ecosystems and engage members. It describes the Asi@Connect project which provides high-capacity internet connectivity for research across Asia-Pacific. It also discusses the Korea Research Platform and efforts to expand it to 25 national research institutes in Korea. New related projects on smart hospitals, agriculture, and environment are mentioned. The conclusion discusses enhancing APAN and the Korea Research Platform and expanding into new areas like disaster and AI education.
Panel: Reaching More Minority Serving InstitutionsLarry Smarr
This document discusses engaging more minority serving institutions (MSIs) in the National Research Platform (NRP). It provides data showing that MSIs serve a disproportionate number of underrepresented minority students and are important producers of STEM graduates from these groups. The NRP can help broaden participation in STEAM fields by providing MSIs access to advanced cyberinfrastructure resources, new learning modalities, and opportunities for collaborative research between MSIs and other institutions. Regional networks also have a role to play in helping MSIs overcome barriers and attracting them to collaborative grants. The goal is to tear down walls between research and teaching and reinvent the university experience for more inclusive learning and innovation.
Panel: The Global Research Platform: An OverviewLarry Smarr
The document provides an overview of the Global Research Platform (GRP), an international collaborative partnership creating a distributed environment for data-intensive global science. The GRP facilitates high-performance data gathering, analytics, transport up to terabits per second, computing, and storage to support large-scale global science cyberinfrastructure ecosystems. It aims to orchestrate research across multiple domains using international testbeds for investigating new technologies related to data-intensive science. Examples of instruments generating exabytes of data that would benefit include the Korea Superconducting Tokamak, the High Luminosity LHC, genomics, the SKA radio telescope, and the Vera Rubin Observatory.
Panel: Future Wireless Extensions of Regional Optical NetworksLarry Smarr
CENIC is a non-profit organization that operates an 8,000+ mile fiber optic network connecting over 12,000 sites across California, including K-12 schools, universities, libraries, and research organizations. It has over 750 private sector partners and contributes over $100 million annually to the California economy. CENIC's network enables research and education collaborations, innovation, and economic growth statewide. It also operates a wireless research network called PRP that connects wireless sensors to supercomputers, supporting applications like wildfire modeling.
Global Research Platform Workshops - Maxine BrownLarry Smarr
The document announces a workshop on global research platforms that will be held virtually in 2021 and in Salt Lake City in 2022, with topics including large-scale science, next-generation platforms, data transport, and international testbeds. It also announces the 4th Global Research Platform Workshop to be held in October 2023 in Limassol, Cyprus co-located with the IEEE eScience 2023 conference.
EPOC and NetSage provide engagement and network monitoring services to support research and education. NetSage collects anonymized network flow data to help understand traffic patterns and troubleshoot performance issues. It provides dashboards and analysis to answer common questions from network engineers and end users. Examples of NetSage deployments and use cases were shown for the CENIC network, including top sources and destinations of traffic, debugging slow flows, and analyzing international traffic patterns by country over time.
The document discusses accelerating science discovery with AI inference-as-a-service. It describes showcases using this approach for high energy physics and gravitational wave experiments. It outlines the vision of the A3D3 institute to unite domain scientists, computer scientists, and engineers to achieve real-time AI and transform science. Examples are provided of using AI inference-as-a-service to accelerate workflows for CMS, ProtoDUNE, LIGO, and other experiments.
Democratizing Science through Cyberinfrastructure - Manish ParasharLarry Smarr
This document summarizes a presentation by Manish Parashar on democratizing science through cyberinfrastructure. The key points are:
1) Broad, fair, and equitable access to advanced cyberinfrastructure is essential for democratizing 21st century science, but there are significant barriers related to knowledge, technical issues, social factors, and balancing capabilities.
2) An advanced cyberinfrastructure ecosystem for all requires integrated portals, access to local and national resources through high-speed networks, diverse allocation modes, embedded expertise networks, and broad training.
3) Realizing this vision will require a scalable federated ecosystem with diverse capabilities and incentives for partnerships to meet growing needs for cyberinfrastructure and
Panel: Building the NRP Ecosystem with the Regional Networks on their Campuses;Larry Smarr
This document summarizes a panel discussion on building the National Research Platform ecosystem with regional networks. The panelists discussed how their regional networks are connecting to and using the Nautilus nodes of the NRP. Examples included using NRP for deep learning and computer vision research at the University of Missouri, challenges of adoption in Nevada and potential solutions, and Georgia Tech's new involvement through the Southern Crossroads regional network. The regional networks see opportunities to expand NRP access and training to enable more researchers in their regions to take advantage of the platform.
Open Force Field: Scavenging pre-emptible CPU hours* in the age of COVID - Je...Larry Smarr
The document discusses Open Force Field (OpenFF), an open-source project that enables rapid development of molecular force fields through automated infrastructure, open data and software, and an open science approach. OpenFF provides access to large quantum chemical datasets, runs quantum chemistry calculations on pre-emptible cloud resources with minimal human intervention, and facilitates easy iteration and testing of new force field hypotheses through an open development model.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, when they may be suitable to use, and how tools like CloudBank and Kubernetes can help facilitate science users' access to cloud resources.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, noting they provide huge capacity and variety but are very expensive for regular use. Facilitating science users on clouds requires services like CloudBank and Kubernetes federation.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, noting they provide huge capacity and variety but are very expensive for regular use. Facilitating science users on clouds requires tools for account management, documentation, and integrating cloud resources through HTCondor and Kubernetes.
Frank Würthwein - NRP and the Path forwardLarry Smarr
NRP will replace PRP and aims to democratize access to national research cyberinfrastructure. The long term vision is to create an open national cyberinfrastructure by federating resources across research institutions. Key innovations include an innovative network fabric, application libraries for FPGAs, a "bring your own resource" model, and innovative scheduling and data infrastructure. The NSF has funded the Prototype National Research Platform project to support NRP for the next 5 years. NRP aims to grow resources, introduce new capabilities, and be driven by the research community.
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsDianaGray10
Join us to learn how UiPath Apps can directly and easily interact with prebuilt connectors via Integration Service--including Salesforce, ServiceNow, Open GenAI, and more.
The best part is you can achieve this without building a custom workflow! Say goodbye to the hassle of using separate automations to call APIs. By seamlessly integrating within App Studio, you can now easily streamline your workflow, while gaining direct access to our Connector Catalog of popular applications.
We’ll discuss and demo the benefits of UiPath Apps and connectors including:
Creating a compelling user experience for any software, without the limitations of APIs.
Accelerating the app creation process, saving time and effort
Enjoying high-performance CRUD (create, read, update, delete) operations, for
seamless data management.
Speakers:
Russell Alfeche, Technology Leader, RPA at qBotic and UiPath MVP
Charlie Greenberg, host
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
Conversational agents, or chatbots, are increasingly used to access all sorts of services using natural language. While open-domain chatbots - like ChatGPT - can converse on any topic, task-oriented chatbots - the focus of this paper - are designed for specific tasks, like booking a flight, obtaining customer support, or setting an appointment. Like any other software, task-oriented chatbots need to be properly tested, usually by defining and executing test scenarios (i.e., sequences of user-chatbot interactions). However, there is currently a lack of methods to quantify the completeness and strength of such test scenarios, which can lead to low-quality tests, and hence to buggy chatbots.
To fill this gap, we propose adapting mutation testing (MuT) for task-oriented chatbots. To this end, we introduce a set of mutation operators that emulate faults in chatbot designs, an architecture that enables MuT on chatbots built using heterogeneous technologies, and a practical realisation as an Eclipse plugin. Moreover, we evaluate the applicability, effectiveness and efficiency of our approach on open-source chatbots, with promising results.
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
Introduction of Cybersecurity with OSS at Code Europe 2024
Calit2’s Program in Nano-science, Nano-engineering, and Nano-medicine
1. Calit2’s Program in Nano-science, Nano-engineering, and Nano-medicine Invited Talk Review of Nano-cancer project April 11, 2006 Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technologies Harry E. Gruber Professor, Dept. of Computer Science and Engineering Jacobs School of Engineering, UCSD
2.
3. Calit2 Materials and Devices Laboratory: “Nano3” – Science, Engineering, Medicine Nano3 Facility CALIT2.UCSD 10,000 sq. feet State-of-the-Art Materials and Devices Laboratory Source: Bernd Fruhberger, Calit2
4.
5.
6.
7. NanoMedicine Lab – Interface with Cleanrooms Raith50 E-beam Writing System In operation (March 2006) Example: Nanolithography Nanodimensional grid in PMMA resist e-beam Nanolithography is Used for Rapid Prototyping of Antibody Arrays to Optimize Size, Shape, and Spacing of the Antibody Coated Gold or TiO 2 Dots Source: Bernd Fruhberger, Calit2
8.
9. Adapting High Resolution Displays with Live Instrument Feeds to Cancer Research Source: David Lee, NCMIR, UCSD Calit2 OptIPuter Project
11. Marine Genome Sequencing Project Measuring the Genetic Diversity of Ocean Microbes CAMERA will include All Sorcerer II Metagenomic Data
12. Calit2’s Direct Access Core Architecture Will Create Next Generation Metagenomics Server Source: Phil Papadopoulos, SDSC, Calit2 + Web Services User Environment CAMERA Complex Flat File Server Farm TeraGrid Backplane (10000s of CPUs) W E B PORTAL Web Local Cluster Direct Access Lambda Cnxns Dedicated Compute Farm (1000 CPUs) Data- Base Farm 10 GigE Fabric