Presentation about our probabilistic coupled inland flood, hurricane wind and storm surge model. We discuss Harvey losses, SpatialKat runtimes, and show for the first time industry wide EP curves for all perils combined and by themselves. Highlights also include our climate sensitivity and climate variability modeling, another industry first. Comments welcome.
About the potential of novel, alternative rain sensors, such as microwave links (MWL) used for telecommunication, crowd sensing, or cheap ubiquitous sensors.
Presentation about our probabilistic coupled inland flood, hurricane wind and storm surge model. We discuss Harvey losses, SpatialKat runtimes, and show for the first time industry wide EP curves for all perils combined and by themselves. Highlights also include our climate sensitivity and climate variability modeling, another industry first. Comments welcome.
About the potential of novel, alternative rain sensors, such as microwave links (MWL) used for telecommunication, crowd sensing, or cheap ubiquitous sensors.
El 29 de febrero y el 1 de marzo de 2016, la Fundación Ramón Areces analizó la relación entre 'Big Data y el cambio climático' en unas jornadas. ¿Puede el Big Data ayudar a reducir el cambio climático? ¿Cómo contribuirá ese análisis masivo de datos a prevenir y gestionar catástrofes naturales? Son solo algunas de las preguntas a las que intentarán responder los ponentes. Las ciencias vinculadas al clima tienen en el Big Data una herramienta muy prometedora para afrontar diferentes fenómenos asociados al cambio climático.
Blue Waters Enabled Advances in the Fields of Atmospheric Science, Climate, a...inside-BigData.com
In this deck from the Blue Waters Summit, Susan Bates from NCAR presents: Blue Waters Enabled Advances in the Fields of Atmospheric Science, Climate, and Weather.
For the past five years, the Blue Waters Project has provided an invaluable platform for research in the fields of atmospheric science, climate, and weather. The computationally intensive numerical models running on Blue Waters push the limits of model resolution and/or capability in first-of-their-kind simulations. These projects span the full breadth of spatial and temporal scales, from discrete events such as tropical cyclones and tornadoes, to regional analyses of extreme events, to global-scale research on the effects of climate change. In this talk, we will explore progress in these research areas enabled by Blue Waters and demonstrate why this particular resource has been so important.
Watch the video: https://wp.me/p3RLHQ-iYR
Learn more: https://bluewaters.ncsa.illinois.edu/blue-waters-symposium-2018
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
Backscatter Working Group Software Inter-comparison ProjectRequesting and Co...Giuseppe Masetti
Backscatter mosaics of the seafloor are now routinely produced from multibeam sonar data, and used in a wide range of marine applications. However, significant differences (up to 5 dB) have been observed between the levels of mosaics produced by different software processing a same dataset. This is a major detriment to several possible uses of backscatter mosaics, including quantitative analysis, monitoring seafloor change over time, and combining mosaics. A recently concluded international Backscatter Working Group (BSWG) identified this issue and recommended that “to check the consistency of the processing results provided by various software suites, initiatives promoting comparative tests on common data sets should be encouraged […]”. However, backscatter data processing is a complex (and often proprietary) sequence of steps, so that simply comparing end-results between software does not provide much information as to the root cause of the differences between results.
In order to pinpoint the source(s) of inconsistency between software, it is necessary to understand at which stage(s) of the data processing chain do the differences become substantial. We have invited willing software developers to discuss this framework and collectively adopt a list of intermediate processing steps. We provided a small dataset consisting of various seafloor types surveyed with the same multibeam sonar system, using constant acquisition settings and sea conditions, and have the software developers generate these intermediate processing results, to be eventually compared. If the experiment proves fruitful, we may extend it to more datasets, software and intermediate results. Eventually, software developers may consider making the results from intermediate stages a standard output as well as adhering to a consistent terminology, as advocated by Schimel et al. (2018). To date, the developers of four software (Sonarscope, QPS FMGT, CARIS SIPS, MB Process) have expressed their interest in collaborating on this project.
Open Backscatter Toolchain (OpenBST) Project - A Community-vetted Workflow fo...Giuseppe Masetti
Presentation given at the Canadian Hydrographic Conference 2020
Dates: Mon., Feb. 24, 2020 – Thu., Feb. 27, 2020
Location: Quebec City, Canada
Authors: M. Smith, G. Masetti, L. Mayer, M. Malik, J.-M. Augustin, C. Poncelet, I. Parnum
In this project the group members will play with daily rainfall data collected in Gulf coast (535stations in total) from 1949 to 2017. The purposes of this exercise are to:
1) to give students an idea of a typical example of a climate data set (spatio-temporal data) and someassociated scientific questions (e.g. how rainfall extremes vary in space and time and how that mightbe affected by other things like greenhouse gases or temperatures).
2) to get students familiar with data analysis using R including data manipulation, data visualization, and data summary.
3) to introduce some statistical methods (e.g. time series analysis, spatial statistics, extreme value analysis) to analyze this kind of data to "answer" (perform statistical inference) the questions of interest.
Group members: Lin Ge, Jianan Jang, Jessica Robinson, Erin Song, Seth Temple, Adam Wu
This presentation is from the PBS User Group 2013, as presented by Greg Clifford.
"Cray scalable solutions are optimized for complex simulations, where parallel and massively concurrent applications need access to data, fast. This presentation will cover recent updates from Cray, showing how Cray continues to deliver the largest production supercomputers leveraging Altair PBS Works and HyperWorks software solutions."
Learn more: http://www.altairatc.com/page.aspx?region=na&name=agenda
Watch the presentation video: http://insidehpc.com/2013/10/04/cray-hpc-environments-leading-edge-simulation/
CloudSmartz understands that data is the lifeblood of your business. Making sure that your critical data, applications, and systems are protected and quickly recoverable in the event of a disaster or outage is paramount. CloudSmartz offers a quick and cost- effective way for you to deploy a DR solution.
Leveraging Technology in a Challenging Energy WorldAdvisian
INTECSEA's Brian McShane discuss using technology in a challenging energy world, including global warming, renewable energy, commodity pricing and technology applications for the Deepwater and Arctic pipelines
El 29 de febrero y el 1 de marzo de 2016, la Fundación Ramón Areces analizó la relación entre 'Big Data y el cambio climático' en unas jornadas. ¿Puede el Big Data ayudar a reducir el cambio climático? ¿Cómo contribuirá ese análisis masivo de datos a prevenir y gestionar catástrofes naturales? Son solo algunas de las preguntas a las que intentarán responder los ponentes. Las ciencias vinculadas al clima tienen en el Big Data una herramienta muy prometedora para afrontar diferentes fenómenos asociados al cambio climático.
Blue Waters Enabled Advances in the Fields of Atmospheric Science, Climate, a...inside-BigData.com
In this deck from the Blue Waters Summit, Susan Bates from NCAR presents: Blue Waters Enabled Advances in the Fields of Atmospheric Science, Climate, and Weather.
For the past five years, the Blue Waters Project has provided an invaluable platform for research in the fields of atmospheric science, climate, and weather. The computationally intensive numerical models running on Blue Waters push the limits of model resolution and/or capability in first-of-their-kind simulations. These projects span the full breadth of spatial and temporal scales, from discrete events such as tropical cyclones and tornadoes, to regional analyses of extreme events, to global-scale research on the effects of climate change. In this talk, we will explore progress in these research areas enabled by Blue Waters and demonstrate why this particular resource has been so important.
Watch the video: https://wp.me/p3RLHQ-iYR
Learn more: https://bluewaters.ncsa.illinois.edu/blue-waters-symposium-2018
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
Backscatter Working Group Software Inter-comparison ProjectRequesting and Co...Giuseppe Masetti
Backscatter mosaics of the seafloor are now routinely produced from multibeam sonar data, and used in a wide range of marine applications. However, significant differences (up to 5 dB) have been observed between the levels of mosaics produced by different software processing a same dataset. This is a major detriment to several possible uses of backscatter mosaics, including quantitative analysis, monitoring seafloor change over time, and combining mosaics. A recently concluded international Backscatter Working Group (BSWG) identified this issue and recommended that “to check the consistency of the processing results provided by various software suites, initiatives promoting comparative tests on common data sets should be encouraged […]”. However, backscatter data processing is a complex (and often proprietary) sequence of steps, so that simply comparing end-results between software does not provide much information as to the root cause of the differences between results.
In order to pinpoint the source(s) of inconsistency between software, it is necessary to understand at which stage(s) of the data processing chain do the differences become substantial. We have invited willing software developers to discuss this framework and collectively adopt a list of intermediate processing steps. We provided a small dataset consisting of various seafloor types surveyed with the same multibeam sonar system, using constant acquisition settings and sea conditions, and have the software developers generate these intermediate processing results, to be eventually compared. If the experiment proves fruitful, we may extend it to more datasets, software and intermediate results. Eventually, software developers may consider making the results from intermediate stages a standard output as well as adhering to a consistent terminology, as advocated by Schimel et al. (2018). To date, the developers of four software (Sonarscope, QPS FMGT, CARIS SIPS, MB Process) have expressed their interest in collaborating on this project.
Open Backscatter Toolchain (OpenBST) Project - A Community-vetted Workflow fo...Giuseppe Masetti
Presentation given at the Canadian Hydrographic Conference 2020
Dates: Mon., Feb. 24, 2020 – Thu., Feb. 27, 2020
Location: Quebec City, Canada
Authors: M. Smith, G. Masetti, L. Mayer, M. Malik, J.-M. Augustin, C. Poncelet, I. Parnum
In this project the group members will play with daily rainfall data collected in Gulf coast (535stations in total) from 1949 to 2017. The purposes of this exercise are to:
1) to give students an idea of a typical example of a climate data set (spatio-temporal data) and someassociated scientific questions (e.g. how rainfall extremes vary in space and time and how that mightbe affected by other things like greenhouse gases or temperatures).
2) to get students familiar with data analysis using R including data manipulation, data visualization, and data summary.
3) to introduce some statistical methods (e.g. time series analysis, spatial statistics, extreme value analysis) to analyze this kind of data to "answer" (perform statistical inference) the questions of interest.
Group members: Lin Ge, Jianan Jang, Jessica Robinson, Erin Song, Seth Temple, Adam Wu
This presentation is from the PBS User Group 2013, as presented by Greg Clifford.
"Cray scalable solutions are optimized for complex simulations, where parallel and massively concurrent applications need access to data, fast. This presentation will cover recent updates from Cray, showing how Cray continues to deliver the largest production supercomputers leveraging Altair PBS Works and HyperWorks software solutions."
Learn more: http://www.altairatc.com/page.aspx?region=na&name=agenda
Watch the presentation video: http://insidehpc.com/2013/10/04/cray-hpc-environments-leading-edge-simulation/
CloudSmartz understands that data is the lifeblood of your business. Making sure that your critical data, applications, and systems are protected and quickly recoverable in the event of a disaster or outage is paramount. CloudSmartz offers a quick and cost- effective way for you to deploy a DR solution.
Leveraging Technology in a Challenging Energy WorldAdvisian
INTECSEA's Brian McShane discuss using technology in a challenging energy world, including global warming, renewable energy, commodity pricing and technology applications for the Deepwater and Arctic pipelines
The Jump to Light Speed - Data Intensive Earth Sciences are Leading the Way t...Larry Smarr
05.06.14
Keynote to the 15th Federation of Earth Science Information Partners Assembly Meeting: Linking Data and Information to Decision Makers
Title: The Jump to Light Speed - Data Intensive Earth Sciences are Leading the Way to the International LambdaGrid
San Diego, CA
Towards Exascale Simulations for Regional-Scale Earthquake Hazard and Riskinside-BigData.com
In this deck from the HPC User Forum in Tucson, David McCallen from LBNL presents: Towards Exascale Simulations for
Regional-Scale Earthquake Hazard and Risk.
"With the major advances occurring in high performance computing, the ability to accurately simulate the complex processes associated with major earthquakes is becoming a reality. High performance simulations offer a transformational approach to earthquake hazard and risk assessments that can dramatically increase our understanding of earthquake processes and provide improved estimates of the ground motions that can be expected in future earthquakes. This work will bring together a multidisciplinary team of earth scientists and earthquake engineers from the DOE national laboratory complex to develop advanced computational tools that will take full advantages of emerging, cutting-edge DOE computational platforms."
Watch the video: https://wp.me/p3RLHQ-ioE
Learn more: https://www.exascaleproject.org/advanced-simulations-for-earthquake-risk-assessment/
and
http://hpcuserforum.com
A Campus-Scale High Performance Cyberinfrastructure is Required for Data-Int...Larry Smarr
11.12.12
Seminar Presentation
Princeton Institute for Computational Science and Engineering (PICSciE)
Princeton University
Title: A Campus-Scale High Performance Cyberinfrastructure is Required for Data-Intensive Research
Princeton, NJ
In this video from the HPC User Forum at Argonne, Dr. Brett Bode from NCSA presents: Research on Blue Waters.
"Blue Waters is one of the most powerful supercomputers in the world and is one of the fastest supercomputers on a university campus. Scientists and engineers across the country use the computing and data power of Blue Waters to tackle a wide range of challenging problems, from predicting the behavior of complex biological systems to simulating the evolution of the cosmos."
Watch the video: https://wp.me/p3RLHQ-kYx
Learn more: http://www.ncsa.illinois.edu/enabling/bluewaters
and
http://hpcuserforum.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
FlinkDTW: Time-series Pattern Search at Scale Using Dynamic Time Warping - Ch...Flink Forward
DTW: Dynamic Time Warping is a well-known method to find patterns within a time-series. It has the possibility to find a pattern even if the data are distorted. It can be used to detect trends in sell, defect in machine signals in the industry, medicine for electro-cardiograms, DNA…
Most of the implementations are usually very slow, but a very efficient open source implementation (best paper SIGKDD 2012) is implemented in C. It can be easily ported in other language, as Java, so that it can be then easily used in Flink.
We present how we did some slight modifications so that we can use with Flink at even greater scale to return the TopK best matches on past data or streaming data.
Data Center Lessons Learned at an Intel data center. Innovations in cost and energy savings in high-density data centers including: air economizer, retrofit of factory builiding, high efficiency air-cooled cabinets, and a container data center proof-of-concept.
Big Data, Big Computing, AI, and Environmental ScienceIan Foster
I presented to the Environmental Data Science group at UChicago, with the goal of getting them excited about the opportunities inherent in big data, big computing, and AI--and to think about how to collaborate with Argonne in those areas. We had a great and long conversation about Takuya Kurihana's work on unsupervised learning for cloud classification. I also mentioned our work making NASA and CMIP data accessible on AI supercomputers.
40 Powers of 10 - Simulating the Universe with the DiRAC HPC Facilityinside-BigData.com
In this deck from the Swiss HPC Conference, Mark Wilkinson presents: 40 Powers of 10 - Simulating the Universe with the DiRAC HPC Facility.
"DiRAC is the integrated supercomputing facility for theoretical modeling and HPC-based research in particle physics, and astrophysics, cosmology, and nuclear physics, all areas in which the UK is world-leading. DiRAC provides a variety of compute resources, matching machine architecture to the algorithm design and requirements of the research problems to be solved. As a single federated Facility, DiRAC allows more effective and efficient use of computing resources, supporting the delivery of the science programs across the STFC research communities. It provides a common training and consultation framework and, crucially, provides critical mass and a coordinating structure for both small- and large-scale cross-discipline science projects, the technical support needed to run and develop a distributed HPC service, and a pool of expertise to support knowledge transfer and industrial partnership projects. The on-going development and sharing of best-practice for the delivery of productive, national HPC services with DiRAC enables STFC researchers to produce world-leading science across the entire STFC science theory program."
Watch the video: https://wp.me/p3RLHQ-k94
Learn more: https://dirac.ac.uk/
and
http://hpcadvisorycouncil.com/events/2019/swiss-workshop/agenda.php
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
Digital Infrastructure in a Carbon Constrained WorldLarry Smarr
09.01.15
Invited Presentation to the
West Coast Leadership Dialogue
Stanford University
Title: Digital Infrastructure in a Carbon Constrained World
Palo Alto, CA
Embracing Aerial Robotics in the Oil and Gas SectorManaswiMumbarkar
Discover how aerial robotics redefine oil and gas practices, enhancing safety & precision. Explore the future of robotic applications in this transformative industry.
Nutraceutical market, scope and growth: Herbal drug technologyLokesh Patil
As consumer awareness of health and wellness rises, the nutraceutical market—which includes goods like functional meals, drinks, and dietary supplements that provide health advantages beyond basic nutrition—is growing significantly. As healthcare expenses rise, the population ages, and people want natural and preventative health solutions more and more, this industry is increasing quickly. Further driving market expansion are product formulation innovations and the use of cutting-edge technology for customized nutrition. With its worldwide reach, the nutraceutical industry is expected to keep growing and provide significant chances for research and investment in a number of categories, including vitamins, minerals, probiotics, and herbal supplements.
Nucleophilic Addition of carbonyl compounds.pptxSSR02
Nucleophilic addition is the most important reaction of carbonyls. Not just aldehydes and ketones, but also carboxylic acid derivatives in general.
Carbonyls undergo addition reactions with a large range of nucleophiles.
Comparing the relative basicity of the nucleophile and the product is extremely helpful in determining how reversible the addition reaction is. Reactions with Grignards and hydrides are irreversible. Reactions with weak bases like halides and carboxylates generally don’t happen.
Electronic effects (inductive effects, electron donation) have a large impact on reactivity.
Large groups adjacent to the carbonyl will slow the rate of reaction.
Neutral nucleophiles can also add to carbonyls, although their additions are generally slower and more reversible. Acid catalysis is sometimes employed to increase the rate of addition.
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxMAGOTI ERNEST
Although Artemia has been known to man for centuries, its use as a food for the culture of larval organisms apparently began only in the 1930s, when several investigators found that it made an excellent food for newly hatched fish larvae (Litvinenko et al., 2023). As aquaculture developed in the 1960s and ‘70s, the use of Artemia also became more widespread, due both to its convenience and to its nutritional value for larval organisms (Arenas-Pardo et al., 2024). The fact that Artemia dormant cysts can be stored for long periods in cans, and then used as an off-the-shelf food requiring only 24 h of incubation makes them the most convenient, least labor-intensive, live food available for aquaculture (Sorgeloos & Roubach, 2021). The nutritional value of Artemia, especially for marine organisms, is not constant, but varies both geographically and temporally. During the last decade, however, both the causes of Artemia nutritional variability and methods to improve poorquality Artemia have been identified (Loufi et al., 2024).
Brine shrimp (Artemia spp.) are used in marine aquaculture worldwide. Annually, more than 2,000 metric tons of dry cysts are used for cultivation of fish, crustacean, and shellfish larva. Brine shrimp are important to aquaculture because newly hatched brine shrimp nauplii (larvae) provide a food source for many fish fry (Mozanzadeh et al., 2021). Culture and harvesting of brine shrimp eggs represents another aspect of the aquaculture industry. Nauplii and metanauplii of Artemia, commonly known as brine shrimp, play a crucial role in aquaculture due to their nutritional value and suitability as live feed for many aquatic species, particularly in larval stages (Sorgeloos & Roubach, 2021).
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxRASHMI M G
Abnormal or anomalous secondary growth in plants. It defines secondary growth as an increase in plant girth due to vascular cambium or cork cambium. Anomalous secondary growth does not follow the normal pattern of a single vascular cambium producing xylem internally and phloem externally.
1. KatRisk US Flood Model
London, November 2017
KatRisk LLC
752 Gilman St.
Berkeley, CA 94710
510-984-0056
www.katrisk.com
KatRisk Deutschland GmbH
Wilhelmstr. 6
79098 Freiburg, Germany
0761-5146-7600
2. 2
KatRisk USA Flood Model Highlights
Correlated flood, TC wind (Atlantic and Pacific), and storm surge models
10m inland flood and storm surge hazard resolution
2d hydraulic model with no lower threshold for catchment size (all locs can flood)
50k years of correlated simulations for all perils
Global sea surface temperature correlation of cat models across continents
Flexible model execution architecture to investigate model assumptions and
sensitivities (climate change, correlations to global teleconnections)
Harvey Event Footprint Stochastic Hurricane Tracks Storm Surge Score
3. 3
KatRisk USA Flood, Wind, Storm Surge AAL
Ran test portfolio (from this workshop) and KatRisk Industry
Exposure for hurricane wind, storm surge and inland flood
Test portfolio represents about 1% of value compared to the
KatRisk Industry Exposure, but different LOBs, building types,
construction, etc. makes comparison not 1:1
Result: The KatRisk inland flood GU AAL is about as large as the
combination of Atlantic hurricane wind and storm surge losses
About 18% of inland flood losses are caused by tropical cyclones
Exposure / Peril Flood GU Storm Surge GU Hurricane Wind GU
Model Comparison AAL $207 Million $49 Million $192 Million
Industry Loss AAL $15 - $20 Billion $4 – $6 Billion $12 - $16 Billion
4. 4
KatRisk SpatialKat Software Features
1.2 million locations Ariel Re test portfolio,
GU & GR, 10 samples, 50k years
Inland Flood Storm Surge TC wind
Runtime 25 cores (2x Xeon E5-2690 v4) 4 minutes 1 minute 4 minutes
Runtime 4 cores (Intel i7-7700k) = Desktop 20 minutes 5 minutes 19 minutes
Enables cat modeling on laptop / desktop / server / cloud
– Open Source based efficient C++ and R based engine with memory efficient kernel (< 1GB)
– Location aware repeatable sparse antithetic latin-hypercube sampling enables coherent
measures of risk with fast loss convergence (2 to 1000 samples)
– View and change hazard, vulnerability, location level and regional flood defenses
– User specified spatial secondary uncertainty correlation with KatRisk defaults
– Easy file formats to implement OASIS type and other cat models
True multi-peril: wind, storm surge, and inland flood. Flexible inuring order of contracts,
multiple policy files (by peril) with high performance in-memory execution
Financial contracts up to Net Pre Cat Loss
Requires > 50MB/sec r/w speed per core
SATA III SSD sufficient for up to 10 cores
Low run-times allow performing multiple runs and sensitivity studies
even on local laptop / desktop. Hosted API for location loss analytics.