Presented at the 2017 RTM Conference: common sampling practices skew environmental data understating or prejudicing the contaminant mass, nature & extent and remediation program. Unskewing such data can be predictive to better match site "true" site conditions and reduce "surprises", especially environmental insurance claims.
Ohio Brownfields Conference 2016: Environmental Baselining ToolsJoseph Berlin
To supplement traditional Voluntary cleanup programs the use of environmental baselining adds tools to mitigate environmental risk and facilitate transactions and lending.
2015 National Tanks Conference- Forensic Engineering/Environmental ForensicsJoseph Berlin
Joseph Berlin, BLDI President and chief forensic investigator, was a presenter at the New England Interstate Water Pollution Control Commission's National Tanks Conference and Expo in Phoenix, AR on September 15, 2015.
Joe provided a discussion on forensic engineering and environmental forensics (FE/EF) focused on their use and trends affecting underground storage tank (UST) sites. The purpose of his session was to present a different understanding of the Conceptual Site Model (CSM) and how the Component Failure Analysis (CFA) integrates into the overall site CSM. A recent case was presented involving the use of FE/EF methods to demonstrate Environmental Baselining. This case utilized CFA, amended CSM, specific analytical methods (Ratio, PIANO, Isotopic) and data analysis techniques to address environmental risk, insurance and transactional due diligence issues. Current Baselining practices and one Baselining practice in development (Arizona) were discussed.
Benefits Of Comprehensive Environmental Due Diligencepaulhhayden
This document summarizes the benefits of comprehensive environmental due diligence presented by Geo-Technology Associates, Inc. It discusses how incomprehensive due diligence can lead to inaccurate appraisals, unknown risks, failed deals, unhappy clients/lenders, lost revenue, and lost opportunities. It then presents two case studies highlighting how comprehensive due diligence helped mitigate risks and costs through additional assessments, regulatory coordination, and remedial actions.
The document provides guidance on managing asbestos in buildings. It discusses the health risks of asbestos exposure and the poor historical management of asbestos. It introduces new survey types to better guide asbestos management. Surveyors must be competent and surveys must be thorough, identifying all asbestos present before any refurbishment works. Asbestos management requires proper planning, record keeping and removing asbestos by licensed contractors to avoid health risks and liability issues from improper asbestos handling.
Presentation by: Pat Coyne
Fall 2013 DDD Tour
One key area of revision in E 1527-13 clarifies when a review of agency files should be conducted during a Phase I environmental site assessment. For some consultants, the clarifying language on AFRs may raise the bar of what is typically conducted during a Phase I ESA. Agency file reviews can bring valuable information to a Phase I ESA investigation, but are not without their challenges. Depending on the target property involved, AFRs can vary in complexity, obtainability, location of files and level of effort. When the revised E 1527 standard takes effect later this year, you need to ensure that your reports adequately reflect the new language about when an agency file review should be conducted—and what to do if one cannot be conducted. This timely track will address the concerns and areas of confusion raised about the new requirements, including:
What’s changing?
Current industry practice in performing AFRs
Differences from state to state in terms of availability of files, travel time required, responsiveness of agencies, and inconsistent/incomplete data
Determining when an agency file review is and is not “reasonably ascertainable”
Adequately pricing Phase I ESAs given the significant variability in the level of effort required
How best to educate your clients about the new language
Ohio Brownfields Conference 2016: Environmental Baselining ToolsJoseph Berlin
To supplement traditional Voluntary cleanup programs the use of environmental baselining adds tools to mitigate environmental risk and facilitate transactions and lending.
2015 National Tanks Conference- Forensic Engineering/Environmental ForensicsJoseph Berlin
Joseph Berlin, BLDI President and chief forensic investigator, was a presenter at the New England Interstate Water Pollution Control Commission's National Tanks Conference and Expo in Phoenix, AR on September 15, 2015.
Joe provided a discussion on forensic engineering and environmental forensics (FE/EF) focused on their use and trends affecting underground storage tank (UST) sites. The purpose of his session was to present a different understanding of the Conceptual Site Model (CSM) and how the Component Failure Analysis (CFA) integrates into the overall site CSM. A recent case was presented involving the use of FE/EF methods to demonstrate Environmental Baselining. This case utilized CFA, amended CSM, specific analytical methods (Ratio, PIANO, Isotopic) and data analysis techniques to address environmental risk, insurance and transactional due diligence issues. Current Baselining practices and one Baselining practice in development (Arizona) were discussed.
Benefits Of Comprehensive Environmental Due Diligencepaulhhayden
This document summarizes the benefits of comprehensive environmental due diligence presented by Geo-Technology Associates, Inc. It discusses how incomprehensive due diligence can lead to inaccurate appraisals, unknown risks, failed deals, unhappy clients/lenders, lost revenue, and lost opportunities. It then presents two case studies highlighting how comprehensive due diligence helped mitigate risks and costs through additional assessments, regulatory coordination, and remedial actions.
The document provides guidance on managing asbestos in buildings. It discusses the health risks of asbestos exposure and the poor historical management of asbestos. It introduces new survey types to better guide asbestos management. Surveyors must be competent and surveys must be thorough, identifying all asbestos present before any refurbishment works. Asbestos management requires proper planning, record keeping and removing asbestos by licensed contractors to avoid health risks and liability issues from improper asbestos handling.
Presentation by: Pat Coyne
Fall 2013 DDD Tour
One key area of revision in E 1527-13 clarifies when a review of agency files should be conducted during a Phase I environmental site assessment. For some consultants, the clarifying language on AFRs may raise the bar of what is typically conducted during a Phase I ESA. Agency file reviews can bring valuable information to a Phase I ESA investigation, but are not without their challenges. Depending on the target property involved, AFRs can vary in complexity, obtainability, location of files and level of effort. When the revised E 1527 standard takes effect later this year, you need to ensure that your reports adequately reflect the new language about when an agency file review should be conducted—and what to do if one cannot be conducted. This timely track will address the concerns and areas of confusion raised about the new requirements, including:
What’s changing?
Current industry practice in performing AFRs
Differences from state to state in terms of availability of files, travel time required, responsiveness of agencies, and inconsistent/incomplete data
Determining when an agency file review is and is not “reasonably ascertainable”
Adequately pricing Phase I ESAs given the significant variability in the level of effort required
How best to educate your clients about the new language
Evolving Lifecycles with High Resolution Site Characterization (HRSC) and 3-D...Joshua Orris
The incorporation of a 3DCSM and completion of HRSC provided a tool for enhanced, data-driven, decisions to support a change in remediation closure strategies. Currently, an approved pilot study has been obtained to shut-down the remediation systems (ISCO, P&T) and conduct a hydraulic study under non-pumping conditions. A separate micro-biological bench scale treatability study was competed that yielded positive results for an emerging innovative technology. As a result, a field pilot study has commenced with results expected in nine-twelve months. With the results of the hydraulic study, field pilot studies and an updated risk assessment leading site monitoring optimization cost lifecycle savings upwards of $15MM towards an alternatively evolved best available technology remediation closure strategy.
Download the Latest OSHA 10 Answers PDF : oyetrade.comNarendra Jayas
Latest OSHA 10 Test Question and Answers PDF for Construction and General Industry Exam.
Download the full set of 390 MCQ type question and answers - https://www.oyetrade.com/OSHA-10-Answers-2021.php
To Help OSHA 10 trainees to pass their pre-test and post-test we have prepared set of 390 question and answers called OSHA 10 Answers in downloadable PDF format. The OSHA 10 Answers question bank is prepared by our in-house highly experienced safety professionals and trainers. The OSHA 10 Answers document consists of 390 MCQ type question and answers updated for year 2024 exams.
Monitor indicators of genetic diversity from space using Earth Observation dataSpatial Genetics
Genetic diversity within and among populations is essential for species persistence. While targets and indicators for genetic diversity are captured in the Kunming-Montreal Global Biodiversity Framework, assessing genetic diversity across many species at national and regional scales remains challenging. Parties to the Convention on Biological Diversity (CBD) need accessible tools for reliable and efficient monitoring at relevant scales. Here, we describe how Earth Observation satellites (EO) make essential contributions to enable, accelerate, and improve genetic diversity monitoring and preservation. Specifically, we introduce a workflow integrating EO into existing genetic diversity monitoring strategies and present a set of examples where EO data is or can be integrated to improve assessment, monitoring, and conservation. We describe how available EO data can be integrated in innovative ways to support calculation of the genetic diversity indicators of the GBF monitoring framework and to inform management and monitoring decisions, especially in areas with limited research infrastructure or access. We also describe novel, integrative approaches to improve the indicators that can be implemented with the coming generation of EO data, and new capabilities that will provide unprecedented detail to characterize the changes to Earth’s surface and their implications for biodiversity, on a global scale.
Kinetic studies on malachite green dye adsorption from aqueous solutions by A...Open Access Research Paper
Water polluted by dyestuffs compounds is a global threat to health and the environment; accordingly, we prepared a green novel sorbent chemical and Physical system from an algae, chitosan and chitosan nanoparticle and impregnated with algae with chitosan nanocomposite for the sorption of Malachite green dye from water. The algae with chitosan nanocomposite by a simple method and used as a recyclable and effective adsorbent for the removal of malachite green dye from aqueous solutions. Algae, chitosan, chitosan nanoparticle and algae with chitosan nanocomposite were characterized using different physicochemical methods. The functional groups and chemical compounds found in algae, chitosan, chitosan algae, chitosan nanoparticle, and chitosan nanoparticle with algae were identified using FTIR, SEM, and TGADTA/DTG techniques. The optimal adsorption conditions, different dosages, pH and Temperature the amount of algae with chitosan nanocomposite were determined. At optimized conditions and the batch equilibrium studies more than 99% of the dye was removed. The adsorption process data matched well kinetics showed that the reaction order for dye varied with pseudo-first order and pseudo-second order. Furthermore, the maximum adsorption capacity of the algae with chitosan nanocomposite toward malachite green dye reached as high as 15.5mg/g, respectively. Finally, multiple times reusing of algae with chitosan nanocomposite and removing dye from a real wastewater has made it a promising and attractive option for further practical applications.
The modification of an existing product or the formulation of a new product to fill a newly identified market niche or customer need are both examples of product development. This study generally developed and conducted the formulation of aramang baked products enriched with malunggay conducted by the researchers. Specifically, it answered the acceptability level in terms of taste, texture, flavor, odor, and color also the overall acceptability of enriched aramang baked products. The study used the frequency distribution for evaluators to determine the acceptability of enriched aramang baked products enriched with malunggay. As per sensory evaluation conducted by the researchers, it was proven that aramang baked products enriched with malunggay was acceptable in terms of Odor, Taste, Flavor, Color, and Texture. Based on the results of sensory evaluation of enriched aramang baked products proven that three (3) treatments were all highly acceptable in terms of variable Odor, Taste, Flavor, Color and Textures conducted by the researchers.
2024 State of Marketing Report – by HubspotMarius Sescu
https://www.hubspot.com/state-of-marketing
· Scaling relationships and proving ROI
· Social media is the place for search, sales, and service
· Authentic influencer partnerships fuel brand growth
· The strongest connections happen via call, click, chat, and camera.
· Time saved with AI leads to more creative work
· Seeking: A single source of truth
· TLDR; Get on social, try AI, and align your systems.
· More human marketing, powered by robots
ChatGPT is a revolutionary addition to the world since its introduction in 2022. A big shift in the sector of information gathering and processing happened because of this chatbot. What is the story of ChatGPT? How is the bot responding to prompts and generating contents? Swipe through these slides prepared by Expeed Software, a web development company regarding the development and technical intricacies of ChatGPT!
Evolving Lifecycles with High Resolution Site Characterization (HRSC) and 3-D...Joshua Orris
The incorporation of a 3DCSM and completion of HRSC provided a tool for enhanced, data-driven, decisions to support a change in remediation closure strategies. Currently, an approved pilot study has been obtained to shut-down the remediation systems (ISCO, P&T) and conduct a hydraulic study under non-pumping conditions. A separate micro-biological bench scale treatability study was competed that yielded positive results for an emerging innovative technology. As a result, a field pilot study has commenced with results expected in nine-twelve months. With the results of the hydraulic study, field pilot studies and an updated risk assessment leading site monitoring optimization cost lifecycle savings upwards of $15MM towards an alternatively evolved best available technology remediation closure strategy.
Download the Latest OSHA 10 Answers PDF : oyetrade.comNarendra Jayas
Latest OSHA 10 Test Question and Answers PDF for Construction and General Industry Exam.
Download the full set of 390 MCQ type question and answers - https://www.oyetrade.com/OSHA-10-Answers-2021.php
To Help OSHA 10 trainees to pass their pre-test and post-test we have prepared set of 390 question and answers called OSHA 10 Answers in downloadable PDF format. The OSHA 10 Answers question bank is prepared by our in-house highly experienced safety professionals and trainers. The OSHA 10 Answers document consists of 390 MCQ type question and answers updated for year 2024 exams.
Monitor indicators of genetic diversity from space using Earth Observation dataSpatial Genetics
Genetic diversity within and among populations is essential for species persistence. While targets and indicators for genetic diversity are captured in the Kunming-Montreal Global Biodiversity Framework, assessing genetic diversity across many species at national and regional scales remains challenging. Parties to the Convention on Biological Diversity (CBD) need accessible tools for reliable and efficient monitoring at relevant scales. Here, we describe how Earth Observation satellites (EO) make essential contributions to enable, accelerate, and improve genetic diversity monitoring and preservation. Specifically, we introduce a workflow integrating EO into existing genetic diversity monitoring strategies and present a set of examples where EO data is or can be integrated to improve assessment, monitoring, and conservation. We describe how available EO data can be integrated in innovative ways to support calculation of the genetic diversity indicators of the GBF monitoring framework and to inform management and monitoring decisions, especially in areas with limited research infrastructure or access. We also describe novel, integrative approaches to improve the indicators that can be implemented with the coming generation of EO data, and new capabilities that will provide unprecedented detail to characterize the changes to Earth’s surface and their implications for biodiversity, on a global scale.
Kinetic studies on malachite green dye adsorption from aqueous solutions by A...Open Access Research Paper
Water polluted by dyestuffs compounds is a global threat to health and the environment; accordingly, we prepared a green novel sorbent chemical and Physical system from an algae, chitosan and chitosan nanoparticle and impregnated with algae with chitosan nanocomposite for the sorption of Malachite green dye from water. The algae with chitosan nanocomposite by a simple method and used as a recyclable and effective adsorbent for the removal of malachite green dye from aqueous solutions. Algae, chitosan, chitosan nanoparticle and algae with chitosan nanocomposite were characterized using different physicochemical methods. The functional groups and chemical compounds found in algae, chitosan, chitosan algae, chitosan nanoparticle, and chitosan nanoparticle with algae were identified using FTIR, SEM, and TGADTA/DTG techniques. The optimal adsorption conditions, different dosages, pH and Temperature the amount of algae with chitosan nanocomposite were determined. At optimized conditions and the batch equilibrium studies more than 99% of the dye was removed. The adsorption process data matched well kinetics showed that the reaction order for dye varied with pseudo-first order and pseudo-second order. Furthermore, the maximum adsorption capacity of the algae with chitosan nanocomposite toward malachite green dye reached as high as 15.5mg/g, respectively. Finally, multiple times reusing of algae with chitosan nanocomposite and removing dye from a real wastewater has made it a promising and attractive option for further practical applications.
The modification of an existing product or the formulation of a new product to fill a newly identified market niche or customer need are both examples of product development. This study generally developed and conducted the formulation of aramang baked products enriched with malunggay conducted by the researchers. Specifically, it answered the acceptability level in terms of taste, texture, flavor, odor, and color also the overall acceptability of enriched aramang baked products. The study used the frequency distribution for evaluators to determine the acceptability of enriched aramang baked products enriched with malunggay. As per sensory evaluation conducted by the researchers, it was proven that aramang baked products enriched with malunggay was acceptable in terms of Odor, Taste, Flavor, Color, and Texture. Based on the results of sensory evaluation of enriched aramang baked products proven that three (3) treatments were all highly acceptable in terms of variable Odor, Taste, Flavor, Color and Textures conducted by the researchers.
2024 State of Marketing Report – by HubspotMarius Sescu
https://www.hubspot.com/state-of-marketing
· Scaling relationships and proving ROI
· Social media is the place for search, sales, and service
· Authentic influencer partnerships fuel brand growth
· The strongest connections happen via call, click, chat, and camera.
· Time saved with AI leads to more creative work
· Seeking: A single source of truth
· TLDR; Get on social, try AI, and align your systems.
· More human marketing, powered by robots
ChatGPT is a revolutionary addition to the world since its introduction in 2022. A big shift in the sector of information gathering and processing happened because of this chatbot. What is the story of ChatGPT? How is the bot responding to prompts and generating contents? Swipe through these slides prepared by Expeed Software, a web development company regarding the development and technical intricacies of ChatGPT!
Product Design Trends in 2024 | Teenage EngineeringsPixeldarts
The realm of product design is a constantly changing environment where technology and style intersect. Every year introduces fresh challenges and exciting trends that mold the future of this captivating art form. In this piece, we delve into the significant trends set to influence the look and functionality of product design in the year 2024.
How Race, Age and Gender Shape Attitudes Towards Mental HealthThinkNow
Mental health has been in the news quite a bit lately. Dozens of U.S. states are currently suing Meta for contributing to the youth mental health crisis by inserting addictive features into their products, while the U.S. Surgeon General is touring the nation to bring awareness to the growing epidemic of loneliness and isolation. The country has endured periods of low national morale, such as in the 1970s when high inflation and the energy crisis worsened public sentiment following the Vietnam War. The current mood, however, feels different. Gallup recently reported that national mental health is at an all-time low, with few bright spots to lift spirits.
To better understand how Americans are feeling and their attitudes towards mental health in general, ThinkNow conducted a nationally representative quantitative survey of 1,500 respondents and found some interesting differences among ethnic, age and gender groups.
Technology
For example, 52% agree that technology and social media have a negative impact on mental health, but when broken out by race, 61% of Whites felt technology had a negative effect, and only 48% of Hispanics thought it did.
While technology has helped us keep in touch with friends and family in faraway places, it appears to have degraded our ability to connect in person. Staying connected online is a double-edged sword since the same news feed that brings us pictures of the grandkids and fluffy kittens also feeds us news about the wars in Israel and Ukraine, the dysfunction in Washington, the latest mass shooting and the climate crisis.
Hispanics may have a built-in defense against the isolation technology breeds, owing to their large, multigenerational households, strong social support systems, and tendency to use social media to stay connected with relatives abroad.
Age and Gender
When asked how individuals rate their mental health, men rate it higher than women by 11 percentage points, and Baby Boomers rank it highest at 83%, saying it’s good or excellent vs. 57% of Gen Z saying the same.
Gen Z spends the most amount of time on social media, so the notion that social media negatively affects mental health appears to be correlated. Unfortunately, Gen Z is also the generation that’s least comfortable discussing mental health concerns with healthcare professionals. Only 40% of them state they’re comfortable discussing their issues with a professional compared to 60% of Millennials and 65% of Boomers.
Race Affects Attitudes
As seen in previous research conducted by ThinkNow, Asian Americans lag other groups when it comes to awareness of mental health issues. Twenty-four percent of Asian Americans believe that having a mental health issue is a sign of weakness compared to the 16% average for all groups. Asians are also considerably less likely to be aware of mental health services in their communities (42% vs. 55%) and most likely to seek out information on social media (51% vs. 35%).
AI Trends in Creative Operations 2024 by Artwork Flow.pdfmarketingartwork
Creative operations teams expect increased AI use in 2024. Currently, over half of tasks are not AI-enabled, but this is expected to decrease in the coming year. ChatGPT is the most popular AI tool currently. Business leaders are more actively exploring AI benefits than individual contributors. Most respondents do not believe AI will impact workforce size in 2024. However, some inhibitions still exist around AI accuracy and lack of understanding. Creatives primarily want to use AI to save time on mundane tasks and boost productivity.
Organizational culture includes values, norms, systems, symbols, language, assumptions, beliefs, and habits that influence employee behaviors and how people interpret those behaviors. It is important because culture can help or hinder a company's success. Some key aspects of Netflix's culture that help it achieve results include hiring smartly so every position has stars, focusing on attitude over just aptitude, and having a strict policy against peacocks, whiners, and jerks.
PEPSICO Presentation to CAGNY Conference Feb 2024Neil Kimberley
PepsiCo provided a safe harbor statement noting that any forward-looking statements are based on currently available information and are subject to risks and uncertainties. It also provided information on non-GAAP measures and directing readers to its website for disclosure and reconciliation. The document then discussed PepsiCo's business overview, including that it is a global beverage and convenient food company with iconic brands, $91 billion in net revenue in 2023, and nearly $14 billion in core operating profit. It operates through a divisional structure with a focus on local consumers.
Content Methodology: A Best Practices Report (Webinar)contently
This document provides an overview of content methodology best practices. It defines content methodology as establishing objectives, KPIs, and a culture of continuous learning and iteration. An effective methodology focuses on connecting with audiences, creating optimal content, and optimizing processes. It also discusses why a methodology is needed due to the competitive landscape, proliferation of channels, and opportunities for improvement. Components of an effective methodology include defining objectives and KPIs, audience analysis, identifying opportunities, and evaluating resources. The document concludes with recommendations around creating a content plan, testing and optimizing content over 90 days.
How to Prepare For a Successful Job Search for 2024Albert Qian
The document provides guidance on preparing a job search for 2024. It discusses the state of the job market, focusing on growth in AI and healthcare but also continued layoffs. It recommends figuring out what you want to do by researching interests and skills, then conducting informational interviews. The job search should involve building a personal brand on LinkedIn, actively applying to jobs, tailoring resumes and interviews, maintaining job hunting as a habit, and continuing self-improvement. Once hired, the document advises setting new goals and keeping skills and networking active in case of future opportunities.
A report by thenetworkone and Kurio.
The contributing experts and agencies are (in an alphabetical order): Sylwia Rytel, Social Media Supervisor, 180heartbeats + JUNG v MATT (PL), Sharlene Jenner, Vice President - Director of Engagement Strategy, Abelson Taylor (USA), Alex Casanovas, Digital Director, Atrevia (ES), Dora Beilin, Senior Social Strategist, Barrett Hoffher (USA), Min Seo, Campaign Director, Brand New Agency (KR), Deshé M. Gully, Associate Strategist, Day One Agency (USA), Francesca Trevisan, Strategist, Different (IT), Trevor Crossman, CX and Digital Transformation Director; Olivia Hussey, Strategic Planner; Simi Srinarula, Social Media Manager, The Hallway (AUS), James Hebbert, Managing Director, Hylink (CN / UK), Mundy Álvarez, Planning Director; Pedro Rojas, Social Media Manager; Pancho González, CCO, Inbrax (CH), Oana Oprea, Head of Digital Planning, Jam Session Agency (RO), Amy Bottrill, Social Account Director, Launch (UK), Gaby Arriaga, Founder, Leonardo1452 (MX), Shantesh S Row, Creative Director, Liwa (UAE), Rajesh Mehta, Chief Strategy Officer; Dhruv Gaur, Digital Planning Lead; Leonie Mergulhao, Account Supervisor - Social Media & PR, Medulla (IN), Aurelija Plioplytė, Head of Digital & Social, Not Perfect (LI), Daiana Khaidargaliyeva, Account Manager, Osaka Labs (UK / USA), Stefanie Söhnchen, Vice President Digital, PIABO Communications (DE), Elisabeth Winiartati, Managing Consultant, Head of Global Integrated Communications; Lydia Aprina, Account Manager, Integrated Marketing and Communications; Nita Prabowo, Account Manager, Integrated Marketing and Communications; Okhi, Web Developer, PNTR Group (ID), Kei Obusan, Insights Director; Daffi Ranandi, Insights Manager, Radarr (SG), Gautam Reghunath, Co-founder & CEO, Talented (IN), Donagh Humphreys, Head of Social and Digital Innovation, THINKHOUSE (IRE), Sarah Yim, Strategy Director, Zulu Alpha Kilo (CA).
Trends In Paid Search: Navigating The Digital Landscape In 2024Search Engine Journal
The search marketing landscape is evolving rapidly with new technologies, and professionals, like you, rely on innovative paid search strategies to meet changing demands.
It’s important that you’re ready to implement new strategies in 2024.
Check this out and learn the top trends in paid search advertising that are expected to gain traction, so you can drive higher ROI more efficiently in 2024.
You’ll learn:
- The latest trends in AI and automation, and what this means for an evolving paid search ecosystem.
- New developments in privacy and data regulation.
- Emerging ad formats that are expected to make an impact next year.
Watch Sreekant Lanka from iQuanti and Irina Klein from OneMain Financial as they dive into the future of paid search and explore the trends, strategies, and technologies that will shape the search marketing landscape.
If you’re looking to assess your paid search strategy and design an industry-aligned plan for 2024, then this webinar is for you.
5 Public speaking tips from TED - Visualized summarySpeakerHub
From their humble beginnings in 1984, TED has grown into the world’s most powerful amplifier for speakers and thought-leaders to share their ideas. They have over 2,400 filmed talks (not including the 30,000+ TEDx videos) freely available online, and have hosted over 17,500 events around the world.
With over one billion views in a year, it’s no wonder that so many speakers are looking to TED for ideas on how to share their message more effectively.
The article “5 Public-Speaking Tips TED Gives Its Speakers”, by Carmine Gallo for Forbes, gives speakers five practical ways to connect with their audience, and effectively share their ideas on stage.
Whether you are gearing up to get on a TED stage yourself, or just want to master the skills that so many of their speakers possess, these tips and quotes from Chris Anderson, the TED Talks Curator, will encourage you to make the most impactful impression on your audience.
See the full article and more summaries like this on SpeakerHub here: https://speakerhub.com/blog/5-presentation-tips-ted-gives-its-speakers
See the original article on Forbes here:
http://www.forbes.com/forbes/welcome/?toURL=http://www.forbes.com/sites/carminegallo/2016/05/06/5-public-speaking-tips-ted-gives-its-speakers/&refURL=&referrer=#5c07a8221d9b
ChatGPT and the Future of Work - Clark Boyd Clark Boyd
Everyone is in agreement that ChatGPT (and other generative AI tools) will shape the future of work. Yet there is little consensus on exactly how, when, and to what extent this technology will change our world.
Businesses that extract maximum value from ChatGPT will use it as a collaborative tool for everything from brainstorming to technical maintenance.
For individuals, now is the time to pinpoint the skills the future professional will need to thrive in the AI age.
Check out this presentation to understand what ChatGPT is, how it will shape the future of work, and how you can prepare to take advantage.
The document provides career advice for getting into the tech field, including:
- Doing projects and internships in college to build a portfolio.
- Learning about different roles and technologies through industry research.
- Contributing to open source projects to build experience and network.
- Developing a personal brand through a website and social media presence.
- Networking through events, communities, and finding a mentor.
- Practicing interviews through mock interviews and whiteboarding coding questions.
Google's Just Not That Into You: Understanding Core Updates & Search IntentLily Ray
1. Core updates from Google periodically change how its algorithms assess and rank websites and pages. This can impact rankings through shifts in user intent, site quality issues being caught up to, world events influencing queries, and overhauls to search like the E-A-T framework.
2. There are many possible user intents beyond just transactional, navigational and informational. Identifying intent shifts is important during core updates. Sites may need to optimize for new intents through different content types and sections.
3. Responding effectively to core updates requires analyzing "before and after" data to understand changes, identifying new intents or page types, and ensuring content matches appropriate intents across video, images, knowledge graphs and more.
A brief introduction to DataScience with explaining of the concepts, algorithms, machine learning, supervised and unsupervised learning, clustering, statistics, data preprocessing, real-world applications etc.
It's part of a Data Science Corner Campaign where I will be discussing the fundamentals of DataScience, AIML, Statistics etc.
Time Management & Productivity - Best PracticesVit Horky
Here's my presentation on by proven best practices how to manage your work time effectively and how to improve your productivity. It includes practical tips and how to use tools such as Slack, Google Apps, Hubspot, Google Calendar, Gmail and others.
The six step guide to practical project managementMindGenius
The six step guide to practical project management
If you think managing projects is too difficult, think again.
We’ve stripped back project management processes to the
basics – to make it quicker and easier, without sacrificing
the vital ingredients for success.
“If you’re looking for some real-world guidance, then The Six Step Guide to Practical Project Management will help.”
Dr Andrew Makar, Tactical Project Management
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Data Lost in Translation - Skewed Date Collection Prejudices Outcome
1. Grand Rapids, MI
Joseph Berlin, PE, QC, CP
Data Lost in Translation – Data Variance, Environmental
Forensics and Sampling Practices
Presented to:
2. Speaker Background – Joseph Berlin, PE, QC, CP
2
Mr. Berlin’s work focuses on the application of scientific and
forensic principles to better understand and mitigate
environmental contamination and the associated financial
implications. Mr. Berlin’s practice focuses on the development and
implementation of forensic environmental investigations, including
source and origin, migration scenarios, data validity
determinations and environmental liability and reserve estimates.
Mr. Berlin is a registered Professional Engineer in several states, a
Certified Professional (Ohio VAP#351) and a member of the
National Academy of Forensic Engineers. Mr. Berlin has BS and MS
degrees in Civil Engineering, MBA, plus further graduate work in
industrial hygiene and accounting. Mr. Berlin’s career spans over
25 years and has run from DOD “secret” projects to cases with
Fortune 500 oil companies to insurance claims for mom-and-pop
gas stations.
3. Soil, groundwater and vapor
sampling may not reflect site
conditions…Guess work or
Predictable?
Let’s take a look at a simple example
3
4. 4
Prejudiced data defines outcome
Method “B”
Prejudiced Method
Variance on THE Method
OR
Method “A”
Proper Method
6. What if…
6
Sample results were actually
prejudiced to reflect only 5% to 50% of
the actual impact
The volume of impacted material and
contaminant mass were 2 to 10 times
greater than estimated
7. Data dazed and confused
7
Since I only really care about my bottom
line why does this matter?
8. Risk Management – Data Review
• Assessment of prior data
• Validation of Areas of Concern
• Exclusions/Carve Outs
• Deductible Escrow Size/Structure
• Voluntary Program Impact
• Remediation Method/Structure/Time
• Remediation Cost Estimates
8
Defer no time, delays have dangerous ends.
- William Shakespeare
9. How are Datasets Prejudiced
• Sample collection skewed to
“clean” areas
• Non-preserved samples (5035 use)
• Field sample handling
• Use of Method “B” sample
• Many others
9
10. Original Case - Revisited
10
• Tanker spill along a highway
• adjacent to a Great Lake
• Loss of over 4,000 gallons of gasoline
• Residential area
• Shallow groundwater (8’ deep)
• Closest home - 100 feet from spill area
• Samples collected during source removal
• Samples split: Methods “A” and “B”
• Modeled data for “A” and “B” samples
Included in full version
(see last slide to obtain full presentation)
11. Original Case - Revisited
11
• Tanker spill along a highway adjacent to a
Great Lake
• Loss of over 4,000 gallons of gasoline
• Residential area
• Shallow groundwater (8’ deep)
• Closest home - 100 feet from spill area
• Samples collected during source removal
• Samples split: Methods “A” and “B”
• Modeled data for “A” and “B” samples
Included in full version
(see last slide to obtain full presentation)
12. Original Example – Data Variance
12
Method Benzene (ug/kg)
Total
Contaminant
Mass (TCM)
Vapor
Threshold
Exceeded?
A 0 to 81,000 7,200 lbs. Yes
B 0 to 1,500 86 lbs. No
Data Parsing- Method “A”/Method “B”:
• Factor (Total Contaminant Mass): 2-6
• Factor (Benzene): 5-54
Included in full version
(see last slide to obtain full presentation)
13. 13
Method “B” Sample
Method “A” Sample
Laboratory Chromatograms
(For a single data point in population)
Note:
1. Lighter ends not present in
Method “B” sample
2. These are the same sample
locations only the collection
method varied
Included in full version
(see last slide to obtain full presentation)
14. 14
Method “B” sampling
Leads to wrong/inefficient CSM &
remedy
NOT
Method “A”
Actual CSM
Method “B”
Skewed CSM
Conceptual Site Models
15. Preference Skews Timeline and Cost
15
Method
Volume of
Impact
(CY)
Remedial
Comments
Projected
Cost ($)
A Over 3,000
In-situ treatment likely
Large source area
> $250,000
B
Less than
300
Spot removal < $25,000
• Method “A” v. “B” – significant variance
• Method “B” – WRONG results/plan
Included in full version
16. How might I know? Some
indicators…
• What was clean is now dirty
• What was dirty is now clean
• A small issue becomes bigger
• A big issue becomes smaller
• Result variance over time? Firm? Field
crew?
16
“Commonly, the variance is best explained by sample
collection methods.”
17. 17
• Understand risk of data prejudice
• Become aware of skewing methods
• Request certification & reliance of reports
• Consider use of normalized CSM
• Call “Skewed” consultants to task
In case I care…how do I protect
myself?
“You know my method.
It is founded upon the observation of trifles.”
― Sir Arthur Conan Doyle
18. In Closing
• Consider the reliability/validity of ALL VOC data
• Do not underestimate the importance of methods
defining outcome (the trifles of data collection)
• Consider terms to include select documentation
(SOPs, chromatograms, training logs)
• Ensure contractual Reliance of Consultant work
• Consider use of normalized data CSM/RCE
18
Included in full version
(see last slide to obtain full presentation)
19. Follow Up
If you wish to have the complete version of this presentation or
have any questions or comments please contact Rich Spehar,
PE, Joseph “Joe” Berlin, PE, EP, CP or Marty Janowiak
(bldi@bldi.com) at our main office at 616-459-3737.
Also refer to www.ohioenvironmentalblog.com or
www.michiganenvironmentalblog.com
19
Copyright, BLDI, Inc. 2015, all rights reserved.
Editor's Notes
Thank you Jeff. Good afternoon. My name is Joe Berlin.
I am a professional engineer in several states, member of National Academy of Forensic Engineers.
Our work includes transactional due diligence, investigation/remediation and forensic engagements.
A large part of my project portfolio includes agency negotiations, program development and forensic reviews.
I’m a data guy. When I review data I look at the data and assess if it makes sense.
Was is possibly skewed towards an opinion or outcome
This presentation builds on a research program we have been undertaking for the past two years
The goal, at the end, is for us to understand and appreciate that datasets, especially soil data, are often skewed to knowingly and unknowingly
There are methods to assess such data and, especially related to environmental risk assessment, redevelopment and insurance, how can we better assess data.
Although many understand or assume that all sampling is conducted the same or has little variance by the end of this presentation I hope you understand that such assumptions are limited at best and the significant impact one field method can have on an environmental program.
For example, during our research we found several large national firms use Method B as standard practice.
SO what many would say? Well, let’s look at some actual data.
Today, we are only going to discuss soil sampling practices, a little history behind current methods and how that influences financial models.
There is an inherent assumption in most environmental investigations that unanticipated outcomes may be encountered.
Although sometimes true, such negative outcomes are much more predictable than many think.
Consider, in many environmental Conceptual Site Models there is an inherent probability of “negative unanticipated” outcomes built into the model. What if that probability (i.e. risk) can be better understood and predicted?
Ok, if we are willing can we do a simple show of hands for those of us upon receiving a data package questioned the sample results and, thereby, the entire environmental program?
How about having “unexpected” or “unforeseen” environmental surprises?
These are generally negative surprises aren’t they.
How many know what 5035 field preservation is?
Thank you. SO it seems that many of us have had such an experience.
The model shown is REAL data. These are split samples (only the locations with detectable concentrations are shown here for simplicity).
We are looking at the same site with only one difference…the method of collection of the samples.
Dataset or Method “B” was collected per a common but WRONG industry practice by sampling from the field screening (headspace sample)
Now we understand that many would say, “Hey I want the Method “B” sample results. However, it’s not like the contamination magically disappeared. Voila! Gone. ”
The problem is that the Method B results often grossly underestimate the volume, concentration and location of contamination.
As an engineer I tend to look at the work in numbers and grids. If my number of affected grids is 10% of reality and the values in the grids are certainly less than 50% of reality I know I have a serious data discrepancy. In our world that means money, time and risk not included in the model (we call it the Conceptual Site Model or CSM). Using common models there are relatively simple methods to correct or enhance the CSM.
For example, during our research we found several large national firms use Method B as standard practice.
SO what many would say? Well, let’s look at some actual data.
Today, we are only going to discuss soil sampling practices, a little history behind current methods and how that influences financial models.
There is an inherent assumption in most environmental investigations that unanticipated outcomes may be encountered.
Although sometimes true, such negative outcomes are much more predictable than many think.
Consider, in many environmental Conceptual Site Models there is an inherent probability of “negative unanticipated” outcomes built into the model. What if that probability (i.e. risk) can be better understood and predicted?
So…we now come to examples of bad situations caused by contamination not previously assessed, missed or avoided.
How does this happen? Mistake? OR…prejudicial data collection methods.
Were these events truly unavoidable?
We are really talking about prejudicial data collection methods. Whether a lot of consultants know it or not…they collect samples in a manner to prejudice the results
And ALWAYS to minimize the issue.
Here’s what we know:
Vapor concentrations in soil are temporal and can vary significantly over a year, month, week or even day
Initial VI screening often uses soil results as a baseline or starting point
In fact, PM is now defaulting to soil v. vapor because of the stricter vapor standards.
Everything isn’t just about VAPOR today, but vapor does drive the preponderance of environmental mitigation programs today.
Scary Things
Headlines of bad things
Delays in development
Increase Costs
Stigma
Dramatic adverse impact on ROI
I submit that after we finish this discussion that you may be able to identify a project or two where you thought something was odd (sample results didn’t make sense).
Submit that a lot of consultants collect soil samples, we’ll get to vapor and groundwater another time, that skews results and are likely unaware that they inherently do so. IN fact, many regulators do not understand it or care (Ohio EPA).
What happens when I run into more “bad stuff” that could have been found earlier? Would I have walked away?
Written different terms? Filed suit?
We will submit that in many such cases of finding much greater impact than estimated is based on insufficient data parsing (excluding certain data or using select multipliers) and improperly collected samples. If you are the lender, developer or underwriter ensure you agreement allows you to request any and all documentation. Why? Because you want this information to ensure their CSM is accurate.
Ok, prior to the deal start by asking for:
SOPS
staff resumes who were on-site collecting data
AND especially
the CSM as this should tell you have they looked at the data.
Later, if there is an you can also ask for:
field notes,
field logs,
Laboratory chromatograms
This initial scan of information can often identify the reason(s) for the difference in impacted media volume, concentration, location and contaminant mass.
Point: the various data inputs, along with Environmental Sample data, is assumed to be reliable and collected in a consistent manner.
If the Environmental data is collected to skew as “Clean” how does that influence the Financial Model? It certainly affects the exposure assessment.
Success factors and points over view:
Differing points of view and data qualification
Developer
Lender
Regulator/Agency
Seller
Consultant A
Consultant B
Risk Managers
Underwriter
And
Workout and
Claims
Forensic methods, including Data Normalization, can be used to better assess the various risk management issues prior to, during or on the back end of environmental programs.
We often use data normalization which should probably be defined as it can be a powerful tool.
Data is grouped based on a number of factors (date collected, field preservation or not, lab hold time, consultant group, media)
Based on the data group the data can then be normalized (or corrected) based on data group and factors likely influencing the output
The range of normalization factors can be input to the model to provide various outputs depending on the engagement
For example: if we are looking at benzene as the cleanup driver, we might look at sample results from 1998 and utilize a range of factors of 4 to 10 in developing a more rational CSM. Remember, there may be other factors, including is the newest data skewed as well.
So, if you better understood the data (normalized if you will) you could better structure the development, cost items out, provide a more accurate (not blue sky) timeline
Money is one thing but time, time, is something we can never get more of.
Many of us here have questioned or question datasets based on various, often, gut feelings.
The prejudicial data, especially old data, may not necessarily have been done on purpose. But assessing that data today without normalizing the data provides an inaccurate model.
Ultimately a disciplined approach to data parsing for input to a CSM can better align data with outcomes.
The real world impact of environmental problems
Delays
Increased cost
Environmental agreements/indemnities
Stigma
Since I only really care about my bottom line why does this matter?
Unforeseen Costs and Delays…or Not?
“You know my method. It is founded upon the observation of trifles.” ― Arthur Conan Doyle, The Boscombe Valley Mystery
You can see the description.
Soil conditions: a very nice medium sand
The initial source removal started within 12 hours of the accident
The uniformity of the soil and application/spill of gasoline provided very good comparisons for the Method A/B co-located samples
You can see the description.
Soil conditions: a very nice medium sand
The initial source removal started within 12 hours of the accident
The uniformity of the soil and application/spill of gasoline provided very good comparisons for the Method A/B co-located samples
The results speak for themselves.
The data for this case is consistent with data from the dozen or sites for which we have collected data on Method “A” v. Method “B”
Let’s remember this isn’t a choice of which data to select. The conditions are the same regardless of which sample set used. One set (Method “A”) better reflects the conditions. The other set (Method “B”), although possibly preferred for transactional purposes, will have issues generally only exposed well after the transaction when the risk/cost had already been transferred to purchaser, developer, lender, insurer and tenants.
Notes:
Remember these are “split” co-located samples.
You can see the description.
Soil conditions: a very nice medium sand
The uniformity of the soil and application/spill of gasoline provided very good comparisons for the Method A/B co-located samples
Comparing the chromatograms and by extension the CSMs (on the next slide) between Method A v. Method B and, ultimately, the remedy is many cases
Ok, if we are willing can we do a simple show of hands for those of us upon receiving a data package questioned the sample results and, thereby, the entire environmental program?
How about having “unexpected” or “unforeseen” environmental surprises?
These are generally negative surprises aren’t they.
How many know what 5035 field preservation is?
Thank you. SO it seems that many of us have had such an experience.
The model shown is REAL data. These are split samples (only the locations with detectable concentrations are shown here for simplicity).
We are looking at the same site with only one difference…the method of collection of the samples.
Dataset or Method “B” was collected per a common but WRONG industry practice by sampling from the field screening (headspace sample)
Now we understand that many would say, “Hey I want the Method “B” sample results. However, it’s not like the contamination magically disappeared. Voila! Gone. ”
The problem is that the Method B results often grossly underestimate the volume, concentration and location of contamination.
As an engineer I tend to look at the work in numbers and grids. If my number of affected grids is 10% of reality and the values in the grids are certainly less than 50% of reality I know I have a serious data discrepancy. In our world that means money, time and risk not included in the model (we call it the Conceptual Site Model or CSM). Using common models there are relatively simple methods to correct or enhance the CSM.
For example, during our research we found several large national firms use Method B as standard practice.
SO what many would say? Well, let’s look at some actual data.
Today, we are only going to discuss soil sampling practices, a little history behind current methods and how that influences financial models.
There is an inherent assumption in most environmental investigations that unanticipated outcomes may be encountered.
Although sometimes true, such negative outcomes are much more predictable than many think.
Consider, in many environmental Conceptual Site Models there is an inherent probability of “negative unanticipated” outcomes built into the model. What if that probability (i.e. risk) can be better understood and predicted?
Some consultants/firms are well known for giving “better” (lower) results. Today, especially with vapor intrusion, a developer, risk manager or underwriter, takes on greater downside risk with the use of “prejudicial” data.
Since a case can be made even more strongly today that a person should have known about such “prejudicial” data, how can you say you didn’t know.
Areas of contamination are documented using sampling…however, the sampling specifics can dictate the results
Does the data drive the results?
Consider?
(see: single dataset table below)
You are often provided reports (Phase II ESAs, etc.) so doing your own Phase II or investigation may not be feasible.
However, you can request specific information to plug holes in the CSM.
I was asked about Forensic engagements and cost, especially to assess a CSM or RCE.
Get the basic information during underwriting/risk review
If there is a real concern (e.g. blown RCE budget) then get the second set of information, as available
We have another set of questions to identify transactional and consultant grouping
With this information the data prejudice, gaps and a simple updated, more rational, data-driven CSM can be developed
Cost: less than $5,000
Time: in as little as 2 weeks
Consider pre-transaction Baselining
Environmental forensic lab work as low as $1,000
More commonly add: $5,000 to $10,000
Time: usually 4 weeks
Get the basic information during underwriting/risk review process.
Many of us have people/firms we prefer to work or associate with. If this type of information is important, and you want reliable data, you can do some simple things to increase the reliability.
In the engagement or underwriting process provide, much like lenders, that under separate cover, obtain some non-standard information
With this information the data prejudice, gaps and a simple updated, more rational, data-driven CSM can be developed