The document provides an overview of cold fusion research from its announcement in 1989 to present day. It summarizes the initial excitement around the claims of excess heat production through cold fusion, the widespread rejection by the scientific community due to lack of reproducibility and contradictions with accepted theory, and the continued research by a small group of supporters. While mainstream science remains skeptical, some ongoing work on low-energy nuclear reactions and metal-hydrogen systems could provide new insights, even if cold fusion itself is ultimately disproven.
Adobe Behance Scales to Millions of Users at Lower TCO with Neo4jNeo4j
1) Behance is an online platform for showcasing creative work with 25 million members and millions of monthly visitors. It was previously powered by a Cassandra database which had scaling issues.
2) Behance transitioned to using Neo4j, a graph database, which improved performance, flexibility, and reduced costs. It enabled real-time activity feeds and recommendations.
3) This success led to using the graph across Adobe products through the Creative Social Graph initiative. It powered new community features in Lightroom and Photoshop Express at scale.
This document discusses evidence-based management and decision making. It defines evidence-based practice as focusing on improving decision quality through a process of forming questions, searching for evidence, critically appraising evidence, integrating evidence with expertise, and assessing outcomes. The document provides examples of how pilot Chesley Sullenberger exemplified evidence-based decision making when landing a plane in the Hudson River by relying on scientific findings, organizational facts and checklists, considering decisions carefully, and taking responsibility for all stakeholders. It outlines five good habits for evidence-based practice in both management and teaching.
AI has great promise for improving operations in the oil and gas industry. It can provide 24/7 supervision of field assets through visual analytics, optimize plant performance using digital twins to run virtual simulations, and accelerate access to engineering information by capturing past work. AI can also improve morale by using robots to perform low-value tasks currently done by humans.
Semantic Artificial Intelligence is the fusion of various types of AI, incl. symbolic AI, reasoning, and machine learning techniques like deep learning. At the same time, Semantic AI has a strong focus on data management and data governance. With the 'wedding' of various AI techniques new promises are made, but also fundamental approaches like 'Explainable AI (XAI)', knowledge graphs, or Linked Data are more strongly focused.
Scoutbee - Knowledge graphs at Scoutbee with Neo4jNeo4j
Scoutbee is building a knowledge graph with Neo4j to power its supplier discovery and intelligence platform. The knowledge graph will integrate Scoutbee's data with external sources to create rich supplier profiles connected by relationships. This will allow Scoutbee to provide explainable answers to customer questions about suppliers and their relationships. The knowledge graph is part of Scoutbee's effort to democratize data access and build a foundation for continual data-driven process improvements.
Contemporary science and technology is characterized by four dimensions: products, settings, resources, and practitioners. It takes many forms and continues to add new fields like nanotechnology and genetic engineering. Scientific research examines natural materials while technology focuses on design, production, and maintenance. Resources that fuel science and technology are more abundant and diverse than ever before, relying heavily on scientific knowledge, sophisticated instruments, and a global marketplace. Practitioners are more numerous, collaborative, and specialized than in the past.
Transforming BT’s Infrastructure Management with Graph TechnologyNeo4j
Join us for this 45-minute discussion on network digital twins and how BT is transforming its infrastructure management with graph technology and Neo4j.
Adobe Behance Scales to Millions of Users at Lower TCO with Neo4jNeo4j
1) Behance is an online platform for showcasing creative work with 25 million members and millions of monthly visitors. It was previously powered by a Cassandra database which had scaling issues.
2) Behance transitioned to using Neo4j, a graph database, which improved performance, flexibility, and reduced costs. It enabled real-time activity feeds and recommendations.
3) This success led to using the graph across Adobe products through the Creative Social Graph initiative. It powered new community features in Lightroom and Photoshop Express at scale.
This document discusses evidence-based management and decision making. It defines evidence-based practice as focusing on improving decision quality through a process of forming questions, searching for evidence, critically appraising evidence, integrating evidence with expertise, and assessing outcomes. The document provides examples of how pilot Chesley Sullenberger exemplified evidence-based decision making when landing a plane in the Hudson River by relying on scientific findings, organizational facts and checklists, considering decisions carefully, and taking responsibility for all stakeholders. It outlines five good habits for evidence-based practice in both management and teaching.
AI has great promise for improving operations in the oil and gas industry. It can provide 24/7 supervision of field assets through visual analytics, optimize plant performance using digital twins to run virtual simulations, and accelerate access to engineering information by capturing past work. AI can also improve morale by using robots to perform low-value tasks currently done by humans.
Semantic Artificial Intelligence is the fusion of various types of AI, incl. symbolic AI, reasoning, and machine learning techniques like deep learning. At the same time, Semantic AI has a strong focus on data management and data governance. With the 'wedding' of various AI techniques new promises are made, but also fundamental approaches like 'Explainable AI (XAI)', knowledge graphs, or Linked Data are more strongly focused.
Scoutbee - Knowledge graphs at Scoutbee with Neo4jNeo4j
Scoutbee is building a knowledge graph with Neo4j to power its supplier discovery and intelligence platform. The knowledge graph will integrate Scoutbee's data with external sources to create rich supplier profiles connected by relationships. This will allow Scoutbee to provide explainable answers to customer questions about suppliers and their relationships. The knowledge graph is part of Scoutbee's effort to democratize data access and build a foundation for continual data-driven process improvements.
Contemporary science and technology is characterized by four dimensions: products, settings, resources, and practitioners. It takes many forms and continues to add new fields like nanotechnology and genetic engineering. Scientific research examines natural materials while technology focuses on design, production, and maintenance. Resources that fuel science and technology are more abundant and diverse than ever before, relying heavily on scientific knowledge, sophisticated instruments, and a global marketplace. Practitioners are more numerous, collaborative, and specialized than in the past.
Transforming BT’s Infrastructure Management with Graph TechnologyNeo4j
Join us for this 45-minute discussion on network digital twins and how BT is transforming its infrastructure management with graph technology and Neo4j.
Banking Circle: Money Laundering Beware: A Modern Approach to AML with Machin...Neo4j
The document discusses Banking Circle's use of graph technology and a data-driven approach to improve its anti-money laundering efforts. It represents payment data as a network to extract features for machine learning models that detect suspicious activity. This approach generates fewer false alarms than rules-based systems while identifying more high-risk payments and accounts. Network-based investigations also help analysts explore connections more efficiently. The new system screens over 1 million payments daily and has increased alerts leading to compliance actions by 1300% while reducing total alerts by 30%.
Talk given at Fronteers 2015 in Amsterdam.
In a world where many of our digital spaces are becoming more closed than ever, open data is a concept that is rapidly on the rise.
In this talk we'll explore what open data is (and what it isn't), and why we should care about it. We'll look at how you can introduce it into your projects with regards to practical publication and consumption, and discuss some useful tools and reference points.
Open data isn't just dry and technical - it gives us great scope to be creative, and throughout this talk we'll go through some of the amazing things that it has been used for globally in the hope that it will inspire you to create something amazing yourself.
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...DATAVERSITY
Change is hard, especially in response to negative stimuli or what is perceived as negative stimuli. So organizations need to reframe how they think about data privacy, security and governance, treating them as value centers to 1) ensure enterprise data can flow where it needs to, 2) prevent – not just react – to internal and external threats, and 3) comply with data privacy and security regulations.
Working together, these roles can accelerate faster access to approved, relevant and higher quality data – and that means more successful use cases, faster speed to insights, and better business outcomes. However, both new information and tools are required to make the shift from defense to offense, reducing data drama while increasing its value.
Join us for this panel discussion with experts in these fields as they discuss:
- Recent research about where data privacy, security and governance stand
- The most valuable enterprise data use cases
- The common obstacles to data value creation
- New approaches to data privacy, security and governance
- Their advice on how to shift from a reactive to resilient mindset/culture/organization
You’ll be educated, entertained and inspired by this panel and their expertise in using the data trifecta to innovate more often, operate more efficiently, and differentiate more strategically.
Modern Data Challenges require Modern Graph TechnologyNeo4j
This session focuses on key data trends and challenges impacting enterprises. And, how graph technology is evolving to future-proof data strategy and architectures.
1) Managers are criticized for incompetence and mismanagement, leading to calls for more accountability and transparency in decision-making.
2) Only about half of what is learned is still considered valid after 7 years, so reliance on past experience can spread outdated or incorrect information.
3) Research shows that managers are often incorrect in their beliefs about effective practices, scoring on average only 35-57% correct on true/false questions.
4) The main reason evidence-based management is needed is because of "bounded rationality" - managers rely on intuitive thinking which is prone to biases that can lead to flawed decisions without objective evidence to counteract them.
Development of Science in 18th to 19th centuryLeizel Despi
The document summarizes key developments in science from the 18th to 19th centuries. It describes how the connections between science, technology, and industry grew during this period, leading science to become a major driving force of intellectual and material change. It then outlines major advances in physics, chemistry, biology, and geology during this era, including breakthroughs such as Darwin's theory of evolution, Maxwell's unification of electricity and magnetism, and discoveries of new elements and atomic theory. Finally, it lists important scientists from this period like Darwin, Faraday, Pasteur, and Röntgen alongside their contributions.
This document provides a summary of the history of oil from the 1850s discovery of oil in Pennsylvania through the 1973 oil crisis. It covers 3 phases: 1) the age of illumination until the rise of Standard Oil in the late 19th century, 2) two world wars where control of oil increasingly determined outcomes, and 3) the rise of OPEC and the Middle East leading to the 1973 oil embargo and price shock. Key events included Colonel Drake's 1859 well in Titusville PA launching the industry, Rockefeller consolidating refiners under Standard Oil, two world wars demonstrating oil's military importance, and OPEC actions in 1973 quadrupling oil prices.
Big Data, Data-Driven Decision Making and Statistics Towards Data-Informed Po...Prof. Dr. Diego Kuonen
The document discusses big data, data-driven decision making, and data-informed policy making. It defines big data as large and complex data that requires new tools and techniques to analyze. It emphasizes that decision making should be based on data analysis rather than intuition alone. For policy making, data are crucial for monitoring progress, but statistics and data science are often underappreciated. Developing countries in particular lack reliable data for policy decisions.
The document discusses challenges with data annotation at scale and potential solutions. It notes that while data is important for AI, obtaining large datasets is difficult due to privacy laws, terms of use, and outsourcing challenges. Annotation quality and workflow optimization are also discussed, including using tight bounding boxes, automatic annotation, and open-source tools like CVAT that support tasks like object detection, classification, and semantic segmentation. The conclusion emphasizes that data requires management as a product and investing in infrastructure to develop high quality datasets.
Data Science Transforming Security OperationsPriyanka Aash
Data science can transform security operations by being applied across the entire process, beyond just prevention and detection. It can enhance detection through advanced analytics, augment investigations by aggregating alerts and prioritizing threats, improve continuously through feedback loops, enable intelligence sharing, and inform automated responses. Organizations should assess their data science maturity and focus on integrating it throughout their security operations rather than treating it as an isolated feature. Building an in-house data science practice requires alignment, strategic staffing, and a long-term commitment to maximize the benefits.
The documentary Revolution OS explores the origins and growth of the open source software movement. It begins with Richard Stallman's creation of the GNU operating system in response to the closing of proprietary software code. Linux was later created by Linus Torvalds, combining GNU components and an open source kernel. Linux grew from a personal project into a viable alternative to Windows. Companies like Google and IBM adopted Linux, following the model of openly sharing code to encourage contributions from developers around the world. The open source approach manages to produce high-quality software through massive peer review and independent validation of code.
Lattice Energy LLC - LENRs enable green radiation-free nuclear power and prop...Lewis Larsen
This document provides a history of low-energy nuclear reactions (LENRs) research, beginning in the early 1900s. It discusses key events and experiments, including Einstein encouraging Sternglass to publish his 1951 findings of neutron production in a hydrogen-filled X-ray tube. It describes Pons and Fleischmann's 1989 announcement of excess heat production in electrochemical cells, which was met with skepticism. It outlines subsequent experimental work reporting nuclear transmutations and heat without radiation. The document proposes that these effects may be explained by ultralow energy neutron production via collective many-body processes, as first hypothesized by Einstein and Sternglass.
A Brief Perspective On Climate Change SkepticismJeff Brooks
This document discusses the history of climate change skepticism from its origins in the 19th century to modern times. It outlines several key instances where early scientific work on the greenhouse effect and links between CO2 and temperature were met with skepticism by other scientists. More recently, the document suggests that climate models have been unreliable in their predictions and that the portrayal of a 97% scientific consensus on human-caused warming is misleading. Overall, the document argues that skepticism is a natural and important part of the scientific process.
Climate Change: Science Versus Consensus and AlarmismTJSomething
This document discusses skepticism about claims of catastrophic human-caused climate change and analyzes Al Gore's film "An Inconvenient Truth". It notes that global temperatures have risen in recent centuries but that natural factors also influence the climate. The author argues that climate models have limitations and that some scientists have distorted data to support alarmist views for political or funding reasons.
Lattice Energy LLC - US Government Labs Reported Clear-cut Neutron Capture Da...Lewis Larsen
This document summarizes the work of Lewis Larsen, President and CEO of Lattice Energy LLC, on commercializing a next-generation source of green nuclear energy. It discusses how in 1989, some scientists knew neutrons were behind low-energy nuclear reactions (LENRs) from experimental data, but this was ignored due to paradigm conflicts. It describes how increased electron masses were hypothesized to explain apparent fusion rates in Pons and Fleischmann's experiments, though this did not explain the absence of gamma and neutron radiation. The document presents experimental data from 1989 workshops showing isotopic shifts in palladium cathodes from LENR experiments.
Cold fusioneers new ploy-ad hoc redefinition of technical term fusion-Dec 30 ...Lewis Larsen
The document summarizes the key differences between "cold fusion" and low energy nuclear reactions (LENRs). It argues that [1] "cold fusion" involving fusion of deuterium atoms is scientifically invalid, while [2] LENRs involving neutron capture can be explained through established physics. It accuses cold fusion proponents of recently trying to redefine fusion to include neutron processes in order to validate their claims, against mainstream understanding. The document supports the Widom-Larsen theory of LENRs as a valid approach grounded in modern physics, unlike unsupported claims of cold fusion.
The document summarizes the history of cold fusion research, known as the Pons-Fleischmann effect. It describes how the initial 1989 announcement claiming cold fusion was met with skepticism by most scientists due to lack of evidence and reproducibility of results. While most mainstream science rejected the claims, a subculture formed to continue exploring cold fusion. Over time, some governments and militaries have provided limited funding for additional research on phenomena that cannot yet be explained by known science.
This document outlines seven theories of climate change, beginning with the theory of anthropogenic (human-caused) global warming (AGW). The AGW theory holds that increasing emissions of greenhouse gases, especially carbon dioxide, from human activities like burning fossil fuels are the primary driver of the global warming observed over the past 50 years. It asserts that positive feedback loops amplify the initial warming effects of greenhouse gases. Proponents argue nearly all the 0.7°C increase in global temperatures over the past century can be attributed to human greenhouse gas emissions.
This document discusses climate change skepticism. It notes that while many scientific studies have found evidence that human activity is causing climate change, some skeptics reject these findings. The document examines reasons for skepticism, such as financial motivations of oil companies and a lack of acceptance that human activity can influence the climate system significantly. It also summarizes the views of specific skeptics like Raymond Spier and responses from the scientific community to Spier's arguments about factors like ice core data and cloud formation. Overall, the document suggests that while skepticism can be valuable in science, climate skeptics often aim to discredit the science for political or financial reasons rather than contributing to a better understanding of the issues.
1) The document summarizes evidence that human-caused greenhouse gas emissions are driving observed increases in global temperatures. It discusses the complex, interdisciplinary nature of climate change research and refutes claims that the field is motivated by funding or politics.
2) Key aspects of Earth's climate system are described, including the greenhouse effect which makes the planet habitable. The role of human emissions of greenhouse gases in increasing surface temperatures is explained, though uncertainties remain about future impacts.
3) The skepticism expressed in a recent editorial by Spier is addressed. While skepticism can be valuable, the evidence indicates Spier's specific doubts regarding temperature data and ice core records are unwarranted.
Banking Circle: Money Laundering Beware: A Modern Approach to AML with Machin...Neo4j
The document discusses Banking Circle's use of graph technology and a data-driven approach to improve its anti-money laundering efforts. It represents payment data as a network to extract features for machine learning models that detect suspicious activity. This approach generates fewer false alarms than rules-based systems while identifying more high-risk payments and accounts. Network-based investigations also help analysts explore connections more efficiently. The new system screens over 1 million payments daily and has increased alerts leading to compliance actions by 1300% while reducing total alerts by 30%.
Talk given at Fronteers 2015 in Amsterdam.
In a world where many of our digital spaces are becoming more closed than ever, open data is a concept that is rapidly on the rise.
In this talk we'll explore what open data is (and what it isn't), and why we should care about it. We'll look at how you can introduce it into your projects with regards to practical publication and consumption, and discuss some useful tools and reference points.
Open data isn't just dry and technical - it gives us great scope to be creative, and throughout this talk we'll go through some of the amazing things that it has been used for globally in the hope that it will inspire you to create something amazing yourself.
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...DATAVERSITY
Change is hard, especially in response to negative stimuli or what is perceived as negative stimuli. So organizations need to reframe how they think about data privacy, security and governance, treating them as value centers to 1) ensure enterprise data can flow where it needs to, 2) prevent – not just react – to internal and external threats, and 3) comply with data privacy and security regulations.
Working together, these roles can accelerate faster access to approved, relevant and higher quality data – and that means more successful use cases, faster speed to insights, and better business outcomes. However, both new information and tools are required to make the shift from defense to offense, reducing data drama while increasing its value.
Join us for this panel discussion with experts in these fields as they discuss:
- Recent research about where data privacy, security and governance stand
- The most valuable enterprise data use cases
- The common obstacles to data value creation
- New approaches to data privacy, security and governance
- Their advice on how to shift from a reactive to resilient mindset/culture/organization
You’ll be educated, entertained and inspired by this panel and their expertise in using the data trifecta to innovate more often, operate more efficiently, and differentiate more strategically.
Modern Data Challenges require Modern Graph TechnologyNeo4j
This session focuses on key data trends and challenges impacting enterprises. And, how graph technology is evolving to future-proof data strategy and architectures.
1) Managers are criticized for incompetence and mismanagement, leading to calls for more accountability and transparency in decision-making.
2) Only about half of what is learned is still considered valid after 7 years, so reliance on past experience can spread outdated or incorrect information.
3) Research shows that managers are often incorrect in their beliefs about effective practices, scoring on average only 35-57% correct on true/false questions.
4) The main reason evidence-based management is needed is because of "bounded rationality" - managers rely on intuitive thinking which is prone to biases that can lead to flawed decisions without objective evidence to counteract them.
Development of Science in 18th to 19th centuryLeizel Despi
The document summarizes key developments in science from the 18th to 19th centuries. It describes how the connections between science, technology, and industry grew during this period, leading science to become a major driving force of intellectual and material change. It then outlines major advances in physics, chemistry, biology, and geology during this era, including breakthroughs such as Darwin's theory of evolution, Maxwell's unification of electricity and magnetism, and discoveries of new elements and atomic theory. Finally, it lists important scientists from this period like Darwin, Faraday, Pasteur, and Röntgen alongside their contributions.
This document provides a summary of the history of oil from the 1850s discovery of oil in Pennsylvania through the 1973 oil crisis. It covers 3 phases: 1) the age of illumination until the rise of Standard Oil in the late 19th century, 2) two world wars where control of oil increasingly determined outcomes, and 3) the rise of OPEC and the Middle East leading to the 1973 oil embargo and price shock. Key events included Colonel Drake's 1859 well in Titusville PA launching the industry, Rockefeller consolidating refiners under Standard Oil, two world wars demonstrating oil's military importance, and OPEC actions in 1973 quadrupling oil prices.
Big Data, Data-Driven Decision Making and Statistics Towards Data-Informed Po...Prof. Dr. Diego Kuonen
The document discusses big data, data-driven decision making, and data-informed policy making. It defines big data as large and complex data that requires new tools and techniques to analyze. It emphasizes that decision making should be based on data analysis rather than intuition alone. For policy making, data are crucial for monitoring progress, but statistics and data science are often underappreciated. Developing countries in particular lack reliable data for policy decisions.
The document discusses challenges with data annotation at scale and potential solutions. It notes that while data is important for AI, obtaining large datasets is difficult due to privacy laws, terms of use, and outsourcing challenges. Annotation quality and workflow optimization are also discussed, including using tight bounding boxes, automatic annotation, and open-source tools like CVAT that support tasks like object detection, classification, and semantic segmentation. The conclusion emphasizes that data requires management as a product and investing in infrastructure to develop high quality datasets.
Data Science Transforming Security OperationsPriyanka Aash
Data science can transform security operations by being applied across the entire process, beyond just prevention and detection. It can enhance detection through advanced analytics, augment investigations by aggregating alerts and prioritizing threats, improve continuously through feedback loops, enable intelligence sharing, and inform automated responses. Organizations should assess their data science maturity and focus on integrating it throughout their security operations rather than treating it as an isolated feature. Building an in-house data science practice requires alignment, strategic staffing, and a long-term commitment to maximize the benefits.
The documentary Revolution OS explores the origins and growth of the open source software movement. It begins with Richard Stallman's creation of the GNU operating system in response to the closing of proprietary software code. Linux was later created by Linus Torvalds, combining GNU components and an open source kernel. Linux grew from a personal project into a viable alternative to Windows. Companies like Google and IBM adopted Linux, following the model of openly sharing code to encourage contributions from developers around the world. The open source approach manages to produce high-quality software through massive peer review and independent validation of code.
Lattice Energy LLC - LENRs enable green radiation-free nuclear power and prop...Lewis Larsen
This document provides a history of low-energy nuclear reactions (LENRs) research, beginning in the early 1900s. It discusses key events and experiments, including Einstein encouraging Sternglass to publish his 1951 findings of neutron production in a hydrogen-filled X-ray tube. It describes Pons and Fleischmann's 1989 announcement of excess heat production in electrochemical cells, which was met with skepticism. It outlines subsequent experimental work reporting nuclear transmutations and heat without radiation. The document proposes that these effects may be explained by ultralow energy neutron production via collective many-body processes, as first hypothesized by Einstein and Sternglass.
A Brief Perspective On Climate Change SkepticismJeff Brooks
This document discusses the history of climate change skepticism from its origins in the 19th century to modern times. It outlines several key instances where early scientific work on the greenhouse effect and links between CO2 and temperature were met with skepticism by other scientists. More recently, the document suggests that climate models have been unreliable in their predictions and that the portrayal of a 97% scientific consensus on human-caused warming is misleading. Overall, the document argues that skepticism is a natural and important part of the scientific process.
Climate Change: Science Versus Consensus and AlarmismTJSomething
This document discusses skepticism about claims of catastrophic human-caused climate change and analyzes Al Gore's film "An Inconvenient Truth". It notes that global temperatures have risen in recent centuries but that natural factors also influence the climate. The author argues that climate models have limitations and that some scientists have distorted data to support alarmist views for political or funding reasons.
Lattice Energy LLC - US Government Labs Reported Clear-cut Neutron Capture Da...Lewis Larsen
This document summarizes the work of Lewis Larsen, President and CEO of Lattice Energy LLC, on commercializing a next-generation source of green nuclear energy. It discusses how in 1989, some scientists knew neutrons were behind low-energy nuclear reactions (LENRs) from experimental data, but this was ignored due to paradigm conflicts. It describes how increased electron masses were hypothesized to explain apparent fusion rates in Pons and Fleischmann's experiments, though this did not explain the absence of gamma and neutron radiation. The document presents experimental data from 1989 workshops showing isotopic shifts in palladium cathodes from LENR experiments.
Cold fusioneers new ploy-ad hoc redefinition of technical term fusion-Dec 30 ...Lewis Larsen
The document summarizes the key differences between "cold fusion" and low energy nuclear reactions (LENRs). It argues that [1] "cold fusion" involving fusion of deuterium atoms is scientifically invalid, while [2] LENRs involving neutron capture can be explained through established physics. It accuses cold fusion proponents of recently trying to redefine fusion to include neutron processes in order to validate their claims, against mainstream understanding. The document supports the Widom-Larsen theory of LENRs as a valid approach grounded in modern physics, unlike unsupported claims of cold fusion.
The document summarizes the history of cold fusion research, known as the Pons-Fleischmann effect. It describes how the initial 1989 announcement claiming cold fusion was met with skepticism by most scientists due to lack of evidence and reproducibility of results. While most mainstream science rejected the claims, a subculture formed to continue exploring cold fusion. Over time, some governments and militaries have provided limited funding for additional research on phenomena that cannot yet be explained by known science.
This document outlines seven theories of climate change, beginning with the theory of anthropogenic (human-caused) global warming (AGW). The AGW theory holds that increasing emissions of greenhouse gases, especially carbon dioxide, from human activities like burning fossil fuels are the primary driver of the global warming observed over the past 50 years. It asserts that positive feedback loops amplify the initial warming effects of greenhouse gases. Proponents argue nearly all the 0.7°C increase in global temperatures over the past century can be attributed to human greenhouse gas emissions.
This document discusses climate change skepticism. It notes that while many scientific studies have found evidence that human activity is causing climate change, some skeptics reject these findings. The document examines reasons for skepticism, such as financial motivations of oil companies and a lack of acceptance that human activity can influence the climate system significantly. It also summarizes the views of specific skeptics like Raymond Spier and responses from the scientific community to Spier's arguments about factors like ice core data and cloud formation. Overall, the document suggests that while skepticism can be valuable in science, climate skeptics often aim to discredit the science for political or financial reasons rather than contributing to a better understanding of the issues.
1) The document summarizes evidence that human-caused greenhouse gas emissions are driving observed increases in global temperatures. It discusses the complex, interdisciplinary nature of climate change research and refutes claims that the field is motivated by funding or politics.
2) Key aspects of Earth's climate system are described, including the greenhouse effect which makes the planet habitable. The role of human emissions of greenhouse gases in increasing surface temperatures is explained, though uncertainties remain about future impacts.
3) The skepticism expressed in a recent editorial by Spier is addressed. While skepticism can be valuable, the evidence indicates Spier's specific doubts regarding temperature data and ice core records are unwarranted.
- The document discusses how global warming is likely contributing to stronger hurricanes by adding additional heat energy to the climate that must be dissipated through tropical storms. More heat means stronger storms.
- Scientific studies show that carbon dioxide levels are currently the highest they've been in the last 40 million years, and have increased sharply in the last 200 years due to human activity like burning fossil fuels. Higher carbon dioxide is linked to increased global temperatures.
- Climate models predict that a 1% annual increase in carbon dioxide will lead to a half-category increase in hurricane intensity, 18% more rainfall in hurricane cores, and a higher risk of stronger category 5 storms. While the frequency of hurricanes may increase as well due to warming,
The document summarizes Eric Prebys' lecture on perpetual motion machines and free energy claims. He discusses the history of perpetual motion ideas and how thermodynamics laws prove such machines are impossible. He outlines common signs that a free energy claim lacks scientific rigor, such as working in isolation from the mainstream scientific community or relying on vague demonstrations.
This document analyzes claims of new energy technologies that contradict mainstream science, including "free energy" mechanisms. It discusses the history of free energy claims dating back before 1800 and inventors whose claims involved anomalous effects that could not be explained by physics at the time. It also summarizes some microcurrent energy harvesting technologies that are accepted by mainstream science, such as thermoelectric and piezoelectric generators, which are increasingly being commercialized for powering small devices. The document aims to take an independent perspective on both established and more controversial energy topics.
The document discusses several types of energy technologies that are gaining acceptance in mainstream science, including microcurrent harvesting technologies. Microcurrent harvesters use small amounts of ambient energy sources like heat, motion, light, and vibration to power small electronic devices without the need for batteries. Technologies like thermoelectric generators that convert tiny temperature differences and piezoelectric generators that harness mechanical energy are being used increasingly to power industrial sensors, monitors and other devices. While only producing very small amounts of power, these microcurrent harvesters are enabling perpetual operation without battery replacement in many applications. The document also notes several other emerging energy technologies that could have significant economic and social impacts if fully developed and commercialized.
This document provides biographical information about the author, John Costella, and introduces his edited compilation of the Climategate emails. It is dedicated to the memory of John Daly, an early climate skeptic. The foreword discusses how the emails expose a previously hidden world where a small group of climate scientists were able to control temperature data and influence the peer review process. It argues this was a real conspiracy and that Steve McIntyre performed a great service by requesting data and programs. The introduction explains why Climategate is so distressing to scientists because it undermines the scientific method and trust in the peer review process.
This document provides an introduction to the textbook "Fundamentals in Nuclear Physics" by Jean-Louis Basdevant, James Rich, and Michel Spiro. It discusses the following key points in 3 sentences:
1) The textbook was produced by top scientists at the prestigious Ecole Polytechnique in France and aims to provide undergraduate students with fundamental knowledge of nuclear physics and its applications.
2) The textbook incorporates the most recent scientific advances in nuclear physics into its courses and achieves an outstanding level of quality and consistency across topics due to the high caliber of its authors and reviewers.
3) The textbook was originally restricted to Ecole Polytechnique students but has since been made available in English to reach
This document summarizes and forwards an email from "S. Fred Singer" discussing various topics related to climate change and global warming. The email discusses recent news articles and studies that question the degree of scientific consensus around human-caused climate change. It argues that climate models have high uncertainties and that some climate alarmists have exaggerated risks and uncertainties to advocate for policy actions like the Kyoto Protocol. The document criticizes some climate scientists and activists for altering official reports to overstate evidence of climate impacts and for justifying advocacy as their role even when it compromises ethics and objectivity in journalism.
This document discusses the relationship between science, politics, and pseudoscience. It examines different ways of categorizing scientific disciplines and proposes a new categorization called "post-normal science" to describe fields with high uncertainty and stakes, like climate change. It also discusses three case studies - the Superconducting Super Collider, the Yucca Mountain nuclear waste repository, and global climate change - to illustrate different types of scientific endeavors based on their levels of uncertainty and decision stakes. Throughout, it emphasizes the importance of understanding how scientific knowledge interacts with political contexts and authorities.
Lattice Energy LLC - Synopsis of book titled Fusion Fiasco by Steven Krivit p...Lewis Larsen
Synopsis of Steve Krivit’s book “Fusion Fiasco” in context of the Widom-Larsen theory of LENRs:
By late October 1989 Dr. Edward Teller, ‘father’ of the first Hydrogen bomb, was apparently convinced Pons & Fleischmann had discovered a little-understood nuclear process that could operate in ordinary electrochemical cells. Bizarre absence of deadly hard radiation indicated to him that P&F’s puzzling results probably weren’t caused by a fusion process. After seeing all the ERAB panel’s data, he further speculated that the underlying process was very likely nuclear and possibly catalyzed by “neutral particle of small mass and marginal stability” that was somewhat akin to a neutron. Krivit reveals how his prescient insights were ignored by the DOE ERAB panel and then effectively buried for 27 years.
On Genies and Bottles: Scientists’ Moral Responsibility and Dangerous Technol...inkwina
On Genies and Bottles: Scientists’ Moral Responsibility
and Dangerous Technology R&D
David Koepsell
The age-old maxim of scientists whose work has resulted in deadly or
dangerous technologies is: scientists are not to blame, but rather technologists and
politicians must be morally culpable for the uses of science. As new technologies
threaten not just populations but species and biospheres, scientists should reassess
their moral culpability when researching fields whose impact may be catastrophic.
Looking at real-world examples such as smallpox research and the Australian
‘‘mousepox trick’’, and considering fictional or future technologies like Kurt Vonnegut’s
‘‘ice-nine’’ from Cat’s Cradle, and the ‘‘grey goo’’ scenario in nanotechnology,
this paper suggests how ethical principles developed in biomedicine can be
adjusted for science in general. An ‘‘extended moral horizon’’ may require looking
not just to the effects of research on individual human subjects, but also to effects on
humanity as a whole. Moreover, a crude utilitarian calculus can help scientists make
moral decisions about which technologies to pursue and disseminate when catastrophes
may result. Finally, institutions should be devised to teach these moral
principles to scientists, and require moral education for future funding.
This document discusses scientific theories and the Big Bang Theory. It provides definitions of key terms like scientific theory, fact, and law. It explains that a scientific theory is well-supported by evidence but is not absolute fact. The document then summarizes the development of the Big Bang Theory, including contributions from Einstein, Friedman, and Hubble. It describes how evidence from the cosmic microwave background and galaxy redshifts supports the idea that the universe expanded from a hot, dense initial state nearly 14 billion years ago.
Similar to What is science? What isn't? A look at cold fusion (20)
CLASS 12th CHEMISTRY SOLID STATE ppt (Animated)eitps1506
Description:
Dive into the fascinating realm of solid-state physics with our meticulously crafted online PowerPoint presentation. This immersive educational resource offers a comprehensive exploration of the fundamental concepts, theories, and applications within the realm of solid-state physics.
From crystalline structures to semiconductor devices, this presentation delves into the intricate principles governing the behavior of solids, providing clear explanations and illustrative examples to enhance understanding. Whether you're a student delving into the subject for the first time or a seasoned researcher seeking to deepen your knowledge, our presentation offers valuable insights and in-depth analyses to cater to various levels of expertise.
Key topics covered include:
Crystal Structures: Unravel the mysteries of crystalline arrangements and their significance in determining material properties.
Band Theory: Explore the electronic band structure of solids and understand how it influences their conductive properties.
Semiconductor Physics: Delve into the behavior of semiconductors, including doping, carrier transport, and device applications.
Magnetic Properties: Investigate the magnetic behavior of solids, including ferromagnetism, antiferromagnetism, and ferrimagnetism.
Optical Properties: Examine the interaction of light with solids, including absorption, reflection, and transmission phenomena.
With visually engaging slides, informative content, and interactive elements, our online PowerPoint presentation serves as a valuable resource for students, educators, and enthusiasts alike, facilitating a deeper understanding of the captivating world of solid-state physics. Explore the intricacies of solid-state materials and unlock the secrets behind their remarkable properties with our comprehensive presentation.
Anti-Universe And Emergent Gravity and the Dark UniverseSérgio Sacani
Recent theoretical progress indicates that spacetime and gravity emerge together from the entanglement structure of an underlying microscopic theory. These ideas are best understood in Anti-de Sitter space, where they rely on the area law for entanglement entropy. The extension to de Sitter space requires taking into account the entropy and temperature associated with the cosmological horizon. Using insights from string theory, black hole physics and quantum information theory we argue that the positive dark energy leads to a thermal volume law contribution to the entropy that overtakes the area law precisely at the cosmological horizon. Due to the competition between area and volume law entanglement the microscopic de Sitter states do not thermalise at sub-Hubble scales: they exhibit memory effects in the form of an entropy displacement caused by matter. The emergent laws of gravity contain an additional ‘dark’ gravitational force describing the ‘elastic’ response due to the entropy displacement. We derive an estimate of the strength of this extra force in terms of the baryonic mass, Newton’s constant and the Hubble acceleration scale a0 = cH0, and provide evidence for the fact that this additional ‘dark gravity force’ explains the observed phenomena in galaxies and clusters currently attributed to dark matter.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
The cost of acquiring information by natural selectionCarl Bergstrom
This is a short talk that I gave at the Banff International Research Station workshop on Modeling and Theory in Population Biology. The idea is to try to understand how the burden of natural selection relates to the amount of information that selection puts into the genome.
It's based on the first part of this research paper:
The cost of information acquisition by natural selection
Ryan Seamus McGee, Olivia Kosterlitz, Artem Kaznatcheev, Benjamin Kerr, Carl T. Bergstrom
bioRxiv 2022.07.02.498577; doi: https://doi.org/10.1101/2022.07.02.498577
PPT on Direct Seeded Rice presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
Immersive Learning That Works: Research Grounding and Paths ForwardLeonel Morgado
We will metaverse into the essence of immersive learning, into its three dimensions and conceptual models. This approach encompasses elements from teaching methodologies to social involvement, through organizational concerns and technologies. Challenging the perception of learning as knowledge transfer, we introduce a 'Uses, Practices & Strategies' model operationalized by the 'Immersive Learning Brain' and ‘Immersion Cube’ frameworks. This approach offers a comprehensive guide through the intricacies of immersive educational experiences and spotlighting research frontiers, along the immersion dimensions of system, narrative, and agency. Our discourse extends to stakeholders beyond the academic sphere, addressing the interests of technologists, instructional designers, and policymakers. We span various contexts, from formal education to organizational transformation to the new horizon of an AI-pervasive society. This keynote aims to unite the iLRN community in a collaborative journey towards a future where immersive learning research and practice coalesce, paving the way for innovative educational research and practice landscapes.
Clinical periodontology and implant dentistry 2003.pdf
What is science? What isn't? A look at cold fusion
1. What is Science? What isn't?
A Look at Cold Fusion
Dennis Miller
Presented to the Arbeitskreis Philosophie Kelkheim, 26.10.2020. Originally in German as Die kalte
Fusion: Wissenschaft, Pseudowissenschaft, Pathologische Wissenschaft?
2. The cold fusion story: outline
• An important und unexpected discovery is announced
Great media interest
Hectic attempts to replicate experiments
• Errors and inconsistencies soon found
Widespread rejection
Reputations ruined
Criticised as “pathological science”, “delusion”, “fiasco”, “fraud”
• Small group of scientists support the original claims and continue this
research
3. What does it tell us about science?
• Scientific procedure
Verification
Peer review
Research ethics
Open discussion vs. interests of individuals and institutions
Ideas that fail to be accepted
• History / philosophy of science
Pathological science
4. Nuclear fusion
• In principle almost unlimited source of power on earth
• Challenging technology
Plasma at approx. 100 million °C
Enormous machines required
• No practical fusion power station yet (in spite of progress)
5. *
+
+
+ g
~ 50 %
~ 50 %
~ 0.00001 %
Fusion of deuterium
according to conventional nuclear physics
2H
2H
4He
4He*
3He
3H
• Deuterium content in seawater: 0.015% (molar fraction of total hydrogen)
• Energy of nuclear fusion about 4x108 times heat of combustion
6. Model Cross-section
Person as size
comparison
Hot fusion research: big and expensive.
Figs https://de.wikipedia.org/wiki/ITER
• ITER – International project under construction
Cost approx 20 billion €
Plasma temperature for ignition: 270 million °C
7. The Fleischmann-Pons experiment: principle
• Background
Palladium absorbs large amounts of hydrogen/deuterium
Can the palladium metal lattice help the deuterium to fuse?
• Setup
Force lots of deuterium into the palladium through electrolysis
Analyse temperature measurements to determine if excess heat is
produced
→ March 1989 claim in press conference
It works: fusion is possible without extreme temperature!
8. The Fleischmann-Pons experiment: equipment
Platinum Palladium
Constant temperature
bath
D2O + e- DO- + D
Absorbed by
palladium
Thermometer
Schematic Stanley Pons with cold fusion cell
Photo en.wikipedia.org
9. The 1989 Press Conference:
Major Discovery or Errors and Delusions?
10. The press conference and its aftermath
Green +ve / acceptance
Red –ve / crticism
ACS: American Chemical Society
APS: American Physical Society
DOE: Dept. of Energy
11. After the press conference – other labs' results
• High speed work!
• Many groups involved
• Various techniques: excess heat, neutrons, gas phase/metal
• Positive results trend
· Early reports
· Less well equipped labs
No. Reports
+ve -ve
23 Mar – 2 May 26 20
3 May – 24 May 8 27
12. Objections and criticism - consensus view
•Experimental problems
Determination of excess heat
Measurements of neutrons
•Theory
Implausibly large effect of Pd/D
lattice (~1057 !!)
Enormous mismatch between
excess heat and fusion products
•Scientific method
Conflict between pressure to publish and
scientific standards
Bypassing peer review
Many experimental details not published
Lack of cooperation between electrochemists
and nuclear physicists
Lack of control tests (e.g. H2O instead of D2O)
Failure to recognize that a theory fails
13. Pathological Science
People are tricked into false results ... by subjective effects, wishful
thinking or threshold interactions.“
Irving Langmuir, 1953
• Unexpected claim: contradicts accepted theory and experience
• Effects that are difficult to detect
• Objections explained away by ad hoc assumptions
• Ratio of supporters to critics rises to ~ 50 %, and then falls gradually to
near zero
→ Cold fusion often considered and example of pathological science
16. Cold fusion / LENR: Some of the topics
Gas phase reactors
- Special surface layers
- Metallic nanoparticles
Transmutation
- Neutrons react with nuclei:
• Heavier elements formed
- Transforming nuclear waste to non-radioactive products ?
1H instead of D as fuel
- Contradicts original cold fusion concept
Theory (deuterium fusion)
- Metal lattice has major effects on nuclear reactions
- Electrons in metal lattice reduce D + D repulsion
- Large change in ratio of fusion products:
• Main channel: D + D 4He
• Energy of fusion: reaction as heat not gamma ray
Electrochemical cells
- Heat measurements
- Reproducibility
- Fusion products, He, neutrons, gamma rays
Theory (LENR with 1H)
- Metal lattice effects:
• Ultracold neutrons formed from proton + electron
Theory
- Electron deep orbits
• Electrons can be much closer to
nucleus than in conventional theory
17. LENR / Cold Fusion: examples of active supporters
LENR-CANR
Online cold fusion/LENR Library
SRI International
Research
Michael McKubre
Francis Tanzella
July 2020: These have all been active in the area within the last five years.
However, some may no longer be working on LENR / Cold Fusion
Peter Hagelstein
MIT
Theory
Mitchell Schwartz
Brillouin Energy
Lattice Energy LLC
Kiva Labs
Edmund Storms
LENR Cars
New Energy Times
Steven Krivit
Leonardo Corp
Andrea Rossi
E-cat
J-P. Biberian
Univ. Aix-Marseille
Clean Nuclear Power
Allan Widom
Northeastern Univ.
Theory
Clean Planet Inc.
Tohuka University
Sendai
18. Cold Fusion / LENR: Conclusion
• Premature publication
Lack of controls
Poor reproducibility
Secrecy (patent considerations)
Self-delusion
• Work continues in small community outside mainstream science
2019 Google project:
- failed to confirm the cold fusion claims
- but LENR research may reveal something interesting
1989 Press conferenence
and following year
19. Literature
1 John Huizenga, Cold Fusion – The Scientific Fiasco of the Century, Oxford Univ. Press (1993)
2 Frank Close, Too Hot to Handle – The Race for Cold Fusion, Penguin Books (1992).
3 C. P. Berlinguette et al., Revisiting the Case of Cold Fusion, Nature,
https://doi.org/10.1038/s41586-019-1256-6 (2019)
4 Philip Ball, Lessons from Cold Fusion, 30 Years on, Nature, 569, 601 (2019)
5 Bruce Lewenstein, Fax to Facts: Communication in the Cold Fusion Saga, Social Studies of Science
25, 403-436 (1995)
6 Jean-Paul Biberian, Cold Fusion, Advances in Condensed Matter Nuclear Science, Elsevier (2020).
21. Fleischmann-Pons Experiment: Background
Pons and Fleischmann were both chemists. Fleischmann, an electrochemist, was the better known of the two
and enjoyed an excellent reputation in the field. After early retirement from the Univ. of Southampton, he
continued research with Stanley Pons in Utah. Pons had done his PhD at Southampton and was now Professor
at the University of Utah. Some time in the mid-1980's they discussed the cold fusion idea and decided to test
it. They were well aware that this was a bit like a lottery: big prize, but very small chance of winning it. For the
next couple of years this exotic idea was not their main project, however. But in 1988 they had some
encouraging preliminary results and applied for a grant to do more intensive research on this topic.
The discussions on the research grant lead to contact with a colleague working on a related project. Prof.
Steven Jones, physicist at another university in Utah, was interested in the possibility of fusion in geochemical
processes. His work suggested that such processes do occur, but the rate was exceedingly small – much larger
than conventional theory predicted, but much too small to be of any use for energy production. Should they
collaborate with Jones or regard him as a competitor?
22. Pressure for Premature Publication
In the summer of 1988, Fleischmann and Pons believed they had evidence for cold fusion and hoped to study
this in detail for before making a public statement. That would probably be a project lasting about 18 months.
However, in early 1989 they came under pressure – partly from of fears of competition by Jones and also
because the University of Utah wanted to make sure it profited from a major discovery made in its labs. This
lead to the famous press conference of March 23rd * at which they announced what they believed was a major
discovery: contrary to accepted physical theory, deuterium could undergo fusion at room temperature using
quite simple apparatus!
https://www.youtube.com/watch?v=6CfHaeQo6oU
*
23. Aftermath of the 1989 Press Conference
After the press conference there was intense activity. Many groups tried to replicate the results. The first few
weeks were hectic. In particular, much was outside the normal rules of scientific communication. Those trying
to replicate the experiment had no scientific publication to go on and obtained information from whatever
sources they could find.
Over the next few weeks a variety of serious problems with the cold fusion claims became clear. At the
beginning of May there was strong criticism at an American Physical Society meeting. The weekly bulletin of
members of the society regarded this a refutation but added “The corpse of cold fusion will probably continue
to twitch for a while ...”. A very careful study was conducted at Harwell, one of the leading nuclear physics labs.
The Harwell scientists waited for clear, scientifically valid results and did not make any premature
announcements. By the 15th of June they were ready and gave a press conference; no effect was found and the
lab. would discontinue its work on this topic.
The Harwell results confirmed the doubts of many experts on the validity of the cold fusion claims. There was
still a band of supporters, but they were becoming increasingly separated from mainstream science.
Cold fusion was soon seen by many as a fiasco. Much money had been wasted checking claims for which there
was little evidence. It was bad for the public perception of science. Yes, an erroneous claim was corrected, but
not by the normal procedure – it's not an example of how science is supposed to work.
24. Cold Fusion Research Continues
Soon, mainstream science considered cold fusion to be discredited. A small group of scientists disagreed and
continued to work on it. Their results did not attract a great deal of comment; after the initial excitement many
of the critics had lost interest in the subject.
The cold fusion community became separated from mainstream science. It was a community with views not
accepted by most scientists and it mistrusted the scientific establishment. Its research was not published by
the major scientific journals. It had its own conferences at which reports of cold fusion phenomena could
expect a warm reception that they would not get in the wider world.
However, over 30 years after the initial announcement, cold fusion research has not disappeared. The area is
now often referred to as LENR (Low energy nuclear reactions). There is quite a variety of methods and theories.
25. 2019: Cold Fusion in the News Again
Hidden from public view, a group funded by Google had been working on the topic for the last couple of years.
Now they published a report in Nature. Nature, a leading scientific journal, is selective about what it publishes,
so anything on cold fusion appearing in its pages is a bit of a sensation. Google's approach has a lot going for it.
They saw a research area that was controversial, dogged by poor reproducibility and not generally accepted.
But it had kept going for 30 years, and claimed to offer a solution to the world's energy problems or at least
useful technology. Had it been prematurely dismissed? Are at least some of the claims true? Google got a
group from several institutions together. Scientists from various relevant disciplines, but not part of the cold
fusion community. They were to work according to principles that had been sorely neglected at the start of the
cold fusion saga. Scientific rigour, cooperation between different areas of expertise, peer reviewed publication
(not sensational press conferences). At the simple yes/no level the result was negative – there was no evidence
for cold fusion. On the other hand, the project led to the study of interesting materials (metals with a very high
content of hydrogen). That could be valuable both scientifically and technologically. Quite apart from fusion
(hot or cold), hydrogen will have an increasing role to play in the energy supply.
26. Where's Cold Fusion Today? – The Community
There seems to be an active cold fusion community: research by small, rather exotic firms and a few institutes.
It's a scientific world that runs parallel, but largely disconnected from mainstream science. There's a scientific
society and a peer-reviewed journal. An international conference is held about every one or two years
(attendance reported to be 100 – 200). An industrial association was founded to represent the various firms in
this area (they too are in a parallel world).
Mainstream science is wary of any involvement with cold fusion, but it does not boycott the field completely. In
2009 ACS published a book on gas phase cold fusion by Jean-Paul Biberian. In Jan 2020 a review of the whole
field (again edited by Biberian) appeared – published by Elsevier, a major scientfic publisher.
Most of the cold fusion researchers have been in the field for decades and many have reached retiring age. Will
younger people take their place in this unpopular field?
27. Where's Cold Fusion Today? – Results
Lack of reproducibility has been a constant theme. Some experiments produce excess heat, often only after
running for a considerable time. Some electrodes are inactive. In spite of many attempts the reason has not
been found, though there are a variety of ideas. New areas of science often have reproducibility problems, but
one can normally find out why and gradually learn to control the experimental setup. However, this has not
happened in cold fusion – there are lab. experiments that claim to produce excess heat, but no-one has yet
shown a small prototype reactor that reliably produces power (as opposed to a lab. experiment that is
designed to accumulate data).
28. Where's Cold Fusion Today? – Theory
A major objection to cold fusion is that it contradicts accepted theories. That applies not only to the occurence
of fusion at all at low temperatures, but also to the massive discrepancies between heating and expected
fusion products (neutrons, gamma rays, helium, tritium). After 30 years, there are lots of results, but no
coherent picture. An experimental result should not automatically be rejected because it conficts with
accepted theories. But the stronger the conflict with theory, the higher will be our standard of evidence before
accepting the new result. And if it is accepted, we need a new theory. This theory should put everything into
perspective, not just suggest an explantion for an isolated phenomenon. Physics already has a comprehensive
theoretical framework and the new theory must fit into that. If the theory requires ad hoc assumptions to
avoid contradictions, that is a bad sign. There are plenty of theories favoured by cold fusion proponents but
they are not generally accepted. The preface to Biberian's 2020 book probably gives the general feeling among
the community: current nuclear theories are based on two-body interactions, but in condensed matter we
need concepts based on multibody interactions – an under-researched area. Well, perhaps there's something
interesting there, but I don't take the objection at face value: theoretical physicists have done a lot of work on
the properties of solids.