Everything you wanted to ask about Thermodynamics
but were afraid to ask !
Synopsis
This ‘out of hours’ tutorial has been configured for those students, staff, industry, high school teachers and students, (and others) struggling with the subject concepts and application at a fundamental level. It will include live demonstrations, experiments, animations and videos designed to reinforce the detail.
Apart from correcting any earlier education system failures, and/or memory fade, a primary objective is to get attendees to ‘first base’ so they may pick up books and papers, or comb through web pages with confidence. Hopefully, this will also be a pre-cursor to practical application
Mathematical formulations are approached from several different angles as an aid to rapid assimilation and a deeper understanding, and to demonstrate specific advantages in different fields of application.
Looking across the spectrum of human understanding there are very few things that are derived from, proven, and supported by the most fundamental of considerations and observations spanning the infinitely small to the infinitely large.
“Across the entire spectrum of physics, Einstein liked General Relativity and Thermodynamics best because they are derived from the most fundamental of considerations and see wide applicability”
Unfortunately, both appear to be taught badly and students often arrive at college and university with a partial or confused picture of the mechanisms at work, the basic principles, mathematical formulations, and general applicability. And perhaps worse, this often sees them confounded by books and papers.
This tutorial on Thermodynamics is therefore tailored to correct these failing in support of the wider lecture and research programs spanning technology, engineering, and management theory and practice at The University of Suffolk.
It is hard to understate the importance of ‘Thermodynamics’ in providing an almost complete (Grand Unified Theory) picture of the inner physics of energy transfer spanning machines and chemistry thro information.
Apparently, Einstein had two favourite theories: General Relativity and Thermodynamics! He championed both because of their ‘beauty’, completeness, and emergent properties purely derived from the fundamental consideration of how the universe works.
The origins of this topic mainly reside in the Industrial revolution and the realisation that the early machinery was grossly inefficient. E.G. Engines were only converting the energy consumed to ~2% of useful work output. This drew the attention of Savery (1698), Newcomen (1712), Carnot (1769), and for the next 200 years the conundrum of lost energy occupied many of the greatest scientific minds. This culminated in Rudolf Clausius (~1850)publishing his theory of Thermodynamics with further refinement by Boltzmann (1872).
Why was all this so important? In the 1700s a ‘beam engine’ weighing in at >20 tons consumed vast amounts of coal, to deliver an output ~10hp. Today a Turbofan jet Engine can deliver >30k hp at a weight of ~6 tons. This is the difference between working with little understanding, and today where our knowledge is far more complete. Our latest challenges tend around non-linear loss mechanisms associated with turbulent air and fuel flow.. And like many other fields we have to step beyond our generalise mathematical models and turn to the power of our computers for deeper insights.
Ultimately all machines, mechanisms, computing processes and information itself, involve the transformation of matter and/or bits, and thus they are Entropic and subject to the theory of Thermodynamics. This lecture therefore presents a foundation spanning the history and progress to date in preparation for the embracing other science and engineering disciplines.
Sublime Émilie - Insights into science and art through Kaija Saariaho’s opera.
Kaija Saariaho’s monodrama received its Finnish premiere April 2nd, 2015 at the Finnish National Opera. The title character Émilie du Châtelet (1706–1749) was a significant French Enlightenment mathematician, physicist and philosopher whose love of knowledge and science was equally matched by a passion for men, jewellery and gambling. Marquise du Châtelet is known as the first woman in the history of science to achieve significant results in mathematics and physics.
The scientific community and general audiences had a chance to learn about Émilie’s unique life and work on the eve of the premiere of the opera. A group of international researchers and artists who share an interest in her story came together for a series of lectures, discussions and music performances in Helsinki on 1–2 April 2015.
The event was prepared by the AvaraOpera collective, operating at University of the Arts Helsinki, and it is produced in collaboration with the Finnish National Opera. The event is jointly funded by University of the Arts and the Finnish Cultural Foundation.
http://bit.ly/sublimeemilie
Information Medicine Presentation at SAND 2015 by Nisha Manek, M.D.BillTiller
This presentation was given at the Science & Non Duality Conference (SAND) in San Jose, CA.
SAND is dedicated to bringing together the complementary pathways of human inquiry: the scientific method and spiritual practices. In this sense, the mission of SAND is to bring a consciousness inclusive science.
It is hard to understate the importance of ‘Thermodynamics’ in providing an almost complete (Grand Unified Theory) picture of the inner physics of energy transfer spanning machines and chemistry thro information.
Apparently, Einstein had two favourite theories: General Relativity and Thermodynamics! He championed both because of their ‘beauty’, completeness, and emergent properties purely derived from the fundamental consideration of how the universe works.
The origins of this topic mainly reside in the Industrial revolution and the realisation that the early machinery was grossly inefficient. E.G. Engines were only converting the energy consumed to ~2% of useful work output. This drew the attention of Savery (1698), Newcomen (1712), Carnot (1769), and for the next 200 years the conundrum of lost energy occupied many of the greatest scientific minds. This culminated in Rudolf Clausius (~1850)publishing his theory of Thermodynamics with further refinement by Boltzmann (1872).
Why was all this so important? In the 1700s a ‘beam engine’ weighing in at >20 tons consumed vast amounts of coal, to deliver an output ~10hp. Today a Turbofan jet Engine can deliver >30k hp at a weight of ~6 tons. This is the difference between working with little understanding, and today where our knowledge is far more complete. Our latest challenges tend around non-linear loss mechanisms associated with turbulent air and fuel flow.. And like many other fields we have to step beyond our generalise mathematical models and turn to the power of our computers for deeper insights.
Ultimately all machines, mechanisms, computing processes and information itself, involve the transformation of matter and/or bits, and thus they are Entropic and subject to the theory of Thermodynamics. This lecture therefore presents a foundation spanning the history and progress to date in preparation for the embracing other science and engineering disciplines.
Sublime Émilie - Insights into science and art through Kaija Saariaho’s opera.
Kaija Saariaho’s monodrama received its Finnish premiere April 2nd, 2015 at the Finnish National Opera. The title character Émilie du Châtelet (1706–1749) was a significant French Enlightenment mathematician, physicist and philosopher whose love of knowledge and science was equally matched by a passion for men, jewellery and gambling. Marquise du Châtelet is known as the first woman in the history of science to achieve significant results in mathematics and physics.
The scientific community and general audiences had a chance to learn about Émilie’s unique life and work on the eve of the premiere of the opera. A group of international researchers and artists who share an interest in her story came together for a series of lectures, discussions and music performances in Helsinki on 1–2 April 2015.
The event was prepared by the AvaraOpera collective, operating at University of the Arts Helsinki, and it is produced in collaboration with the Finnish National Opera. The event is jointly funded by University of the Arts and the Finnish Cultural Foundation.
http://bit.ly/sublimeemilie
Information Medicine Presentation at SAND 2015 by Nisha Manek, M.D.BillTiller
This presentation was given at the Science & Non Duality Conference (SAND) in San Jose, CA.
SAND is dedicated to bringing together the complementary pathways of human inquiry: the scientific method and spiritual practices. In this sense, the mission of SAND is to bring a consciousness inclusive science.
It should be no surprise that AI is treading a similar path to computing which began with single-purpose machines tasked for payroll calculations, banking transactions, or weapons targeting et al, but nothing more! It took decades for General Purpose Computing to emerge in the form of the now ubiquitous PC. Today, AI is still in a single-purpose/task-specific phase, and we have no general-purpose platforms, but their emergence is only a matter of time!
Recent AI progress has seen a repeat of the media debate and alarmist warnings for our computing past, compounded by consequential advances in robotics. In turn, this has promoted numerous attempts to draw biological equivalences defining the time when machines will overtake humans. But without any workable definitions or framework that tend to little more than un/educated guesses. Recourse to IQ measures and the Touring test have proved to be irrelevant, and without a reference framework or formal characterisation, continued discussion and debate remain futile
We therefore approach this AI problem from the bottom up by defining the simplest of machines and lifeforms to derive clues, pointers and basic boundary conditions . This sees a fundamental Entropic description emerge that is applicable to both machine and lifeforms.
This presentation is suitable for professionals and the public alike, and is fully illustrated by high-quality graphics, animations and, movies. Inevitably, it contains some mathematics that non-practitioners will have to take on trust, but the focus is on defining the key characteristics, parameters, and important features of AI, our total dependence, and the future!
Note: A 40 min session for a predominantly ley audience and not all the slides presented here were used on the day. Their inclusion here is in response to those audience members requesting more detail at the end of/during the event.
Past civilisations have nurtured small populations of those trying to understand and manipulate nature to some advantage in materials, tools, weapons, food, and wealth. However, they never formed communities and lacked the means of recording, communicating, and sharing successes and failures. They also lacked a common framework/philosophy to qualify them as scientists, but that all began to change in the 16th Century. In this lecture we consider the progression to a philosophy of science, and the underlying principles and assumptions that now guide scientific inquiry.We also examines the nature of scientific knowledge, the methods of acquisition, evolution, and significance over past centuries, and reflect on the value to society.
In the struggle to solve problems, deliver understanding, and reveal the truth about our universe, science had to suffer and survive: ignorance, bigotry, established superstitions, and the ‘diktats’ of religions and politics, and latterly, falling education standards mired by social media. We chart that ‘scientific’ journey emphasising the importance of observation, experimentation, and the search for universal laws. Ultimately, this essentially Aristotelian perspective was challenged and overtaken by the rise of empiricism, which emphasised the importance of sensory experience and the limitations of human knowledge.
Science continues to evolve and provide us with the best truths attainable with our leading edge technologies of observation and experimentation. Today, it stands as the greatest and richest contributor to human knowledge, understanding, progress, and wellbeing. In turn, debates and controversies are ongoing, shaping the field and philosophy which remains essential for understanding the nature of scientific knowledge and the models it creates. But unlike any belief system, the answers and models furnishers by science are not certain and invariant, they tend to be stochastic and incomplete - ‘the best we can do’ at a given time.
In this workshop session we identify aging technology design concepts, old business and operating models, plus energy supply limits as the prime constraints of 6G and beyond. We also identify the notion of an erroneous spectrum shortage born of the bands and channel mode of operation which is fundamentally unsuited to 6G and IoT demands in the near and far future.
We strongly link optical fibre in the local loop with future wireless systems and the need for very low-energy ‘tower-less’ systems. We also postulate a future demanding UWB and HWB (Hyper) with transmission energies ~𝛍W and signals below the ambient noise level. This will be necessary to power an IoT of >2.4Tn Things which we estimate to be necessary for Industry 4/5 and sustainable societies.
Engineering might be defined as the judicial application of science and scientific knowledge, but with the rider that unlike science and scientific studies, engineering always has to deliver a solution and a result. There are therefore aspects of engineering that stretch and challenge, the accepted, wisdom and knowledge of science. To purists, this might appear outrageous, but it is no more so than the works of Erwin Schrödinger or Leonhard Euler et al
In this lecture we examine many of the established engineering basics whilst being mindful that most of our education, techniques, and working solutions are founded on the assumption of well behave linear environments. As our entire universe, and everything in it, is inherently complex and non-linear, we have to salute the powers of approximation and iteration for our many engineering success to date. However, we are increasingly being challenged by complexities of the fundamental non-linear nature of the problems confronting us. ( E.G. Politics, Conflict, Global Warming, Sustainability, Medicine, Fusion Power, Logistics, Networks, Depletion of Resources, Accelerating Tech Driven Change +++)
We start by tracing history from the foundations up to the present day, including modern analytical nomenclature and techniques, system reliability, resilience and costs, we highlight the the basic human limitations that necessitate multi-disciplinary teams that include AI and vast computing power.
The overall treatment includes our analogue past, digital today, and analogue/digital hybrid future of computing, robots, networks and systems of all kinds. It also includes animations, movies and sound files to demonstrate the realities of modern system design including the inherent complexities. To further highlight, and exemplify this projected future, we examine a real engineering project concerned with acoustic sniper spotting under battlefield conditions and extreme noise. Here a combination of digital modelling sees the use of analogue acoustic filter arrays, analogue signal amplification, and digital signal processing doubling the range of sniper detection and location.
IoT growth forecasts currently tend to span 30 – 60 Bn ‘Things’ by 2030. However, this ignores the central IoT role in realising sustainable societies where raw materials and component use have to see very high levels of reuse, repurposing, and recycling. In such a world almost everything we possess and use will have to be tagged and be electronically addressable as a part of the IoT. Such a need immediately sees growth estimates of 2Tn or more over the span of Industry 4 and 5. On the basis of energy demands alone, it is inconceivable that the technologies of BlueTooth, WiFi, 4, 5, and 6G could support such demand, and nor are the signaling and security protocols viable on such a scale.
The evolution of the IoT will therefore most likely see a new form of dynamic network requiring new lightweight protocols employing very little signal processing, together with very low energy wireless technologies (in the micro-Watt range) operating over extremely short distances (~10m). This need might be best satisfied by a new form of ‘Zero Infrastructure Mesh Networks’ that engage in active resource sharing, lossy probabilistic routing, and cyber security realised through an integrated ‘auto-immunity’ system. Ultimately, we might also envisage data amalgamation at key nodes that have a direct connection into the internet along with an additional layer of cyber checks and protection.
We justify the above assertions by illustrating the energy and network limitations of today’s 5G networks and those already obvious in current 6G proposals. We then go on to detail how a suitable IoT MeshNet might be configured and realised, along with a few solutions and emergent outcomes on the way.
Recently, it has become increasingly evident that we have engineers and scientists reaching a professional level of practice without a clear understanding of the scientific method, its origins, and its fundamental workings. There also appears to be a lack of appreciation of our total dependence on the truths that science continually reveals. How this situation ensued appears to vary from country to country, and the flavour of education system encountered by students. But a common complaint is the progressive dumbing down of the science curriculum along with a dire shortage of qualified teachers. This also seems to be compounded with the increasing speciation of science and engineering into narrower and narrower disciplines. So this situation (crisis?) prompted a request for a corrective series of foundation lectures focussed on healing these educational flaws across relevant disciplines, graduating and practicing levels. This then is the first in this foundation series.
Uncanny Valley addresses our reactions to humanoid objects, such as robots, a video game characters, or dolls, and how they look and act ‘almost’ like a real human. Feeling of uneasiness or disgust in the observer are addressed directly, rather than familiarity or attraction. The theory was proposed by Japanese roboticist Masahiro Mori in 1970 and has been explored by many researchers and artists since. It has application in AI, robotics, MMI, and human-computer interaction, and helps designers to create more appealing devices that can interact with people in various domains, such as industry, education, entertainment, defence, health care, et al.
In this lecture we explain and demonstrate the fundamentals before extending the principle to sound, motion, actions, and eyes as an output mechanism. We also note that all this poses some challenges and risks in the potential for reduced the emotional connections, empathy, acceptance, and trust between humans and machines. On a further dimension the potential to create threat and terror can be useful opportunity in the military domain. It is thus important to understand the causes and effects of the uncanny valley in the wider sense in order to meet the needs of each application space
Only 40 years ago, the rate of technologically driven change was such that companies could re-organize efficiently and economically over considerable periods of time, but about 30 years ago this changed as the arrival of new technologies accelerated. We effectively moved from a world of slow periodic changes to one where change became a continuum. The leading-edge sectors were fast to recognize and adopt this new mode of continual adaptation driven by new technologies. This saw these ever more efficient and expansive companies dominating some sectors. For the majority, however, it seems that this transition was not recognized until relatively recently, and a so new movement was born under the banner of digitalization. This not only impacts the way people work, it affects company operations and changes markets, and it does so suddenly!.
Perhaps the most impactive and recent driver of change in this regard has been COVID which saw the adoption of video conferencing and working as a survival imperative in much less than a month. This now stands as a beacon of proof that companies, organizations, and society, can indeed change and adapt to the new at a rate previously considered impossible. The big danger for digitalization programmes now is the simple-minded view that there are singular (magic) solutions that fit every company and organization, but this is not the case. The reality is that the needs and culture of an organization are not the same and may not be uniform from top to bottom.
Manufacturing necessitates very steep hierarchical management structures and tight control to ensure the consistency of the quality of products. On the other hand, a research laboratory or design company requires a low flat management hierarchy and an apparently relaxed level of control. This is absolutely necessary to foster creativity, innovation, and invention. This presentation gives practical examples of management and organizational, extremes. We then go on to highlight the need to embrace AI and Quantum Computing over the coming decade to deal with future technologies, operating
and market complexity.
The aspirational visions of Society 5.0 coined by many nations around 2015/16 have now been eclipsed by technological progress and world events including another European war, global warming, climate change and resource shortages. In this new context, the published 5.0 documents now seem naive and simplistic, high on aspiration, and very short on ‘the how’. The stark reality is that the present situation has been induced by our species and our inability to understand and cope with complexity.
“There are no simple solutions to complex problems”
What is now clear is that our route to survival and Society 5.0 will be born of Industry 4.0/5.0 and a symbiosis between Mother Nature, Machines, and Mankind. Today we consume and destroy near 50% more resources than the planet might reasonably support, and merely improving the efficiency of all our processes and what we do will only delay the end point. And so I4.0 is founded on new materials and new processes that are far less damaging, inherently sustainable, and most importantly, readily dispensable across the planet.
“Reversing global warming will not see a climatic reversal to some previously stable state”
In this presentation, we start with the nature of climate change, move on to the technology changes that might save the day, the impact of Industry 4.0/5.0, and then postulate what Society 5.0 might actually look like.
In a world of accelerating innovation and increasingly complex digital services, applications, appliances, and devices, it seems unreasonable to expect customers to understand and maintain their own cyber security. We are way past the point where even the well educated can cope with the compounded complexity of an ‘on-line-life’. The reality is, today's products and services are incomplete and sport wholly inadequate cyber defence applications.
Perhaps the single biggest problem is that defenders have never been professional attackers - and they don’t share the same level of thinking and deviousness, or indeed, the inventiveness of their enemies. Apart from an education embracing the attack techniques, and in some cases, engaging in war games, the defenders remain on the back foot However, there a number of new, an potentially significant, approaches yet to be addressed, and we care to look at the problem from a new direction.
In the maintenance of high-tech equipment and systems across many industries, identifiable precursors are employed to flag impending outages and failures. This realisation prompted a series of experiments to see if it was possible to presage pending cyber attacks. And indeed it was found to be the case!
In this presentation we give an overview of our early experimental and observational results, long with our current thinking spanning networks through to individual hackers, and inside actors.
When people are exposed to the new for the first time their reaction, quite rightly, is generally one of caution and perhaps a degree of suspicion. And, when that ‘new born’ is a novel technology, reactions can quickly become amplified and biased toward the dystopian by the sensationalism of media and mis-information of social networks. In this modern era I think we can also safely assume that Hollywood has more than a ‘bit part’ in nurturing extreme reactions with movies such as Terminator, AI and Ex-Machina.
Our purpose here is to dispel the modern myth that technology is, or can be, inherently evil and a direct threat to humanity. We do so by positing three basic axioms:
“Without technology we would know and understand
almost nothing”
“The greatest threat to humanity is humanity”
“If technology progress and societal advance stall, then civilisations collapse”
Having briefly establishing these in the context of our wider history, we focus on the Industrial Revolutions and their beneficial upside and consequential negatives. We then move on to examine Robotics, Artificial Intelligence, Artificial Life, and Quantum Computing in the context of our current needs and realising sustainable futures, and the survival of our civilisation.
Connecting Everything Vital to Sustainability
Mobile network evolution has followed a reasonably predictable path almost entirely focused on the needs of human communication. The transition from 1 to 2G was dictated by the economics of reliability, performance, and scale, whilst 3, 4, and 5G saw the transition to mobile computing with full internet access, AI and an ever-expanding plethora of applications. But 5G could be the end of the line as cell-site energy demands have become excessive at ~10kW.
Midway between the migration from 4G to 5G, M2M and the IoT machines overtook the human population of 8Bn people with near (estimated) 20Bn devices. Current IoT growth rates suggest a 40 - 60Bn population by 2030 to 2050. However, we present evidence that it could be far more ~ 1,000Bn ‘Things’. This is based on the observation of the number of IoT components populating modern vehicles, homes, offices, factories and plants, along with smart ‘human implants’ and ‘smart bolts’ plus the instrumentation of civil; structures.
The bold assumption that 5G would be a dominant player in the IoT is now patently one of naivety and the world has become far more complex with over 10 wireless standards currently in use. So, this poses the question; will 6G rise to the challenge? We see this as highly unlikely as the diversity of need is extremely broad, and we propose that it could be the end of tower based networks for a lot of applications. A migration to mesh-nets, UWB and (Hyper Wide Band) for the IoT at frequencies above 100GHz seems the most obvious engineering choice as it allows for far simpler designs with extremely low power at sub $0.01/device cost. 5G is already on the margins of being sustainable, and a ‘more-of-the-same’ thinking 6G can lonely be far worse!
Seventy years on from AI appearing on the public scene and all the optimistic projections have been largely overtaken with systems outgunning humans at all board, card and computer games including Chess, Poker and GO. Of course; general knowledge, medical diagnosis, genetics and proteomics, image and pattern recognition are now all firmly in the grasp of AI.
Interestingly, AI is treading a similar path to computing in that it began with single purpose/task machines that could only deal with a company payroll calculations or banking transactions and nothing more! General purpose computing emerged over further decades to give us the PCs and devices we now enjoy. So, AI currently runs as task specific applications on these general purpose platforms, and no doubt, general purpose AI will also become tractable in a few decades too!
Recent progress has promoted a deal of debate and discussion along with hundreds of published papers and definitions that attempt to characterise biological and artificial intelligence. But they all suffer the same futility and fail! Without reference to any formal characterisation, all discussion and debate remains relatively meaningless.
Somewhat ironically, it was the defence industry that triggered the analysis work here. Two of key steps to success were: the abandonment of all performance comparisons between biological and machine entities; and the avoidance of using the human brain as some ‘golden’ intelligence reference.
This presentation is suitable for professionals and public alike, and comes fully illustrated by high quality graphics, animations and movies. Inevitably, it contains (engineering) mathematics that non-practitioners will have to take on trust, whilst professionals may wish challenge on the basis that the focus on getting a solution rather than the purity of the process!
For millennia we have crafted artifacts from bulk materials that we have progressively refined to produce ever more precision tools and products. Latterly, we have crossed a critical threshold where our abilities now eclipse Mother Nature. For example; the smallest transistors in production today have feature sizes down to 2nm which is smaller than a biological virus ~20 - 200nm. The implications for ITC, AI, Robotics, and Production are ever more profound as we approach, and most likely undercut, the scale of the atom ~ 0.1-0.4nm. Not only does this open the door to new technologies, it sees new and remarkable capabilities. So, in this presentation we look at this new Tech Horizon spanning robotics to quantum computing and sensory technologies, and how they will help us realise sustainable futures germane to Industry 4.0, 5.0, and beyond.
More Related Content
Similar to Thermodynamics Tutorial - The Fundamentals
It should be no surprise that AI is treading a similar path to computing which began with single-purpose machines tasked for payroll calculations, banking transactions, or weapons targeting et al, but nothing more! It took decades for General Purpose Computing to emerge in the form of the now ubiquitous PC. Today, AI is still in a single-purpose/task-specific phase, and we have no general-purpose platforms, but their emergence is only a matter of time!
Recent AI progress has seen a repeat of the media debate and alarmist warnings for our computing past, compounded by consequential advances in robotics. In turn, this has promoted numerous attempts to draw biological equivalences defining the time when machines will overtake humans. But without any workable definitions or framework that tend to little more than un/educated guesses. Recourse to IQ measures and the Touring test have proved to be irrelevant, and without a reference framework or formal characterisation, continued discussion and debate remain futile
We therefore approach this AI problem from the bottom up by defining the simplest of machines and lifeforms to derive clues, pointers and basic boundary conditions . This sees a fundamental Entropic description emerge that is applicable to both machine and lifeforms.
This presentation is suitable for professionals and the public alike, and is fully illustrated by high-quality graphics, animations and, movies. Inevitably, it contains some mathematics that non-practitioners will have to take on trust, but the focus is on defining the key characteristics, parameters, and important features of AI, our total dependence, and the future!
Note: A 40 min session for a predominantly ley audience and not all the slides presented here were used on the day. Their inclusion here is in response to those audience members requesting more detail at the end of/during the event.
Past civilisations have nurtured small populations of those trying to understand and manipulate nature to some advantage in materials, tools, weapons, food, and wealth. However, they never formed communities and lacked the means of recording, communicating, and sharing successes and failures. They also lacked a common framework/philosophy to qualify them as scientists, but that all began to change in the 16th Century. In this lecture we consider the progression to a philosophy of science, and the underlying principles and assumptions that now guide scientific inquiry.We also examines the nature of scientific knowledge, the methods of acquisition, evolution, and significance over past centuries, and reflect on the value to society.
In the struggle to solve problems, deliver understanding, and reveal the truth about our universe, science had to suffer and survive: ignorance, bigotry, established superstitions, and the ‘diktats’ of religions and politics, and latterly, falling education standards mired by social media. We chart that ‘scientific’ journey emphasising the importance of observation, experimentation, and the search for universal laws. Ultimately, this essentially Aristotelian perspective was challenged and overtaken by the rise of empiricism, which emphasised the importance of sensory experience and the limitations of human knowledge.
Science continues to evolve and provide us with the best truths attainable with our leading edge technologies of observation and experimentation. Today, it stands as the greatest and richest contributor to human knowledge, understanding, progress, and wellbeing. In turn, debates and controversies are ongoing, shaping the field and philosophy which remains essential for understanding the nature of scientific knowledge and the models it creates. But unlike any belief system, the answers and models furnishers by science are not certain and invariant, they tend to be stochastic and incomplete - ‘the best we can do’ at a given time.
In this workshop session we identify aging technology design concepts, old business and operating models, plus energy supply limits as the prime constraints of 6G and beyond. We also identify the notion of an erroneous spectrum shortage born of the bands and channel mode of operation which is fundamentally unsuited to 6G and IoT demands in the near and far future.
We strongly link optical fibre in the local loop with future wireless systems and the need for very low-energy ‘tower-less’ systems. We also postulate a future demanding UWB and HWB (Hyper) with transmission energies ~𝛍W and signals below the ambient noise level. This will be necessary to power an IoT of >2.4Tn Things which we estimate to be necessary for Industry 4/5 and sustainable societies.
Engineering might be defined as the judicial application of science and scientific knowledge, but with the rider that unlike science and scientific studies, engineering always has to deliver a solution and a result. There are therefore aspects of engineering that stretch and challenge, the accepted, wisdom and knowledge of science. To purists, this might appear outrageous, but it is no more so than the works of Erwin Schrödinger or Leonhard Euler et al
In this lecture we examine many of the established engineering basics whilst being mindful that most of our education, techniques, and working solutions are founded on the assumption of well behave linear environments. As our entire universe, and everything in it, is inherently complex and non-linear, we have to salute the powers of approximation and iteration for our many engineering success to date. However, we are increasingly being challenged by complexities of the fundamental non-linear nature of the problems confronting us. ( E.G. Politics, Conflict, Global Warming, Sustainability, Medicine, Fusion Power, Logistics, Networks, Depletion of Resources, Accelerating Tech Driven Change +++)
We start by tracing history from the foundations up to the present day, including modern analytical nomenclature and techniques, system reliability, resilience and costs, we highlight the the basic human limitations that necessitate multi-disciplinary teams that include AI and vast computing power.
The overall treatment includes our analogue past, digital today, and analogue/digital hybrid future of computing, robots, networks and systems of all kinds. It also includes animations, movies and sound files to demonstrate the realities of modern system design including the inherent complexities. To further highlight, and exemplify this projected future, we examine a real engineering project concerned with acoustic sniper spotting under battlefield conditions and extreme noise. Here a combination of digital modelling sees the use of analogue acoustic filter arrays, analogue signal amplification, and digital signal processing doubling the range of sniper detection and location.
IoT growth forecasts currently tend to span 30 – 60 Bn ‘Things’ by 2030. However, this ignores the central IoT role in realising sustainable societies where raw materials and component use have to see very high levels of reuse, repurposing, and recycling. In such a world almost everything we possess and use will have to be tagged and be electronically addressable as a part of the IoT. Such a need immediately sees growth estimates of 2Tn or more over the span of Industry 4 and 5. On the basis of energy demands alone, it is inconceivable that the technologies of BlueTooth, WiFi, 4, 5, and 6G could support such demand, and nor are the signaling and security protocols viable on such a scale.
The evolution of the IoT will therefore most likely see a new form of dynamic network requiring new lightweight protocols employing very little signal processing, together with very low energy wireless technologies (in the micro-Watt range) operating over extremely short distances (~10m). This need might be best satisfied by a new form of ‘Zero Infrastructure Mesh Networks’ that engage in active resource sharing, lossy probabilistic routing, and cyber security realised through an integrated ‘auto-immunity’ system. Ultimately, we might also envisage data amalgamation at key nodes that have a direct connection into the internet along with an additional layer of cyber checks and protection.
We justify the above assertions by illustrating the energy and network limitations of today’s 5G networks and those already obvious in current 6G proposals. We then go on to detail how a suitable IoT MeshNet might be configured and realised, along with a few solutions and emergent outcomes on the way.
Recently, it has become increasingly evident that we have engineers and scientists reaching a professional level of practice without a clear understanding of the scientific method, its origins, and its fundamental workings. There also appears to be a lack of appreciation of our total dependence on the truths that science continually reveals. How this situation ensued appears to vary from country to country, and the flavour of education system encountered by students. But a common complaint is the progressive dumbing down of the science curriculum along with a dire shortage of qualified teachers. This also seems to be compounded with the increasing speciation of science and engineering into narrower and narrower disciplines. So this situation (crisis?) prompted a request for a corrective series of foundation lectures focussed on healing these educational flaws across relevant disciplines, graduating and practicing levels. This then is the first in this foundation series.
Uncanny Valley addresses our reactions to humanoid objects, such as robots, a video game characters, or dolls, and how they look and act ‘almost’ like a real human. Feeling of uneasiness or disgust in the observer are addressed directly, rather than familiarity or attraction. The theory was proposed by Japanese roboticist Masahiro Mori in 1970 and has been explored by many researchers and artists since. It has application in AI, robotics, MMI, and human-computer interaction, and helps designers to create more appealing devices that can interact with people in various domains, such as industry, education, entertainment, defence, health care, et al.
In this lecture we explain and demonstrate the fundamentals before extending the principle to sound, motion, actions, and eyes as an output mechanism. We also note that all this poses some challenges and risks in the potential for reduced the emotional connections, empathy, acceptance, and trust between humans and machines. On a further dimension the potential to create threat and terror can be useful opportunity in the military domain. It is thus important to understand the causes and effects of the uncanny valley in the wider sense in order to meet the needs of each application space
Only 40 years ago, the rate of technologically driven change was such that companies could re-organize efficiently and economically over considerable periods of time, but about 30 years ago this changed as the arrival of new technologies accelerated. We effectively moved from a world of slow periodic changes to one where change became a continuum. The leading-edge sectors were fast to recognize and adopt this new mode of continual adaptation driven by new technologies. This saw these ever more efficient and expansive companies dominating some sectors. For the majority, however, it seems that this transition was not recognized until relatively recently, and a so new movement was born under the banner of digitalization. This not only impacts the way people work, it affects company operations and changes markets, and it does so suddenly!.
Perhaps the most impactive and recent driver of change in this regard has been COVID which saw the adoption of video conferencing and working as a survival imperative in much less than a month. This now stands as a beacon of proof that companies, organizations, and society, can indeed change and adapt to the new at a rate previously considered impossible. The big danger for digitalization programmes now is the simple-minded view that there are singular (magic) solutions that fit every company and organization, but this is not the case. The reality is that the needs and culture of an organization are not the same and may not be uniform from top to bottom.
Manufacturing necessitates very steep hierarchical management structures and tight control to ensure the consistency of the quality of products. On the other hand, a research laboratory or design company requires a low flat management hierarchy and an apparently relaxed level of control. This is absolutely necessary to foster creativity, innovation, and invention. This presentation gives practical examples of management and organizational, extremes. We then go on to highlight the need to embrace AI and Quantum Computing over the coming decade to deal with future technologies, operating
and market complexity.
The aspirational visions of Society 5.0 coined by many nations around 2015/16 have now been eclipsed by technological progress and world events including another European war, global warming, climate change and resource shortages. In this new context, the published 5.0 documents now seem naive and simplistic, high on aspiration, and very short on ‘the how’. The stark reality is that the present situation has been induced by our species and our inability to understand and cope with complexity.
“There are no simple solutions to complex problems”
What is now clear is that our route to survival and Society 5.0 will be born of Industry 4.0/5.0 and a symbiosis between Mother Nature, Machines, and Mankind. Today we consume and destroy near 50% more resources than the planet might reasonably support, and merely improving the efficiency of all our processes and what we do will only delay the end point. And so I4.0 is founded on new materials and new processes that are far less damaging, inherently sustainable, and most importantly, readily dispensable across the planet.
“Reversing global warming will not see a climatic reversal to some previously stable state”
In this presentation, we start with the nature of climate change, move on to the technology changes that might save the day, the impact of Industry 4.0/5.0, and then postulate what Society 5.0 might actually look like.
In a world of accelerating innovation and increasingly complex digital services, applications, appliances, and devices, it seems unreasonable to expect customers to understand and maintain their own cyber security. We are way past the point where even the well educated can cope with the compounded complexity of an ‘on-line-life’. The reality is, today's products and services are incomplete and sport wholly inadequate cyber defence applications.
Perhaps the single biggest problem is that defenders have never been professional attackers - and they don’t share the same level of thinking and deviousness, or indeed, the inventiveness of their enemies. Apart from an education embracing the attack techniques, and in some cases, engaging in war games, the defenders remain on the back foot However, there a number of new, an potentially significant, approaches yet to be addressed, and we care to look at the problem from a new direction.
In the maintenance of high-tech equipment and systems across many industries, identifiable precursors are employed to flag impending outages and failures. This realisation prompted a series of experiments to see if it was possible to presage pending cyber attacks. And indeed it was found to be the case!
In this presentation we give an overview of our early experimental and observational results, long with our current thinking spanning networks through to individual hackers, and inside actors.
When people are exposed to the new for the first time their reaction, quite rightly, is generally one of caution and perhaps a degree of suspicion. And, when that ‘new born’ is a novel technology, reactions can quickly become amplified and biased toward the dystopian by the sensationalism of media and mis-information of social networks. In this modern era I think we can also safely assume that Hollywood has more than a ‘bit part’ in nurturing extreme reactions with movies such as Terminator, AI and Ex-Machina.
Our purpose here is to dispel the modern myth that technology is, or can be, inherently evil and a direct threat to humanity. We do so by positing three basic axioms:
“Without technology we would know and understand
almost nothing”
“The greatest threat to humanity is humanity”
“If technology progress and societal advance stall, then civilisations collapse”
Having briefly establishing these in the context of our wider history, we focus on the Industrial Revolutions and their beneficial upside and consequential negatives. We then move on to examine Robotics, Artificial Intelligence, Artificial Life, and Quantum Computing in the context of our current needs and realising sustainable futures, and the survival of our civilisation.
Connecting Everything Vital to Sustainability
Mobile network evolution has followed a reasonably predictable path almost entirely focused on the needs of human communication. The transition from 1 to 2G was dictated by the economics of reliability, performance, and scale, whilst 3, 4, and 5G saw the transition to mobile computing with full internet access, AI and an ever-expanding plethora of applications. But 5G could be the end of the line as cell-site energy demands have become excessive at ~10kW.
Midway between the migration from 4G to 5G, M2M and the IoT machines overtook the human population of 8Bn people with near (estimated) 20Bn devices. Current IoT growth rates suggest a 40 - 60Bn population by 2030 to 2050. However, we present evidence that it could be far more ~ 1,000Bn ‘Things’. This is based on the observation of the number of IoT components populating modern vehicles, homes, offices, factories and plants, along with smart ‘human implants’ and ‘smart bolts’ plus the instrumentation of civil; structures.
The bold assumption that 5G would be a dominant player in the IoT is now patently one of naivety and the world has become far more complex with over 10 wireless standards currently in use. So, this poses the question; will 6G rise to the challenge? We see this as highly unlikely as the diversity of need is extremely broad, and we propose that it could be the end of tower based networks for a lot of applications. A migration to mesh-nets, UWB and (Hyper Wide Band) for the IoT at frequencies above 100GHz seems the most obvious engineering choice as it allows for far simpler designs with extremely low power at sub $0.01/device cost. 5G is already on the margins of being sustainable, and a ‘more-of-the-same’ thinking 6G can lonely be far worse!
Seventy years on from AI appearing on the public scene and all the optimistic projections have been largely overtaken with systems outgunning humans at all board, card and computer games including Chess, Poker and GO. Of course; general knowledge, medical diagnosis, genetics and proteomics, image and pattern recognition are now all firmly in the grasp of AI.
Interestingly, AI is treading a similar path to computing in that it began with single purpose/task machines that could only deal with a company payroll calculations or banking transactions and nothing more! General purpose computing emerged over further decades to give us the PCs and devices we now enjoy. So, AI currently runs as task specific applications on these general purpose platforms, and no doubt, general purpose AI will also become tractable in a few decades too!
Recent progress has promoted a deal of debate and discussion along with hundreds of published papers and definitions that attempt to characterise biological and artificial intelligence. But they all suffer the same futility and fail! Without reference to any formal characterisation, all discussion and debate remains relatively meaningless.
Somewhat ironically, it was the defence industry that triggered the analysis work here. Two of key steps to success were: the abandonment of all performance comparisons between biological and machine entities; and the avoidance of using the human brain as some ‘golden’ intelligence reference.
This presentation is suitable for professionals and public alike, and comes fully illustrated by high quality graphics, animations and movies. Inevitably, it contains (engineering) mathematics that non-practitioners will have to take on trust, whilst professionals may wish challenge on the basis that the focus on getting a solution rather than the purity of the process!
For millennia we have crafted artifacts from bulk materials that we have progressively refined to produce ever more precision tools and products. Latterly, we have crossed a critical threshold where our abilities now eclipse Mother Nature. For example; the smallest transistors in production today have feature sizes down to 2nm which is smaller than a biological virus ~20 - 200nm. The implications for ITC, AI, Robotics, and Production are ever more profound as we approach, and most likely undercut, the scale of the atom ~ 0.1-0.4nm. Not only does this open the door to new technologies, it sees new and remarkable capabilities. So, in this presentation we look at this new Tech Horizon spanning robotics to quantum computing and sensory technologies, and how they will help us realise sustainable futures germane to Industry 4.0, 5.0, and beyond.
We are engaged in a war the like of which we have never seen or experienced before. Our enemies are invisible and relentless; with globally dispersed forces working at all levels and in all sectors of our societies. They are better organised, resourced, motivated, and adaptive than any of our organisations or institutions, and they are winning. This war is also one of paradox!
“The cost to many nations is now on a par with their GDP”
“No previous war has seen so many suffer so much to (almost) never retaliate”
“We are up against attackers who operate as a virtual (ghost-like) guerrilla army”
“No state can defend its population and organisations, and they stand alone - isolated and exposed”
“A real army/defence force would rehearse and play all day and very occasionally engage in warfare. We, on the other hand, are at war every day but never play, war-game, or anticipate new forms of attack”
To turn this situation around we need to understand our enemies and adopt their tactics and tools as a part of our defence strategy. We also have to be united, and organised so the no one, and no organisation, stands alone. We also have to engage in sharing attack data, experiences and solutions.
All this has to be supported by wargaming, and anticipatory solutions creation.
The good news is; we have better, and more, people, machines, networks, facilities, and expertise than our enemies. All it requires is the embracing of advanced R&D, leadership, sharing, and orchestration on a global scale.
In 2015/16 a number of bodies/nations set about defining societies they would aspire to in the near future. Each vision document similarly described some idealistic, egalitarian, super-smart, human centred, state providing a near uniformity of living conditions, and opportunity. At the same time, each society would be free of adversity, with economic development guided by ecological and human need. Of course, economic growth was defined to continue in line with the past. Very nice, but a product of old linear thinking and modelling!
It is now approaching 2022 and in the past 5/7 years our base silicon technology has advanced to enjoy a >30 fold increase in computing power. Our top end mobile devices would now challenge a super computer of 1996/7 era, whist AI systems now pervade our homes, offices, vehicles, professions and all our on-line services. At the same time, information overload has started to rival some medical conditions!
All of this has also been compounded by two years of COVID-19 lockdowns and restrictions that have seen the normalisation of social isolation, limited travel, working and eduction from home, virtualised medicine and care, support services, shopping and meetings. In turn, this has resulted in empty offices, towns and cities. Concurently, climate change, global warming, pollution, finite resources, a stressed planetary system, and social unrest have suddenly become urgent issues. Against this backdrop it really seems to be time to revisit those Society 5.0 Visions and the limited linear thinking that contrived them!
In this presentation we examine many of the core parameters and assumptions to highlight existing, or soon to be realised, solutions and remedies. In doing so, a different picture of Society 5.0 emerges.
The biggest force for social change since the first industrial revolution has been adjusting to, and taking advantage of, the new and accelerating capabilities of our advancing technologies. And in our entire history, the dominant technology driver has been silicon-based electronics. It has prompted revolutions in Computing, Telecoms, Automation, AI, and Robotics that radically changed the human condition. Today, that same exponential revolution is accelerating us into Industry 4.0 and onto Industry 5.0.
The consequential transformation of medicine, industrial design and production, farming, food, processing, supply and demand has seen living standards improve and life expectancy widen. Many of our institutions have also seen tech-driven transformations in line with industry. If there has been a down-side to this progression, it has been our inability to transform the workforce ahead of new demands. Unemployment has persisted whilst reeducation and retraining have been on the back foot, whilst, the net creation of new jobs has always exceeded the demise of the old. As a result, leading countries in the first world now have labour shortages at all levels right across the spectrum.
Recently, COVID-19 has demonstrated that we have the technology and we can rapidly reorganise and change society if we have to. So in this presentation, we examine ‘the force functions’ and changes engineered to date, and then peer over the horizon to sample what is to come in terms of technologies and working practices…
Throughout my career in science, engineering and management I attended numerous meeting where many misconceptions and misinterpretations were evident. Perhaps the most expansive and expensive were the probabilities assumed and calculated for system reliability and/or product manufacturing quality. Eventually, I began to refer to this as ‘five nines’ problem!
Not fully understanding the origins of the reliability measures, it is so easy to demand a 99.999% instead of 99.99% up time for an electronic system. What could be easier? At face value it appears to be trivial and straightforward! Likewise, taking a 5s manufacturing plant up to a 6s defect level turns out to be a monumental engineering challenge! And at the time of writing 6s has never been achieved!
It appears that to few engineering and management courses address this topic, and if they do, it is as a scant reference of insufficient depth. So, we see far too many students understand in any depth, if at all! And when they become managers they just ‘don’t get it’!
This presentation and the associated lecture have been specifically created to address this problem with relevance to BSc, BA, MSc and MBA students along with anyone needing a refresher or explicit introduction to the topic. In addition to the graphics, animations and movies, the lecture is also littered with practical examples and the outcomes of case studies.
Industries 1.0, 2.0 (and most of) 3.0, saw manufacturing and construction using natural materials readily extracted, refined, amalgamated, machined, and molded. In general, these exhibited fixed mechanical, electrical, and chemical properties. However, the latter stages of Industry 3.0 embraced synthetics exhibiting superior properties to afford new degrees of freedom in the design of structures and products.
Today Industry 4.0 sees further advances with metamaterials, dynamic coatings, controllable properties, and additive manufacturing. Embedded smarts have also made communication between components, products and structures possible under the guise of the IoT. Adaptable materials with a degree of self-repair are also opening the door to further freedoms and less material use. In combination, these represent a big step toward sustainable societies with highly efficient ReUse, RePurposing, and Recycling (3R).
At the leading edge, we are now realising active surfaces that can reflect, absorb, or amplify wireless signals, offer programmable colour, and integral energy storage. But amongst a growing list of possibilities, it is integral sensing & communication that may define this new era. In this presentation, we look at these advances in the context of smart design, cities & societies.
We are engaged in an exponentially growing cyber war that we are visibly losing. Within the next 3 years it has been estimated that the global cost will equal, or overtake, the UK GDP, and it is clear that our defences are inadequate and often ineffective. Malware and ransomer-ware continue to extort more money, and cause damage and inconvenience to individuals, organisations and society, whilst hacker groups, criminals and rogue states continue to innovate and maintain their advantage. At the same time, our defences are subverted and rendered ineffective as we operate in a reactive and prescriptive, after the fact, mode with no foresight or anticipation.
In any war it is essential to know and understand as much about the enemy as possible, it is also necessary to establish the truth and validity of any situation or development. Doing this in the cyber domain is orders of magnitude more difficult than the real world, but some of the relevant tools are now available or at an advanced stage of development. For example; fully automated fact checkers and truth engines have been demonstrated, whilst situational awareness technologies are commercially available. However, what is missing is some level of context assessment on a continual basis. Without this we will continue to be ‘blind-sided’ by the actions and developments of the attackers as they maintain their element of surprise along every line of innovation.
What do we need? In short ; a Context Engine that continually monitors networks, servers, routers, machines, devices and people for anomalous behaviours that flag pending attacks as behavioural deviations that are generally easy to detect. In the case of attacker groups we have observed precursor events and trends in network activity days ahead of some big offensive. However, this requires a shift in the defenders thinking and operations away for the reactive and short term, to the long term continual monitoring, data collection and analysis in order to establish threat assessments on a real time.
The behavioural analysis of people, networks and ITC, is at the core of our ‘Context Engine’ solution which completes the triangle of: Truth; Situation; Context Awareness to provide defenders with a fuller and transformative picture. Most of the known precursor elements of this undertaken have been studied in some depth, with some behavioural elements identified on real networks and some physical situations. The unknown can only add more accuracy!
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills MN
Travis Hills of Minnesota developed a method to convert waste into high-value dry fertilizer, significantly enriching soil quality. By providing farmers with a valuable resource derived from waste, Travis Hills helps enhance farm profitability while promoting environmental stewardship. Travis Hills' sustainable practices lead to cost savings and increased revenue for farmers by improving resource efficiency and reducing waste.
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...
Thermodynamics Tutorial - The Fundamentals
1. Thermodynamics
The basic mechanisms and laws
that govern, life, the universe,
science, engineering,
and everything!
Everything you wanted to ask about,
But were afraid to ask!
Wed 7 March 2018 16:00 – 18:00
Out of hours public lecture
Presented by Professor
Peter Cochrane OBE
Ipswich Waterfront Building
Animated tutorial style with,
demonstrations, videos &
provoking propositions
Organised and hosted by the
UoS Innovation Centre
2. A story just for students or what? Entropic formula by
Ludwig Boltzmann
between 1872 - 1875
As a student I was taught that Boltzmann got so frustrated
that people would not believe him and would not accept
his ideas and theories that he committed suicide whilst on
a family holiday - 5 September1906 Duino Italy
(I have never been able to verify this! )
Member of Austrian Academy of Sciences and in 1887
President of the University of Graz 1887
Member of Royal Swedish Academy of Sciences 1888
Member of Royal Society 1899
Named in his honour:
• Boltzmann's Energy Equipartition theorem
• Boltzmann brain
• Boltzmann constant
• Boltzmann machine
• Boltzmann equation
• Boltzmann Lattice methods fluid dynamics
• Ludwig Boltzmann Gesellschaft
• Boltzmann Medal
• Boltzmann (crater)
3. To U n d e r s t a n d !
Art
Life
Death
Physics
Theory
Biology
Science
Practice
Industry
Education
Chemistry
Philosophy
Experience
Experiment
Manufacturing
Mathematics
Observation
Engineering
Agriculture
Technology
Cosmology
Computing
Production
Philosophy
Economics
Transport
Telecoms
Design
Construction
Management
Information
Sociology
Pharmacy
Medicine
Logistics
Security
Systems
Energy
Health
Cyber
AI/AL
++++
We like to put things in boxes
We teach and learn in boxes
It is fast, efficient, & effective up
to a point
A product of an education system
forged to meet the needs of the
industrial revolution
This subject cannot be ‘boxed’
as it is so widely applicable and
the closest we have to a GUT !
4. THE 30 sec VERSION
There is a game
We are all in the game
No one can opt out !
B AD NEWS: You can’t win the game
GOOD NEWS: You can break even
LIMITATION: But only on a very cold day
B AD NEWS: It never gets that cold !
A fundamental realisation simplified
5. F o r g o t t e n o r d i s c o n n e c t e d
We do things
We see things
We experience things
We are taught lots of things
We actually understand lots of things
AND THEN we forget or do not link them
D E M O
T I M E
6. Pe o p l e a r e w a s t i n g t h e i r l i v e s a n d $ $ $ M
- Fools
- The Ignorant
- Tricksters @ Charlatans
F R E E E N E R G Y I S A L I V E
A N D W E L L - B U T S T I L L
N O T D E L I V E R I N G ! !
T H I S C A U S E I S Y E T
A N O T H E R I R R A T I O N A L
B E L I E F S Y S T E M ! !
7. I t a l l b o i l s d o w n t o p t h i s
A n d i t i s t h i s o b v i o u s !
8. E v e r y t h i n g i s c o n n e c t e d
E n e r g y i s n o t m a g i c !
T h e re i s n o f re e e n e r g y !
T h e re i s a l w a y s a p r i c e t o p a y !
1 0 0 % e f f i c i e n c y i s u n a t t a i n a b l e !
To l e v i t a t e a 1 0 0 k g h u m a n @ 1 m / s
@ a r a n g e o f 0 m ~ 1 k W
@ a r a n g e o f 1 m ~ 1 0 k W
@ a r a n g e o f 2 0 m ~ 4 M W
H a r r y d e f i n i t e l y n e e d s a t h i c k e r a r m a n d w a n d ! !
9. M y s t i c i s m d o e s n o t s o l v e o u r p r o b l e m s !
E n e r g y i s n o t m a g i c !
T h e re i s n o f re e e n e r g y !
T h e re i s a l w ay s a p r i c e t o p ay !
1 0 0 % e f f i c i e n c y i s u n a t t a i n a b l e !
H u m a n b r a i n ~ 6 0 - 1 0 0 W
H u m a n b o d y ~ 1 0 0 - 1 , 0 0 0 W
L i f t i n g p o w e r r e q u i r e d ~ 4 0 0 M W
I f L u k e i s p r o j e c t i n g t h e f o r c e h e n e e d s a t h i c k e r a r m ! !
T h a t e n e r g y d i d n o t p a s s t h ro u g h h i s h e a d o r b o d y
10. THERMO DYNAMICS
Relating to heat Relating to
change/movement
No one could have
guessed that the study
of heat would reveal
laws and principles
pertinent to all systems
in general - biological -
chemical - physical and
man made
The closest we have come to a
‘one theory fits all’ - a unified
set of observations - a generally
applicable law, & comprehensive
understandings
11. RANKING
Across the entire spectrum of physics,
Einstein liked General Relativity and
Thermodynamics best because they
are both derived from fundamental
considerations of how the universe
works; their ‘completeness’ and as
fundamentally emergent properties
12. THE NATURE OF HEAT
A complete mystery for millennia
Used and exploited with
rules of thumb based on
feel, colour, melting
point of metals and
malleability etc
13. ASSOCIATED WITH THE GODS
Ancient Egyptians related heat to mythology
Component of “primordial
forces”, from which all was
formed and elements of
chaos, that existed before
the creation of the sun
14. A TOOL FOR PEACE AND WAR
Ancient Greeks created an early flame thrower
First theory Heraclitus ~500 BC:
Elements in nature; fire, earth,
& water "All things are flowing -
and an exchange for fire“
15. INDUSTRIAL REVOLUTION 1
Water transport & limited green energy ~1770
Water and water flow sufficiently understood and
characterised by Archimedes, and adaptations to
Newtonian Mechanics
~5 bhp @ <10%
Efficiency of
energy extraction
BIG Unknowns
How big a wheel ?
What shape the blades ?
Overflow or underflow of water ?
How high the head and volume of water ?
16. CONCATENATED MILLS
Quickly ran out of energy as more built
****** Water and water flow sufficiently understood and
characterised by Archimedes, and adaptations to
Newtonian Mechanics
The fight for efficiency was driven by
building mill after mill along rivers
only to run out of energy….a victory
for science, engineering, innovation…
and a foundation for the rationale of
thermodynamics
A story to be repeated with the
steam engine and the belt drive…
with energy distribution, friction and
efficiency…the next big hurdles
17. Water transport & limited green energy ~1770
D I G G I N G I N T O T H E D E T A I L
Experiments, intuition, guessing, theory, understanding
Mathematician Leonhard Euler and son Albert (1750s) experimented with/characterised waterwheels
Mathematician Jean-Victor Poncelet (1862) proposed inward-flowing radial turbine - modern turbine precursor
Engineer Samuel B. Howd (1838) patented enclosed vertical spindle & curved blades.
Hydraulic Engineer James B. Francis (1892) added guide vanes and shaped the
blades to the correct angle
Francis turbine is still the most widely used for medium-high heads/pressure
Engineer James Thomson, added pivoted curved guide vanes to assure optimum
flow even at part load
18. INDUSTRIAL REVOLUTION 2
Steam power and ‘engine design’ were very
rudimentary with many accidents and huge
inefficiencies apparent due to the lack of
scientific knowledge and sound
engineering principles
Steam transport & energy from coal ~1850
BIG Unknowns
What fuel ?
What fluid ?
What pressure ?
What size piston ?
How efficient the design ?
~30 bhp
@ < 15%
Efficiency
19. THE LEARNING EXPERIENCE
Big accidents
People get hurt/killed
Understanding is essential
Without knowledge and understanding……
Water and water flow sufficiently
understood and characterised by
Archimedes, and adaptations to
Any fool can build a water mill
and/or a steam engine, but it
takes science and engineering
to build a good/efficient/safe
machine
~500 bhp
@ < 20%
Efficiency
20. COAL, FIRE, WATER, STEAM, BELT
A long path to understanding and efficiency realisation
~300bhp
@ < 20%
Efficiency
~100kbhp
@ >95%
Efficiency
~200 bhp
@ < 15%
Efficiency
21. COAL, FIRE, WATER, STEAM, BELT
A long path to understanding and efficiency realisation
~200bhp
@ < 20%
Efficiency
~100kbhp
@ >95%
Efficiency
~30 bhp
@ < 15%
Efficiency
23. SCIENCE COMES TO THE RESCUE
Deep knowledge & understanding creates acceleration
All our transport, energy, industrial,
civil, military, information & network
systems hinge on this knowledge
Riding a horse to walking on the moon < 200 years
Riding on a train to riding on a rocket <100 years
24. THE RUN UP
Perspective thinking
~1662 Robert Boyle
Gas Law
~1666 Issac Newton
Laws of Motion
P.V = k
P.V = mRT
F = d(mv)
dt
s = ut + ft2
Force
Motion
Inertia
Momentum
1) A body is at rest or in linear motion
until acted upon by some external force
2) Force is the time rate of change of
momentum
3) For every action in nature there is
always an equal and opposite reaction
Volume
Pressure
Temperature
Volume and Pressure
are inversely related
when Temperature is
constant
26. THE RUN UP
Step by step thinking
~1850 Rudolf Clausius
William Thomson (Lord Kelvin)
Max Plank
All proved and stated The First and Second Laws of
Thermodynamics:
- the total heat in a closed system is conserved
- there really is no free lunch
- 100% efficiency can never be realised
- perpetual motion machines are impossible
[Heat converted
to work analysis
Work converted
to heat analysis
Gave ‘Entropy’ its name and the first to
lecture and teach the subject
Entropy from the Greek En = enérgeia + tropos = turning point
27. THEORY FORMULATION
It all starts with the nature of heat
The physics of and understanding of heat…followed by a realisation of
a general and universal applicability…
Biology
Systems
Chemistry
Cosmology
Information
Communications
Artificial Intelligence
+++++ almost everything else
…with each discipline taking a different
route, applying a different emphasis,
interpretation and application set
Sad to say the conservationists,
sustainability enthusiasts, and
politicians still have to discover
thermodynamics and entropy!
28. I N I T I A L F O R M U L A T I O N
Derived from the kinetic theory of gases
Law 0: If two systems are in thermal equilibrium with a third then all are in equilibrium
Law 1: Conservation - energy cannot be created or destroyed in an isolated system
Law 2: Celestial Ratchet - entropy of any isolated system always increases
Law 3: System Entropy - approaches a constant as temperature approaches absolute zero
…theories, experiments, trials, observations repeated multiple times
and ways by hundreds of teams across the planet over many lifetimes.
The most complete view of the way (our) universe works and why!
All based on what we observer, what we can test, what we can prove:
the ‘truths’ established by standing on the shoulders of giants over
millennia…
29. If two systems are in thermal equilibrium with a third then all are in equilibrium
LAW 0 Defines temperature and heat flow
Cold Water
+
Hot Water
+
Dish
}All want to be
at the same
temperature
Flow is hot to cold and never the reverse direction
30. LAW 1: Conservation of Energy
Energy cannot be created or destroyed in an isolated system
Energy can only change in form potential, kinetic, heat
Perhaps obvious from Einstein E = mc2
In a universe, or system, of constant mass the energy is also constant
But this is partially a circular argument
Internal Energy = +Heat Input +Work Done by System
∆u = Q + W
Internal Energy = Heat Output -Work Done on System
∆u = -Q - W
31. A S F A R A S W E K N O W
The total mass/energy of the universe is constant
Dark Energy ?
Quantum Dynamics ?
Black Holes ?
String Theory ?
Worm Holes ?
Multi-Dimensionality ?
Relativity
Quantum
Mechanics
Experiments
Observations
ExperiencesMeasurements
Theories
32. LAW 2: Full energy and state accounting
Celestial Ratchet - entropy of any isolated system always increases
Entropy can be thought of in terms of order and disorder
All systems tend toward a disordered state
One way processes
Drop a cup and it smashes, but the reverse never happens
A battery always discharges it never decides to recharge itself
We live and then die and not the other way round - there are no zombies
The more energy is dispersed - the greater the entropy
33. LAW 2: Full energy and state accounting
Celestial Ratchet - entropy of any isolated system always increases
Entropy can be thought of in terms of order and disorder
All systems tend toward a disordered state
One way processes
Drop a cup and it smashes, but the reverse never happens
A battery always discharges it never decides to recharge itself
We live and then die and not the other way round - there are no zombies
The more energy is dispersed - the greater the entropy
Hot coffee in a mug is concentrated - but as the heat dissipates into the surroundings
the heat flow sees entropy fall in the mug and rise in the environment
34. LAW 2: Entropy
- Turbo charger and turbo booster in an internal combustion cycle
- Activation energy necessary to initiate a chemical reaction
- Photosynthesis as a form of energy transformation
- Energy conversion of aviation fuel in a jet engine
- Chlorophyll as a mechanism of energy storage
- Spread of heat from the sun across the earth
- Efficiency of a modern power station
- After burner cycle of a jet turbine
The nature of heat/energy flow/spread/dissipation
The ‘purists’ always ‘retreat’ to this founding conceptualisation/formulation applicable to:
These, and many more, are the ‘qualified/recognised’ boundaries of the science founded
on heat, energy, and reactions. Here, formulations are directly related to the original
roots of thermodynamics without abstraction, modification or mutation!
35. LAW 2: Entropy
- A measure/number of possible arrangements the atoms in a system can have
- Measure of how (dis)organised energy is in a system of atoms or molecules
- The level of (dis)organisation of characters on a page or bits in a message
- The level of social activity and or physical movement in a crowd
- The effectiveness of passwords, coding and encryption
- The level of cohesion in a fighting force
Order/disorder is a popular visualisation
The ‘purists’ tend to wince at this visual perspective as, for many of them, it strays too far
from the thermodynamic origins of the property and heat based formulations:
The level of ‘bastardisation’ of the original thermodynamic concepts and formulations
increases as we come down this list - however, they, and many more, prove to be useful
across many fields - BUT we should note that, as useful as they may be, they are not the
same as the original starting point, but can be statistically justified
36.
37. LAW 2: Entropy
Order/disorder information
Order Disorder
. , : ; i if iff
I need to
talk with
you soon
The concept of information
entropy was introduced by
Claude Shannon in his 1948
paper "A Mathematical Theory
of Communication"
Information
38. ENGINEERING MAXIMS : Does it work ?
“ Whilst it is permissible for the mathematicians, physicists, chemists and biologists
to declare that there is no solution to a problem - WE in engineering enjoy no such
luxury and WE always have to find an answer”
Order/disorder is more than a popular visualisation - it is a useful tool
The primary question for engineers is: Does it work ?
Refinement, efficiency, reliability, resilience, and functionality often have to follow!
“Engineers have to drink from the ‘well of human knowledge’ and experience; they
are obliged to utilise and/or bend any likely discovery/result to their advantage -
and do so without regret or limitation”
In this sense engineers and engineering ride the boundary between science and alchemy in the/ir
search for practical solutions often ahead of reliable results or any workable science.
39. One way processes
From order to disorder
Entropy always increases
This is the universe we live in - our experiences
Order disorder
LAW 2 : Times Arrow
40. LAW 2 : Times Arrow
Doesn’t do reverse gear
Order from disorder isn’t simple
Entropy never decreases
41. LAW 2 : Times Arrow
Doesn’t do reverse gear
Order from disorder isn’t simple
Entropy never decreases
We have never witnessed any reverse order processes- and whilst theoretically feasible
their probability is so close to zero we can say they will never happen….and we have
no evidence of time travellers either !!
Order disorder
42.
43.
44. EXCEPTION ALERT : LIFE!!
Time’s arrow isn’t in reverse !
-ve Entropy partly defines life systems
Only possible (and true) in small pockets
Life is an insignificant element of a bigger system
Entropy goes negative as order emerges from disorder
Does not detract significantly from the universal trend toward total disorder
WE ALWAYS have to consider any system
in the context of the whole environment…
..the Entropy of a closed/isolated/
constrained system can experience/see +
or -ve Entropy,. but the whole only ever
sees a +ve change
45. ENTROPY : Basic System Form
Energy
Sink
Engine transforms
the form of energy
Engine
Energy
Source
Work Output
movement
chemical reaction
temperature change
47. - No free energy
- No free material
- No free processes
- No perpetual motion machines
NOTHING IS FREE : Everything has a cost !!
Efficiency Always < 100%
Energy Out < Energy In
48. ENTROPY : Basic System Form
Energy
Sink
Engine transforms
the form of energy
Engine
Energy
Source
Work Output ⇒ Energy Output = E1 - E2 = ∆E ∝T1 -T2
movement
chemical reaction
temperature change
Energy Suppled = E1
Energy Dissipated = E2
Energy = E1 ∝T1 Energy = E2 ∝T2
NOTICE : The work/energy output is
dictated by the temperature differential
and the efficiency of the machine. For a
given efficiencey the output = Z( T1 - T2)
NOTICE : Entropy defines the energy
available to do useful work in a
thermodynamic process
49. ENTROPY : Heat v Statistical View
∆S ≈ ∆ERelative change in Entropy
Many formulation variants - some more convenient or easier to deal with than
others depending on educational background/mode of thinking distribution but
the outcome/form is always the same and so are all the conclusions - there are
no conflicts or exceptrions…
T
}
Integrating over the entire space/system is then trivial:
S = k log KT∂S ≈ ∂E
T∫∫ S = kß log W
Relative change system energy
at a given temperature
50. GENERAL FORMULATION
Derived from the kinetic theory of gases
Historically; Thermodynamics and nearly
all the early thinking emerged from
considerations of a number a number of
i n d u s t r i a l p ro b l e m s re l a t e d t o h e a t
generation, flow and exploitation in the
transformation into motion
Development; It was soon realised and
shown that all the laws could be derived
from the study of molecular movement at
an individual and fundamental level - and
therefore, be based on a statistical model
c o n c e r n e d w i t h t h e p r o b a b i l i t i e s
associated with movement and location
51.
52.
53. ENTROPY : Thermal Derivation ~1840
∆S ≈ ∆E ≈ k∆TRelative change in Entropy
Many formulation variants - some more convenient or easier to deal with than others
depending on educational background/mode of thinking, but the outcome and form is
always the same and so are all the conclusions - there are no conflicts or exceptions…
T
Integrating over the entire space/system is then trivial:
∂S ≈ ∂E
T∫∫ S = kß logW
T
∂S ≈k ∂T
T∫
kß
= Boltzman’s Constant = 1.38065 × 10−23 J/K
Joules/Kelvin
This W nomenclature was ‘standardised’
from later probabalistic derivations
Original formulation in 1842 by Boltzmann did not use
W =Wahrscheinlichkeit (German for probability) - it
was introduced by Max Plank in 1900
∫
54. ENTROPY : Statistical Derivation ~1900
A measure of the number of possible micro-states of a system in thermodynamic
equilibrium, consistent with its macro-state
A full formulation is rendered impossibly large unless all the micro-states are statistically
independent and all the probabilities are the same for the whole macro-state
W = Wahrscheinlichkeit (probability) of a macrostate for some probability distribution of microstates - positions and
momenta of all molecules - the most general expression of the thermodynamic entropy
N = The total number of molecules/components
Ni = The individual molecules/components
N!
N
Ni !∏i
W =
Total number of positions of total population
Total number of individual positions/molecule
55. ENTROPY : Statistical Thermodynamics
In many practical cases a system’s thermodynamic micro-states are not equally probable: eg,
high energy states are less probable than low energy at a fixed temperature
And so the equal probabilities assumption does not always obtain, but a well established
generalisation is given by Gibbs:
This formulation is the most useful and most cited
in engineering and information science…and there
are many similar forms including Shannon’s Bound
S = ⎲⎲
i
pi log pi-kß
Iff all the probabilities are equal, then this reduces to:
S = kß logW
56. ENTROPY : Commonly cited forms
The springboard for
information theory
and info systems
understanding
57. EXCEPTION ALERT: New Dimensions
Our reality of 4 dimensions appears to be a fraction of an 11 dimensional universe.
But all of the above is based on millennia of evidential understanding of our 4D
‘reality’ and the Laws of Physics appear immutable. Though challenged and tested
continually they remain steadfast and the foundation of our understandings.
There is always room for new discoveries, but unless there is another reality of
different and/or more dimensions the Laws of Thermodynamics remains our
most complete model, at the core of our base understandings of the universe in
which we live.
58. POSITIONING: Human knowledge
E = mc2
S = kß log W
kß
= 1.38065 × 10−23 J/K
If this is mankind’s most prophetic equation:
Then this is a very close second:
Profound consequences: Time travel is impossible and nothing lives or lasts forever
59. LAW 3: Entropy > 0 as T > 0 K
The only law founded on unique measurement trends
“The entropy of a perfect crystal at absolute
zero is exactly equal to zero”
In a sense this law is more hypothetical that any of the other
three as it cannot be directly demonstrated - ie we cannot
create perfect crystals or a temperature of absolute zero !
“Perfect order and thus zero entropy is only
possible at absolute zero”
https://arxiv.org/abs/1412.3828
60. TO BE AWARE: ENTHALPY
A measure of the total energy of a system
The internal energy plus that required to create a system
Entropy S = Joules/Kelvin - a measure of how energy is distributed in a system
Enthalpy H = Joules - A systems internal energy + p.v
TO BE CLEAR
OTHER TERMS
Endothermic = Absorbing Energy
Exothermic = Releasing Energy
Adiabatic = No Energy Exchange
More generally applicable in chemistry
and chemical engineering et al and not
information systems and theory
{
Terms commonly employed in many
other fields but not in general use for
information systems and theory
{
61. IS HELL ENDOTHERMIC OR EXOTHERMIC ?
1) We postulate that if souls exist, then they must have some mass. If they do; a mole of souls can also have a mass
2) So, at what rate are souls moving into and exiting hell? I think we can safely assume that once in hell souls do not leave
3) Many/most religions state that if you are not a member, then you will go to hell. Since there are so many of these religions and people
do not belong to more than one religion, we can project that all people and souls go to hell
4) With birth and death rates as they are, we can expect the number of souls in hell to increase exponentially.
5) NOW; Boyle’s Law states that in order for the temperature and pressure in hell to stay the same, the ratio of the mass of souls and
volume needs to stay constant. Two options exist:
a)If hell is expanding at a slower rate than the rate at which souls enter hell, then the temperature and pressure in hell will increase until
all hell breaks loose
b)If hell is expanding at a rate faster than the increase of souls in hell, then the temperature and pressure will drop until hell freezes over
So which is it? If we accept the quote given to me by Theresa Manyan during Freshman year, "that it will be a cold night in hell before I
sleep with you" and take into account the fact that I still have NOT succeeded in having sexual relations with her, then Option 2 cannot be
true...Thus, hell is exothermic."
A fun read from the internet often falsely attributed to Dr Schambaugh of the
Oklahoma School of Chemical Engineering
62. TO CONTEMPLATE - THE FRIVOLOUS ?
Analysing and making sense of the extreme including the non-sensical
1) How much energy flows down Harry Potters wand when he casts a spell?
2) AND where does that energy come from, and what kind of energy is it?
3) What are the limitations to building a matter transporter - aka Start Trek?
4) Is it possible for any life form to survive in its own waste materials?
5) Does a pregnant mother see an increase or decrease in her entropy?
6) When someone/thing dies, does it’s entropy stop increasing?
7) What is the ultimate limit to our information storage capacity?
8) In the movie ‘The Martian’ would his survival strategy actually work?
9) Where are heaven and hell located and how much energy do they consume?
10)…..
63. TO CONTEMPLATE: SERIOUS PROBLEMS
Fashion, fad, political and scientific correctness are almost never in alignment
1) Do our waste recycling programs actually work ?
2) Should we be burning plastic waste instead of recycling?
3) Are electric vehicles really green ?
4) Do wind farms cost in ecologically?
5) Can wave power save us?
6) Is tidal power a better option?
7) Can we actually live ‘off grid’ and benefit the planet?
8) Do solar cells create more pollutants than they save?
9) Can ‘natural farming’ feed the planet?
10)Could we actually freeze technologically driven change?
64. M O R E E X P L A N A T I O N S / V I E W S
Need to know even more?
https://en.wikipedia.org/wiki/History_of_entropy
http://entropysimple.oxy.edu/content.htm
https://www.khanacademy.org/science/biology/energy-and-enzymes/the-laws-of-thermodynamics/a/
the-laws-of-thermodynamics
http://physicsforidiots.com/physics/thermodynamics/
Brian Cox explains why time travels in one direction - Wonders of the Universe - BBC Two
https://www.youtube.com/watch?v=uQSoaiubuA0
A derivation (and quantification) of the third law of thermodynamics
Masanes & Oppenheim (Quantum Physics 11 Dec 2014 (v1), revised 7 Apr 2016)
https://arxiv.org/abs/1412.3828
65. “It can be argued that civilisation and its technology enabler IS the sustainability problem”
This axiom has a brilliant/controversial thermodynamic proof byTim Denton in a 2007/9 paper
suggesting tCivilisation itself is a heat engine - producing 9.6 milliwatts of heat for ever dollar of
GDP normalised to 1990 value.
The insight is quite brilliant, and the implications terrifying.
FURTHER FOOD FOR THOUGHT !
Make it your practice to read wider than your lecture
notes and printed books. Search out the radical, the
deep thinkers, and those who posit the challenging!
“you don't solve problems from within the system that created those same problems”
Axiom - Einstein
66. Josiah Willard Gibbs (February 11, 1839 – April 28,
1903) was an American scientist who made important
theoretical contributions to physics, chemistry, and
mathematics. His work on the applications of
thermodynamics was instrumental in transforming
physical chemistry into a rigorous inductive science.
Gibbs Formula of Enthalpy and Entropy defines
the point of a chemical phase change/reaction