Karl Popper, as a critical rationalist, was an opponent of all forms of skepticism, conventionalism and relativism in science. A major argument of Popper is Hume's critique of induction, arguing that induction should never be used in science. But he disagrees with the skepticism associated with Hume, nor with the support of Bacon and Newton's pure "observation" as a starting point in the formation of theories, as there are no pure observations that do not imply certain theories. Instead, Popper proposes falsifiability as a method of scientific investigation.
DOI: 10.13140/RG.2.2.11481.36967
Thomas Kuhn criticized falsifiability because it characterized "the entire scientific enterprise in terms that apply only to its occasional revolutionary parts," and it cannot be generalized. In Kuhn's view, a delimitation criterion must refer to the functioning of normal science. Kuhn objects to Popper's entire theory and excludes any possibility of rational reconstruction of the development of science. Imre Lakatos said that if a theory is scientific or non-scientific, it can be determined independently of the facts. He proposed a modification of Popper's criterion, which he called "sophisticated (methodological) falsification".
DOI: 10.13140/RG.2.2.30572.82568
Feyerabend, Pluralism and Progress in Science in Against Method 1993 and the ...ijtsrd
The epistemological problem associated with Karl Paul Feyerabend as a philosopher of Science resides beneath the fact that different critics of his works give divers interpretations of them. His works and the accounts they present have no common structure. This plurality and conflictual interpretations of him makes it difficult, if not impossible to pin him to a particular tradition in the Philosophy of Science. For this reason, while some of his critics consider him to be a relativist, to some, he is a Dadaist, a confusionist and an anarchist, yet others think of Feyerabend as the worst enemy of Science. This diversity of interpretation of Feyerabend, in my opinion, only goes to reassure us of our reading of him. That is, Feyerabend is closely associated with pluralism than anything else. My aim, in this paper is thus propose a thesis and attempt a justification. The thesis is that my reading of Against Method, 1993 and The Tyranny of Science, 2011 , justifies the thesis above. This perspective, unlike the others, is more holistic and inclusive. Without agreeing with his poists about science and its method, I contend that his pluralist claims in the philosophy of science art not hard to find. My examples stem, first, from the diversity of interpretations, and the conflicting views of his critics. Second, I consider the titles of the two works under consideration, to illustrate his criticism of the scientism and Methodism of Modern Science on the one hand, and his defence of plurality of methods and theories. Finally, I conclude that contrarily to critics who label him the worst enemy of science, anarchist or a confusionist, I think that, Feyerabend exaggerated his criticism of Modern Science and his defence of pluralism when he claimed to see no difference between science, myths and religion. However, I go further to contend that this comparison does not eclipse his pluralist position. It rather exaggerates it. That is why I term him, an extreme pluralist to say the least. Nyuykongi John Paul ""Feyerabend, Pluralism and Progress in Science in Against Method (1993) and the Tyranny of Science (2011)"" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-2 , February 2020,
URL: https://www.ijtsrd.com/papers/ijtsrd30060.pdf
Paper Url : https://www.ijtsrd.com/humanities-and-the-arts/education/30060/feyerabend-pluralism-and-progress-in-science-in-against-method-1993-and-the-tyranny-of-science-2011/nyuykongi-john-paul
The contemporary philosophy of science (epistemology) featuring K.Popper, T.Kuhn, I.Lakatos, P.Feyerabend, Hanson among others, has exercised a decisive critique to the dominant views of the positivist and neo-positivist model of knowledge and has in fact undermined its credibility.
lecture 29 from a college level introduction to psychology course taught Fall 2011 by Brian J. Piper, Ph.D. (psy391@gmail.com) at Willamette University, includes parapsychology, Freudian psychology
Evolutionary epistemology versus faith and justified true belief: Does scien...William Hall
This presentation explores the basis for scientific rationality by testing our claims about the world against nature as described by Karl Popper's evolutionary epistemology versus accepting claims based on justified true belief. The presentation is particularly concerned to show the philosophical problems with religious fundamentalism.
What is science? Science, pseudoscience, non-scienceDennis Miller
Science plays a fundamental role in modern society. But what exactly is science? In philosophy this question is known as the demarcation problem (Popper, Kuhn, Laudan and others).
Internet Encyclopedia of PhilosophySearchPrimary.docxvrickens
Internet Encyclopedia of PhilosophySearch
Primary Menu
Skip to contentABCDEFGHIJKLMNOPQRSTUVWXYZ×Karl Popper: Philosophy of Science
Karl Popper (1902-1994) was one of the most influential philosophers of science of the 20th century. He made significant contributions to debates concerning general scientific methodology and theory choice, the demarcation of science from non-science, the nature of probability and quantum mechanics, and the methodology of the social sciences. His work is notable for its wide influence both within the philosophy of science, within science itself, and within a broader social context.
Popper’s early work attempts to solve the problem of demarcation and offer a clear criterion that distinguishes scientific theories from metaphysical or mythological claims. Popper’s falsificationist methodology holds that scientific theories are characterized by entailing predictions that future observations might reveal to be false. When theories are falsified by such observations, scientists can respond by revising the theory, or by rejecting the theory in favor of a rival or by maintaining the theory as is and changing an auxiliary hypothesis. In either case, however, this process must aim at the production of new, falsifiable predictions. While Popper recognizes that scientists can and do hold onto theories in the face of failed predictions when there are no predictively superior rivals to turn to. He holds that scientific practice is characterized by its continual effort to test theories against experience and make revisions based on the outcomes of these tests. By contrast, theories that are permanently immunized from falsification by the introduction of untestable ad hoc hypotheses can no longer be classified as scientific. Among other things, Popper argues that his falsificationist proposal allows for a solution of the problem of induction, since inductive reasoning plays no role in his account of theory choice.
Along with his general proposals regarding falsification and scientific methodology, Popper is notable for his work on probability and quantum mechanics and on the methodology of the social sciences. Popper defends a propensity theory of probability, according to which probabilities are interpreted as objective, mind-independent properties of experimental setups. Popper then uses this theory to provide a realist interpretation of quantum mechanics, though its applicability goes beyond this specific case. With respect to the social sciences, Popper argued against the historicist attempt to formulate universal laws covering the whole of human history and instead argued in favor of methodological individualism and situational logic.Table of ContentsBackgroundFalsification and the Criterion of DemarcationPopper on Physics and PsychoanalysisAuxiliary and Ad Hoc HypothesesBasic Sentences and the Role of ConventionInduction, Corroboration, and VerisimilitudeCriticisms of FalsificationismRealism, Quantum Mechanics, and Pro ...
A scientific theory, according to Popper, can be legitimately saved from falsification by introducing an auxiliary hypothesis to generate new, falsifiable predictions. Also, if there are suspicions of bias or error, the researchers might introduce an auxiliary falsifiable hypothesis that would allow testing. But this technique can not solve the problem in general, because any auxiliary hypothesis can be challenged in the same way, ad infinitum. To solve this regression, Popper introduces the idea of a basic statement, an empirical statement that can be used both to determine whether a given theory is falsifiable and, if necessary, to corroborate falsification assumptions.
DOI: 10.13140/RG.2.2.22162.09923
Thomas Kuhn criticized falsifiability because it characterized "the entire scientific enterprise in terms that apply only to its occasional revolutionary parts," and it cannot be generalized. In Kuhn's view, a delimitation criterion must refer to the functioning of normal science. Kuhn objects to Popper's entire theory and excludes any possibility of rational reconstruction of the development of science. Imre Lakatos said that if a theory is scientific or non-scientific, it can be determined independently of the facts. He proposed a modification of Popper's criterion, which he called "sophisticated (methodological) falsification".
DOI: 10.13140/RG.2.2.30572.82568
Feyerabend, Pluralism and Progress in Science in Against Method 1993 and the ...ijtsrd
The epistemological problem associated with Karl Paul Feyerabend as a philosopher of Science resides beneath the fact that different critics of his works give divers interpretations of them. His works and the accounts they present have no common structure. This plurality and conflictual interpretations of him makes it difficult, if not impossible to pin him to a particular tradition in the Philosophy of Science. For this reason, while some of his critics consider him to be a relativist, to some, he is a Dadaist, a confusionist and an anarchist, yet others think of Feyerabend as the worst enemy of Science. This diversity of interpretation of Feyerabend, in my opinion, only goes to reassure us of our reading of him. That is, Feyerabend is closely associated with pluralism than anything else. My aim, in this paper is thus propose a thesis and attempt a justification. The thesis is that my reading of Against Method, 1993 and The Tyranny of Science, 2011 , justifies the thesis above. This perspective, unlike the others, is more holistic and inclusive. Without agreeing with his poists about science and its method, I contend that his pluralist claims in the philosophy of science art not hard to find. My examples stem, first, from the diversity of interpretations, and the conflicting views of his critics. Second, I consider the titles of the two works under consideration, to illustrate his criticism of the scientism and Methodism of Modern Science on the one hand, and his defence of plurality of methods and theories. Finally, I conclude that contrarily to critics who label him the worst enemy of science, anarchist or a confusionist, I think that, Feyerabend exaggerated his criticism of Modern Science and his defence of pluralism when he claimed to see no difference between science, myths and religion. However, I go further to contend that this comparison does not eclipse his pluralist position. It rather exaggerates it. That is why I term him, an extreme pluralist to say the least. Nyuykongi John Paul ""Feyerabend, Pluralism and Progress in Science in Against Method (1993) and the Tyranny of Science (2011)"" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-2 , February 2020,
URL: https://www.ijtsrd.com/papers/ijtsrd30060.pdf
Paper Url : https://www.ijtsrd.com/humanities-and-the-arts/education/30060/feyerabend-pluralism-and-progress-in-science-in-against-method-1993-and-the-tyranny-of-science-2011/nyuykongi-john-paul
The contemporary philosophy of science (epistemology) featuring K.Popper, T.Kuhn, I.Lakatos, P.Feyerabend, Hanson among others, has exercised a decisive critique to the dominant views of the positivist and neo-positivist model of knowledge and has in fact undermined its credibility.
lecture 29 from a college level introduction to psychology course taught Fall 2011 by Brian J. Piper, Ph.D. (psy391@gmail.com) at Willamette University, includes parapsychology, Freudian psychology
Evolutionary epistemology versus faith and justified true belief: Does scien...William Hall
This presentation explores the basis for scientific rationality by testing our claims about the world against nature as described by Karl Popper's evolutionary epistemology versus accepting claims based on justified true belief. The presentation is particularly concerned to show the philosophical problems with religious fundamentalism.
What is science? Science, pseudoscience, non-scienceDennis Miller
Science plays a fundamental role in modern society. But what exactly is science? In philosophy this question is known as the demarcation problem (Popper, Kuhn, Laudan and others).
Internet Encyclopedia of PhilosophySearchPrimary.docxvrickens
Internet Encyclopedia of PhilosophySearch
Primary Menu
Skip to contentABCDEFGHIJKLMNOPQRSTUVWXYZ×Karl Popper: Philosophy of Science
Karl Popper (1902-1994) was one of the most influential philosophers of science of the 20th century. He made significant contributions to debates concerning general scientific methodology and theory choice, the demarcation of science from non-science, the nature of probability and quantum mechanics, and the methodology of the social sciences. His work is notable for its wide influence both within the philosophy of science, within science itself, and within a broader social context.
Popper’s early work attempts to solve the problem of demarcation and offer a clear criterion that distinguishes scientific theories from metaphysical or mythological claims. Popper’s falsificationist methodology holds that scientific theories are characterized by entailing predictions that future observations might reveal to be false. When theories are falsified by such observations, scientists can respond by revising the theory, or by rejecting the theory in favor of a rival or by maintaining the theory as is and changing an auxiliary hypothesis. In either case, however, this process must aim at the production of new, falsifiable predictions. While Popper recognizes that scientists can and do hold onto theories in the face of failed predictions when there are no predictively superior rivals to turn to. He holds that scientific practice is characterized by its continual effort to test theories against experience and make revisions based on the outcomes of these tests. By contrast, theories that are permanently immunized from falsification by the introduction of untestable ad hoc hypotheses can no longer be classified as scientific. Among other things, Popper argues that his falsificationist proposal allows for a solution of the problem of induction, since inductive reasoning plays no role in his account of theory choice.
Along with his general proposals regarding falsification and scientific methodology, Popper is notable for his work on probability and quantum mechanics and on the methodology of the social sciences. Popper defends a propensity theory of probability, according to which probabilities are interpreted as objective, mind-independent properties of experimental setups. Popper then uses this theory to provide a realist interpretation of quantum mechanics, though its applicability goes beyond this specific case. With respect to the social sciences, Popper argued against the historicist attempt to formulate universal laws covering the whole of human history and instead argued in favor of methodological individualism and situational logic.Table of ContentsBackgroundFalsification and the Criterion of DemarcationPopper on Physics and PsychoanalysisAuxiliary and Ad Hoc HypothesesBasic Sentences and the Role of ConventionInduction, Corroboration, and VerisimilitudeCriticisms of FalsificationismRealism, Quantum Mechanics, and Pro ...
A scientific theory, according to Popper, can be legitimately saved from falsification by introducing an auxiliary hypothesis to generate new, falsifiable predictions. Also, if there are suspicions of bias or error, the researchers might introduce an auxiliary falsifiable hypothesis that would allow testing. But this technique can not solve the problem in general, because any auxiliary hypothesis can be challenged in the same way, ad infinitum. To solve this regression, Popper introduces the idea of a basic statement, an empirical statement that can be used both to determine whether a given theory is falsifiable and, if necessary, to corroborate falsification assumptions.
DOI: 10.13140/RG.2.2.22162.09923
RelativismEpistemic RelativismWe have now presented a philos.docxcarlt4
Relativism
Epistemic Relativism
We have now presented a philosophical argument behind the whole basis of accepted scientific truth.
Let's introduce another philosophical term important in that dabate:
Epistemic Relativism: the position that knowledge is valid only relatively to a specific context, society, culture or individual.
In the following video, Duncan Pritchard, from the University of Edinburgh introduces the concept of Epistemic Relativism. You will learn about the well-known Bellarmine–Galileo controversy about the validity of Ptolemy’s geocentric system vs Copernicus’s heliocentric system, This historical episode is well documented, and it has been the battleground of important discussions about what epistemologists call epistemic relativism, namely the view that norms of reasoning and justification for our knowledge claims seem to be relative.
From "The Little Thinker‘s Blog“
This historical example illustrates the epistemic relativist’s ‘no neutral ground’ argument, and the difficulty of identifying a common ground or a common measure to assess and evaluate knowledge claims in their historical and social context.
** Content from Online Course: Philosophy and the Sciences: Introduction to the Philosophy of Physical Sciences by The University of Edinburgh
https://youtu.be/MYnZgJeOqqg
Popper's Falsification
From inductivism to Popper’s falsification
From: Philosophy and the Science for Everyone by Michela Massimi. ISBN: 9781138785434
Karl Popper
Philosophers of science are interested in understanding the nature of scientific knowledge and its distinctive features. For a very long time, they strove to find what they thought might be the distinctive method of science, the method that would allow scientists to make informed decisions about what counts as a scientific theory.
The importance of demarcating good science from pseudo-science is neither otiose nor a mere philosophical exercise. It is at the very heart of social policy, when decisions are taken at the governmental level about how to spend taxpayers’ money.
Karl Popper (28 July 1902 – 17 September 1994) was, undoubtedly, one of the most influential philosophers of the early twentieth century to have contributed to the debate about demarcating good science from pseudo-science. In this section we very briefly review some of his seminal ideas.
Popper’s battleground was the social sciences. At the beginning of the twentieth century, in the German-speaking world, a lively debate took place between the so-called Naturwissenschaften (the natural sciences, including mathematics, physics, and chemistry) and the Geisteswissenschaften (the human sciences, including psychology and the emergent psychoanalysis), and whether the latter could rise to the status of proper sciences on a par with the natural sciences.
This is the historical context in which Popper began his philosophical reflections in the 1920s. Popper’s reflections were influenced by the Vienna Circle, a group of young int.
Science v Pseudoscience: What’s the Difference? - Kevin KorbAdam Ford
Science has a certain common core, especially a reliance on empirical methods of assessing hypotheses. Pseudosciences have little in common but their negation: they are not science.
They reject meaningful empirical assessment in some way or another. Popper proposed a clear demarcation criterion for Science v Rubbish: Falsifiability. However, his criterion has not stood the test of time. There are no definitive arguments against any pseudoscience, any more than against extreme skepticism in general, but there are clear indicators of phoniness.
Post: http://www.scifuture.org/science-vs-pseudoscience
International Journal of Humanities and Social Science Invention (IJHSSI) is an international journal intended for professionals and researchers in all fields of Humanities and Social Science. IJHSSI publishes research articles and reviews within the whole field Humanities and Social Science, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
.There are different paths to reality, they are determined by the knower, being instrumental methodological study object, epistemological axis, among others. Reality presents several faces, what is observable and what is perceived sensory empirical data obtained correspond to the visible, the main thing is to discover the hidden side, which is behind the perceptible or data. Epistemology is the whole process of obtaining scientific knowledge, ranging from the pre knowledge to get to know the hidden side, one thing is what is seen and what is not, and one that is not seen, is really it is.
Knowledge vs Belief, Justification and Truth―What Scientific Knowledge Is a...William Hall
For better or worse, science works. It has given humans dominion over the earth
It is now within humanity’s capacity to either destroy the ecosystems we depend on for our survival or make the world a better place to live.
What is it about the epistemological foundations of science that makes it so much more effective helping us to understand the world than are philosophical systems based on faith and belief?
This talk introduces the epistemological work of Sir Karl Popper, focused on his later work on evolutionary epistemology. I argue that Popper's evolutionary approach to epistemology explains how science works, and why this is a better system for building reliable knowledge about the world than are systems ultimately based in justification and belief as guides to "truth".
Caveats: I am not a student of philosophy and am not familiar with much of its literature. From my background as an evolutionary biologist, I have extended Popper’s ideas in a number of areas as I have explored the biological roles and nature of knowledge in living systems.
The presentation (if downloaded) includes links to my publications expanding on ideas presented here.
Riscuri și provocări în inteligența artificială: Cutii negre și actorii de am...Nicolae Sfetcu
Inteligența artificială a creat oportunități fără precedent, dar și noi riscuri. Creșterea exponențială a capabilităților modelelor de inteligența artificială permite atingerea unor niveluri de valoare și generalizare neatinse până acum. Cu toate acestea, opacitatea acestor modele a crescut, de asemenea, iar natura lor de cutie neagră face dificilă, chiar și pentru experți, explicarea justificării concluziilor lor. Acest lucru poate reprezenta un punct critic din punct de vedere tehnologic și social, deoarece riscul este real, după cum demonstrează episoadele recente, ale sistemelor de antrenament care sunt compromise de părtiniri și prejudecăți de discriminare, care au învățat din datele de instruire. Prin urmare, este posibil ca învățarea din urmele digitale ale deciziilor trecute să poată duce la încorporarea prejudecăților invizibile existente în modelele rezultate, perpetuându-le.
IT & C, Volumul 2, Numărul 1, Martie 2023, pp. 41-47
ISSN 2821 – 8469, ISSN – L 2821 – 8469, DOI: 10.58679/IT21269
URL: https://www.internetmobile.ro/riscuri-si-provocari-in-inteligenta-artificiala-cutii-negre-si-actorii-de-amenintare/
EU Clear English Tips for Translators - eBookNicolae Sfetcu
Here are some tips to help translators avoid copying structure and wording from other languages that would be awkward in English.
They should be useful to non-native speakers, but may serve as handy reminders for native speakers too.
Funcții PHP definite de utilizator în dezvoltarea WordPressNicolae Sfetcu
PHP definește o gamă largă de funcții ca blocuri reutilizabile de instrucțiuni în limbajul de bază, și multe sunt, de asemenea, disponibile în diferite extensii. Aceste funcții sunt bine documentate în documentația online PHP. Funcțiile personalizate pot fi definite de dezvoltator. O funcție nu se va executa automat când se încarcă o pagină, ea poate fi apelată de oriunde și oricând în cadrul programului. PHP acceptă declarații de tip privind parametrii funcției, care sunt aplicate în timpul execuției.
IT & C, Volumul 2, Numărul 1, Martie 2023, pp. 37-40
ISSN 2821 – 8469, ISSN – L 2821 – 8469, DOI: 10.58679/IT37237
URL: https://www.internetmobile.ro/functii-php-definite-de-utilizator-in-dezvoltarea-wordpress/
Practici comune pentru limbajul de programare în CNicolae Sfetcu
Odată cu utilizarea pe scară largă, o serie de practici și convenții comune au evoluat pentru a ajuta la evitarea erorilor în programele C. Acestea sunt simultan o demonstrație a aplicării bunelor principii de inginerie software într-un limbaj, și o indicație a limitărilor C. Deși puține sunt utilizate universal, iar unele sunt controversate, fiecare dintre acestea se bucură de o utilizare largă.
IT & C, Volumul 2, Numărul 1, Martie 2023, pp. 30-36
ISSN 2821 – 8469, ISSN – L 2821 – 8469, DOI: 10.58679/IT80750
URL: https://www.internetmobile.ro/practici-comune-pentru-programarea-in-c/
IT & C, Volumul 2, Numărul 3, Septembrie 2023 - RezumateNicolae Sfetcu
Revista IT & C este o publicație trimestrială din domeniile tehnologiei informației și comunicații, și domenii conexe de studiu și practică.
Cuprins:
EDITORIAL
Provocări în inteligența artificială
TEHNOLOGIA INFORMAȚIEI
Blockchain Design and Modelling
TELECOMUNICAȚII
Arhitectura de bază a rețelelor 5G
INTERNET
Big Data Ethics in Education and Research
SOFTWARE
Tableau Software: Vizualizarea și analiza datelor
PROGRAMARE
Rezumarea automată în inteligența artificială prin învățare nesupravegheată: TextRank
DEZVOLTARE WEB
Argumentele funcțiilor PHP – Transmiterea argumentelor prin referință
SECURITATE CIBERNETICĂ
Criptomonede și criptosecurități – Contracte inteligente
ISSN 2821– 8469 ISSN – L 2821 – 8469, DOI: 10.58679/IT55267
EAN , Cod IT23 , ID 32330
IT & C (PDF, EPUB, MOBI pentru Kindle) https://www.internetmobile.ro/revista/revista-it-c-volumul-2-numarul-3-septembrie-2023/
Vizualizarea datelor cu aplicațiile Tableau SoftwareNicolae Sfetcu
Tableau este un instrument de analiză și vizualizare a datelor care se poate conecta cu multe surse de date, creând tablouri de bord interactive. Tableau utilizează inovații de integrare a aplicațiilor, cum ar fi API-urile JavaScript și aplicația de conectare unică, pentru a include în mod constant analiza Tableau în aplicațiile de afaceri de bază. Tableau interoghează baze de date relaționale, cuburi de procesare analitică online, baze de date în cloud și foi de calcul pentru a genera vizualizări de date de tip grafic. De asemenea, software-ul poate extrage, stoca și prelua date dintr-un motor de date în memorie.
IT & C, Volumul 2, Numărul 1, Martie 2023, pp. 23-29
ISSN 2821 – 8469, ISSN – L 2821 – 8469, DOI: 10.58679/IT10117
URL: https://www.internetmobile.ro/vizualizarea-datelor-cu-aplicatiile-tableau-software/
La revendication de Hooke sur la loi de la gravitéNicolae Sfetcu
Dans une note intitulée « Un état vrai de l'affaire et la controverse entre Sr Isaak Newton et le Dr Robert Hooke comme priorité de cette noble hypothèse du mouvement des planètes autour du Soleil en tant que leurs centres » non publié au cours de sa vie, Hooke a décrit sa théorie de la gravité. Pour soutenir sa « priorité », Hooke cite ses conférences sur les mouvements planétaires du 23 mai 1666, « Une tentative de prouver le mouvement de la Terre à partir d'observations » publiées en 1674 et la correspondance avec Isaac Newton en 1679.
DOI: 10.13140/RG.2.2.26375.24485
Procesarea Big Data cu instrumente avansateNicolae Sfetcu
Datele trebuie procesate cu instrumente avansate de colectare și analiză, pe baza unor algoritmi prestabiliți, pentru a putea obține informații relevante. Algoritmii trebuie să ia în considerare și aspecte invizibile pentru percepțiile directe. Big Data în procesele guvernamentale cresc eficiența costurilor, productivitatea și inovația. Registrele civile sunt o sursă pentru Big Data. Datele prelucrate ajută în domenii critice de dezvoltare, cum ar fi îngrijirea sănătății, ocuparea forței de muncă, productivitatea economică, criminalitatea, securitatea și gestionarea dezastrelor naturale și a resurselor.
IT & C, Volumul 2, Numărul 1, Martie 2023, pp. 18-22
ISSN 2821 – 8469, ISSN – L 2821 – 8469, DOI: 10.58679/IT91785
URL: https://www.internetmobile.ro/procesarea-big-data/
Corupţia, Globalizarea și NeocolonialismulNicolae Sfetcu
O introducere în conceptele interdependente despre corupţie, globalizare prin instituţiile financiare internaţionale, şi neocolonialism înţeles ca exploatarea resurselor şi materiilor prime a ţărilor sărace şi în curs de dezvoltare de unele mari corporaţii multinaţionale.
Corupţia este atât o cauză majoră cât şi un rezultat al sărăciei în întreaga lume. Ea apare la toate nivelurile societăţii, de la autorităţile locale şi naţionale, la societatea civilă, sistemul judiciar, întreprinderile mari şi mici, unităţile militare, etc. Corupţie sistemică (sau corupţia endemică) este corupţia, care se datorează în primul rând punctelor slabe ale unei organizaţii sau proces. Aceasta poate fi contrastată la funcţionarii sau agenţii individuali corupţi din cadrul sistemului. Factorii care încurajează corupţia sistemică includ stimulente contradictorii, puteri discreţionare, puteri de monopol, lipsa de transparenţă, salarii mici, şi o cultură a impunităţii. printre actele specifice de corupţie se numără luarea de mită, şantaj, şi deturnarea de fonduri, într-un sistem în care corupţia devine regula mai degrabă decât excepţia.
Neocolonialismului este practica de utilizare a capitalismului, globalizării, şi a forţelor culturale, pentru a controla o ţară, în locul unui control direct militar sau politic. Un astfel de control poate fi economic, cultural, sau lingvistic. Societăţile corporative care aparţin culturii impuse pot pătrunde mult mai uşor pe pieţele din aceste ţări. Astfel, neocolonialismului este rezultatul final al unor interese de afaceri sau geopolitice se obţine prin deformarea culturii ţărilor colonizate.
În urma unei ideologii cunoscut sub numele de neoliberalism, şi răspândită de instituţii financiare similare, cunoscută sub numele de "Consensul de la Washington", au fost impuse politici de ajustare structurală pentru a se asigura de rambursarea datoriilor şi restructurarea economică. Dar, în realitate s-a cerut ţărilor sărace să-şi reducă cheltuielile cu sănătatea, educaţia şi dezvoltarea, făcându-se o prioritate din rambursarea datoriilor şi a altor politici economice favorizante pentru ţările dezvoltate.Practic, FMI şi Banca Mondială au cerut ţărilor sărace să reducă nivelul de trai al populaţiei.
Corupţia, crimele de stat corporativ, şi crima organizată, sunt oricum considerate atât crime internaţionale cât şi crime de stat la nivel naţional. În cele mai multe cazuri crima de stat este considerată ca aplicabilă atunci când statul se implică direct în secretomania excesivă şi acoperirea unor activităţi ilegale, dezinformarea, şi o evidenţă financiară superficială sau chiar incorectă (care încurajează evaziunea fiscală în cazul unora din oficialii guvernamentali), reflectănd adesea interesele doar a anumitor clase sociale şi interese de grup, şi încălcând astfel drepturile omului.
Corupţia presupune cel puţin o persoană care corupe, una coruptă, şi o masă mare de păgubiţi inerţi. Vina este a tuturor!
Performanța și standardele rețelelor de telecomunicații 5GNicolae Sfetcu
Pentru măsurarea precisă a performanței 5G se utilizează simulatoare și teste specifice. Inițial, termenul a fost asociat cu standardul IMT-2020 al Uniunii Internaționale de Telecomunicații, care necesita o viteză maximă teoretică de descărcare de 20 gigabiți pe secundă și 10 gigabiți pe secundă viteza de încărcare, împreună cu alte cerințe. Apoi, grupul de standarde industriale 3GPP a ales standardul 5G NR (New Radio) împreună cu LTE ca propunere pentru transmitere la standardul IMT-2020.
IT & C, Volumul 2, Numărul 1, Martie 2023, pp. 13-17
ISSN 2821 – 8469, ISSN – L 2821 – 8469, DOI: 10.58679/IT52354
URL: https://www.internetmobile.ro/performanta-si-standardele-retelelor-de-telecomunicatii-5g/
Intelligence Info, Volumul 2, Numărul 3, Septembrie 2023Nicolae Sfetcu
Revista Intelligence Info este o publicație trimestrială din domeniile intelligence, geopolitică și securitate, și domenii conexe de studiu și practică.
Cuprins:
EDITORIAL
Rolul serviciilor de informații în război, de Nicolae Sfetcu
INTELLIGENCE
Intelligence Analysis, de Nicolae Sfetcu
ISTORIA
Reformele serviciilor secrete de la Mihail Moruzov la Eugen Cristescu, între inovație și decadență, de Rodica Liseanu
Take Ionescu – O biografie vectorială în istoria partidelor politice și în semantica diplomației, de Rodica Liseanu
GEOPOLITICA
Apusul universalismului european, de Lisa-Maria Achimescu
Chinese Hegemony in the Production of Rare Earths, de Emilian M. Dobrescu
România. Între ‘gura de rai’ geografică şi răspântia geopolitică, de Radu Carp
SECURITATE
Schimbarea paradigmelor în mediul internațional de securitate, de Alexandru Cristian
Adevăr și dezinformare în fenomenul OZN, de Dan D. Farcaș
ȘTIINȚA INFORMAȚIEI
Utilizarea analiticii rețelelor sociale în intelligence, de Nicolae Sfetcu
ISSN 2821-8159 ISSN-L 2821-8159, DOI: 10.58679/II30199
EAN , Cod II23 , ID 22330
Intelligence Info (PDF, EPUB, MOBI pentru Kindle) https://www.intelligenceinfo.org/revista/revista-intelligence-info-volumul-2-numarul-3-septembrie-2023/
Ontologii narative în tehnologia blockchainNicolae Sfetcu
Paul Ricoeur a examinat o serie de forme diferite de discurs extins, începând cu discursul metaforic. Discursul narativ este una din formele investigate, configurând concepte eterogene care identifică acțiunile într-un moment în care un lucru se întâmplă nu numai după altceva, ci și din cauza altui lucru dintr-o poveste sau istorie care poate fi urmată. Reformează evenimentele fizice ca evenimente narative, care au sens deoarece spun ceea ce se întâmplă într-o poveste sau într-o istorie. Narațiunile sunt întotdeauna o sinteză a conceptelor eterogene care configurează episoadele povestirii.
IT & C, Volumul 2, Numărul 1, Martie 2023, pp. 7-12
ISSN 2821 – 8469, ISSN – L 2821 – 8469, DOI: 10.58679/IT70323
URL: https://www.internetmobile.ro/ontologii-narative-in-tehnologia-blockchain/
Philosophy of Blockchain Technology - OntologiesNicolae Sfetcu
About the necessity and usefulness of developing a philosophy specific to the blockchain technology, emphasizing on the ontological aspects. After an Introduction that highlights the main philosophical directions for this emerging technology, in Blockchain Technology I explain the way the blockchain works, discussing ontological development directions of this technology in Designing and Modeling. The next section is dedicated to the main application of blockchain technology, Bitcoin, with the social implications of this cryptocurrency. There follows a section of Philosophy in which I identify the blockchain technology with the concept of heterotopia developed by Michel Foucault and I interpret it in the light of the notational technology developed by Nelson Goodman as a notational system. In the Ontology section, I present two developmental paths that I consider important: Narrative Ontology, based on the idea of order and structure of history transmitted through Paul Ricoeur's narrative history, and the Enterprise Ontology system based on concepts and models of an enterprise, specific to the semantic web, and which I consider to be the most well developed and which will probably become the formal ontological system, at least in terms of the economic and legal aspects of blockchain technology. In Conclusions I am talking about the future directions of developing the blockchain technology philosophy in general as an explanatory and robust theory from a phenomenologically consistent point of view, which allows testability and ontologies in particular, arguing for the need of a global adoption of an ontological system for develop cross-cutting solutions and to make this technology profitable.
Inteligența artificială, o provocare esențialăNicolae Sfetcu
Inteligența artificială a progresat până la punctul în care este o componentă esențială în aproape toate sectoarele economiei moderne actuale, cu un impact semnificativ asupra vieții noastre private, sociale și politice. Ea a fost întemeiată pe presupunerea că inteligența umană poate fi descrisă atât de precis încât să poată fi făcută o mașină să o simuleze. Acest lucru ridică argumente filozofice despre minte și etica creării de ființe artificiale înzestrate cu inteligență asemănătoare omului. Inteligența artificială sunt o sursă a unui set complet nou de probleme de explicabilitate, responsabilitate și încredere.
IT & C, Volumul 2, Numărul 1, Martie 2023, pp. 3-6
ISSN 2821 – 8469, ISSN – L 2821 – 8469, DOI: 10.58679/IT30677
URL: https://www.internetmobile.ro/inteligenta-artificiala-o-provocare-esentiala/
Ghidul de faţă se bazează în general pe ghidul în limba engleză „How to write clearly”, aducând o serie de recomandări specifi ce redactării textelor în limba română.
https://www.telework.ro/ro/e-books/ghid-ue-pentru-traduceri/
Activitatea de intelligence – Ciclul intelligenceNicolae Sfetcu
David Singer afirmă că, în prezent, amenințarea constituie principalul obiectiv al agențiilor de informații. Activitatea de informații poate fi considerată ca fiind procesul prin care anumite tipuri de informații sunt solicitate, colectate, analizate și diseminate, și modul în care sunt concepute și desfășurate anumite tipuri de acțiuni secrete. Ciclul intelligence reprezintă un set de procese utilizate pentru a furniza informații utile în luarea deciziilor. Ciclul constă din mai multe procese. Domeniul conex al contrainformațiilor este însărcinat cu împiedicarea eforturilor informative ale altora.
INTELLIGENCE INFO, Volumul 2, Numărul 1, Martie 2023, pp. 34-40
ISSN 2821 – 8159, ISSN – L 2821 – 8159, DOI: 10.58679/II18551
URL: https://www.intelligenceinfo.org/activitatea-de-intelligence-ciclul-intelligence/
Cunoașterea Științifică, Volumul 2, Numărul 3, Septembrie 2023Nicolae Sfetcu
Revista Cunoașterea Științifică este o publicație trimestrială din domeniile științei și filosofiei, și domenii conexe de studiu și practică.
Cuprins:
EDITORIAL
Știința schimbărilor climatice, de Nicolae Sfetcu
ȘTIINȚE NATURALE
Interactions between the brain, the biofields, and the physical, de Adrian Klein și Robert Neil Boyd
The Psycho-Neural Connectivity, de Adrian Klein
ȘTIINȚE SOCIALE
School dropout rate in Romania, de Alexandra Mocanu
Ipoteza hipercivilizațiilor, de Dan D. Farcas
The Importance of the Rare Earths for the World Economy, de Emilian M. Dobrescu
Puteri emergente şi noile paradigme în mediul internațional de securitate, de Alexandru Cristian
Models of Emotional Intelligence in Research and Education, de Nicolae Sfetcu
Cultura anime în România, de Alexandra Mocanu
ȘTIINȚE FORMALE
Etica în inteligența artificială: provocări și perspective, de Sebastian Bidașcă
FILOSOFIE
Portretul biblic al unicornilor și paranoia fără fundamente privind seria „My Little Pony”, de Valentina-Andrada Minea
Materialism şi realitatea esteticii: Argumente şi contraargumente la scrisorile lui Friedrich Schiller privind educaţia estetică a omului (1/3), de Petru Ababii
Unele aspecte ale filosofiei lui Albert Einstein și Henri Bergson, timpul şi paradoxul lui Zenon în viziune critică nonsofisticată, de Petru Ababii
RECENZII CĂRȚI
Humanism, Becoming and the Demiurge in The Adventures of Pinocchio, de Nicolae Sfetcu
ISSN 2821-8086 ISSN-L 2821-8086, DOI: 10.58679/CS90773
EAN 725657205492, Cod CS23P , ID 12330
Cunoașterea Științifică (PDF, EPUB, MOBI pentru Kindle) https://www.cunoasterea.ro/revista/revista-cunoasterea-stiintifica-volumul-2-numarul-3-septembrie-2023/
Manual pentru începători pentru întreţinerea şi depanarea calculatoarelor, cu o introducere în noţiuni de calculatoare, hardware, software (inclusiv sisteme de operare) şi securitatea pe Internet.
Un calculator de uz general are patru componente principale: unitatea logică aritmetică (ALU), unitatea de control, memoria, şi dispozitive de intrare şi ieşire (denumite colectiv I/O). Aceste piese sunt interconectate prin bus-uri, de multe ori făcut din grupuri de fire.
Caracteristica definitorie a computerelor moderne, care le distinge de toate celelalte maşini, este că acestea pot fi programate. Asta presupune că un anumit tip de instrucţiuni (program) poate fi implementat în calculator, care le va procesa. Calculatoare moderne, bazate pe arhitectura von Neumann, au adesea codul maşină în forma unui limbaj de programare imperativ.
Drobeta Turnu Severin Heavy Water Plant: ConstructionNicolae Sfetcu
The heavy water plant was established under the name of Combinatul Chimic Drobeta, by Decree 400/16.11.1979, under the Inorganic Products Industrial Center (CIPA) Râmnicu Vâlcea. The thermo-electric plant for supplying the heavy water factory with steam was decided to be located in Halânga village, three kilometers from the factory. The process water required for the factory was brought from the Danube, and the hydrogen sulphide used in the process was produced in the plant, through a specific technology, and then compressed, liquefied and stored in special tanks. The works on the heavy water factory at Drobeta Turnu Severin started in 1979, based on a derogatory HCM. The equipment for the heavy water plant was purchased through the Industrial Center for Chemical and Refinery Equipment (CIUTCR). All equipment and facilities that transported hydrogen sulphide had to meet strict quality assurance conditions.
CUNOAȘTEREA ȘTIINȚIFICĂ, Volumul 2, Numărul 1, Martie 2023, pp. 39-44
ISSN 2821 – 8086, ISSN – L 2821 – 8086, DOI: 10.58679/CS96723
URL: https://www.cunoasterea.ro/drobeta-turnu-severin-heavy-water-plant-construction/
Din punct de vedere metodologic, atât Newton cât și Einstein, și ulterior Dirac, au susținut fără rezerve principiul simplității matematice în descoperirea noilor legi fizice ale naturii. Lor li s-au alăturat și Poincaré și Weyl. Eduard Prugovecki afirmă că gravitația cuantică a impus luarea în considerare a unor întrebări epistemologice fundamentale, care pot fi identificate în filosofie cu problema minții-corp și cu problema liberului arbitru. Aceste întrebări au influențat epistemologia mecanicii cuantice sub forma “paralelismului psiho-fizic” al lui von Neumann și analiza ulterioară a tezei de către Wigner că “colapsul pachetului de unde” are loc în mintea “observatorului”.
CUNOAȘTEREA ȘTIINȚIFICĂ, Volumul 2, Numărul 1, Martie 2023, pp. 20-38
ISSN 2821 – 8086, ISSN – L 2821 – 8086, DOI: 10.58679/CS96800
URL: https://www.cunoasterea.ro/epistemologia-gravitatiei-cuantice/
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
(May 29th, 2024) Advancements in Intravital Microscopy- Insights for Preclini...Scintica Instrumentation
Intravital microscopy (IVM) is a powerful tool utilized to study cellular behavior over time and space in vivo. Much of our understanding of cell biology has been accomplished using various in vitro and ex vivo methods; however, these studies do not necessarily reflect the natural dynamics of biological processes. Unlike traditional cell culture or fixed tissue imaging, IVM allows for the ultra-fast high-resolution imaging of cellular processes over time and space and were studied in its natural environment. Real-time visualization of biological processes in the context of an intact organism helps maintain physiological relevance and provide insights into the progression of disease, response to treatments or developmental processes.
In this webinar we give an overview of advanced applications of the IVM system in preclinical research. IVIM technology is a provider of all-in-one intravital microscopy systems and solutions optimized for in vivo imaging of live animal models at sub-micron resolution. The system’s unique features and user-friendly software enables researchers to probe fast dynamic biological processes such as immune cell tracking, cell-cell interaction as well as vascularization and tumor metastasis with exceptional detail. This webinar will also give an overview of IVM being utilized in drug development, offering a view into the intricate interaction between drugs/nanoparticles and tissues in vivo and allows for the evaluation of therapeutic intervention in a variety of tissues and organs. This interdisciplinary collaboration continues to drive the advancements of novel therapeutic strategies.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
Mammalian Pineal Body Structure and Also Functions
Karl Popper’s demarcation problem
1. Karl Popper’s demarcation problem
Nicolae Sfetcu
24.01.2019
Sfetcu, Nicolae, " Karl Popper’s demarcation problem ", SetThings (January 24, 2019),
MultiMedia Publishing (ed.), DOI: 10.13140/RG.2.2.11481.36967, URL
= https://www.telework.ro/en/karl-poppers-demarcation-problem/
Email: nicolae@sfetcu.com
This book is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International. To view a copy of this
license, visit http://creativecommons.org/licenses/by-nd/4.0/.
Extract from:
Sfetcu, Nicolae, "The distinction between falsification and refutation in the demarcation problem
of Karl Popper", SetThings (2019), MultiMedia Publishing (ed.), DOI:
10.13140/RG.2.2.22522.54725, ISBN 978-606-033-207-7, URL = https://www.telework.ro/ro/e-
books/the-distinction-between-falsification-and-refutation-in-the-demarcation-problem-of-karl-
popper/
2. Nicolae Sfetcu: Karl Popper’s demarcation problem
3
Karl Popper, as a critical rationalist, was an opponent of all forms of skepticism,
conventionalism and relativism in science. In 1935 he wrote Logik der Forschung. Zur
Erkenntnistheorie der modernen Naturwissenschaft, later translating the book into English and
publishing it under the title The Logic of Scientific Discovery (Karl Raimund Popper 2002b)
considered to be a pioneering work in its field. Many of the arguments in this book are directed
against the members of the "Vienna Circle", such as Moritz Schlick, Otto Neurath, Rudolph
Carnap, Hans Reichenbach, Carl Hempel and Herbert Feigl. Popper agrees with them on the
general aspects of scientific methodology and their mistrust in traditional philosophical
methodology, but its solutions have been significantly different. Popper has contributed
significantly to the debates on general scientific methodology, the demarcation of pseudoscience,
the nature of probability, and the methodology of social sciences.
Popper was deeply impressed by the differences between Freud's and Adler's supposed
"scientific" theories and the revolution triggered by Einstein's theory of relativity in physics during
the first two decades of the 20th century. While Einstein's theory was extremely "risky" in the
sense that it was possible to deduce consequences from it which, if it were proved to be false,
would have falsified the whole theory, nothing could, in principle, falsify the psychoanalytic
theories, which are not predictive. (Thornton 2017)
Popper was criticized about his prescriptive approach to science and the focus on the logic
of falsifiability. His theory was opposed to Thomas Kuhn's socio-historical approach developed in
"The Structure of Scientific Revolutions", (T. S. Kuhn 1996) which, in support, reintroduced the
idea that the change of science is essentially dialectical and depends on the establishment of a
consensus within the communities of researchers.
3. Nicolae Sfetcu: Karl Popper’s demarcation problem
4
There have been attempts to demarcate science of non-science since the ancient period:
"To be scientific," Aristotle said, "one must deal with causes, one must use logical demonstration,
and one must identify the universals which 'inhere' in the particulars of sense." (Laudan 1983)
The demarcation of science by pseudoscience has both theoretical reasons (the problem of
delimitation is an illuminating perspective that contributes to the philosophy of science in the same
way that error analysis contributes to the study of informal logic and rational reasoning) and
practical reasons (the demarcation is important for decision-making both in private and public
life). (Mahner 2007)
Logical positivism, through the theory of verifiability of significance (verificationism),
considered that only affirmations of factual matters or logical relationships between concepts are
significant. (Grayling 2001) But "the verificationist proposals had the aim of solving a distinctly
different demarcation problem, namely that between science and metaphysics." (Hansson 2017)
According to Popper, the central issue of the philosophy of science is the demarcation, the
distinction between science and what he calls "non-science" (including logic, metaphysics,
psychoanalysis, etc.).
"Any demarcation in my sense must be rough. (This is one of the great differences from
any formal meaning criterion of any artificial 'language of science'.) For the transition
between metaphysics and science is not a sharp one: what was a metaphysical idea
yesterday can become a testable scientific theory tomorrow; and this happens frequently."
(Miller 1985)
"There will be well-testable theories, hardly testable theories, and non-testable theories.
Those which are non-testable are of no interest to empirical scientists. They may be
described as metaphysical. Here I must again stress a point which has often been
misunderstood. Perhaps I can avoid these misunderstandings if I put my point now in this
way. Take a square to represent the class of all statements of a language in which we intend
to formulate a science; draw a broad horizontal line, dividing it into an upper and lower
half; write 'science' and 'testable' into the upper half, and 'metaphysics' and 'non-testable'
into the lower: then, I hope, you will realize that I do not propose to draw the line of
demarcation in such a way that it coincides with the limits of a language, leaving science
inside, and banning metaphysics by excluding it from the class of meaningful statements."
(Karl Raimund Popper 2002a)
4. Nicolae Sfetcu: Karl Popper’s demarcation problem
5
A major argument of Popper is Hume's critique of induction, (Hume 1738) arguing that
induction should never be used in science. But he disagrees with the skepticism associated with
Hume, nor with the support of Bacon and Newton's pure "observation" as a starting point in the
formation of theories, as there are no pure observations that do not imply certain theories. Popper
argues that there is no unique methodology for science. It is necessary to solve the problem of
demarcation of metaphysics from science. But we should recognize that many metaphysical
systems have led to important scientific results. He reminds Democrit's system; and that of
Schopenhauer that is very similar to that of Freud. And some, for example, those of Plato or
Malebranche or Schopenhauer are wonderful constructions of thought. But at the same time, we
should oppose those metaphysical systems that tend to charm and wonder. But obviously, we
should do the same with non-metaphysical or anti-metaphysical systems if it displays this
dangerous trend. And Popper think we cannot do it in one move. Rather, we must make the effort
to analyze the systems in detail; we must show that we understand what the author wants to say,
but what he says does not deserve the effort to understand. (Miller 1985)
Instead, Popper proposes falsifiability as a method of scientific investigation. For him, a
theory is scientific only if it is falsifiable by a conscious event. Popper's theory of demarcation is
based on his perception of the logical asymmetry he has between verification and falsification: it
is logically impossible to definitively verify a universal proposition by reference to experience (as
Hume says), but a single counter-example refutes definitively the corresponding universal law. In
a word, an exception, far from "proving" an exception to the rule, definitively rejects it: (Thornton
2017)
Popper says that they are people of courageous ideas, though very critical of their own
ideas, they try to find out if their ideas are correct, trying to find out first whether they are wrong.
5. Nicolae Sfetcu: Karl Popper’s demarcation problem
6
They operate with courageous conjectures and severe attempts to reject their own conjectures. The
criterion of demarcation between science and non-science that he proposes is a simple logical
analysis of this image. If it is good or bad, this will be shown by its fertility. Courageous ideas are
new and bold hypotheses or conjectures. And severe rejection attempts are critical discussions and
severe empirical tests. But when it is a bold conjecture in the sense proposed here, and when not?
It is bold if and only if it assumes a great risk of being false - if things were different, and if they
seem at that moment to be different. (Miller 1985)
A true scientific theory is restrictive, and can therefore be tested and falsified, but never
logically verified. Thus, if a theory has resisted the test, it does not mean it has been verified, it
has only a greater degree of corroboration, and can be replaced at any time by a better theory.
Popper uses falsifiability as a demarcation criterion to evaluate theories. The Popper
criterion does not exclude from the field of science statements that cannot be falsified, but only
theories that contain no falsifiable statement, yet it is not clear what constitutes a "whole theory"
and what makes a statement to be "significant".
Verificationism1
was an essential feature of the logical positivism of the so-called Vienna
Circle. Popper noticed that the philosophers of the Vienna Circle mixed up two different issues,
the significance and demarcation, and proposed to verify a single solution for both. Popper said
that there are significant non-scientific theories, and therefore a significance criterion does not
coincide with a delimitation criterion, proposing replacing verifiability with falsifiability as a
delimitation criterion. On the other hand, he strictly opposed the view that statements that are not
falsifiable are meaningless or wrong. (Karl Raimund Popper 2002b)
1
Verificationism claims that a statement must, in principle, be empirically checked to be both meaningful
and scientific.
6. Nicolae Sfetcu: Karl Popper’s demarcation problem
7
Popper argues that the only logical technique that is integral part of the scientific method
is that of deductive testing, the conclusions being deduced from a hypothesis and then compared
with each other and other relevant statements to determine whether they falsify or corroborate the
hypothesis. Such conclusions are not directly compared to the facts, simply because there are no
"pure" facts available; all observations-statements are loaded by theory and are just as much a
function of purely subjective factors (interests, expectations, desires, etc.) as they are a function of
what is truly objective. (Thornton 2017)
Popper specifies four steps for the deductive procedure2
:
"I proposed (though years elapsed before I published this proposal) that the refutability or
falsifiability of a theoretical system should be taken as the criterion of its demarcation.
According to this view, which I still uphold, a system is to be considered as scientific only
if it makes assertions which may clash with observations; and a system is, in fact, tested
by attempts to produce such clashes, that is to say by attempts to refute it. Thus testability
is the same as refutability, and can therefore likewise be taken as a criterion of demarcation.
There are, moreover (as I found later), degrees of testability: some theories expose
themselves to possible refutations more boldly than others." (Karl Raimund Popper 2002a)
Popper believes that Hume's philosophy demonstrates that there is an implicit contradiction
in traditional empiricism, which claims that all knowledge comes from experience, and that
universal sentences (including scientific laws) are verifiable by reference to experience.
Contradiction derives from the attempt to show that, despite the openness of the experience,
scientific laws can be interpreted as empirical generalizations, which in a way finally confirm a
"positive" experience. Popper eliminates the contradiction by rejecting the first of these principles
and eliminating the imposition of empirical verification into falsifiability in the second principle.
He states that scientific theories are not inductively deduced from experience, nor are scientific
2
The steps for the deductive procedure, according to Popper: (1) A test of internal consistency to see possible
contradictions; (2) Axiomatization of theory to distinguish between empirical and logical elements; (3) Comparing
the new theory with the existing one; (4) Testing the theory by empirically applying the conclusions derived from it
to verify whether the theory is corroborated (but not verified). (Karl Raimund Popper 2002b)
7. Nicolae Sfetcu: Karl Popper’s demarcation problem
8
experiments conducted to verify or establish their truth; all knowledge is provisional, conjectural,
hypothetical - we can never prove theories definitively, we can only confirm (temporarily) or refute
them. That is why we have to make a choice between theories that explain the set of investigated
phenomena, eliminating only those theories that are falsified, and rationally choose between the
remaining, unfalsified theories, the one that possesses the highest level of explanatory power and
predictive power. Popper emphasizes the importance of the critical spirit of science - critical
thinking is the very essence of rationality. (Thornton 2017)
There have been various demarcation proposals: this should refer to a research program,
(Lakatos 1974) an epistemic domain or a cognitive discipline, representing common goals of
knowledge and practice (Bunge 1982) (Mahner 2007) a theory (K. Popper 1963), a practice, (Lugg
1992) (Morris 1987) a problem or a scientific inquiry (Siitonen 1984) and a specific investigation.
(T. Kuhn 1970) (Mayo 1996) The difficulty is to select the demarcation method. Derksen (Derksen
1993) places emphasis on demoting the pseudoscience man (the person who promotes
pseudoscience), in the idea that pseudoscience has scientific claims and such claims are associated
with a person, not a theory.
Bibliography
Bunge, Mario. 1982. “Demarcating Science from Pseudoscience.” Fundamenta Scientiae 3.
Derksen, A. A. 1993. “The Seven Sins of Pseudo-Science.” Journal for General Philosophy of
Science / Zeitschrift Für Allgemeine Wissenschaftstheorie 24 (1): 17–42.
Grayling, A. C. 2001. Wittgenstein: A Very Short Introduction. OUP Oxford.
Hansson, Sven Ove. 2017. “Science and Pseudo-Science.” In The Stanford Encyclopedia of
Philosophy, edited by Edward N. Zalta, Summer 2017. Metaphysics Research Lab,
Stanford University. https://plato.stanford.edu/archives/sum2017/entries/pseudo-science/.
Hume, David. 1738. A Treatise of Human Nature. Oxford University Press.
Kuhn, Thomas. 1970. “Logic of Discovery or Psychology of Research?” Criticism and the
Growth of Knowledge.
Kuhn, Thomas S. 1996. The Structure of Scientific Revolutions. 3rd edition. Chicago, IL:
University of Chicago Press.
8. Nicolae Sfetcu: Karl Popper’s demarcation problem
9
Lakatos, I. 1974. “Popper on Demarcation and Induction.” In The Philosophy of Karl Popper,
edited by Karl R. Popper and Paul Arthur Schilpp, 1st ed. Vol. The Library of living
philosophers. La Salle, Ill: Open Court.
Laudan, Larry. 1983. “The Demise of the Demarcation Problem.” In Physics, Philosophy and
Psychoanalysis, 111–27. Boston Studies in the Philosophy of Science. Springer,
Dordrecht. https://doi.org/10.1007/978-94-009-7055-7_6.
Lugg, Andrew. 1992. “Pseudoscience as Nonsense.” Methodology and Science 25.
Mahner, Martin. 2007. “Demarcating Science from Non-Science.” ResearchGate. 2007.
https://www.researchgate.net/publication/286895878_Demarcating_Science_from_Non-
Science.
Mayo, Deborah G. 1996. “Ducks, Rabbits, and Normal Science: Recasting the Kuhn’s-Eye View
of Popper’s Demarcation of Science.” The British Journal for the Philosophy of Science
47 (2): 271–90. http://www.jstor.org/stable/687948.
Miller, David. 1985. Popper Selections. Princeton.
Morris, Robert L. 1987. “Parapsychology and the Demarcation Problem.” Inquiry 30 (3): 241–
51. https://doi.org/10.1080/00201748708602122.
Popper, Karl Raimund. 2002a. Conjectures and Refutations: The Growth of Scientific
Knowledge. Psychology Press.
———. 2002b. The Logic of Scientific Discovery. Psychology Press.
Siitonen, Arto. 1984. “Demarcation of Science From the Point of View of Problems and
Problem-Stating.” Philosophia Naturalis 21: 339–353.
Thornton, Stephen. 2017. “Karl Popper.” In The Stanford Encyclopedia of Philosophy, edited by
Edward N. Zalta, Summer 2017. Metaphysics Research Lab, Stanford University.
https://plato.stanford.edu/archives/sum2017/entries/popper/.