Upcoming SlideShare
Loading in …5

Science and Social Responsibility [John Crowley, UNESCO SHS, France]


Published on

Workshop on Higher Education and Professional Responsibility in CBRN Applied Sciences and Technology across the Sub-Mediterranean Region
3-4 April 2012. Palazzo Zorzi, Venice
Session 1. Status - Culture of Safety and Security and Responsible Science

Published in: Technology, Spiritual
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Science and Social Responsibility [John Crowley, UNESCO SHS, France]

  1. 1. DRAFT FOR PRESENTATION AT THE WORKSHOP ON “HIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGION”, VENICE, 3-4 APRIL 2012 Science and Social Responsibility John Crowley UNESCO Division of Ethics, Science and Society1No one, I suspect, would deny that scientists have responsibilities. Their primaryresponsibility is to science itself – to search impartially for truth, with all the epistemologicaland institutional implications that such a search entails. They also have civic responsibilities,as citizens and as members of society, that are inevitably coloured by their specific status asscientists – for instance because of the technical knowledge they may alone have access to.It is less obvious that scientists might have additional responsibilities as scientists. Indeed,the claim is often denied. Some see the idea of responsible science, or of science ethics, asa restriction or an imposition on science – one that denies the fundamental character ofscience as free inquiry, which implies the entitlement of scientists to follow a line of researchwherever it may lead.The issue, therefore, is to give reasons for the very existence of social responsibility.There is in fact a simple, straightforward and consensual basis in article 27 of the UniversalDeclaration of Human Rights, which declares the human right “to share in scientificadvancement and its benefits”. It follows logically from this human right, and from the need totake it seriously, that scientists, and all those engaged in shaping the institutions andpractices of science, have a correlative duty to help create the conditions in which the rightcan be realized. However, this particular human right has not been extensively developed,either conceptually or legally, and its implications are perhaps not entirely clear. Furthermore,it does not require simply assertion – it also, if it is to be taken seriously as a basis for thedefinition of responsibilities, needs justification.A full justification would be beyond the scope of this presentation. However, there are twobasic ideas that enable the key point to be made fairly quickly.The first is the universal nature of science, which has two implications, one “thin” andincontrovertible, the other “thick” and debatable, but certainly plausible.What can hardly be denied is that science is fundamentally impersonal. While manydiscoveries, theorems, experiments, equations etc. bear the names of their discoverers, thisis purely a matter of courtesy. They are in fact inherently anonymous, radically detachablefrom the conditions of their invention. Any piece of scientific knowledge – valid or invalid –has the same content for anyone who can understand it. It follows that potential universalaccess to the results of science is written into the logic of science itself. Failure to achieveactual universality of access must therefore be ascribable to identifiable barriers –inadequate training, wilful refusal, diffusion lags, and so on.What is more debatable is whether science actually operates in this way – or whether itshould. The elementary “bits” of scientific knowledge may be radically universalizable, butscience is not just a collection of “bits”. It is structured not just by institutions and practices,but also by theories and by worldviews that give it meaning as well as content. Theconnection between the understanding produced by science and human self-understanding,1 The views expressed in this paper are those of the author and, except where specifically stated otherwise,should not be regarded as official statements of a UNESCO position on the topics addressed.Contact email: j.crowley@unesco.org. 1
  2. 2. DRAFT FOR PRESENTATION AT THE WORKSHOP ON “HIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGION”, VENICE, 3-4 APRIL 2012as expressed in philosophy, literature, art and everyday life, is clearly much more complexthan the implications of operational method or even epistemology. Arguments are indeedwidespread in contemporary philosophy that science is not in fact universal in the “thick”sense, and at a more practical level it is clear that the institutional organization of science isnot universalizable in the same sense as the elementary bits of knowledge are.This point, it should be emphasized, does not qualify the human right “to share in scientificadvancement and its benefits”. What it does call into question, however, is the extent towhich the right follows directly from the internal structure of science. If the right were a purelyexternal supplement, then it clearly would require an autonomous justification. Without goinginto detail for present purposes, the language of the Universal Declaration of Human Rightsdeserves emphasis in this regard. It does not refer only to the benefits of science, importantthough they are, but also to sharing in “scientific advancement” – in other words in scienceas a process. From the fact that this process is not inherently universal (unlike theelementary “bits” of scientific knowledge), what follows is a responsibility to create theconditions in which scientific advancement can actually be shared. Science, in other words,is bundled up with worldviews, institutions and social practices that are not strictly speaking“scientific” and that are acceptable only in so far as they are, in the relevant sense, “shared”– which means, minimally, elaborated through mechanisms that are recognized by allstakeholders to be equitable. This is a huge challenge, of course, but it expresses rather thannegates the essentially universal character of science.Alongside universality, the second basic idea that can help to establish the validity of thehuman rights framework for science is integrity.It is sometimes assumed in public debate that integrity has an optional character, in thesense that science can deliver results regardless of integrity – and perhaps deliver moreeffectively when it ignores ethical considerations. There are clearly cases in which this canbe argued – for instance in certain areas of medical research, where strong bioethicsframeworks have been established precisely because violating principles such as informedconsent could, plausibly, enable results to be achieved faster. Generally speaking, however,indifference to integrity tends to be positively correlated with indifference to methodology.The most serious violations of scientific ethics in the 20th century – for instance, the Mengeleexperiments in Nazi Germany – typically involved not just gross mistreatment of researchsubjects but also routine fabrication and falsification of data, precisely because they wereresponding to politically biased and scientifically arbitrary agendas. And of course, the resultswere not peer-reviewed. There are reasonable grounds to believe, therefore, that unethicalscience tends, for institutional and social reasons, to be bad science. Conversely, integrity,which underwrites the availability to all of the results of science, albeit not effective sharing,provides an internal justification for the established human rights based responsibilityframework.Assuming this framework to be adequately justified, it is important to consider what it actuallymeans. While the right to share in scientific advancement and its benefits is inadequatelydeveloped in legal and conceptual terms, it is nonetheless possible to draw some reasonablyfirm conclusions from a series of authoritative instruments that have been adopted byUNESCO to address science ethics and the institutional structures of science. Reference ismade in this regard, inter alia, to the 1974 Recommendation on the Status of ScientificResearchers, the 1999 Declaration on Science and the Use of Scientific Knowledge and the2005 Universal Declaration on Bioethics and Human Rights. Drawing on these documents,three important dimensions of the social responsibilities of scientists can be drawn out, whichbuild on but also expand the basic principles of universality and integrity: the duty to refrainfrom harm; the duty to aim at social benefits; and the duty to ensure access not just to resultsbut also to equitable opportunities to participate in scientific processes. 2
  3. 3. DRAFT FOR PRESENTATION AT THE WORKSHOP ON “HIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGION”, VENICE, 3-4 APRIL 2012These duties are very generic, of course. Each of them would need to be filled in before theycan apply to specific cases. For instance, the kind of harm that can be inflicted by theresearch process or by technological applications varies considerably between disciplines. Inbiomedical research, the principle of informed consent serves to protect human subjectsagainst inappropriate protocols. A different approach is required – even if it can be related byanalogy to informed consent – to deal with, for instance, weapons research or environmentalimpact. Similarly, the direct social benefits are typically much easier to identify if one isworking on, say, vaccine development or nanofiltration than if one’s research deals withprime numbers or ancient history.2 Furthermore, it is a tricky business to give weight topotential social benefits in decisions about priority-setting and funding, since such benefitsdepend not just on scientific success at the research level, but also on a range of otherfactors such as effective development of applications and social take-up. Finally, the kind of“access” that might be relevant in human rights terms is likely to vary considerably from caseto case. In some research areas, publication in reputable journals may suffice to ensure thatthose who could benefit from the results have access to them. In other areas, it may benecessary to overcome barriers to access relating to factors as diverse as the cost ofsubscriptions, language and lack of capacity to develop applications. To take just oneexample, equitable access to climate science for developing countries cannot be ensuredsimply by giving them free access to raw satellite observation data, if the observations do nothave the right geographical focus and if the beneficiaries lack the human and numericalcapacity to input the data into models and other computational tools.Nonetheless, while generic, the three duties do offer a reasonably clear basis for whatscientists and science institutions need to worry about and respond to. They also give a fairlypersuasive negative picture of what irresponsible science might look like. Very simply put,indifference to harm, social benefit and equitable access is unethical. Concern does not ofcourse exhaust ethics. But without concern, ethics has nothing to build on.This framework for the social responsibilities of scientists – which, while established, hasnever been comprehensively applied – is not immediately applicable. On the contrary, itfaces a series of challenges that point to the need to sharpen, to operationalize and perhapsto update it. Some are very familiar; others are recent or even emerging.Among the familiar challenges is the indeterminate character of the social impact of anygiven area of scientific research. Technological development may of course be moredeterminate – although even then significant uncertainties are likely to remain with regard tosocial uses and social effects. Scientists cannot, therefore, be expected to assess theiractions by reference to detailed predictions of things that would happen. Ethical concernstypically operate in the realm of what might happen. They involve considerations ofuncertainty and risk and are framed by the very old philosophical notion of prudence – andthe much newer one of precaution.Recent and emerging issues relate both to changing patterns of scientific organization and toscientific developments that modify the character of the relation between science andsociety.The point is not that science has suddenly acquired a capacity to change the word that itprevious lacked. On the contrary, science has been world-changing ever since the notion ofscience was first adumbrated – not just through the application of specific technologies,important as they are, but also because science is a worldview that bears on human2 Indeed, the general question how ethics and social responsibility apply to the social and human sciences is acomplex issue that cannot be discussed in detail in this paper. 3
  4. 4. DRAFT FOR PRESENTATION AT THE WORKSHOP ON “HIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGION”, VENICE, 3-4 APRIL 2012consciousness and self-understanding. However, the power of science and technology toshape the very fabric of life sharply expanded in the 20th century, and was dramatized by thedestructive deployment of nuclear fusion. In the 21st century, we are increasingly sensitive tothe implications of biotechnologies, broadly understood, including their interfaces andpossible convergence with nanotechnologies and computer sciences, which may over timereshape what we mean by life and what we understand to be “human”.What this means, in practice, is an added level of complexity embedded in the question whatscientists could or should be responsible for. The things that might happen are increasinglydiffuse, and the causal connections sometimes elusive.Let me take just two examples.The first is so-called “dual use”, which in this context involves the malicious use by anunrelated third party of results derived from bona fide research that is intended to havebeneficial applications or is not pursued with a view to application at all. The question is towhat extent scientists have responsibilities with respect to “dual use research of concern”3 –in other words possible malicious use by hypothetical third parties of their research results.The second is the concerns about converging technologies that have emerged in somecircles, notably in Europe, as a consequence of the reception of the 2002 NSF report byRoco & Bainbridge on “enhancing human potential”. These concerns led the EuropeanCommission, in the code of conduct for responsible nanoscience that it published in 2008, topropose banning public funding for research in the nanoscience that could lead toapplications involving the “non-therapeutic enhancement of human potential”.In both cases, the difficult question is how such concerns might be operationalized. Clearly, itis unrealistic and even unfair simply to state that individual scientists will be held responsiblefor misuse of their research. At the very least, their ability to deal with such responsibilitieswill depend on the institutional support provided to them, on the broader regulatory context,on the actions of many others in the science and technology system, and of course onadequate education and training.It bears emphasizing that these areas of responsibility are, in the strong sense of the word,social. First, because they fully correspond to the three core dimensions of socialresponsibility, as traditionally constituted: do no harm, aim at social benefit, ensure access.Secondly, because they relate to the social organization of science, and not simply toscience as an abstraction (“understanding the world”) or as an individualized set ofprofessional practices.However, there is also a broader category of social responsibilities that concern the socialorganization of science as such, regardless of the specific content of scientific research andtechnological development. It is a very familiar point that, in this area, considerable changeshave occurred and are still ongoing. Nonetheless, it remains helpful to highlight some of thedynamics that are putting under pressure a comprehensive and practically applicableapproach to scientific responsibility.In this respect, intellectual property obviously deserves special mention. The point is not thatintellectual property as such necessarily conflicts with the orientation of science towards3 The phrase “dual use research of concern” (DURC) was developed by the US NASBB in order to define thepriority area for new science policy frameworks with respect to biosecurity. The objective is to operationalize theconcept in such a way that DURC (i.e. research giving rise to clearly identifiable potential concerns on the basis ofcredible security information) can be singled out at the research funding or publication level and treatedappropriately by evaluators, editors and others with, in the broad sense, a “gatekeeping” mandate. 4
  5. 5. DRAFT FOR PRESENTATION AT THE WORKSHOP ON “HIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGION”, VENICE, 3-4 APRIL 2012public benefit – though of course science and technological innovation long predateintellectual property as elaborated in the 18th and 19th centuries. But there are certainly newuses of intellectual property that raise questions with respect both to the orientation ofscience and to access to its results. The relevant contrasts are very familiar. Jonas Salkfamously did not seek to patent the polio vaccine, though it would undoubtedly have beenadmissible, because he felt it would have been inappropriate to do so. Few scientists woulddo the same today. Indeed, in many countries, few scientists could, given how research isfunded and the institutional pressures under which scientists operate. Similarly, as far as Iknow, no attempt was made to patent any of the transuranide elements, although it wouldarguably have been possible. Certainly, technologically revealed fragments of the naturalworld such as genes have been successfully patented in recent years.Intellectual property does not exhaust the significance of commercial dynamics as they bearon the social organization of science. There is also a partly separate set of issues connectedto technology choices and how they relate to social needs. There is no lack of evidence thattechnology choices, notably for development, tend to be driven by supplier push rather thanby impartial needs assessment. The difficult question, which would require furtherconsideration, is what social responsibilities scientists might have, individually and throughtheir institutions, to foster an improved and more participatory approach to technologyassessment, especially in developing countries.Commercial pressures are just one aspect of the context in which science and technologycurrently operate. Of particular importance for the topic of this workshop is military research,which influences research agendas, restricts access to research results, and of coursecreates the risk that science and technology will be deployed deliberately for harmfulpurposes. It is well know that military funding currently dominates a number of areas ofemerging research, notably in converging technologies. The implications of this situationrequire careful consideration. At the very least, military research programmes require moreopen and transparent discussion than they typically receive.To conclude, I should like to emphasize that the various ethical or responsibility issues I havereferred, and the others that could be added in the same vein, are not exclusively, and insome respects not primarily, moral dilemmas for individual scientists. Of course, individualscientists can make a difference, and need to be adequately equipped, through education,training and institutional support, to cope with their responsibilities. But social responsibility isfirst and foremost a social, and therefore institutional, issue. If scientists are to actresponsibly, policy-makers and institutional leaders also have duties to create the right kindof framework – one that encourages, supports and ultimately normalizes ethical science andtechnology.Alongside education, awareness-raising and training, which are obviously essential,UNESCO has a major role to play in shaping international and national frameworks. Thereare three main levels in this regard.First, providing a forum for discussion about the ethical challenges that are emerging and theprinciples that can be elaborated to deal with them. As an intellectual agency, UNESCO isnaturally convinced that such discussions are valuable in themselves, regardless of the legalor institutional follow-up given to them. We have established several expert advisory bodieswith the mandate precisely to foster independent and open-ended discussion in these areas.Of particular relevance to this workshop are the International Bioethics Committee and theWorld Commission for the Ethics of Scientific Knowledge and Technology. They can set theirown agenda with respect to emerging issues, conduct hearings, mobilize expertise, publishreports, and thereby contribute to shaping international agendas, including the institutionalagenda of UNESCO itself. 5
  6. 6. DRAFT FOR PRESENTATION AT THE WORKSHOP ON “HIGHER EDUCATION AND PROFESSIONAL RESPONSIBILITY IN CBRN APPLIED SCIENCES AND TECHNOLOGY ACROSS THE SUB-MEDITERRANEAN REGION”, VENICE, 3-4 APRIL 2012Secondly, UNESCO is an intergovernmental setting within which, if our member states seefit, legal commitments and political statements can be adopted. There are a range of possibleinstruments, with different legal status and practical implications: - declarations, which are statements by the General Conference, addressed to humanity as a whole, expounding in political terms shared commitments – for instance the 2005 Universal Declaration on Bioethics and Human Rights; - recommendations, which are statements by the General Conference, addressed to the member states of UNESCO, calling on them to do certain things and to report periodically to the General Conference on implementation – for example the 1974 Recommendation on the Status of Scientific Researchers; and - conventions, which are adopted by the General Conference but require subsequent ratification by member states and, once they have entered into force, become legally binding commitments between the states parties. There is no directly relevant example in UNESCO, but the 2005 Convention on the Elimination of Doping in Sport does relate to some aspects of scientific responsibility, in so far as doping is increasingly about deliberate and sophisticated misapplications of science.Many valuable things are done within the international community without recourse tointernational law, but the UN system is, essentially, a law-driven community. Consideration ofthe possible merits of new legal instruments in specific areas is therefore an ongoingprocess.Thirdly, on the basis of agreed principles, and within the setting either of existing legalcommitments or of agreed programmes, UNESCO can provide technical support for thedevelopment in each member state, and where appropriate at institutional level, of thelegislative, regulatory, advisory or aspirational frameworks that can give substance toscientific responsibility. This level is of course essential. If ethical reflection stays exclusivelyat the level of identification of issues and elaboration of principles, it risks being toothless.The crucial challenge is to establish the mechanisms by which ethics – in this case the socialresponsibilities of scientists – can be embedded in the routine practices of science andtechnology at all levels.In this respect, a culture of responsibility parallels the familiar and in many ways betterunderstood idea of a culture of safety. In the same way as safety cannot simply be added toa pre-existing institution, but requires extensive re-engineering of processes, attitudes andbehaviour, a responsible institution does not simply happen to exhibit responsibility. It needsto be configured – and usually, in practice, reconfigured – to become responsible. 6