Sir Peter Gluckman’s introductory remarks at a joint workshop held by the International Science Council and Joint Research Centre of the European Commission in Ispra, Italy.
12 September 2024
The International Science Council, which is the global NGO bringing together the natural and social sciences with its members being the national academies, scientific unions and associations, is delighted to partner with the Joint Research Centre for this workshop.
I was struck the other day by a comment made by an extreme right-wing American commentator[1] who said, “ I am not a flat-earther. I am not a round-earther. Actually, what I am is somebody who has left the cult of science”. These words have many implications. They represent an extreme example of the issues we are here to discuss. They serve as a reminder that while we may see as self-evident that science is the best way of understanding the observable world, trust in what science is, while critical, is under challenge. And we would be foolish to reject this kind of statement as a purely American disease or to argue that it is not a generic issue. Irrespective of the number of personally distrust science, their influence is such that they clearly impact on how societies make decisions on many matters even if the actual number of distrusters is segmented – but it is growing, not diminishing in size. As Evans and Collins pointed out in their book Why democracies need science, a key role of science in democracies is to position help societies to make better decisions.
It seems worthwhile to start by reminding ourselves what science is: an organised system of knowledge – one based on observation and experimentation. Explanations can only be based on causal reality, logic, and past observations – sometimes called ‘shallow’ explanations. Explanations based on merely subjective and non-empirical considerations, be they from religion or belief or ‘deep’ explanations, are excluded. Claims without quality assessment by formal or informal expert peers should not be considered part of science. These principles, not methods or truths, define science, allowing iterative review and progressive modification of knowledge as new observations are made and incorporated. It is these principles that make science universal. Crucially, they apply everywhere and across all cultures.
Science is distinctive in its principles allowing science to provide the most reliable and inclusive way to understand the universe and the world around and within us.
But there is potential danger. As Clark, Pinker and others wrote[2]:
The fundamental principle of science is that evidence — not authority, tradition, rhetorical eloquence, or social prestige — should triumph. This commitment makes science a radical force in society: challenging and disrupting sacred myths, cherished beliefs, and socially desirable narratives. Consequently, science exists in tension with other institutions, occasionally provoking hostility and censorship.
This is not exclusive to one extreme end of the political spectrum; we have seen it previously in the postmodernist and relativistic arguments about the validity of science.
We must, however, distinguish what is science from the scientific systems that evolved to produce or use science[3]. The latter vary enormously and are influenced by context, culture, and motive. They include the institutions that fund, teach, and publish science, higher education, and research institutions; they include the defence and private sectors and other components of civil society. Here, we must be honest and acknowledge that institutionalised science has contributed both good and bad and has its own power dynamics.
But science is not the only knowledge system people use. In their daily lives people apply and combine a variety of knowledge systems, including those that define their identity, values, and worldviews; these may be local, indigenous, religious, cultural, or occupational in origin. Science will be more likely to be used when scientists acknowledge its limits and understand that for science to be trusted and best used, they must allow that other knowledge systems often play a role in how we live and society makes decisions.
We are concerned with some somewhat overlapping and interrelated elements when we talk about trust in science. Let me list my own idiosyncratic taxonomy of factors to consider.
- The production of reliable knowledge – much has been written about it. Certainly not all is well in the industry of science, but this is not our prime focus today. There are too many incentives to jump to premature conclusions, to have sloppy research design, and for scientific fraud to entirely eliminate it. But the institutions of science systems do work hard with guidelines and processes to eliminate malevolent behaviours in the scientific community as much as possible, but it is a human endeavour, and the most egregious examples make great media stories.
- The second is the communication of what we know or more honestly what we think we know. There is a great tendency for scientists to ignore the differential gap, as Heather Douglas describes it[4], between what we know and what we conclude. Assumptions are often never admitted to, uncertainties ignored -as we saw so often in Covid communication. Scientific disagreements can be played out in public, hubris exudes, jargon is overused. Scientists and their institutions are great at hyperbole. Minor molecular findings can be turned into headlines curing cancer or diabetes. A study in Australia showed how university and hospital public relations departments contribute to such hyperbole. Publics are not dumb and can sense this. Our community certainly contributes to its own challenges.
- Then there is the matter of perception of it by the recipient. Many published studies and reviews on trust come from philosophers and psychologists who focus on the individual relationship – how that is built and sustained. How two partners in a business or in a romantic relationship sustain their trust in each other. Here is some form of reciprocation. But when one moves from one-on-one to system-to-society interactions, I am less certain, to the extent to which we can extrapolate from that kind of study of trust to the challenges we are discussing. But too much of science ignores or exhibits hubris in any semblance of a relationship with society.
- Then there is the issue of anchoring biases and that underlying psychology which we need to discuss. One form of anchoring bias of growing importance lies in identity fusion – where an individual subsumes their own views to be that of the group they want to be fused with. As liberal democracies have become more polarised, identity fusion plays a greater role at the extremes as we are seeing playing out in so many ways.
Clearly in America and other so-called liberal democracies the alignment of science with political affiliation is most acute. Climate change science has been suggested as a precipitant. But there are also deeper issues. As Schoufele is recently quoted:[5]
Science relies on the public perception that it creates knowledge objectively and in a politically neutral way. The moment we lose that aspect of trust, we just become one of the many institutions ,.., that have suffered from rapidly eroding levels of public trust.
- And here we come to more immediate issues. The interrelated matters of affective polarisation, loss of horizontal trust within society (sometimes called social trust where groups no longer trust each other and do not want to cooperate) and particularly, the issue of the rapid decline in institutional trust. There has been a loss of trust in both the institutions and its actors within the liberal democracies. Most obvious in politicians, media, financial institutions, police but universities and the institutions of science are equally caught up in it. While trust in science tends to be high relative to other elite institutions, it has followed the same general decline.
But the question remains. Can we dislocate a fall in trust in science from general decline in institutional trust. The parallel in trendlines suggests it would be hard. But given it has maintained a higher trust level relative to other elites, it may be possible. Much of my group’s more recent work is on factors affecting social and institutional trust in the context of discussing social cohesion.[6] We cannot ignore the issues of inequality and exclusion in undermining institutional trust.
- A new technology was invented that competed with an extant product. The legacy industry immediately organised, produced fake science, undertook an active disinformation campaign, recruited politicians and the combined effort had a long-lasting legacy. It was the story of margarine versus butter as told by the late Calestous Juma in his marvelous book, Innovation and its enemies.
- But beyond the obvious interests that led to margarine being undermined by the dairy industry, the question is worth asking what motivates so many people to engage in undermining science? Is it always something specific and political or is this no different to the mischief making we see so often on social media. What is the psychology of the disinformation purveyor. Do they always have interests in play. Certainly, over millennia, shamans and priests, dictators, and autocrats have used disinformation and propaganda to maintain power in multiple ways.
And now because of the ease of social media and the business model of influencers, disinformation is also used as a form of disruptive entertainment.
We seem to have gone beyond selective acceptance of science – that of green movements who would accept climate change but reject genetic medication or the conservative right who would accept GM and not climate change to now broad rejection of the ‘cult of science”.
So for the bulk of dis-informers, is it now simply that one tool in demonstrating loyalty to the identify group (united by interests or emotion) – to undermine anything that lies outside the group? Conspiracy theories and distrust, identity fusion and polarisation go hand in hand. Social media have expedited all of these elements and magnified their effect and impact.
- One other factor can add fuel to the mix. The science community often conveniently forgets that science and technology also do harm. Thalidomide, eugenics, the Tuskegee experiment – are example that role off the tongue as bad science. And of course, much of the world’s science and technology is most rapidly developed in the military context. But there are many others that are the result of the unintended consequence of good science. The climate emergency is after all the result of the science and engineering create fossil fuel-based engines and industry. Obesity has a lot to do with the science of industrial food production, mental health issues in young people is fueled by the digital sciences and their application. Economic science has led to policies that fuel inequality.
As the next raft of technologies is emerging at a destabilising rate and largely without any regulatory control, what will artificial intelligence, synthetic biology and quantum bring, at least in raising societal fears. And fears are the fuel of affective polarisation and the shift towards autocracy.
We are here because we bring different expertise to these and other perspectives I have not considered, and because we agree a loss of trust in modern science must limit the use of science in collective decision making, and that must ultimately harm society and prevent progress.
I look forward to a lively meeting and thank the JRC for their hospitality.
[1] Candace Owens https://www.catholic.com/audio/cot/should-christians-trust-the-science
[2] https://www.pnas.org/doi/pdf/10.1073/pnas.2301642120
[3] P Gluckman; https://policylabs.frontiersin.org/content/commentary-science-and-science-systems-beyond-semantics
[4] Heather Douglas, Science, Policy, and the Value-Free Ideal, 2009 University of Pittsburgh Press
[5] Taken from https://www.nytimes.com/2024/09/11/opinion/republicans-science-denial.html
[6] https://informedfutures.org/challenges-to-social-cohesion/