Uncertainty? Definitely

15 Dec 2015 | Viewpoint
Since her term as President of the European Research Council ended in 2013, Helga Nowotny has turned author. Her book on the role of uncertainty in science, The Cunning of Uncertainty, has just been published. Science|Business spoke to her at the recent Innovation Conference in Barcelona

Q: So, Helga, you’ve written this book on uncertainty, and the cunning of uncertainty. Why now?

A: There are two answers to this. One is a very pragmatic one. After stepping down as President of the European Research Council I finally had time to write again, and I realised that this is a topic that has been at the back of my mind and has been with me for a very long time.

The other reason is that in terms of timing, maybe this is the right moment to write such a book. I say this because we live in a time where there is also a politics of fear, I would say, of inducing deliberately anxiety in people to use it for political purposes.

But coming back to my own professional background, much of my work has been concerned with the tensions and relationships between science and society, and I have been struck by the fact that science copes extremely well with uncertainty. Uncertainty is a driving force for science because it pushes scientists into the unknown. There is this great excitement and moment of new discoveries, where science thrives on the cusp of uncertainty; and science and scientists also know that whatever knowledge is gained, it is provisional knowledge, because there will be new knowledge.

Just on that, when you talk to scientists they quite often say these things, but when they speak in public they very rarely say that science is provisional.

That’s the second part of the equation. In society we have this, I would almost say, craving for certainty. And there is a lot of pressure on scientists in matters that are of concern to society, concern to politicians: Is this substance cancerous, yes or no? Is the risk to be taken or not? Questions of this kind. And here, scientists if they are honest very, very rarely can say yes or no. If they are honest they have to say “yes, under the following conditions…”, and there are many conditions needed to specify the answer. But I agree. The political pressure is there, the internalised feeling that we should give these clear-cut yes or no answers, but it’s really not possible.

So part of the message that I want to get across is that it is not only more honest but it is necessary to accommodate uncertainty better into our lives.

One of the experienced editors at Nature once said to me that almost all the papers published in Nature will at some point be shown to be wrong…or at least wrong in part…

…but here, if I may come back to the provisionality of knowledge – and this is the good thing about science – we will produce new knowledge that in the long term will supersede or complement the knowledge that we already have, and part of it will be proven wrong. That’s how science progresses and how scientific knowledge continuously renews itself.

On the other hand, and this is why we have peer review, you want to have some kind of guarantee or certification that what is published right now is up to the state of the art, and meets the rigorousness of methods to be applied. Peer review should guarantee knowledge as certain, but this knowledge remains provisional.

You talk about the politics of fear, and deliberately inducing anxiety in people. You’re talking here, without being specific about governments, but about governments generally. You’re not just talking about terrorists, for example?

Look, we have seen a change in public mood after 9/11, so this obviously was related to terrorism. We have now a European variant of it… so for me this is the background. I don’t talk in the book about terrorism. But I think it is important to realise, and I am very adamant in the book, that if you let fear take over it eliminates the openness of the future. Because if you are afraid you retreat, or you feel constrained to limit your actions and reactions.

Now, the title of the book harks back to Hegel and his concept of the cunning of reason…

…and to the ancient Greeks.  In Greek it’s metis. So it’s much older than Hegel.

Do you believe there is a sense that the more we know the greater the uncertainty?

The more we know we discover what we do not know as yet – I would rephrase it in that way.

Surely that must raise uncertainty, because things can only be uncertain if you know you need to know about them…

Yes. But the cunning,–of course it’s a metaphor –means for me that there are many contingencies in our lives, in our way of reasoning, in our way of doing research, in our way of planning or practising foresight of any kind, that turn out to be different from what we had planned or imagined. The cunning is a metaphorical way of accepting that not all our plans will follow a straight line, and also to learn to appreciate the inbuilt subversive force of uncertainty. We all recognise in our lives the things that we had wanted or we had planned and then it turns out we did not get them or things turned out otherwise. But with the benefit of hindsight we often discover that in the end we find it had some unexpected positive sides to it. And I would like to get people see this cunning as a way of letting them discover many more dimensions than we imagine when we plan or when we set ourselves goals.

Do you think politicians and research funders understand this?

I think they want to control things, and in particular when funding research it is very tempting for politicians to think they know better what the outcome should be, and that the researchers are there to bring home the bacon. But if one looks back at the major scientific, technological, medical achievements in the last 100 years, they did not come about because there was a master plan, or a politician saying we need this vaccine or this product or this particular outcome.

That’s not entirely true, though, is it? You had the Manhattan Project…

…yes, but the Manhattan Project was something different. Like putting a man on the moon, it’s an engineering problem. Engineering is different from science. With engineering you know precisely what you want, you know the constraints that you have to overcome – technical, financial, regulatory…

This is obviously a good time to be publishing your book, but do you have any specific targets in mind here?

Targets? I have several audiences, let’s put it that way. I want to encourage researchers not to hide uncertainty. For me [the French biologist and Nobel Prizewinner] François Jacob is one of those who got it right a long time ago. In his autobiography he speaks about “day science” and “night science”. Day science is the glorious part: you show to the world what science can do, and it’s wonderful, magnificent. And then, he says, there is night science as the other side of the coin: when one is frustrated in the lab, one’s experiments don’t work, one  has to start again, one discovers going down the wrong alley, and having to backtrack – and yet you persist, and you go on, because you believe in what you are doing. I think this is something that also has to be communicated better to society, in the sense of not only showing the finished product, the outcome, but also the practice and process of research. And fundamental research is inherently uncertain – and if it were not uncertain you would never find anything new.

That’s one part of my audience. The other part of the audience is “society” (if I can use this global term, which is far too vague and too big). We know from a lot of empirical work about what people are afraid of, where they see risks; and this work consistently shows that people are mostly afraid of the wrong risks, or to put it very bluntly they are afraid of the things that they already know. I also want to tell the public that uncertainty is part of life. Science has extended the range of what we can predict, but obviously there are limits even in the best models and tools that we have. There will always be uncertainties, just as everyone has to live with probabilities.

Let’s say your book is read by all the main decision makers. They read it, they change their practice accordingly. What would change?

They would gain a longer-term vision for the way in which science works, not just the short-term impact perspective. I don’t deny that it also has a place, but I would like them to accept the so-called “usefulness of useless knowledge”, as [US educator] Abraham Flexner formulated it in 1930s.

Are there things that you think they would be funding but they aren’t funding now?

You can tell them, “We would not have the GPS without Mr Einstein, who did not receive funding for his theory.” Give them a couple of examples of this kind, just to make them realise that fundamental research, especially, is often guided by serendipity and by the cunning of uncertainty. The transformation of seemingly useless knowledge sometimes needs a much longer time span. But in the end, something useful will come out of it. That awareness is lacking now.

CERN is still getting money…

Yes, but CERN works differently. CERN is open to discover what we don’t know as yet, and is working with uncertainty. Take the Higgs boson – it might well have turned out that the people running the Large Hadron Collider would have said, “Guys, we’ve not found anything.”

But budgets generally are under threat…

So politicians go for the secure part. They want to have short-term societal economic impact – as is now practised routinely in the UK, in particular. And when you cut down budgets you insist on deliverables within a short period of time. But nobody is able to foresee the more long-term results. What about the unexpected societal impact that no one can write into the impact statement of the application because nobody knows?

When you were president of the ERC, did you feel this pressure?

Getting the ERC started and functioning was a major voyage through uncertainty, because we did not know whether we would succeed, and especially in the first years there was this constant struggle between what I would call a culture of trust and a culture of control.

So you are saying you have to go with a culture of trust, because you can’t control?

The ERC has been set up to do bottom up risky frontier research based on excellence only. So no economic impact statements. Without a culture of trust this is not possible.

The Cunning of Uncertainty by Helga Nowotny, Polity 2015, ISBN: 978-0-7456-8761-2

Never miss an update from Science|Business:   Newsletter sign-up