Risk consultant: EU R&D hostage of 'eco-religious fundamentalism'

-A +A

European companies are finding it increasingly difficult to covert research into innovation as politicians turn to the precautionary principle and Europeans reject science as a 'force of evil', argues David Zaruk, an environmental health risk consultant, in an interview with EurActiv. 

David Zaruk is an associate professor in risk and corporate communications at Vesalius College in Brussels and a senior research associate at the Institute for European Studies at the Vrije Universiteit Brussel (VUB). He is also an independent risk lobbying analyst at Risk Perception Management, a consultancy. 

Of Canadian origin, Zaruk has a PhD in philosophy and has worked for twelve years in issue management and science communications at chemicals group Solvay and the European Chemicals Industry Council (Cefic), among others. He is also one of the founders of GreenFacts, an Internet-based environmental health risk communications tool.

To read a shortened version of this interview, please click here

What role does science play in EU policy making? 

EU policymaking has two main problems. One of the big problems is structural. You have different Commission directorate-generals (DGs) doing different things working within their silos - and that is the structural problem affecting science policy. Just look at the Lisbon Agenda and Strategy for growth and jobs, or the Barcelona target of spending 3% of GDP on R&D. 

Then, look at the first reading of the REACH regulation. The chemicals industry, one of Europe's largest and most innovative industries, was being attacked by one DG while at the same time accused by another of not investing enough in research to reach the 3% target. 

Nano is another example. There are great nanotech projects going on in DG Research, while DG Sanco is putting all its effort into consultations on 'how can we control this stuff?'. Is there sufficient coordination on research policy between the silos? 

One DG thus encourages the development of an application while another is attacking it or potential innovations that might emerge from research. Sanco is mandated to keep people safe and they have developed a structure for risk assessments to meet this. But how do you do risk assessments for nanotech? Normally, risk equals hazard plus exposure. But nanoparticles are so small that it is difficult to measure exposure - so Sanco's conclusion is that there must be exposure and the need for precaution. This means that now hazard equals risk – a nightmare for any science or research-oriented policy approach. The chosen risk assessment approach for nano will entail taking all nano off the shelves and off the research agenda. 

However, if you are a university or a company making a big research investment, you want to make sure there is some clarity on where you are going: you need clear guidance and clear policies. If different DGs are acting against each other and we are getting mixed messages, that's one of the big problems for developing science and an innovation policy. If you invest in research, you want to get some profit at some point out of it. But there are a lot of handcuffs and conditions put on research in Europe. 

One of the EU's challenges is to get research to go into innovation. You can say all you want about how wonderful research is, but if you can't develop the research into a product that somebody can or is willing to use in some way (you can't create an innovation), you have removed the incentive for further research. This happened to much of the biotech research in the 1990s. Once again, America has gained from lack of clarity in EU science policy. 

The Esko Aho report ('Creating an innovative Europe') was worth reading, but it missed a couple of things. While it noted that European consumers are not embracing innovation enough, it failed to answer to the question of why are they not embracing new innovations. This is about treating the symptoms of the cold rather than the virus. 

Why are Europeans not embracing innovation? 

Historically, Europeans and Americans see nature differently. This affects the way they see the world. Americans see themselves historically as pioneers pushing westward and going out to conquer nature. Meanwhile, Europeans were building walls around their cities to protect themselves from nature. In the EU, the view is that conquering nature is not something that man should be doing. This precautionary preoccupation persists today. 

Science, from the time of the Renaissance, was originally considered as the 'saviour' to fight nature and stop nature from destroying yields and diseases, etc. This was the case until after WWII, when science began to move from its focus on discovery and problem-solving to being progress driven (more technological). 

After we more or less 'conquered' nature, industry began to come up with innovations to make life more comfortable (and marketable) and a lot of that science was technology-oriented and thus more expensive. Science was no longer perceived as a saviour (except for rare occurrences of public vulnerability to nature, like the emergence of AIDS or H1N1), but a convenience. 

And nature is no longer seen as the vicious beast it used to be. It is now seen more as the polar bear adrift on a melting glacier, and it is science and its technological advances that caused all the carbon increase and global warming. Our view of the role of science has changed now from a force of good to protect us from the evils of nature to, suddenly, a force or technological machine that big businesses brought in and which is destroying and polluting nature. And perhaps nature is going to take its revenge on us in a way with global warming, which is going to be cataclysmic. 

And here we are talking about a new religion, the eco-religion. We've killed our Christian religion, but man can't live without one. So rather than God we have Gaia and a certain sense of purpose in our lives to save the world. Ecological rituals to recycle and to try to lessen our ecological footprint are all very strong religious symbols which add meaning to our lives. We have an Armageddon principle of climate change, and of course good and evil depends on whether we live our lives sustainably or not. So it is a religion. 

I don't mean to be cynical about this, as religion can be quite good. But religion also tends to fight science - especially if you are creationist. And a lot of environmentalists look at science as a threat, as did the Catholic Church. There are interesting parallels between these, then. 

Secondly, precaution was created as a tool for policy, particularly by those who think science has gone too far, and if we are not sure where the research is going, perhaps we should opt for precaution. 

There are different definitions of precautionary principle, and here we see where science and policy break down. The first definition was formulated in the Rio Earth Summit Declaration, at a time when scientists started to talk about climate change and when the idea of sustainable development was introduced. It is referred to as the triple negative: 'If you don't have enough information on something, it does not give you a reason to not do something'. At Rio, there was not enough evidence to prove that the world was warming (or that it was due to human activities), but this was not enough of a reason not to take action to cut emissions. This definition of precaution supported science as a policy tool. 

Another definition was set out by the European Commission in their 2000 Communication, which basically said: "If you don't have enough information on something, you should withhold judgement until you get more." 

The third definition was formulated by the European Environment Agency and forcefully espoused by its science policy head David Gee, previously an environmental activist and director of Friends of the Earth UK: "Until you have enough information to be certain about something, you should take precautions". This definition is the most widely used in Europe now. This definition switched the burden of proof – science now has to prove that something is safe with certainty before we permit further research. 

Before, scientists could develop an innovation, and market it and it was then up to us to prove and test that it is dangerous. Now, you need to prove something is safe before it can be marketed. But proving something safe is rather difficult. The best example of this is GMOs: a company does tests and proves that a biotech seed is safe according to policy requirements, but once it asks for market authorisation, opponents come up with more questions, such as 'are they safe for a type of a butterfly?' or '[are you sure] that GMO pollen won't be a threat to other strains?' They can ask for anything really. This is good business for research labs, but not for innovation. 

It is extremely difficult and challenging to do science and scientific exploration nowadays. Scientific research is not encouraged. And on top of that, with the reversal of the burden of proof, the whole mentality has been changed - you are guilty until proven innocent now. 

The REACH Regulation states at the beginning that REACH is based on the precautionary principle 'as defined in the Treaty'. But the Treaty only has a half a sentence on precaution (the second half makes reference to the 'polluter pays' principle). REACH is a good example of the reversal of the burden of proof. During the process, the whole point of REACH shifted from ensuring the safe use of chemicals to that of substitution, but how can you prove that substitutes are safe? It is impossible! So it is very interesting to see that REACH is based on something that is not very well defined. 

How do you see the relationship between science and society and how is society affecting science? 

We are afraid of science now. Therefore, we need clearer communications on science's benefits – not just the focus on the risks. 

I used to believe that if you can communicate science clearly to politicians and the public - you can get better policies and improve public perception. But I'm not that optimistic anymore. I, unfortunately, see that increasingly facts don't matter very much. Influence matters more and the same NGOs campaigning against the influence of lobbyists are those enjoying the greatest influence on EU policies influencing science. 

You can present the best scientific facts into the policy debate and hope that it is treated with a certain degree of understanding, but we are not in a knowledge-based society anymore. We are more and more in an influence-based society. The scientists are part of the structure but not a major part. You have scientists coming in, for example, in the risk assessment to present their views to the SCHER [scientific committee on health and environmental risks]. But then their opinion might be brought into the debate, but not necessarily – so why should they bother to participate? 

It has a lot to do with the separation of risk assessment and risk management - a mistake made during the mad cow crisis. At that time, scientists were advising on policies, whereas the scientific community was divided between whether CJD could come from eating beef infected with BSE. 

Following this confusion, EU decided to separate risk assessment and management as it thought scientists should not be involved in the decision-making process but only part of an impact assessment. But more and more scientists are starting to see that there is very little role for scientific evidence within the decision-making process. And many of them are not participating anymore in the risk assessment process. A recent consultation on the risk assessment process showed that it is becoming increasingly difficult to get good experts at all anymore. They are getting fed up, as EU policy is being driven by other elements than science. 

There is a belief among policymakers that there is no interest in Europe in having GMO food, so the technology is not being turned into innovative products. If you don't interact with civil society then you are not going to get public acceptance for what you are doing nor the innovations and benefits that could feed into further research. A report on societal impact on research showed that the scientific community has not paid enough respect to civil society. 

We can see that with some emerging technologies today, such as nanotech. If the public feels that there is a threat somewhere and is somehow not convinced of the idea and are afraid of the word nano, then products resulting from of nanotech research will have a hard time finding their way onto the market. 

Public loss of trust in science and obstruction to ensuing innovations is also rooted in promises made by scientists, but not kept. GM technology promised to feed the world without pesticides, develop new strains of Golden rice, grow bumper crops in deserts, while we only ended up with Roundup Ready Corn that nobody wanted, nor needed. A lot was promised, less was delivered and the public felt deceived. 

Unfortunately this has a lot to do with funding requests (which promises a lot based on potential results) - innovation is more pragmatic. Now, nanotech and stem cell research promise to cure all sorts of health problems such as cancer. But what we have so far is research showing that carbon nanotubes might cause cancer! But you have to understand that these promises come from researchers putting out funding applications saying 'please fund us as this might work'. The public is thus getting a message that nano will solve problems, while only things delivered so far include nanomaterials that we can't see, but which could go inside our body and the environment, and we don't know how to control them. 

So, we are seeing the risks, but we are not yet seeing the promised benefits because they are just potential applications. This is an opportunity for opponents of the research-driven agenda. It has been said that GMO stands for Greenpeace Membership Opportunity. GMOs have affected a lot this issue of where science and policy has gone. It is very difficult for policymakers to put money into research if the public is being made afraid of something. So what will 'nano' stand for: Not Another New Organism? I hope not, but once again, facts don't matter, influence does, and NGOs have a lot of influence on the direction that research goes here. 

What's the problem with precaution? Isn't it politically responsible to take precautions? 

Of course you need precaution – it is a natural impulse to take care. But you need to make sure that you really have an issue. Because a lobbyist, whether an NGO or a competitor with an alternative to a substance can create fear very easily today. The WHO is advising precaution on H1N1 and scaring the general public (they should be focusing on other more serious threats like TB or malaria: once again, facts don't matter, even at the WHO). 

It is just that precaution needs to be used responsibly. Precaution uses a different logic to scientific perspective. Today I brought an umbrella with me even though the sun was shining. I was not wrong to bring the umbrella (and will do so again tomorrow), even though I was not right either. 

Precautionary logic entails that not being right is not the same as being wrong. In other words, if you use the precautionary principle, you are never wrong - that is very attractive. Scientific logic implies that you are either right or wrong. Therefore, if you are right and can say 'this works and it is not a threat', then researchers assume you have the right to market this. However, according to precautionary logic, not necessarily. 

The European Environment Agency's approach is that 'we don't trust you – prove to me that this is safe'. Reversing the burden of proof is a reversal of trust. For policymakers, it is much more attractive to never be wrong than to take the risk and maybe be right. So science is paying a big price in Europe because of the precautionary principle – both in terms of lost opportunities for innovation and loss of trust in science.

Whose science can we trust? 

There is demand on science, and it is good. Science has always been pushing against the norms of society. Whether it is that the earth revolves around the sun or whether a substance is safe for a foetus in the womb, there has always been that push to assess whether science is safe and science has always had unbelievably strict rigorous standards. This was not invented by the EEA. 

But now science is being pulled into eco-religious dialogue. One example is the problem of the natural-synthetic divide (seen most recently in the debate on chemicals). It is assumed that natural is good, and synthetic (man-made) is bad. We tend to forget sometimes that there are some pretty nasty natural chemicals out there and that science is there to fight them – diseases, viruses, bacteria – and that some of the medicines we take are synthetic but are actually doing good things for us. 

The problem is that science is being associated more and more with non-natural endeavours and hence bad. We are talking about green chemistry, have a drive to use more 'natural stuff' and are looking up to sustainable science. We strive for more green science and activists are beginning to portray non-green chemistry as evil. Take plastics as an example. We tend to forget about the enormous benefits that plastics have brought us. The fact that we can now buy meat and not get sick the next day, the fact that we are able to package things more lightly and save energy with lighter, more efficient technologies using plastics – all these positive sides are forgotten, because plastics is synthetic (and in our worldview, assumed to be evil). 

Our eco-religious cultural narrative dictates that our solutions to our problems should not be man-made, they have to be natural. This drives eco-labelling and green procurement decisions. But what about when hospitals opt for eco-label disinfectants, which are not necessarily the best disinfectants and enable the spreading of hospital super bugs? How many people have to die each year (in Belgium, around 1400 victims) before we put science before religion? 

Eco-oriented decisions are based on good intentions and caring, benign policymaking. Watchdog organisations like NGOs are there to protect us. If my watchdog barks at the moon 99 times out of 100, I will still accept it for that one time it barks at a mad cow. NGOs know they can present things without scientific evidence and it doesn't matter - they are trusted from the sense of their good intentions that they are there to protect us. But if a scientist makes a mistake one time out of a million it is not the same thing. We have different standards of expectations on the scientific and technological world than we do on the societal elements. People die from malaria (around 3000 per day) or hospital super-bugs due to policies based on caring, good intentions (without scientific grounding), and nobody seems to mind – trust in the influence of NGOs is not undermined. How can facts matter in this context?

Do you see any  role for science in EU policymaking? 

We need a little bit of courage, political courage first of all. Precaution is a policy tool for cowards, because if you are never wrong, you don't have to take risks or be responsible for any indirect negative consequences. It is wonderful to take precautions when you have a difficult decision. But you affect people when you stop research. For example, there could be benefits that will come out of nano research that will allow a more effective chemotherapy one day, but if we run away from a difficult decision on carbon nanotubes, we may never have that benefit. We can manage risks (if we trust researchers), we don't have to run away all of the time. 

Politicians are not courageous enough and precaution is going to affect the role of science in policy. And the scientists see this when they are working on risk assessment and their opinion is not followed. We are not in a knowledge-based society, but in an influence-based society. It is not just corporate lobbyists, but also NGOs are very effective in the way they are affecting policies, particularly on research issues. In our influence-based society, science is being hit by eco-religious fundamentalists (call them creationists, puritans) who are capable of wreaking mass havoc in pursuit of their noble intentions. 

Finally, there is one last little point – we have no trust in institutions or government experts either. Trust is a commodity and some have a surplus of it, some have a deficit. Profit-oriented industry certainly has a deficit, NGO watchdogs have a surplus. Building trust is very hard, especially with our new emerging communication tools. Whither innovation? 

Advertising