European policymakers face a difficult choice when authorising new technologies such as GMOs, as they often find themselves caught between conflicting expert safety advice and calls to respect the precautionary principle when scientific evidence is insufficient.


EU policymaking is largely based on expertise and involves handling complex technical information at different levels. While such expert-based regulatory policy is seen by some as a guarantee of rational decision-making, it is sometimes perceived as technocratic and opaque.  

Since the mid-1990s, steps have been taken to improve the quality of science used in decision-making by establishing independent scientific committees and independent risk-assessment agencies. Examples of these include those in place for medicines and food. Since 2001, the debate on the role of scientific evidence in policymaking has been seen in a wider context of European governance and better regulation.

In its 2001 White Paper on European governance, the European Commission recognised that scientific and other expert advice was playing an increasingly significant role in EU decision-making. Expert advice particularly serves to "anticipate and identify" potential problems and uncertainties facing the EU, helping its institutions to make decisions and to communicate risks effectively. 

In 2000, the Commission published a communication on the so-called 'precautionary principle', covering "cases where scientific evidence is insufficient, inconclusive or uncertain and preliminary scientific evaluation indicates that there are reasonable grounds for concern that the potentially dangerous effects on the environment, human, animal or plant health may be inconsistent with the high level of protection chosen by the EU". 


Scientific expertise increasingly contested

Scientific experts are regularly consulted by policymakers to explain and offer advice on diverse EU policy issues. But while scientific advice is being sought more often, the practice has become a target of growing criticism. The lack of transparency in the way such expertise is selected, used and disseminated by governments is often considered a problem which could undermine the legitimacy of the decision-making process. 

There is a widespread debate on the advantages and disadvantages of risk analysis as a valuable tool for policy decisions. Industry, NGOs and academia are often at odds over how risk analysis should be used and how much influence it should have over decisions. One of the main points of debate concerns the objectivity of the experts consulted. 

The scientific community and industry often argue in favour of strictly risk-based policymaking, describing risk analysis as the only "objective scientific basis" for making more rational decisions. According to the proponents of this approach, problems and limits of risk analysis can be overcome through data collection and research, as well as strict guidelines for conducting research and presenting results in a consistent manner. 

Meanwhile, environmental NGOs and consumer interest groups point to the dangerous tendency of risk analysis to oversimplify problems facing policymakers by focusing on one hazard and effect at the time, or on problems that are well understood. They argue that the complexity of risk assessment methods leaves the whole process vulnerable to manipulation for political purposes, ultimately making the decision-making process less democratic. 

Whose science can be trusted?

Scientific evidence has always attracted some scepticism, particularly if provided by industry, when it is considered potentially biased. However, recent policy developments in the EU regarding genetically modified food, for example, prove that even official expertise produced by EU scientific agencies is increasingly being contested.

While individual companies and industrial sectors are often perceived as an untrustworthy source of scientific information, data provided by environmental NGOs and consumer protection groups are more widely disseminated by the media and more easily accepted by the general public and decision-makers alike. 

Whereas NGOs accuse industry of exaggerating the benefits of their new products, industry denounces NGOs' focus on the potential health or environmental risks of their products.

A general lack of understanding of how advances in science and technology affect our lives further contributes to confusing the general public, and sometimes invites controversial or sensationalist reporting on issues such as food safety, GMOs, bird flu and global warming.

In addition, people can also refute authoritative information and instead decide to believe information presented to them by groups or individuals with whom they share beliefs or ideologies.

Politicians as risk managers

Confronted with competing scientific evidence, extensive industry and NGO lobbies, public fears and confusion, EU politicians - who are usually non-scientists and non-specialists - have the difficult task of regulating and deciding on the authorisation of new products and technologies. 

Independent cost-benefit analysis, risk assessment and perception of whether the risks can be managed is widely seen as the way forward.

In this regard, politicians can be considered as 'risk managers'. Risk management describes the process of weighing up policy options with regard to a controversial issue, like GMOs or chemicals, in consultation with all the stakeholders concerned. By looking at the risks involved, and at the risks perceived by the public, politicians decide what to do about the risk, and how to communicate their decision, implement it and evaluate the results. 

Missed opportunities to innovate? 

Some argue that the EU's cautious approach to granting market authorisation to new technology applications like GMOs or products derived from nanotechnology could lead the bloc to miss out on major opportunities to improve its competitiveness. Furthermore, a society that is averse to risk in general and regulations placing the burden of proof on industry rather than the authorities is seen as a major hindrance for further investment in innovation. 

One example of how the burden of proof has been revered is the EU's recent REACH regulation on chemicals. Instead of obliging national authorities to allay concerns about particular chemicals, manufacturers now have the responsibility for proving that their products are safe in order to get market authorisation.   


"We need a fundamental review of the way European institutions access and use scientific advice," said President of the European Commission, José Manuel Barroso.  "In the next Commission, I want to set up a chief scientific adviser who has the power to deliver proactive, scientific advice throughout all stages of policy development and delivery. This will reflect the central importance I attach to research and innovation," he added. 

In a recent report, the European Research Advisory Board  (EURAB) asked whether research was part of the solution on social issues or part of the problem. While European policymakers are promoting research and innovation "as the saviour of Europe, bringing anticipated benefits, increased competitiveness, prosperity and better jobs," people are increasingly doubtful that a "scientific assessment of risks and benefits with decisions made solely by the experts is necessarily a guarantee of the best choice for society," it notes.

Research is viewed positively when it solves problems, is relevant to people's lives and useful to society, the report concludes. But "too often though, researchers are perceived to be addressing issues that the public may not necessarily consider as beneficial to society," and the economic benefits of research do not enjoy the same acceptance, it argues. 

"European publics are not questioning the scientific information as much as they are actually questioning the institutions generating it (lost confidence in business, government and the academe)," EURAB continued.  

The European Policy Centre  (EPC),  a Brussels-based think-tank, emphasises that in managing risks to the environment and human health, the "best available science" and scientific evidence must be a "key knowledge input" for decision-making at all stages of the regulatory cycle. The paper identifies a series of weaknesses in the EU's current approach to using science for policy and decision-making. 

The think-tank notes that the EU treaties contain no requirement to take account of scientific data when formulating policy unless it is environmental in nature. There is no requirement to base decisions on the "best available science," nor to base legislation on evidence of risk rather than of potential hazards, it continues. Indeed, it is not necessary to produce a comprehensive statement of risk management principles either, it adds.  

The EPC argues that EU guidelines for the collection and use of scientific advice are limited, "not mandatory and do not provide a comprehensive common set of key concepts and definitions for use in the provision of scientific advice". They neglect to include a definition of the quality of information to be used, and findings from major scientific assessments used for policymaking are not subject to peer review, it notes. As a result, the guidelines "do not provide EU institutions with a coherent policy for the use of evidence in decision-making".    

EU guidelines for the selection of scientific advisors are not subject to peer review by external experts, while selection processes are opaque and restrict the use of advisors from outside the EU, the EPC continues.    

The think-tank encourages the European Commission to publish a decision on a new binding policy covering risk analysis in policymaking, and invites the EU executive to establish a new policy for the collection and use of scientific advice for decision-making to be applied by all institutions to all stages of the regulatory cycle and to all sources of scientific advice. In addition, the paper proposes the establishment of a European Academy of Sciences, whose role would consist of advising high-level politicians on the scientific dimension of policy and decision-making. 

Marie-Hélène Fandel, a policy analyst at the EPC, said it was not necessarily a bad thing that politicians were responsible for deciding whether to authorise new technologies, as science is "very rarely neutral" and depends on both the cost-benefit analysis and the risk assessment of products. 

However, she stressed that while the Union's cautious approach to granting market authorisation to new technology applications like GMOs or products derived from nanotechnology has prevented it from suffering any major backlash, it could also mean the bloc is missing out on major opportunities to improve its competitiveness.  

"If we are too cautious, we will not move forward and ultimately risk losing our competitive edge. On the other hand, if there is a significant risk to health or the environment, we should not rush into adopting new technologies. It all depends on our assessments of the benefits and costs, and on whether the risks can be managed," she added. 

Furthermore, with an increasing number of products incorporating new technologies such as nanotech or biotech, "a common approach is needed to avoid a fragmentation of the single market over product safety issues," she argued. It is also important that "decision-makers recognise the risk of inaction if the EU fails to act quickly on new technologies," she said, suggesting that the economic gains and new jobs could benefit Europe's competitors instead (see EurActiv interview with Marie-Hélène Fandel here.)

Fandel also suggests that the next Commission should strengthen the advice its President receives on science and technology issues. A new structure called 'Council of Advisors on Science and Technology' could be set up and closely linked to the Commission President "to help shape the President's vision on the challenges ahead". She notes that many of the existing EU 'knowledge pools' are "away from the centre of action and focus on very technical issues". The new strcuture could help pool the information from those for the President. 

David Zaruk, who heads Risk Perception Management, an environmental health risk consultancy, said that in today's policymaking, "facts don't matter that much". "We are not in a knowledge-based society but in an influence-based society." He also said it is difficult to get good science experts to conduct risk assessments of proposed EU policies, as "they are fed up with the fact that their advice is constantly ignored". 

Zaruk also noted that while the 2006 Aho report on creating an innovative Europe stated that the lack of innovation-friendly markets in Europe is hindering investment in R&D, the report did not say why society is not embracing innovation. According to Zaruk, people are reluctant to do so due to the emergence of a sort of an "eco-religion", which takes precaution and the precautionary principle as a policy tool for those who think that "science has gone too far".

While science was initially seen as saving people from nature's dangers, so-called "eco-religious" people think that nature needs saving from people and science, Zaruk argued.

The European Science Foundation (ESF) said the scientific community is worried that "the inclusion of sociopolitical values in the confirmation practice of science tends to undercut the objectivity of science. For instance, in the field of expertise, science-based advice for political decision-mak­­ing is in constant danger of becoming identified with one of the warring political factions. By tying its judgments too intimately to certain sociopolitical values, science runs the risk of losing its credibility". 

On the one hand, including sociopolitical values in the assessment procedure "is mandatory for a responsible science. On the other hand, a social bias of science tends to undercut the overarching authority of science which derives from its factual basis," ESF continues. 

"A science tied too intimately with social values might lose the capacity of 'speaking truth to power'. As a result, the increasing politicisation of science might undermine its credibility. To the extent that science enters the social arena and becomes part of political power play, the scientific claims to objectivity and trustworthiness tend to be sapped." 

Andreas Hensel, president of the German Federal Institute for Risk Assessment  (BfR), said: "Science enjoys considerable trust in society: as a source of risk information, it is trusted more than most of the other stakeholders, such as politicians or industry representatives. This in itself suggests one of the prerequisites for trust and credibility in scientific risk assessment: it must be perceived as coming from a neutral entity which makes its assessments independently of day-to-day politics and economic interests."

First, risk assessment should be independent of risk management and, second, transparency is a key requirement for trust and credibility, he explained.  

On GMOs, Hensel said that "food is a particularly sensitive topic because it is essential for life and many consumers believe that making the right choice is important for maintaining their own health." Potential risks from GM plants and food are "perceived as hazardous because they are not seen as being connected to products produced by traditional breeding methods and techniques, which are perceived as 'natural' and therefore safe," he added.  

Researchers at the International Life Sciences Institute  (ILSI) note that "there has been significant public debate about the susceptibility of research to biases of various kinds. The dialogue has extended to the peer-reviewed literature, scientific conferences, the mass media, government advisory bodies, and beyond. Whereas biases can come from myriad sources, the overwhelming focus of the discussion to date has been on industry-funded science". 

Highlighting the critical role that industry has played and will continue to play in the research process, ILSI researchers recently proposed "conflict-of-interest guidelines regarding industry funding to protect the integrity and credibility of the scientific record". The document is intended to prompt "ongoing discussion and refinement".

The guidelines include eight principles or "ground rules" for industry-sponsored research. They include a requirement for scientific investigators to control both study design and the research itself, and guarantee investigators and the appropriate auditors/reviewers access to all data and control of statistical analysis.

The researchers conclude that, "in the end, management of conflicts of interest, and, for that matter, management of scientific biases altogether is a matter of consensus building, not enforcement". 


  • Feb. 2000:  Commission Communication on the precautionary principle.
  • Jan. 2002: European Environment Agency report on 'Late lessons from early warnings'.
  • March 2005: Launch of SINAPSE (Scientific Information for Policy Support in Europe): a tool for the exchange of information between the scientific community and decision-makers.  
  • 25-30 May 2009: Science and values: the politicisation of science conference.
  • 15 Sept. 2009: Commission President José Manuel Barroso announced his intention to set up a chief scientific adviser to the next Commission.
  • Feb./March 2010: European Environment Agency to publish a follow-up to its 2002 report entitled 'Late lessons from early warnings'.