"We need a fundamental review of the way European institutions access and use scientific advice," said President of the European Commission, José Manuel Barroso. "In the next Commission, I want to set up a chief scientific adviser who has the power to deliver proactive, scientific advice throughout all stages of policy development and delivery. This will reflect the central importance I attach to research and innovation," he added.
In a recent report, the European Research Advisory Board (EURAB) asked whether research was part of the solution on social issues or part of the problem. While European policymakers are promoting research and innovation "as the saviour of Europe, bringing anticipated benefits, increased competitiveness, prosperity and better jobs," people are increasingly doubtful that a "scientific assessment of risks and benefits with decisions made solely by the experts is necessarily a guarantee of the best choice for society," it notes.
Research is viewed positively when it solves problems, is relevant to people's lives and useful to society, the report concludes. But "too often though, researchers are perceived to be addressing issues that the public may not necessarily consider as beneficial to society," and the economic benefits of research do not enjoy the same acceptance, it argues.
"European publics are not questioning the scientific information as much as they are actually questioning the institutions generating it (lost confidence in business, government and the academe)," EURAB continued.
The European Policy Centre (EPC), a Brussels-based think-tank, emphasises that in managing risks to the environment and human health, the "best available science" and scientific evidence must be a "key knowledge input" for decision-making at all stages of the regulatory cycle. The paper identifies a series of weaknesses in the EU's current approach to using science for policy and decision-making.
The think-tank notes that the EU treaties contain no requirement to take account of scientific data when formulating policy unless it is environmental in nature. There is no requirement to base decisions on the "best available science," nor to base legislation on evidence of risk rather than of potential hazards, it continues. Indeed, it is not necessary to produce a comprehensive statement of risk management principles either, it adds.
The EPC argues that EU guidelines for the collection and use of scientific advice are limited, "not mandatory and do not provide a comprehensive common set of key concepts and definitions for use in the provision of scientific advice". They neglect to include a definition of the quality of information to be used, and findings from major scientific assessments used for policymaking are not subject to peer review, it notes. As a result, the guidelines "do not provide EU institutions with a coherent policy for the use of evidence in decision-making".
EU guidelines for the selection of scientific advisors are not subject to peer review by external experts, while selection processes are opaque and restrict the use of advisors from outside the EU, the EPC continues.
The think-tank encourages the European Commission to publish a decision on a new binding policy covering risk analysis in policymaking, and invites the EU executive to establish a new policy for the collection and use of scientific advice for decision-making to be applied by all institutions to all stages of the regulatory cycle and to all sources of scientific advice. In addition, the paper proposes the establishment of a European Academy of Sciences, whose role would consist of advising high-level politicians on the scientific dimension of policy and decision-making.
Marie-Hélène Fandel, a policy analyst at the EPC, said it was not necessarily a bad thing that politicians were responsible for deciding whether to authorise new technologies, as science is "very rarely neutral" and depends on both the cost-benefit analysis and the risk assessment of products.
However, she stressed that while the Union's cautious approach to granting market authorisation to new technology applications like GMOs or products derived from nanotechnology has prevented it from suffering any major backlash, it could also mean the bloc is missing out on major opportunities to improve its competitiveness.
"If we are too cautious, we will not move forward and ultimately risk losing our competitive edge. On the other hand, if there is a significant risk to health or the environment, we should not rush into adopting new technologies. It all depends on our assessments of the benefits and costs, and on whether the risks can be managed," she added.
Furthermore, with an increasing number of products incorporating new technologies such as nanotech or biotech, "a common approach is needed to avoid a fragmentation of the single market over product safety issues," she argued. It is also important that "decision-makers recognise the risk of inaction if the EU fails to act quickly on new technologies," she said, suggesting that the economic gains and new jobs could benefit Europe's competitors instead (see EurActiv interview with Marie-Hélène Fandel here.)
Fandel also suggests that the next Commission should strengthen the advice its President receives on science and technology issues. A new structure called 'Council of Advisors on Science and Technology' could be set up and closely linked to the Commission President "to help shape the President's vision on the challenges ahead". She notes that many of the existing EU 'knowledge pools' are "away from the centre of action and focus on very technical issues". The new strcuture could help pool the information from those for the President.
David Zaruk, who heads Risk Perception Management, an environmental health risk consultancy, said that in today's policymaking, "facts don't matter that much". "We are not in a knowledge-based society but in an influence-based society." He also said it is difficult to get good science experts to conduct risk assessments of proposed EU policies, as "they are fed up with the fact that their advice is constantly ignored".
Zaruk also noted that while the 2006 Aho report on creating an innovative Europe stated that the lack of innovation-friendly markets in Europe is hindering investment in R&D, the report did not say why society is not embracing innovation. According to Zaruk, people are reluctant to do so due to the emergence of a sort of an "eco-religion", which takes precaution and the precautionary principle as a policy tool for those who think that "science has gone too far".
While science was initially seen as saving people from nature's dangers, so-called "eco-religious" people think that nature needs saving from people and science, Zaruk argued.
The European Science Foundation (ESF) said the scientific community is worried that "the inclusion of sociopolitical values in the confirmation practice of science tends to undercut the objectivity of science. For instance, in the field of expertise, science-based advice for political decision-making is in constant danger of becoming identified with one of the warring political factions. By tying its judgments too intimately to certain sociopolitical values, science runs the risk of losing its credibility".
On the one hand, including sociopolitical values in the assessment procedure "is mandatory for a responsible science. On the other hand, a social bias of science tends to undercut the overarching authority of science which derives from its factual basis," ESF continues.
"A science tied too intimately with social values might lose the capacity of 'speaking truth to power'. As a result, the increasing politicisation of science might undermine its credibility. To the extent that science enters the social arena and becomes part of political power play, the scientific claims to objectivity and trustworthiness tend to be sapped."
Andreas Hensel, president of the German Federal Institute for Risk Assessment (BfR), said: "Science enjoys considerable trust in society: as a source of risk information, it is trusted more than most of the other stakeholders, such as politicians or industry representatives. This in itself suggests one of the prerequisites for trust and credibility in scientific risk assessment: it must be perceived as coming from a neutral entity which makes its assessments independently of day-to-day politics and economic interests."
First, risk assessment should be independent of risk management and, second, transparency is a key requirement for trust and credibility, he explained.
On GMOs, Hensel said that "food is a particularly sensitive topic because it is essential for life and many consumers believe that making the right choice is important for maintaining their own health." Potential risks from GM plants and food are "perceived as hazardous because they are not seen as being connected to products produced by traditional breeding methods and techniques, which are perceived as 'natural' and therefore safe," he added.
Researchers at the International Life Sciences Institute (ILSI) note that "there has been significant public debate about the susceptibility of research to biases of various kinds. The dialogue has extended to the peer-reviewed literature, scientific conferences, the mass media, government advisory bodies, and beyond. Whereas biases can come from myriad sources, the overwhelming focus of the discussion to date has been on industry-funded science".
Highlighting the critical role that industry has played and will continue to play in the research process, ILSI researchers recently proposed "conflict-of-interest guidelines regarding industry funding to protect the integrity and credibility of the scientific record". The document is intended to prompt "ongoing discussion and refinement".
The guidelines include eight principles or "ground rules" for industry-sponsored research. They include a requirement for scientific investigators to control both study design and the research itself, and guarantee investigators and the appropriate auditors/reviewers access to all data and control of statistical analysis.
The researchers conclude that, "in the end, management of conflicts of interest, and, for that matter, management of scientific biases altogether is a matter of consensus building, not enforcement".