In recent months, there have been several attacks on the Precautionary Principle and how the principle has been used in the EU. For example, the scientific advisor to the President of the European Commission, Ann Glover, has accused Commissioners of having “crazy ideas” about the safety of nanotechnologies, genetically modified organisms, shale gas, endocrine disrupters, etc. (See EU twisting facts to fit political agenda, chief scientist says and EU science advisor: ‘Lots of policies are not based on evidence’.) Similarly, it has been argued that policymakers often misuse the precautionary principle.
Steffen Foss Hansen, Associate Professor, DTU Environment, Technological University of Denmark, Kgs. Lyngby, Denmark and David Gee, Retired, Science, Policy and Emerging Issues, European Environment Agency, Copenhagen.
The evidence that we have helped gather for the European Environment Agency (EEA) in Copenhagen has shown that cases of misuse of the precautionary principle is rare; that fear of its use is misplaced; and that precautionary actions stimulate innovations. The evidence base is robust. It is based on the 34 case studies, which cover a thousand years of accumulated experiences with diverse chemical and technological agents that posed risks to health and environments and which the EEA has published in its two volumes of “Late Lessons from Early Warnings” (see Late lessons from early warnings: the precautionary principle 1896-2000 and Late lessons from early warnings: science, precaution, innovation). The case studies are written by more than 80 internationally renowned scientists who analyze the growth of scientific knowledge about early and late warnings of harm and what societies did with that knowledge. They conclude with lessons that could be learned, which may help improve future decision-making. Cases include CFCs and the ozone layer, BSE, PCB, acid rain, antibiotics on animal feed in volume 1 and leaded petrol, mercury pollution in Minamata Bay, bisphenol A, floods, alien species, and climate change in volume 2.
The second volume of “Late Lessons” also includes an analysis of “false positives” i.e. where government regulation, which is claimed to have been based on precaution, turns out to have been unnecessary, or overly restrictive. Our analysis of 88 such cases identified only four robust “false positives”: and none of them had their origins in Europe. In the case of the swine flu of 1976 and the Southern leaf corn blight of 1970, it was US regulators who took precautionary action against the potential return of a bug and a pest respectively that had caused huge human and economic losses in previous years. In the cases of saccharin and food irradiation, it was US regulators that required labelling of saccharin as a potential carcinogen in 1977 and who withdrew the permission to irradiate can-packed bacon in 1968.
But why is there a scarcity of genuine false positives compared to the large number of ‘claimed false positives’? A contribution to the mismatch between the rhetoric and the reality of false positives comes from the “product defense” strategies of companies whose short term profits are threatened by evidence of harm from their products, as documented in the Late Lessons chapter in volume 2 on the tobacco industries’ response to the evidence on passive smoking.
If research is to replace rhetoric in debates on risks and precaution then more research into potential hazards is needed, a conclusion that our analyses of research into the potential hazards of emerging chemicals and technologies, conducted for the “Late Lessons “ project, also supported. (See the recent issue of Journal of Epidemiology and Community Health).
In an analysis of 79 E&H journals from 1899 to 2012 our colleague Philippe Grandjean showed that most research had focused on well known hazards such as the heavy metals, and on PCBs, DDT etc. whilst there was little research into emerging chemicals identified as priorities by the US EPA. In our analysis we focused on technologies, identifying the amount of funding for environmental health and safety (EHS) research that had been sponsored by the European Commission through its Framework Programmes between 1996-2013 (FP 4-7), and comparing it to the funding of research and technological development (RTD) that develops and promotes the new technologies We found that EHS-research corresponded to a mere 0.6% of the overall RTD budget.
More specifically, the overall European public funding of research into the potential hazards of the nano, bio, and information & communication (NBIC) technologies during the FP1–7 programmes was 1.3% (€402 million) from a total RTD budget of some €31 billion.
During FP7 (2007 to 2013) some €3.5 billion has been provided for Nanosciences, nanotechnologies, materials and new production technologies but by 2011 only 25 projects has been given €82 million (2.3%) to study the health and environmental impact of nanomaterials. These absolute amounts were more than twice the total spent on nanotechnology under FP6, but the RTD/EHS ratio was similar at 2.1%. Under FP1=7, the total funding on biotechnology was €7.5 billion of which the EHS component was €273 million (4%); the EHS research budget for ICT was just 0.09% of the overall RTD budget of about 19 billion for ICT.
The low EHS research ratio seems to be the result of a number of factors: an unintended consequence of disparate funding decisions; technological optimism; a priori assertions of safety; collective hubris; and myopia.
But what would be a prudent RTD/EHS research ratio? In order to avoid the costly consequences of inadequate anticipatory EHS research, which can include the premature decline of promising technologies, we suggest that a prudent RTD/EHS ratio for NBIC technologies would lie somewhere between 5% and 15%, depending on intrinsic hazard potential; plausible exposure scenarios; and such characteristics of the technologies as their novelty, persistence, bioaccumulation potential, and spatial range. The Netherlands, for example, has decided to devote some 15% of its research budget to the EHS of nanotechnology on the basis of such considerations.
Scientific evidence is one of the fundamental prerequisites for implementing appropriate precautionary action but EU public research into the EHS potential of emerging technologies and chemicals does not seem sufficient to inform decision-makers in time to both anticipate and avoid their potential risks and to ensure the commercial longevity of promising products and technologies.
Perhaps the corporations that are developing consumer chemicals and the NBIC technologies are funding prudent quantities of EHS research? However, so far, there is little transparency about these research activities, as the UK House of Lords discovered when investigating the use of nanotechnologies in food. Because of the failure of society to ensure that any health or ecosystems damage is rectified by the risk creators there is little economic incentive for corporations to conduct research into the potential hazards of their technologies. The histories of hazards described in the Late Lessons reports are likely to be repeated unless there is timely and sufficient anticipatory research into the potential hazards of emerging chemicals and technologies, research that could encourage more responsible innovation into meeting the sustainability challenges of our times.