When caution trumps opportunity

Bio-safety regimes empower officials over farmers. Scarce public resources are wasted in surveillance and control.

Updated: February 24, 2014 11:30 pm
With GMOs, as with  climate science, there coexists a core of scientific consensus and persistence of denialism in the  scientific community.  Reuters With GMOs, as with
climate science, there coexists a core of scientific consensus and persistence of denialism in the
scientific community. Reuters

Ronald J. Herring

Bio-safety regimes empower officials over farmers. Scarce public resources are wasted in surveillance and control.

Genetic engineering in agriculture raises contentious politics unknown in other applications of molecular technology. The pivot is risk. In pharmaceuticals, medicine and industrial applications, recombinant DNA technology has been widely accepted as providing useful tools; in agriculture, products using these same tools have been coded as producing “GMOs”, evoking almost universally an aura of unique risk and special regulation. Science is invoked and attacked as providing sufficient evidence for assuring safety in the use of GMOs.

Though associated with wealthy economies historically, genetically engineered crops grown in “developing countries” in 2012 exceeded total acres grown in the so-called developed countries for the first time. India was the 16th country to approve a genetically engineered crop: Bt cotton in 2002. Controversy over India’s second Bt crop — brinjal — was intense, centred on the adequacy of science in assessing risk. Risk is an elastic and elusive concept. In common use, risk is part of everyday life but seldom formalised.

In normal science, risk has a precise but deceptively simple meaning: risk equals the probability of some hazard. Anyone booking a flight, taking prescription drugs or scheduling surgery recognises potential hazards. We regularly take some risks because of expected benefits, or because the risk of doing nothing is higher. The question is always: compared to what? Ideally, regulation of any technology would reach some threshold of acceptable risk — balanced with benefits — for a whole society.

Conceptually simple, these comparisons are devilishly difficult. Often neither hazard nor probability is known, or cannot be measured. The economist Jack Knight wrote in the 1920s that this situation is one of uncertainty, not risk. In the world of uncertainty, risk is of necessity a social construction. The common cellphone is a good example: there is some evidence of hazard, no proof of hazard and no estimate of probabilities, but such obvious utility that hypothetical risk is discounted by nearly everyone.

Science cannot assess uncertainty, nor determine appropriate risk preferences; these are of necessity political decisions. For agricultural biotechnology, the precondition for risk regulation would ask of science: do transgenic plants produce more hazards than cultivars bred by other means? Though there may well be new hazards, none has been demonstrated in mainstream science to date.

The European Commission Directorate-General for Research assessed available regulatory science for environmental and food-safety risks in A Decade of EU-funded GMO Research (2001-2010): “The main conclusion to be drawn from the efforts of more than 130 research projects, covering a period of more than 25 years of research, and involving more than 500 independent research groups, is that biotechnology, and in particular GMOs, are not per se more risky than, [for example], conventional plant breeding technologies” (page 16).

Like China and Brazil, India supported biotechnology for its potential benefits and established institutions of state science to assess risks. Bt cotton utilised an insecticidal protein derived from a common soil bacterium, Bacillus thuringiensis, hence “Bt”, to control pests in cotton with less pesticide; the results dramatically demonstrated benefits. The same technology applied to brinjal raised the risk bar because it is a food crop. The Genetic Engineering Approval Committee (GEAC), after nine years of tests involving seven government agencies and departments, approved release of the crop by both the private and public sectors, based on comparative assessment of options. Hazards to both farmers and consumers were documented in current practices of heavy pesticide application — some unapproved for food crops. No hazards from the insecticidal protein were found through standard safety protocols; GEAC findings conformed to the EU’s general conclusions.

This statutory state science was not decisive, however. Then minister for environment, Jairam Ramesh, concluded that the GEAC studies were inadequate; risks to food safety and the environment were posited. Neither hypothetical risk was explicitly compared to known hazards of existing practices; uncertainty trumped demonstrable risk. Food safety was the most telling example. Only one study, not peer-reviewed and funded by an international campaigner against biotechnology, was cited as evidence of hazard: organ damage and death.

However, Professor Gilles-Eric Séralini’s claims about Bt proteins had been rejected by the GMO Panel of the European Food Safety Authority; his most recent article positing cancer risks was retracted after publication by Food and Chemical Toxicology — a rare and embarrassing step for a journal. With GMOs, as with climate science, there coexists a core of scientific consensus and persistence of denialism in the scientific community.

The minister’s logic was precautionary, consistent with one line of international practice. This is a global pattern of conflicting logics of developmental states and precautionary states. Developmental states embrace some uncertainty: continuation of the status quo likewise entails uncertainties, and often known hazards. Precautionary states, in contrast, privilege caution over opportunity; uncertainty is coded as unacceptable risk. States are divided between these approaches; the location of official science in the state then fundamentally affects outcomes. Ministries of environment tend to be preservationist, hence precautionary. Ministries of agriculture or science and technology have different missions, more in common with the logic of developmentalism. Exactly this division appeared in India over Bt brinjal; as environment held the decisive voice, the crop was not approved.

This dialectic of risk and benefit encounters the Goldilocks Paradox of all regulation; the level of cautionary restriction should be not too much, not too little, but just right. Excessive regulation is suffocating and adverse to equity. Too little precaution might produce hazards that entail unacceptable risk. The strictest regulation enables multinational life science corporations with capacity and connections to win at the expense of small firms and public science. Bio-safety regimes empower officials over farmers; scarce public resources are wasted in surveillance and control rather than innovation. Investment in both public and private sectors is depressed.

If effective technologies are blocked, agriculture is needlessly crippled. It is ethically difficult to justify depriving farmers of the same technical progress urban people take for granted. Finally, with climate change continually producing new challenges to agriculture, ruling out any tools for response is itself a risky proposition.

The writer, professor of government at Cornell University, edited the book ‘Transgenics and the Poor: Biotechnology in Development Studies’

For all the latest Opinion News, download Indian Express App

    Live Cricket Scores & Results