On the Perception of Risk and Benefit

 

B.Contestabile         admin@socrethics.com         First version 2008   Last version 2013

 

 

 

 

Table of contents

 

Abstract

 

1.   Introduction

2.   Natural Risks

      2.1  Asteroid Impacts

      2.2  Earthquakes

      2.3  Explosive Volcanic Eruptions

      2.4  Diseases and Injuries

3.   Technological Risks

      3.1  Nuclear Technology

      3.2  Bio-Technology

      3.3  Environmental Risks

      3.4  Road Traffic Accidents

4.   The Theory of Risk Perception

      4.1  Basics of Risk and Benefit

      4.2  The Psychometric Paradigm

      4.3  Cultural Theory

5.   Distorted Perceptions

      5.1  The Underestimation of Natural Risks

      5.2  The Preference for Natural Risks

      5.3  The Preference for Familiar Risks

      5.4  The Underestimation of Technological Risks

6.   Conclusion

 

References

 

 

 

 

 

Abstract

 

 

Starting point

Starting point of this paper is the observation, that natural risks are systematically underestimated. It also seems that there is an irrational preference for natural risks opposite to technological risks.

 

 

Type of Problem

- How can the preference for natural risks be explained?

- Are technological risks underestimated as well as natural risks?

 

 

The preference for natural risks

The preference for natural risks is rational as far as it refers to the increasing magnitude and complexity of technological risks (whereas natural risks remain constant).

 

According to the theory of risk-perception an irrational preference for natural risks can be explained as follows:

- The technology in question is loaded with negative associations. This creates an (unconscious) negative disposition.

- The negative disposition is rationalized by assigning a low benefit and a high risk to the technology.

Sometimes an irrational preference for natural risks can also be explained by traditions which emphasize the divine origin of nature.

 

The preference for natural risks is a special case within the preference for familiar risks. As soon as technological risks become familiar, they are treated like natural risks.

 

 

The underestimation of technological risks

There are strong indications that many technological risks are systematically underestimated as well as natural risks.

 

Technologies involving unfamiliar risks are often imposed by threats of war and competition. The struggle for (economic) survival explains why technological innovation accelerates even in areas of high risk.

 

The combined effects of risk-tolerance and underestimation add up to a considerable risk of self-destruction. Insofar the relation between risk and benefit (in terms of survival) has worsened in the 20th century.

 

 

 

 

 

1. Introduction

 

 

Starting point

Starting point of this paper is the observation, that natural risks are systematically underestimated. It also seems that there is an irrational preference for natural risks opposite to technological risks.

 

 

Type of Problem

1.      How can the preference for natural risks be explained?

2.      Are technological risks underestimated as well as natural risks?

 

 

 

2. Natural Risks

 

 

2.1 Asteroid Impacts

 

 

Definition

1.      Asteroids, also called minor planets or planetoids, are a class of astronomical objects. The term asteroid is generally used to indicate a diverse group of small celestial bodies in the solar system that orbit around the Sun. Hundreds of thousands of asteroids have been discovered within the solar system at the present rate of discovery around 5,000 per month (Asteroid, Wikipedia).

2.      Asteroids are composed of rocky material. Most of them stay a safe distance from Earth, between the orbits of Mars and Jupiter. But some, the so-called Near-Earth objects (NEOs) follow orbits that can intersect Earth’s. [Rees, 90]

 

 

Example

About sixty-five million years ago Earth was hit by an object about ten kilometers across. The resultant impact released as much energy as a million H-bombs; it triggered mountain-shattering earthquakes around the world and colossal tidal waves; it threw enough debris into the upper atmosphere to block out the Sun for more than a year. This is believed to have been the event that wiped out the dinosaurs. [Rees, 90]

 

 

Probability

1.      There is a fifty percent risk of a Tunguska-scale impact (asteroid of a few tens of meters across, with an energy 1000 times as powerful as the bomb dropped on Hiroshima) somewhere on Earth this century and a far smaller chance of an impact on a densely populated region.

2.      The dominant risk is not from Tunguska-scale events, but from rarer impacts that would each devastate a larger area. There is about one chance in ten thousand that within the next fifty years an asteroid half a kilometer across will crash into the North Atlantic or Pacific, causing giant tsunamis.(…) The probability that we’ll end our lives in such an event is about the same as the average person’s risk of dying in an air crash. [Rees, 92]

3.      An asteroid measuring over 1,000 meters in diameter is potentially capable of destroying human civilization. Chances of a major asteroid impact in the 21st century are a mere 0.0002 percent, although there is a 2 percent probability of Earth colliding with a 100 meter asteroid before the year 2100 (from Deflecting Asteroids Difficult but Possible)

 

Astronomers are confident that they know all of the potentially world-destroying asteroids. There is a test mission to smash a spacecraft into a harmless 800-metre asteroid in an attempt to deflect it. Dealing with larger bodies would likely require a nuclear explosion. Concerning smaller asteroids Martin Rees uses an analogy with the calculation of an insurance premium. We have to multiply the risk by the cost of damage. By that measure, it would be worth spending (disputed) 25 times as much as is allocated now [Aaron].

 

 

 

2.2 Earthquakes

 

 

Definition

1.      An earthquake is the result of a sudden release of energy in the Earth's crust that creates seismic waves.

2.      The moment magnitude of an earthquake is conventionally reported, or the related Richter magnitude, with magnitude 3 or lower earthquakes being mostly imperceptible and magnitude 7 causing serious damage over large areas.

3.      At the Earth's surface, earthquakes manifest themselves by a shaking and sometimes displacement of the ground. When a large earthquake epicenter is located offshore, the seabed sometimes suffers sufficient displacement to cause a tsunami. The shaking in earthquakes can also trigger landslides and occasionally volcanic activity (…) Earthquakes are caused mostly by rupture of geological faults, but also by volcanic activity, landslides, mine blasts, and nuclear experiments.

(Earthquake, Wikipedia)

 

 

Examples

See List of earthquakes

 

 

Probability

1)                 Large earthquakes occur less frequently, the relationship being exponential; for example, roughly ten times as many earthquakes larger than (Richter) magnitude 4 occur in a particular time period than earthquakes larger than magnitude 5. In the (low seismicity) United Kingdom, for example, it has been calculated that the average recurrences are:

a)      an earthquake of 3.7 - 4.6 every year

b)      an earthquake of 4.7 - 5.5 every 10 years

c)      an earthquake of 5.6 or larger every 100 years.

With the rapid growth of mega-cities such as Mexico City, Tokyo or Tehran, in areas of high seismic risk, some seismologists are warning that a single quake may claim the lives of up to 3 million people (Earthquake, Wikipedia)

 

2)      The worst localized natural catastrophe that could be deemed probable in this century would be an earthquake in Tokyo or Los Angeles, where the immediate devastation would have a long-term “fallout” for the world’s economy [Rees, 92]

 

 

 

2.3 Explosive Volcanic Eruptions

 

 

Definition

1)      An explosive volcanic eruption is driven by gas including water vapour accumulating under great pressure. Driven by the hot rising magma as it interacts with the ground water the pressure increases until it bursts violently through the over mantle of rock.

2)      This is merely the beginning. In many cases the rising magma will have vast quantities of gas dispersed through it, partially dissolved; held only by the enormous pressure. Sometimes there is a lava plug blocking the conduit to the summit, and when this occurs, eruptions are even more violent. With the sudden release of pressure following the initial explosion this gas resumes its gaseous form, violently and explosively. This secondary explosion is often far more violent than the first one; the rocks, dust, gas and pyroclastic material may be blown 20 km into the atmosphere at rate of up to 100,000 tons per second, traveling at several hundred meters per second.

3)      Sooner or later this cloud collapses, almost as violently, creating a pyroclastic flow, the killer cloud of hot volcanic matter.

(Explosive Eruption, Wikipedia)

Examples:

1)      Krakatoa in 1883

2)      Mount St. Helens in 1980.

3)      Mount Pinatubo 1991

 

 

Supervolcanos

Volcanic eruptions include a rare class of “supereruptions”, thousands of times larger than the eruption of Krakatoa in 1883. A supereruption in northern Sumatra, seventy thousand years ago left a one-hundred-kilometer crater and ejected several thousand cubic kilometers of ash, enough to have blocked out the sun for a year or more [Rees, 97]

 

The term supervolcano refers to any volcano capable of throwing out at least 300 cubic kilometers of magma during an eruption. (…) One of the most recent was the Toba eruption, 74’000 years ago. A medium-sized super-eruption, releasing 1000 cubic kilometers of magma, would wreak the same devastation as a 1-kilometre-wide asteroid smashing into the Earth. (…) Previous super-eruptions have been linked to mass extinction events [Ravilious, 32].

 

 

Probability

Eruptions of the scale of Mt Pinatubo occur roughly every 100 years. The impact of such eruptions needs to be taken into account when assessing long-term climate trends. Eruptions are sporadic, unpredictable and vary in size. Ash and gases vary in composition. These factors present a challenge to climatologists seeking to isolate volcanic effects from other influences on past global climate.

There is a reasonably high probability that the temperature record will be influenced by volcanic activity in the next decade.

(Volcanic eruptions and climate change, by CSIRO Atmospheric Research)

 

A super-eruption is 5 to 10 times more likely than an asteroid strike

At least one supervolcano explodes every 100’000 years or so, the geological record suggests [Ravilious, 32].

 

 

 

2.4 Diseases and Injuries

 

 

 

Updated projections of global mortality and burden of disease, 2002-2030: data sources, methods and results (WHO, 2005)

 

 

1.      The 20 leading causes of death account for about 2/3 of the total in 2002.

2.      Injuries account for about 8% of the total in 2002.

3.      Due to treatment errors and infections in German hospitals, five times more people die each year than in traffic accidents (faz.net 24.02.17)

 

 

Probability

If the 2002-2007 environment remains stable then

1.      the vast majority of people die from well-known diseases.

2.      the only technological risk which concerns the majority is road traffic accidents

The stability of the system, however, is far from guaranteed. Following some of the major destabilizing risks:

 

 

 

 

3. Technological Risks

 

 

 

3.1 Nuclear Technology

 

 

Nuclear reactors

Examples of nuclear disasters:

         Three Mile Island

         Windscale Fire (Sellafield)

         Fukushima

Designers of nuclear reactors aimed to reduce the probability of the worst accidents to less than one per million “reactor years”. To do such calculations, all possible combinations of mishaps and subsystem failures have to be included. Among these is the possibility that a large aircraft might crash onto the containment vessel (…) The chance that one of them would hit a particular building is reassuringly low, much less than one in a million per year. But we now know that this is not the right calculation. It overlooks the possibility that kamikaze-style terrorists could aim for just such a target. The chance of such an event cannot be assessed even by the most astute engineers [Rees, 46].

 

 

Nuclear war

The destructive potential of existing nuclear weapons is such that life on earth can be annihilated several times. Smoke generated by even a “small” nuclear war would provoke deadly and widespread climatic disruption (see Climate catastrophe).

The Cuban Missile Crisis in 1962 was not only the most dangerous moment of the Cold War, it was the most dangerous moment in human history [Rees, 26]. Robert Mc Namara, former US Secretary of defense, estimated that the probability of nuclear war was substantially higher than one in six [Rees, 28], a probability which is reminiscent of Russian roulette.

 

The two controversial ethical positions in the Cold War were characterized by the following slogans:

1)      Rather red (communist) than dead, a slogan which addresses the willingness to survive at any price. From this point of view nuclear armament is an indefensible risk.

2)      Rather be dead than live in slavery (Schiller, William Tell), a slogan which addresses the willingness to pay any price for freedom. In the Cold War, the term freedom was closely tied to the terms capitalism and human rights. Cultural values cannot be preserved without economic and military power. From this point of view nuclear armament is a moral obligation.

Nuclear deterrence is a fragile equilibrium and above moral dilemma may soon be reactivated by a new nuclear power or the emergence of nuclear terror.

 

Although the Cold War ended in the early 1990s, the doctrine of Mutually Assured Destruction (MAD) certainly continues to be in force although it has receded from public discourse. The payoff of this doctrine is expected to be a tense but stable peace.

Critics of the MAD doctrine note the similarity between the acronym and the common word for mental illness. The doctrine of nuclear deterrence depends on several challengeable assumptions (Mutually Assured Destruction, Wikipedia).

Following an example for the risks of the MAD doctrine:

On May 23, 1967, the US Air Force prepared aircraft for war, thinking the nation’s surveillance radars in polar regions were being jammed by the Soviet Union. Just in time, military space weather forecasters conveyed information about the solar storm’s potential to disrupt radar and radio communications (1967 Solar Storm, American Geophysical Union). If the aircrafts had been started, then the Soviets – who permanently observed these movements – would have acted in the same way. Since the solar storm also jammed the radio communication the jets could not have been recalled.

 

 

Nuclear terror

Nuclear know-how is proliferated and increases the risk of nuclear terror (see Nuclear proliferation, Wikipedia).

Example: Between 1993 and 2008 more than 1300 illegal deals with radioactive material were detected. In 16 cases weapons-grade uranium or plutonium was found [Tages-Anzeiger, March, 1, 2008, p.40].

The increasing risk of terror requires a reconsideration of most risk calculations. Events like the September 11 attack have shown that it is not sufficient to calculate the worst case on the basis of possible accidents.

 

 

Particle physics

The world could theoretically be destructed by experiments in elementary particle physics [Leslie]. Physicians at the CERN

1.      calculate the risk of a complete destruction (see The experiment that could blow up the planet) on the basis of current theory

2.      at the same time expect fundamental new insights which could upturn the current theory.

The possibility of the Large Hadron Collider (LHC) destroying the earth could yet be debated in court (…)

One question a court can investigate is how likely it is that the theoretical underpinnings of the scientific work are defective. Those seeking an injunction could, for example, ask a court to consider the history of shifting arguments for why the LHC is safe. In 1999 physicists said no particle accelerator for the foreseeable future would have he power to create a black hole. But theoretical work published in 2001 showed that if hidden extra dimensions in space-time did exist, the LHC might create black holes after all. Thereafter, the argument for safety was changed. In 2003, it said that any black holes created would instantly evaporate. But when subsequent theoretical work suggested otherwise, the argument changed again [Johnson].

 

 

 

3.2 Bio-Technology

 

 

Bio-attacks

For decades several nations have had substantial and largely secret programs to develop chemical and biological weapons. There is ever-growing expertise in designing and dispersing lethal pathogens.

Example: Sverdlovsk anthrax leak (1979)

 

The problem of detecting illicit fabrication of nuclear weapons is easy compared with the task of verifying national compliance with treaties on chemical and biological weapons. And even that is easy compared with the challenge of monitoring subnational groups and individuals. Biological and chemical warfare were long regarded as cheap options for states without nuclear weapons. But it no longer requires a state, or even a large organization, to mount a catastrophic attack: the resources needed could be acquired by private individuals.

Examples:

1.      Sarin gas attack on the Tokio subway (1995)

2.      American Anthrax outbreak of 2001.

 

The manufacture of lethal chemicals or toxins requires modest-scale equipment that is, moreover, essentially the same as is needed for medical or agricultural programs. The knowledge and techniques for making biological superweapons will become dispersed among hospital laboratories, agricultural research institutes and peaceful factories everywhere. Only an oppressive police state could assure total government control over such novel tools for mass destruction (…). Thousands of individuals, perhaps even millions, may someday acquire the capability to disseminate weapons that could cause widespread (even worldwide) epidemics. A few adherents of a death-seeking cult, or even a single embittered individual, could unleash an attack [Rees, 47-49].

 

For infectious diseases, initial dispersal is less crucial than for anthrax (which cannot be passed on from person to person), even a localized release, especially in a mobile population could trigger a widespread epidemic [Rees, 51].

One feature common to all biological attacks is that they cannot be detected until it is too late, perhaps even not before the effects have diffused worldwide (…). This delay is an attraction to the lone dissident or terrorist, because the provenance of an attack can be readily camouflaged [Rees, 53].

 

 

Engineered viruses

Four factors determine the severity of any disease outbreak:

1.      How deadly it is

2.      How easily it spreads from person to person

3.      If and how long a person is infectious before symptoms appear

4.      Whether it can be prevented by vaccines, treatments or both

[Young, 30]

There is one virus that might be able to acquire all four traits almost instantaneously: flu. The question is whether mutation, RNA-reassortment or both could create a form of flu that combines the deadliness of some bird flu strains with the infectiousness of the strains that circulate in people [Young, 32]. But it is not only wild viruses we need to worry about. Engineered viruses might start a pandemic if there was a slip-up in the lab. The 1977 flu pandemic was probably caused by the escape of an old human flu strain from a lab (…). In 2001 biologists had accidentally created an extremely deadly version of a rabbit virus. In 2003 “biodefense” researchers in the US deliberately altered a mouse virus in this way [Young, 33].

 

All pre-2000 epidemics (with the possible exception of the Russian anthrax release) were caused by naturally occurring pathogens. But the bio-threat has been aggravated by the advance of biotechnology. Just a few individuals with specialized skills and access to a laboratory could inexpensively and easily produce panoply of lethal biological weapons that might seriously threaten the US population. Moreover, they could manufacture such biological agents with commercially available equipment and therefore remain inconspicuous (…). A skilled “loner” like could perpetrate a catastrophic epidemic, even though the focus is now on terrorist groups.

Example: The American mathematician Theodore Kacynski (called Unabomber)

 

All over the world there are people with the expertise to undertake genetic manipulations and cultivate microorganisms [Rees, 54].

Within a few years, the genetic blueprints of vast numbers of viruses will be archived in laboratory databases accessible to other scientists via the Internet. The blueprint of the Ebola virus, for example, is already archived; there are thousands of people with the skills to assemble it, using strands of DNA that are available commercially (…). Creation of “designer viruses” is a burgeoning technology. And a better understanding of the human immune system, though of crucial medical benefit, will also make it easier for those who wish to suppress immunity (…). Strains of bacteria can be developed that are immune to antibiotics (…). We may not have to wait long before new kinds of synthetic microbes are being genetically engineered. If this technique works, it opens up the prospects of designing new forms of life that could feed off other materials in our environment [Rees, 55-57].

 

 

Conclusion

The cold war exposed us to graver risks than most would knowingly have accepted. The danger of nuclear devastation still looms but threats stemming from new science are even more intractable [Rees, 25].

 

 

 

3.3 Environmental Risks

 

The following diagnosis is taken from Earth’s nine lives [Pearce]:

 

Acid oceans

Diagnosis: Safe for now, but some oceans will cross the threshold by mid-century

 

Fresh water

Diagnosis: Boundary will be approached by mid-century

 

Nitrogen and phosphorus cycles

Diagnosis: Boundary not yet exceeded

 

Land use

Diagnosis: Boundary will be approached by mid-century

 

Aerosol loading

Diagnosis: Risk unknown

 

Chemical pollution

Diagnosis: Risk unknown

 

Biodiversity

Diagnosis: Boundary far exceeded

Individual species may not matter much on their own, but collectively they form ecosystems that provide a range of vital services, such as recycling waste, cleaning water, absorbing carbon and maintaining the chemistry of the oceans [Pearce]

 

Climat change caused by carbon dioxide technology

Diagnosis: Boundary exceeded

 

In contrast to ozone depletion, global warming due to the so-called greenhouse effect is an environmental problem for which there is no quick fix [Rees, 108].

Following some publications concerning the probability and consequences of global warming:

1.      Intergovernmental Panel on Climatic Change (IPCC)

2.      Climate Catastrophe? by Richard S.Lindzen

3.      A few reviews of Lomborg’s “Cool It”

 

Interactions with other risks

1.      Even if global warming occurs at the slower end of the likely range, its consequences – competition for water supplies and large-scale migration – could engender tensions that trigger international and regional conflicts, especially if these are further fuelled by continuing population growth. Moreover, such conflict could be aggravated, perhaps catastrophically, by the increasingly effective disruptive techniques with which novel technology is empowering even small groups [Rees, 110].

 

2.      The interaction between the atmosphere and oceans is so complex and uncertain that we can’t discount the risk of something much more drastic than the “best guess” rate of global warming. The temperature change may not be just in direct (linear) proportion to the rise in the carbon dioxide concentration. When some threshold level is reached, there could be a sudden and drastic flip to a new pattern of wind and ocean circulation (…). We know that changes of this kind happened in the past. Many times during the last hundred thousand years there seem to have been drastic cooling-offs within decades or less (…). The small chance of something really catastrophic is more worrying than the greater chance of less extreme events (…). Even at a one-percent chance that human-induced atmospheric changes could trigger an extreme and sudden climatic transition is a disquieting enough prospect to justify precautionary measures more drastic than those already proposed by the Kyoto agreements. Such a threat would be a hundred times larger than the baseline risk of environmental catastrophe that the Earth is exposed to from asteroid impacts and extreme volcanic events [Rees, 110-112].

 

The strategic threats posed by global environment and development problems are the most complex, interwoven and potentially devastating of all the challenges to our security. Scientists…do not fully understand the consequences of our many-faceted assault on the interwoven fabric of atmosphere, water, land and life in all its biological diversity [Rees, 112].

 

For more information on the above mentioned risks see Earth’s nine lives [Pearce]:

 

 

 

3.4 Road Traffic Accidents

 

While investigating and discussing a possible destruction of the biosphere, the daily victims of technology are almost forgotten.

A rise of 40% until 2030 is expected for injury caused deaths in road traffic accidents. Twice as much people will probably killed by these accidents than by violence and war.

 

 

 

 

Updated projections of global mortality and burden of disease, 2002-2030: data sources, methods and results (WHO, 2005)

 

 

 

 

4. The Theory of Risk Perception

 

 

4.1 Basics of Risk and Benefit

 

 

Risk

The meaning of the term risk depends on the field of knowledge.

Examples:

         Toxicology: exposure to a dangerous substance

         Engineering: failure of a system

         Politics:  failure of a political decision

         Economy: failure of an investment decision

All this meanings are background information for the perception of risk on the psychological and sociological level.

(Sozialpsychologische Risikoforschung).

 

Obviously, in order to define risk, one has to define situations like loss, catastrophe or undesirable outcome. Risk can be expressed in terms of suffering, number of deaths, financial loss etc. This definition makes clear, that risk can only be valuated relative to a goal.

 

 

Benefit/risk

         A benefit is a contribution in the striving for a goal.

         A benefit is related to a risk and is measured relative to this risk.

         The risk-benefit ratio is a measure for the quality of a decision.

In primitive cultures natural risks can be seen as the “price” that has to be paid for survival. With a few exceptions like Buddhism, cultures do not question this price. Similarly in advanced civilizations, technological risks can be seen as the “price” that has to be paid for survival in the cultural competition.

         The present paper concentrates on death risk, the term benefit relates to survival.

         The paper The Cultural Evolution of Suffering concentrates on the risk of suffering.

 

 

Theoretical and empirical risk

The new nature of technological risks was not understood when the first protests against nuclear power plants were staged and when the first activist groups were formed. Chauncy Starr, an engineer, was the first expert to acknowledge that acceptance of voluntarily entered risks, such as smoking, was much higher than that of risks that had to be suffered involuntarily (Starr, 1969). But Starr calculated the public acceptance of risks on the basis of a theoretical model, instead of studying risk perception empirically, which led him to conclude that nuclear power plants would be acceptable (Science in a Political Environment).

Empirical studies have shown that the perception of risk is a psychological and social construct which does not necessarily correspond to objectively measurable risks. The peculiarities of risk perception have been investigated on two levels:

1)      On the individual level by means of factor analysis (Psychometric paradigm)

2)      On the society level by means of structural analysis (Cultural theory)

 

 

 

4.2 The Psychometric Paradigm

 

 

Factor analysis

The following factors in the perception of risk were investigated by factor analysis:

1)      The familiarity with the risk

2)      The presumed controllability of the risk

3)      The potential for catastrophic consequences

4)      The immediacy of the consequences (short-term perspective)

5)      The extent to which the risk is known in the sciences and in the public.

 

Typical results were the following:

1)      The risk of a technology is overestimated, if it has a potential to kill many people within a short period of time as compared to killing the same number of people over a long period of time

2)      Risks are overestimated if they concern the interviewee him-/herself.

3)      Risks are underestimated if they are taken voluntarily.

4)      Risks are underestimated if people think they can influence the outcome by their own action. In those cases most people are unrealistic optimistic and think they are safer than the average.

5)      Natural risks are underestimated relative to technological risks. People tend to accept inevitable risks, but they overestimate risks where a culprit can be found (more on this in chapter 4.3.)

(Sozialpsychologische Risikoforschung).

 

Between the variables mentioned above there are some strong correlations.

It was found that two main factors could explain why lay people saw some risks as more dangerous than others. These factors are referred to as "dread" and "unknown."

1)      A dread risk elicits a visceral feeling of dread, is uncontrollable, is catastrophic, is fatal, is inequitable, and is involuntary.

2)      An unknown risk is delayed, new, and unknown to science.

(Risk perception, Wikipedia)

 

 

Combination of factors (“dread” and “unknown”)

1.      The risk of a nuclear war combines magnitude and complexity of risk. Mutual nuclear deterrence successfully prevented a third world war, but the risk of human or technical failure in the control of high-tech weapons remains. The new dimension of risk introduced by nuclear weapons (mutual assured destruction) combined with distrust in complex technology contributed to the back-to-nature movements during the cold war.

2.      Similarly the phenomenon of global warming combines magnitude and complexity of risk. The fear of apocalypse induced by global warming contributes to actual cases of technophobia.

 

 

Affective factors

The factors mentioned so far are subsumed under the term cognitive, because they concern the knowledge about risks.

Research within the psychometric paradigm has more recently turned to focus on the roles of affect, emotion, and stigma in influencing risk perception.

1.      The basic premise of the turn toward affective theories is that affect -- a positive or negative feeling toward an object -- causes evaluations of an object's riskiness, rather than the other way round (see affect heuristic). A key finding in support of this theory is the strong negative correlation between people's judgments of the risk and benefit of an activity. Activities judged to have a high risk are nearly always seen as having low benefit, and vice-versa. It is assumed that these judgments help to rationalize a negative disposition toward hazardous activities (or rationalize a positive disposition toward low-risk activities).

2.      Stigma refers to a metaphorical mark of disgrace attached to certain risky activities. Stigmatized activities are seen as morally objectionable, completely unacceptable, and polluting to anyone or anything associated with them.

(Risk perception, Wikipedia)

 

Example 1: The affective factor trust:

It is widely agreed that trust is a key factor in influencing people's perceptions of risk. There are two main ways trust is said to shape risk perceptions:

1)      An activity is perceived as more risky if the people or agencies managing it are perceived as untrustworthy.

2)      Information presented by trusted sources is given more credence than information from untrusted sources.

(Risk perception, Wikipedia)

         If an American does not trust the US Department of Energy, he/she is likely to exaggerate the danger of nuclear power.

         Due to the Bhopal disaster (and others) chemical plants have a questionable reputation. The risk created by a chemical plant is therefore overweighed as compared to the risk which is created by car drivers [Birnbacher, 23].

 

Example 2: The affective factors greed and fear.

The majority tends to pursue short-term interests. Short-term success of technology distracts from long-term risks. The benefit is visible and accompanied by applause (fast cars, cheap energy); the risk grows in the silent (e.g. Climatic change). Once the danger is drastic, the perception of risk turns into the other extreme. Risk-taking in culture works similar to risk-taking in the stock market; it is governed by the affective factors greed and fear. Technology acts like an amplifier.

 

 

Combination of cognitive and affective factors

Cognitive and affective factors work together in creating a distorted perception of risk.

Example:

1)      Cognitive: The factor “unknown” creates a negative disposition towards genetic engineering.

2)      Affective: The negative disposition is rationalized by assigning a low benefit to genetically modified plants.

 

 

 

4.3 Cultural Theory

 

 

The cultural theory of risk

The most influential cultural theory is called simply "The Cultural Theory of risk". Cultural Theory is based on the work of anthropologist Mary Douglas and political scientist Aaron Wildavsky.

Cultural Theory makes two basic claims.

1)      It argues that views of risk are produced by, and support, social structures. Fear of certain types of risks serves to uphold the social structure.

2)      Cultural Theory proposes that there are four basic "ways of life," each corresponding to a particular social structure and a particular outlook on risk (…) Attempts have been made to validate Cultural Theory with survey research, but controversy remains over whether to interpret their results as supporting or refuting the theory.

(Risk perception, Wikipedia)

In this paper we concentrate on the first basic claim of cultural theory.  

 

 

Preservation of social structures

There is a connection between the factor “unknown” (chapter 4.2) and the attempt to preserve social structures (chapter 4.3). Technological change causes social change. If the technological risk is unknown then the social risk is unknown as well. The distrust for technology often expresses an unwillingness to change social structures.

 

 

The divinity of nature

Many people think that nature benefits people [Birnbacher, 27], whereas technical attempts to improve benefit are suspicious. An explication for attributing “goodness” or even “divinity” to nature may be found in the dominance of the Stoic and Christian tradition within occidental history [Birnbacher 16, 20]. These traditions strive to adapt humans to “nature as it is” and are at the root of many back-to-nature movements.

Example:

Most Amish do not practice any form of birth control and are against abortion. They are among the fastest-growing populations in the world and do not worry about overpopulation (similar to the Catholic church, see Population Control).

 

The Transhumanist vision of a technological salvation is seen as hubris by the Christian doctrine.

After the explosion of the nuclear plant in Chernobyl, various analysts reported an unusual number of apocalyptic dreams among their patients. The nightmares common denominator was a superhuman nemesis in the act of scourging a technology that represents human hubris. None of these patients bore any personal guilt for that explosion, but all of them experienced it as a sort of collective guilt (…) The unconscious continues to experience the hubris of technology as ridden with guilt (Growth and Guilt, 183)

 

 

Contemporary visions of justice

The perception of risk is influenced by the distribution of risks [Birnbacher, 23].

1)      Just distribution: In the case of car accidents the risks are well tolerated because the majority uses cars and the risks are almost evenly distributed among the drivers

2)      Unjust distribution: The majority rebels if wealthy citizens get a better medical treatment. The risk of an incurable illness is tolerated as long as it is incurable for everybody.

 

 

 

 

5. Distorted Perceptions

 

 

5.1 The Underestimation of Natural Risks

 

The underestimation of natural risks can be explained by the psychological factors discussed in chapter 4.2:

 

1.      Familiarity.

People tend to accept the inevitable natural risks. In daily life many risks are suppressed because the imagination of a constant horrible threat (like earth quakes, strokes etc.) paralysis all activity. Opinion surveys about happiness [Frey] rely on this suppression. But the seemingly scientific regularity of daily life is an illusion; the power of contingency is omnipresent [Hampe]. Experiences with contingency might have contributed to the concept of the Hindu Maja, the imagination that we live in an illusory world and that our perception is distorted. The unconscious part of the psyche ignores all risks which are not accessible by the senses. Optimism improves the Darwinian fitness.

a.      It regularly attracts the attention of observers how quickly a destroyed region is repopulated after a volcanic eruption. People behave as if the probability of an eruption would decrease after the destruction [Birnbacher, 27].

b.      The conclusion of insurances for earthquake damages increases immediately after an earthquake and then decreases continuously [Matuschek]. Areas with a high probability for earthquakes are not less populated than areas with a low probability [Schneider].

c.      In Bangladesh almost every year large areas are flooded, the houses destroyed and hundreds of people killed (Example: Bangladesh Flood 2007). But shortly after such an event the same region is farmed again, the houses are reconstructed and the population continues to expand. Seen from this perspective it is not surprising that the risks of climate change are not taken seriously by a majority.

 

2.      Presumed Controllability.

a.      Risks are underestimated if people think they can influence the outcome by their own action. People tend to think that they are safer than the average. The underestimation of risks (e.g. in mountain climbing) can be measured by comparing survey data with the statistics of insurances. In some instances the risks are plainly denied by the interviewees.

b.      Risks are underestimated if they are taken voluntarily. Statistically dangerous stiles of living (e.g. nutrition which is too fat and rich in calories) don’t scare most people. The risks that kill you are not necessarily the risks that anger and frighten you [Sandman].

 

3.      The immediacy of the consequences (short-term perspective)

a.      In daily life our perception of risks (e.g. unprotected intercourse) and chances is distorted by short-term interests. This may be explained by the fact that the interests of old people are biologically irrelevant (except from activities serving the next generation).

b.      In procreation the perception of risk is systematically distorted because the decision to have children is usually taken at an emotional peak of one’s life. From a biological point of view it makes sense to couple procreation with a spontaneous conviction to do the right thing and (at the same time) with a loss of reason and realism.

 

 

 

5.2 The Preference for Natural Risks

 

According to [Birnbacher 23] risk is better tolerated if it stems from natural sources as compared to technological sources, even if it can be demonstrated that the risk is equivalent.

Example: Natural radioactivity is preferred to technical radioactivity, even if the exposure is the same [Hansson].

 

 

Thesis: [Birnbacher, 24]:

The preference for natural sources of risk can be explained by

1)      an evolutionary (unconscious) knowledge shaping intuition

2)      the assumption that natural risks are easier to avoid than technological risks

3)      the increasing magnitude and complexity of technological risks (whereas natural risks remain constant)

In the following these explications are investigated in more detail:

 

 

Evolutionary adaptation

The lacking evolutionary adaptation to technological risks could explain why people are more cautions in dealing with artificial factors than with natural factors in their environment.

 

Examples:

1)      The energy of a driving car is systematically underestimated because cars can be controlled with practically no effort.

2)      We are less prepared for synthetic substances than for natural contaminants [Birnbacher, 24]

3)      Radioactivity cannot be perceived at all. Natural doses are far below the limit which causes damage (see Radiation Exposure Examples) so that there was no need for evolutionary adaptation. There is no human warning system against the dangerous doses produced by nuclear disasters.

4)      The destructive energy generated by high-technology weapons is in no relation to the human senses. A bomber pilot can kill thousands of civilians by the tip of a finger.

 

Counter-examples:

1)      For risks after the period of procreation (e.g. the increasing risk of cancer) there is no evolutionary pressure for adaptation [Hansson]. This argument is not quite correct, because there is a biological interest to bring up the offspring. The period which is exposed to evolutionary pressure has to be enhanced accordingly [Birnbacher, 54]

2)      In many respects humans are not any more adapted to nature, because the environment considerably changed relative to the one which shaped prehistoric man. Most plants which are used in our daily alimentation are new ones [Hansson].

3)      With regard to deluges, earthquakes, volcanic eruptions and asteroid collisions there was no learning process and no evolutionary adaptation because these events occur erratically and in long intervals.

 

Conclusion: In most cases there is a lacking evolutionary adaptation to both technical and natural risks.

 

 

Avoidance of risk

1.      In many cases the cost to prepare for natural risks (e.g. deluges, earthquakes) exceeds the cost for reducing technological risks, e.g. car accidents [Birnbacher, 25].

2.      If the 2002-2007 environment remains stable then the vast majority of people die from well-known diseases (chapter 2.4). A stroke usually occurs as unattended as a terror attack, but has a much higher probability. The chance to become a victim of cancer is much higher than the one to be harmed by global warming.

 

Conclusion: Natural risks are not easier to avoid than technological risks.

 

 

Increasing magnitude of risk (factor “dread”)

Thesis: The technological risks grow continuously whereas the natural risks remain constant [Birnbacher, 25].

Examples:

1)      The increasing globalization of financial markets increases the magnitude of risk in case of a breakdown.

2)      The destructive energy and precision of high-technology weapons increases with each generation.

 

Counter-examples:

1)      Many new technologies imply lower risks than old ones (e.g. solar power).

2)      In contrast to natural risks, technological risks are related to a benefit [Birnbacher, 25]. (This statement has to be put into perspective though: the weapons industry theoretically produces a benefit for the winners, but there are wars without winners)

 

Conclusion: Although there are areas where risk decreases, the new dimension of certain technological risks (in particular Nuclear warfare) cannot be denied. Natural risks of a comparable dimension (like asteroid collision) have a much lower probability.

 

 

Increasing complexity (factor “unknown”)

Thesis: Technical resources are better understood than natural ones. The knowledge of the production process and the test procedures go into the risk estimation [Birnbacher, 26].

Example: The composition of medicaments is better known than the one of natural remedies.

 

Counter-examples:

1)      Complex software systems (like Microsoft’s Windows) cannot guarantee freedom from errors. The cases to be tested exceed by far the available resources for testing.

2)      New risks, such as rapid climate change, are without precedence in recorded human history. Likewise, it is uncertain what it means to burden future generations with radioactive waste and there is no way of reducing that uncertainty (Science in a political environment)

3)      Single medicaments are better tested than natural remedies, but not combinations of medicaments. Unexplored combinations often lead to dangerous situations (e.g. complex allergic reactions).

4)      The risk of failure by an unforeseen combination of hazards increases with the complexity of a system. Example: Chernobyl disaster. Concerning the power of contingency see [Hampe].

 

Conclusion: Although there are areas where technological risks are better understood than natural ones, the new dimension of complexity cannot be denied.

 

 

Cultural factors

Among the successful traditions are Stoicism and Christianity. Both emphasize the divine origin of nature, as compared to the human origin of technology. Both developed in times, where little could be done to influence the course of nature and a positive attitude towards natural risks therefore improved the survival value [Birnbacher, 37].

 

 

 

5.3 The Preference for Familiar Risks

 

As long as the negative impacts of a new technology concern a minority only, they are tolerated and become “familiar”. Car accidents have become so common that they are considered “familiar” or “natural” (although technology is used).

         The estimated cost of road accidents exceeds the total expense for development aid [WHO, 2007]. For people between 15 and 19 years road traffic injuries are the No.1 ranked cause of death, for people between 10-14 and 20-24 years No.2 ranked [WHO, 2004, Youth and road safety].

         The global injury mortality caused by road traffic injuries exceeds the one caused by wars and violence (see following graph) but the majority doesn’t consider this fact to be a matter of special consideration.

         The predicted worldwide road traffic fatalities correspond to the impact of a monthly Tsunami [WHO, 2002].

 

 

 

 

 

WHO (2002), The injury chart book: A graphical overview of the global burden of disease, Geneva

 

 

In contrast to car accidents radiation poisoning is unfamiliar. Radioactive sources and corresponding radiation exist in nature (e.g. cosmic rays) but many people consider radioactivity to be unnatural. The press is systematically overreacting when it comes to radiation exposure.

Example: Until 1983 the New York Times counted

         200 entries on radiation. In none of these events a casualty was reported.

         120 entries on car accidents. A total of 50’000 casualties were reported in these events.

The radon exposure in one's own home is probably greater than the radiation exposures which had regularly been trumpeted in head lines (Radiation exposure examples)

 

 

 

 

5.4 The Underestimation of Technological Risks

 

The dissent between laic estimations of risk and experts was decided in favor of the experts in the case of natural risks, but led to a continuing controversy in the case of technological risks (Sozialpsychologische Risikoforschung). There are indications, however, that many technological risks are systematically underestimated as well as natural risks.

 

 

Counterproductive effects and undesirable side effects

The counter-productive effects and undesirable side effects of a new technology are often hard to estimate because they show up with a delay:

Examples:

         Nuclear and bio-chemical weapons were thought to increase the survival value of their developers, but eventually created a potential for terror and self-destruction

         The massive deployment of antibiotics improves the resistance of the bacteria.

         During the Vietnam War the United States military sprayed chemical herbicides called Agent Orange in Vietnam, eastern Laos and parts of Cambodia, in order to defoliate forested and rural land, depriving guerrillas of cover. Vietnam estimates 400,000 people were killed or maimed, and 500,000 children born with birth defects as a result of its use (Agent Orange, Wikipedia)

         Only after widespread applications it was discovered that the prolonged inhalation of asbestos fibers can cause malignant lung cancer, mesothelioma, and asbestosis (Asbestos, Wikipedia)

         Power plants burning natural gas and coal improve a nation’s economic competitiveness but contribute to a possible environmental catastrophe. Global warming is underestimated or repressed because it shows up with a delay. When people have a comfortable lifestyle, their tendency to not rock the boat grows [Gifford]

 

 

The underestimation of complexity

It is often impossible to get “familiar” with technological risks because the power of contingency only reveals in a unique event:

         The worst case, i.e. the interaction of several areas of risk (including natural risks) may be incalculable, especially if human error, irrationality and destructive potential is involved.

         According to chaos theory complex systems can break down in an unpredictable way [Leslie, 6]

Examples:

         Bhopal disaster

         Space Shuttle Challenger disaster

Interestingly the immense efforts to control complexity by means of systems theory create a new source of risk because the (imperfect or deficient) indicators and computer models are mistaken as reality [Gugerli].

 

 

The underestimation of leverage

Access to modern technology can exert huge “leverage”

We are entering an era when a single person can, by one clandestine act, cause millions of deaths or render a city uninhabitable for years (…). These threats are growing for three reasons:

1.      The destructive and disruptive capabilities available to an individual trained in genetics, bacteriology or computer networks will grow as science advances.

2.      Society is becoming more integrated and interdependent (internationally as well as nationally)

3.      Instant communications mean that the psychological impact of even a local disaster has worldwide repercussions. This can inspire equally fanatical acts by sectarian groups or even “loners”.

[Rees, 61-62]

 

Some optimists imagine that scientific or technical education reduces the propensity towards extreme irrationality and delinquency. But many examples belie this (…). Although modern technology allows instant worldwide communication, it actually makes it easier to survive within an intellectual cocoon. Being economically self-sufficient via the Internet it is possible to cut oneself from contact with physical neighbors, indeed with any “normal” people. Beliefs can be reinforced by selective electronic contact with other adherents of a cult on other continents [Rees, 63].

 

 

Self-destruction

         Technologies involving unfamiliar risks (like new weapons and genetic mutations) are often imposed by threats of war and competition. The struggle for (economic) survival explains why technological innovation accelerates even in areas of high risk.

         In an ancestral environment risk-tolerance improves the biological fitness [Birnbacher, 37]; in a modern environment it might be fatal. The combined effects of risk-tolerance and underestimation add up to a considerable probability of human self-destruction. For the likelihood of self-destruction see Human extinction and Global catastrophic risk.

         Self-destruction is not a new phenomenon in socio-biological systems. Animals can’t consider the long-term consequences of their actions and blindly attempt to maximize the propagation of their genes. Natural selection doesn’t invariably favor tendencies toward self-preservation, neither on the species level (see e.g. Autotoxicity in Plants) nor on the individual level [De Catanzaro].

         A collection of ideas how the trend for self-destruction could be broken was published by Nick Bostrom in Existential Risks, chapter 9. But efficient countermeasures require an excessive reinforcement of control mechanisms. The price for preventing technological disasters and self-destruction could itself be disastrous [Rees 73-88].

 

 

 

 

“Mr.Ghandi, what do you think of Western civilization?”

 

 “I think it would be a good idea.”

 

 

 

 

 

 

6. Conclusion

 

 

The preference for natural risks

         The preference for natural risks is rational as far as it refers to the increasing magnitude and complexity of technological risks (whereas natural risks remain constant).

         According to the theory of risk-perception an irrational preference for natural risks can be explained as follows:

o   The technology in question is loaded with negative associations. This creates an (unconscious) negative disposition.

o   The negative disposition is rationalized by assigning a low benefit and a high risk to the technology.

Sometimes an irrational preference for natural risks can also be explained by traditions which emphasize the divine origin of nature.

The preference for natural risks is a special case within the preference for familiar risks. As soon as technological risks become familiar, they are treated like natural risks.

 

 

The underestimation of technological risks

         There are strong indications that many technological risks are systematically underestimated as well as natural risks.

         Technologies involving unfamiliar risks are often imposed by threats of war and (economic) competition. The struggle for (economic) survival explains why technological innovation accelerates even in areas of high risk.

         The combined effects of risk-tolerance and underestimation add up to a considerable risk of self-destruction. Insofar the relation between risk and benefit (in terms of survival) has worsened in the 20th century.

 

 

 

 

 

Acknowledgment

 

I would like to thank Michael Hampe for the inspiring conversations in the context of this paper.

 

 

 

 

 

References

 

1.      Aron Jacob (2015), Don’t fear apocalyptic asteroids

2.      Birnbacher Dieter (2006), Natürlichkeit, Verlag de Gruyter, Berlin

3.      Blume Michale, Ramsel Carsten, Graupner Sven (2006), Religiosity as a Demographic Factor, Marburg Journal of Religion No.11

4.      Broome John (2004), Weighing Lives, Oxford University Press

5.      De Catanzaro Denys, A Mathematical Model of Evolutionary Pressures Regulating Self-Preservation and Self-Destruction.

6.      Frey Bruno, Stutzer Alois (2001), Happiness and Economics, University Presses of CA

7.      Gifford Robert (2015), The Road to Climate Hell, New Scientist, 11 July, 28-33.

8.      Gugerli David (2013), Das Risiko der Risikogesellschaft, Neue Zürcher Zeitung vom 26.3. Feuilleton, p.45

9.      Hampe Michael (2006), Die Macht des Zufalls, Wolf Jobst Siedler Verlag, Berlin

10.  Hansson Sven Ove (2003), Are natural risks less dangerous than technological risks?, in Philosophia naturalis 40, Issue 1, S.43-54

11.  Johnson Eric (2010), CERN on trial, New Scientist, 20 February, 24-25, referring to Tennessee Law Review, vol.76, p.819

12.  Leslie John (1996), The End of the World, Routledge

13.  Matuschek Milosz (2016), Geld oder Liebe, Neue Zürcher Zeitung, 4.Nov, p.11

14.  Pearce Fred (2010), Earth’s nine lives, New Scientist, 27 February, 31-35

15.  Ravilious Kate (2010), Exodus on the exploding Earth, New Scientist, 14 April, 28-33

16.  Rees Martin (2003), Our Final Hour, Basic Books, New York

17.  Rifkin Jeremy (2010), The third industrial revolution, in New Scientist, 13 February, p.46, referring to The Empathic Civilization, Penguin book

18.  Sandman P.M.(1987), Risk Communication, in EPA Journal, Nov., 21-22

19.  Schneider Wolf (2011), Vor Katastrophen fürchten wir uns nicht, NZZ Folio 06, Zürich

20.  Young Emma (2015), Contagion, New Scientist, 9 May, 30-32

21.  WHO, World report on road traffic injury prevention