You are here

What you need to know ...

9 January, 2015 - 17:36
Available under Creative Commons-ShareAlike 4.0 International License. Download for free at http://cnx.org/contents/3d8499e9-08c0-47dd-9482-7e8131ce99bc@11.15

Working responsibly with risk requires careful integration of substantive ethical issues, distinguishing different senses of risk, and mastering the skills required in morally responsible risk communication. In other words, it is more than just implementing a mechanical process that imposes unwanted consensus on disparate groups and individuals. (See Sandel for an argument that past ethical controversies such as slavery had to be settled by means of substantive debates rather than procedural maneuvers.) Ethics is important to risk because scientific risk assessment is value-laden. Values permeate decisions such as choice of method as well as decisions on how to distribute the burden implied by the uncertainty involved in risk assessment and management. This section will introduce you to basic moral concepts involved in risk and offer information on how risk is assessed, managed, perceived, and communicated.

Responsible Risk Management: Associated Basic Moral Concepts

  1. Right: A capacity of action that others are obliged to recognize and respect. A key right in the context of risk is free and informed consent. (See below)
  2. Duty: The obligation to recognize and respect the essential capacities of actions of others. Duties are correlative to rights. For example, the duty to avoid paternalism in the management and communication of risk is correlative to the right of free and informed consent.
  3. Virtue: Responsible risk management can also be formulated as a virtue. Virtues are traits that extend "deep down" into an individual's character. They include an orientation toward excellence in decision and execution, perceptual sensitivities that help to uncover moral relevance, and emotions/attitudes
  4. that help motivate decisions and actions oriented toward achieving excellence. For example, a responsible risk communicator has curiosity that drives understanding and appreciating risk, a concern for the well being of the risk bearer, and a strong desire to communicate risk information truthfully and clearly.
  5. Justice: Justice can be generally defined as giving each his or her due. Distributive justice, in the context of risk, prescribes a fair distribution of the benefits and harms associated with taking a certain risk. Ideal pattern approaches argue that distribution should conform to a pattern such as equality (equal shares to everyone), need (greatest share to those with the greatest needs), and merit (greatest share to those who demonstrate superior merit). Ideal pattern approaches require continual redistribution by government through measures such as a progressive income tax. Historical process approaches prefer maintaining current patterns of distribution provided the historical process leading to them has been free of force or fraud. Justice in the context of risk lies in determining how the benefits and harms associated with risk are distributed, and how the uncertainty that permeates the risk assessment and management process is distributed among those involved.
  6. Responsibility: Herbert Fingarette defines responsibility (in the context of criminal insanity) as (moral) response to (moral) relevance. Different senses of responsibility include causal, legal (vs. moral), role, capacity, and blame. Responsibility can be reactive when it focuses on the past and the assigning of praise and blame; or it can be proactive when it turns to preventing harm (minimizing risk) and realizing value.
  7. Trust: The expectation of moral behavior on the part of others. Trust is built out of the social capital accumulated through successful interactions with others. It is consumed or undermined by those who choose to free ride on social cooperation, i.e., compete while others are cooperating. The prisoner's dilemma (see link above) provides a simplified model to show the fragility of trust (m17367).

Key Terms in Risk Practices

  1. Safety: "A thing is safe if, were its risks fully known, those risks would be judged acceptable in light of settled value principles." (IEE 108)
  2. Risk: "A risk is the potential that something unwanted and harmful may occur." (IEE 108)
  3. NIMBY: This acronym stands for "Not in my backyard." Citizens often find the risks associated with a project or product acceptable only if these are located somewhere else, i.e., in another person's backyard. NIMBY has made it next to impossible for the U.S. DOE (Department of Energy) to find an acceptable permanent storage facility for nuclear waste.
  4. Free and Informed Consent: The right to decide if a risk is acceptable based on access to pertinent information and absence of compulsion. The Belmont Report defines informed consent in the following way: "[that] subjects, to the degree that they are capable, be given the opportunity to choose what shall or shall not happen to them. This opportunity is provided when adequate standards for informed consent are satisfied." The Online Ethics Center spells out conditions necessary for fulfilling informed consent: (a) disclosure (of information to the patient/subject); (b)comprehension (by the patient/subject of the information being disclosed); (c) voluntariness (of the patient/subject in making his/her choice); (d) competence (of the patient/subject to make a decision); and (e) consent (by the patient/subject).
  5. Paternalism: Often experts are tempted to act as overly concerned parents and take over the decision-making perogatives of the public because they (the experts) "know better." Paternalism, while well motivated, is based on the misconception that the public doesn't understand risk because it often reaches different conclusions on the acceptability of a given risk than the expert. But the public often appreciates risk from a broader, richer standpoint, especially if the expert has properly and clearly communicated it. As will be seen below, the public perception of risk is rational because it is predictable.

Dimensions of Risk

  • Risk Assessment: The process of determining the degree of risk associated with a certain product or process using scientific methods such as epidemological study or animal bioassay. While using scientific procedures to gain a measure of exactness, risk assessment still brings with it a remainder of uncertainty that cannot be eliminated. A risk assessment issues into two uncertainties, the uncertainty as to whether the harm will occur and the uncertainty as to who (out of the many exposed) will be harmed. Ethics enters into the picture as stakeholders negotiate how to deal with and distribute this uncertainty. Responsible risk practice requires integrating the conflicting values and interests of the involved stakeholders in assessing, communicating, perceiving, and managing risk. It also requires a basis of trust that is difficult to build up given the diverse players that make up the risk taking and bearing situation.
  • Risk Management: The political/social/ethical process of determining if a risk of a certain degree is acceptable given the settled value principles generally held in the community of the risk bearers. Responsible risk management requires (a) assessing harm through the responsible exercise of scientific method and (b) communicating the assessed risk to those likely to bear it. Responsible risk management
  • (i) honors rights such as free and informed consent and due process, (ii) avoids conflicts of interests in determining and communicating risk, (iii) conscientiously works toward a just distribution of risks and benefits, and (iv) avoids paternalism.
  • Risk Perception: How people perceive risk differs from the strict, scientifically determined degree of risk. For example, risk perception factors in voluntariness, control, expected benefits, lack of knowledge, and dread of adverse consequences in working toward a judgment on the acceptability of a given risk by the community of risk bearers. Because the public perceives risk over this broad background of scientific, social, political, and ethical factors, it frequently arrives at conclusions at odds with judgments reached using strictly scientific methods. Those taking a paternalistic attitude toward the public take this difference as evidence of the irrationality of the public and the need for the experts to taken things into their own hands. However, the public attitude toward risk is intelligible and rational when this broader, risk perception perspective is taken into account.
  • Risk Communication: This dimension focuses on how to communicate risk information to risk bearers in order to facilitate distributive justice, free and informed consent, and due process. Responsible risk communication requires translating scientifically determined information into a non-technical vocabulary. Analogies and comparisons help as does the use of concrete language and commonly understood images. But improper use of comparisions and analogies confuses the public and undermines trust.
  • Public: "those persons whose lack of information, technical knowledge, or time for deliberation renders them more or less vulnerable to the powers an engineer wields on behalf of his client or employer" Davis

Assessing Risk

  • Epidemiological Studies: We are constantly exposed to different risks that have become inherent in our socio-technical circumstances. These ongoing, unintentional experiments are exploited through epidemiological studies which are designed to measure the correlation between exposure to risk factors and the occurrence of harm. For example, are those living close to EMFs (electro-magnetic fields generated by technologies like electrical power lines) susceptible to certain harms like leukemia? An epidemiological study would compare incidents of this disease occurring in a population exposed to EMFs with incidents of this disease occurring in a population, unexposed to EMSs. If there were a significant risk ratio (usually set at three times the incidents of the harm in the unexposed, control group) then this provides evidence that exposure to EMFs somehow causes leukemia. (Further study would be required to confirm this hypothesis and uncover the causal mechanism by which exposure produces the harm.) Epidemiological studies are difficult to carry out and are always accompanied by uncertainty due to the limitations of the methods employed. Typically, the harm may take years to become manifest after exposure. Finding a population stable enough to determine the effects of long term exposure is difficult because individuals frequently move from place to place. Such natural experiments also bring with them a great deal of "noise"; factors other than EMFs could be causing leukemia or EMFs could be interacting with other elements in the environment to cause the harm. Finally, there is the Tuskegee factor. In the notorious Tuskegee experiment, doctors refused to treat African Americans for syphilis in order to study the long term progression of the disease. Exposing a population to a risk factor without informing them of the potential harm in order to gain scientific information violates the right of free and informed consent and the duty not to harm.
  • Animal Bioassays: Risk information can often be obtained by exposing animals to the risk factor and checking for emerging harms. While useful, animal bioassays are subject to several problems. Experimenting on animals raises many of the same ethical concerns as experimenting on humans. Utilitarians argue that animals merit moral consideration because they are sentient and can suffer. Animal experiments are thus subject to the three Rs: reduce, refine, and avoid replication. (See Bernard Rollins) Second, these experiments create two kinds of uncertainty. (a) Projections from animal to human physiology can lead researchers astray because of the differences between the two; for example, animals are more sensitive to certain harms than humans. (b) Projecting the results from intensive short term animal exposure into the long term can also introduce errors and uncertainty. Thus, as with epidemiological studies, there are uncertainties inherent in animal bioassays.
  • Risk assessment, while useful, is burdened with uncertainty due to the limits of what we know, what we can know, and what we are able to learn within the ethical parameters of human and animal experimentation. Crucial ethical issues arise as we decide how to distribute this uncertainty. Do we place its burden on the risk taker by continuing with a project until it is proven unsafe and harmful? Or do we suspend the activity until it is proven safe and harm-free. The first gives priority to advancing risky activities. The second gives priority to public safety and health, even to the point of suspending the new activities under question

Risk Perception

  • The framework from which the public perceives risk is broader and richer than that of risk assessment. The following five factors influence how the public judges the acceptability of a risk assessed at a given magnitude.
  • Voluntariness: A risk that is voluntarily taken is more acceptable than a risk of the same magnitude that taken involuntarily. Thus, driving one's car to a public hearing on the risks of a proposed nuclear power plant may be riskier than living next to the plant. But driving to the public hearings is done voluntarily while living next to the plant is suffered involuntarily. According to studies, a voluntary risk is as much as 1000 times more acceptable than an involuntary risk of the same magnitude.
  • Control: Closely related to voluntariness is control. A risk under one's control (or under the control of someone trusted) is more acceptable than a risk of the same magnitude that is not under control. Charles Perrow, in Normal Accidents argues against nuclear energy technology because its design allows for components that are tightly coupled and interact with nonlinear patterns of causality. These two characteristics make it possible for small events to start chain reactions that issue into large scale disasters. Because these small events cannot be isolated (they are "tightly coupled") and because they interact unpredictably (they display nonlinear causality), they escape control and lead to unacceptable risks.
  • Perceived/Expected Benefits: A risk of a given magnitude is more acceptable if it comes accompanied with substantial expected benefits. One takes the risk of driving to the hearings on the proposed nuclear plant because the benefits of getting crucial information on this project outweigh the risks of having a car accident. Riding a motorcycle is a risky venture. But the benefits received from this activity in the form of enjoyment make the risk more acceptable than a risk of the same magnitude accompanied with less benefits.
  • Unknown Factors: A risk that is not understood is less acceptable than one that is well understood. Riding a bicycle is a risky venture but, because its risks are well known, it is more acceptable than other activities accompanied by risks of similar magnitudes. This factor is highly pertinent to EMFs (electro-magnetic fields). While EMFs are associated with certain illnesses like leukemia, their effects are not well known and are not understood by the public. This unknown element makes living near EMF producing technologies less acceptable.
  • Dread Factors: A risk may be known and its causal relation to certain illnesses well understood. Nevertheless it may be less acceptable because the condition it causes is one that is highly dreaded. EMFs, because they have been associated with leukemia in children, are much less acceptable because of this "dread factor." The causes of radiation sickness are well known as are the stages of the illness. But because this kind of illness is highly dreaded, accompanying risks are less acceptable than other risks of the same magnitude with less of the dread factor. Again, compare crashing on a bicycle with coming down with cancer to get an idea of how dread permeates the perception of risk.
  • Against Paternalism: Consider the possibility that predictability is one component of rationality. Then test this hypothesis in the cases presented at the beginning of this module. Can the risks posed by each project be examined in terms voluntariness, susceptibility to control, expected benefits, unknown factors, and dread factors? If so, then the public perception of this risk is rational because it can be predicted and understood. Thus, even though members of the public might find other risks of the same or even greater magnitude more acceptable, these perceptual factors would render the public's judgment intelligible and predictable. If all of this is so (and you will be testing this hypothesis in the exercises below) then paternalism on the part of the expert would not be justified. Furthermore, these insights into how risk is perceived by the public should provide you with valuable insight into how to communicate risk to the public.

Responsible Risk Communication

  • Telling the Truth: Certainly, responsible risk communication should start with the commitment to tell the truth. But the virtue of truthfulness is more complicated than it might seem at a first glance. For example, were an expert to tell nonexperts the whole truth this might confuse them, especially if the account is loaded with complex technical explanations and jargon. Truthfulness might require some simplification (holding some things back or putting them in different terms), judicious comparisons, and the use of concrete images. Thus, the virtue of truthfulness requires (a)understanding the audience and (b) outlining their perceptions, concerns, feelings, and needs. With this in mind, here are some factors that are useful in communicating risk responsibly and truthfully.
  • Know the audience: What is their level of understanding, their needs, and their perceptions. For example, do they perceive the risk as voluntary, under control, accompanied with substantial benefits, accompanied by effects that are well known, and of a low dread factor? The risk perception framework described above will help you to communicate risk in a helpful and responsible manner.
  • Take measures to avoid deceiving the audience: The gap between the expert (those in the know) and the public is sometimes quite large. This creates the temptation to fill that gap with less then truthful content. Avoiding deception requires more than just refraining from telling outright lies. It also requires taking measures to avoid subtle manipulation and unintentional deception.
  • Guard against unintentional deception: (a) Be careful when using rhetorical devises. (b) Use risk comparisons and analogies to provide the public with benchmarks, not to persuade them that because they accept risk X they should accept risk Y. (c) Be sure to point out the limits of comparisons and analogies. (Driving to the public hearing is a risk of a greater magnitude than living next to a nuclear plant but this does not include key factors such as voluntariness, control, and expected benefits. (d) Avoid conflicts of interest. In exercise one below, you will be looking at an example of risk communication taken from the movie Silkwood. Think about whether this communication is responsible and honest. Do the interests of the risk communicators coincide with those of the audience? Do the interests of the communicators bias the content of the communication in any way? (For example, does the upcoming vote to keep the union play a role in this risk communication act?)