Risk Analysis: An International Journal

Subscribe to Risk Analysis: An International Journal feed Risk Analysis: An International Journal
Table of Contents for Risk Analysis. List of articles from both the latest and EarlyView issues.
Updated: 2 hours 20 min ago

Null Hypothesis Testing ≠ Scientific Inference: A Critique of the Shaky Premise at the Heart of the Science and Values Debate, and a Defense of Value‐Neutral Risk Assessment

5 July 2019 - 2:26pm
Abstract

Many philosophers and statisticians argue that risk assessors are morally obligated to evaluate the probabilities and consequences of methodological error, and to base their decisions of whether to adopt a given parameter value, model, or hypothesis on those considerations. This argument is couched within the rubric of null hypothesis testing, which I suggest is a poor descriptive and normative model for risk assessment. Risk regulation is not primarily concerned with evaluating the probability of data conditional upon the null hypothesis, but rather with measuring risks, estimating the consequences of available courses of action and inaction, formally characterizing uncertainty, and deciding what to do based upon explicit values and decision criteria. In turn, I defend an ideal of value‐neutrality, whereby the core inferential tasks of risk assessment—such as weighing evidence, estimating parameters, and model selection—should be guided by the aim of correspondence to reality. This is not to say that value judgments be damned, but rather that they should be accounted for within a structured approach to decision analysis, rather than embedded within risk assessment in an informal manner.

Effect of Providing the Uncertainty Information About a Tornado Occurrence on the Weather Recipients’ Cognition and Protective Action: Probabilistic Hazard Information Versus Deterministic Warnings

5 July 2019 - 2:26pm
Abstract

Currently, a binary alarm system is used in the United States to issue deterministic warning polygons in case of tornado events. To enhance the effectiveness of the weather information, a likelihood alarm system, which uses a tool called probabilistic hazard information (PHI), is being developed at National Severe Storms Laboratory to issue probabilistic information about the threat. This study aims to investigate the effects of providing the uncertainty information about a tornado occurrence through the PHI's graphical swath on laypeople's concern, fear, and protective action, as compared with providing the warning information with the deterministic polygon. The displays of color‐coded swaths and deterministic polygons were shown to subjects. Some displays had a blue background denoting the probability of any tornado formation in the general area. Participants were asked to report their levels of concern, fear, and protective action at randomly chosen locations within each of seven designated levels on each display. Analysis of a three‐stage nested design showed that providing the uncertainty information via the PHI would appropriately increase recipients’ levels of concern, fear, and protective action in highly dangerous scenarios, with a more than 60% chance of being affected by the threat, as compared with deterministic polygons. The blue background and the color‐coding type did not have a significant effect on the people's cognition of the threat and reaction to it. This study shows that using a likelihood alarm system leads to more conscious decision making by the weather information recipients and enhances the system safety.

Recalibration of the Grunow–Finke Assessment Tool to Improve Performance in Detecting Unnatural Epidemics

5 July 2019 - 2:26pm
Abstract

Successful identification of unnatural epidemics relies on a sensitive risk assessment tool designed for the differentiation between unnatural and natural epidemics. The Grunow–Finke tool (GFT), which has been the most widely used, however, has low sensitivity in such differentiation. We aimed to recalibrate the GFT and improve the performance in detection of unnatural epidemics. The comparator was the original GFT and its application in 11 historical outbreaks, including eight confirmed unnatural outbreaks and three natural outbreaks. Three steps were involved: (i) removing criteria, (ii) changing weighting factors, and (iii) adding and refining criteria. We created a series of alternative models to examine the changes on the parameter likelihood of unnatural outbreaks until we found a model that correctly identified all the unnatural outbreaks and natural ones. Finally, the recalibrated GFT was tested and validated with data from an unnatural and natural outbreak, respectively. A total of 238 models were tested. Through the removal of criteria, increasing or decreasing weighting factors of other criteria, adding a new criterion titled “special insights,” and setting a new threshold for likelihood, we increased the sensitivity of the GFT from 38% to 100%, and retained the specificity at 100% in detecting unnatural epidemics. Using test data from an unnatural and a natural outbreak, the recalibrated GFT correctly classified their etiology. The recalibrated GFT could be integrated into routine outbreak investigation by public health institutions and agencies responsible for biosecurity.

Ortwin Renn: Risk Governance Maven

5 July 2019 - 2:26pm
Risk Analysis, Volume 39, Issue 7, Page 1435-1440, July 2019.

From the Editors

5 July 2019 - 2:26pm
Risk Analysis, Volume 39, Issue 7, Page 1433-1434, July 2019.

Issue Information ‐ TOC

5 July 2019 - 2:26pm
Risk Analysis, Volume 39, Issue 7, July 2019.

Health Risk Assessment of Photoresists Used in an Optoelectronic Semiconductor Factory

28 June 2019 - 1:24pm
Abstract

Photoresist materials are indispensable in photolithography, a process used in semiconductor fabrication. The work process and potential hazards in semiconductor production have raised concerns as to adverse health effects. We therefore performed a health risk assessment of occupational exposure to positive photoresists in a single optoelectronic semiconductor factory in Taiwan. Positive photoresists are widely used in the optoelectronic semiconductor industry for photolithography. Occupational exposure was estimated using the Stoffenmanager® model. Bayesian modeling incorporated available personal air sampling data. We examined the composition and by‐products of the photoresists according to descriptions published in the literature and patents; the main compositions assessed were propylene glycol methyl ether acetate (PGMEA), novolac resin, photoactive compound, phenol, cresol, benzene, toluene, and xylene. Reference concentrations for each compound were reassessed and updated if necessary. Calculated hazard quotients were greater than 1 for benzene, phenol, xylene, and PGMEA, indicating that they have the potential for exposures that exceed reference levels. The information from our health risk assessment suggests that benzene and phenol have a higher level of risk than is currently acknowledged. Undertaking our form of risk assessment in the workplace design phase could identify compounds of major concern, allow for the early implementation of control measures and monitoring strategies, and thereby reduce the level of exposure to health risks that workers face throughout their career.

How to Integrate Labor Disruption into an Economic Impact Evaluation Model for Postdisaster Recovery Periods

28 June 2019 - 8:08am
Abstract

Evaluating the economic impacts caused by capital destruction is an effective method for disaster management and prevention, but the magnitude of the economic impact of labor disruption on an economic system remains unclear. This article emphasizes the importance of considering labor disruption when evaluating the economic impact of natural disasters. Based on the principle of disasters and resilience theory, our model integrates nonlinear recovery of labor losses and the demand of labor from outside the disaster area into the dynamic evaluation of the economic impact in the postdisaster recovery period. We exemplify this through a case study: the flood disaster that occurred in Wuhan city, China, on July 6, 2016 (the “7.6 Wuhan flood disaster”). The results indicate that (i) the indirect economic impacts of the “7.6 Wuhan flood disaster” will underestimate 15.12% if we do not consider labor disruption; (ii) the economic impact in secondary industry caused by insufficient labor forces accounts for 42.27% of its total impact, while that in the tertiary industry is 36.29%, which can cause enormous losses if both industries suffer shocks; and (iii) the agricultural sector of Wuhan city experiences an increase in output demand of 0.07% that is created by the introduction of 50,000 short‐term laborers from outside the disaster area to meet the postdisaster reconstruction need. These results provide evidence for the important role of labor disruption and prove that it is a nonnegligible component of postdisaster economic recovery and postdisaster reduction.

Determinants of Probability Neglect and Risk Attitudes for Disaster Risk: An Online Experimental Study of Flood Insurance Demand among Homeowners

27 June 2019 - 7:00pm
Abstract

Little is known about why individuals place either a high or a very low value on mitigating risks of disaster‐type events, like floods. This study uses panel data methods to explore the psychological factors affecting probability neglect of flood risk relevant to the zero end‐point of the probability weighting function in Prospect Theory, and willingness‐to‐pay for flood insurance. In particular, we focus on explanatory variables of anticipatory and anticipated emotions, as well as the threshold of concern. Moreover, results obtained under real and hypothetical incentives are compared in an experiment with high experimental outcomes. Based on our findings, we suggest several policy recommendations to overcome individual decision processes, which may hinder flood protection efforts.

Optimization of the Aflatoxin Monitoring Costs along the Maize Supply Chain

27 June 2019 - 7:00pm
Abstract

An optimization model was used to gain insight into cost‐effective monitoring plans for aflatoxins along the maize supply chain. The model was based on a typical Dutch maize chain, with maize grown in the Black Sea region, and transported by ship to the Netherlands for use as an ingredient in compound feed for dairy cattle. Six different scenarios, with different aflatoxin concentrations at harvest and possible aflatoxin production during transport, were used. By minimizing the costs and using parameters such as the concentration, the variance of the sampling plan, and the monitoring and replacement costs, the model optimized the control points (CPs; e.g., after harvest, before or after transport by sea ship), the number of batches sampled at the CP, and the number of samples per batch. This optimization approach led to an end‐of‐chain aflatoxin concentration below the predetermined limit. The model showed that, when postharvest aflatoxin production was not possible, it was most cost‐effective to collect samples from all batches and replace contaminated batches directly after the harvest, since the replacement costs were the lowest at the origin of the chain. When there was aflatoxin production during storage, it was most cost‐effective to collect samples and replace contaminated batches after storage and transport to avoid the duplicate before and after monitoring and replacement costs. Further along the chain a contaminated batch is detected, the more stakeholders are involved, the more expensive the replacement costs and possible recall costs become.

Dread and Risk Elimination Premium for the Value of a Statistical Life

13 June 2019 - 6:34pm
Abstract

The value of a statistical life (VSL) is a widely used measure for the value of mortality risk reduction. As VSL should reflect preferences and attitudes to risk, there are reasons to believe that it varies depending on the type of risk involved. It has been argued that cancer should be considered a “dread disease,” which supports the use of a “cancer premium.” The objective of this study is to investigate the existence of a cancer premium (for pancreatic cancer and multiple myeloma) in relation to road traffic accidents, sudden cardiac arrest, and amyotrophic lateral sclerosis (ALS). Data were collected from 500 individuals in the Swedish general population of 50–74‐year olds using a web‐based questionnaire. Preferences were elicited using the contingent valuation method, and a split‐sample design was applied to test scale sensitivity. VSL differs significantly between contexts, being highest for ALS and lowest for road traffic accidents. A premium (92–113%) for cancer was found in relation to road traffic accidents. The premium was higher for cancer with a shorter time from diagnosis to death. A premium was also found for sudden cardiac arrest (73%) and ALS (118%) in relation to road traffic accidents. Eliminating risk was associated with a premium of around 20%. This study provides additional evidence that there exist a dread premium and risk elimination premium. These factors should be considered when searching for an appropriate value for economic evaluation and health technology assessment.

Science for Policy: A Case Study of Scientific Polarization, Values, and the Framing of Risk and Uncertainty

13 June 2019 - 6:31pm
Abstract

It is well documented that more research can lead to hardened positions, particularly when dealing with complex, controversial, and value‐laden issues. This study is an attempt to unveil underlying values in a contemporary debate, where both sides use scientific evidence to support their argument. We analyze the problem framing, vocabulary, interpretation of evidence, and policy recommendations, with particular attention to the framing of nature and technology. We find clear differences between the two arguments. One side stress that there is no evidence that the present approach is causing harm to humans or the environment, does not ruminate on uncertainties to that end, references nature's ability to handle the problem, and indicates distrust in technological solutions. In contrast, the other side focuses on uncertainties, particularly the lack of knowledge about potential environmental effects and signals trust in technological development and human intervention as the solution. Our study suggests that the two sides’ diverging interpretations are tied to their perception of nature: vulnerable to human activities versus robust and able to handle human impacts. The two sides also seem to hold diverging views of technology, but there are indications that this might be rooted in their perception of governance and economy rather than about technology per se. We conclude that there is a need to further investigate how scientific arguments are related to worldviews, to see how (if at all) worldview typologies can help us to understand how value‐based judgments are embedded in science advice, and the impact these have on policy preferences.

Modeling the Cost Effectiveness of Fire Protection Resource Allocation in the United States: Models and a 1980–2014 Case Study

13 June 2019 - 6:31pm
Abstract

The estimated cost of fire in the United States is about $329 billion a year, yet there are gaps in the literature to measure the effectiveness of investment and to allocate resources optimally in fire protection. This article fills these gaps by creating data‐driven empirical and theoretical models to study the effectiveness of nationwide fire protection investment in reducing economic and human losses. The regression between investment and loss vulnerability shows high R 2 values (≈0.93). This article also contributes to the literature by modeling strategic (national‐level or state‐level) resource allocation (RA) for fire protection with equity‐efficiency trade‐off considerations, while existing literature focuses on operational‐level RA. This model and its numerical analyses provide techniques and insights to aid the strategic decision‐making process. The results from this model are used to calculate fire risk scores for various geographic regions, which can be used as an indicator of fire risk. A case study of federal fire grant allocation is used to validate and show the utility of the optimal RA model. The results also identify potential underinvestment and overinvestment in fire protection in certain regions. This article presents scenarios in which the model presented outperforms the existing RA scheme, when compared in terms of the correlation of resources allocated with actual number of fire incidents. This article provides some novel insights to policymakers and analysts in fire protection and safety that would help in mitigating economic costs and saving lives.

On the Limits of the Precautionary Principle

13 June 2019 - 6:31pm
Abstract

The precautionary principle (PP) is an influential principle of risk management. It has been widely introduced into environmental legislation, and it plays an important role in most international environmental agreements. Yet, there is little consensus on precisely how to understand and formulate the principle. In this article I prove some impossibility results for two plausible formulations of the PP as a decision‐rule. These results illustrate the difficulty in making the PP consistent with the acceptance of any tradeoffs between catastrophic risks and more ordinary goods. How one interprets these results will, however, depend on one's views and commitments. For instance, those who are convinced that the conditions in the impossibility results are requirements of rationality may see these results as undermining the rationality of the PP. But others may simply take these results to identify a set of purported rationality conditions that defenders of the PP should not accept, or to illustrate types of situations in which the principle should not be applied.

A CGE Framework for Modeling the Economics of Flooding and Recovery in a Major Urban Area

13 June 2019 - 6:31pm
Abstract

Coastal cities around the world have experienced large costs from major flooding events in recent years. Climate change is predicted to bring an increased likelihood of flooding due to sea level rise and more frequent severe storms. In order to plan future development and adaptation, cities must know the magnitude of losses associated with these events, and how they can be reduced. Often losses are calculated from insurance claims or surveying flood victims. However, this largely neglects the loss due to the disruption of economic activity. We use a forward‐looking dynamic computable general equilibrium model to study how a local economy responds to a flood, focusing on the subsequent recovery/reconstruction. Initial damage is modeled as a shock to the capital stock and recovery requires rebuilding that stock. We apply the model to Vancouver, British Columbia by considering a flood scenario causing total capital damage of $14.6 billion spread across five municipalities. GDP loss relative to a no‐flood scenario is relatively long‐lasting. It is 2.0% ($2.2 billion) in the first year after the flood, 1.7% ($1.9 billion) in the second year, and 1.2% ($1.4 billion) in the fifth year.

Toward an Epidemiology of Safety and Security Risks: An Organizational Vulnerability Assessment in International Airports

13 June 2019 - 6:31pm
Abstract

International airports are complex sociotechnical systems that have an intrinsic potential to develop safety and security disruptions. In the absence of appropriate defenses, and when the potential for disruption is neglected, organizational crises can occur and jeopardize aviation services. This investigation examines the ways in which modern international airports can be “authors of their own misfortune” by adopting practices, attitudes, and behaviors that could increase their overall level of vulnerability. A sociotechnical perspective, the macroergonomic approach, is applied in this research to detect the potential organizational determinants of vulnerability in airport operations. Qualitative data nurture the case study on international airports produced by the present research. Findings from this study highlight that systemic weaknesses frequently reside in areas at the intersection of physical, organizational, and social spaces. Specific pathways of vulnerability can be drawn across these areas, involving the following systemic layers: individual, task, tools and technology, environment, and organization. This investigation expands the existing literature on the dynamics that characterize crisis incubation in multiorganization, multistakeholder systems such as international airports and provides practical recommendations for airport managers to improve their capabilities to early detect symptoms of organizational vulnerability.

Pages