Risk Analysis: An International Journal

Subscribe to Risk Analysis: An International Journal feed Risk Analysis: An International Journal
Table of Contents for Risk Analysis. List of articles from both the latest and EarlyView issues.
Updated: 31 min 14 sec ago

Estimating Evacuation Shelter Deficits in the Houston–Galveston Metropolitan Area

23 January 2020 - 11:34am
Abstract

Evacuation is frequently used by emergency managers and other officials as part of an overall approach to reducing the morbidity and mortality associated with hurricane landfall. In this study, the evacuation shelter capacity of the Houston–Galveston Metropolitan Statistical Area (MSA) was spatially assessed and shelter deficits in the region were estimated. These data provide essential information needed to eliminate shelter deficits and ensure a successful evacuation from a future storm. Spatial statistical methods—Global Moran's I, Anselin Local Moran's I (Local Indicators of Spatial Association [LISA]), and Hot Spot Analysis (Getis‐Ord Gi*) were used to assess for regional spatial autocorrelation and clustering of evacuation shelters in the Houston–Galveston MSA. Shelter deficits were estimated in four ways—the aggregate deficit for the Houston–Galveston MSA, by evacuation Zip‐Zone, by county, and by distance or radii of evacuation Zip‐Zone. Evacuation shelters were disproportionately distributed in the region, with lower capacity shelters clustered closer to evacuation Zip‐Zones (50 miles from the Coastal Zip‐Zone), and higher capacity shelters clustered farther away from the zones (120 miles from the Coastal Zip‐Zone). The aggregate shelter deficit for the Houston–Galveston MSA was 353,713 persons. To reduce morbidity and mortality associated with future hurricanes in the Houston–Galveston MSA, authorities should consider the development and implementation of policies that would improve the evacuation shelter capacity of the region. Eliminating shelter deficits, which has been done successfully in the state of Florida, is an essential element of protecting the public from hurricane impacts.

Risk or Efficacy? How Psychological Distance Influences Climate Change Engagement

20 January 2020 - 8:00pm
Abstract

Construal‐level theory suggests that high‐level abstract features weigh more in people's decision‐making at farther distance, while low‐level concrete features weigh more at closer distance. Based on this, we propose that psychological distance will influence the effect of risk versus efficacy framing on climate change engagement. In particular, risk perception related to the end‐state expectancy of climate change mitigation should influence people's climate change engagement at farther distance. In contrast, efficacy perception related to the perceived feasibility of attaining end‐state goals should influence engagement at closer distance. Results from an experimental survey based on a national sample that is both demographically and geographically representative (N = 1,282) supported our proposition. At closer spatial distance, perceived efficacy boosted by efficacy framing increased participants’ intention to perform climate mitigation behaviors. In contrast, at farther distance, risk framing increased behavioral intention through heightened risk perception. Based on these findings, we suggest that when communicating distant and abstract risks, highlighting their disastrous impacts may better motivate action. In contrast, when communicating impending and concrete risks, stressing the feasibility of action may have stronger motivational potential.

Global Transmission of Live Polioviruses: Updated Dynamic Modeling of the Polio Endgame

20 January 2020 - 5:00pm
Abstract

Nearly 20 years after the year 2000 target for global wild poliovirus (WPV) eradication, live polioviruses continue to circulate with all three serotypes posing challenges for the polio endgame. We updated a global differential equation‐based poliovirus transmission and stochastic risk model to include programmatic and epidemiological experience through January 2020. We used the model to explore the likely dynamics of poliovirus transmission for 2019–2023, which coincides with a new Global Polio Eradication Initiative Strategic Plan. The model stratifies the global population into 72 blocks, each containing 10 subpopulations of approximately 10.7 million people. Exported viruses go into subpopulations within the same block and within groups of blocks that represent large preferentially mixing geographical areas (e.g., continents). We assign representative World Bank income levels to the blocks along with polio immunization and transmission assumptions, which capture some of the heterogeneity across countries while still focusing on global poliovirus transmission dynamics. We also updated estimates of reintroduction risks using available evidence. The updated model characterizes transmission dynamics and resulting polio cases consistent with the evidence through 2019. Based on recent epidemiological experience and prospective immunization assumptions for the 2019–2023 Strategic Plan, the updated model does not show successful eradication of serotype 1 WPV by 2023 or successful cessation of oral poliovirus vaccine serotype 2‐related viruses.

Against the De Minimis Principle

20 January 2020 - 12:24pm
Abstract

According to the class of de minimis decision principles, risks can be ignored (or at least treated very differently from other risks) if the risk is sufficiently small. In this article, we argue that a de minimis threshold has no place in a normative theory of decision making, because the application of the principle will either recommend ignoring risks that should not be ignored (e.g., the sure death of a person) or it cannot be used by ordinary bounded and information‐constrained agents.

Human Factors Analysis for Maritime Accidents Based on a Dynamic Fuzzy Bayesian Network

15 January 2020 - 8:27pm
Abstract

Human factors are widely regarded to be highly contributing factors to maritime accident prevention system failures. The conventional methods for human factor assessment, especially quantitative techniques, such as fault trees and bow‐ties, are static and cannot deal with models with uncertainty, which limits their application to human factors risk analysis. To alleviate these drawbacks, in the present study, a new human factor analysis framework called multidimensional analysis model of accident causes (MAMAC) is introduced. MAMAC combines the human factors analysis and classification system and business process management. In addition, intuitionistic fuzzy set theory and Bayesian Network are integrated into MAMAC to form a comprehensive dynamic human factors analysis model characterized by flexibility and uncertainty handling. The proposed model is tested on maritime accident scenarios from a sand carrier accident database in China to investigate the human factors involved, and the top 10 most highly contributing primary events associated with the human factors leading to sand carrier accidents are identified. According to the results of this study, direct human factors, classified as unsafe acts, are not a focus for maritime investigators and scholars. Meanwhile, unsafe preconditions and unsafe supervision are listed as the top two considerations for human factors analysis, especially for supervision failures of shipping companies and ship owners. Moreover, potential safety countermeasures for the most highly contributing human factors are proposed in this article. Finally, an application of the proposed model verifies its advantages in calculating the failure probability of accidents induced by human factors.

Probabilistic Assessment of the Failure Risk of the Europa Clipper Spacecraft due to Radiations

15 January 2020 - 8:26pm
Abstract

The Europa mission approved in 2019 is still in the development phase. It is designed to conduct a detailed reconnaissance of that moon of Jupiter as it could possibly support life as we know it. This article is based on a top‐down approach (mission → system → subsystems → components) to model the probability of mission failure. The focus here is on the case where the (uncertain) radiation load exceeds the (uncertain) capacity of critical subsystems of the spacecraft. The model is an illustrative quantification of the uncertainties about (1) the complex external radiation environment in repeated exposures, (2) the effectiveness of the shielding in different zones of the spacecraft, and (3) the components’ capacities, by modeling all three as dynamic random variables. A simulation including a sensitivity analysis is used to obtain the failure probability of the whole mission in forty‐five revolutions around Jupiter. This article illustrates how probabilistic risk analysis based on engineering models, test results and expert opinions can be used in the early stages of the design of space missions when uncertainties are large. It also describes the optimization of the spacecraft design, taking into account the decisionmakers’ risk attitude and the mission resource constraints.

Operational Networks: Adaptation to Extreme Events in China

14 January 2020 - 1:48pm
Abstract

Natural hazards pose an increasing challenge to public administrators, as the frequency, costs, and consequences of extreme events escalate in a complex, interdependent, world. This study examines organizational networks as instruments for mobilizing collective response to extreme events, but effective design has been elusive. Governments have focused on planned networks to anticipate risk before hazards occur; communities have formed emergent networks as voluntary efforts after the event. Using a framework of complex adaptive systems, we identify operational networks that adapt to their immediate context in real time, using technologies to support the search, exchange, and feedback of information to enable informed, collective action. Applying mixed research methods—documentary analysis of laws, policies, and procedures; content analysis of news articles; onsite observation; and semistructured interviews with experienced personnel—we document operational networks as a distinct form of multiorganizational response to urgent events that combines the structure of designated authority with the flexibility of information technologies. The integration of planned and emergent organizational forms into operational networks is measured through External/Internal (E/I) index analysis, based on empirical data collected on response systems that formed following the 2008 Wenchuan and 2013 Lushan earthquakes in the centralized administrative context of China. Findings show that planned networks provide the organizational structure and initial legitimacy essential for operational networks to form, but ready access to information technology—cell phones, short‐wave radio systems, internet access—enables rapid communication and exchange of information essential for flexible adaptation in real time to meet urgent needs.

Multicriteria Decision Framework for Cybersecurity Risk Assessment and Management

6 January 2020 - 8:00pm
Abstract

Risk assessors and managers face many difficult challenges related to novel cyber systems. Among these challenges are the constantly changing nature of cyber systems caused by technical advances, their distribution across the physical, information, and sociocognitive domains, and the complex network structures often including thousands of nodes. Here, we review probabilistic and risk‐based decision‐making techniques applied to cyber systems and conclude that existing approaches typically do not address all components of the risk assessment triplet (threat, vulnerability, consequence) and lack the ability to integrate across multiple domains of cyber systems to provide guidance for enhancing cybersecurity. We present a decision‐analysis‐based approach that quantifies threat, vulnerability, and consequences through a set of criteria designed to assess the overall utility of cybersecurity management alternatives. The proposed framework bridges the gap between risk assessment and risk management, allowing an analyst to ensure a structured and transparent process of selecting risk management alternatives. The use of this technique is illustrated for a hypothetical, but realistic, case study exemplifying the process of evaluating and ranking five cybersecurity enhancement strategies. The approach presented does not necessarily eliminate biases and subjectivity necessary for selecting countermeasures, but provides justifiable methods for selecting risk management actions consistent with stakeholder and decisionmaker values and technical data.

Probability Size Matters: The Effect of Foreground‐Only versus Foreground+Background Graphs on Risk Aversion Diminishes with Larger Probabilities

6 January 2020 - 5:50pm
Abstract

Graphs are increasingly recommended for improving decision‐making and promoting risk‐avoidant behaviors. Graphs that depict only the number of people affected by a risk (“foreground‐only” displays) tend to increase perceived risk and risk aversion (e.g., willingness to get vaccinated), as compared to graphs that also depict the number of people at risk for harm (“foreground+background” displays). However, previous research examining these “foreground‐only effects” has focused on relatively low‐probability risks (<10%), limiting generalizability to communications about larger risks. In two experiments, we systematically investigated the moderating role of probability size on foreground‐only effects, using a wide range of probability sizes (from 0.1% to 40%). Additionally, we examined the moderating role of the size of the risk reduction, that is, the extent to which a protective behavior reduces the risk. Across both experiments, foreground‐only effects on perceived risk and risk aversion were weaker for larger probabilities. Experiment 2 also revealed that foreground‐only effects were weaker for smaller risk reductions, while foreground‐only displays decreased understanding of absolute risk magnitudes independently of probability size. These findings suggest that the greater effectiveness of foreground‐only versus foreground+background displays for increasing perceived risk and risk aversion diminishes with larger probability sizes and smaller risk reductions. Moreover, if the goal is to promote understanding of absolute risk magnitudes, foreground+background displays should be used rather than foreground‐only displays regardless of probability size. Our findings also help to refine and extend existing theoretical accounts of foreground‐only effects to situations involving a wide range of probability sizes.

Issue Information ‐ TOC

6 January 2020 - 11:04am
Risk Analysis, Volume 40, Issue 1, January 2020.

Interdependent Network Recovery Games

6 January 2020 - 11:04am
Abstract

Recovery of interdependent infrastructure networks in the presence of catastrophic failure is crucial to the economy and welfare of society. Recently, centralized methods have been developed to address optimal resource allocation in postdisaster recovery scenarios of interdependent infrastructure systems that minimize total cost. In real‐world systems, however, multiple independent, possibly noncooperative, utility network controllers are responsible for making recovery decisions, resulting in suboptimal decentralized processes. With the goal of minimizing recovery cost, a best‐case decentralized model allows controllers to develop a full recovery plan and negotiate until all parties are satisfied (an equilibrium is reached). Such a model is computationally intensive for planning and negotiating, and time is a crucial resource in postdisaster recovery scenarios. Furthermore, in this work, we prove this best‐case decentralized negotiation process could continue indefinitely under certain conditions. Accounting for network controllers' urgency in repairing their system, we propose an ad hoc sequential game‐theoretic model of interdependent infrastructure network recovery represented as a discrete time noncooperative game between network controllers that is guaranteed to converge to an equilibrium. We further reduce the computation time needed to find a solution by applying a best‐response heuristic and prove bounds on ε‐Nash equilibrium, where ε depends on problem inputs. We compare best‐case and ad hoc models on an empirical interdependent infrastructure network in the presence of simulated earthquakes to demonstrate the extent of the tradeoff between optimality and computational efficiency. Our method provides a foundation for modeling sociotechnical systems in a way that mirrors restoration processes in practice.

Time‐Varying Risk Measurement for Ship Collision Prevention

6 January 2020 - 11:04am
Abstract

We propose an innovative time‐varying collision risk (TCR) measurement for ship collision prevention in this article. The proposed measurement considers the level of danger of the approaching ships and the capability of a ship to prevent collisions. We define the TCR as the probability of the overlap of ships’ positions in the future, given the uncertainty of maneuvers. Two sets are identified: (1) the velocity obstacle set as the maneuvers of the own ship that lead to collisions with target ships, and (2) the reachable velocity set as the maneuvers that the own ship can reach regarding its maneuverability. We then measure the TCR as the time‐dependent percentage of overlap between these two sets. Several scenarios are presented to illustrate how the proposed measurement identifies the time‐varying risk levels, and how the approach can be used as an intuitively understandable tool for collision avoidance.

Engineering Systems and Risk Analytics

6 January 2020 - 11:04am
Risk Analysis, Volume 40, Issue 1, Page 1-7, January 2020.

Advances on a Decision Analytic Approach to Exposure‐Based Chemical Prioritization

6 January 2020 - 11:04am
Abstract

The volume and variety of manufactured chemicals is increasing, although little is known about the risks associated with the frequency and extent of human exposure to most chemicals. The EPA and the recent signing of the Lautenberg Act have both signaled the need for high‐throughput methods to characterize and screen chemicals based on exposure potential, such that more comprehensive toxicity research can be informed. Prior work of Mitchell et al. using multicriteria decision analysis tools to prioritize chemicals for further research is enhanced here, resulting in a high‐level chemical prioritization tool for risk‐based screening. Reliable exposure information is a key gap in currently available engineering analytics to support predictive environmental and health risk assessments. An elicitation with 32 experts informed relative prioritization of risks from chemical properties and human use factors, and the values for each chemical associated with each metric were approximated with data from EPA's CP_CAT database. Three different versions of the model were evaluated using distinct weight profiles, resulting in three different ranked chemical prioritizations with only a small degree of variation across weight profiles. Future work will aim to include greater input from human factors experts and better define qualitative metrics.

An Integrated Scenario Ensemble‐Based Framework for Hurricane Evacuation Modeling: Part 2—Hazard Modeling

6 January 2020 - 11:04am
Abstract

Hurricane track and intensity can change rapidly in unexpected ways, thus making predictions of hurricanes and related hazards uncertain. This inherent uncertainty often translates into suboptimal decision‐making outcomes, such as unnecessary evacuation. Representing this uncertainty is thus critical in evacuation planning and related activities. We describe a physics‐based hazard modeling approach that (1) dynamically accounts for the physical interactions among hazard components and (2) captures hurricane evolution uncertainty using an ensemble method. This loosely coupled model system provides a framework for probabilistic water inundation and wind speed levels for a new, risk‐based approach to evacuation modeling, described in a companion article in this issue. It combines the Weather Research and Forecasting (WRF) meteorological model, the Coupled Routing and Excess STorage (CREST) hydrologic model, and the ADvanced CIRCulation (ADCIRC) storm surge, tide, and wind‐wave model to compute inundation levels and wind speeds for an ensemble of hurricane predictions. Perturbations to WRF's initial and boundary conditions and different model physics/parameterizations generate an ensemble of storm solutions, which are then used to drive the coupled hydrologic + hydrodynamic models. Hurricane Isabel (2003) is used as a case study to illustrate the ensemble‐based approach. The inundation, river runoff, and wind hazard results are strongly dependent on the accuracy of the mesoscale meteorological simulations, which improves with decreasing lead time to hurricane landfall. The ensemble envelope brackets the observed behavior while providing “best‐case” and “worst‐case” scenarios for the subsequent risk‐based evacuation model.

An Integrated Scenario Ensemble‐Based Framework for Hurricane Evacuation Modeling: Part 1—Decision Support System

6 January 2020 - 11:04am
Abstract

This article introduces a new integrated scenario‐based evacuation (ISE) framework to support hurricane evacuation decision making. It explicitly captures the dynamics, uncertainty, and human–natural system interactions that are fundamental to the challenge of hurricane evacuation, but have not been fully captured in previous formal evacuation models. The hazard is represented with an ensemble of probabilistic scenarios, population behavior with a dynamic decision model, and traffic with a dynamic user equilibrium model. The components are integrated in a multistage stochastic programming model that minimizes risk and travel times to provide a tree of evacuation order recommendations and an evaluation of the risk and travel time performance for that solution. The ISE framework recommendations offer an advance in the state of the art because they: (1) are based on an integrated hazard assessment (designed to ultimately include inland flooding), (2) explicitly balance the sometimes competing objectives of minimizing risk and minimizing travel time, (3) offer a well‐hedged solution that is robust under the range of ways the hurricane might evolve, and (4) leverage the substantial value of increasing information (or decreasing degree of uncertainty) over the course of a hurricane event. A case study for Hurricane Isabel (2003) in eastern North Carolina is presented to demonstrate how the framework is applied, the type of results it can provide, and how it compares to available methods of a single scenario deterministic analysis and a two‐stage stochastic program.

A Framework for Understanding Uncertainty in Seismic Risk Assessment

6 January 2020 - 11:04am
Abstract

A better understanding of the uncertainty that exists in models used for seismic risk assessment is critical to improving risk‐based decisions pertaining to earthquake safety. Current models estimating the probability of collapse of a building do not consider comprehensively the nature and impact of uncertainty. This article presents a model framework to enhance seismic risk assessment and thus gives decisionmakers a fuller understanding of the nature and limitations of the estimates. This can help ensure that risks are not over‐ or underestimated and the value of acquiring accurate data is appreciated fully. The methodology presented provides a novel treatment of uncertainties in input variables, their propagation through the model, and their effect on the results. The study presents ranges of possible annual collapse probabilities for different case studies on buildings in different parts of the world, exposed to different levels of seismicity, and with different vulnerabilities. A global sensitivity analysis was conducted to determine the significance of uncertain variables. Two key outcomes are (1) that the uncertainty in ground‐motion conversion equations has the largest effect on the uncertainty in the calculation of annual collapse probability; and (2) the vulnerability of a building appears to have an effect on the range of annual collapse probabilities produced, i.e., the level of uncertainty in the estimate of annual collapse probability, with less vulnerable buildings having a smaller uncertainty.

Estimating the Probability of Human Error by Incorporating Component Failure Data from User‐Induced Defects in the Development of Complex Electrical Systems

6 January 2020 - 11:04am
Abstract

This article proposes a methodology for incorporating electrical component failure data into the human error assessment and reduction technique (HEART) for estimating human error probabilities (HEPs). The existing HEART method contains factors known as error‐producing conditions (EPCs) that adjust a generic HEP to a more specific situation being assessed. The selection and proportioning of these EPCs are at the discretion of an assessor, and are therefore subject to the assessor's experience and potential bias. This dependence on expert opinion is prevalent in similar HEP assessment techniques used in numerous industrial areas. The proposed method incorporates factors based on observed trends in electrical component failures to produce a revised HEP that can trigger risk mitigation actions more effectively based on the presence of component categories or other hazardous conditions that have a history of failure due to human error. The data used for the additional factors are a result of an analysis of failures of electronic components experienced during system integration and testing at NASA Goddard Space Flight Center. The analysis includes the determination of root failure mechanisms and trend analysis. The major causes of these defects were attributed to electrostatic damage, electrical overstress, mechanical overstress, or thermal overstress. These factors representing user‐induced defects are quantified and incorporated into specific hardware factors based on the system's electrical parts list. This proposed methodology is demonstrated with an example comparing the original HEART method and the proposed modified technique.

Workforce/Population, Economy, Infrastructure, Geography, Hierarchy, and Time (WEIGHT): Reflections on the Plural Dimensions of Disaster Resilience

6 January 2020 - 11:04am
Abstract

The concept of resilience and its relevance to disaster risk management has increasingly gained attention in recent years. It is common for risk and resilience studies to model system recovery by analyzing a single or aggregated measure of performance, such as economic output or system functionality. However, the history of past disasters and recent risk literature suggest that a single‐dimension view of relevant systems is not only insufficient, but can compromise the ability to manage risk for these systems. In this article, we explore how multiple dimensions influence the ability for complex systems to function and effectively recover after a disaster. In particular, we compile evidence from the many competing resilience perspectives to identify the most critical resilience dimensions across several academic disciplines, applications, and disaster events. The findings demonstrate the need for a conceptual framework that decomposes resilience into six primary dimensions: workforce/population, economy, infrastructure, geography, hierarchy, and time (WEIGHT). These dimensions are not typically addressed holistically in the literature; often they are either modeled independently or in piecemeal combinations. The current research is the first to provide a comprehensive discussion of each resilience dimension and discuss how these dimensions can be integrated into a cohesive framework, suggesting that no single dimension is sufficient for a holistic analysis of a disaster risk management. Through this article, we also aim to spark discussions among researchers and policymakers to develop a multicriteria decision framework for evaluating the efficacy of resilience strategies. Furthermore, the WEIGHT dimensions may also be used to motivate the generation of new approaches for data analytics of resilience‐related knowledge bases.

Quantitative Risk Assessment of Seafarers’ Nonfatal Injuries Due to Occupational Accidents Based on Bayesian Network Modeling

6 January 2020 - 11:04am
Abstract

Reducing the incidence of seafarers’ workplace injuries is of great importance to shipping and ship management companies. The objective of this study is to identify the important influencing factors and to build a quantitative model for the injury risk analysis aboard ships, so as to provide a decision support framework for effective injury prevention and management. Most of the previous research on seafarers’ occupational accidents either adopts a qualitative approach or applies simple descriptive statistics for analyses. In this study, the advanced method of a Bayesian network (BN) is used for the predictive modeling of seafarer injuries for its interpretative power as well as predictive capacity. The modeling is data driven and based on an extensive empirical survey to collect data on seafarers’ working practice and their injury records during the latest tour of duty, which could overcome the limitation of historical injury databases that mostly contain only data about the injured group instead of the entire population. Using the survey data, a BN model was developed consisting of nine major variables, including “PPE availability,” “Age,” and “Experience” of the seafarers, which were identified to be the most influential risk factors. The model was validated further with several tests through sensitivity analyses and logical axiom test. Finally, implementation of the result toward decision support for safety management in the global shipping industry was discussed.

Pages