Risk Analysis: An International Journal
In the field of risk analysis, the normative value systems underlying accepted methodology are rarely explicitly discussed. This perspective provides a critique of the various ethical frameworks that can be used in risk assessments and risk management decisions. The goal is to acknowledge philosophical weaknesses that should be considered and communicated in order to improve the public acceptance of the work of risk analysts.
Risk Assessment of Salmonellosis from Consumption of Alfalfa Sprouts and Evaluation of the Public Health Impact of Sprout Seed Treatment and Spent Irrigation Water Testing
We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0–5-log10 reduction in Salmonella) and testing spent irrigation water (SIW) during production. The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response. Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400–248,000) cases/year. Risk reduction (by 5- to 7-fold) predicted from a 1-log10 seed treatment alone was comparable to SIW testing alone, and each additional 1-log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3-log10 or a 5-log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33–448) or 1.4 (95% CI <1–4.5), respectively. Combined with SIW testing, a 3-log10 or 5-log10 seed treatment reduced the cases/year to 45 (95% CI 10–146) or <1 (95% CI <1–1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3-log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22–298) cases/year. Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.
Induced Earthquakes from Long-Term Gas Extraction in Groningen, the Netherlands: Statistical Analysis and Prognosis for Acceptable-Risk Regulation
Recently, growing earthquake activity in the northeastern Netherlands has aroused considerable concern among the 600,000 provincial inhabitants. There, at 3 km deep, the rich Groningen gas field extends over 900 km2 and still contains about 600 of the original 2,800 billion cubic meters (bcm). Particularly after 2001, earthquakes have increased in number, magnitude (M, on the logarithmic Richter scale), and damage to numerous buildings. The man-made nature of extraction-induced earthquakes challenges static notions of risk, complicates formal risk assessment, and questions familiar conceptions of acceptable risk. Here, a 26-year set of 294 earthquakes with M ≥ 1.5 is statistically analyzed in relation to increasing cumulative gas extraction since 1963. Extrapolations from a fast-rising trend over 2001–2013 indicate that—under “business as usual”—around 2021 some 35 earthquakes with M ≥ 1.5 might occur annually, including four with M ≥ 2.5 (ten-fold stronger), and one with M ≥ 3.5 every 2.5 years. Given this uneasy prospect, annual gas extraction has been reduced from 54 bcm in 2013 to 24 bcm in 2017. This has significantly reduced earthquake activity, so far. However, when extraction is stabilized at 24 bcm per year for 2017–2021 (or 21.6 bcm, as judicially established in Nov. 2017), the annual number of earthquakes would gradually increase again, with an expected all-time maximum M ≈ 4.5. Further safety management may best follow distinct stages of seismic risk generation, with moderation of gas extraction and massive (but late and slow) building reinforcement as outstanding strategies. Officially, “acceptable risk” is mainly approached by quantification of risk (e.g., of fatal building collapse) for testing against national safety standards, but actual (local) risk estimation remains problematic. Additionally important are societal cost–benefit analysis, equity considerations, and precautionary restraint. Socially and psychologically, deliberate attempts are made to improve risk communication, reduce public anxiety, and restore people's confidence in responsible experts and policymakers.
Current approaches to risk management place insufficient emphasis on the system knowledge available to the assessor, particularly in respect of the dynamic behavior of the system under threat, the role of human agents (HAs), and the knowledge available to those agents.
In this article, we address the second of these issues. We are concerned with a class of systems containing HAs playing a variety of roles as significant system elements—as decisionmakers, cognitive agents, or implementers—that is, human activity systems. Within this family of HAS, we focus on safety and mission-critical systems, referring to this subclass as critical human activity systems (CHASs).
Identification of the role and contribution of these human elements to a system is a nontrivial problem whether in an engineering context, or, as is the case here, in a wider social and public context. Frequently, they are treated as standing apart from the system in design or policy terms. Regardless of the process of policy definition followed, analysis of the risk and threats to such a CHAS requires a holistic approach, since the effect of undesirable, uninformed, or erroneous actions on the part of the human elements is both potentially significant to the system output and inextricably bound together with the nonhuman elements of the system.
We present a procedure for identifying the potential threats and risks emerging from the roles and activity of those HAs, using the 2014 flooding in southwestern England and the Thames Valley as a contemporary example.
Security of the systems is normally interdependent in such a way that security risks of one part affect other parts and threats spread through the vulnerable links in the network. So, the risks of the systems can be mitigated through investments in the security of interconnecting links. This article takes an innovative look at the problem of security investment of nodes on their vulnerable links in a given contagious network as a game-theoretic model that can be applied to a variety of applications including information systems. In the proposed game model, each node computes its corresponding risk based on the value of its assets, vulnerabilities, and threats to determine the optimum level of security investments on its external links respecting its limited budget. Furthermore, direct and indirect nonlinear influences of a node's security investment on the risks of other nodes are considered. The existence and uniqueness of the game's Nash equilibrium in the proposed game are also proved. Further analysis of the model in a practical case revealed that taking advantage of the investment effects of other players, perfectly rational players (i.e., those who use the utility function of the proposed game model) make more cost-effective decisions than selfish nonrational or semirational players.
This article analyzes the role of dynamic economic resilience in relation to recovery from disasters in general and illustrates its potential to reduce disaster losses in a case study of the Wenchuan earthquake of 2008. We first offer operational definitions of the concept linked to policies to promote increased levels and speed of investment in repair and reconstruction to implement this resilience. We then develop a dynamic computable general equilibrium (CGE) model that incorporates major features of investment and traces the time-path of the economy as it recovers with and without dynamic economic resilience. The results indicate that resilience strategies could have significantly reduced GDP losses from the Wenchuan earthquake by 47.4% during 2008–2011 by accelerating the pace of recovery and could have further reduced losses slightly by shortening the recovery by one year. The results can be generalized to conclude that shortening the recovery period is not nearly as effective as increasing reconstruction investment levels and steepening the time-path of recovery. This is an important distinction that should be made in the typically vague and singular reference to increasing the speed of recovery in many definitions of dynamic resilience.
A Mathematical Model for Pathogen Cross-Contamination Dynamics during the Postharvest Processing of Leafy Greens
We developed a probabilistic mathematical model for the postharvest processing of leafy greens focusing on Escherichia coli O157:H7 contamination of fresh-cut romaine lettuce as the case study. Our model can (i) support the investigation of cross-contamination scenarios, and (ii) evaluate and compare different risk mitigation options. We used an agent-based modeling framework to predict the pathogen prevalence and levels in bags of fresh-cut lettuce and quantify spread of E. coli O157:H7 from contaminated lettuce to surface areas of processing equipment. Using an unbalanced factorial design, we were able to propagate combinations of random values assigned to model inputs through different processing steps and ranked statistically significant inputs with respect to their impacts on selected model outputs. Results indicated that whether contamination originated on incoming lettuce heads or on the surface areas of processing equipment, pathogen prevalence among bags of fresh-cut lettuce and batches was most significantly impacted by the level of free chlorine in the flume tank and frequency of replacing the wash water inside the tank. Pathogen levels in bags of fresh-cut lettuce were most significantly influenced by the initial levels of contamination on incoming lettuce heads or surface areas of processing equipment. The influence of surface contamination on pathogen prevalence or levels in fresh-cut bags depended on the location of that surface relative to the flume tank. This study demonstrates that developing a flexible yet mathematically rigorous modeling tool, a “virtual laboratory,” can provide valuable insights into the effectiveness of individual and combined risk mitigation options.
A Systems-Based Risk Assessment Framework for Intentional Electromagnetic Interference (IEMI) on Critical Infrastructures
Modern infrastructures are becoming increasingly dependent on electronic systems, leaving them more vulnerable to electrical surges or electromagnetic interference. Electromagnetic disturbances appear in nature, e.g., lightning and solar wind; however, they may also be generated by man-made technology to maliciously damage or disturb electronic equipment. This article presents a systematic risk assessment framework for identifying possible, consequential, and plausible intentional electromagnetic interference (IEMI) attacks on an arbitrary distribution network infrastructure. In the absence of available data on IEMI occurrences, we find that a systems-based risk assessment is more useful than a probabilistic approach. We therefore modify the often applied definition of risk, i.e., a set of triplets containing scenario, probability, and consequence, to a set of quadruplets: scenario, resource requirements, plausibility, and consequence. Probability is “replaced” by resource requirements and plausibility, where the former is the minimum amount and type of equipment necessary to successfully carry out an attack scenario and the latter is a subjective assessment of the extent of the existence of attackers who possess the motivation, knowledge, and resources necessary to carry out the scenario. We apply the concept of intrusion areas and classify electromagnetic source technology according to key attributes. Worst-case scenarios are identified for different quantities of attacker resources. The most plausible and consequential of these are deemed the most important scenarios and should provide useful decision support in a countermeasures effort. Finally, an example of the proposed risk assessment framework, based on notional data, is provided on a hypothetical water distribution network.
Modeling Poliovirus Transmission in Pakistan and Afghanistan to Inform Vaccination Strategies in Undervaccinated Subpopulations
Due to security, access, and programmatic challenges in areas of Pakistan and Afghanistan, both countries continue to sustain indigenous wild poliovirus (WPV) transmission and threaten the success of global polio eradication and oral poliovirus vaccine (OPV) cessation. We fitted an existing differential-equation-based poliovirus transmission and OPV evolution model to Pakistan and Afghanistan using four subpopulations to characterize the well-vaccinated and undervaccinated subpopulations in each country. We explored retrospective and prospective scenarios for using inactivated poliovirus vaccine (IPV) in routine immunization or supplemental immunization activities (SIAs). The undervaccinated subpopulations sustain the circulation of serotype 1 WPV and serotype 2 circulating vaccine-derived poliovirus. We find a moderate impact of past IPV use on polio incidence and population immunity to transmission mainly due to (1) the boosting effect of IPV for individuals with preexisting immunity from a live poliovirus infection and (2) the effect of IPV-only on oropharyngeal transmission for individuals without preexisting immunity from a live poliovirus infection. Future IPV use may similarly yield moderate benefits, particularly if access to undervaccinated subpopulations dramatically improves. However, OPV provides a much greater impact on transmission and the incremental benefit of IPV in addition to OPV remains limited. This study suggests that despite the moderate effect of using IPV in SIAs, using OPV in SIAs remains the most effective means to stop transmission, while limited IPV resources should prioritize IPV use in routine immunization.
The predominant definition of extinction risk in conservation biology involves evaluating the cumulative distribution function (CDF) of extinction time at a particular point (the “time horizon”). Using the principles of decision theory, this article develops an alternative definition of extinction risk as the expected loss (EL) to society resulting from eventual extinction of a species. Distinct roles are identified for time preference and risk aversion. Ranges of tentative values for the parameters of the two approaches are proposed, and the performances of the two approaches are compared and contrasted for a small set of real-world species with published extinction time distributions and a large set of hypothetical extinction time distributions. Potential issues with each approach are evaluated, and the EL approach is recommended as the better of the two. The CDF approach suffers from the fact that extinctions that occur at any time before the specified time horizon are weighted equally, while extinctions that occur beyond the specified time horizon receive no weight at all. It also suffers from the fact that the time horizon does not correspond to any natural phenomenon, and so is impossible to specify nonarbitrarily; yet the results can depend critically on the specified value. In contrast, the EL approach has the advantage of weighting extinction time continuously, with no artificial time horizon, and the parameters of the approach (the rates of time preference and risk aversion) do correspond to natural phenomena, and so can be specified nonarbitrarily.
News media plays a large role in the information the public receives during an infectious disease outbreak, and may influence public knowledge and perceptions of risk. This study analyzed and described the content of U.S. news media coverage of Zika virus and Zika response during 2016. A random selection of 800 Zika-related news stories from 25 print and television news sources was analyzed. The study examined 24 different messages that appeared in news media articles and characterized them using theories of risk perception as messages with characteristics that could increase perception of risk (risk-elevating messages; n = 14), messages that could decrease perception of risk (risk-minimizing messages; n = 8), or messages about travel or testing guidance (n = 2). Overall, 96% of news stories in the study sample contained at least one or more risk-elevating message(s) and 61% contained risk-minimizing message(s). The frequency of many messages changed after local transmission was confirmed in Florida, and differed between sources in locations with or without local transmission in 2016. Forty percent of news stories included messages about negative potential outcomes of Zika virus infection without mentioning ways to reduce risk. Findings from this study may help inform current federal, state, and local Zika responses by offering a detailed analysis of how news media are covering the outbreak and response activities as well as identifying specific messages appearing more or less frequently than intended. Findings identifying the types of messages that require greater emphasis may also assist public health communicators in responding more effectively to future outbreaks.
Antimicrobial spray products are used by millions of people around the world for cleaning and disinfection of commonly touched surfaces. Influenza A is a pathogen of major concern, leading to up to 49,000 deaths and 114,000 hospitalizations per year in the United States alone. One of the recognized routes of transmission for influenza A is by transfer of viruses from surfaces to hands and subsequently to mucous membranes. Therefore, routine cleaning and disinfection of surfaces is an important part of the environmental management of influenza A. While the emphasis is generally on spraying hard surfaces and laundering cloth and linens with high temperature machine drying, not all surfaces can be treated in this manner. The quantitative microbial risk assessment (QMRA) approach was used to develop a stochastic risk model for estimating the risk of infection from indirect contact with porous fomite with and without surface treatment with an antimicrobial spray. The data collected from laboratory analysis combined with the risk model show that influenza A infection risk can be lowered by four logs after using an antimicrobial spray on a porous surface. Median risk associated with a single touch to a contaminated fabric was estimated to be 1.25 × 10−4 for the untreated surface, and 3.6 × 10−8 for the treated surface as a base case assumption. This single touch scenario was used to develop a generalizable model for estimating risks and comparing scenarios with and without treatment to more realistic multiple touch scenarios over time periods and with contact rates previously reported in the literature. The results of this study and understanding of product efficacy on risk reduction inform and broaden the range of risk management strategies for influenza A by demonstrating effective risk reduction associated with treating nonporous fomites that cannot be laundered at high temperatures.
Construction of a Dose–Illness Relationship via Modeling Morbidity and Application to Risk Assessment of Wastewater Reuse
A disease burden (DB) evaluation for environmental pathogens is generally performed using disability-adjusted life years with the aim of providing a quantitative assessment of the health hazard caused by pathogens. A critical step in the preparation for this evaluation is the estimation of morbidity between exposure and disease occurrence. In this study, the method of a traditional dose–response analysis was first reviewed, and then a combination of the theoretical basis of a “single-hit” and an “infection-illness” model was performed by incorporating two critical factors: the “infective coefficient” and “infection duration.” This allowed a dose–morbidity model to be built for direct use in DB calculations. In addition, human experimental data for typical intestinal pathogens were obtained for model validation, and the results indicated that the model was well fitted and could be further used for morbidity estimation. On this basis, a real case of a water reuse project was selected for model application, and the morbidity as well as the DB caused by intestinal pathogens during water reuse was evaluated. The results show that the DB attributed to Enteroviruses was significant, while that for enteric bacteria was negligible. Therefore, water treatment technology should be further improved to reduce the exposure risk of Enteroviruses. Since road flushing was identified as the major exposure route, human contact with reclaimed water through this pathway should be limited. The methodology proposed for model construction not only makes up for missing data of morbidity during risk evaluation, but is also necessary to quantify the maximum possible DB.
Past research has suggested that urban anglers are a group at high risk of being exposed to contaminants from fish consumption. Fish consumption advisories have been used in many regions to encourage healthy fish-eating behaviors, but few studies have been designed to assess whether these advisories actually influence behavior as intended. We conducted a large-scale, randomized experiment to test the influence of an advisory brochure on urban anglers’ fish consumption. We collected detailed information on anglers’ fish consumption in three urban counties in the Great Lakes region in the summers of 2014 and 2015. We provided a treatment group with fish consumption guidelines in an advisory brochure before the summer of 2015 and compared their change in fish consumption to a control group. The brochure led to a reduction in fish consumption for anglers who ate the most fish; these anglers reduced their consumption of high-contaminant purchased fish (by ≥0.2 meals/summer for those in 72nd percentile of fish consumption or above), high-contaminant sport-caught fish (by ≥0.4 meals/summer for those in 87th percentile and above), and low-contaminant sport-caught fish (by ≥0.3 meals/summer by those in 76th percentile and above). The brochure also reduced sport-caught fish consumption among those anglers who exceeded the advisories in 2014 (by 2.0 meals/summer). In addition, the brochure led to small increases in sport-caught fish consumption (0.4–0.6 meals/summer) in urban anglers who ate very little sport-caught fish (≤1 meal/summer).
Historical data analysis shows that escalation accidents, so-called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent-based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent-based modeling technique explains the domino effects from a bottom-up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher-level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large-scale complicated cases.
Estimation of the Leukemia Risk in Human Populations Exposed to Benzene from Tobacco Smoke Using Epidemiological Data
Several epidemiological studies have demonstrated an association between occupational benzene exposure and increased leukemia risk, in particular acute myeloid leukemia (AML). However, there is still uncertainty as to the risk to the general population from exposure to lower environmental levels of benzene. To estimate the excess risk of leukemia from low-dose benzene exposure, various methods for incorporating epidemiological data in quantitative risk assessment were utilized. Tobacco smoke was identified as one of the main potential sources of benzene exposure and was the focus of this exposure assessment, allowing further investigation of the role of benzene in smoking-induced leukemia. Potency estimates for benzene were generated from individual occupational studies and meta-analysis data, and an exposure assessment for two smoking subgroups (light and heavy smokers) carried out. Subsequently, various techniques, including life-table analysis, were then used to evaluate both the excess lifetime risk and the contribution of benzene to smoking-induced leukemia and AML. The excess lifetime risk for smokers was estimated at between two and six additional leukemia deaths in 10,000 and one to three additional AML deaths in 10,000. The contribution of benzene to smoking-induced leukemia was estimated at between 9% and 24% (UpperCL 14–31%). For AML this contribution was estimated as 11–30% (UpperCL 22–60%). From the assessments carried out here, it appears there is an increased risk of leukemia from low-level exposure to benzene and that benzene may contribute up to a third of smoking-induced leukemia. Comparable results from using methods with varying degrees of complexity were generated.