EISG in Risk Analysis

Risk Analysis is the official journal of Society for Risk Analysis and publishes peer-reviewed, original research on both the theory and practice of risk. The application areas are vast. Below are articles with particular relevance to the Engineering and Infrastructure Specialty Group.


January 2018

Sociotechnical Resilience: A Preliminary Concept
This article presents the concept of sociotechnical resilience by employing an interdisciplinary perspective derived from the fields of science and technology studies, human factors, safety science, organizational studies, and systems engineering. Highlighting the hybrid nature of sociotechnical systems, we identify three main constituents that characterize sociotechnical resilience: informational relations, sociomaterial structures, and anticipatory practices. Further, we frame sociotechnical resilience as undergirded by the notion of transformability with an emphasis on intentional activities, focusing on the ability of sociotechnical systems to shift from one form to another in the aftermath of shock and disturbance. We propose that the triad of relations, structures, and practices are fundamental aspects required to comprehend the resilience of sociotechnical systems during times of crisis.

Development of an Asset Value Map for Disaster Risk Assessment in China by Spatial Disaggregation Using Ancillary Remote Sensing Data
The extent of economic losses due to a natural hazard and disaster depends largely on the spatial distribution of asset values in relation to the hazard intensity distribution within the affected area. Given that statistical data on asset value are collected by administrative units in China, generating spatially explicit asset exposure maps remains a key challenge for rapid postdisaster economic loss assessment. The goal of this study is to introduce a top-down (or downscaling) approach to disaggregate administrative-unit level asset value to grid-cell level. To do so, finding the highly correlated “surrogate” indicators is the key. A combination of three data sets—nighttime light grid, LandScan population grid, and road density grid, is used as ancillary asset density distribution information for spatializing the asset value. As a result, a high spatial resolution asset value map of China for 2015 is generated. The spatial data set contains aggregated economic value at risk at 30 arc-second spatial resolution. Accuracy of the spatial disaggregation reflects redistribution errors introduced by the disaggregation process as well as errors from the original ancillary data sets. The overall accuracy of the results proves to be promising. The example of using the developed disaggregated asset value map in exposure assessment of watersheds demonstrates that the data set offers immense analytical flexibility for overlay analysis according to the hazard extent. This product will help current efforts to analyze spatial characteristics of exposure and to uncover the contributions of both physical and social drivers of natural hazard and disaster across space and time.

Resilience Analysis of Countries under Disasters Based on Multisource Data
Disasters occur almost daily in the world. Because emergencies frequently have no precedent, are highly uncertain, and can be very destructive, improving a country's resilience is an efficient way to reduce risk. In this article, we collected more than 20,000 historical data points from disasters from 207 countries to enable us to calculate the severity of disasters and the danger they pose to countries. In addition, 6 primary indices (disaster, personal attribute, infrastructure, economics, education, and occupation) including 38 secondary influencing factors are considered in analyzing the resilience of countries. Using these data, we obtained the danger, expected number of deaths, and resilience of all 207 countries. We found that a country covering a large area is more likely to have a low resilience score. Through sensitivity analysis of all secondary indices, we found that population density, frequency of disasters, and GDP are the three most critical factors affecting resilience. Based on broad-spectrum resilience analysis of the different continents, Oceania and South America have the highest resilience, while Asia has the lowest. Over the past 50 years, the resilience of many countries has been improved sharply, especially in developing countries. Based on our results, we analyze the comprehensive resilience and provide some optimal suggestions to efficiently improve resilience.

Industrial Safety and Utopia: Insights from the Fukushima Daiichi Accident
Feedback from industrial accidents is provided by various state or even international, institutions, and lessons learned can be controversial. However, there has been little research into organizational learning at the international level. This article helps to fill the gap through an in-depth review of official reports of the Fukushima Daiichi accident published shortly after the event. We present a new method to analyze the arguments contained in these voluminous documents. Taking an intertextual perspective, the method focuses on the accident narratives, their rationale, and links between “facts,” “causes,” and “recommendations.” The aim is to evaluate how the findings of the various reports are consistent with (or contradict) “institutionalized knowledge,” and identify the social representations that underpin them. We find that although the scientific controversy surrounding the results of the various inquiries reflects different ethical perspectives, they are integrated into the same utopian ideal. The involvement of multiple actors in this controversy raises questions about the public construction of epistemic authority, and we highlight the special status given to the International Atomic Energy Agency in this regard.

Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice
The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I–I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I–I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term “essential entities” includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS.

Evaluating the Benefits of Adaptation of Critical Infrastructures to Hydrometeorological Risks
Infrastructure adaptation measures provide a practical way to reduce the risk from extreme hydrometeorological hazards, such as floods and windstorms. The benefit of adapting infrastructure assets is evaluated as the reduction in risk relative to the “do nothing” case. However, evaluating the full benefits of risk reduction is challenging because of the complexity of the systems, the scarcity of data, and the uncertainty of future climatic changes. We address this challenge by integrating methods from the study of climate adaptation, infrastructure systems, and complex networks. In doing so, we outline an infrastructure risk assessment that incorporates interdependence, user demands, and potential failure-related economic losses. Individual infrastructure assets are intersected with probabilistic hazard maps to calculate expected annual damages. Protection measure costs are integrated to calculate risk reduction and associated discounted benefits, which are used to explore the business case for investment in adaptation. A demonstration of the methodology is provided for flood protection of major electricity substations in England and Wales. We conclude that the ongoing adaptation program for major electricity assets is highly cost beneficial.

How to Design Rating Schemes of Risk Matrices: A Sequential Updating Approach
Risk matrices have been widely used as a risk evaluation tool in many fields due to their simplicity and intuitive nature. Designing a rating scheme, i.e., determining the number of ratings used in a risk matrix and assigning different ratings to different cells, is an essential part of risk matrix construction. However, most of the related literature has focused on applying a risk matrix to various fields, instead of researching how to design risk matrices. Based on the analysis of several current rules, we propose a new approach, namely, the sequential updating approach (SUA), to design the rating scheme of a risk matrix in a reliable way. In this article, we propose three principles and a rating algorithm based on these principles. The three principles, namely, adjusted weak consistency, consistent internality, and continuous screening, characterize a good rating scheme. The resulting rating scheme has been proven to be unique. A global rating algorithm is then proposed to create the design that satisfies the three principles. We then explore the performance of the SUA. An illustrative application is first given to explain the feasibility of our approach. The sensitivity analysis shows that our method captures a resolution-reliability tradeoff for decisionmakers in choosing an appropriate rating scheme for a risk matrix. Finally, we compare the designs based on the SUA and Cox's axioms, highlighting the advantages of the SUA.

December 2017

Geographic Hotspots of Critical National Infrastructure
Failure of critical national infrastructures can result in major disruptions to society and the economy. Understanding the criticality of individual assets and the geographic areas in which they are located is essential for targeting investments to reduce risks and enhance system resilience. Within this study we provide new insights into the criticality of real-life critical infrastructure networks by integrating high-resolution data on infrastructure location, connectivity, interdependence, and usage. We propose a metric of infrastructure criticality in terms of the number of users who may be directly or indirectly disrupted by the failure of physically interdependent infrastructures. Kernel density estimation is used to integrate spatially discrete criticality values associated with individual infrastructure assets, producing a continuous surface from which statistically significant infrastructure criticality hotspots are identified. We develop a comprehensive and unique national-scale demonstration for England and Wales that utilizes previously unavailable data from the energy, transport, water, waste, and digital communications sectors. The testing of 200,000 failure scenarios identifies that hotspots are typically located around the periphery of urban areas where there are large facilities upon which many users depend or where several critical infrastructures are concentrated in one location.

Evaluation of the Consequences of a Cistern Truck Accident While Transporting Dangerous Substances through a Tunnel
The transportation of dangerous substances by truck carriers harbors important safety issues in both road and mine tunnels. Even though traffic conditions in road and mine tunnels are different, the potential geometric and hydrodynamic similarities can lead to similar effects from the uncontrolled leakage of the dangerous material. This work was motivated by the design study of the LAGUNA-LBNO (Large Apparatus studying Grand Unification and Neutrino Astrophysics and Long Baseline Neutrino Oscillations) project. The considered neutrino detector requires a huge amount of liquid argon, which must be transported down the tunnel. The present work focuses on the estimation of the most credible incident and the resulting consequences in the case of a truck accident in the tunnel. The approach and tools used in the present work are generic and can be adapted to other similar situations.

Three-Stage Decision-Making Model under Restricted Conditions for Emergency Response to Ships Not under Control
A ship that is not under control (NUC) is a typical incident that poses serious problems when in confined waters close to shore. The emergency response to NUC ships is to select the best risk control options, which is a challenge in restricted conditions (e.g., time limitation, resource constraint, and information asymmetry), particularly in inland waterway transportation. To enable a quick and effective response, this article develops a three-stage decision-making framework for NUC ship handling. The core of this method is (1) to propose feasible options for each involved entity (e.g., maritime safety administration, NUC ship, and ships passing by) under resource constraint in the first stage, (2) to select the most feasible options by comparing the similarity of the new case and existing cases in the second stage, and (3) to make decisions considering the cooperation between the involved organizations by using a developed Bayesian network in the third stage. Consequently, this work provides a useful tool to achieve well-organized management of NUC ships.

November 2017

Spatial Optimization of Future Urban Development with Regards to Climate Risk and Sustainability Objectives
Authors: Daniel Caparros-Midwood, Stuart Barr and Richard Dawson
Abstract: Future development in cities needs to manage increasing populations, climate-related risks, and sustainable development objectives such as reducing greenhouse gas emissions. Planners therefore face a challenge of multidimensional, spatial optimization in order to balance potential tradeoffs and maximize synergies between risks and other objectives. To address this, a spatial optimization framework has been developed. This uses a spatially implemented genetic algorithm to generate a set of Pareto-optimal results that provide planners with the best set of trade-off spatial plans for six risk and sustainability objectives: (i) minimize heat risks, (ii) minimize flooding risks, (iii) minimize transport travel costs to minimize associated emissions, (iv) maximize brownfield development, (v) minimize urban sprawl, and (vi) prevent development of greenspace. The framework is applied to Greater London (U.K.) and shown to generate spatial development strategies that are optimal for specific objectives and differ significantly from the existing development strategies. In addition, the analysis reveals tradeoffs between different risks as well as between risk and sustainability objectives. While increases in heat or flood risk can be avoided, there are no strategies that do not increase at least one of these. Tradeoffs between risk and other sustainability objectives can be more severe, for example, minimizing heat risk is only possible if future development is allowed to sprawl significantly. The results highlight the importance of spatial structure in modulating risks and other sustainability objectives. However, not all planning objectives are suited to quantified optimization and so the results should form part of an evidence base to improve the delivery of risk and sustainability management in future urban development.

A General Framework for the Assessment of Power System Vulnerability to Malicious Attacks
Authors: R. Piccinelli, G. Sansavini, R. Lucchetti and E. Zio
Abstract: The protection and safe operations of power systems heavily rely on the identification of the causes of damage and service disruption. This article presents a general framework for the assessment of power system vulnerability to malicious attacks. The concept of susceptibility to an attack is employed to quantitatively evaluate the degree of exposure of the system and its components to intentional offensive actions. A scenario with two agents having opposing objectives is proposed, i.e., a defender having multiple alternatives of protection strategies for system elements, and an attacker having multiple alternatives of attack strategies against different combinations of system elements. The defender aims to minimize the system susceptibility to the attack, subject to budget constraints; on the other hand, the attacker aims to maximize the susceptibility. The problem is defined as a zero-sum game between the defender and the attacker. The assumption that the interests of the attacker and the defender are opposite makes it irrelevant whether or not the defender shows the strategy he/she will use. Thus, the approaches “leader–follower game” or “simultaneous game” do not provide differences as far as the results are concerned. The results show an example of such a situation, and the von Neumann theorem is applied to find the (mixed) equilibrium strategies of the attacker and of the defender.

Evaluating the Cost, Safety, and Proliferation Risks of Small Floating Nuclear Reactors
Authors: Michael J. Ford, Ahmed Abdulla and M. Granger Morgan
Abstract: It is hard to see how our energy system can be decarbonized if the world abandons nuclear power, but equally hard to introduce the technology in nonnuclear energy states. This is especially true in countries with limited technical, institutional, and regulatory capabilities, where safety and proliferation concerns are acute. Given the need to achieve serious emissions mitigation by mid-century, and the multidecadal effort required to develop robust nuclear governance institutions, we must look to other models that might facilitate nuclear plant deployment while mitigating the technology's risks. One such deployment paradigm is the build-own-operate-return model.

Because returning small land-based reactors containing spent fuel is infeasible, we evaluate the cost, safety, and proliferation risks of a system in which small modular reactors are manufactured in a factory, and then deployed to a customer nation on a floating platform. This floating small modular reactor would be owned and operated by a single entity and returned unopened to the developed state for refueling. We developed a decision model that allows for a comparison of floating and land-based alternatives considering key International Atomic Energy Agency plant-siting criteria. Abandoning onsite refueling is beneficial, and floating reactors built in a central facility can potentially reduce the risk of cost overruns and the consequences of accidents. However, if the floating platform must be built to military-grade specifications, then the cost would be much higher than a land-based system. The analysis tool presented is flexible, and can assist planners in determining the scope of risks and uncertainty associated with different deployment options.

Deterrence and Risk Preferences in Sequential Attacker–Defender Games with Continuous Efforts
Authors: Vineet M. Payyappalli, Jun Zhuang and Victor Richmond R. Jose
Abstract: Most attacker–defender games consider players as risk neutral, whereas in reality attackers and defenders may be risk seeking or risk averse. This article studies the impact of players' risk preferences on their equilibrium behavior and its effect on the notion of deterrence. In particular, we study the effects of risk preferences in a single-period, sequential game where a defender has a continuous range of investment levels that could be strategically chosen to potentially deter an attack. This article presents analytic results related to the effect of attacker and defender risk preferences on the optimal defense effort level and their impact on the deterrence level. Numerical illustrations and some discussion of the effect of risk preferences on deterrence and the utility of using such a model are provided, as well as sensitivity analysis of continuous attack investment levels and uncertainty in the defender's beliefs about the attacker's risk preference. A key contribution of this article is the identification of specific scenarios in which the defender using a model that takes into account risk preferences would be better off than a defender using a traditional risk-neutral model. This study provides insights that could be used by policy analysts and decisionmakers involved in investment decisions in security and safety.

October 2017

The Use of Simulation to Reduce the Domain of “Black Swans” with Application to Hurricane Impacts to Power Systems
Authors: Christine L. Berner, Andrea Staid, Roger Flage and Seth D. Guikema
Abstract: Recently, the concept of black swans has gained increased attention in the fields of risk assessment and risk management. Different types of black swans have been suggested, distinguishing between unknown unknowns (nothing in the past can convincingly point to its occurrence), unknown knowns (known to some, but not to relevant analysts), or known knowns where the probability of occurrence is judged as negligible. Traditional risk assessments have been questioned, as their standard probabilistic methods may not be capable of predicting or even identifying these rare and extreme events, thus creating a source of possible black swans.

In this article, we show how a simulation model can be used to identify previously unknown potentially extreme events that if not identified and treated could occur as black swans. We show that by manipulating a verified and validated model used to predict the impacts of hazards on a system of interest, we can identify hazard conditions not previously experienced that could lead to impacts much larger than any previous level of impact. This makes these potential black swan events known and allows risk managers to more fully consider them. We demonstrate this method using a model developed to evaluate the effect of hurricanes on energy systems in the United States; we identify hurricanes with potentially extreme impacts, storms well beyond what the historic record suggests is possible in terms of impacts.

Assessing Climate Change Impacts on Wildfire Exposure in Mediterranean Areas
Authors: Antonio T. Monteiro, Mark A. Finney, Liliana Del Giudice, Enrico Scoccimarro and Donatella Spano
Abstract: We used simulation modeling to assess potential climate change impacts on wildfire exposure in Italy and Corsica (France). Weather data were obtained from a regional climate model for the period 1981–2070 using the IPCC A1B emissions scenario. Wildfire simulations were performed with the minimum travel time fire spread algorithm using predicted fuel moisture, wind speed, and wind direction to simulate expected changes in weather for three climatic periods (1981–2010, 2011–2040, and 2041–2070). Overall, the wildfire simulations showed very slight changes in flame length, while other outputs such as burn probability and fire size increased significantly in the second future period (2041–2070), especially in the southern portion of the study area. The projected changes fuel moisture could result in a lengthening of the fire season for the entire study area. This work represents the first application in Europe of a methodology based on high resolution (250 m) landscape wildfire modeling to assess potential impacts of climate changes on wildfire exposure at a national scale. The findings can provide information and support in wildfire management planning and fire risk mitigation activities.

Construction Safety Risk Modeling and Simulation
Authors: Antoine J.-P. Tixier, Matthew R. Hallowell and Balaji Rajagopalan
Abstract: By building on a genetic-inspired attribute-based conceptual framework for safety risk analysis, we propose a novel approach to define, model, and simulate univariate and bivariate construction safety risk at the situational level. Our fully data-driven techniques provide construction practitioners and academicians with an easy and automated way of getting valuable empirical insights from attribute-based data extracted from unstructured textual injury reports. By applying our methodology on a data set of 814 injury reports, we first show the frequency-magnitude distribution of construction safety risk to be very similar to that of many natural phenomena such as precipitation or earthquakes. Motivated by this observation, and drawing on state-of-the-art techniques in hydroclimatology and insurance, we then introduce univariate and bivariate nonparametric stochastic safety risk generators based on kernel density estimators and copulas. These generators enable the user to produce large numbers of synthetic safety risk values faithful to the original data, allowing safety-related decision making under uncertainty to be grounded on extensive empirical evidence. One of the implications of our study is that like natural phenomena, construction safety may benefit from being studied quantitatively by leveraging empirical data rather than strictly being approached through a managerial perspective using subjective data, which is the current industry standard. Finally, a side but interesting finding is that in our data set, attributes related to high energy levels (e.g., machinery, hazardous substance) and to human error (e.g., improper security of tools) emerge as strong risk shapers.

A Blueprint for Full Collective Flood Risk Estimation: Demonstration for European River Flooding (open access)
Authors: Francesco Serinaldi and Chris G. Kilsby
Abstract: loods are a natural hazard evolving in space and time according to meteorological and river basin dynamics, so that a single flood event can affect different regions over the event duration. This physical mechanism introduces spatio-temporal relationships between flood records and losses at different locations over a given time window that should be taken into account for an effective assessment of the collective flood risk. However, since extreme floods are rare events, the limited number of historical records usually prevents a reliable frequency analysis. To overcome this limit, we move from the analysis of extreme events to the modeling of continuous stream flow records preserving spatio-temporal correlation structures of the entire process, and making a more efficient use of the information provided by continuous flow records. The approach is based on the dynamic copula framework, which allows for splitting the modeling of spatio-temporal properties by coupling suitable time series models accounting for temporal dynamics, and multivariate distributions describing spatial dependence. The model is applied to 490 stream flow sequences recorded across 10 of the largest river basins in central and eastern Europe (Danube, Rhine, Elbe, Oder, Waser, Meuse, Rhone, Seine, Loire, and Garonne). Using available proxy data to quantify local flood exposure and vulnerability, we show that the temporal dependence exerts a key role in reproducing interannual persistence, and thus magnitude and frequency of annual proxy flood losses aggregated at a basin-wide scale, while copulas allow the preservation of the spatial dependence of losses at weekly and annual time scales.

Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach (open access)
Authors: Toon Haer, W. J. Wouter Botzen, Hans de Moel and Jeroen C. J. H. Aerts
Abstract: Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks.

An Empirical Agent-Based Model to Simulate the Adoption of Water Reuse Using the Social Amplification of Risk Framework
Authors: Venu Kandiah, Andrew R. Binder and Emily Z. Berglund
Abstract: Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large-scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision-making process. Based on the social amplification of risk framework, our agent-based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the “risk publics” model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community-level parameters—including social groups, relationships, and communication variables, also from survey data—are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks.

September 2017

Resilience of Cyber Systems with Over- and Underregulation
Authors: Viktoria Gisladottir, Alexander A. Ganin, Jeffrey M. Keisler, Jeremy Kepner and Igor Linkov
Abstract: Recent cyber attacks provide evidence of increased threats to our critical systems and infrastructure. A common reaction to a new threat is to harden the system by adding new rules and regulations. As federal and state governments request new procedures to follow, each of their organizations implements their own cyber defense strategies. This unintentionally increases time and effort that employees spend on training and policy implementation and decreases the time and latitude to perform critical job functions, thus raising overall levels of stress. People's performance under stress, coupled with an overabundance of information, results in even more vulnerabilities for adversaries to exploit. In this article, we embed a simple regulatory model that accounts for cybersecurity human factors and an organization's regulatory environment in a model of a corporate cyber network under attack. The resulting model demonstrates the effect of under- and overregulation on an organization's resilience with respect to insider threats. Currently, there is a tendency to use ad-hoc approaches to account for human factors rather than to incorporate them into cyber resilience modeling. It is clear that using a systematic approach utilizing behavioral science, which already exists in cyber resilience assessment, would provide a more holistic view for decisionmakers.

Application of Graph Theory to Cost-Effective Fire Protection of Chemical Plants During Domino Effects
Authors: Nima Khakzad, Gabriele Landucci and Genserik Reniers
Abstract: In the present study, we have introduced a methodology based on graph theory and multicriteria decision analysis for cost-effective fire protection of chemical plants subject to fire-induced domino effects. By modeling domino effects in chemical plants as a directed graph, the graph centrality measures such as out-closeness and betweenness scores can be used to identify the installations playing a key role in initiating and propagating potential domino effects. It is demonstrated that active fire protection of installations with the highest out-closeness score and passive fire protection of installations with the highest betweenness score are the most effective strategies for reducing the vulnerability of chemical plants to fire-induced domino effects. We have employed a dynamic graph analysis to investigate the impact of both the availability and the degradation of fire protection measures over time on the vulnerability of chemical plants. The results obtained from the graph analysis can further be prioritized using multicriteria decision analysis techniques such as the method of reference point to find the most cost-effective fire protection strategy.

A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents
Authors: Hongyang Yu, Faisal Khan and Brian Veitch
Abstract: Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry.

Click here for abstracts from earlier in 2017.