Summary: Workshop on the field and science of risk analysis, Ann Arbor, USA, May 5-6 2016

On May 5-6 a workshop on risk analysis and science was organised in Ann Arbor, USA. The idea of the workshop was to stimulate discussions and future activities related to the topic. 

The workshop participants were: David Alderson (Naval Postgraduate School, USA), Terje Aven (University of Stavanger, Norway), Frederic Bouder (Maastricht University, The Netherlands), Seth Guikema (University of Michigan, USA), Katherine McComas (Cornell University, USA), Pia-Johanna Schweizer (University of Stuttgart, Germany), Kimberly M. Thompson (University of Central Florida, USA), Felicia Wu (Michigan State University, USA); and, as secretaries, Roger Flage (University of Stavanger, Norway), Allison Reilly (University of Michigan, USA), and Kristen Schell (University of Michigan).

The workshop was co-sponsored by the University of Stavanger, Norway and the University of Michigan.

Risk analysis is here interpreted in a wide sense as in Society for Risk Analysis (SRA) contexts covering in particular risk assessment, risk communication and risk management). Below we summarize some of the reflections that emerged from the discussions.

A distinction highlighted during the workshop introduction and frequently referred to throughout the workshop was that between:

   A. Studies and management  of the  risk of specific activities

   B. Generic risk practices and research: How to conceptualise, understand, assess, communicate  and manage risk

For example, it was pointed out that it is not a goal to have as many as possible working on B, but enough to have a development there and generate momentum. There is a need for a proper balance between A and B type of activities. 

Moreover, there was a suggestion for a new aspect C, covering the interrelation between A and B, i.e. how to arrive at a coherent framework for reflecting on A and B. Others considered this aspects covered by B. There was also broad agreement that publications contributing to the field of risk analysis should contribute to generalizable knowledge, so although a publication is addressing A, there should be some impact also on B.

The discussions at the workshop revolved around the following questions and issues:

a) Defining the risk analysis field/science. Is risk analysis actually a field/science? Is it really important?

The answer was a clear and unambiguous ‘yes’ to the questions of whether risk analysis is a field. This is also important, from an individual (self-serving) point of view, in terms of career progression, to be recognized for work performed, for promotion and tenure. But it is also important because there are many people working across applications/domains who face similar issues, and the field provides a venue for them to share ideas, learn about best practices, and contribute to the development of these practices.

The question of whether risk analysis is a science came up with more diverging answers. One line of argument was that it depends on how “science” is defined; whether the reference is the traditional scientific method, with requirements for reproducibility and validation, or a broader understanding. For the latter, there were arguments that although indeed a science, risk analysis is not a unified science. However, risk analysis is definitely scientific in the sense of being systematic and evidence-based.

b) What are the main challenges/obstacles for the development of this field/science?  How can we best meet these challenges/obstacles? 

Several main challenges/obstacles were identified and discusses, including:

  • Lack of clear research funding mechanisms specifically tied to risk: Research project calls tend to be issue-focused, i.e. focused on A rather than B.
  • Lack of standards: Standards are difficult to come up with due to the interdisciplinary nature of the field; and the lack of such standards makes it hard for the community to be authoritative.
  • Unclear career paths for those who work in the field: In academia, there are tensions for interdisciplinary researchers to get tenure; and in industry, risk analysis is not an established professions (beyond actuaries, finance, consultants).
  • Lack of advocacy for the field/science: E.g., visibility in policy circles.
  • Lack of critical mass in the community
  • Lack of unified approach to training

Suggestions for how to meet these challenges/obstacles included:

  • Raise visibility and increase outreach to decision-makers and the media: E.g., publicize specific examples of great work having an impact, and collaborate with like-minded or complementary professional organizations to increase visibility of risk analysis as a field and science.
  • General outreach and PR: E.g., recruit educators or figure out a way for SRA to work on it as a society, invite and chaperone media at Annual Meeting.
  • Establish standards, for quality control
  • Funded postdocs within government (like AAAS fellows)
  • Better coordination within academic programs to offer robust curricula in risk, and improved educational offerings at conferences

c) Define key, concrete subjects of the risk field/science

The following subjects were identified:

  • Risk assessment: Methodology (probability, statistics, domain-specific topics), modeling, and process; evidence evaluation; survey of domain-specific techniques/areas of application (financial, engineering, human health, ecological, environmental).
  • Risk management and governance: Decision analysis, policy analysis, operations research, economics (microeconomics, cost-benefit analysis, behavioral economics), regulatory systems, stakeholder and public engagement, ethics; key concepts (e.g. resilience).
  • Risk perception and communication: Cognition about risk (heuristics, biases, mental models, etc.); cultural, social, institutional factors; framing effects, experimental design, surveys; public engagement, messaging, focus groups; quantitative and qualitative methods (mental models, experimental design, survey research, statistics); key concepts (e.g. trust).

There was also a broad discussion in particular about what a curriculum in risk analysis would look like. What goes into an undergraduate/graduate understanding of risk? What goes into a professional certification for risk? One suggestion was to define a “ladder of competence” for risk analysis, analogously to the one suggested by Hubert Dreyfus in the 1970s in the field of software engineering.

d) Where do we go from here?  What steps do we need to take?

In addition to the initiatives identified in response to issue b), the following suggestions for how to move forward were made:

  • Make use of SRA New Initiatives to increase visibility: Invite media to be chaperoned at annual meeting (cf. b)). Encourage (via support) creation of TED talks (or create TED-like talks on SRA website).
  • Make a short video competition (e.g., 3-5 min, topics to be defined, appeal to students, finalists to be filtered, online voting of finalists, to be shown at SRA meeting luncheon with an award to be announced/presented there): Discuss with SRA Council, awards committee; define a process to do it, guidelines/requirements, timeline for submission, etc.
  • Make a certification process for risk analysts, along the lines of INFORMS.

The output/deliveries from the workshop were an increased understanding of what are the obstacles for the further development of the risk analysis field, as well as an increased understanding of what it means to be a risk analysis professional. Furthermore, several ideas for how to improve the current situation were generated, and preliminary drafts of subjects defining the core of the risk field and science were drawn up. Finally, the workshop provided a basis for a roundtable at the SRA Annual Meeting 2016 in San Diego, where most the workshop participants will take part.

Slides from the workshop presentations and group work can be found here:

Intro slides Terje Aven

Working group 1 slides

Working group 2 slides