Political Risk: Three Reasons Why Political Risk is Difficult To Understand

By

John F. Phillips

In their book Political Risk: How Businesses and Organizations Can Anticipate Global Insecurity, Condoleezza Rice and Amy Zegart spend a lot of time addressing what they call the “five hards of political risk.” Rice and Zegart identify the “hards” as: hard to reward, hard to understand, hard to measure, hard to update, and hard to communicate.

Let’s drill down a bit on the second “hard”, hard to understand.

Political risk is very difficult to understand because many times, our concept of political risk is either flawed or uninformed. Understanding political risk is more than just reading, thinking about, and forming opinions about current events. Political risk is more about identifying behavior patterns that occur in the international arena and then using the knowledge gleaned from the analysis of these patterns to identify the probability of risk and the impact it may have on an organization.

Rice and Zegart identify three factors that negatively impact the understanding of political risk: bias, analyzing individual incidents rather than behavior patterns, and “blind spots” that can influence the understanding of behavior patterns and the risk these patterns reveal.

Bias is a real problem because it shapes how we look at events. In the political science literature, this bias is defined as our” world view.” World view acts as a filter that can distort our perception of events and our understanding of the “why” of political behavior. All of us have a world view that we have formed over our lifetimes. Even those who claim to be “objective” are influenced by their own world view.

World view can enable a distorted understanding of events and can negatively impact responses. It leads to risk analysis that is more focused on “what I wish to be” rather than a more realistic foundation of “this is how it really is.” We all want to be optimistic and think that events can’t happen to our organization or that negative events can be adequately managed, but bias can lead to poor risk analysis and incorrect response that does more to exacerbate risk rather than mitigate risk. Bias conformed to world view can be the death knell for an organization.

Looking at individual events rather patterns of events can also negatively impact the ability to understand political risk and its impact. Rice and Zegart refer to this as the “availability heuristic,” stating that “people tend to judge the frequency of an event based on how many similar instances they can readily recall.” (Rice and Zegart, 2018, p.87)

In lay terms, we tend to remember some things, especially negative or high probability events, occurring more than others and draw our conclusions about current incidents based on our recollection of those events in the past. Unfortunately, many times, political risk is more significantly impacted by low probability, but high impact events. The recent blockage of the Suez Canal by a wayward vessel is a case in point.

Another problem inherent in this approach is that one event does not necessarily mean anything. One event is a data point, that’s all. My mentor always taught that a data point is a data point; two data points is two data points, three data points is “hey, look at this,” four or more data points may be a trend. By the time ten or so data points are identified, correlation may be present and the beginning of a better understanding of behavior patterns can be established.

Rice and Zegart argue that the presence of “blind spots” is closely related to the “availability heuristic” addressed earlier (Rice and Zegart, 2018, pp. 92-93). Blind spots can also be caused by bias influenced by personal world view and, in a group decision making scenario, by what is known as “group think.” which can lead to conformity of thought and rejection of contrarian points of view. Group think is very difficult to overcome because of the pressures inherent in group dynamics. Blind spots are difficult to eliminate, but they can be effectively managed by being aware of their presence and accounting for them during the political risk analysis process.

Bias, the presence of the “availability heuristic,” and blind spots can make the understanding of political risk very difficult. They work in concert and their presence can lead to a flawed interpretation of behavior patterns and, ultimately, a misunderstanding of the “why” of the political behavior that creates political risk.

It is important that political risk be understood as objectively as possible. This means managing bias, the “availability heuristic,” and identifying and eliminating blind spots that hinder perceptions of risk. In order to do this, organizations must be proactive in terms of spending the time and resources to create an environment where understanding and mitigating political risk is a high priority.

What are you and your organization doing to better analyze and understand political risk?

For a deeper discussion of political risk, I highly recommend that you read Political Risk: How Businesses and Organizations Can Anticipate Global Insecurity by Condoleezza Rice and Amy B. Zegart (Twelve Books, 2018). It should be part of every business leader’s library.

Leave a Reply

Your email address will not be published. Required fields are marked *