Category Archives: scenario analysis

Griffin’s Risk Management Superpower

 

The third installment of the feature film series Men in Black features the alien Griffin. Griffin possesses the critical ArcNet shield that can protect the earth against the impending Boglodite invasion. Griffin also possesses an amazing superpower: he can see the many possible futures in store for us. The movie’s writers, director and the actor playing Griffin, Michael Stuhlbarg, exploit this superpower to great comedic effect, first in a scene that takes place in 1969 at Andy Warhol’s Factory, and then later at Shea Stadium where Griffin visualize’s the Miracle Mets’ entirely improbable victory in the World Series later that year. Griffin’s superpower goes an important step further, and this is a key to how the comedy is written. Not only does he see the wide array of possible futures, but he understands, too, which futures are consistent with events as they play out, and which futures are suddenly ruled out by current events. He sees what mathematician’s call the filtration.

Continue reading

Climate apples and oranges

Monday’s post discussed a proposal by Vikram Pandit, the CEO of Citigroup, calling for a comparison of the results produced by risk models across different banks when evaluating a standardized “hypothetical” portfolio of assets. Exercises like this are standard fare in many research fields where modeling encompasses a broad array of complicated issues, and there can be wide disparity in both structural and parameter choices.

Let’s look at the economics of controlling greenhouse gas emissions. A number of major research institutions have developed economic models to assess the costs of mitigation, including one of the MIT centers where I work. The results from these models play a useful role in public discussion about what should be done.

But these models are inherently very complicated. Greenhouse gases are produced in many different sectors of the economy, in particular in the energy sector which is, in turn, an input to many other sectors, so an economy wide model is required. The greenhouse gas problem operates at long time scales, so technological evolution over many decades is key. Finally, greenhouse gases are a global public goods problem, so a global economic model is required. Building a model to meet these demands is a heroic effort, demanding many judgment calls on major issues.

Understanding the different assumptions and structural choices and how they impact the results is useful, and can shape how the results are read. During the Bush administration, the U.S. government, which underwrites much of the research in this area, ran a comparison exercise on these economic models similar to what Vikram Pandit is proposing for bank risk models. They took 3 of the leading models, and had them generate a suite of diagnostic results when analyzing a common set of policies. They then published an analysis of the different results and the underlying modeling choices that generated the differences. This was done as a part of the Climate Change Science Program and the full results can be found here.

To illustrate the variation across models, here’s just one of the diagnostics, the forecasted change in the price of natural gas. The three models produced strikingly different results. One model produces a forecast that increases more than 800% by the end of the century under modest emissions constraints, while another forecasts increases less than 200%.

Seeing such widely variant results is an eye opener. Novices in any research area often take modeling results for granted. Seasoned researchers are more attuned to the weaknesses and uncertainties and range of different opinions across the scientific community. Comparison exercises, like the one done by the U.S. Climate Change Science Program, or like the one Citibank’s Vikram Pandit is proposing, shine light on differences that the public needs to be better attuned to.

Of course, knowing that there are differences isn’t the end of the process, and doesn’t solve all of the problems. But it’s a useful contribution.

Banking apples and oranges

Vikram Pandit, the CEO of Citigroup, used an opinion piece in last week’s Financial Times to make an interesting proposal on risk disclosures: banks and other financial institutions should be required to report how their internal modeling assesses the risk in a “benchmark” portfolio. Regulators would define the contents of this hypothetical portfolio, and banks would report “a hypothetical loan/loss reserve level, value at risk, stress-test results and risk-weighted assets.”

It’s a useful proposal that could give investors and other market participants additional useful information. But it also has its limitations, and does not resolve some inherent problems with risk-based capital requirements, and does not eliminate the need to control bank size and risk by other means.

Continue reading

The Risk Management of Economic Angst

I’ve just returned from Europe where I spent part of the summer talking to companies in different European nations. Everywhere I went, the signals are flashing yellow. In Europe, the recovery seems to be coming to a standstill.

A poisonous mix of sluggish output, sovereign debt crisis, fragile banks and lack of political will has created a perverse cycle of lower growth, less than expected tax revenues (despite VAT and income tax hikes), obstinate fiscal deficits, growing public debt, more turmoil in financial markets, stressed banks that refuse to ease lending, further austerity and spending cuts, which further undercut business and consumer confidence, as well as disposable income, which lead to yet lower economic growth and so on. It is as if much of the Eurozone and the UK are dangerously balanced on the edge of another recession…without even having been able to recover to the levels of output seen before it all started in 2008.

Recent data tells a sad story. Industrial production in the 17-nation euro area fell 0.7 percent in June compared to May. GDP has slowed to a meager 0.2 per cent in the second quarter from 0.8 per cent in the previous three months. Even mighty Germany is decelerating fast, with last quarter GDP slowing to a shocking 0.1 percent, and the latest IFO Business Climate Index shows that German businesses have turned much less optimistic.

No question that the coming year will be challenging for European businesses. Many companies have been closely monitoring how the situation is evolving, and are diligently working on contingency plans that deal with possible sinister Eco-Fin scenarios.

The following figure shows the risk matrix used by a European manufacturing company (call it MadeInEU) back at the beginning of 2010 (a few months after the beginning of the Greek debt tragedy). The risk matrix plots the different drivers of risk that matter to MadeInEU. The horizontal axis measures the likelihood of a particular driver of risk happening; the vertical axis measures how a driver influences operations, EBITDA and Cash flows. A heuristic combination of probability and impact helps managers rank the drivers of risk by their relative importance.

The same risk matrix redrawn at the end of June showed a different picture. The most important changes that caught management attention were the significant upgrades in three drivers: (1) Macroeconomy; (2) Access to Capital; (3) Financial Risks (mainly related to the outlook for the Euro).

Continue reading

TEPCO’s black swan

The four reactors at Fukushima Daiichi damaged in the March 11 earthquake and tsunami belong to the Tokyo Electric Power Co. (TEPCO). Not surprisingly, TEPCO’s financial situation has become perilous. In announcing its annual results, the company recorded a direct loss of more than ¥1 trillion ($12 billion). This does not include several significant additional costs. Several of TEPCO’s other nuclear reactors remain down in part due to the regulatory and political reaction to the accident. The company will need to purchase replacement power both for the four lost reactors, and for the add-on shut down reactors. Most importantly, the ¥1 trillion does not include the liability for the various types of damage caused by the accident. Estimates for the size of this liability range from 2-11 times the ¥1 trillion loss already booked. Following the accident, TEPCO’s stock quickly lost ¾ of its market value, or more than  ¥2.7 trillion ($32 billion). It is widely believed that the liabilities from the accident make TEPCO insolvent, but for a government decision to support the company financially.

Continue reading

What’s Special About Stress Tests?

The Fed has just announced a second round of stress tests for US banks. This follows the first round of tests in Spring 2009 with results announced in May. There are many interesting issues one could touch on in regard to the role of stress tests in regulating the banking system. But this blog is focused on risk management at non-financial corporations, and so we will leave most of those issues for discussion in other venues. But risk management at non-financial corporations obviously employs many of the same tools as risk management at banks and other financial institutions, so there are a few interesting questions that are common to the two realms. One of them is “What is special about stress tests?”

Are they any different than the other quantitative tools employed to measure risk? If the Fed is already calculating a VaR for each bank, it would seem that it already has a more comprehensive piece of information than would be contained in a stress test. The stress test identifies a single bad scenario and calculates the bank’s losses in that scenario. When calculating its VaR, a bank is effectively determining its losses under many, many possible scenarios. The bank then generates a probability distribution of losses. The VaR just locates the tail on this distribution. What extra information could the Fed extract by forcing the banks to focus on a single scenario instead of the full probability distribution of scenarios?
Continue reading

%d bloggers like this: