Today, I will endeavour to address an interesting conundrum: Why fairly sophisticated industrial organizations with outstanding in-house financial talent chose not to take into account uncertainty, the value of optionality and rational competitive behaviours in their big bet industrial decision making.
Not all organizations of course. Indeed, in Dagens Næringsliv (DN; the leading Norwegian financial newspaper) on 04.02.2016 there was an article about Norske Skog, a globally leading paper producer HQed in Norway, and how they were taken to court in the US by a group of holders of secured bonds, headed by Citibank, regarding an agreement between Norske Skog and a group of holders of unsecured bonds, including the Blackstone companies GSO and Cyrus Capital. Reading between the lines, both the Blackstone guys and the Citibank guys appeared to have taken full package of MBA courses on decision trees and sequential game theory. (Norske Skog’s executive management appeared in comparison less prepared*.) Furthermore, in preparation for this blog post, I had a chat with a senior executive in a large European energy company, who stated that (my translation, slightly edited): “Mostly anybody, including the CEO, in [my company] would be able to credibly contribute to a discussion about the output from [their in-house stochastic dynamic programming software package]”.
But for most organizations, DCF with uncertainty modeled through the addition of a risk premium to risk free interest rate appears to be the preferred analytical tool for valuation of most industrial projects (with the exception of multiple-based methods), despite the fact that sound theoretical tools for say valuing optionality and predicting competitive behaviours have been available to most MBAs and PhDs in finance for one or two decades.
The situation reminds me of the situation of fracture mechanics-based integrity assessment of offshore oil and gas pipelines 5-10 years ago. The tools for calculating correctly were there, but the practitioner community and various standards-setting bodies chose to stick to old-fashioned standards, like DNV-OS-F101, complemented with proprietary calculations and safety factors rather than do a full finite element calculation. The result: often overly conservative and costly designs, but also cases of significant under-dimensioning, fracture, costly repairs and possibly environmental damage.
Methodically, I started out my small conundrum-resolving project with mapping out standard practices within banking, venture funds, energy companies, oil services companies, and early-phase technology companies.
The mapping was done through informal interviews with a few people from the industries in question (but no claim of statistical significance). Some typical applications discussed were: repair yes / no, extend life time yes / no, start-up valuation, distressed bond valuation, produce yes / no, drill yes / no, enter market yes / no, and conduct R&D project yes / no.
Some patterns emerged:
- Most companies use variations of standard DCF, with uncertainty taken into account through the discount factor (plus often identification of a worst-case scenario), optionality not valued quantitatively, and in case of company valuation, corrections for various balance sheet items.
- Some organizations leverage more sophisticated models, including stochastic dynamic programming / simulation / decision trees. These are generally found in the energy sector and finance.
- There are also a number of Norwegian universities and research organizations with advanced capabilities / practices in this area, including Norwegian School of Economics (NHH; Bergen), Norwegian Business School (BI; Oslo), Norwegian Computing Center (NR; Oslo), and SINTEF (Trondheim).
However, the conundrum remained unresolved: why would the practitioner community not want to calculate correctly**? A number of explanatory factors were suggested: Established in-house practices, need for more tough-to-estimate parameters (including subjective success probabilities), lack of competence, and short-comings of possible theoretical frameworks.
On a general note, one may predict that the observed reliance on plain-vanilla DCF (plus some qualitative considerations on the last page of the ppt) would result in incorrect / suboptimal decision, too low valuations (by ignoring value of optionality), too high valuations (if ignoring predictable competitive behaviours, and including unjustified optimism), and the need for adjustment factors to compensate for un-modelled factors. However, the real issue is that one precludes the solid thinking and the qualitative insights that come with proper quantitative modelling, which leads to incomplete exploration and assessment of the solution space, unknown and sometimes unnecessary strategic exposure, and strategic sub-optimality.
One may at this point wonder what tools would be available to a practitioner that decides to venture beyond DCF: In practice, there are two approaches: real options theory, often calculated by stochastic dynamic programming (from the field of mathematical finance), and decision trees (from the field of decision theory). There are pros and cons for each of them:
Going back to the DN article, what tool would I use if I were say the CFO of Norske Skog? Short answer #1: A multi-player decision tree (or in game theory parlance, an extensive form representation or a game tree) that represents the sequential choices of the stakeholders (administration, shareholders, holders of unsecured and secured bonds) plus key uncertainties (including the price path of newspaper paper and the outcome of certain foreseeable court cases). If I had done that, would I understand that Norske Skog would be taken to court by Citibank over this agreement with GSO and Cyrus Capital? Short answer #2: Yes, absolutely.
I tend to complete my blog posts with a call to action. In this case, it is simple: i) Identify a pilot case for the application of quantitative methods for valuing a project opportunity with significant optionality and / or competitive behaviours. ii) Do an evaluation of the quantitative and qualitative insights extracted from such case, benchmarked against a plain-vanilla DCF analysis. iii.a) If a repetitive situation (say a trading strategy for an energy company or investing for a VC), build up internal capabilities in the area of quantitative analysis. iii.b) If a singular event (say a one-off R&D project or an M&A scenario), get assistance from an external consultant.
And if you think that this blog post is just a trivial, but somewhat interesting example of technology adoption theory in action (and now we are at the chasm between early adopters and early majority), I will not disagree. But, let me revert to my example from the offshore oil and gas pipeline industry, in which a small number of industry practitioners, consulting firms, and research organizations (supported by major oil and gas companies) over the years helped the pipeline industry move from standards-based integrity assessment to full 3D finite element analyses. I am hoping that this blog post could in a similar, though obviously much smaller way, contribute to the acceptance of full-scale decision trees and game trees as useful tools in industrial decision making.
*) Disclaimer: I have no inside information whatsoever about Norske Skog and this court case, so this is my personal interpretation of some newspaper articles. I believe the court case is still pending. Norske Skog’s administration and BoD may have private information that makes their strategy perfectly rational. Besides, Norske Skog’s set of feasible solutions when they did this deal with GSO and Cyrus Capital may, according to a number of newspaper articles, have been very restricted.
**) Once you start to venture beyond DCF, ‘calculating correctly’ is a non-trivial affair. As indicated by any standard text book in finance, one generally has to introduce draconian simplifying assumptions. In practice, there are two schools: calculating exactly right using binomial lattices (or similar), but introducing such simplifications and going very black box, or calculating approximately right, based on modelling full complexity, but using a somewhat arbitrary discount rate.