• Ingen resultater fundet

Ambiguous Investment under Choquet-Brownian Motion with Neo-Additive Capacities

N/A
N/A
Info
Hent
Protected

Academic year: 2022

Del "Ambiguous Investment under Choquet-Brownian Motion with Neo-Additive Capacities"

Copied!
86
0
0

Indlæser.... (se fuldtekst nu)

Hele teksten

(1)

Choquet-Brownian Motion with Neo-Additive Capacities

by

Nicklas Snekloth Kongsgaard

Master’s Thesis

Presented to the Faculty of the Department of Economics in Partial Fulfillment of the Requirements for the Degree of

Master of Science in Advanced Economics and Finance (Candidatus Oeconomices)

Copenhagen Business School Supervisors: Jimmy Mart´ınez-Correa

September 2013

No. of pages (characters): 80 (161,972)

(2)
(3)

A new branch of the real options literature examines the optimal stop- ping problem of an investment opportunity embedded in an ambiguous environment as opposed to traditional aleatory risk. Only few articles have been published within this emerging field. This thesis critically reviews the existing modeling efforts specified by MEU, α-MEU, and CEU preference relations, and finds that the current specifications col- lectively fail to achieve separability of ambiguity and ambiguity prefer- ences in a manner that is coherent with economic intuition. We show that convex combinations in the multiple priors approach aimed at in- tegrating ambiguity attitudes yield peculiar non-monotonic preferences toward ambiguity and cannot be reconciled with dynamic consistency.

We propose a specification that mends the shortcomings of the current literature by building on the recently developed dynamically consistent Choquet-Brownian motions. We offer an original contribu- tion by introducing neo-additive capacities in order to obtain a proper separation of tastes and beliefs and apply the developed specification to the optimal stopping problem of the firm in possession of a real option to invest in order to show that the injection of ambiguity to the economy yields drastically different effects on both project value and option value than under pure risk. Whereas risk increases option value through the continuation region, for an ambiguity averse firm ambiguity lowers option value both in the stopping region and the continuation region, which delays investment as well, but for critically different reasons. For an ambiguity loving firm, ambiguity increases option value in both regions, but does so more in the stopping region.

In consequence, investment tends to be delayed by pessimistic firms and pulled forward by optimistic firms under ambiguity.

Hence, the proposed model resonates in its findings with the main results of the current literature underα-MEU and CEU specifications, while it by means of the introduction of neo-additive capacities con- tributes with the possibility of a refined analysis of the role of ambigu- ity preferences in guiding the direction of the impact from ambiguity on real option valuation and investment timing. As a result, we pro- vide a well-specified explanation of non-identical investment behavior in identical environments due to heterogeneous tastes and beliefs.

i

(4)

ii

(5)

1 Introduction 1

1.1 Research Question . . . 4

1.2 Literature Review . . . 6

2 Decision-Theoretic Approaches to Ambiguity 11 2.1 Preliminaries . . . 14

2.2 The von Neumann-Morgenstern and Savage Representations . 15 2.3 Multiple Priors . . . 18

2.3.1 Maxmin Expected Utility . . . 18

2.3.2 α-Maxmin Expected Utility . . . 19

2.4 Smooth Ambiguity . . . 20

2.5 Capacities and the Choquet Integral . . . 21

2.5.1 Neo-Additive Capacities . . . 23

3 Brownian Motions under Ambiguity 27 3.1 Preliminaries . . . 28

3.2 From Random Walks onZ to Brownian Motions onR . . . . 29

3.3 Changing Probability Measures . . . 35

3.3.1 Rectangularity andκ-Ignorance . . . 38

3.3.2 Dynamic Inconsistency ofα-MEU . . . 39

3.4 Smooth Ambiguity in Continuous-Time . . . 40

3.5 Choquet-Brownian Motion . . . 41

3.5.1 Universe . . . 42

3.5.2 Characterization . . . 43

3.5.3 Continuous-Time Limit . . . 46

4 Investment under Ambiguity 49 4.1 Existing Project Valuation Schedules . . . 49

4.1.1 The Canonical Model . . . 50

4.1.2 Multiple Priors . . . 51

4.1.3 Choquet-Brownian Motion . . . 56

4.2 Proposed Project Valuation Schedule . . . 57

4.2.1 Ambiguity . . . 58 iii

(6)

iv Contents

4.2.2 Optimism and Pessimism . . . 59 4.2.3 Risk . . . 60 4.3 Investment Timing . . . 61 4.3.1 Risk in Traditional Non-Ambiguous Environments . . 63 4.3.2 Real Option Value in Ambiguous Environments . . . . 64 4.3.3 The Value of Waiting . . . 68

5 Conclusion 73

References 77

(7)

Introduction

”Uncertainty must be taken in a sense radically distinct from the famil- iar notion of Risk, from which it has never been properly separated....

The essential fact is that ’risk’ means in some cases a quantity suscep- tible of measurement, while at other times it is something distinctly not of this character; and there are far-reaching and crucial differences in the bearings of the phenomena depending on which of the two is really present and operating.... It will appear that a measurable uncertainty, or ’risk’ proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all.”

– Knight, F. H. Risk, Uncertainty, and Profit (1927, p. 16)

Earlier recessions have conformed with the Friedman (1988) ”Plucking Model” of economic fluctuations in which mild recessions are followed by slow recoveries and deep recessions by brisk recoveries. However, the re- cent crisis of 2008 is rather unusual in the sense that while the output gap has widened deeply, recovery has been nothing near brisk. Much academic research has discussed the roots of the crisis, and while the causes and sug- gested remedies are numerous and technically inaccessible to the general public, what resonates is the realization that the economy and the finan- cial system are of such daunting complexity that the most basic question in macroeconomics, ”Should we stimulate the economy or pursue austerity?”

is indeed still both a relevant and heated topic in the EU today.

Bloom et al. (2013) find that uncertainty about the economy has been un- precedentedly high after the breakout of the crisis. If academics, economists, financiers, and politicians are not yet complete in their understanding of our economy, it may be an extraordinary assumption that firms are perfectly able to navigate the economy by optimizing from the understanding of all

1

(8)

2

50 100 150 200 250

1985 1990 1995 2000 2005 2010

US

1985 1990 1995 2000 2005 2010

Europe

Figure 1.1: The Bloom-Baker-Davis Economic Policy Uncertainty indices. Data from Bloom et al. (2013).

possible scenarios of how the future may play out. Today, most economic literature is implicitly built on the assumption that the firm is in possession of some complete model, which informs it about every single possible future state of the world and its corresponding probability of being realized. Thus, firms are facing uncertainty from risk. Now, this thesis asks the question, what happens to the investment decision of the firm when instead of being able to develop accurate inferences about the future, then, in the words of Keynes (1937, p. 214),”We simply don’t know”?

Up until recently, the economic literature has primarily dealt with uncer- tainty in its aleatory sense, risk. However, Knight (1921) clarified that one should pursue a sharp and thoughtful distinction between quantifiable and unquantifiableuncertainty. Formally,risk in a situation is one in which a de- cision maker faces a quantifiable probabilistic model that is entirely known and subject to uncertainty because of the stochasticity of one or more ele- ments, but about which the decision maker is at all times perfectly confident that no model misspecification exists. Contrariwise,ambiguity refers to un- certainty in its Knightian sense so that the decision maker lacks confidence in the specification of her model as she is not complete in her understanding of the economy.

We examine the investment decision of the firm under ambiguity by fol- lowing the classical McDonald and Siegel (1986) optimal stopping problem of a firm in possession of an option to invest, and subject it to the presence

(9)

of ambiguity. Such an approach was recently pioneered by Nishimura and Ozaki (2007) who find that ambiguity—or Knightian uncertainty—carries diametrically different impact than that of risk. However, their results as- sume complete aversion towards ambiguity. As pointed out by Eichberger and Kelsey (2009), a central, yet still not completely resolved problem in modeling ambiguity concerns the separation of ambiguity and ambiguity attitudes. In order to deliver a model that offers explanatory power over the entire business cycle, optimism is as important to model as pessimism.

Indeed, the first article in behavioral economics was DeBondt and Thaler’s (1985) examination of overreaction in the stock market. Separability of tastes and beliefs is important in economic modeling because while atti- tudes towards ambiguity can be stable personal characteristics or induced by herding behavior, experienced ambiguity may vary with the information about the environment. An overlooked case is ambiguity neutrality, which has so not yet been properly specified in real option models under ambigu- ity. Ambiguity neutrality is an imperative model feature because it allows a decision maker to be neutral towards ambiguity, even if she perceives and is informed about ambiguity in her environment.

The current specifications that attempt to deliver a separation of am- biguity and ambiguity attitudes (Schr¨oder, 2011; Roubaud et al., 2010) fail to allow for informed ambiguity neutrality and consequently attitudes such as optimism and pessimism are ill-defined. To our knowledge, this thesis provides the first specification that achieves dynamically consistent separa- bility of tastes and beliefs nesting ambiguity neutrality, even if ambiguity is perceived.

The thesis will proceed as follows. In the rest of this chapter we pose our research questions and provide a literature review on real options un- der ambiguity. Research in this area is in its infancy and owes its progress to the advancements of the literature on decision theory under uncertainty.

For this reason, Chapter 2 is devoted to a survey of the in economic re- search most employed representations that allow preferences over ambiguity.

Chapter 3 integrates the preference relations examined in Chapter 2 with the analysis of continuous-time stochastic process in order to examine their moments under ambiguity1. In Chapter 4, we synthesize and construct the

1Since this thesis bridges decision theory and the analysis of continuous-time stochastic processes, notation accommodates conventions from both bodies of literature to preserve

(10)

4 Research Question

valuation schedules predicted by each ambiguity preference relation under examination. The valuation schedules are applied to the all-or-nothing in- vestment problem and subjected to critique. A proposed specification that mends the shortcomings of current efforts will be put forward and subjected to the all-or-nothing problem in order to obtain comparison with the cur- rent specifications. Thereafter, the proposed specification will be applied to the optimal stopping problem under flexible investment timing so as to inspect the insights it yields into the investment decision of the firm under ambiguity. Chapter 5 concludes the thesis.

1.1 Research Question

The impact of uncertainty is multifaceted and by nature laborious to rep- resent in a modeling framework2. Therefore, while uncertainty impacts the economy on several levels, attention is here restricted to the issue of firm investment under ambiguity. In order to preserve tractability, we do not intend to propose a new universal framework, but restrict the scope to an investigation and critique of the predominant classes of representations used in the real options literature followed by a new specification proposed by this thesis. The analysis will narrow in on the two most central problem in real options, namely (i) how to value a payoff flow, and (ii) how to determine a strategy for the optimal stopping problems of when to invest and when to delay investment. The purpose of this thesis is to investigate how the investment decision is impacted by the introduction of ambiguity—as op- posed to risk—and how such impact is influenced by the ambiguity attitude of the firm under consideration. As is customary in the traditional real op- tions literature initiated by McDonald and Siegel (1986), firms are also in the recent real options literature on ambiguity assumed to be risk-neutral (Nishimura and Ozaki, 2007; Schr¨oder, 2011; Roubaud et al., 2010). In con- sequence, firms are traditionally assumed to maximize theexpected value of an investment. This thesis follows the new stream of pioneering literature which argues that what matters to firms in their evaluation of uncertain

continuity. Accordingly, the reader should expect a few amendments to the notation employed in cited papers.

2As Fox (2009) points out, complexity was indeed the hidden driver behind the as- sumption of perfect foresight in early economics.

(11)

payoffs is not risk per se, but risk to which a layer of ambiguity is added.3 In essence, the main objective of this thesis is to provide answer to the following questions:

• How does Knightian uncertainty, as opposed to risk, impact the val- uation of a payoff flow?

• How does Knightian uncertainty, as opposed to risk, impact the val- uation of an investment opportunity, i.e. a real option?

• How do attitudes toward ambiguity interact with the degree of ambi- guity in the timing and decision to investment?

Brownian motions are the core of modern day economics and finance, and will also provide the foundation of analysis in this thesis when em- bedding the investment decision in an ambiguous environment. Hence, the research question is addressed by investigating current specification efforts of a valuation scheme of a continuous-time stochastic payoff process, sub- jecting them to critique, and developing a proposal for a new specification.

Then, the proposed valuation schedule will provide the foundation for evalu- ating real options to invest and thus the dynamics of the investment decision of the firm. In doing so, this paper pursues a Dixit and Pindyck (1994)–like valuation schedule for infinite payoff flows (πt)t≥0,

V(π, t|·) = πt

ρ−µK(·) (1.1)

with discount factor ρ and trend µ, but to which a ”K-multiplier” is in- troduced to embody the premiums and discounts assigned to valuations in ambiguous environments. This approach will allow for easy comparability with existing models, and makes it readily applicable to the optimal stopping problem.

Eichberger and Kelsey’s (2009) observation that separability of tastes and beliefs towards ambiguity is not yet resolved also applies to the real options under ambiguity. At present, the seminal work of Nishimura and Ozaki (2007), Schr¨oder (2011), and Roubaud et al. (2010) allows for analysis of the impact of ambiguity on investment value, but so far, to the knowledge of the author, no paper has yet succeeded in separating beliefs and tastes

3This is also the approach in insurance literature.

(12)

6 Literature Review

for ambiguity—that is ambiguity and ambiguity attitude—in a methodol- ogy that is well-specified for outcomes that are non-linear on the degree of ambiguity in the reference parameter. Thus, current specifications fail to properly analyze valuations that are by construction strictly convex in growth prospects.

In order to provide guidance in the derivation of a satisfactory speci- fication, the thesis puts forward the following three parsimonious a priori desiderata to be satisfied by an adequate specification:

(C1) Separability: Beliefs (ambiguity) and tastes (ambiguity atti- tude) are separated.

(C2) Ambiguity Neutrality: Ambiguity neutrality is nested by the ambiguity attitude separability specification, so that an ambiguity neutral decision maker is indifferent to the spread of evaluations caused by the multiplicity of alternative valuation schedules under ambiguity, and hence only cares about the expected prior though she is indeed informed of the existence of multiplicity.

(C3) Attitude Monotonicity: An ambiguity-averse(-loving) decision maker is monotonic in her preference for increasing ambiguity.

It is emphasized that focus will be restricted to theoretical analysis com- bined with numerical simulations in order to build on the existing literature.

While identification will be discussed, the thesis leaves empirical tests of the proposed model to future research.

1.2 Literature Review

Only few articles have examined the impact of ambiguity on real options, and of the pioneers, multiple priors methods have dominated the model- ing approach (Nishimura and Ozaki, 2007; Choi et al., 2009; Trojanowska and Kort, 2010; Miao and Wang, 2011; Schr¨oder, 2011). Nishimura and Ozaki (2007) were the first to introduce ambiguity to the real options lit- erature by following the Chen and Epstein (2002) intertemporal multiple priors approach to continuous-time stochastic processes in asset pricing, building upon the decision theoretic work of the Gilboa and Schmeidler (1989) Maxmin Expected Utility (MEU) model. A decision maker who ex- hibits MEU preferences understands ambiguity as a lack of confidence in

(13)

her model specification for some reference variable, and hence conceptual- izes a non-singleton set of probability measures whose elements all give rise to different priors describing the world. As a result of the Chen and Ep- stein (2002) representation, the recent real options literature incorporating ambiguity specifies a range of possible stochastic processes only differing by the possible profit flow drifts of an investment through Girsanov’s Theo- rem. Therefore, the larger the set of possible priors, the larger the degree of ambiguity characterized by a larger set of possible drifts. The drifts are restricted by Chen and Epstein’s (2002) κ-ignorance, which spans the set of density generators within the range [−κ, κ]. Faced by a range of possible priors, MEU stipulates that decision makers act as if they order according to the least favorable prior. In effect, MEU results in complete aversion to ambiguity and therefore always orders according to the least favorable drift of a given stochastic process.

The seminal work of Nishimura and Ozaki (2007) brought much new insight into the the analysis of ambiguity in real options literature. They find that the impact of ambiguity on the value of an irreversible investment opportunity is drastically different from that of risk. While risk enhances real option value through the continuation region and leaves project value in the stopping region unchanged, ambiguitydecreases option value in both the stopping region and the continuation region, but does so more in the former than the latter, which also increases the value of waiting, but for critically different reasons than under risk. Motivated by the desire to ex- amine the effect of ambiguity under a wider range of tastes, Schr¨oder (2011) points out that Nishimura and Ozaki’s (2007) use of Gilboa and Schmei- dler (1989) maxmin expected utility preference representation result in an extreme aversion towards ambiguity and accordingly fails to analyze real op- tions under ambiguity subject to more optimistic attitudes. Consequently, Schr¨oder (2011) examines real option valuation under Marinacci (2002) and Ghirardato et al. (2004)α-maxmin expected utility, which introduces a Hur- wicz (1951) criterion as an index of optimism. However, while the results of optimism give rise to an option value increase for increased ambiguity in both the stopping and the continuation regions causing a tendency to exercise the real option, Schr¨oder’s (2011) specification is dynamically in- consistent in non-extremeα ∈(0,1). In addition, this thesis finds that the model yields peculiarities in its analysis of ambiguity, so that the attitude

(14)

8 Literature Review

towards ambiguity is non-monotonic for a fixedα, meaning that if a decision maker has even a infinitesimally small fraction of optimism, she is eager to invest entirely because of the presence of ambiguity. Therefore, Ghirardato et al.’s (2004) α-MEU preferences as a class satisfies neither informed am- biguity neutrality nor attitude monotonicity for strictly convex outcomes such as the valuation schemes of infinite payoff streams commonly applied in economics and finance. Since the Hurwicz criterion measures the relative weight on one of the extreme attitudes, there is no guarantee that ambiguity neutrality is reached by the midpoint (or any other invariant value) as the least favorable and the most favorable priors do not necessarily lead to sym- metric outcomes around the expected value from the confident probability measure. While ambiguity neutrality can be achieved by letting the decision maker have full confidence in the objective probability measure by reducing the set of priors to a singleton, such an ad hoc procedure would confound the separation between ambiguity and attitude, and accordingly defeat the very purpose of Schr¨oder’s (2011) introduction of Ghirardato et al.’s (2004) α-MEU representation in the first place.

Motivated by the inherent limitations of the multiple priors approach, Klibanoff et al. (2005) (KMM) develop smooth ambiguity which introduces second-order preferences on expected utilities. While this representation has gained much popularity as it captures both risk aversion as well as separation of ambiguity and ambiguity preferences ambiguity, Skiadas (2013) show that KMM fails to capture ambiguity preferences n continuous-time.

Another approach is pioneered by Roubaud et al. (2010) who adopt Schmeidler’s (1989) Choquet Expected Utility preferences as applied in continuous-time by Kast and Lapied (2010). The findings are similar to that of Schr¨oder (2011) in its extreme cases, but instead of assuming a given range of priors, the developed Choquet-Brownian motion summarized by a random walk under a non-additive measure, known as a capacity, charac- terizes the decision makers’s subjective representation of preferences. This leads to a deformation of the objective Brownian motion so that both the drift and volatility are modified, in contrast to Chen and Epstein (2002) in which ambiguity entirely concerns the drift. Roubaud et al. (2010) ar- rive at model which succeeds in analyzing the impact of differing ambiguity attitudes on project value. The characterization of tastes is achieved by following Schmeidler’s (1989) definition of ambiguity aversion in relation to

(15)

the convexity of capacities. In Kast and Lapied (2010) a constant capacity c∈[0,1]—referred to as c-ignorance—expresses the direction and intensity of the psychological bias nesting the probabilistic case c = 12. Therefore, the delineation of optimism and pessimism is characterized by the decision maker’s subjective preference representation that are monotonic in attitudes, a feature we found lacking in Schr¨oder (2011). However, exactly because the capacity curvature governing ambiguity relations is determined by subjective preferences, the Roubaud et al. (2010)c-ignorance approach fails to achieve informed ambiguity neutrality and with that a well-specified separation of ambiguity and ambiguity attitude.

(16)

10 Literature Review

(17)

Decision-Theoretic

Approaches to Ambiguity

”No human inquiry can be a science unless it pursues its path through mathematical exposition and demonstration.”

– Leonardo da Vinci

Capital flows because it is set in motion by the people who possess it, and so, ultimately, economics is the study of how people act to information and how such acts lead to consumption, investment, and production. By investigating the valuation of an investment opportunity and how a decision maker will act in her strategy in an ambiguous environment, this thesis necessarily takes the position that there exists some degree of predictability of acts. Predictability must find its root in a decision rule prescribing the behavior of a decision maker when embedded in an economic setting subject to choices, and therefore the specification of the preference relation that guides the behavior to be predicted is core to the evaluation of acts in obtaining a rigorous analysis. However, such an axiomatic approach to behavior may seem at odds with free will, and as Gilboa (2009, p. 5) notes, on the presumption that we as human beings possess free will, it may be worthwhile to ask,”Can free choice be predicted?”

Economics as a field relies on the prerequisite that some sort of pre- dictability is indeed possible. There exists an ongoing academic debate on the role of mathematics in economics. On one side is the modernist point of view—named so by McCloskey (1983)—which relates economics to an

11

(18)

12

objective reality that mathematics aid decipher, and on the other is social constructivism which by tradition does not have a mathematical representa- tion to represent ”reality”, because reality only as such exists as a rhetorical device by construction. Closely related is the discussion whether human be- ings can be rationalized by some axiomatic preference relation. This thesis does not take a stand in this discussion and refrains from proposing any nor- mative theory. The proposed model is only meant as a model and employs decision theory entirely as a means of guiding the pursuance of a descriptive theory that more closely resembles the behavior one might derive intuitively than that predicted by current specification efforts. For this purpose, this chapter aims to provide the reader with the necessary decision-theoretic ap- paratus to critically review the developments in the theory of real options under ambiguity.

Let’s clarify that while decision theory generally revolves around indi- viduals, the real options literature in consideration applies to the decision to invest by firms. Therefore, risk aversion is traditionally ignored seeing that firms are assumed to maximize the expected value of an investment opportunity. However, the notionexpected value presumes that the decision maker has some model specification in mind with which she constructs her expectation. While the firm under consideration may or may not indeed have an actual model to forecast the consequences of its investments, the traditional real options literature developed on the seminal work of McDon- ald and Siegel (1986) implicitly assumes that the firm possesses—or at least has in mind—a well-specified model that to the decision maker in the firm accurately describes the full range of outcomes and corresponding proba- bilities that follow from an act. The expected value of an investment is

”expected” precisely because the decision maker is able to assign probabili- ties to all possible outcomes. Therefore, for a risk neutral firm, the spread of possible outcomes of an all-or-nothing investment does not matter as long as it is mean-preserving in expectation.

In contrast, ambiguity describes the setting in which the firm is not sure if it possesses the right model. Consequently, the firm worries about model misspecification as it is not sure about how future uncertainty will play out and in consequence considers a multiplicity of probability measures as also possible. The firm cannot reduce the decision problem to be described by a single probability measure and is thus faced by ambiguity. In order to see

(19)

how the presence of ambiguity gives rise to preference relations that cannot be described by risk in its aleatory sense and thus irreducible to a single probability measure, consider the often cited Ellsberg (1961) paradox. Ob- served actions within the decision matrix below reveal how decision makers exhibit non-standard preferences in situations with unknown probabilities.

Consider an urn containing 150 balls. With certainty the urn contains 50 red balls, but the remaining 100 balls can be anything from 100 blue balls and 0 yellow balls to 0 blue balls and 100 yellow balls. The partition is unknown. A decision maker faces bets on a draw from the urn with state space Ω = {r, b, y}. The decision maker is certain about the probabilities P({r}) = 1/3 and P({b, y}) = 2/3, but is unsure about the probability density partition on{b, y}.

50 balls 100 balls

Red Blue Y ellow

I fr $100 $0 $0

fb $0 $100 $0

II fry $100 $0 $100

fby $0 $100 $100

Ellsberg (1961) theorized4 that people will exhibit the strict modal pref- erences I: fr fb and II:fry ≺ fby. Such preferences seem likely, but are inconsistent with the assumption that beliefs can be numerically represented by a probability measure and consequently come into conflict with proba- bilistic sophistication (Machina and Schmeidler, 1992). Specifically, note how I: fr fb =⇒ P({r}) > P({b}), but II: fry ≺ fby =⇒ P({r, y}) <

P({b, y}) =⇒ P({r}) < P({b}) contradicting the former. Therefore, such preferences cannot be consistent with maximizing expected utility relative to any additive probability distribution (be it objective or subjective).

This chapter outlines the preference representations most employed in the economic literature on ambiguous choices. First, the chapter sketches the insights of von Neumann and Morgenstern (1944) Expected Utility that provided the workhorse of modern day economics, and touches on Savage’s (1954) Subjective Expected Utility representation which inverted the repre- sentation to derive a subjective probability distribution from the preference

4Ellsberg did not conduct any experiments. See Camerer and Weber (1992) for a survey of empirical evidence.

(20)

14 Preliminaries

over acts. Next, the chapter considers the three most popular representa- tions: (i) multiple priors, (ii) smooth ambiguity, and (iii) Choquet expected utility. While Gilboa and Schmeidler (1989) and Ghirardato et al.’s (2004) multiple priors preference relations are the most intuitive and hence most applied representation, we will show that it suffers from inherent limitations that make it ill-suited in application to the theory of real options, though pursued by Nishimura and Ozaki (2007) and Schr¨oder (2011). Klibanoff et al.; Klibanoff et al.’s (2005; 2009) smooth ambiguity5 is an elegant repre- sentation attempting to address some of the limitations of the multiple prior representation. Unfortunately, the next chapter will reveal that smooth ambiguity cannot be applied to a continuous-time stochastic environment (Skiadas, 2013). Therefore, the thesis turns to Schmeidler’s (1989) Cho- quet Expected Utility, which may at first not seem intuitive in construction, but will do so once applied to a continuous-time process as in Kast and Lapied (2010). Finally, an outline is given of Chateauneuf et al.’s (2007) neo-additive capacities.

2.1 Preliminaries

For any state space Ω with events A ⊂ Ω, where formally F = 2, the σ-algebra of its subsets, we define a probability measure P : F → [0,1], normalized,P(Ω) = 1, andadditive, so thatP(A∪B) =P(A)+P(B)∀A, B ∈ F such that A∩B = ∅. In the following, we let the clean E[·] refer to the expectation operation with respect to the objective additive probability measure. Let P = ∆(Ω,F) denote the set of all normalized and additive probability measures on the state space (Ω,F).

AssumeX describes a set of prizes (possible outcomes or consequences) for which ∆(X) is the collection of all lotteries (probability distributions) on X. Then, the decision maker faces acts A = {f : Ω → X}, mapping the states of nature into consequencesX. Asplicing function is a function which takes and act on some sub-domain and another on its complement, so that∀f, g∈ Aand eventA⊂Ω, define an act

fAg(ω) :=

( g(ω) ifω∈A f(ω) ifω∈A{

5Commonly referred to as KMM (Klibanoff et al. (2005)).

(21)

2.2 The von Neumann-Morgenstern and Savage Representations

von Neumann-Morgenstern Expected Utility

von Neumann and Morgenstern (1944) were the first to complete and ax- iomatic development of expected utility, which we briefly consider here. Con- sider a set of pricesX around which we define a set of lotteries with finite support,

L:=

(

P:X →[0,1]

card{x:P(x)>0}<∞ ∧ X

x∈X

P(x) = 1 )

(2.1)

Now, we will perform mixing operations6 on the elements of L so as to establish the von Neumann-Morgenstern axioms which, if satisfied, yield the binary preference relation%⊂L×L so that ∀P,Q,V∈L

(vNM1) Weak order: The preference relation%satisfiescompleteness, i.e. either P Q, or P ≺ Q, or P ∼ Q, and transitivity, i.e.

(P%Q∧Q%V) =⇒ P%V.

(vNM2) Continuity: P%Q%V =⇒ ∃α∈[0,1] :Q∼αP+(1−α)V. (vNM3) Independence: ∀ α ∈ (0,1), then P % Q ⇐⇒ αP+ (1−

α)Q%αQ+ (1−α)V.

Theorem 2.1 (von Neumann and Morgenstern (1944) Expected Utility).

∃u:X→Rsuch that for x∈X and ∀P,Q,V∈L P%Q ⇐⇒

Z

X

u(x)dP(x)≥ Z

X

u(x)dQ(x) (2.2) iff the preference relation %⊂L×L satisfies axioms vNM1–vNM3.7 More- over, in this case u is unique up to a positive linear transformation.

6A mixing operation onLis defined as (αP+ (1α)Q) (x) =αP(x)+(1−α)Q(x)∀x X. In words, a compound lottery that gives you lotteryPwith probabilityαand lottery Qwith probability (1α) is in probability theory equivalent toαtimes the conditional probability ofX=xunderPplus the conditional probability ofX=xgivenQ. Hence, mixing operations relate to the decision maker’s perception ofconditional probabilities.

7Integral notation is used in order to emphasize symmetry with similar expressions in other representation theorems. In effect the integral is in fact a sum given the assumption thatL⊂ P is the set of probability measures with finite support.

(22)

16 The von Neumann-Morgenstern and Savage Representations

The von Neumann and Morgenstern (1944) model has become the fun- damental building block for all subsequent advancements in decision theory and the workhorse in modern economics. However, a stumbling block to the model is the assumption that the decision maker is presented with a range of different lotteries—i.e. objective probability distributions—to choose from.

Indeed, real-word decision makers in business may quite rarely enjoy the ex- perience of being presented with the opportunity to choose between strate- gies with objectively known probabilities. Essentially, in the terminology of Knight (1921), the model only allows for the consideration of situations under well-defined aleatoryrisk.

Savage’s Subjective Expected Utility

If we instead turn attention to Knightian uncertainty, that is situations in which it is impossible to delineate uncertainty, then even though we as decision makers are faced by ambiguity, in the words of (Keynes, 1937, p. 214) we still”behave exactly as we should if we had behind us a good Ben- thamite calculation of a series of prospective advantages and disadvantages, each multiplied by its appropriate probability, waiting to be summed.” This line of thought is at the core of the Savage (1954) representation theorem which does not assume the existence of an a priori probability distribution, but instead derives a subjective probability measure from observed prefer- ences over acts. Hence, Savage’s theorem validates expected utility analysis even in situations in which no objective measure is given and where ”un- certainty” remains only vaguely defined, because a decision maker acts as if she maximizes the expectation of utility relative to her subjective prob- ability measure. In effect, the beauty of Savage (1954) is that if its axioms are satisfied, the representation yields the existence ofbotha utility function and a probability measure (though a subjective one) without assuming these as primitives of the model.

Theorem 2.2 (Subjective Expected Utility, Savage (1954)). A preference relation% satisfies P1-P7 iff there exists a finitely additive probability mea- surePand a non-constant bounded function u:X→Rsuch that∀f, g∈ A

f %g ⇐⇒

Z

u◦f(ω)dP(ω)≥ Z

u◦g(ω)dP(ω) (2.3)

(23)

Moreover, in this case P is unique, and u is unique up to positive linear transformations.

(P1) Weak order: %satisfies completeness and transitivity.

(P2) Sure-Thing-Principle: The preferences between acts depend solely on the consequences in states in which the payoffs of the two acts being compared are distinct ∀ A ⊂ Ω and ∀ f, g, h, h0 ∈ A, thenfh

A{ %gh

A{ ⇐⇒ fh0

A{ %gh0

A{.

(P3) Eventwise Monotonicity: ∀ f ∈ A and every non-null event A⊂Ω andx, y∈X, then x%y ⇐⇒ fAx %gyA.

(P4) Weak Comparative Probability: ∀A, B ⊂Ω and x, y, z, w∈ X with xy and zw, thenyAx %yBx ⇐⇒ wAz %wBz.

(P5) Nondegeneracy: ∃f, g∈ A:f g.

(P6) Small Event Continuity: ∀f, g, h∈ A sof g,∃ a partition of Ω, {A1, . . . , An}:∀i≤n, then fAhi g∧f ghAi.

(P7) Uniform Monotonicity: ∀f, g ∈ Aand A ⊂Ω, then for every ω∈A,f %Ag(ω) =⇒ f %Ag and g(ω)%Af =⇒ g%Af. Effectively, Savage’s (1954) Subjective Expected Utility (SEU) model reduces all uncertainty about the state space Ω to a subjective probability measure P characterizing her beliefs. Thus, while SEU does not assume objectiveprobabilities, it is nonetheless still a model of risk in the sense that decision makers behaveas if they are indeed assigning probabilities.

However, the Ellsberg (1961) paradox cannot be represented by maxi- mizing expected utility on any additive probability measure. The funda- mental assumption in question is Bayesianism, namely that uncertainty is quantifiable in a probabilistic manner. Preferences for known as opposed to unknown probabilities as those exhibited in the Ellsberg paradox are in- compatible with this assumption, and in effect violates Savage axiom P2, commonly known as the Sure-Thing-Principle. To see this formally, notice from the decision matrix that fr and fb are equal on event Yellow for all ω ∈Ω. Consider the acts in the decision matrix as slicing functions of acts on the partition of events {r, b} and {r, b}{ = {y}. Define acts so that f on {r, b} gives $100 for Red and $0 for Blue, andg on {r, b} the other way

(24)

18 Multiple Priors

around, whilehon {r, b}{={y}always gives $0 and h0 always $100. Then, by P2,

f{r,b}h { %gh{r,b}{ ⇐⇒ f{r,b}h0 {%g{r,b}h0 { which we saw is violated by the revealed preferences

f{r,b}{h g{r,b}{h and f{r,b}{h0 ≺gh{r,b}{0

As a result, the modal preferences in the Ellsberg (1961) paradox conflict with Machina and Schmeidler’s (1992) definition ofprobibalistic sophistica- tion, as the decision maker is not able to reduce her preference relation to a preference on acts over a single subjective prior.

Definition 2.1 (Probabilistic Sophistication, Machina and Schmeidler (1992)). The preference relation % is ”probabilistically sophisticated” if∃ a unique probability measure, P, such that the decision makers choice between any acts f, g ∈ A depends solely on the distributions over the outcomes induced by these acts and the measureP.

2.3 Multiple Priors

2.3.1 Maxmin Expected Utility

Motivated by the desire to find a representation exhibiting preference for known probabilities—that is to allow for ambiguity aversion—Gilboa and Schmeidler (1989) proposed a model of ambiguity sensitive preferences under which the decision maker has too little information to form a prior over the state space, and instead considers aset of priors as possible when evaluating an action. Under Maxmin Expected Utility (MEU), it is assumed that the decision maker isambiguity averse and hence orders according to the most pessimistic probability distribution inP.

Theorem 2.3 (Maxmin Expected Utility, Gilboa and Schmeidler (1989)).

∃u : X → R and a non-singleton set of convex probability measures P(X) such that∀f, g∈ A

f %g ⇐⇒ min

P∈P

Z

u◦f(ω)dP(ω)≥min

P∈P

Z

u◦g(ω)dP(ω),

(25)

iff the preference relation% satisfies the MEU axioms8.

Note that ifP is a singleton, MEU collapses to Savage’s (1954) standard SEU, and hence, the level of ambiguity is dictated by thevagueness of the decision maker’s beliefs, that is the cardinality of the set of distinct priors.

Now, if subjects behave as if they have MEU preferences, Gilboa and Schmeidler (1989) offer a solution to the Ellsberg (1961) paradox as % ac- commodates choice patterns consistent with preferences for known proba- bilities. Again, consider the state space Ω ={r, b, y}, but introduce a range of possible probability measuresP ={P∈∆(Ω,F)|P({r}) = 13}, that is all probability measuresP({b})∈

0,23

:P({y}) = 23−P({b}). It becomes clear thatfry fbr∼fbysince ambiguity about theby-partition dictates maxmin preferences to assign the worst probability distribution to the realization of the state variable, in this case setting the probability of the non-red ball equal to zero when chosen in combination withb. On the other hand, when betting {b, y}, we do not care about the by-partition as it has no impact on P({b, y}) = 23 and effectively the decision maker cancels out any impact from ambiguity.

2.3.2 α-Maxmin Expected Utility

While MEU achieves ambiguity averse preferences, it remains silent on the attitude towards the level of ambiguity, i.e. the model does not allow for a nuanced analysis of ambiguity and ambiguity aversion in separation. Indeed, it may seem exceptional that the decision maker always orders according to the most extreme pessimistic probability distribution. For this reason, Mari- nacci (2002) and Ghirardato et al. (2004) introduced a separation between thevagueness of beliefs (the level of ambiguity), i.e. the diffusion of the set of priors, P, and the attitude towards this ambiguity given by the Hurwicz criterion9. Specifically, α-MEU preferences set the objective function to some convex combination of the minimum (most pessimistic) and maximum (most optimistic) set of expected utilities. Thus,α∈[0,1] describes the rel- ative weight on the optimistic distribution, in effect acting as an ambiguity aversion index, whereα= 1 is pure optimism andα = 0 is pure pessimism.

8The axioms underlying MEU will not be given here. See Gilboa and Schmeidler (1989) or Gilboa (2009) for reference.

9The Hurwicz criterion in decision making under complete uncertainty represents a convex comprise between the maximin and maximax criteria.

(26)

20 Smooth Ambiguity

Theorem 2.4(α-Maxmin Expected Utility, Ghirardato et al. (2004)). For any actf and any non-singleton set of probability measures,P, the decision maker weights some ambiguity aversion α ∈ [0,1] to the most optimistic prior, and (1−α) to the most pessimistic prior,

f %g ⇐⇒ (1−α) min

P∈P

Z

u◦f(ω)dP(ω) +αmax

P∈P

Z

u◦f(ω)dP(ω)

≥(1−α) min

P∈P

Z

u◦g(ω)dP(ω) +αmax

P∈P

Z

u◦g(ω)dP(ω) so that the vaguer the set of priors the more ambiguity, and the lowerα the higher the level of ambiguity aversion.

2.4 Smooth Ambiguity

Despite Ghirardato et al.’s (2004) achievement of the α-MEU preference relation in separating ambiguity and ambiguity aversion, thus fulfilling sep- arability, Klibanoff et al. (2005) foundα-MEU too crude for refined analysis inherent to the construction of the preference relation by a Hurwicz (1951) criterion seeing that the decision maker only considers the infinum and supre- mum expected utility of priors. Despite the fact that acts are evaluated as a convex combination of the extreme priors, the decision rule still fails to consider available intermediate prospect values as these are taking into ac- count by interpolation. The result is the peculiarity that if two prospects carry similar values on extreme priors, but one dominates the other on all remaining priors, then a decision maker exhibiting α-MEU preferences will consider the two prospects similarly attractive, though a preference for the dominating prospect indeed seems more reasonable.

Motivated by the desire to develop a more refined representation, Klibanoff et al. (2005) propose to replace the Hurwicz (1951) criterion based α-MEU preferences to an aggregation of the entire set of possible expected utilities over the set of priors. To this end, the representation introduces second-order probability over probabilities in ∆(Ω). Furthermore, they need to introduce non-linearity in order to capture a non-neutral ambiguity atti- tudes, exactly as is customary in utility functions (if not, the prior over priors collapses to probabilistic sophistication in effect making it an ordinary SEU representation). In essence, the representation introduces two utility func-

(27)

tions: (i) an external utility function characterizing ambiguity attitudes over the set of priors, and (ii) an internal ordinary von Neumann-Morgenstern utility function characterizing risk attitudes over the evaluation of acts.

Specifically, let Pbe a probability measure on the state space Ω, and let

∆ be the set of all such admissible probability measures. Introduce a second- order utility function from reals to realsφ:R→Rthat captures ambiguity aversion, strictly increasing on u. In particular, a concave φ characterizes ambiguity aversion, which Klibanoff et al. (2005) define to be an aversion towards mean preserving spreads in theψf, whereψ:σ(∆)→[0,1] is a prior over ∆, the set of probability measures, measuring the subjective evaluation of a given prior as the ”right” measure, a convex φ captures ambiguity- loving preferences, where a linear φresults in ambiguity neutrality. Hence, the characterization of φ is parallel to that of risk attitudes in customary utility functions. Then, the preference representation is thus given

Z

∆(Ω)

φ Z

u◦f(ω)dP(ω)

dψ(P) (2.4)

An increase in ambiguity is represented by an increase in the support on the spread ofµ, so that the multiplicity and range of subjective probabilities over priors increases.

2.5 Capacities and the Choquet Integral

The point of departure in Schmeidler (1989) is that for probability to reflect the decision maker’s willingness to bet in a game with ambiguity, then such a probability measure cannot be additive. We saw that Savage’s (1954) expec- tation operation is processed with respect to some subjective prior probabil- ity derived uniquely from the decision maker’s preferences over acts. How- ever, Ellsberg (1961) showed that such preferences do not conform with the modal preferences revealed under ambiguity, as they cannot be represented by any additive probability measure, and are consequently at odds with the Sure-Thing-Principle, Savage axiom P2 and with Machina and Schmeidler (1992) probabilistic sophistication.

In pursuance of a representation that accommodates ambiguity aver- sion, Schmeidler (1989) generalizes the probability measure to capacities orprobability charges that assign densities on events but which need not be

(28)

22 Capacities and the Choquet Integral

additive—and are only so in the special case of the existence of a probability measure—thus in general referred to asnon-additive probabilities.

Definition 2.2 (Capacity). A capacity v on(Ω,F) is a normalized mono- tone set functionv:F →[0,1]such thatA⊂B =⇒ v(A)≤v(B)∀A, B ∈ F (monotonicity) and v(∅) = 0 and v(Ω) = 1(normalization).

The capacityvisconvexifv(A∪B)≥v(A)+v(B)−v(A∩B), andconcave if the reverse inequality holds. Aprobability measure is then a special case of a capacity which is ”both” concave and convex, meaning that the equation holds with equality. Now, if∀A, B ∈ F such thatA∪B ∈ F andA∩B =∅, when it holds thatv(A∪B) =v(A)+v(B), then we call the capacityadditive, that is it is a probability measure, whereas ifv(A∪B)≥v(A) +v(B) the capacity issuper-additive, and forv(A∪B)≤v(A) +v(B) it issub-additive.

In this sense, capacities are in general understood as beingnon-additive, and so we call it afuzzy measure.

The expectation operation over a non-additive probability cannot employ the usual Rieman integral, seeing that this procedure integrates vertically by considering events independently and one by one. As a consequence, Schmeidler (1989) made use of the Choquet (1955) integral 10which evalu- ates acts subject to beliefs represented by a capacity11.

Definition 2.3 (Choquet Integral, Choquet (1955)). The Choquet integral of an act f : Ω → X ⊆ R with respect to capacity v on (Ω,F) for Ω = {ωi, i= 1, . . . , n} is defined

Z

f dv:=

n

X

i=1

f(ωi

v

i

[

j=1

ωj

−v

i−1

[

j=1

ωj

 (2.5)

by the ordered sequence of outcomes f(ω1) ≥ f(ω2) ≥ . . . ≥ f(ωn), with notational convenience v(ω0) := 0.

Note how the Choquet integral does not satisfy additivity: R

f dv + R

gdv6=R

(f +g)dv.

10Capacities and the Choquet integral were introduced by Choquet (1955) in an appli- cation to physics.

11Perhaps more intuitively, we can follow Yaari (1987) and understand a capacity as a probability distortion function. LetP be an underlying objective probability measure on (Ω,F) and let w: [0,1][0,1] be an increasing function withw(0) = 0 andw(1) = 1, then we can consider the capacityv=wPadistorted probabilityandwa corresponding distortion function.

(29)

Theorem 2.5 (Choquet Expected Utility, Schmeidler (1989)). Iff % satis- fies the CEU axioms12, then ∃a unique non-additive probability vonF and a u:X→R such that ∀f, g∈ A

f %g ⇐⇒

Z

u◦f dv≥ Z

u◦gdv

Moreover, in this case v is unique, and u is unique up to positive linear transformations.

The capacity used in the computation of the Choquet integral will over- weight high outcomes if the capacity is concave (and can thus be consid- ered optimistic) and will overweight low outcomes if the capacity is con- vex (and can accordingly be considered pessimistic). To see how the con- struction of capacities allows for ambiguity aversion, reconsider the decision matrix in Ellsberg (1961). Let u($100) > u($0) (no argument here) and observe that fr fb demands v({r}) > v({b}), whereas frb ≺ fby de- mands v({r, b})< v({b, y}), then becausev is allowed to be non-additive it is valid to let beliefs be characterized v({r}) = v({r, b}) = v({r, y}) = 13, v({b}) = v({y}) = 0, and v({b, y}) = 23. Conforming with Schmeidler’s (1989) definition of ambiguity aversion as the ordering with respect to con- vex capacities, observe that the just described capacity is indeed convex.

2.5.1 Neo-Additive Capacities

CEU succeeds in modeling attitude towards ambiguity. However, ambiguity and ambiguity attitude are confounded as there exists no separation between the two in the representation. Chateauneuf et al. (2007) construct a repre- sentation with a Choquet integral of neo-additive capacities that allows for such a separation, named so because it isadditive on non-extreme outcomes.

A neo-additive capacity can be expressed as a convex combination of a probability and a special capacity—referred to as the Hurwicz capacity—

that only distinguishes between whether an event is impossible, possible or certain. Specifically, the Hurwicz capacity simply distinguishes whether an eventA belongs to the set of ”null” events, ”certain” events, or ”essential”

events. Null-events are ”impossible”, so ∅ ∈ N and ∀B ⊂A, A ∈ N =⇒ B ⊂ N and A, B ∈ N =⇒ A∪B ∈ N, whereas essential events are

12Again, these will not be sketched here, see Schmeidler (1989).

(30)

24 Capacities and the Choquet Integral

”certain”,U ={A∈ F : Ω\A∈ N }, and essential events are ”likely” in the sense that they are neither impossible nor certain, i.e. F ={F \N ∪ U }.

We can then intuitively understand the Hurwicz capacity as a convex combination of two capacities, one of which reflects complete ignorance or complete ambiguity in everything bar a universal event occurring, and the second which reflects complete confidence in everything bar null events, that is forµU(A) = 1 ifA∈ U and µN(A) = 1 ifA∈ N, soµNα(A) =αµU(A) + (1−α)µN(A).

Definition 2.4 (Hurwicz Capacity). For the null-events N ⊂ F and opti- mism α∈[0,1] the Hurwicz capacity is defined

µNα(A) :=





0 ifA∈ N

α ifA /∈ N ∧Ω\A /∈ N 1 if Ω\A∈ N

(2.6)

Now, we are ready to formally define the neo-additive capacity. We let defined by convex combination of (i) the additive capacity, i.e. a confident probability measure, and (ii) the Hurwicz criterion which captures how ”op- timistic” the decision maker is in the sense that she evaluates how ”certain”

she is that a given event will occur. The convex combination determines the degree to which the decision maker believes that the additive measure indeed represents the true probability measure.

Definition 2.5 (Neo-Additive Capacities, Chateauneuf et al. (2007)). For some event A and an additive confident probability measure P on (Ω,F), then the neo-additive capacity v is given by a convex combination of the additive measure and a Hurwicz capacity

v(A|N,P, δ, α) := (1−δ)P(A) +δα,

where δ ∈[0,1] relates to the degree of ambiguity (the lack of confidence in Pas the true measure) and α∈[0,1] to the level of optimism.

This neo-additive class of capacities delivers a delineation of ambiguity and ambiguity attitudes. Specifically, δ denotes the lack of confidence in the prior P, so that for δ = 0 the preference relation collapses to educes to the standard subjective expected utility representation of preferences over Savage-acts, that is acts that do not involve objective lotteries. The higher

(31)

δ, the more unsure the decision maker is that P is a true measure. In addition, the preference relation under neo-additive capacities also yields an α which measures the degree of optimism versus pessimism by which the decision-maker resolves her ambiguity. Effectively, the pair (δ, α) allows for preferences for different degrees of ambiguity and for different attitudes towards ambiguity and the sought after separation is thus established.

(32)

26 Capacities and the Choquet Integral

(33)

Brownian Motions under Ambiguity

”The probability dependent on future events is impossible to predict in a mathematical manner. He analyzes causes which could influence a rise or fall of market values or the amplitude of market fluctuations.

His inductions are absolutely personal, since his counterpart in a trans- action necessarily has the opposite opinion.”

– Bachelier, Louis, Theory of Speculation (1901, p. 25-26)

This chapter applies the just developed decision theoretic representations to the analysis of ambiguity about payoffs following laws of motions described by continuous-time stochastic processes. The importance of Brownian mo- tions in the theory of modern economics and finance can hardly be over- stated. Such families of motions are indeed also at the heart of real options theory in which firms make decisions over strategies on payoff flows.

First, this chapter builds the fundamentals of stochastic differential equa- tions to building blocks with which we proceed to build on ambiguity about processes following Brownian motions. Next, the chapter proceeds to embed the three classes of ambiguity preference relations outlined in the previous chapter in a continuous-time stochastic environment. The multiple priors approach is modeled in continuous-time by Chen and Epstein (2002) who employed Girsanov’s Theorem as a means of evaluating processes that can be described by a non-singleton set of probability measures and is hence used in Nishimura and Ozaki’s (2007) adoption of Gilboa and Schmeidler (1989) MEU preferences, later generalized by Schr¨oder (2011) in a Mari-

27

(34)

28 Preliminaries

nacci (2002); Ghirardato et al. (2004)α-MEU specification. Skiadas (2013) shows that Klibanoff et al.’s (2005) smooth ambiguity preferences cannot be applied in a continuous-time environment, for which reason a short sketch of the proof will be provided in this chapter. Then, finally, the chapter will build the Kast and Lapied (2010) Choquet-Brownian motion employed in the proposed model as specified by Roubaud et al. (2010) in a real option model, but to which Chateauneuf et al.’s (2007) neo-additive capacities are incorporated in order to obtain a more refined delineation between ambi- guity and ambiguity attitude. We start by providing the mathematics that will provide useful in understanding drivers behind the specifications of the valuation schedules to be considered in Chapter 4.

3.1 Preliminaries

Once and for all (Ω,F,P) is a probability space with sample space Ω, a σ- algebra of its subsetsF:= 2, subject to some probability measureP:F → [0,1], normalizedP(Ω) = 1, and additive. If some property holds for almost everyω with respect to the measure P, namely ∀ω ∈Ω\N and P(N) = 0, we say that it holds ”almost surely”, abbreviatedP–a.s.

Then, a stochastic process is a family of random variables over an index T. In continuous-time we write (Xt)0≤t≤T over T = [0, T] and in discrete- time{Xt}0≤t≤T overT ={0,1, . . . , T}where thestate of the process at time t is a random variable X(t) ∀ t ∈ T. In accordance with the literature on stochastic processes, we will writeX(t) =Xt interchangeably.

The state space S contains all of the possible states over which the stochastic process can realize a value. In this sense, the sample space Ω is the set of all possible sample paths through the state space S, so X(t, ω) : T ×Ω → R. Indeed, it can be most intuitive to think of Ω as a family of trajectories, realizations or sample paths ω :t 7→ Xt(ω) telling us theposition of the process Xt at each timet∈ T for some given sample pathω.

In order to formalize how information is revealed through time, we intro- duce the notion of a filtration. Afiltration F= (Ft)t≥0 is the increasing se- quence ofσ-algebras contained inF :Ft−1 ⊆ Ft∀t≥1, where we interpret Ftas the set of events which are observable attsuch that the filtration con- tains the events that can happen ”up to timet”. In this sense,F0 ={∅,Ω}

(35)

and FT = 2. We call (Ω,F,(F)t≥0,P) a filtered probability space. We call a stochastic processadapted to the filtration if∀t, XtisFt-measurable. Ev- ery stochastic process generates anatural filtrationFtX :=σ(Xs,0≤s,≤t), which keeps track of the ”history” of the process.

3.2 From Random Walks on Z to Brownian Mo- tions on R

We start out by a discrete-time classical random walk as it later will provide the foundation for the characterization of the Choquet-Brownian. Let’s consider an infinite sequence {Xj}j≥1 of {−1,+1}-valued i.i.d. Bernoulli random variables which are 1 with probability p,

Xj =

( +1 with probability p

−1 with probability 1−p (3.1) where we will consider the symmetric random walk forp= 12, so E[Xj] = 0 and Var[Xj] = 1. The sequence of random variables feed a sequence of sums {Sτ}τ≥1 in an τ-step random walk

Sτ =X1+· · ·+Xτ (τ = 1,2, . . .) (3.2) with propertiesE[Sτ] = 0 and Var[Sτ] =τ (recall Cov[Xi, Xj] = 0∀i6=j, so Var[Sτ] = Var

hPτ j=1Xj

i

=Pτ

j=1Var [Xj] +P

i6=jCov [Xi, Xj] =τ·1 + 0 = τ) summarized by the sample space Ω ={−1,+1}τ so that Sτ walks on Z in a lattice as illustrated in Figure 3.1. By convention, X0 := 0.

So far, so good. Now, since the objective is to construct a continuous- time motion, we are interested in inserting more random variables in between the already established Xj. Thus, re-scale the random walk by inserting random variables in the regions between the already established {Sτ}τ≥1 realizing at τ = 1,2, . . . by establishing a new sequence {Sτ(N)}τ≥1 under which, for some fixedN ∈N, we considerN times as many random variables that realize at timesj/N instead ofj.

For the re-scaling to be valid, the new steps must take the re-scaled amplitude Xj/√

N instead of Xj so that variance over an interval remains unchanged, that is Varh

PN j=1

Xj

N

i

=PN j=1 1

N = 1 = Var[Xj].

(36)

30 From Random Walks on Z to Brownian Motions on R

0

1

−1

2

0

−2

3

1

−1

−3

τ= 0 τ = 1 τ = 2 τ = 3 · · ·

Figure 3.1: TheSτ symmetrical random walk onZ.

By increasing N, the random walk {Sτ(N)}τ≥1 will be defined on an increasing number of elements ofR+. In the pursuance of a continuity, it is convenient to define a a piecewise constant random function Wt(N) on the halflineR+ ={t :t≥0} locally constant in the n

[Nτ,(τ+1)N ) :τ = 1,2, . . .o regions between the realization of{Sτ(N)}τ≥1, so as to connect theXj/N,

Wt(N):= X

1≤j≤bN tc

Xj

N (3.3)

wherebN tcdenotes the largest integer less thanN t(intuitively,Wt(N)takes the value of the last realization in the random walk{Sτ(N)}τ≥1 before a new step of amplitudeXj/√

N realizes).

Thus, we see that the re-scaling of the random walk {Sτ(N)}τ≥1 by in- troducing a larger number of steps, that is by increasing N, results in a continuous process with random variables defined on finer and finer regions ofR+. Accordingly, we invoke the Central Limit Theorem (CLT) to examine Wt(N) whenN → ∞.

Theorem 3.1(Central Limit Theorem). If{Xj}j≥1 are i.i.d. random vari- ables with finite meanE[Xj] =µand finite non-zero variance Var[Xj] =σ2,

Referencer

Outline

RELATEREDE DOKUMENTER

Drawing on insights from network studies, we then propose that the characteristics of recruiting managers’ social ties with job applicants from their com- munity—especially,

We propose that effective knowledge sourcing by the MNE depends on the success of the unit-level emergence enabling processes in managing individual heterogeneity and

We contribute to the vast literature on globalization and GVC specialization by showing that against the background of a relatively stable share of cross border US affiliate

Consistent with the theory, we show empirically that (1) insurers with more stable insurance funding take more investment risk and, therefore, earn higher average investment

We engage with the concept of hidden curriculum in two ways in this paper. First, we propose that  the hidden  curriculum of  Danish tertiary  education 

This article addresses the above-discussed shortcomings in the current literature by investigating the effect of sensory brand experience on brand equity in the banking industry,

Building on the finding that digital nomadism appears to be more of a life concept as opposed to being solely a work- related concept, we propose to question whether the

Building on the concept of governance developed by the global value chain literature, the article identifies two different types of networks: European lead firms internalise