A Markov Chain Approach to Monetary Policy Decision Making.

Detta är en Kandidat-uppsats från KTH/Matematik (Inst.); KTH/Matematik (Inst.)

Författare: Marcus Josefsson; Erik Rasmusson; [2012]

Nyckelord: ;


Through monetary policy, central banks aim to prevent societal costs associated with high

or unstable ination. Forecasts and several other tools are used to provide guidance to

this end, as outcomes of interest rate decisions are not fully predictable.

This report presents a statistical approach, viewing the development of the economy as

a Markov chain. The economy is thus represented by a nite number of states, composed

of ination and short-term variations in GDP. The Markov property is assumed to hold,

that is, the economy moves between states over an appropriately chosen time period and

the transition probabilities depend only on the initial state. Using the Markov Decision

Process (MDP) framework, the transition probabilities between such states are evaluated

using historical data, distinguished by the interest rate decision preceding the transition.

Completing the model, a cost of ination is de ned for each state as the deviation from

a set target. An optimal policy is then determined as a xed decision for each state,

minimizing the expected average cost incurred while using the model.

The model is evaluated on data from Sweden and the U.S., for periods 1994-2007 and

1954-2007 respectively. The results are assessed by the estimated transition probabilities

as well as by the optimal policy suggested. While the Swedish observations are concluded

to be too few in number to render valuable results, outcomes using the U.S. data agree

in several aspects with what would have been expected from macroeconomic theory. In

conclusion, the results suggest that the model might be applied to the problem, granted

sucient data is available for reliable transition probabilities to be estimated and that

this estimation can be performed in an unbiased way. Presently, this appears to be a

dicult task.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)