Uncertainty and gradualism in monetary policy
Speech by José Manuel González-PáramoMember of the Executive Board and Governing Council of the European Central BankUniversidad Pablo de OlavideSevilla, 17 March 2006
Introduction
Uncertainty is one of those key ideas that are present in almost all fields of economics. Precautionary savings, risk premium and confidence intervals are all examples of familiar expressions that are related to the presence of uncertainty. The monetary policy literature is no exception; all conceivable sources of uncertainty are currently analysed in the academic literature. Economic shocks, parameter uncertainty, model uncertainty and even preference uncertainty are topics that have all been studied recently.
Uncertainty would not in itself be a problem if we were able to reduce it considerably by studying the properties of the economy and conducting controlled experiments. Unfortunately, the prospects are not propitious in this regard. In contrast to other disciplines, it is not feasible and would not be ethical to induce variability into the economy with the sole purpose of furthering our understanding. Even if we could, we would still face the additional complication that our policy choices affect the environment we are trying to study. That is, people will adjust their decisions and expectations in the face of regime changes.
The degree of uncertainty facing the European Central Bank is arguably much greater than that facing other longer established central banks. Monetary union is a new regime, and we currently can have only about eight years of data covering the real union at our disposal for model estimation and analysis. At the time when the ECB was set up, there was of course no such data. This means that uncertainty has been, and continues to be, a very real and relevant concern in the policy-making process. Consequently, uncertainty has played an important role in formulating our strategy.
Today I will discuss what I see as the main sources of uncertainty facing monetary policy makers and the various solutions discussed in the literature. I will then explain our own response to this uncertainty in the context of our monetary policy strategy. In addition, and trying to follow the teaching of Don Pablo de Olavide who stated that “There is nothing more persuasive than good example, and nothing more convincing than experience” (Nada persuade tanto como el ejemplo y nada convence tan eficazmente como la experiencia), I will try to illustrate the nature of our strategy by referring to our monetary policy decisions and comparing the ECB’s strategy with other monetary policy frameworks.
Monetary policy in an uncertain world
The two main uncertainties that a central bank faces are related to data and model uncertainty.
The most easily assessed form of uncertainty relates to data. As all applied economists know, very timely macroeconomic statistics are typically based on a limited response by business and therefore subject to revisions. Furthermore, we only receive the figures we are most interested in, i.e. those relating to the current economic situation, with a lag of typically at least a quarter.
With regard to model uncertainty, I want to focus on three aspects: general model uncertainty, identification of shocks and the transmission of monetary policy.
First, as any literature survey or discussion with central bankers would tell you, there is substantial uncertainty concerning the “correct” model of the economy to use. Consequently, any (at least economic) model is an approximation of an exceedingly complex reality. Of course, our aim is not necessarily to have the “right” model, but one that:
does reasonably well in forecasting future economic conditions;
captures the relevant frictions which create a role for monetary policy; and
is well suited for making alternative policy scenarios.
Forecasting is important because monetary policy operates only with a substantial lag. Hence, in order for policy to be effective, it must be sufficiently forward-looking and pre-emptive. The key frictions that determine the extent to which monetary policy actions have a real effect on the economy are those related to nominal rigidities, such as price and wage stickiness. Most models currently used in central banks incorporate these frictions in one form or another. The Lucas critique is highly relevant in the context of policy scenarios, in that it requires structural models to be used such that the changing behaviour of the private sector in the face of different policies is appropriately incorporated. The Lucas critique and the rational expectations revolution that followed emphasised the fact that reduced-form models are unstable across different regimes. In other words, they are not well suited for policy analysis, as they implicitly assume that the behavioural equations would not change when we simulate counterfactual policies. In contrast, structural rational expectations models try to find “deep” parameters such as preferences, production functions and so forth that should be independent of the policy regime, hence allowing consistent assessment of alternative policies. The problem here is that several different structural models may well give equally plausible descriptions of statistics.
In macroeconomics, we are in the unfortunate situation that we only have very short samples that can be used to estimate our competing structural models and to test our theories and hence none will stand out as a clear “winner”. This is particularly true for the euro area, for which only about eight years of data can be available. The question is therefore how to treat more or less equally plausible models, in particular when they imply differences concerning the appropriate stance of current monetary policy.
One implication of the fact that several structural models will fit the data is that there is a link between model and shock uncertainty, in that identification of the latter hinges on which model is employed. Such identification has crucial effects on the resulting policy recommendation. For example, the preferred response to demand and supply shocks are typically quite different. In the econometric literature, this issue surfaces because various identification schemes give rise to different interpretations of the “structural” shock.[1]
The usual assumption in structural economic models is that all shocks can be observed by the central banker at decision-making time. As it is painfully clear for those of us who have to make such decisions, this is far from an accurate description of the environment we face.
One example of this problem is the controversy surrounding how much IT-related innovation has fuelled productivity growth. Another is how to interpret rapid increases in stock prices, invariably triggering debates about bubbles versus strong growth prospects. By illustration, it is argued that the Federal Reserve System (FED) may have played an important role in the rise of the great inflation (see Orphanides (2003a,b)) as it may have overestimated potential output and therefore believed that the low level of observed GDP implied a large negative output gap. In response, it loosened monetary policy, inadvertently contributing to the fuelling of inflationary pressures.
There is also much uncertainty surrounding the transmission of monetary policy actions into the economy. Three issues stand out in particular. First, due to the structural break caused by the introduction of the euro, we can only rely on a very recent sample in order to identify the effects and, as I discussed above, that sample covers a very short period indeed. Second, monetary policy operates only with substantial lags on inflation and GDP. Typical estimates suggest that the full effect is realised only after about 1.5-2 years. This reduces the sample and complicates the identification of the transmission mechanism as dynamic models are inherently more challenging to estimate, in particular, when there is uncertainty (as there always is) about the appropriate lag structure to use. Third, the task of the central bank is to ensure price stability and one indicator often used is the behaviour of long-term inflation expectations. In this respect, a key issue is how the private sector forms its expectations, in particular about future inflation. Again, this issue is also subject to a high degree of uncertainty, given that the formation of expectations depends on explicitness of goals, strategy and policy actions.
Addressing uncertainty – suggested solutions
How should this uncertainty be reflected in policy choices? Academic literature on this subject abounds, offering various ideas of theoretical interest, such as robust control and Bayesian averaging. The former approach proposes that uncertainty is considered based on the model that is currently believed to be the best approximation, whereas the latter requires the central banker to use several models and attach different “priors” on their likelihood. Another difference is that robust control places all the weight on the worst-case scenario (within the bounds of the amount of uncertainty considered), such that policy is chosen to ensure an acceptable outcome even in the worst possible case, at some cost should the benchmark model turn out to be correct. In contrast, the Bayesian averaging approach weighs each model and outcome with its probability.
A less formal approach to uncertainty, which is perhaps closer to what a central banker would actually consider, is the idea that there may be simple rules that perform well in many different models. The problem is that a policy which is fully optimal in one model might perform very badly in another. If there is uncertainty over which of two models are correct, it would therefore be useful to find a rule that works well in both. Since many models “agree” that inflation is undesirable and that monetary policy can contribute to keeping it under control, it is not surprising that a rule that raises interest rates more than one-for-one with inflation delivers a robust outcome. This is the keystone of the “Taylor-rule”, as advocated by Taylor (1993). The second component of the rule consists of a response to the output gap, such that when GDP rises above its potential, interest rates should rise.
Another benefit of a rules-based approach is that policy becomes more predictable, which can facilitate the expectation formation process. However, one downside is that if it is applied in a rigid fashion, which may be necessary for the rule to be credible, it will sometimes ignore information that the central banker perceives as currently relevant for predicting future risks to price stability. A drawback of simple rules that only focus on inflation and the output gap is their possible inability to provide a real solution to the policy problem. In turn, this risks exacerbating macroeconomic fluctuations derived from self-fulfilling expectations.
In the real world, information does not arrive in the same stylised way that it does in models: often shocks only reveal themselves gradually rather than appearing at a quarterly frequency. One possible response is to ensure that policy itself is gradual. Recent literature has indeed shown that a high degree of inertia in the policy instrument can be beneficial to the economy. An inertial monetary policy reduces the risk of having to quickly reverse policy decisions, something most central bankers are keen to avoid as policy reversals are quite challenging from the communication viewpoint. It also increases the leverage of the policy rate to medium and longer-term market rates, as markets come to expect that a given stance is generally maintained for some time. In addition, a gradual monetary policy could reduce the likelihood of financial market disruptions and diminish the probability of large monetary policy mistakes in an uncertain environment. Finally, an established reputation for conducting policy in a gradual manner would be particularly beneficial in the face of major risks, such as an outbreak of deflationary fears. Private sectors’ expectations that the central bank would remove in a gradual manner the monetary policy stimulus engineered to contrast such fears would in itself exert a stabilising effect, thus decreasing the likelihood that the zero lower-bound constraint on nominal interest rates might hamper the central bank’s attempt to stabilise inflation and inflation expectations.
The gradualist policy has some support, but also some resistance in the literature. A possible negative side effect of the gradualist policy is that the central bank risks falling “behind the curve”. In other words, if the economy starts to pick up and inflation increases, a monetary policy response that is too slow will further increase inflation, thereby creating the need for an even stronger correction in the future. In the event that inflation is persistent, this can be costly.
Apart from these general solutions to the problem of uncertainty, the academic literature also provides concrete proposals for specific causes of uncertainty. With regard, for example, to shock uncertainty, a well-established finding is that if the model is linear and the objective quadratic, as is the case in most models of monetary policy, then additive uncertainty about shocks has no effect on optimal policy, constituting the celebrated certainty equivalence principle. Policy responds to the best estimate of the variables in the same way that it would respond to the perfectly measured variables, were they observable.
In the case of multiplicative parameter uncertainty, if the slope of the Phillips curve is imperfectly known, it is interesting to note that the results regarding the strength of the optimal monetary policy response can go either way. The classic Brainard (1967) intuition is that the central banker should respond less to observable developments, supporting the gradualist approach. There are, however, counter examples. For instance, Söderström (2002) shows that a more aggressive response to shocks may be more appropriate if there is uncertainty about the degree of inflation persistence. This also relates to the findings by Angeloni et. al. (2003) who show that if there is uncertainty about inflation persistence, it is better to assume that inflation is persistent. This is because the costs of making a mistake when the inflation process is in reality less persistent are not as high as making the reverse mistake.
Furthermore, transmission uncertainty was the main motivation that Friedman used to caution against trying to “fine-tune” the economy through monetary policy intervention. The evident risk is that two years down the line, when a monetary policy action starts having effect, the economic situation triggering that action might have reversed, leading policy to exacerbate, rather than attenuate, the business cycle. The practical merit of this argument is of course an empirical issue; models can be statistically evaluated based on the assumptions surrounding the estimation strategy, as well as on their forecasting performance. In the absence of long samples providing sufficient information, it is linked rather to the central banker’s gut feeling as to how much faith to place on the current models and forecasts.
I would like to conclude this discussion by noting that a significant proportion of applied monetary research emphasises that anchoring inflation expectations increases the ability of a central bank to maintain price and macroeconomic stability and has thus established a positive link between credibility and inflation stabilisation. Here credibility is defined, as per the standard dictionary definition, as “the ability to have one’s statements accepted as factual or one’s professed motives accepted as the true ones”. It might be the case that the higher the uncertainty, the greater the need for credibility. The idea is that credibility, by anchoring inflation expectations, helps to diminish uncertainty about future developments and thus contributes to stabilising the economy. Similarly, in an environment of high uncertainty, actions by a central bank which do not enjoy a high degree of credibility may be misinterpreted by agents as an indication that it is not committed to its price stability objective.
The response of the ECB
So now it has been made clear that there is plenty of uncertainty about the best way to address uncertainty! In this context, the ECB has committed itself to a procedural framework that aims at enhancing the robustness of our monetary policy.
How can the central bank attain the previously mentioned credibility among private agents? By being clearly committed to a well-defined goal and pursuing it consistently and independently of political influence. Accordingly, the first two features of our monetary policy strategy are that the European Central Bank and the national central banks (NCBs) of the Eurosystem are fully independent and a clearly specified quantitative definition of the ECB primary objective, that is price stability, has been set.
In addition, the European Central Bank’s definition of price stability refers to a medium-term horizon, in recognition of the fact that monetary policy can only control price developments in the longer term and never in the short term. By contrast with the practice typically observed in inflation-targeting regimes, the ECB has not specified a fixed policy horizon. There are many reasons for this decision. Some of them are indeed related to the presence of uncertainty. The lags with which monetary policy affects price developments vary and are unpredictable. Moreover, the optimum monetary policy response always depends on the specific type and magnitude of the shocks affecting the economy. A medium-term horizon allows central bankers the necessary flexibility to respond appropriately to economic shocks of a specific type. A medium-term orientation helps to avoid introducing unnecessary volatility into the economy and to stabilise output and employment.
In order to best serve its primary objective of medium-term price stability, the ECB has chosen a full-information approach which rests on two pillars: an economic analysis and a monetary analysis. The two-pillar approach uses both economic and monetary analysis to give complementary pictures that can be cross-checked against each other and overall give an encompassing assessment of the current risks to price stability. Information is partitioned in this way because we believe that the two analyses contain signals that relate to different horizons. There is again also an element of model uncertainty reflected in this choice, as the analytical underpinnings of the two analyses rest to some extent on different paradigms. This allows the signals coming from the two analyses to be cross-checked and therefore avoids relying too much on any particular model.
The economic analysis focuses on the short to medium-term risks to price stability, examining a large number of indicators to extract the short-term disturbances and their effects on the aggregate economy. In this context, economic projections play an important role. These projections, including the ECB/Eurosystem staff projections – which are drawn up in collaboration with the national central banks of the Eurosystem – and the forecast published by external institutions provide a set of economic scenarios based on a number of assumptions. This analysis is complemented by an ongoing monitoring of the economy based on a wide set of statistics and indicators.
The monetary analysis serves to complement the economic analysis with a medium to long-term perspective on the risks to price stability. This analysis rests on the empirically and theoretically-supported idea that monetary growth and inflation are closely related in the medium to long run. A sustained increase in money growth in excess of what is needed to finance transactions in the economy at constant prices will ultimately be associated with inflation. Therefore, monetary analysis can detect risks to price stability that escape the short-term economic analysis. Furthermore, the ECB’s monetary statistics can complement or replace economic indicators that are measured with delays or surrounded with uncertainty, as these statistics can provide accurate and very timely insight into the state of the economy. As discussed above, the work of Orphanides (2003a,b) shows that, in the case of the US, reliance on a single paradigm in the event of data uncertainty regarding the output gap might lead to an inappropriate monetary policy stance.
Careful monitoring of money and credit aggregates can help to identify risks to the economy which are not typically incorporated in standard models. For example, Detken and Smets (2004) argued that the analysis of monetary and credit developments can provide a forward-looking indicator of build-ups of asset price bubbles, the subsequent correction of which can lead to high-cost recessions. Thus it is crucial to the ECB in maintaining price and macroeconomic stability at a horizon which would also implicitly address concerns for financial stability (see Issing et al, 2005).
Monetary analysis in the ECB can therefore play a role in extracting both the signals contained and monetary and credit statistics. This requires a deep and comprehensive analysis, which we have been developing and extending since the start of monetary union.
I would like to turn now to a final key aspect of our monetary policy, namely the communication of the European Central Bank. A well-oriented communication is crucial for maintaining credibility and thus anchoring inflation expectations. In the euro area, there are various sources trying to measure inflation expectations: the ECB Survey of Professional Forecasters and the indicators derived from index-linked financial instruments, among them. By comparing these indicators with the announced inflation target, it is thus possible to assess the credibility of ECB. The conclusion of such an exercise is that, through its whole period of existence, the ECB has been able to anchor inflation expectations remarkably well, thus demonstrating a high level of credibility. Transparent communication and consistent decision making also enhance predictability, both in the short run (i.e. interest rate decisions) and, most importantly, in the medium to long run (i.e. ability to understand the central bank reaction function). The high level of predictability of the ECB (ECB, 2006) has contributed to reduce uncertainty about the interest rate, does contributing to the efficiency of market allocation. In addition, it has helped to guide price and wage setting behaviour in line with the central bank objective.
The importance and complementarities of all these features of our strategy, in a context of high uncertainty in the economic environment, have been evident in the monetary policy decisions of the ECB from its inception.
In early 2001, for example, the ECB started an easing cycle. At the time, on the heels of significant adverse supply shocks and rather strong wage dynamics, inflation rates were high at levels. However, we equally saw an upcoming worsening of the outlook for economic activity, as indicated by the rapid decline in several confidence measures. This lowered the risks to inflation coming from wages, and inflationary pressures more generally. While we assessed that the information coming from monetary trends was consistent with price stability over the medium term, we took the view that the upward inflationary pressures were of a temporary nature and warranted looking through the shocks that had caused them to emerge. We held the view that a protracted period of weak economic activity, and the increasing odds that the recovery would not materialise soon, would eventually facilitate a downward adjustment in price and wage-setting behaviours. The easing cycle, which we readily initiated, registered a cumulative decline in the policy rate of 275 basis points over 2 years, moving the policy rate down to 2. There is no doubt that the Governing Council used appropriate judgement in granting high weight to the decrease of inflationary pressures stemming from the downside risks to economic activity. And there is also no doubt that the resolute action of the ECB, based on its own inflationary analysis, not only permitted to stabilise inflation and inflationary expectations, but also helped to forestall a much deeper slowdown in the economy.
Between 2002 and 2004, when the forecast of all major institutions, including the central scenarios of the ECB/Eurosystem staff projections, pointed to accelerating economic growth in the euro over the following years, a mechanical policy response would have induced the start of a tightening cycle. However, this did not occur because the downward risks to the recovery were heavily taken into account in the decision making process. Ex post, this turned to be the right decision. An early tightening of our monetary policy would have probably led to a policy reversal, which could have damaged our reputation. In parallel, our communication policy, explaining the decisions and emphasizing our permanent commitment with the objective of price stability, allowed to maintained well anchored inflation expectations.
At the end of 2005, after two and a half years of maintaining rates at historically low levels, the Governing Council decided to increase the ECB interest rates by 25 basis points. During the second half of 2005, as hard data and survey indicators pointed in the direction of a strengthening of economic activity, it was seen as likely that headline inflation rates, after raising to levels significantly in excess of 2%, could remain above or at 2% in 2006 and 2007, and monetary analysis continued to point towards upside risks to price stability over the medium term because of the ample liquidity accumulated in the economy. Under these circumstances, waiting for data to confirm the strengthening of economic activity and the risks to price stability to materialise would have damaged our credibility and may have required a stronger tightening in the medium-term. In this occasion, our communication strategy needed to be accompanied by policy action in order to maintain medium to long-term inflation expectations in the euro area solidly anchored at levels consistent with price stability.
The December decision did not precommit a series of interest rate increases but confirmed the ECB’s readiness to act as needed. Close monitoring of the risks to price stability on the basis of new information would guide future decisions. Within this framework, in 2 March 2006 we decided again to increase the key ECB rates by 25 basis points. On the basis of available information, in the short run inflation is likely to remain above 2%. Looking further ahead, changes in taxes and oil prices are expected to significantly affect inflation in 2006 and 2007, while there is also likely to be an upward impact from indirect effects of past oil prices, in the context of stronger growth rates expected over the coming quarters. These elements, together with risks of further increases in oil prices and stronger wage and price developments due to second round effects, warranted our decision. In addition, strong monetary and credit growth in an environment of ample liquidity continues to point risks to price stability.
Gradualism at both sides of the Atlantic
Is the ECB’s approach to monetary policy very different to that applied by other central banks? Has it provided very different results? As regards these two questions, I would like to tackle, in particular, the debate on the activism or gradualism of the ECB’s monetary policy as compared to that applied by the US Federal Reserve System. As noted in Sahuc and Smets (2005), the ECB is often criticised for conducting an a-cyclic monetary policy compared with the US. The gradualist strategy of the ECB is thus called into question when compared with the US Federal Reserve System. Put simply, the accusation is that because of an excessive degree of gradualism, the ECB’s monetary policy could be seen as “too little, too late”. This conclusion is often derived from a comparison of the historical paths of interest rates in the US and the euro area, with the latter being much smoother than the former. Is this critique justified?
As regards the apparent excess gradualism of the ECB, using a medium-scale macroeconomic model for both the US and the euro area, the following conclusion is reached:
The degree of interest rate smoothing, which can be taken as an indicator of monetary policy gradualism, is quite similar in the euro area and the US. Therefore, the differences in policy behaviour cannot explain by themselves the differences of interest rate paths.
Why are the interest rate paths then so different across the two areas? Relying on counterfactuals experiments Sahuc and Smets (2006) conclude that:
The role of shocks rather than the structure of the economy (price stickiness, consumption or investment behaviours, etc.) is crucial in explaining this fact. They show that, while the US structure is less rigid, with a lower degree of price stickiness, only the differences in the size and the nature of shocks can explain the differences in interest rate developments. Notably, most of the differences are due to demand shocks, which were larger in the United States.
One might then conclude from this study that the ECB is not too gradualist compared with the Federal Reserve System and that the ECB simply faces some different short-term disturbances.
Conclusion
Let me now conclude. Central bankers have to make decisions in a world of pervasive uncertainty. This uncertainty refers to our knowledge on such key aspects for monetary policy decision purposes as the state and structure of the economy, the impact of monetary policy on the economy and the interaction of private agents and policy makers. Different responses can be found in the economic literature to this uncertainty, including the application of simple rules, the implementation of gradual policies or even the recommendation to avoid policies that aim at fine-tuning economic activity.
I have tried to convey to you today the thinking behind the ECB’s response to this problem. The ECB’s monetary strategy provides one illustration of a commitment to a procedural framework, which may overcome some of the limitations and risks associated with an over-reliance on more narrowly defined rules. The strategy includes a clear commitment to the goal variable, i.e. the primary objective of price stability. Moreover, the two-pillar structure takes explicit account of the need for robustness in monetary policy-making and reduces the scope of discretion, as it makes it more difficult for policy-makers to disregard or gloss over contradictory evidence. In addition, we attach great importance to communication as a crucial element for maintaining credibility and thus anchoring inflation expectations.
All in all the evolution of inflation expectations over these years witnesses that our medium term strategy is understood and deemed credible by economic agents. This is an achievement that would hardly have been conceivable before EMU.
Coming back to Pablo de Olavide’s teaching, which I quoted at the beginning of this speech, judging from the experience to date, the euro currency has been an unquestionable success – euro area price stability being the best exponent of this. I hope I have been persuasive enough to convince you of this fact.
References
Angeloni, I., G. Coenen and F. Smets (2003), “Persistence, the Transmission Mechanism and Robust Monetary Policy”, Scottish Journal of Political Economy, November, Vol. 50(5), 527-549.
Brainard, W. (1967), “Uncertainty and the Effectiveness of Policy”, American Economic Review, Vol. 57, 411-425.
Detken, C. and F. Smets (2004), “Asset Price Booms and Monetary Policy”, ECB Working Paper, No 364.
European Central Bank (2006), “The Predictability of ECB Monetary Policy”, ECB Monthly Bulletin, 51-61.
Issing, O., V. Gaspar, O. Tristani and D. Vestin (2005), “Imperfect Knowledge and Monetary Policy”. The Stone Lectures in Economics. Cambridge University Press.
Orphanides, A. (2003a), “The Quest for Prosperity without Inflation”, Journal of Monetary Economics, Vol. 50, 633-663.
Orphanides, A. (2003b), “Monetary Policy Evaluation with Noisy Information”, Journal of Monetary Economics, Vol. 50, 605-631.
Sahuc, J.G. and Smets, F. (2006), “Differences in Interest rate Policy at the ECB and the FED: An Investigation with a Medium-Scale DSGE Model”, mimeo, European Central Bank.
Söderström, U. (2002), “Monetary Policy with Uncertain Parameters”, Scandinavian Journal of Economics, Vol. 104, 125-45.
Taylor, J. (1993), “Discretion versus Policy Rules in Practice”, Carnegie-Rocherster Conference Series on Public Policy 39, 195-214.
-
[1] The econometric literature attaches a different meaning to the word “structural model”; it is a model with identified shocks (in contrast to, say, an unrestricted vector auto regression model). The economic meaning is rather that it is a model based on explicit micro-foundations with optimising agents.
Evropská centrální banka
Generální ředitelství pro komunikaci
- Sonnemannstrasse 20
- 60314 Frankfurt am Main, Německo
- +49 69 1344 7455
- media@ecb.europa.eu
Reprodukce je povolena pouze s uvedením zdroje.
Kontakty pro média