Monday, August 21, 2017

New Argument for a Higher Inflation Target

On voxeu.org, Philippe Aghion, Antonin Bergeaud, Timo Boppart, Peter Klenow, and Huiyu Li discuss their recent work on the measurement of output and whether measurement bias can account for the measured slowdown in productivity growth. While the work is mostly relevant to discussions of the productivity slowdown and secular stagnation, I was interested in a corollary that ties it to discussions of the optimal level of the inflation target.

The authors note the high frequency of "creative destruction" in the US, which they define as when "products exit the market because they are eclipsed by a better product sold by a new producer." This presents a challenge for statistical offices trying to measure inflation:
The standard procedure in such cases is to assume that the quality-adjusted inflation rate is the same as for other items in the same category that the statistical office can follow over time, i.e. products that are not subject to creative destruction. However, creative destruction implies that the new products enter the market because they have a lower quality-adjusted price. Our study tries to quantify the bias that arises from relying on imputation to measure US productivity growth in cases of creative destruction.
They explain that this can lead to mismeasurement of TFP growth, which they quantify by examining changes in the share of incumbent products over time:
If the statistical office is correct to assume that the quality-adjusted inflation rate is the same for creatively destroyed products as for surviving incumbent products, then the market share of surviving incumbent products should stay constant over time. If instead the market share of these incumbent products shrinks systematically over time, then the surviving subset of products must have higher average inflation than creatively destroyed products. For a given elasticity of substitution between products, the more the market share shrinks for surviving products, the more the missing growth.
From 1983 to 2013, they estimate that "missing growth" averaged about 0.63% per year. This is substantial, but there is no clear time trend (i.e. there is not more missed growth in recent years), so it can't account for the measured productivity growth slowdown.

The authors suggest that the Fed should consider adjusting its inflation target upwards to "get closer to achieving quality-adjusted price stability." A few months ago, 22 economists including Joseph Stiglitz and Narayana Kocherlakota wrote a letter urging the Fed to consider raising its inflation target, in which they stated:
Policymakers must be willing to rigorously assess the costs and benefits of previously-accepted policy parameters in response to economic changes. One of these key parameters that should be rigorously reassessed is the very low inflation targets that have guided monetary policy in recent decades. We believe that the Fed should appoint a diverse and representative blue ribbon commission with expertise, integrity, and transparency to evaluate and expeditiously recommend a path forward on these questions.
The letter did not mention this measurement bias rationale for a higher target, but the blue ribbon commission they propose should take it into consideration.

Friday, August 18, 2017

The Low Misery Dilemma

The other day, Tim Duy tweeted:

It took me a moment--and I'd guess I'm not alone--to even recognize how remarkable this is. The New York Times ran an article with the headline "Fed Officials Confront New Reality: Low Inflation and Low Unemployment." Confront, not embrace, not celebrate.

The misery index is the sum of unemployment and inflation. Arthur Okun proposed it in the 1960s as a crude gauge of the economy, based on the fact that high inflation and high unemployment are both miserable (so high values of the index are bad). The misery index was pretty low in the 60s, in the 6% to 8% range, similar to where it has been since around 2014. Now it is around 6%. Great, right?

The NYT article notes that we are in an opposite situation to the stagflation of the 1970s and early 80s, when both high inflation and high unemployment were concerns. The misery index reached a high of 21% in 1980. (The unemployment data is only available since 1948).

Very high inflation and high unemployment are each individually troubling for the social welfare costs they impose (which are more obvious for unemployment). But observed together, they also troubled economists for seeming to run contrary to the Phillips curve-based models of the time. The tradeoff between inflation and unemployment wasn't what economists and policymakers had believed, and their misunderstanding probably contributed to the misery.

Though economic theory has evolved, the basic Phillips curve tradeoff idea is still an important part of central bankers' models. By models, I mean both the formal quantitative models used by their staffs and the way they think about how the world works. General idea: if the economy is above full employment, that should put upward pressure on wages, which should put upward pressure on prices.

So low unemployment combined with low inflation seem like a nice problem to have, but if they are indeed a new reality-- that is, something that will last--then there is something amiss in that chain of logic. Maybe we are not at full employment, because the natural rate of unemployment is a lot lower than we thought, or we are looking at the wrong labor market indicators. Maybe full employment does not put upward pressure on wages, for some reason, or maybe we are looking at the wrong wage measures. For example, San Francisco Fed researchers argue that wage growth measures should be adjusted in light of retiring Baby Boomers. Or maybe the link between wage and price inflation has weakened.

Until policymakers feel confident that they understand why we are experiencing both low inflation and low unemployment, they can't simply embrace the low misery. It is natural that they will worry that they are missing something, and that the consequences of whatever that is could be disastrous. The question is what to do in the meanwhile.

There are two camps for Fed policy. One camp favors a wait-and-see approach: hold rates steady until we actually observe inflation rising above 2%. Maybe even let it stay above 2% for awhile, to make up for the lengthy period of below-2% inflation. The other camp favors raising rates preemptively, just in case we are missing some sign that inflation is about to spiral out of control. This latter possibility strikes me as unlikely, but I'm admittedly oversimplifying the concerns, and also haven't personally experienced high inflation.


Thursday, August 10, 2017

Macro in the Econ Major and at Liberal Arts Colleges

Last week, I attended the 13th annual Conference of Macroeconomists from Liberal Arts Colleges, hosted this year by Davidson College. I also attended the conference two years ago at Union College. I can't recommend this conference strongly enough!

The conference is a response to the increasing expectation of high quality research at many liberal arts colleges. Many of us are the only macroeconomist at our college, and can't regularly attend macro seminars, so the conference is a much-needed opportunity to receive feedback on work in progress. (The paper I presented last time just came out in the Journal of Monetary Economics!)
This time, I presented "Inflation Expectations and the Price at the Pump" and discussed Erin Wolcott's paper, "Impact of Foreign Official Purchases of U.S.Treasuries on the Yield Curve."

There was a wide range of interesting work. For example, Gina Pieters presented “Bitcoin Reveals Unofficial Exchange Rates and Detects Capital Controls.” M. Saif Mehkari's work on “Repatriation Taxes” is highly relevant to today's policy discussions. Most of the presenters and attendees were junior faculty members, but three more senior scholars held a panel discussion at dinner. Next year, the conference will be held at Wake Forest.

I also attended a session on "Macro in the Econ Major" led by PJ Glandon. A link to his slides is here. One slide presented the image below, prompting an interesting discussion about whether and how we should tailor what is taught in macro courses to our perception of the students' interests and career goals.





Monday, August 7, 2017

Labor Market Conditions Index Discontinued

A few years ago, I blogged about the Fed's new Labor Market Conditions Index (LMCI). The index attempts to summarize the state of the labor market using a statistical technique that captures the primary common variation from 19 labor market indicators. I was skeptical about the usefulness of the LMCI for a few reasons. And as it turns out, the LMCI is now discontinued as of August 3.

The discontinuation is newsworthy because the LMCI was cited in policy discussions at the Fed, even by Janet Yellen. The index became high-profile enough that I was even interviewed about it on NPR's Marketplace.

One issue that I noted with the index in my blog was the following:
A minor quibble with the index is its inclusion of wages in the list of indicators. This introduces endogeneity that makes it unsuitable for use in Phillips Curve-type estimations of the relationship between labor market conditions and wages or inflation. In other words, we can't attempt to estimate how wages depend on labor market tightness if our measure of labor market tightness already depends on wages by construction.
This corresponds to one reason that is provided for the discontinuation of the index: "including average hourly earnings as an indicator did not provide a meaningful link between labor market conditions and wage growth."

The other reasons provided for discontinuation are that "model estimates turned out to be more sensitive to the detrending procedure than we had expected" and "the measurement of some indicators in recent years has changed in ways that significantly degraded their signal content."

I also noted in my blog post and on NPR that the index is almost perfectly correlated with the unemployment rate, meaning it provides very little additional information about labor market conditions. (Or interpreted differently, meaning that the unemployment rate provides a lot of information about labor market conditions.) The development of the LMCI was part of a worthy effort to develop alternative informative measures of labor market conditions that can help policymakers gauge where we are relative to full employment and predict what is likely to happen to prices and wages. So since resources and attention are limited, I think it is wise that they can be directed toward developing and evaluating other measures.