Must-Read: Tim Duy: Is Pushing Unemployment Lower A Risky Strategy?

Must-Read: Tim Duy thinks, I believe correctly, that the Fed is confusing its own past policy errors with economic laws:

Tim Duy: Is Pushing Unemployment Lower A Risky Strategy?:

Fed hawks are pushing for a rate hike sooner than later in an effort to prevent the economy from “overhearing”…

…argued to set the stage for the next recession…. John Williams:

History teaches us that an economy that runs too hot for too long can generate imbalances, potentially leading to excessive inflation, asset market bubbles, and ultimately economic correction and recession. A gradual process of raising rates reduces the risks of such an outcome…. If we wait too long to remove monetary accommodation, we hazard allowing imbalances to grow, requiring us to play catch-up, and not leaving much room to maneuver. Not to mention, a sudden reversal of policy could be disruptive and slow the economy in unintended ways….

William Dudley….

A particular risk of late and fast is that the unemployment rate could significantly undershoot the level consistent with price stability. If this occurred, then inflation would likely rise above our objective. At that point, history shows it is very difficult to push the unemployment rate back up just a little bit in order to contain inflation pressures. Looking at the post-war period, whenever the unemployment rate has increased by more than 0.3 to 0.4 percentage points, the economy has always ended up in a full-blown recession…. This is an outcome to avoid….

I don’t know that there is a law of economics where the unemployment can never be nudged up a few fractions of a percentage point. But I do think there is a policy mechanism…. Rhe Fed tends to overemphasize the importance of lagging data such as inflation and wages and discount the lags in their own policy process. Essentially, the Fed ignores the warning signs of recession, ultimately over tightening…. For instance, an inverted yield curve traditionally indicates substantially tight monetary conditions. Yet even after the yield curve inverted at the end of January 2000, the Fed continued tightening through May of that year, adding an additional 100bp to the fed funds rate. The yield curve began to invert in January of 2006; the Fed added another 100bp of tightening in the first half of that year. This isn’t an economic mechanism…. This is a policy error….

Bottom Line: The Fed thinks the costs of undershooting their estimate of the natural rate of unemployment outweigh the benefits. I am skeptical they are doing the calculus right on this one. I would be more convinced they had it right if I sensed that placed greater weight on the possibility that they are too pessimistic about the natural rate. I would be more convinced if they were already at their inflation target. And I would be more convinced if their analysis of why tightening cycles end in recessions was a bit more introspective. Was it destiny or repeated policy error? But none of these things seem to be true.

Must-Reads: September 7, 2016


Should Reads:

Must-Read: Lawrence Summers: The Fed’s Complacency About Its Current Toolbox Is Unwarranted

Must-Read: Larry Summers is right: forward guidance and large-scale QE are unlikely to be powerful enough tools for the Fed to deal with the next recession. This is especially true given the Fed’s current policy posture. Large-scale QE is, I believe, primarily useful as a signal of forward guidance. And the Federal Reserve’s current eagerness to tighten monetary policy without any visible signals of an overheating high pressure economy is greatly undermining its ability to credibly engage in forward guidance in the future:

Lawrence Summers: The Fed’s Complacency About Its Current Toolbox Is Unwarranted:

I was disappointed in what came out of Jackson Hole for three reasons…

The Fed should have signaled a desire to exceed its two percent inflation target during periods of protracted recovery and low unemployment…. Even apart from the desirability of allowing inflation to rise above two percent in a happy economic scenario GDP, labor market and inflation expectations data all make a compelling case against a rate increase….

My second reason for disappointment… was that Chair Yellen… was too complacent to conclude that:

even if average interest rates remain lower than in the past, I believe that monetary policy will, under most conditions, be able to respond effectively.

This statement may rank with Ben Bernanke’s unfortunate observation that subprime problems would be easily contained. Rather I believe that countering the next recession is the major monetary policy challenge before the Fed…. It is more than 50 percent likely that we will have a recession in the next 3 years. Countering recessions requires 400 or 500 basis points of monetary easing. We are very unlikely to have anything like that much room for easing when the next recession comes.

Chair Yellen, relying… on… David Reifschneider using the FRBUS model, comes to the relatively serene conclusion that by using forward guidance and QE… the Fed will likely able to respond adequately to the next recession with its existing tool kit.  I think this conclusion is unlikely to be right…. There is an important methodological point here: distrust conclusions reached primarily on the basis of model results.  Models are estimated or parameterized on the basis of historical data.  They can be expected to go wrong whenever the world changes in important ways.  Alan Greenspan was importantly right when he ignored models and maintained easy policy in the mid 1990s because of other more anecdotal evidence that convinced him that productivity growth had accelerated. I believe a similar skeptical attitude towards model results is appropriate today….

I wonder what credibility Fed forward guidance is likely to have given the utter disconnect over many years between Fed and market views regarding future rate and the track record so far of the Fed being wrong and the market being right…. Even if unconventional policy could be highly efficacious in moving long term rates and even if QE induced moves in long rates were potent, there is the question of how much room there is to bring down long rates. Reifschneider… shows that with a big recession rates would likely approach -6 percent, or even -9 percent, but for the zero lower bound.  I find the idea that forward guidance and QE could do the anything like the work of 600, let alone 900, basis points of rate cutting close to absurd…

Brookings Productivity Festival on Friday

Real Gross Domestic Product FRED St Louis Fed

The current discussion of “slow growth in measured productivity” here in the U.S. seems to suffer from a great deal of confusion. From my perspective, there are six things going on:

  1. Since the 1920s, the rise of non-Smithian information goods…
  2. Since 1973, the productivity slowdown…
  3. Since 1995, the semiconductor-driven infotech speedup…
  4. Since 2004, Moore’s Law hitting the wall…
  5. Since 2008, what we will soon be calling “The Longer Depression”…
  6. And, remember, policy changes to speed productivity growth may well be nearly orthogonal to all of the above save (5)…

To talk about the cause of “slow growth in measured productivity” as if it is just one, not five, things causes confusion. To identify one or a small number of causes of a single thing that is “slow growth in measured productivity” causes great confusion. And then to insist that the best policy move is to undo that one or small number of thing causes even greater confusion…

The productivity puzzle: How can we speed up the growth of the economy? Friday, September 9, 2016, 9:30 – 11:00 am, Falk Auditorium: The Brookings Institution:

After nearly a decade of strong productivity growth starting in the mid-1990s, productivity growth has slowed down over the most recent decade. Output per hour worked in the U.S. business sector has grown at only 1.3 percent per year from 2004 to 2015, and growth was even slower from 2010 to 2015 at just 0.5 percent a year. These rates are only half or less of the pace of growth achieved in the past.

The United States is not alone in facing this problem, as all of the major advanced economies have also seen slow productivity growth. This slow growth has been a major cause of weak overall GDP growth, stagnation in real wages and household incomes, and it strongly impacts government revenues and the deficit.

On September 9, 2016 the Initiative on Business and Public Policy and the Hutchins Center on Fiscal and Monetary Policy at Brookings will host a forum on the policy implications of the growth slowdown. Senior Fellow Martin Baily will present an overview paper on the causes of the slowdown, followed by a panel discussion on the most effective policies to enhance productivity performance. After the panel discussion, panelists will take questions from the audience. The event will be webcast live.

Join the conversation on Twitter at #Productivity

Welcome: Louise Seiner

Paper: Martin Baily

Panel: Moderator: David Wessel

  • Jonathan Baker
  • Robert Barro
  • J. Bradford DeLong
  • Bronwyn Hall

Inequality of income, wealth, or consumption? How about all three?

A local resident drives a golf cart from his house to his golf club as a group of landscape workers take a break in Vista, Calif.

When social scientists, policymakers, and pundits talk about inequality, it’s important to specify what kind inequality they are actually talking about. Most of them will be talking about income inequality, though others might be talking about wealth and occasionally an economist might interject to mention inequality of consumption. All of these inequality measures are important, but economic analyses tend to focus on one measure at a time due to data constraints. Yet data on income, wealth, and consumption can be compared and combined to give policymakers a better picture of economic inequality writ large. Two papers, funded in part by Equitable Growth and released today as working papers, provide just that kind of multidimensional view of inequality.

The two papers are part of an effort to see how the distributions of income, wealth, and consumption have shifted over the years. The first one is by Jonathan Fisher of Stanford University, David Johnson of the University of Michigan, Jonathan Latner of the University of Bremen, Timothy Smeeding of the University of Wisconsin, and Jeffrey Thompson of the Federal Reserve Board of Governors. The second one is by Fisher, Johnson, Smeeding and Thompson.

The first paper uses data from the Panel Study of Income Dynamics, a longitudinal study that allows researchers to track changes in the income, wealth, and consumption of households over time. It looks not only at inequality, but mobility as well. When it comes to inequality, the researchers find that inequality of all three variables has increased since 1999. The correlation between income, consumption, and wealth is high, but the correlation is not perfect. People with high income are very likely to have high levels of consumption and wealth, but that’s not necessarily always true. There is also movement within a lifetime along these measures, but as the graph below shows there is less relative movement for those at the top and the bottom of the distributions of these measures of inequality.

Smeeding income/wealth mobility
Whether it's wealth or income, those at the top and the bottom are very likely to stay there
Chance an individual starting in a quintile ends up in each quintile later in life.

The second paper also uses Panel Study of Income Dynamics data but supplements it with data from the Federal Reserve’s Survey of Consumer Finances. Like the first paper, this effort also finds that inequality in income, wealth, and consumption has increased. But in this paper, the economists also look at inequality in what they call “two and three dimensions,” which analyzes the ways in which these different kinds of inequality interact with each other.

Think of their analysis this way: Inequality often gets measured by the share of income or consumption or wealth held by a fraction of the population such as the top 1 percent, the top 5 percent, the bottom 20 percent, and so on. The authors do that analysis and refer to it as inequality in one dimension. They then look at the “cross-shares” of inequality. For instance, they look at the share of consumption by households in the top 5 percent of the income distribution and compare that to the increase in the share of consumption by those in the top 5 percent of consumption. If the increase in the share of consumption by top earners is higher than the increase in the share for top consumers, then having a high income is now more correlated with having high consumption levels.

After conducting an analysis like this for all three measures and in three dimensions as well, the economists find that multi-dimensional inequality has increased faster than inequality in just one dimension. Those with high incomes are now more likely to have high levels of consumption and wealth and those with low income are more likely to have low consumption and wealth.

The authors note the important role of wealth in helping to ride out income shocks and smooth consumption. This might lead policymakers and social scientists alike to think a bit more about the role of wealth inequality in the economy over inequality of income or consumption. But of course, researchers should resist the urge to make one inequality rule them all, as the rest of the paper shows. A more detailed picture of multidimensional inequality would be quite useful.

Must-Read: Paul Krugman: Thinking About Brexit, Fast and Slow

Must-Read: Let me, for one, say that I am surprised that it now appears likely that there will be no Brexit recession in Britain: I would not have thought that a 15% decline in the value of the pound would have been enough to offset the negative shock to domestic investment delivered by the Brexit vote:

U S U K Foreign Exchange Rate FRED St Louis Fed

I thought that it would have required emergency expansionary monetary policy measures the Bank of England was not willing to undertake…

Once again: the rules: (1) Paul Krugman is right. (2) If you think Paul is wrong, see (1)…

Paul Krugman: Thinking About Brexit, Fast and Slow:

“The City’s smartest people are being forced to admit they were wrong about a ‘Brecession’”…

So says Business Insider, now that good UK PMI surveys have caused Credit Suisse and Morgan Stanley to back off their forecasts of a Brexit-induced recession.

But I wasn’t wrong. Yay me!

OK, seriously, at least for the moment it seems as if my skepticism about dire short-run forecasts, despite my agreement about the long run costs has been vindicated:

Economists have very good reasons to believe that Brexit will do bad things in the long run, but are strongly tempted to sex up their arguments by making very dubious claims about the short run. And the fact that so many respectable people are making these dubious claims makes them seem well-reasoned when they aren’t.

I could, of course, still turn out to be wrong. But let me say that what I’m really enjoying here — aside from the chance to claim that I was right — is, for once, having an argument with smart people who are trying to get it right. So much of my time these days is spent combatting sheer derp, that it’s almost like a vacation to debate propositions that aren’t self-evidently stupid.

The Great Recession left struggling Detroiters even worse off

The Detroit skyline is seen under the Ambassador Bridge.

From skyrocketing unemployment rates to shrinking household expenditures, the fallout of the Great Recession of 2007-2009 rattled the U.S. economy. A new working paper released today by the Washington Center for Equitable Growth quantifies just how the Great Recession affected a group of already vulnerable people—low- and moderate-income households in the Detroit metropolitan area.

This new research by Equitable Growth grantee and University of Michigan professor of law Michael Barr and University of Michigan graduate student of economics Daniel Schaffa uses two different local surveys to measure the effects of the recession. For pre-recession data, they turn to the Detroit Area Household Financial Services Study, which exclusively focused on low- and moderate-income families between 2005 and 2006. For post-recession outcomes, the authors cite the 2009-2010 Michigan Recession and Recovery Study, a survey designed to to assess comprehensively the financial situations of Detroit-area households, using some of the same questions asked in the earlier Detroit Area Household Financial Services Study.

When matched, these targeted surveys provide a more nuanced characterization of pre- and post-recession employment, household income, housing, and financial health than other frequently-used datasets such as the Survey of Consumer Finance, Panel Study of Income Dynamics, or the RAND American Life Panel. Ultimately, the two local surveys helped Barr and Schaffa calculate the differences in pre- and post-recession levels on a number of outcomes for households in Census block groups with less than 80 percent of the Detroit metro area median income.

Barr and Schaffa find, for example, that after the Great Recession the median duration of unemployment for low- and moderate-income households increased from 5.2 months to 9.5 months. Meanwhile, the employment rate—the share of the working-age population with a job—dropped by 9.0 percentage points. Median household income deteriorated, falling from $24,000 pre-recession to $19,000 post-recession. Income for African American households declined even more dramatically, with average household income declining by more than $7,000 over the period.

Housing statistics were equally dismal. Both homeownership rates and home values fell. The median home value saw a $50,000 reduction.

The paper provides data on a range of other household outcomes, including the change in rates of home foreclosures, mortgages payments, health care access, payday lending, overdraft usage, and even marriage—all broken down by gender, race, and educational attainment. These measures all tell the same story: For low- to moderate-income families in Detroit–those who arguably started the worst off–the Great Recession had pernicious effects. Though these findings are consistent with what we might expect, Barr and Schaffa show the magnitude of just how devastating the Great Recession was for already disadvantaged households.

So what do these results mean for policy? Barr and Schaffa propose designing policies to help shield these vulnerable families against the effects of future recession. Strengthening safety net programs and improving access to financial services can help ensure those who are already struggling to make ends meet don’t lose even more.

 

Must-Read: Claudia Sahm: Telling Macro Stories with Micro

Must-Read: Claudia Sahm: Telling Macro Stories with Micro:

Economists are avid storytellers….

A good story paper in economics, according to David Romer, has three characteristics: a viewpoint, a lever, and a result…. Blog or media coverage… focuses on the result…. Economists… spend more time on the lever, the how-did-they-get-the-result part…. The viewpoint matters… but it usually holds across many papers.

Best to focus the new stuff. Except when the viewpoint comes under scrutiny, then the stories can really change…. One long-standing viewpoint in economics is that changes in the macro-economy can largely be understood by studying changes in macro aggregates. Ironically, this viewpoint even survived macro’s push to micro foundations with a “representative agent” stepping in as the missing link between aggregate data and micro theory…. An ever-growing body of research and commentary is helping to identify times when differences at the micro level are relevant for macro outcomes….

In the past nine years, have seen models that condition on aggregate measures of income, wealth, interest rates, sentiment, and credit conditions do a pretty good job explaining the changes in aggregate consumer spending…. Adding micro heterogeneity to macro models is one in a long list of possible improvements. Adding a more realistic financial sector, exploring non-linearities, relaxing rational expectations, and extracting a better signal from noisy aggregate data are all in the queue too…. I suspect the Representative Agent is not getting voted off macro island any time soon….

Economics is not supposed to be about economists, but sometimes our stories can feel that way, especially to non economists. And to be fair, the viewpoints that economists bring to their work do have an impact on the results, if nothing else by what we choose to study…

Taking a look at unpredictable schedules

The first Labor Day celebration took place on September 5, 1882 in New York City, when thousands of workers marched and took an unauthorized “workingman’s holiday” to protest the unlimited hours many workers of the era were compelled to put in. While overwork is still a problem for many Americans, a growing number of Americans are grappling with unpredictable, constantly shifting schedules, as detailed in a new issue brief today from Equitable Growth.

Unpredictable scheduling is often aided by “just-in-time” scheduling software, which allows employers to generate schedules based on predicted consumer demand, accounting for factors such as time of day, weather, the season, or even a nearby sporting event. But they do so at the expense of the worker, many of whom see their schedules shift from day-to-day and are given little advanced notice of when they are supposed to show up to work

These kind of unstable schedules affect about 17 percent of the labor force, according to one study. They tend to be most common in retail and service sectors, which are some of the fastest-growing industries and also ones in which workers already face a lack of benefits, poor working conditions, and insufficient pay. (See Table 1.) What’s more, employers frequently require full-time availability to ensure there are always enough workers during busy periods but then fail to give stable or full-time hours.

Employees may spend time and money commuting to work for a scheduled shift only to be sent home without pay if business is slow. For those with children, that may mean they have to pay for childcare even if they themselves did not get paid. And within some industries, it is not uncommon for employers to require workers to remain on-call, keeping their schedules free on the chance that their employer may need them.

These practices perpetuate existing gender and racial inequalities, as workers of color—especially women of color—tend to be sorted into the low-paid positions most most affected by these unstable scheduling practices (despite sharing similar levels of education and age with their white counterparts). The volatility in hours can mean that workers may not earn enough to make ends meet, which affects family well-being as well as economic demand. Yet workers may be unable to take a second job or pursue the kind of education necessary for upward mobility because of the constantly fluctuating hours. In fact, many employees—especially those with care responsibilities—are fired or forced to quit in order to search for a new job.

As we lay out in our issue brief, this has costs for firms. While reducing labor costs may slash expenditures in the short-term, treating workers’ time as just another variable in the cost equation undermines productivity and creates high turnover rates, both of which are costly. Unpredictable schedules also affect who can take jobs and how productive workers may be on the jobs, which also has an impact on business profits.

Despite the problems, federal law does not address unpredictable schedules. But in the face of mounting pressure from employees and policymakers, many private companies have begun to overhaul their work policies. Six major retailers, for example, ended on-call scheduling after attorney general Eric Schneiderman launched an inquiry into their workforce. Wal-Mart Stores Inc., which was one of the first adopters of just-in-time scheduling practices, has announced that it is testing a new system that gives employees the choice to have fix schedules for up to six months at a time. Technology companies also are helping workers take matters into their own hands, and are creating a variety of new smartphone apps that can help employees and employers provide more schedule flexibility.

Local governments also are beginning to pay attention to this issue. After San Francisco enacted the Retail Workers Bill of Rights in 2014, which restricts employers’ ability to impose unpredictable and last-minute schedules on their employees, 18 different states and municipalities introduced similar work-hour legislation in 2015. A bill was introduced last year on the federal level as well.

Of course, legislation alone cannot completely eliminate scheduling abuses by employers, but a federal standard would provide an important marker that states and localities could improve upon and motivate the private sector to invent new ways to balance labor costs and profits.