Miscalculating the wealth of the rich reveals unintended biases

In an ambitious effort, economists Philip Armour and Richard Burkhauser of Cornell University, and Jeff Larrimore of the Joint Committee on Taxation attempt to produce an estimate of trends in inequality based on a definition of income more relevant to understanding economic wellbeing. Specifically, they try to estimate the so-called Haig-Simons metric that defines annual income as consumption plus change in net wealth. According to this definition of income, the authors claim inequality has not been rising over time, leading some strident conservative voices to latch on to it—contrary to numerous other studies and measures

It is important to understand the Haig-Simons metric (named after early 20th century economists Robert Haig and Henry Simons), the methods used by Armour, Burkhauser, and Larrimore for estimating income using this metric, and their presentation of the data regarding this income metric. By my assessment, their working paper does not constitute an informative addition to the inequality discussion. The Haig-Simons measure is interesting but convolutes wealth and income. What’s more, their methodological choices bias the results to downplay relative income growth at the top, and the statistics they report are not sufficiently detailed to assess the implications of their findings.

Is the Haig-Simons income measure useful?

The Haig-Simons income measure has the advantage of avoiding volatility based on the timing of the realization of capital gains on the sale of assets by individuals. Other annual income measures may be too volatile because some people time asset sales based on taxes or other factors, which makes annual income a noisy proxy for wellbeing. Thus, this measure could provide a more comprehensive assessment of the variation in household balance sheets by cutting out timing decisions.

Yet the Haig-Simons measure introduces substantial volatility as well based on changes in the market valuation of assets. Someone with a large stock portfolio, for example, whose portfolio fell substantially could have a negative Haig-Simons income despite being high in the income distribution. Using this measure, the billionaire founder of Facebook, Mark Zuckerberg, would have been considered one of the poorest people in the world in 2012 because his net worth fell by $4.2 billion.

The key point is that assets held in company stock or in stock market indices vary substantially in valuation over any time scale. The Haig-Simons measure attempts to factor out volatility in realized capital income but at the same time introduces potentially higher volatility in the valuation of capital holdings.

To understand the relative merits of the Haig-Simons metric in assessing comprehensive income it is important to discern the relative magnitudes of these volatilities across the entire distribution of income. One way to think about it is that people can choose to sell a stock or other asset whenever they want, which will introduce volatility in their income because of the asset sale. But the value of assets someone holds will vary substantially, too, because stock prices change every day whether or not shares are sold. Thus, measures of income that include capital gains can be variable because of the sale of stock while measures that look at net wealth will also be variable because the stock prices are also volatile. Because it exacerbates the volatility in the valuation of wealth, the Haig-Simmons measure’s primary advantage—that it reduces volatility from the sale of assets—may be swamped.

Additionally, by ignoring the liquidity of assets, the Haig-Simons measure obfuscates changes in wealth and economic wellbeing. Inflation in housing prices during the 2000s led the net wealth of many to increase, which would show up as a rising Haig-Simons income. With hindsight we know that much of this valuation was a bubble and that there was a minimal real improvement in economic wellbeing.

It is unclear why a single reductionist measure is needed for both wealth and income when one measure for wealth and a separate measure for income could be used. Looking at several measures can provide a textured understanding beyond a single number. If asset volatility from timing decisions were determined to be a primary source of noise in short-term inequality trends, then a multi-year average could be applied to other measures of income. This would have the advantage of reducing the volatility from timing decisions while avoiding obviously absurd assessments of income.

Are their methods for determining the Haig-Simons measure sufficient?

The authors attempt to determine a comprehensive measure of both consumption and the change in net worth using the Haig-Simons metric. To compute this measure of income, they make several adjustments to the data to include near cash benefits such as estimating the value of health insurance provided to individuals either from the government of from employers. For changes in net worth, a single national housing index was used to account for changes housing wealth across the country and only the Dow Jones Industrial Average was used for all types of stock income. There are also limitations on details of high-income households in the survey data that they use. Each of these methodological choices will artificially bias their estimates toward a lower valuation of income growth at the top of the distribution.

So let’s examine each of the components they use in turn.

Their method for assessing the value of health insurance is not useful in comparing time trends because it ignores changes in the structure of insurance products over time, variation in the quality of the insurance products provided, and the substantial growth of health care costs beyond inflation over most of they period of their study. Before the enactment and initial implementation of the Affordable Care Act of 2010, there was a trend in health insurance for higher deductible products (these have a lower actuarial value), which would mean that comparisons across time in the value of health insurance would overstate their value in later years relative to early years.

Thus, the relative value of the health insurance product used in their analysis is inflated and will lead to artificially inflated income growth for people whose health insurance constitute a larger share of their “comprehensive” income than others. Because health insurance will probably constitute a smaller share of the consumption of high-income people, the authors’ methodological decisions on accounting for health insurance will artificially imply a reduction in income growth at the top of the income spectrum relative to other segments.

Similarly, the authors’ methodological choices for valuation of housing and stocks are likely to artificially reduce their estimates of the net worth of those at the top. For each of these factors, the authors used a single growth rate for all incomes. By using a single product and not testing the impact of heterogeneous rates—meaning variations in the rates of return—the authors artificially reduce the variability. This reduction in variation will particularly attenuate growth in asset values on the high side and therefore result in artificially low incomes among the top of the distribution.

To their credit, the authors’ acknowledge this shortcoming:

“…when imputing yearly accrued capital gains we assume that all investments receive the ordinary rate of return. Hence we will not capture extranormal returns received by some individuals on their investments.”

Yet after acknowledging this flaw they did not attempt to vary rates of return on the high end of the income distribution or otherwise account for this bias. In fact, a recent paper by Fabian T. Pfeffer, Sheldon Danziger, and Robert F. Schoeni from the University of Michigan find the wealth of the richest households grew faster than the median household, and much faster than households below the median.

Another issue with this approach has to do with their choices of survey data sets. This work relies heavily on surveys such as the Current Population Survey, the Medical Expenditure Panel Survey, and the Survey of Consumer Finances. Economist Philip Vermeulen of the European Central Bank finds that survey data on wealth tend to have a hard time assessing wealth at the top because of systematically biased underreporting among the wealthy skews the estimates lower. Thus, a fundamental limitation of this work is its ability to measure trends at the top, which are central to the claims made about the study’s findings.

Are the data reported sufficient to assess trends in the Haig-Simons measure?

The three authors report average growth estimates by quintile for several different income measures. This is good because it allows for a partial assessment about which components of the income construction are driving the results. But more information is needed. The authors’ choices about which rates of return to apply strongly influence the results; providing results from sensitivity tests (checks to see how much the results rely on various assumptions) with alternate measures would be useful. Without information on these sensitivity tests, there is no way to assess the effect of the biases discussed above.

Furthermore, they do not provide error estimates or statistics on their matching approaches across data sets. Statistics regarding the matching approach (and sensitivity tests of the matching) are particularly important to assess the implications on quality of the methodological choices. These omissions are disconcerting because it makes it impossible to assess the quality of the results and therefore their utility to the current discussion.

Conclusion

The authors attempted an ambitious analysis of incomes, which should be commended, but their execution is insufficient to support the broad proclamations made by many pundits about declines in inequality. Given the study’s clear methodological biases and weaknesses, the claims of the paper’s first author in a report from the Manhattan Institute dramatically overstate the implications. The Haig-Simons income measure may be useful, but there are better ways of getting the same, if not more, information using other measures of wealth and income without conflating the two.

The authors’ methods should work well enough for estimates of the Haig-Simons income for measuring the assets of low- and middle-income people for whom capital gains are a relatively unimportant source of income, but there are fundamental biases that will result in artificially low growth rates among their high-income counterparts. Finally, their reported results are insufficient to assess whether the trends are entirely driven from their methodological choices and also if the results are discernible from noise. I look forward to further research by the three authors and others that more effectively measures the assets of the wealthy—a key to understanding the links between economic inequality and growth.

Factoring inequality into economic growth

The National Bureau of Economic Research released a working paper by Harvard University economist Nathaniel Hendren on August 4 that provides a new way of looking at the relationship between inequality and growth. His paper develops a new statistic, the inequality deflator, which allows researchers to adjust the value of an economic variable, such as average household income, for different levels of inequality. Because averages don’t tell us anything about distribution, the deflator lets us compare those averages by adjusting for the different distributions.

080814-Inequal-deflator

The accompanying graphic shows how much household income increased in the United States after adjusting for rising income inequality between 1979 and 2012. With more attention being paid to the relationship between inequality and growth, Hendren’s inequality deflator can become a powerful tool for understanding the linkages between the two.

The prison boom and black-white economic inequality

Over the past 40 years, the observed earnings gap between African American men and their white counterparts closed slowly but steadily. The average black employed worker earned about a quarter less than the average white employed worker with similar experience in 2010 compared to about a third less in 1970. Such enduring earnings inequality is nothing to celebrate, but at least the trend line is encouraging.

Or is it?

Those reported earnings gains among black men fail to take account of different trends in incarceration and employment, which not only skews labor market statistics but also masks the debilitating economic consequences of the mass incarceration of African American men over the past several decades. When properly accounted, there is little reason to believe that the labor market prospects for black men relative to white men have improved over the past 40 years.

Let’s start with the “prison boom,” or more precisely, the trend in incarceration rates, which have more than doubled over the past 30 years. Today, more than 2.3 million people are locked up in local jails, state prisons, or federal prisons. Although this prison boom affected all racial and ethnic groups, it has had a disproportionate effect on African American men. In the 2010 Census, almost one in ten African American men ages 20 to 34 were institutionalized, while the corresponding rate for white men was only about one in fifty.

Further, on any given day in 2010, about one third of African American men who were high school drop-outs between the ages 20 and 34 lived in jails, prisons, mental health institutions, or nursing homes, and there is good reason to believe that the fraction in prison or jail exceeded the employment rate for this group. Of course, this is just at any given point in time. The fraction incarcerated at some point in life is even higher—about two-thirds by age 34, according to a recent book by sociologist Becky Pettit from the University of Washington.

While these statistics are not new to criminologists, they imply that a growing share of the U.S. population is missing from the government’s main source of information about the labor market: the Current Population Survey. The CPS only covers the non-institutionalized population, but the federal government uses it to calculate important measures of labor market outcomes such as wages, labor force participation, and unemployment rates as well as official poverty statistics, including the Census Bureau’s new Supplemental Poverty Measure.

As the missing data problem has become more severe, these measures have become more distorted, in particular with respect to trends in racial inequality. In a recent NBER working paper, economist Derek Neal and I argue that since 1970, the economic progress of African American men relative to white men has been quite anemic. We reach this conclusion by properly accounting for the growth of the prison population over this period, and hence the misleading picture derived from average labor market earnings for employed workers.

In our paper, we treat the median weekly wages of men in their prime working years as a proxy for their overall labor market prospects. Among the employed, the ratio of median weekly wages for African Americans relative to whites increased steadily from around 65 percent in 1970 to well over 75 percent in 2010, the most recent census year. Yet this statistic substantially overstates the recent relative progress of African Americans for two reasons. First, employment rates for working age men have declined much more among blacks than among whites, and growing numbers among the non-employed are incarcerated. Second, earnings prospects are now and have always been worse for those who are not currently employed.

Thus, we estimate what we call median potential wages for blacks and whites, making adjustments for changes in the numbers of non-employed and institutionalized persons over time. We find that the labor market prospects of black men relative to white men have not improved over the past 40 years. There have been slight ups and downs (with some noteworthy progress in the 1990s), but in 2010, the ratios of median potential wages among African American men to the median potential wages of their white peers were roughly at 1970 levels, across groups with different levels of experience.

Black-white economic convergence, then, has come to a halt after substantial progress throughout most of the past century, as documented in a seminal 1989 study by James Smith of the Rand Corporation and Finis Welch, then an economics professor at the University of California-Los Angeles. While it is difficult to quantify the exact contribution of mass incarceration to the lack of black relative progress in recent decades, some studies do find suggestive evidence that incarceration harms employment and earnings opportunities long after prisoners serve their time.

Our results concerning stalled relative progress for African American men are particularly noteworthy because we are also able to demonstrate that the prison boom was primarily the result of policy choices. At first glance, one might suspect that rising incarceration rates reflect increased criminal activity as a consequence of deteriorating legal labor market opportunities for people with little formal education. But the boom in crime is long over. Criminal activity and arrests for all non-drug-related offenses peaked in the early to mid-1990s and have been on the decline ever since. Drug-related arrests increased well into the late 2000s, but due to short average sentences, drug offenses on their own contributed relatively little to the overall boom in incarceration.

Instead, the main driver of the prison boom has been a move toward more punitive corrections policies across all offense categories, not just drug crimes. Such policies include so-called Truth-in-Sentencing laws, “Three Strikes” policies, and mandatory minimum sentences. As a result, arrested alleged offenders in each violent crime category are now at least twice as likely to spend more than five years in prison then they were in the mid-1980s. The pattern is perhaps even more striking for non-violent offenses: conditional on arrest, the probability of any given sentence length has increased—often by a factor of two or more.

Overall, an alleged offender in the 2000s can expect to spend about twice as long in prison as in the 1980s, conditional on the severity of the crime. Of course, not all of this shift necessarily reflects a change in policy. In particular, technological advances such as the use of DNA evidence may have increased the probability that an alleged offender is found guilty. But these new investigative methods have been adopted by other developed countries—and none of them have experienced changes in distributions of time-served among offenders that are even remotely similar to those we have seen in the United States. Therefore, it is hard to avoid the conclusion that sentencing and parole release policies have played the leading role. We estimate that the overall shift toward more punitive corrections policies probably accounts for between 70 and 85 percent of the growth in incarceration rates since 1985.

There is now substantial evidence that the boom in incarceration had an adverse effect on the relative economic progress of African American men, and that this prison boom was primarily a policy choice and not a result of deteriorating labor market conditions. Supporters of tougher corrections policies may argue that these policies have contributed to the decline in criminal activity over the past two decades. But even with our study, the costs of that crime reduction have not been fully counted and may not have been fully realized yet.

Some recent studies provide evidence that more punitive treatment of first offenders increases recidivism rates and prolongs criminal careers, and recent trends in the demographic characteristics of prisoners are consistent with this claim. Crime in our country was once almost exclusively a young man’s game, but arrest rates and prison admission rates for men ages 40 to 49 have risen disproportionately in recent years. In addition, we have not yet seen how policies that promote mass incarceration within particular communities will impact future generations from those communities.

—Armin Rick is Assistant Professor of Economics at Cornell University’s Johnson School of Management. His collaborator on this project is Professor Derek Neal of the University of Chicago Economics Department. Their paper, “The Prison Boom and the Lack of Black Progress after Smith and Welch,” was recently released by the National Bureau of Economic Research.

Nothing new under the labor market sun

The Bureau of Labor Statistics released new labor market data today showing that the U.S. economy added 209,000 jobs and that the unemployed rate ticked up slightly to 6.2 percent. Overall, the data show an economy continuing on its path of the past several years—a moderate recovery that is inadequate in light of the severity of job losses during the Great Recession.

The slight increase in the unemployment rate was due to an increase in the labor force and not a decline in the number of employed workers. According to the BLS household survey, the number of employed workers increased by 131,000 while the overall labor force increased by 329,000. This resulted in an increase in the labor-force participation rate to 62.9 percent in July from 62.8 percent in June.

The share of the population with a job, the employment-to-population ratio, was unchanged from 59 percent, still 4 percentage points below the most recent peak in December 2006. The ratio for the working age population (workers ages 25 to 54) slightly decreased to 76.6 percent from 76.7 percent.

The number of long-term unemployed workers (those without a job for 27 weeks or more) was essentially unchanged, according to BLS. This group continues to be a large share of the unemployed at 32.9 percent of all unemployed workers. The debate about the future of the long-term unemployed will continue. Some analysts, including economists at the Board of Governors of the Federal Reserve, claim that the long-term unemployed are getting jobs while others remain quite skeptical of the evidence.

Businesses added 209,000 total jobs during July, 198,000 coming from the private sector. The employment gains were less broadly based than in recent months. The diffusion index for private industries, a measure of how many industries added jobs, was only 61.9 percent in July compared to 65.3 percent in June and 64.4 percent in May.

Manufacturing added 28,000 jobs, and all of the gains (30,000) came from industries that manufacture durable goods. Specifically, 14,600 jobs came from the auto industry. Nondurable manufacturing industries shed 2,000 jobs in July led by food manufacturing (a loss of 3,600 jobs).

The data on wage growth, relevant to the current debate about slack in the labor market and the future of Federal Reserve policy, also showed little change. The year-on-year change in the average wage for all workers was 2 percent. Wage growth has hovered around this rate for the last year and shows no sign of acceleration. And the rate is well below its pre-recession level in 2007, which was closer to 3.5 percent.

080114-wage-growth-01

The data released today show a labor market that continues to heal from the Great Recession. But the recovery continues to come up short given the damage done in the past. With wage growth still subdued and no sign that the long-term unemployed are locked out from jobs gains, policy makers should be skeptical of calls to pull back on growth-boosting measures. Overly cautious policy would not only leave our economy weaker in the short run but undermine our long-term economic growth potential as well.

A post-war history of U.S. economic growth

Five years removed from the end of the Great Recession, economists, policymakers, investors, business leaders, and everyday Americans from all walks of life remain concerned about the future of economic growth in the United States. The severity of that two-year recession and the lackluster recovery ever since sparks fear among economists and policymakers that the U.S. economy is in for a perhaps new and long period of slow growth. Economist Tyler Cowen of George Mason University raised this concern in his book “The Great Stagnation.” And Harvard University economist and former Treasury Secretary Larry Summers recently warned about secular stagnation where the economy suffers from a prolonged period of inadequate demand.

Read a PDF of the full document with all citations

While these fears are surfacing today, the anemic economic conditions that prevail at present and from which these concerns spring may be the result of structural changes in the U.S. economy over the past 40 years. Since the mid-1970s, the U.S. economy has undergone a variety of changes that may help or hinder economic growth over the long-term, among them:

  • An employment shift from manufacturing to services
  • The advent of the Internet
  • The entrance of women into the paid labor force
  • The greater participation of people of color in all sectors of the economy
  • The greater openness of the economy to international trade
  • The ever-evolving role of government
  • A rapid increase in income inequality

The mission of the Washington Center for Equitable Growth is to understand whether and how these structural changes, particularly the rise in inequality, affect economic growth and stability. But before we can understand how these forces may affect economic growth, we need a baseline understanding of how the U.S. economy grew in the past.

This report helps in that endeavor by looking at the past 65 years of economic growth in the United States—measured by examining our country’s Gross Domestic Product, both its rate of growth and sources of growth, from 1948 to 2014. The starting point, of course, is what this oft-cited statistic GDP actually measures. GDP is comprised of aggregate statistics based upon four major components: consumption, investment, government expenditures, and net exports.

The report then looks at the overall growth of real (inflation adjusted) per capita GDP as well as the contributions of each component to growth over time, specifically over business cycles, or patterns of economic recessions and expansions. (See graph.)

web-econgrowth01

Based on the overall trends, we divide the post-World War II into three eras of growth—the booming post-war period to the early 1970s (the fourth quarter of 1948 to the fourth quarter of 1973), the transition period to the early-1980s characterized by a series of economic shocks and high inflation (the fourth quarter of 1973 to the third quarter of 1981), and the ensuing period of low economic volatility and heightened growth known as the Great Moderation up until the start of the Great Recession in 2007 (the third quarter of 1981 to the fourth quarter of 2007).(See graph.)

web-econgrowth02

Specifically, economic growth in the third period, leading up to the Great Recession, was:

  • Not as brisk as it once was
  • More dependent upon consumption
  • Held back by net exports
  • Less driven by government expenditures and investment

The current business cycle, starting with the beginning of the Great Recession, appears to be the beginning of a new era—one tentatively defined by tepid consumer demand, stagnant real-wage gains, and growing economic inequality.

This report will have achieved its purpose if it spurs new thinking about how exactly we can and should promote economic growth in the United States.

Designing a research agenda to move the minimum wage forward

During the most recent push to raise the federal minimum wage in the United States, more than 600 economists signed a letter encouraging Congress to do so, including seven Nobel laureates. This letter highlighted research that the minimum wage has little to no impact on the employment of minimum-wage workers and that a raise would provide a small stimulus effect on the economy. A few weeks later a letter opposing a rise in the minimum wage was released with the signatures of more than 500 economists, including three Nobel laureates. The opposing letter focused on the increase in labor costs and pointed to a Congressional Budget Office analysis that finds an increase would reduce overall employment, although the 90 percent confidence interval included a zero effect. These economists fundamentally disagree about the response of employment to minimum wage increases, contributing to the paralysis at the national level on the minimum wage, but both claim to point to “the research.”

Read a pdf of the full document.

We propose a series of research projects targeted at advancing the policy debate. In their February 2014 report the Congressional Budget Office highlighted several areas where they argued that there was not enough information or consensus to make strong assessments. We are reaching out to advocates and policymakers to better understand the questions about the minimum wage they want and need answered, with the intention of shaping a research agenda on the minimum wage that directly answers their questions.

Below we identify research questions that may be of interest to policymakers and advocates inspired by the existing academic research as well as the recent CBO paper. This discussion paper should be treated as the name implies—a jumping-off point for a conversation about a research agenda designed to move the policy process forward.

The 2014 Congressional Budget Office report, “The Effects of a Minimum-Wage Increase on Employment and Family Income,” addressed the questions posed to them by Congress on the impact of an increase in the minimum wage, and relied on the most up-to-date academic research in doing so.

Consequently, the CBO report had to adjudicate between a wide variety of studies on the minimum wage, not all of which pointed to the same conclusions. In many cases, the report splits the difference, such as when it cites “uncertainty about the responsiveness of employment to an increase in wages.” Given these inconsistencies, a minimum wage research agenda that addressed the following questions could help clarify and focus the empirical evidence:

  • How does the minimum wage affect production?
  • How do outputs, profits, and prices change?
  • Does a rise in the minimum change worker efficiency?
  • Do increases affect low- and high-productivity firms differently?
  • Are there changes to workforce composition or hours worked?
  • How does the minimum wage affect the overall wage distribution?
  • How large are“ripple effects”for workers who already earn more than the minimum wage?
  • How much does the minimum wage change income inequality?
  • Does the minimum wage affect the macroeconomy?
  • How much less is spent on government benefits for low-income people?
  • How do consumption patterns change from increased wages?
  • How does the structure of the minimum wage policy impact outcomes?
  • How do effects vary by the size of the minimum wage increase?
  • Do minimum wage changes have different short- and long-run effects?

Many of these questions have been addressed directly or indirectly in the economics literature, but work will be needed to synthesize and effectively communicate the results in a way that allow for a more direct, effective response to CBO’s analysis. Yet many
of these topics are under-researched or rely on older data, suggesting a need for new research. This discussion paper explores several of these questions as a starting point for encouraging new research.

How do employment effects vary by the size of the minimum wage increase?

While recent research suggests that modest increases in the minimum have strong effects on earnings and small effects on employment, little work exists on whether this pattern holds for larger raises. Economic theory suggests that the effects will vary by the “bite” of the minimum wage into the underlying wage or productivity distribution. In a study of the 1996 and 1997 federal minimum wage changes, Economist Jeffrey P. Thompson— now at the Federal Reserve Board and previously a professor at the University of Massachusetts, Amherst, found that in 2009, counties with low average earnings (where the minimum’s “bite” was greater) had larger falls in employment after the wage change. Offering an international perspective on the debate, economists Yi Huang, Prakash Loungani, and Gewei Wang estimated that after China strengthened minimum wage enforcement, firms with low profit margins reduced employment, but firms with high profit margins expanded.

Seattle has just passed legislation to increase the city minimum wage from $9.32 per hour today to $15 by 2017-2021, depending on the type of employer. San Francisco is now con- sidering following suit. Opponents of the minimum wage frequently respond by highlight- ing the arbitrariness of the levels proposed by legislators. Additional research could ground the levels in analysis and help policymakers identify the best targets.

Do minimum wage changes have different short-run and long-run effects?

In his review of the research fifteen years ago, University of Michigan economist Charles Brown emphasized that understanding the long-run effects of the minimum wage remains “the largest and most important gap in the literature.”  Perhaps the research overall found no short-term employment effects because firms are unable to modify production in response to a minimum wage increase in the short-run, but in the medium- to long-run, they are less constrained in terms of hiring patterns and substituting capital for labor.

More recently, Texas A&M University economists Jonathan Meer and Jeremy West argued that the minimum wage primarily influences employment growth, rather than the employment level. Therefore, an increase in the minimum wage has a small effect on employment levels in the short-run , but a large effect in the long-run. In contrast, economists Arindrajit Dube at the University of Massachusetts, Amherst, T. William Lester at the University of North Carolina, Chapel Hill, and Michael Reich at the University of California, Berkeley, failed to find effects on employment levels up to four years after minimum wage increases. Additional work must reconcile conflicting evidence on long-term effects of an increase in the minimum wage.

How does the minimum wage affect production?

To respond to a minimum wage increase, employers and workers may choose a variety of “channels of adjustment,” such as raising prices or improving efficiency. The most comprehensive evidence suggests that restaurants raise prices in response to a minimum wage increase, passing a portion of increased labor costs onto consumers. Unfortunately, the city-level data used in this analysis is almost two decades old, and has not been subjected to alternative specifications. With more recent but less comprehensive data, economists Emek Basker and Muhammad Khan at the University of Missouri, Columbia, find similar price increases for two out of three restaurant items. New research with better quality price data has a high probability of informing how much affected businesses raise prices after a minimum wage increase.

By improving worker and managerial efficiency, minimum wage increases may boost labor productivity. Productivity effects would be consistent with current research confirming that worker turnover falls sharply after a minimum wage increase, both in the United States and Canada.  In addition, restaurant managers’ survey responses suggest that minimum wage increases provide an opportunity to portray the “cost shock as ‘a challenge to the store’” in order “energize employees and to improve productivity,” according to a study by economists Barry Hirsch and Bruce Kaufman at Georgia State University. Similarly, using plant-level data in the United Kingdom, economists at the National Bureau of Economic Research find that revenue-per-worker increases in response to a minimum wage rise, but the effect is statistically insignificant.

Firms may also adjust production practices in the face of a minimum wage increase by hiring more highly skilled workers, or by reducing hours of the lower-skilled work- force. Existing high-quality studies do not generally find large effects on workforce composition and hours, but the estimates remain too statistically imprecise to rule out substantive effects. One recent study, for example, estimates that teen hours either fall somewhat or not much at all, depending on the specification.

More recent but preliminary work suggests that relatively small employment-level impacts of the minimum wage may conceal large changes in the mix of firms. The study finds that restaurants in three states that raised minimum wages during the 2000s experienced increases in employees’ hiring and departures from firms. New research must provide more comprehensive and precise evidence on how firm composition and output change in response to the minimum wage.

How does the minimum wage affect the overall wage distribution?

By raising the wage floor, the minimum wage reduces inequality, but current research has not settled on the size of these effects. One study in 1999 estimated that the falling real value of the minimum wage accounted for the entire increase in wage inequality between the median wage and the 10th percentile wage during 1979-1989. In contrast, a new study this year by economists David Autor and Christopher L. Smith at the Massachusetts Institute of Technology and Alan Manning at the London School of Economics finds that the falling real minimum wage accounted for about one-third of the inequality increase. Better data quality and more recent empirical techniques can improve estimates of the minimum wage’s impact on inequality.

In raising the minimum wage, workers just above the minimum wage will often see a wage increase. While many studies observe these “ripple effects” or wage spillovers, existing empirical work does not evaluate any underlying mechanisms. Do the spillovers occur within firms, as workers paid just above the minimum also demand raises? Or do they occur in the market, as firms are forced to raise wages to attract new workers? Or do they occur as employers attempt to maintain established wage structures (internal pay scales) within firms?

What are the macroeconomic effects of the minimum wage?

By lifting workers out of poverty, the minimum wage may reduce fiscal spending on income support and welfare programs. Two economists at the Institute for Research on Labor and Employment, Rachel West and Michael Reich, find that the minimum reduces the use of food stamps as well as state-level expenditures on that program. Additional empirical work could examine other needs-based programs and quantify state-level budget impacts.

Minimum wage raises likely translate into increased consumption, but little work exists
on directly measuring these effects. One recent study finds a minimum wage change leads to large increases in consumption; these expenditures seem concentrated in automobile purchases partially financed by debt. New research with high quality individual-level data will help to improve estimates of the consumption response to minimum wages.

A final related issue is whether minimum wage increases affect the economy differently during times of economic slack or expansion. One recent study finds that the minimum has large negative effects on employment when unemployment is high, but another one finds no such evidence. More work is needed to identify credible estimates of how the minimum wage interacts with the broader economy.

 

Update on Research for Equitable Growth’s Issue Brief “A Regional Look at Single Moms and Upward Mobility”

About a month ago, I put out a short piece assessing the attention given to the relationship between the share of single mothers in an area and economic mobility. As part of the piece, I noted that many places had high economic mobility despite having a high share of single mothers and that these places tended to have had parental leave laws prior to the Family Medical Leave Act of 1993. Richard Reeves and Joanna Venator of Brookings objected to my analysis noting that some places, such as New Jersey, have low mobility despite having long standing family leave laws. In response, I have added a statistical appendix to the original piece showing that these pre-FMLA parental leave laws are statistically significant factors. You can also read my blog post over on the Brookings Social Mobility Memos blog responding to Reeves and Venator.

The Importance of Private Equity Supermanagers among Top Income Earners

A new data interactive published today by the Washington Center for Equitable Growth should help elevate discussion about the growth of incomes at the very high end of the U.S. income ladder and how that growth affects economic inequality and growth. To help inform that debate about who the people in the 0.1 percent are, we here at the Washington Center for Equitable Growth have produced a data interactive about who is in the top 0.1 percent of the income distribution, based on a 2012 white paper by economists Jon Bakija of Williams College, Adam Cole of the Office of Tax Analysis at the U.S. Department of the Treasury, and Bradley Heim of Indiana University. Looking at tax data, their report looked at the types of jobs held by the top earners in the United States between 1979 and 2005..

Their work provides a great window into how our economy has been changing at the top end of the income spectrum. The data in their study indicate that “supermangers” (as economist Thomas Piketty of the Paris School of Economics refers to business executives and managers in his book “Capital in the 21st Century”) constituted about 60 percent of the top 0.1 percent of the income distribution over that period. While this level did not changed substantially over time, the nature of the supermanager definitely did. People working in finance in 2005 claimed 18 percent of that 0.1 percent, up from 11 percent of the 0.1 percent in 1979. And the types of executives in this mix went from being 20 percent private equity or closely held firms in 1979 to more than half in 2005.

This mirrors the trend in corporate structure in the United States toward more private ownership, most likely because of the elevated role of private equity investing over this period. Big Wall Street private equity firms and financial institutions are huge players in in private equity investing.

James Manzi, a senior fellow at the Manhattan Institute for Policy Research, wrote a piece attacking Piketty’s discussion of supermangers by arguing about the decline in the share of the 0.1 percent that work in publically held companies. But Manzi fails to discuss the rise of those in private equity and closely held firms—which is particularly odd because he cites the Bakija, Cole, and Heim white paper and even found the correct table in the document to find these observations. By missing this point, both his number crunching and critique of Piketty fall way off the mark.

That said, understanding the changing nature of the top 0.1 percent is important for understanding the changes that have been driving our economy. Piketty’s work on the supermanagers is interesting, but only serves to highlight how little we know about this extremely high-income group.

Who are today’s supermanagers and why are they so wealthy?

What explains the changes in top-earning occupations over the past four decades? Perhaps the most intriguing argument about the current state of income inequality in the English speaking economies that Thomas Piketty makes in his bestseller “Capital in the 21st Century” is this—“the vast majority (60 to 70 percent, depending on what definitions one chooses) of the top 0.1 percent of the income hierarchy in 2000-2010 consists of top managers.” He goes on to argue on page 302 of his book that the rise in labor income “primarily reflects the advent of ‘supermanagers,’ that is, top executives of large firms who have managed to obtain extremely high, historically unprecedented compensation packages for their labor.”

top-earners-infographic

This really begs the question as to how and why these supermanagers came into existence. Nobel Laureate Robert M. Solow points out in The New Republic that this is primarily an American outcome. And Henry Engler at Thomson Reuters Accelelus’ Compliance Complete recently published an excellent piece on Piketty’s supermanagers in the United States and the United Kingdom. Both writers agreed with Piketty that these supermanagers were being vastly overly compensated given their questionable contributions to productivity.

I hope to shed a little more light on this issue by examining the change in professions comprising the top 0.1 percent of tax filers between 1979 and 2005. The purpose: to examine whether the changing composition of this super elite reflects changes in our economy that may explain the link between rising economic inequality and anemic economic growth over this period.

To do so, I used data from the April 2012 white paper “Jobs and Income Growth of Top Earners and the Causes of Changing Income Inequality: Evidence from U.S. Tax Return Data,” by economists Jon Bakija of Williams College, Adam Cole of the Office of Tax Analysis at the U.S. Department of the Treasury, and Bradley Heim of Indiana University. They used tax data on the top 0.1% of filers to identify the top earning professions. The infographic below tells the tale, charting the change in occupations at the tippy top of the income ladder in 1979 and 2005.

The biggest change in the distribution of top earners is in the types of executives, managers, and supervisors at non-financial firms. In 1979, most of these people worked for large, publicly traded firms but by 2005 more were working in closely held firms. There is not enough information to provide a clearer picture as to who exactly these people are, but chances are they are employed by firms that are owned by private equity firms—the growth in the private equity industry over this period of time was substantial—and because financial professionals saw large gains, too. The share of people in the top 0.1 percent working in finance also increased substantially, to 18 percent in 2005 from 11 percent in 1979.

These findings are consistent with Piketty’s analysis in his new book. But there are alternative explanations. One is presented in George Mason economist Tyler Cowen’s latest book, “Average is Over.” He claims a skill biased-technological change is responsible for the shift in top occupations over roughly the same period. He argues that technology allows top performers to capture more of the market and thus earn substantially more than average performers. He and many other people hypothesize that this is a driver of increased economic inequality.

But if technology were a primary driver of inequality, then one would expect that skilled trades would have larger incomes and would have become a larger share in the top 0.1 percent. While there are slightly more technical types and entertainers among top earners (as can be seen in the data presented in our interactive) the biggest gains in both percentage terms and magnitude were among privately held business professionals.

Thus, the so called “average is over” argument—that that the top performers in each field will capture a bigger share of the pie—may be a driver of inequality, but it does not appear to explain the bulk of the changes in occupations at the top of the income ladder. Instead, the supermanagers appear to be capturing greater share of the wealth as is argued by Piketty and others. More detailed data would be required to assess who these people are and how workplace dynamics changed from 1979 to 2005 that would explain the change in income. The Washington Center for Equitable Growth will be examining this data in more detail in forthcoming publications.

The value of search-and-matching models for the labor market

University of Queensland economics professor John Quiggin of Crooked Timber recently published an excellent blog post questioning the usefulness and empirical success of the dominant macroeconomic model of the labor market: the search-and-matching framework.  Peter Diamond of the Massachusetts Institute of Technology, Dale Mortensen of Northwestern University, and Christopher Pissarides of the London School of Economics won the Nobel Prize in Economics in 2010 for their foundational work on this model. Most recent theoretical and empirical work on macro labor markets, including my own dissertation, is based on it because it was the best thing economists had on hand when the world came calling to ask about unemployment in the Great Recession.

Moreover, several high-quality datasets that track the model’s observables have also become available in recent years and have enabled important empirical work based on investigations of the model’s predictions and the implications of their success and failure. But Quiggin’s critiques are well taken, and have sparked an interesting conversation, with contributions from Stony Brook University economist Noah Smith, the Roosevelt Institute’s Mike Konczal, and others. Why is it worth having this discussion? Is this just an ivory tower academic debate about oversimplified mathematical formalizations with no empirical basis?

On the contrary, a correct understanding of the labor market is of central importance to assessing its ailments and ordering the right prescription. And the labor market is the way that the majority of people earn the majority of their living, although that living has gotten more meager for all but the top few over the past several decades. The middle fifth of the household income distribution earned labor income of $45,315 in 2011 dollars, or 77.1 percent of their total income, in 1979, and only $45,997 (amounting to 65.8 percent of their income) in 2007—a mere 1.5 percent increase despite a 129 percent increase in inflation-adjusted gross domestic product. The overall share of national income going to labor has declined from 79 percent in 1979 to 71 percent in 2010.

The search-and-matching model works like this: one class of agents, “workers,” is either searching for a job or employed while another class of agents, “firms,” is either vacant, meaning it has a vacancy posted, or filled, in which case it employs a worker and together they hum along productively. In the earliest formulation of the model, the only economic decision made by either agent was whether a firm would post a vacancy in an attempt to match with a worker or would choose not to, thus remaining inactive. Otherwise, both searching workers and vacant firms mindlessly wait until they match up, then commence a productive relationship that lasts until their match spontaneously dissolves. Then the worker goes back to searching and the firm makes its decision about whether to post a vacancy or not once again.

The reason why search-and-matching labor market models have something to say about the recent history of the labor market is because they are a good deal richer than the simple search-based story, which is the reason why Federal Reserve Bank of Richmond economist Karthik Athreya, whose book sparked this discussion, said that “search is not really about searching.” Specifically, they allow for alternative theories of wage-setting, a factual timeline for unemployment spells and what determines their duration, a rich set of labor market outcomes beyond employment and unemployment, and an implementable notion of power that is often a critical missing piece of economic modeling.

Like all economic models, the search-and-matching model is a simplification, even a ludicrous one. Quiggin argues that it fails even as a simplification of reality because the main reason why unemployment exists is not because workers and firms are groping in the dark for one another, a process which just by its nature takes time. And he’s right about that. So do we search theorists have a big problem?

Notice that my summary of the model left out one big thing: how the fruits of the productive employment relationship are split between the worker and the firm. This is by far the biggest controversy in the field of search theory. The assumption made by the earliest search-and-matching models is that the “surplus” the two agents generate is split between the parties in the optimal way, where the optimality concept is defined within the model but with some relationship to a more general intuition about what each party would want. (That optimal way is known as the “Nash Bargain” after the Nobel-Prize-winning mathematical theorist John Nash, the protagonist of the film A Beautiful Mind.)

The problem is that this theory of wage setting is an empirical disaster. Not only is it inconsistent with investigations into how actual wages are actually set, but it generates false predictions about unemployment spells that crucially fail to line up with what happens to unemployment during recessions (it goes up, and it stays high for a long time, when the optimal theory of wages says that wages should do the adjusting). Refinements that make the theory with Nash Bargaining consistent with the data on unemployment in recessions yield their own big empirical problem: those refinements imply that being unemployed isn’t really that bad for workers, which everyone who is sentient knows to be untrue.

So the search-and-matching model has a crazy theory about how wages are set, and that makes it a crazy model of how labor markets work, right? No. What the search-and-matching theory has, and what its alternatives lack for the most part, is indeterminacy about how wages are set. The “Nash Bargain” theory is optimal, but it’s not necessary—other wage-setting assumptions can be used to resolve the indeterminacy. And if economists can get their minds around the idea that the “market solution” is not always optimal then they can make real headway with the search-and-matching approach precisely because it’s consistent with those alternatives.

That’s where the most promising research into the macro labor market is happening. If wage-setting in the search-and-matching model is made more factual by including what seems to be the well-established norm of not actually cutting pay in nominal terms, then it generates long unemployment spells and high unemployment rates—not because something about the matching process has become more inefficient, which Quiggin is correct to call absurd, or because workers don’t care whether they’re employed or unemployed— but because the labor market is a great deal stickier than the canonical competitive equilibrium model assumes.

The search-and-matching model boasts several other strengths. In the popular imagination (mostly of those who have never been unemployed), unemployment follows a large-scale layoff. Basically, it’s the destruction of employment that leads to high unemployment. But in reality, high unemployment occurs mostly when the hiring rate declines. The overall job separation rate is, in general, not related to the business cycle, and it has been in long-run decline, which is itself evidence of ill health in the labor market. There was a very transient uptick in mass layoffs during the 2008 recession, but the market reverted to its long-run level of around 1.2 percent laid-off per month as the official recession ended in late 2009.

Hiring, on the other hand, went down at the beginning of the recession and has remained lousy for a long time. The factual way to interpret this is that there’s inherently churn, or movement in the labor market as workers quit jobs or get laid off and are hired for a new job. The labor market is unhealthy when the workers who leave their jobs can’t find new ones. That is a story that can’t be told without a search-and-matching model.

Furthermore, modifications to the search-and-matching model allow it to explain numerous other phenomena. Instead of workers being only either employed or unemployed, non-employed workers can be allowed to exit the labor force entirely, to go on disability, or to remain in traditional unemployment (that is, receiving unemployment insurance while searching for a job), or enter some other unattached state. Employed workers can be allowed to remain happily with their existing firm or look for a new job while staying employed at the old one. These elaborations obviously allow for a richer set of predictions, and they also let trends such as a prolonged slack labor market to manifest themselves in more ways than high unemployment and/or low wages, including the reduced size of the labor force and more job-lock as those lucky enough to have a job cling to it tenaciously.

Another strength of the search-and-matching approach, and one that is essentially an implication of the indeterminacy of wages, is that it has room for the concept of power, which alternative approaches, especially the competitive equilibrium, do not. In the Nash Bargain, there’s an explicit mathematical parameter that captures the relative power that workers and firms have in the wage-negotiation process. That alone is not terribly meaningful because if anything is an endogenous concept requiring an explanation rather than simply a mathematical assumption, it’s power.

But power also works its way into the model through what are known as the parties’ threat points, meaning the alternatives each agent has available when they are negotiating. When the labor market is weak and unemployment is high, the threat point for workers deteriorates, which means they can be squeezed by firms, a phenomenon that captures a great deal of truth about the functioning of the labor market and needs to be in any model of it. That phenomenon is right there in search-and-matching.

Quiggin makes much of the observation that the rise of Internet-based job searches has not led to a decline in the unemployment rate by making the search process more fluid, as the search-and-matching model supposedly predicts. But the issue is this: has the Internet actually changed how the labor market works? In some ways, it has. My organization, the Washington Center for Equitable Growth, currently has a job vacancy posted (one that would not be captured by the standard data on job vacancies) for which we’ve received a great many applications, whereas in the past a classified advertisement might have yielded a dozen phone calls and half that many mailed resumes.

But we will still take many weeks to interview candidates and fill the position. Thus, a decline in what might be considered search costs just leads to more searching, but not necessarily more matching. A similar dynamic is at play in the rise of the average number of colleges most applicants apply to: instead of streamlining the search process, more information just intensifies it.

There are other examples where one might have expected advances in information technology to have had a significant macroeconomic impact but which in fact have not. Financial integration is one: in the 1990s and early 2000s, then Federal Reserve Board chair Alan Greenspan assured us that in a world of instantaneous financial transactions and global credit markets, systemic risk could not rise because everyone is connected to everyone. We all saw how that turned out. Similarly, ATMs were supposed to have decreased the economy’s structural demand for money, which means that unless the money supply also shrank dramatically there would be high inflation. Nope. Exactly why supposedly world-changing technologies don’t actually change the world is a difficult question, but that critique is not one that pertains to search-and-matching labor market models specifically.

Quiggin also argues that the big question in labor market macroeconomics—essentially, why is labor demand low and why does it stay low—can only be answered by macroeconomic models. He asserts that search-and-matching models don’t add any insight. Let me offer an alternative schematization. There are two big questions in business cycle macroeconomics. Why do recessions happen? And why do they look the way they look, with high, persistent unemployment and a cascade of other symptoms of illness in the labor market?

Search-and-matching models don’t do anything on the first question; one has to assume the recession into the story. But the model goes a long way toward answering the second one if you allow for factual wage-bargaining and other modifications. In short, the model is a great tool to have in the economist’s toolbox, provided it’s used skillfully and with attention to the data, first and foremost.