Previewing tomorrow’s annual income and poverty report

Dollar bills in New York.

The U.S. Census Bureau will release its annual report, “Income and Poverty in the United States,” on Tuesday this week. The report summarizes results from the March edition of the Current Population Survey and presents a wealth of information on the income of individuals and families in the United States. This report is also of interest because it contains an official estimate of inequality in the United States by the Census Bureau, one of the few official estimates that the federal government publishes. Unfortunately, the Census Bureau estimate suffers from several flaws and highlights the urgent need for a better official measurement of inequality.

Figure 1 shows the historical trend line of the Census Bureau’s estimate of the percent of total income held by the top 5 percent of the population. The trend jumped in the early 1990s but has been relatively steady since then and currently shows the top 5 percent earning about 22 percent of all income. But this is a severe underestimate. Figure 1 also shows the estimates made in the Distributional National Accounts dataset constructed by the economists Thomas Piketty, Emmanuel Saez, and Gabriel Zucman. Their work, which incorporates far more accurate tax data and makes several other improvements on the Census estimate, shows that the income share of the top 5 percent has risen steadily and is now about 36 percent of all income.

Figure 1

It’s not only that official estimates get the overall level of inequality wrong. The trend of the Census estimates is also misleading, showing little change over the past two decades. As policymakers start to consider ways to reduce inequality in the United States, accurate measurement of the phenomenon is more important than ever. We can’t evaluate the efficacy of anti-inequality policy if our official measurements of inequality don’t reflect the correct level or trend.

The Distributional National Accounts estimates were constructed entirely from datasets that are produced by the U.S. government, so creating better measures is possible. Poor intra-agency data sharing is largely responsible for the current state of affairs. U.S. code prevents most agencies from handling tax data, so agencies that have an interest in tracking the distribution of wealth are unable to do so. Congress should consider revising these restrictions. The Commission on Evidence-Based Policymaking, a bipartisan effort by Congress, suggested exactly this in their final report. Now Congress must act.

Must-Read: Pedro Nicolaci da Costa: Fed may pause rate hikes if inflation weakness persists

Must-Read: Every year since 2007 the Fed has been too optimistic and forecast that its interest rates will be higher than has turned out to be the case. Every single year. 2-11 = 1/2048:

Pedro Nicolaci da Costa: Fed may pause rate hikes if inflation weakness persists: “The Federal Reserve is embarking on an annual summer ritual: Downgrading its overly optimistic forecasts for economic growth… http://www.businessinsider.com/fed-may-pause-rate-hikes-if-inflation-weakness-persists-2017-7

…Janet Yellen’s testimony to Congress this week… acknowledging that a recent decline in inflation further below the central bank’s 2% target may not, in fact, be as fleeting as policymakers had hoped…. The latest figures are clearly heading in the wrong direction. Consumer prices held flat in June despite expectations for a 0.1% increase and the annual rate, which the Fed watches closely, registered just 1.6%. The Fed’s preferred measure of inflation, the personal consumption expenditures index, has also been slipping…. Not one but two regional Fed banks have just downgraded their growth estimates….

The Fed has frequently been overly optimistic about its predictions for rate hikes in the post-recession era. This rather stunning chart from Deutsche Bank’s Torsten Slok is rather instructive:

Fed may pause rate hikes if inflation weakness persists Business Insider

Should-Read: David Glasner: Milton Friedman Says that the Rate of Interest Is NOT the Price of Money: Don’t Listen to Him!

Should-Read: The price level is the (inverse) price of symbols of purchasing power in terms of an index of useful commodities. The nominal interest rate is the price of liquidity services. The real interest rate is the slope of the intertemporal price system for useful commodities.

Is that clear?

David Glasner: Milton Friedman Says that the Rate of Interest Is NOT the Price of Money: Don’t Listen to Him!: “Friedman’s repeated claims that the rate of interest is not the price of money… https://uneasymoney.com/2017/09/07/milton-friedman-says-that-the-rate-of-interest-is-not-the-price-of-money-dont-listen-to-him/

…have been echoed by his many acolytes so often that it is evidently now taken as clear evidence of economic illiteracy (or “a freshman error,” as Patrick Sullivan describes it) to suggest that the rate of interest is the price of money. It was good of Sullivan to provide an exact reference to this statement of Friedman, not that similar references are hard to find, Friedman never having been one who was loathe to repeat himself. He did so often, and not without eloquence. Even though I usually quote Friedman to criticize him, I would never dream of questioning his brilliance or his skill as an economic analyst, but he was a much better price theorist than a monetary theorist, and he was a tad too self-confident, which made him disinclined to be self-critical or to admit error, or even entertain such a remote possibility…

Must-Read: Amanda Bayer and Cecilia Elena Rouse: Diversity in the Economics Profession: A New Attack on an Old Problem

Must-Read: Amanda Bayer and Cecilia Elena Rouse: Diversity in the Economics Profession: A New Attack on an Old Problem: “The economics profession includes disproportionately few women and members of historically underrepresented racial and ethnic minority groups… https://www.aeaweb.org/articles?id=10.1257/jep.30.4.221

…This underrepresentation… is present at the undergraduate level, continues into the ranks of the academy, and is barely improving over time. It likely hampers the discipline, constraining the range of issues addressed and limiting our collective ability to understand familiar issues from new and innovative perspectives…. We… offer an overview of… research on the reasons for the underrepresentation…. We argue that implicit attitudes and institutional practices may be contributing to the underrepresentation of women and minorities at all stages of the pipeline, calling for new types of research and initiatives to attack the problem. We then review evidence on how diversity affects productivity and propose remedial interventions as well as findings on effectiveness…

Must-Read: Frank Pasquale (2011): Economic Policy for the Worried Wealthy

Must-Read: Frank Pasquale (2011): Economic Policy for the Worried Wealthy: “Why is the austerity movement so powerful in the US?… https://concurringopinions.com/archives/2011/04/economic-policy-for-the-worried-wealthy.html

…Why not expect a little more from the wealthy? Why are states from Arizona to New York going after poor Medicaid patients and schools instead? We know the economic case for austerity in a deep recession is bunk. Why its enduring appeal?… The wealthy in the US may have extraordinary influence over the political process, but they could use it in many different ways. Warren Buffett complained about being taxed less than his secretary, and Bill Gates’s father has fought for the estate tax…. At some point the marginal value of money diminishes; why not spread it around a bit?…

Consider the research of BU sociologists on the lives of “people with fortunes in excess of $25 million.” It is featured at The Atlantic, which, in between ads for Goldman Sachs and planning for its Aspen Ideas festival, earlier this year informed us that “we need a creative, dynamic super-elite more than ever.” But that super-elite is worried:

They are frequently dissatisfied even with their sizable fortunes. Most of them still do not consider themselves financially secure; for that, they say, they would require on average one-quarter more wealth than they currently possess. (Remember: this is a population with assets in the tens of millions of dollars and above.) One respondent, the heir to an enormous fortune, says that what matters most to him is his Christianity, and that his greatest aspiration is “to love the Lord, my family, and my friends.” He also reports that he wouldn’t feel financially secure until he had $1 billion in the bank….

The mental agony of being so close to the billionaire’s circle, but just not quite there, must be enormous. Those in the upper echelons of industry may well be looking at the 400 persons who own more wealth than half the country, and say, “why not me?”… The “fear of falling” Barbara Ehrenreich diagnosed among middle class households in the 1990s has infected the very wealthy today…. Many millionaires will need to pay a lot less taxes to join the ultra-high-net worth crowd, which in turn envies a finance elite. And why not—given the extremely low tax rates on “carried interest,” hedge funders are getting quite a deal…. The “worried wealthy” can choose to create a more equal society, where no one could fall too far, or to build a fortress of personal wealth designed to keep generations secure from the vicissitudes of market driven “creative destruction”…

Should-Read: Noah Smith: Realism in macroeconomic modeling

Should-Read: What Noah Smith does not say is: this is a horrible research program.

Taking your residual, putting it on the right hand side, and calling it a “productivity” shock may allow you to fit some things, but it doesn’t allow you to explain or _understand. And there is no theory and no interest in developing any theory of where these “productivity shocks” come from.

Compare this to medieval Ptolemaic astronomy—well, Judah al-Barceloni had a theory that it was Sammael, the Angel of Mars, who guided the planet around its epicycles.

Ljungqvist and Sargent haven’t even reached that level of Popperian potential falsification: their productivity shocks do not emerge from any economic process whereby businesses learn and forget production technologies, but they simply crystallize out of the air:

Noah Smith: Realism in macroeconomic modeling: “Ljungqvist and Sargent are trying to solve the Shimer Puzzle… http://noahpinionblog.blogspot.com/2017/09/realism-in-macroeconomic-modeling.html

…the fact that in classic labor search models of the business cycle, productivity shocks aren’t big enough to generate the kind of employment fluctuations we see in actual business cycles. A number of theorists have proposed resolutions to this puzzle-i.e., ways to get realistic-sized productivity shocks to generate realistic-sized unemployment cycles. Ljungqvist and Sargent look at these and realize that they’re basically all doing the same thing-reducing the value of a job match to the employer, so that small productivity shocks are more easily able to stop the matches from happening…”

Must- and Should-Reads: September 8, 2017


Interesting Reads:

Should-Read: Paul Krugman: The Political Failure of Trickle-Down Economics

Should-Read: I am not as confident as Paul is that Paul’s numbers here are the right numbers to look at: I do not know how much wealth accumulation escapes the income stream measurement, but I suspect it is a lot more than escaped back in the 1970s:

Paul Krugman: The Political Failure of Trickle-Down Economics: “We tend to think of the period since Reagan’s election as a conservative era… https://krugman.blogs.nytimes.com/2017/08/20/the-political-failure-of-trickle-down-economics/

…lots of centrist Dems willing to cooperate with R agendas… at the end of the Obama years taxation of the rich was pretty much back where it was pre-Reagan…. A welfare state supported by progressive taxation has been much more robust than the year-by-year political narrative might lead you to think.

But in that case, why the incredible surge in inequality?…. There is, I think, a good case to be made that things like the collapse of unions and financial deregulation mattered a lot more than the taxing and spending issues we spend so much time talking about…

New report on evidence-based policymaking boasts recommendations that Congress must take seriously

The Commission on Evidence-Based Policymaking was formed in response to a bill sponsored by Speaker of the House Paul Ryan and Senator Patty Murray (AP Photo/ Scott Applewhite, File)

The bipartisan, congressionally mandated Commission on Evidence-Based Policymaking released its final report today, advocating for a number of sound changes to the way the federal government collects, manages, and makes use of federal data. The Washington Center for Equitable Growth, a grant-giving organization that works closely with academic economists to expand our understanding of inequality in the economy, knows firsthand the challenges posed by current federal data-stewardship practices and applauds the Commission for making a number of smart recommendations for modernizing this infrastructure.

The work of the Commission is complete and it is now incumbent on Congress and the Trump administration to implement these recommendations. We address some of the Commission’s recommendations below, but we must emphasize that without congressional action, the Commission’s report will do nothing. Unfortunately, Congress has not been kind to statistical agencies in 2017, raising the question of whether there is political will to provide the resources that the commission’s plan will require.

Will Congress provide the necessary funding?

The Commission’s report does not address funding levels for existing statistical agencies, but funding for these agencies is not a luxury—it is critical to the functioning of a modern government. The commission was told time and again during hearings that funding for agencies is too low and that data quality is at risk. As we have highlighted before, the House of Representatives is currently on track to cut budgets for important statistical agencies. If Speaker of the House Paul Ryan (R-WI) truly believes in the importance of this Commission’s work then his first priority should be to reverse these cuts.

Despite presenting himself as a strong champion of utilizing data in the governmental process, Speaker Ryan has given little indication that he is willing to pay for such efforts. In a 2014 policy document that first raised the possibility of the Commission, the Speaker proposed a clearinghouse for federal program and survey data. In that document he suggested that such a clearinghouse should be funded by user fees to keep it revenue neutral. Prioritizing revenue-neutral funding mechanisms in an early conceptual document is another discouraging sign that the Speaker may be unwilling to make the necessary investment to turn the commission’s recommendations into a reality.

Statistical agency budgets are measured in millions of dollars, a drop in the bucket in terms of annual government spending. To make the Trump administration’s fiscal year 2017 budget target, the U.S. Bureau of Economic Analysis is proposing to halt programs to track the impact of small businesses, collect better data on trade, and measure health care more accurately for incorporation into quarterly gross domestic product calculations. The savings from cutting these three programs, which would help us understand regional variations in our economy and improve economic decisionmaking, is a mere $10 million. By way of comparison, cutting the top tax rate and reducing the number of tax brackets—part of the House Republican tax plan that Speaker Ryan endorses, would cost $94 billion next year and $1.4 trillion over the next decade.

Increasing access to administrative data is critical for modern governance

Administrative data—data collected in the regular course of a federal agency performing its designated function—has already revolutionized our understanding of several economic phenomena. Most notably, the use of tax data has allowed economists to study income inequality at the top 1 percent of the income distribution and show that this group of earners is taking a much larger share of total income in the economy than they did 30 years ago. One of the first researchers to use tax data to study incomes noted that for the economics profession, “the economic lives of the rich, especially the rich who are not famous, are something of a mystery.”[1: Feenberg, Daniel R. and James M. Poterba, “Income Inequality and The Incomes of Very High-Income Taxpayers: Evidence from Tax Returns.” In Tax Policy and The Economy, edited by James M. Poterba. MIT Press. 2003. ] That has changed: The New York Times recently published a chart created by academic researchers that showed massive income growth in the top 0.001 percent of all earners, all thanks to the availability of tax data to researchers. These data continue to be hard to obtain, even for researchers in other sections of the federal government. If policymakers want to identify and address modern economic challenges, data such as these need to be more widely available to researchers.

The Commission proposes a new agency, the National Secure Data Service, to provide data anonymization and linkage as a service to researchers. It would not be a data warehouse, as is sometimes proposed, but would instead be an intermediary between researchers and federal agencies to facilitate access to data. This is a reasonable approach to the problem of data access. It means that existing agencies can continue to store data as they have while this new agency would concentrate on developing the methodological capacity to evaluate projects, assess privacy concerns, and merge survey and administrative data.

The Commission’s report also calls attention to an under-appreciated challenge for federal researchers: often even federal researchers cannot obtain data if it is generated in another department. This prevents the Bureau of Economic Analysis, for example, from accessing individual tax data, which could be used to improve some of their current statistical processes. The commission suggests revisiting parts of the U.S. Code that erect these barriers between agencies.

Old sources of data shouldn’t fall by the wayside

As the girl scouts say “make new friends, but keep the old.” Administrative data is new to the scene and much in demand among researchers, but for decades policymakers, academic economists, and pundits have relied on economic surveys such as the Current Population Survey. The Commission clearly understands the value of these surveys and notes that they are now suffering from decreased participation and reluctance of respondents to answer particular questions. It may be tempting to see administrative data as a wholesale replacement for these older tools. This is a mistake.

First, these surveys do capture some dynamics that administrative data does not. The Current Population Survey, for example, tells us about the income of low-income Americans who are not required to file a tax return because they owe no taxes. Researchers frequently merge that data to the tax data to obtain a complete universe of individuals.

More importantly however, survey data comes with far fewer privacy concerns than administrative data, making it possible for the government to freely distribute the raw data. This in turn means that analysis is not limited to federal employees or researchers at elite universities. Journalists, bloggers, policy analysts, and casual enthusiasts all have access to the full data set. This truly democratizes the data and contributes to the discourse over economics by incorporating a diverse set of voices.

Balance privacy and access

Per its congressional mandate, the Commission also engaged at length with the issue of privacy in data. Administrative data raises new privacy concerns and it is reasonable to approach this issue with caution. Researchers who work with administrative data are generally receiving data where obvious identifiers, such as names, birth dates, and addresses have been removed. It may still be possible, however, to identify individuals in the dataset by looking at the data. There are many ways that agencies can approach this problem, and recent advances promise new possibilities. Some agencies, for example, are researching the creation of synthetic data sets that use generated data but retain the statistical properties of the original data set.

While Equitable Growth agrees that privacy is important, it should be balanced against the benefits of access for researchers. The Commission notes this tension as well: “It is equally important, however, to calibrate the need for privacy with the public good that research findings based on such data can provide.” At Commission meetings, presenters in charge of sensitive datasets were frequently asked if they had experienced data breaches. In each occasion, the answer was no (although a few reported minor rules violations). It appears that existing safeguards and those used by state and foreign entities are sufficient to the task of maintaining privacy, so further restrictions on dataset use should be approached with care.

Should-Read: Sam Bowles and Wendy Carlin: A new paradigm for the introductory course in economics

Should-Read: Sam Bowles and Wendy Carlin: A new paradigm for the introductory course in economics: “The contributions of Keynes, Hayek, and Nash–aggregate demand, the central economic role of limited information, and strategic interactions modelled by game theory… http://voxeu.org/article/new-paradigm-introductory-course-economics

…have become foundations of economic thinking. Before the end of the 20th century, all three innovations had become standard postgraduate economic instruction…. Things are radically different at the undergraduate level. The Samuelsonian paradigm is basically Marshall plus Keynes…. Asymmetric and local information, and strategic social interactions modelled by game theory are mentioned, if at all, at the end of the introductory course…. Understandably, students think information problems and strategic interaction are simply refinements of the standard model, rather than challenges to two of its foundations–price-taking as the benchmark for competitive behaviour, and complete contracts (and hence market clearing in competitive equilibrium) made possible by complete information. CORE’s introductory text, The Economy, attempts to do for information economics and strategic social interaction what Samuelson did for aggregate demand…. Likewise, behavioural experiments and research on human cognitive capacities call for a more empirically grounded conception of human behaviour….

We replaced the passive price-taker of perfectly competitive equilibrium with the ‘perfect competitor’ (Makowski and Ostroy 2001). This active competitor exploits available (but incomplete) information to appropriate any possible rents that may exist when an economy is not in equilibrium, under some conditions driving the dynamic process to a Pareto-efficient equilibrium, even when there are impediments to competition. The new paradigm not only provides a more convincing story about how an economy might reach a competitive equilibrium, it also fundamentally alters the nature of that outcome. When lenders and borrowers, and employers and employees, are modelled as principals and agents with asymmetric information, who interact under an incomplete contract, credit and labour markets do not clear in competitive equilibrium (Stiglitz 1987). Introducing students to quantity constraints at the outset eliminates the need for ad hoc assumptions to explain the credit constraints underpinning the Keynesian multiplier, persistent unemployment, and other macroeconomic phenomena. Students also get an empirically based perspective on how markets work when reputation, personal loyalty and social norms play an essential role (Brown et al. 2004)….

CORE’s problem-based approach to teaching concepts and models narrows the gap between what the students get and how we do economics in another way as well. Economics has become an increasingly empirical subject…. The data appear… in the CORE…as the basis for defining real economic problems that models should be capable of illuminating…