Should-Read: Noah Smith: Realism in macroeconomic modeling

Should-Read: What Noah Smith does not say is: this is a horrible research program.

Taking your residual, putting it on the right hand side, and calling it a “productivity” shock may allow you to fit some things, but it doesn’t allow you to explain or _understand. And there is no theory and no interest in developing any theory of where these “productivity shocks” come from.

Compare this to medieval Ptolemaic astronomy—well, Judah al-Barceloni had a theory that it was Sammael, the Angel of Mars, who guided the planet around its epicycles.

Ljungqvist and Sargent haven’t even reached that level of Popperian potential falsification: their productivity shocks do not emerge from any economic process whereby businesses learn and forget production technologies, but they simply crystallize out of the air:

Noah Smith: Realism in macroeconomic modeling: “Ljungqvist and Sargent are trying to solve the Shimer Puzzle… http://noahpinionblog.blogspot.com/2017/09/realism-in-macroeconomic-modeling.html

…the fact that in classic labor search models of the business cycle, productivity shocks aren’t big enough to generate the kind of employment fluctuations we see in actual business cycles. A number of theorists have proposed resolutions to this puzzle-i.e., ways to get realistic-sized productivity shocks to generate realistic-sized unemployment cycles. Ljungqvist and Sargent look at these and realize that they’re basically all doing the same thing-reducing the value of a job match to the employer, so that small productivity shocks are more easily able to stop the matches from happening…”

Must- and Should-Reads: September 8, 2017


Interesting Reads:

Should-Read: Paul Krugman: The Political Failure of Trickle-Down Economics

Should-Read: I am not as confident as Paul is that Paul’s numbers here are the right numbers to look at: I do not know how much wealth accumulation escapes the income stream measurement, but I suspect it is a lot more than escaped back in the 1970s:

Paul Krugman: The Political Failure of Trickle-Down Economics: “We tend to think of the period since Reagan’s election as a conservative era… https://krugman.blogs.nytimes.com/2017/08/20/the-political-failure-of-trickle-down-economics/

…lots of centrist Dems willing to cooperate with R agendas… at the end of the Obama years taxation of the rich was pretty much back where it was pre-Reagan…. A welfare state supported by progressive taxation has been much more robust than the year-by-year political narrative might lead you to think.

But in that case, why the incredible surge in inequality?…. There is, I think, a good case to be made that things like the collapse of unions and financial deregulation mattered a lot more than the taxing and spending issues we spend so much time talking about…

New report on evidence-based policymaking boasts recommendations that Congress must take seriously

The Commission on Evidence-Based Policymaking was formed in response to a bill sponsored by Speaker of the House Paul Ryan and Senator Patty Murray (AP Photo/ Scott Applewhite, File)

The bipartisan, congressionally mandated Commission on Evidence-Based Policymaking released its final report today, advocating for a number of sound changes to the way the federal government collects, manages, and makes use of federal data. The Washington Center for Equitable Growth, a grant-giving organization that works closely with academic economists to expand our understanding of inequality in the economy, knows firsthand the challenges posed by current federal data-stewardship practices and applauds the Commission for making a number of smart recommendations for modernizing this infrastructure.

The work of the Commission is complete and it is now incumbent on Congress and the Trump administration to implement these recommendations. We address some of the Commission’s recommendations below, but we must emphasize that without congressional action, the Commission’s report will do nothing. Unfortunately, Congress has not been kind to statistical agencies in 2017, raising the question of whether there is political will to provide the resources that the commission’s plan will require.

Will Congress provide the necessary funding?

The Commission’s report does not address funding levels for existing statistical agencies, but funding for these agencies is not a luxury—it is critical to the functioning of a modern government. The commission was told time and again during hearings that funding for agencies is too low and that data quality is at risk. As we have highlighted before, the House of Representatives is currently on track to cut budgets for important statistical agencies. If Speaker of the House Paul Ryan (R-WI) truly believes in the importance of this Commission’s work then his first priority should be to reverse these cuts.

Despite presenting himself as a strong champion of utilizing data in the governmental process, Speaker Ryan has given little indication that he is willing to pay for such efforts. In a 2014 policy document that first raised the possibility of the Commission, the Speaker proposed a clearinghouse for federal program and survey data. In that document he suggested that such a clearinghouse should be funded by user fees to keep it revenue neutral. Prioritizing revenue-neutral funding mechanisms in an early conceptual document is another discouraging sign that the Speaker may be unwilling to make the necessary investment to turn the commission’s recommendations into a reality.

Statistical agency budgets are measured in millions of dollars, a drop in the bucket in terms of annual government spending. To make the Trump administration’s fiscal year 2017 budget target, the U.S. Bureau of Economic Analysis is proposing to halt programs to track the impact of small businesses, collect better data on trade, and measure health care more accurately for incorporation into quarterly gross domestic product calculations. The savings from cutting these three programs, which would help us understand regional variations in our economy and improve economic decisionmaking, is a mere $10 million. By way of comparison, cutting the top tax rate and reducing the number of tax brackets—part of the House Republican tax plan that Speaker Ryan endorses, would cost $94 billion next year and $1.4 trillion over the next decade.

Increasing access to administrative data is critical for modern governance

Administrative data—data collected in the regular course of a federal agency performing its designated function—has already revolutionized our understanding of several economic phenomena. Most notably, the use of tax data has allowed economists to study income inequality at the top 1 percent of the income distribution and show that this group of earners is taking a much larger share of total income in the economy than they did 30 years ago. One of the first researchers to use tax data to study incomes noted that for the economics profession, “the economic lives of the rich, especially the rich who are not famous, are something of a mystery.”[1: Feenberg, Daniel R. and James M. Poterba, “Income Inequality and The Incomes of Very High-Income Taxpayers: Evidence from Tax Returns.” In Tax Policy and The Economy, edited by James M. Poterba. MIT Press. 2003. ] That has changed: The New York Times recently published a chart created by academic researchers that showed massive income growth in the top 0.001 percent of all earners, all thanks to the availability of tax data to researchers. These data continue to be hard to obtain, even for researchers in other sections of the federal government. If policymakers want to identify and address modern economic challenges, data such as these need to be more widely available to researchers.

The Commission proposes a new agency, the National Secure Data Service, to provide data anonymization and linkage as a service to researchers. It would not be a data warehouse, as is sometimes proposed, but would instead be an intermediary between researchers and federal agencies to facilitate access to data. This is a reasonable approach to the problem of data access. It means that existing agencies can continue to store data as they have while this new agency would concentrate on developing the methodological capacity to evaluate projects, assess privacy concerns, and merge survey and administrative data.

The Commission’s report also calls attention to an under-appreciated challenge for federal researchers: often even federal researchers cannot obtain data if it is generated in another department. This prevents the Bureau of Economic Analysis, for example, from accessing individual tax data, which could be used to improve some of their current statistical processes. The commission suggests revisiting parts of the U.S. Code that erect these barriers between agencies.

Old sources of data shouldn’t fall by the wayside

As the girl scouts say “make new friends, but keep the old.” Administrative data is new to the scene and much in demand among researchers, but for decades policymakers, academic economists, and pundits have relied on economic surveys such as the Current Population Survey. The Commission clearly understands the value of these surveys and notes that they are now suffering from decreased participation and reluctance of respondents to answer particular questions. It may be tempting to see administrative data as a wholesale replacement for these older tools. This is a mistake.

First, these surveys do capture some dynamics that administrative data does not. The Current Population Survey, for example, tells us about the income of low-income Americans who are not required to file a tax return because they owe no taxes. Researchers frequently merge that data to the tax data to obtain a complete universe of individuals.

More importantly however, survey data comes with far fewer privacy concerns than administrative data, making it possible for the government to freely distribute the raw data. This in turn means that analysis is not limited to federal employees or researchers at elite universities. Journalists, bloggers, policy analysts, and casual enthusiasts all have access to the full data set. This truly democratizes the data and contributes to the discourse over economics by incorporating a diverse set of voices.

Balance privacy and access

Per its congressional mandate, the Commission also engaged at length with the issue of privacy in data. Administrative data raises new privacy concerns and it is reasonable to approach this issue with caution. Researchers who work with administrative data are generally receiving data where obvious identifiers, such as names, birth dates, and addresses have been removed. It may still be possible, however, to identify individuals in the dataset by looking at the data. There are many ways that agencies can approach this problem, and recent advances promise new possibilities. Some agencies, for example, are researching the creation of synthetic data sets that use generated data but retain the statistical properties of the original data set.

While Equitable Growth agrees that privacy is important, it should be balanced against the benefits of access for researchers. The Commission notes this tension as well: “It is equally important, however, to calibrate the need for privacy with the public good that research findings based on such data can provide.” At Commission meetings, presenters in charge of sensitive datasets were frequently asked if they had experienced data breaches. In each occasion, the answer was no (although a few reported minor rules violations). It appears that existing safeguards and those used by state and foreign entities are sufficient to the task of maintaining privacy, so further restrictions on dataset use should be approached with care.

Should-Read: Sam Bowles and Wendy Carlin: A new paradigm for the introductory course in economics

Should-Read: Sam Bowles and Wendy Carlin: A new paradigm for the introductory course in economics: “The contributions of Keynes, Hayek, and Nash–aggregate demand, the central economic role of limited information, and strategic interactions modelled by game theory… http://voxeu.org/article/new-paradigm-introductory-course-economics

…have become foundations of economic thinking. Before the end of the 20th century, all three innovations had become standard postgraduate economic instruction…. Things are radically different at the undergraduate level. The Samuelsonian paradigm is basically Marshall plus Keynes…. Asymmetric and local information, and strategic social interactions modelled by game theory are mentioned, if at all, at the end of the introductory course…. Understandably, students think information problems and strategic interaction are simply refinements of the standard model, rather than challenges to two of its foundations–price-taking as the benchmark for competitive behaviour, and complete contracts (and hence market clearing in competitive equilibrium) made possible by complete information. CORE’s introductory text, The Economy, attempts to do for information economics and strategic social interaction what Samuelson did for aggregate demand…. Likewise, behavioural experiments and research on human cognitive capacities call for a more empirically grounded conception of human behaviour….

We replaced the passive price-taker of perfectly competitive equilibrium with the ‘perfect competitor’ (Makowski and Ostroy 2001). This active competitor exploits available (but incomplete) information to appropriate any possible rents that may exist when an economy is not in equilibrium, under some conditions driving the dynamic process to a Pareto-efficient equilibrium, even when there are impediments to competition. The new paradigm not only provides a more convincing story about how an economy might reach a competitive equilibrium, it also fundamentally alters the nature of that outcome. When lenders and borrowers, and employers and employees, are modelled as principals and agents with asymmetric information, who interact under an incomplete contract, credit and labour markets do not clear in competitive equilibrium (Stiglitz 1987). Introducing students to quantity constraints at the outset eliminates the need for ad hoc assumptions to explain the credit constraints underpinning the Keynesian multiplier, persistent unemployment, and other macroeconomic phenomena. Students also get an empirically based perspective on how markets work when reputation, personal loyalty and social norms play an essential role (Brown et al. 2004)….

CORE’s problem-based approach to teaching concepts and models narrows the gap between what the students get and how we do economics in another way as well. Economics has become an increasingly empirical subject…. The data appear… in the CORE…as the basis for defining real economic problems that models should be capable of illuminating…

Must-Read: German Gutierrez and Thomas Philippon: Investment-less growth: An empirical investigation

Must-Read: German Gutierrez and Thomas Philippon: Investment-less growth: An empirical investigation: “Business investment remains low despite high profits, low funding costs, and high asset values… https://www.brookings.edu/bpea-articles/investment-less-growth-an-empirical-investigation/

…which economists refer to as “high Tobin’s Q.”… The authors test eight alternative theories…. About a third of the decline in measured investment can be explained by a shift towards intangible assets. Among the remaining theories… only decreased competition (if firms don’t need to invest in new equipment or strategies to stay competitive, why would they?), tightened governance and, potentially, increased short-termist pressures help explain the phenomenon…. Changes in competition and governance seem partly explained by changes in policy (regulations, anti-trust) and in asset management (activism, indexing, and pressure for shares buyback programs)…

Should-Read: Olivier Coibion, Yuriy Gorodnichenko, and Mauricio Ulate: The US economy is not yet back to its potential

Should-Read: Olivier Coibion, Yuriy Gorodnichenko, and Mauricio Ulate: The US economy is not yet back to its potential: “Estimates of potential output around the world have been systematically revised downward since the Great Recession… http://voxeu.org/article/us-economy-not-yet-back-its-potential

…The methods used to create these estimates do not distinguish between transitory and permanent shocks, or demand and supply shocks. Taking these differences into account suggests US output is almost 10 percentage points below potential output. This has important immediate implications for policymakers, and raises questions for those who estimate potential output…

Must-Read: Cardiff Garcia: Brainard’s framing challenge

Must-Read: If another significant contractionary macroeconomic shock were to hit in the next two years, we would be totally s—-ed. The Fed doesn’t have enough maneuvering room, and only an idiot would think that this President and this Congress would step up and do the job. It is vitally important that the Red now be taking steps so that we would not be totally s—ed if significant contractionary macroeconomic shock were to hit in years three to five. But for that to be the case her colleagues would have to listen to Lael. And there is no sign that they are willing to do so:

Cardiff Garcia: Brainard’s framing challenge: “Tim Duy has the right response to Lael Brainard’s speech today… https://ftalphaville.ft.com/2017/09/05/2193262/brainards-framing-challenge/

…Despite the Fed governor’s persuasive attempt, she’ll find it harder to push the Fed in a more dovish direction than she did at the end of 2015…. The primary issue… is that wage growth has remained subdued and price inflation has trended down…. For too long the Fed persisted in thinking that the inflation slowdown was caused by idiosyncratic factors. One of Brainard’s points is that this belief is increasingly hard to justify…. As to the implications for monetary policy, she writes….

Some might determine that preemptive tightening is appropriate…. However… the Phillips curve appears to be flatter today…. It could take a considerable undershooting of the natural rate of unemployment to achieve our inflation objective if we were to rely on resource utilization alone…. To the extent that the neutral rate remains low relative to its historical value, there is a high premium on guiding inflation back up to target so as to retain space to buffer adverse shocks with conventional policy…. It is important to be clear that we would be comfortable with inflation moving modestly above our target for a time….

Must-Read: Chang-Tai Hsieh and Enrico Moretti: How Local Housing Regulations Smother the U.S. Economy

Must-Read: I could wish for a little more attention to the on-the-ground dynamics of local political economy. In California, it was the right-wing anti-property tax revolt (yes, I’m looking at you, Proposition 13, Jarvis, and Gann) that shifted local governments from being broadly pro-development to regarding developments as burdens that required more in services than they would pay in property taxes In New York (and, to a much lesser extent, Boston), it is an unwillingness to invest sufficiently in transportation infrastructure(yes, I’m looking at you, Chris Christie).

When I do Nimbyism in America: A Back-of-the-Envelope Finger-Exercise Calculation http://www.bradford-delong.com/2016/08/nimbyism-in-america-a-back-of-the-envelope-finger-exercise-calculation.html quick-and-dirty for introductory economics students, I start with this slide:

6a00e551f08003883401b8d21071e9970c pi 720×540

Thus there seem to me to be two problems: a NIMBY problem that afflicts Washington, Boston, and—overwhelmingly—San Francisco, and an infrastructure problem that afflicts New York, Boston, Washington, Chicago, and Philadelphia.

I should rerun the fit line excluding New York, shouldn’t I…

Chang-Tai Hsieh and Enrico Moretti: How Local Housing Regulations Smother the U.S. Economy: “For most of the 20th century, workers moved to areas where new industries and opportunities were emerging… https://www.nytimes.com/2017/09/06/opinion/housing-regulations-us-economy.html

…This was the locomotive behind American prosperity. Agricultural workers moved from the countryside to booming cities like Pittsburgh and Detroit. In the Great Migration, some six million African-Americans left the South for manufacturing jobs in cities like Chicago and Buffalo. What allowed this relocation to places with good-paying jobs that lifted the standard of living for families? Affordable housing.

Today, this locomotive of prosperity has broken down. Finance and high-tech companies in cities like New York, Boston, Seattle and San Francisco find it difficult to hire because of the high cost of housing. When an unemployed worker in Detroit today finds a well-paying job in San Francisco, she often cannot afford the cost of housing there.

New housing construction in America’s most dynamic cities faces growing regulatory costs, delays and enormous opposition from neighboring homeowners. Since the 1970s, a property-rights revolution — what critics call Nimbyism, from “not in my back yard” — has significantly reduced the development of new housing stock, especially in cities where the economy is strongest…

Must- and Should-Reads: September 17, 2017


Interesting Reads: