Rental demand in the United States will continue to surge over the next decade

A rental sign is seen outside a property in Denver.

High and rising rents are squeezing residents of many metropolitan areas along the Atlantic and Pacific coastlines, including New York, San Francisco, Boston, and Washington, D.C. New research suggests that over the coming decade, these rental prices will likely continue to increase, and the upward pressure on rents will likely spread to other parts of the country.

According to a study by the Joint Center for Housing Studies at Harvard University, renting (as opposed to buying) has increased among all ages, household types, and income groups in the United States. And there is a combination of demographic and economic forces at work that are likely to make matters worse over the next 10 years. Population growth among the young and the elderly, who are more likely than others to rent, is partially responsible. Meanwhile, economic forces have lowered homeownership rates for not just the young and old but also middle-aged individuals, who traditionally have been more likely to own their own homes.

Here are three major trends that are likely to define rental demand over the next decade.

First, Millennials are still coming of age. As half of Millennials are in their teens, this generation (ages 19 to 35 in 2016) is continuing to enter the housing market and will also continue to boost the number of new renter households. Over the next decade, the number of Millennial renter households will double from its current number of 11.3 million to 22.6 million households. For lifecycle reasons, young adults of all generations have always tended to have less income and less wealth, making them more inclined to rent rather than to own. But a slack labor market, high student debt, and reduced access to mortgages and other forms of credit in the wake of the Great Recession of 2007-2009 have exacerbated this tendency.

Second, the growing minority population in the United States is projected to account for three-quarters of household growth in the coming decade. Strong flows of immigration will continue to contribute to the growing share of minority households in America. And primarily for economic reasons, minorities and immigrants are more likely to rent than U.S.-born white households. Research by the Joint Center for Housing Studies indicates that about half of all immigrants to the United States are renters, including 74 percent of immigrants under age 35. Accordingly, a rising share of minority and immigrant households are expected to further bolster the demand for rental housing.

Third, the oldest Baby Boomers will be moving into the 70-and-older age bracket over the next ten years. As senior renters typically don’t decide to buy a home later in life, older renters will simply be aging in place. Additionally, the aging of this generation may lead many senior homeowners to access the equity tied up in their homes and improve their accessibility to care by swapping out of homeownership for rental housing. Baby Boomers comprise a significant part of the American population. As they grow older, begin to retire, and require more care, rental demand by seniors will also grow.

These trends are unfolding against a backdrop of economic distress that has plagued many Americans since the turn of the century, which has made homeownership increasingly unattainable. Structural changes in the economy, such as the growth in low-wage service jobs coupled with declines in higher-wage production jobs, also affect all working-age households. Weak wage growth following a slow labor market recovery will put even more pressure on rental demand as would-be homebuyers feel less confident about their economic security.

That said, homeownership rates in the United States are already low by recent historical standards. For millennials, delays in major life events such as education, career advancement, parenthood, and marriage, will impede homeownership. While minority communities and the elderly seek greater flexibility and accessibility to fit their needs, more households will end up renting rather than buying.

So what can policymakers do to alleviate rising rents and make homeownership more affordable? One obvious step is to repair the damage done to labor and credit markets over the past decade or so. Other possibilities include making policy changes at local, state, and federal levels to increase the availability and accessibility of affordable housing. Here are a few options:

  • Streamline permitting to promote affordable housing production, and expediting review of low- and moderate-income housing developments. Builders in Portland, Oregon say streamlining land-use and construction permitting would speed the creation of new homes, easing pressure on the housing market.
  • Relax zoning restrictions that discourage high-density development and floor-size minimums in communities where they are unnecessary and outdated. Minneapolis’ outdated residential codes overhaul went into effect earlier this month.
  • Change local codes to allow for the development of Accessory Dwelling Units, or ADUs, also known as “granny flats” or “garage-overs,” in high-density areas. Many localities are now encouraging the building of ADUs, most recently San Francisco.
  • Provide housing protections for low-income families by passing inclusionary housing requirements, popular in cities such as Washington, D.C.
  • Support asset building through Individual Development Accounts, or IDAs, to help low-income families save for long-term investments like homeownership.
  • Strengthen federal housing programs such as the National Housing Trust Fund, which supports the development of rental units for extremely low-income families.
  • Subsidize the cost of new housing or the rehabilitation of existing housing with support of the Low Income Housing Credit.

Failure to take some kind of action will bring continuing upward pressure on rents over the next decade.

—Nisha Chikhale is a research assistant at the Washington Center for Equitable Growth

Must-Read: Thomas Piketty: Change Europe, Now

Must-Read: Thomas Piketty (2015): Change Europe, Now: “The extreme right has risen from 15% to 30% of the votes in France…

…unemployment and xenophobia, extreme disappointment with the left in power, the feeling that everything has been tried and that something else must be experimented… disastrous management of the financial crisis…. Only a democratic and social re-founding of the Euro zone, based on growth and employment, round a small core of countries prepared to move forward and provide themselves with appropriate political institutions, would enable us to counter the temptation to revert to nationalism and hatred which today threatens the whole of Europe….

It is important that the European leaders—in particular the French and the German—acknowledge their mistakes. We can discuss endlessly all sorts of reforms, both big and small, to be carried out in the various Euro zone countries: shop-opening hours, bus lanes, labour markets, retirement pensions, etc. Some are useful, others less so. But… this is not the reason for the sudden fall in GDP in the Euro zone in 2011-2013…. Recovery was stifled by the over-rapid endeavour to reduce the deficits in 2011-2013—with in particular rises in taxation in France which were much too heavy…. The application of blind fiscal rules… explains why in 2015 the GDP of the Euro zone has still not recovered its 2007 level….

As a first step, all the debts of more than 60% of GDP could be placed in a common fund, with a moratorium on repayments until each country has recovered a strong growth trajectory since 2007. All historical experiences show that above a certain level, it makes no sense to repay debts for decades. It is better to ease the burden clearly so as to invest in growth, including from the creditors’ point of view…. New democratic governance… the setting up of a Euro-zone parliament comprising members from the national parliaments in proportion to the population of each country. This Euro-zone Parliamentary Chamber should also be entrusted with the voting of a common corporate tax… [to] enable the financing of an investment plan in infrastructures and universities…. Europe has all the assets required to offer the best social model in the world. Let’s stop squandering our opportunities…. Before coming to plan B, proposed by the extreme Right and which the extreme Left is increasingly tempted to invoke, let’s start by giving a fully-fledged plan A a genuine chance.

Must-Read: Paul Krugman: Friedman and the Austrians

Must-Read: Paul Krugman (2013): Friedman and the Austrians: “Still thinking about the Bloomberg Businessweek interview with Rand Paul…

…in which he nominated Milton Friedman’s corpse for Fed chairman. Before learning that Friedman was dead, Paul did concede that he wasn’t an Austrian. But I’ll bet he had no idea about the extent to which Friedman really, really wasn’t an Austrian. In his ‘Comments on the critics’ (of his Monetary Framework) Friedman described the ‘London School (really Austrian) view’

that the depression was an inevitable result of the prior boom, that it was deepened by the attempts to prevent prices and wages from falling and firms from going bankrupt, that the monetary authorities had brought on the depression by inflationary policies before the crash and had prolonged it by ‘easy money’ policies thereafter; that the only sound policy was to let the depression run its course, bring down money costs, and eliminate weak and unsound firms.

and dubbed this view an ‘atrophied and rigid caricature’ of the quantity theory. [His version of the] Chicago School, he claimed, never believed in such nonsense. I have, incidentally, seen attempts [by Larry White and company] to claim that nobody believed this, or at any rate that Hayek never believed this, and that characterizing Hayek as a liquidationist is some kind of liberal libel. This is really a case of who are you gonna believe, me or your lying eyes. Let’s go to the text (pdf), p. 275:

And, if we pass from the moment of actual crisis to the situation in the following depression, it is still more difficult to see what lasting good effects can come from credit expansion. The thing which is needed to secure healthy conditions is the most speedy and complete adaptation possible of the structure of production to the proportion between the demand for consumers’ goods and the demand for producers’ goods as determined by voluntary saving and spending.

If the proportion as determined by the voluntary decisions of individuals is distorted by the creation of artificial demand, it must mean that part of the available resources is again led into a wrong direction and a definite and lasting adjustment is again postponed. And, even if the absorption of the unemployed resources were to be quickened in this way, it would only mean that the seed would already be sown for new disturbances and new crises. The only way permanently to ‘mobilize’ all available resources is, therefore, not to use artificial stimulants—whether during a crisis or thereafter—but to leave it to time to effect a permanent cure by the slow process of adapting the structure of production to the means available for capital purposes.

And so, at the end of our analysis, we arrive at results which only confirm the old truth that we may perhaps prevent a crisis by checking expansion in time, but that we can do nothing to get out of it before its natural end, once it has come…

If that’s not liquidationism, I’ll eat my structure of production…

http://krugman.blogs.nytimes.com/2013/08/11/friedman-and-the-austrians/?_r=0

Why are negative employment effects of the minimum wage hard to find?

An activist cheers at a minimum wage rally.

Economists care a great deal about the minimum wage because it is a policy prescription that increasingly affects a large portion of the workforce and because it is a clear case of government intervention, imposing a floor on the market price of labor. Minimum wages therefore offer a policy tool to test theories about how the labor market operates. In a new working paper, Alan Manning of the London School of Economics argues that a clear signal of the negative employment effects of the minimum wage is“elusive,” which should not be surprising if we think about the mechanisms underlying competition in the labor market.

Manning first reviews the response of teen wages and employment to increases in the minimum wage in the United States from 1979 to 2014. Teens are a declining portion of the workforce, but they are more likely to be earning the minimum wage than adults, and Manning indeed presents clear evidence that the minimum wage raises teen wages. When it comes to employment, however, most of the evidence does not suggest significant, negative employment effects of the minimum wage. Only one of seven models with various geographic controls that he presents has a statistically significant, negative effect; five out of the seven models yield small, positive point estimates.

The lack of clear evidence for teen disemployment, Manning explains, seems inconsistent with the predictions of the simple, perfectly competitive model of the economy, where the labor market is approximated by a downward-sloping demand curve for labor, and where the market wage would otherwise be at its market-clearing level. Under this view of the labor market, the direction of the effects of the minimum wage is unambiguous: Minimum wages must reduce the employment of low-wage workers. What’s unclear theoretically is the magnitude of the decline in employment.

Manning, however, spells out three channels through which the employment effects could be small. The first relates consumer demand to the products that minimum wage workers produce. After a rise in the minimum wage, employers pass some of the increased labor costs onto customers. But because the price of a hamburger, say, is only partly determined by low-wage labor costs, the price change may not be that large. Consumers, moreover, may not be that sensitive to a small price increase of a few cents rather than a few dollars. The smaller this elasticity of product demand—when consumers buy just as much fast food at slightly higher prices—the smaller the disemployment effects of the minimum wage. The redistribution of income from owners to workers with higher propensities to consume further attenuates reductions in consumer demand.

Second, when an increase in the minimum wage raises labor costs, employment falls as affected firms substitute away from low-wage employment toward the purchase of capital and intermediate goods produced by other firms. If the elasticity of substitution away from labor and toward other goods is low, as empirical results suggest, then the perfectly competitive model of the labor market need not generate large, negative employment effects. Péter Harasztosi of the Magyar Nemzeti Bank and Attila Lindner of the University College London argue that the low substitution between labor and intermediate goods is one reason why the perfectly competitive model of the labor market is consistent with the small employment effects that they found for what was a large minimum wage increase in Hungary.

Manning also raises a third possibility: Perhaps the labor market isn’t best described by the purely competitive model. In a world with no job-search frictions, no one would celebrate getting a job or mourn losing one if another can be readily obtained without cost or effort. Frictions in the labor market, however, mean that employers may have some degree of power in setting wages, and once employers exercise some choice in determining the wages they pay, it is no longer clear that increases in the minimum wage will always reduce employment. Instead of reducing the employment of low-wage workers, a minimum wage hike can make it easier for employers to recruit, possibly raising employment.

In an imperfectly competitive labor market, then, the direction of the change in employment due to a minimum wage increase is not uniformly negative, although at some point sufficiently high minimum wages will begin to reduce employment. Most of the evidence over the past 15 years does not suggest the United States has crossed that threshold. Manning adds that the current policy experiments in U.S. cities and states, as well as international increases, will shed more light both on the effects of the minimum wage and how labor markets work.

Are low-interest rates contributing to low business investment?

Do low interest rates affect retiree decision-making?

Lower interest rates should boost business investment. By making money today cheaper today, companies have incentives to invest more in order to earn greater profits in the future. More credit equals more investment equals more economic growth now and in the future. But business investment growth hasn’t been particularly strong during the current era of zero and near-zero interest rates in the United States. And that’s particularly puzzling given how strong U.S. corporate profits have been over the same time period.

Maybe, just maybe, negative interest rates actually reduce businesses’ investment appetites– or so a new argument claims. This case, made by Jason Thomas of Carlyle Group and laid out by Greg Ip of the Wall Street Journal, starts from an important observation: Companies are increasing sending more money to shareholders in the form of dividends and stock buybacks, and spending relatively less money on investments in their businesses. According to Thomas’s calculations, since 2009–when interest rates were at zero percent–stock buybacks increased by 194 percent, dividends by 66.5 percent, and investment by only 43 percent. If companies have all these funds to send out to shareholders, why aren’t they using it to invest or borrowing money to do the same?

Thomas argues that it has to do with short-termism, or rather a specific kind of short-termism. One might expect that shareholders would welcome the chance for greater payouts in the future (as investing now would make dividends even greater in the future), but Thomas argues that there’s a group of shareholders that really want those dividends now: retirees. With low interest rates, retirees are particularly in need of cash flow now and will pay a premium for companies that buy back shares or increase dividends. Thomas’s argument, in effect, hinges on the idea that retirees have a higher risk tolerance than we think because it would make more sense for retirees to be investing in less-risky bonds than trying to pick specific stocks.

But many retirees, like many stock market investors, are passive investors, buying and selling through a variety of mutual funds and other retirement-investment vehicles. Their investment patterns reflect a broad index of stocks instead of specific companies.

What’s more, there are other explanations for the dearth of corporate investment today. Ip notes that companies may be investing less because they see fewer good investment opportunities, which is itself a reason why interest rates are low. Another problem with Thomas’s hypothesis is that declining corporate investment trend has been happening far longer than interest rates have been near zero. As John Jay College economist J.W. Mason points out in a report for the Roosevelt Institute, the shift away from investment and toward payouts to shareholders started in the 1980s, well before interest rates ever approached zero.

Low interest rates might possibly, in some small way, have an effect on the payout-investment decision for companies based on the demands of their elderly shareholders. But given the timing of the decline in corporate investment, the impact of low interest rates on retiree decision-making would seem to be nominal at best.

Must-Read: Justin Fox: This Job Market Slump Started a While Ago

This Job Market Slump Started a While Ago Bloomberg View

Must-Read: Justin Fox: This Job Market Slump Started a While Ago: “The Federal Reserve’s Labor Market Conditions Index… is a new measure… consolidates 19 different labor market indicators…

…The index has now declined for five straight months — its worst performance since the recession…. I first learned of its existence Monday when Erica Groshen, the Commissioner of the Bureau of the Labor Statistics, mentioned it at a conference for BLS data users in New York. It was a good reminder, as were a lot of the other presentations at the conference, that the headline jobs numbers that get the lion’s share of attention… aren’t always the best places to look for information…. One of the indicators included in the LMCI, for example, is employment in temporary help services, which tends to start rising and falling before overall employment does. Well, watch out: It looks like it may have peaked in December…. Though the signals coming from the U.S. labor market have been mostly negative for several months now, according to the LMCI, they’ll have to get much worse before it indicates that the economy is falling into a recession. Still, this is clearly more than just one off month.

The problematic returns of for-profit colleges

Photo of the for-profit DeVry University in Miramar, Fla. New research shows that default rates and debt burdens are rising among students at for-profit colleges.

The rise of student debt in the United States—now amounting to more than $1.2 trillion—is generally thought to weigh on the future prospects of graduates of traditional four-year colleges and universities. But research shows that the vast majority of student borrowers carrying heavy debt burdens are non-traditional college students. Default rates and debt burdens are rising among students at for-profit colleges. Students at more traditional colleges and universities aren’t defaulting as frequently because their earnings generally can support debt payments, but that doesn’t seem to be the case for students at for-profit schools.

Why? Because these non-traditional students don’t get that much of an earnings bump after attending non-profit schools, according to a recently released National Bureau of Economic Research working paper by economists Stephanie Riegg Cellini at George Washington University and Nicholas Turner at the U.S. Department of the Treasury. They examined the impact of for-profit colleges when it comes to earnings and employment, relying on a particularly appealing data set—administrative data from the U.S. Department of Education and the U.S. Internal Revenue Service. These data sets let the two economists look at the entire population of students who left a for-profit school between 2006 and 2008 and have data on the labor market status of those students from 1999 to 2014. Importantly, these data enable them to examine students’ stints at a for-profit school and their earnings before and after attendance.

The results are not great for the for-profit college students. The return on attending one of these schools is actually negative. In the five to six years after attending one of these schools, the average associate and bachelor’s degree student actually sees a decline in their earnings compared to what they made before they attended. The majority of this decline is due to the students who don’t complete the degree that they entered. Those who don’t finish school earn less and are less likely to be employed while having to service the debt they took on to attend the program.

But what about the graduates of these programs? They experience slightly positive returns on completing these degrees. Yet compared to other post-secondary school options, the gains from for-profit colleges are weak. In the paper, Cellini and Turner compare outcomes of students who get certificates from for-profit schools and students who get certificates from similar programs at public community colleges. Despite the higher cost of for-profit programs, the economists find that the earnings bump is lower for graduates of for-profit schools. This backs up other research showing a lower earnings return for students who went to for-profit schools. Perhaps policymakers should be thinking of ways of pushing students away from for-profits and toward well-funded public community colleges.

Agriculture the Worst Mistake in the History of the Human Race?: Today’s Economic History

Explore Equitable Growth’s current work in our Value Added blog.


Consider Jared Diamond’s 1987 paean to hunter gatherers. While I find his article provocative and insightful, I also find it annoying. It seems to me that it mostly misses the most important parts of the story.

For one thing, it misses the importance of the dominant Malthusian mechanisms. The invention of agriculture and the domestication of animals provide an enormous technological boost to humanity both in terms of the number of calories that can be harvested by an hour of work and in terms of the ability of a society to make durable investments of all kinds that further boost its productivity. It is an absolute living-standard bonanza for the generations that discover it, and the generations that come after.

So what goes wrong with quality of life among agriculturalists? Well, without rapid technological progress and before the demographic transition, human populations and living standards tend to settle at a point where, given nutrients, hazards of life, and societal institutions, every mother has on average one daughter who herself reproduces. The standard of living will be whatever standard of living makes that happen. And, for agriculturalists–without the hazards to adults of travel and hunting, and without the hazards a mobile lifestyle imposes on the very young–that standard of living is a lot lower than among hunter-gatherers. Lifespan looks about the same looking across hunter-gatherers and agriculturalists. Biomedical and fitness indicators are much much higher for hunter-gatherers.

Whether this is a “disaster” or not depends on the answer to the old utilitarian conundrum of whether it is better to have a few people who live very well or a lot of people who live very poorly. This is an open question in philosophy, but Diamond appears to think that it is a closed one. And Diamond ignores the important consideration that only the density of population that comes with agriculture can generate enough human brains thinking to allow us to–quite possibly–transcend our Malthusian limits and create a truly human world in the long run.

As to inequality … violence … domination … In my view it is difficult to say on net. Certainly the agricultural epoch has many many more people reaping where they did not sow and gathering where they did not scatter. Nevertheless, taking all three together, I cannot judge whether there was either a positive or a negative change across the boundary of the Neolithic Revolution. More inequality and domination, certainly. But you also have many more interactions between humans that are not one-shot interactions: people have fixed addresses, after all. If we know anything about humans, it is that human males have a tendency to resort to violence–perhaps not as great a tendency as chimps or gorillas, but a tendency, and we make more deadly weapons. It is not at all clear to me that the hunter-gatherer epoch had less murder, rape, kidnapping and enslavement of women, and so forth than did the agricultural epoch.

The hunter-gatherer age was not a kumbaya-singing age. Where, after all, are the Neanderthals today?

Jared Diamond (1987): The Worst Mistake in the History of the Human Race: Discover (May), pp. 64-66: “Archaeology is demolishing another sacred belief: that human history over the past million years has been a long tale of progress…

…The adoption of agriculture, supposedly our most decisive step toward a better life, was in many ways a catastrophe from which we have never recovered. With agriculture came the gross social and sexual inequality, the disease and despotism, that curse our existence…. For most of our history we supported ourselves by hunting and gathering: we hunted wild animals and foraged for wild plants. It’s a life that philosophers have traditionally regarded as nasty, brutish, and short…. Our escape from this misery was facilitated only 10,000 years ago, when in different parts of the world people began to domesticate plants and animals. The agricultural revolution spread until today it’s nearly universal and few tribes of hunter-gatherers survive…. Planted crops yield far more tons per acre than roots and berries. Just imagine a band of savages, exhausted from searching for nuts or chasing wild animals, suddenly grazing for the first time at a fruit-laden orchard or a pasture full of sheep. How many milliseconds do you think it would take them to appreciate the advantages of agriculture?…. The progressivist party line sometimes even goes so far as to credit agriculture with the remarkable flowering of art…. Agriculture gave us free time that hunter-gatherers never had. Thus it was agriculture that enabled us to build the Parthenon and compose the B-minor Mass….

While farmers concentrate on high-carbohydrate crops like rice and potatoes, the mix of wild plants and animals in the diets of surviving hunter-gatherers provides more protein and a better balance of other nutrients…. The lives of at least the surviving hunter-gatherers aren’t nasty and brutish, even though farmers have pushed them into some of the world’s worst real estate…. The progressivist view is really making a claim about the distant past: that the lives of primitive people improved when they switched from gathering to farming. Archaeologists can date that switch by distinguishing remains of wild plants and animals from those of domesticated ones in prehistoric garbage dumps….

Usually the only human remains available for study are skeletons, but they permit a surprising number of deductions…. keletons from Greece and Turkey show that the average height of hunger-gatherers toward the end of the ice ages was a generous 5’9″ for men, 5’5″ for women. With the adoption of agriculture, height crashed, and by 3000 B. C. had reached a low of only 5’3″ for men, 5′ for women. By classical times heights were very slowly on the rise again, but modern Greeks and Turks have still not regained the average height of their distant ancestors…. Burial mounds in the Illinois and Ohio river valleys… a hunter-gatherer culture gave way to intensive maize farming around A. D. 1150…. These early farmers paid a price for their new-found livelihood. Compared to the hunter-gatherers who preceded them, the farmers had a nearly 50 per cent increase in enamel defects indicative of malnutrition, a fourfold increase in iron-deficiency anemia (evidenced by a bone condition called porotic hyperostosis), a theefold rise in bone lesions reflecting infectious disease in general, and an increase in degenerative conditions of the spine, probably reflecting a lot of hard physical labor. ‘Life expectancy at birth in the pre-agricultural community was bout twenty-six years,’ says Armelagos, ‘but in the post-agricultural community it was nineteen years. So these episodes of nutritional stress and infectious disease were seriously affecting their ability to survive.’ The evidence suggests that the Indians at Dickson Mounds, like many other primitive peoples, took up farming not by choice but from necessity in order to feed their constantly growing numbers. ‘I don’t think most hunger-gatherers farmed until they had to, and when they switched to farming they traded quality for quantity,’ says Mark Cohen of the State University of New York at Plattsburgh, co-editor with Armelagos, of one of the seminal books in the field, Paleopathology at the Origins of Agriculture. ‘When I first started making that argument ten years ago, not many people agreed with me. Now it’s become a respectable, albeit controversial, side of the debate.’

There are at least three sets of reasons to explain the findings that agriculture was bad for health… a varied diet… [vs] one or a few starchy crops. The farmers gained cheap calories at the cost of poor nutrition…. Because of dependence on a limited number of crops, farmers ran the risk of starvation if one crop failed. Finally, the mere fact that agriculture encouraged people to clump together… led to the spread of parasites and infectious disease….

Besides malnutrition, starvation, and epidemic diseases, farming helped bring another curse upon humanity: deep class divisions. Hunter-gatherers have little or no stored food, and no concentrated food sources, like an orchard or a herd of cows: they live off the wild plants and animals they obtain each day. Therefore, there can be no kings, no class of social parasites who grow fat on food seized from others. Only in a farming population could a healthy, non-producing elite set itself above the disease-ridden masses. Skeletons from Greek tombs at Mycenae c. 1500 B. C. suggest that royals enjoyed a better diet than commoners, since the royal skeletons were two or three inches taller and had better teeth (on the average, one instead of six cavities or missing teeth). Among Chilean mummies from c. A. D. 1000, the elite were distinguished not only by ornaments and gold hair clips but also by a fourfold lower rate of bone lesions caused by disease…. Farming may have encouraged inequality between the sexes, as well. Freed from the need to transport their babies during a nomadic existence, and under pressure to produce more hands to till the fields, farming women tended to have more frequent pregnancies than their hunter-gatherer counterparts — with consequent drains on their health….

Thus with the advent of agriculture and elite became better off, but most people became worse off. Instead of swallowing the progressivist party line that we chose agriculture because it was good for us, we must ask how we got trapped by it despite its pitfalls. One answer boils down to the adage ‘Might makes right.’ Farming could support many more people than hunting, albeit with a poorer quality of life. (Population densities of hunter-gatherers are rarely over on person per ten square miles, while farmers average 100 times that.)… As population densities of hunter-gatherers slowly rose at the end of the ice ages, bands had to choose between feeding more mouths by taking the first steps toward agriculture, or else finding ways to limit growth. Some bands chose the former solution… outbred and then drove off or killed the bands that chose to remain hunter-gatherers, because a hundred malnourished farmers can still outfight one healthy hunter….

Hunter-gatherers practiced the most successful and longest-lasting life style in human history. In contrast, we’re still struggling with the mess into which agriculture has tumbled us, and it’s unclear whether we can solve it. Suppose that an archaeologist who had visited from outer space were trying to explain human history to his fellow spacelings. He might illustrate the results of his digs by a 24-hour clock on which one hour represents 100,000 years of real past time. If the history of the human race began at midnight, then we would now be almost at the end of our first day. We lived as hunter-gatherers for nearly the whole of that day, from midnight through dawn, noon, and sunset. Finally, at 11:54 p. m. we adopted agriculture. As our second midnight approaches, will the plight of famine-stricken peasants gradually spread to engulf us all? Or will we somehow achieve those seductive blessings that we imagine behind agriculture’s glittering facade, and that have so far eluded us?


Diamond is broadly right in what he says:

Carles Boix and Frances Rosenbluth: Bones of Contention: The Political Economy of Height Inequalty: “Human osteological data provide a rich, unmined source of information…

…about the distribution of nutrition, and by extension, the distribution of political power and economic wealth, in societies of long ago. On the basis of data we have collected and analyzed, we find that the shift from a hunter–gatherer to a labor-intensive agriculture opened up inequalities that had discernible effects on human health and stature. But we also find that political institutions intervene decisively in affecting the distribution of resources within societies. Political institutions appear to be shaped not only by economic factors but also by military technology and vulnerability to invasion, leaving important questions for additional exploration.

Must-Reads: June 6, 2016


Should Reads: