One of the lesser-known realities of the War on Poverty was that while poverty rates were falling substantially before it began, that progress came to an abrupt halt, fantastically, with its implementation. Without understanding this, people can remain blind to the lesson about how “fighting” poverty can undermine progress against it. And that may be particularly important to grasp now as, decades later, similar effects seem to be spreading to a far-larger population.
Before we turn to its current implications, we would benefit from reviewing explanations for how the War on Poverty failed. To my mind, the most insightful explanation comes from James Gwartney and Thomas McCaleb, in Have Antipoverty Programs Increased Poverty?
Gwartney and McCaleb discussed four ways incentives were worsened by those programs: through increased real benefits, increased implicit tax rates, decreased incentives to acquire and retain skills, and decreased incentives to avoid adverse lifestyle choices. Of particular importance today is their analysis of why the effects of such programs will be more adverse, both the longer they last and the younger are those impacted.
The first mechanism is that “increases in the real value of benefit payments make dependency on the government even more attractive compared with the alternative of self-support.” That effect will be greater for younger workers, whose earnings potential is lower than older, more educated, and experienced workers.
The second mechanism arises because means-tested poverty programs reduce benefits as households earn more, imposing the equivalent of an additional income tax on increased earnings. And when the reality of multiple programs is factored in, that implicit tax rate can be very high — far higher than the highest official tax rate on earned income, and in some circumstances, well over 100 percent. Consequently, “Such high implicit marginal tax rates pose a significant disincentive to work for those individuals whose potential earnings are relatively low.”
The third mechanism reduces skills, because individuals who have not used their skills for extended periods experience erosion in those skills. Not only does this worsen the longer such incentives persist, the effect is greater for younger workers because, “As transfers make dependency more attractive relative to work experience, schooling, and other forms of human capital investment, youthful recipients fail to develop skills that have in the past enabled the young to escape from poverty.” That is, it is not just that existing skills erode with disuse, but when disincentives mean skills are unlikely to pay off economically, the incentive to acquire those skills in the first place is also diminished.
The moral hazard effect arises because substantial increases in government assistance can enable some to choose “a lifestyle that increases the likelihood of poverty.” And that incentive is more damaging to one’s productive life the earlier it begins.
Gwartney and McCaleb noted that there were hardly any adverse incentive effects on low-income families whose members were retired, and smaller effects on those of working age, the older they were. The effects were much more severe for younger people, particularly those not yet in the labor force, who were (or should be) in the skill-acquisition stage.
To test whether the data corresponded to their analysis, Gwartney and McCaleb went one step further. Rather than just looking at overall poverty rates, they looked at poverty rates broken down by the ages of the householder, to compare the magnitude of the consequences of the disincentive effects on younger low-income households compared to older low-income households.
The effects they found were significant. After the substantial decreases in poverty for all age groups before the War on Poverty began, both official poverty rates and poverty rates adjusted for in-kind benefits (not officially counted as income) for the elderly (for whom the disincentive effects are minimal), continued to fall dramatically, from 15.9 percent in 1968 to 5.5 percent in 1982. For the 45-54 age bracket, adjusted poverty rates fell from 6.7 percent in 1968 to 5.8 percent in 1975, rising thereafter to 8 percent. For the 25-44 age bracket, adjusted poverty rates only fell from 8.6 percent to 8.5 percent at first, but rose substantially after, to 12.3 percent in 1982. Finally, for the youngest group studied, householders under 25, adjusted poverty rates rose from 1968 on, from 12.3 percent in 1968 to 24 percent in 1982.
So how is this “old news” important to current news? There were huge increases in such disincentives both during the Great Recession and during the course of COVID-19 recovery and its associated government policies.
The real (after adjusting for inflation) level of government benefits increased because the duration of benefits for unemployment was substantially extended (to 99 weeks at their peak). For a period in 2020, the federal government added $600 per week to state unemployment benefits (in many cases, making those benefits not only greater than they would otherwise be eligible for but more than they could earn). Eligibility for Medicaid (MediCal in California) was significantly expanded, subsidies for Obamacare policies grew, and there were even rental abeyance programs that allowed many to remain in their homes rent free.
To the extent that assistance programs focus on lower-income families, those programs will add to what recipients’ cumulative marginal tax rate (as economists call it, although it is technically a cumulative marginal benefit reduction rate), and subtract more from what they get to keep in take-home pay from producing for others in markets. Phaseouts of Obamacare subsidies with income do the same thing. Even more striking are “eligibility cliffs” where substantial benefits (e.g., free Medicaid for a parent with small children, which is worth thousands of dollars) disappear entirely when a certain income level is reached.
The incentive to let skills depreciate with disuse, and more importantly, not acquiring skills in the first place, proceeded primarily from restrictions and lockdown effects on employment opportunities and a host of educational policies under COVID, from ineffective online instruction to grade inflation that undermined potential employers’ ability to differentiate between students with particular skills and those without them. Efforts to leave traditional public schools and the disincentives they produced were also hamstrung by attacks on charter schools and voucher proposals.
Many COVID-induced changes taught students the wrong lifestyle-choice lessons, as well. Rules often lost all meaning. Cheating exploded, with virtually no enforcement against it. Students learned that absenteeism carried no penalty, in contrast to the serious penalties the “real world” can impose on its practice. They learned to expect a level of coddling that meant almost every failure to do one’s work was excused, and virtually nothing they could do would earn them a failing grade on anything, much less get them kicked out of school.
In sum, it seems like our failure to recognize what Gwartney and McCaleb did almost four decades ago — just how seriously the adverse effects of our efforts to “help” people hurt them instead — has come back to haunt America with a vengeance. We have recently doubled down on more of the same types of policies, which means we will see even more of their adverse effects.
That has certainly left us in a bad place in many ways. But that does not mean we should give up, acquiescing to an unacceptable status quo. We still have time to recognize that “better late than never” reforms give us a chance to move upward as we go forward from where we are. As Gwartney and McCaleb concluded:
The current system of income transfers confronts the poor with perverse incentives that discourage self-help efforts in the short run and induces recipients to make decisions that retard their ability to escape poverty in the long run…The problem of poverty continues to fester not because we are failing to do enough, but rather because we are doing so much that is counterproductive.
This article was published by AIER