Another sign of the deepening social crisis: The decline in US life expectancy

US life expectancy is on the decline, falling from 2014 to 2017—the first years of decline in life expectancy in over twenty years.  And according to Steven H. Woolf and Heidi Schoomaker, authors of the recently published “Life Expectancy and Mortality Rates in the United States, 1959-2017” in the Journal of the American Medical Association, “A major contributor has been an increase in mortality from specific causes (e.g., drug overdoses, suicides, organ system diseases) among young and middle-aged adults of all racial groups, with an onset as early as the 1990s and with the largest relative increases occurring in the Ohio Valley and New England.”

Declining life expectancy

In 1960, the US had the highest life expectancy of any country in the world.  By 2017 US life expectancy significantly trailed that of other comparable countries, as illustrated below.

In 1980, the difference between average life expectancy in the US and that of comparable countries was not large–73.7 years versus 74.5 years.  However, as we can see in the next figure, the gap steadily grew over the following years.  The US gained 4.9 years in average life expectancy from 1980 to 2017; comparable countries gained 7.8 years on average.

As researchers for the Kaiser Family Foundation point out, “The U.S. and most comparable countries experienced a slight decline in life expectancy in 2015. By 2016, life expectancy for these comparable countries rebounded to pre-2015 numbers, but in the US, such a bounce back did not occur.”  After averaging 78.9 years in 2014, averaged life expectancy in the US fell to 78.7 years in both 2015 and 2016, and then dropped again in 2017 to 78.6 years. These declines mark the first decreases in US life expectancy in more than 20 years.

Moreover, this growing gap and outright decline in average life expectancy holds for both US males and females, as we see from the following figure.

The growing social crisis

Woolf and Schoomaker drew upon 50 years’ worth of data from the US Mortality Database and the US Centers for Disease Control and Prevention’s WONDER database in an attempt to explain why US life expectancy has not kept pace with that of other wealthy countries and is now falling.  Their primary finding, as noted above, is that US life expectancy is being dragged down by “an increase in mortality from specific causes (eg, drug overdoses, suicides, organ system diseases) among young and middle-aged adults of all racial groups.”

More specifically, while over the period 1999-2017, infant mortality, mortality rates among children and early adolescents (1-14 years of age), and age-adjusted mortality rates among adults 65-84 all declined, individuals aged 25-64 “experienced retrogression” beginning in 2010, as we can see in the following figures taken from their article.  Between 2010 and 2017, these midlife adults experienced a 6 percent total increase in mortality rate. This increase overwhelmed gains experienced by the other age cohorts, dragging down overall US average life expectancy.

Woolf and Schoomaker concluded that there were multiple causes for this rise in mortality rates among individuals 25-64.  However, they highlighted drug overdose, alcohol abuse and suicide as among the most important.  This age cohort experienced a nearly four-fold increase in fatal drug overdoses between 1999 and 2017.  Their suicide rates went up nearly 40 percent over the same period. The rate of alcohol-related disease deaths soared by almost 160 percent for those 25-34 years.

In an interview with BusinessInsider, Woolf wrestled with why the country is experiencing such a dramatic rise in mortality rates among young and middle aged adults. “It’s a quandary of why this is happening when we spend so much on healthcare,” Woolf said, adding: “But my betting money is on the economy.”

That seems like a pretty safe answer.  It also raises the question: how do we help working people understand the increasingly toxic nature of the workings of the US economy and build the ties of solidarity necessary to advance the struggle for system transformation.

The Harsh Reality of Job Growth in America

The current US economic expansion, which began a little over a decade ago, is now the longest in US history.  But while commentators celebrate the slow but steady growth in economic activity, and the wealthy toast continuing strong corporate profits, lowered taxes, and record highs in the stock market, things are not so bright for the majority of workers, despite record low levels of unemployment.

The fact is, despite the long expansion, the share of workers in low-wage jobs remains substantial. To make matters worse, the share of low-quality jobs in total employment seems likely to keep growing. And, although US workers are not unique in facing hard times, the downward press on worker well-being in the US has been more punishing than in many other advanced capitalist countries, leaving the average US worker absolutely poorer than the average worker in several of them.

The low wage reality

According to a recent Bookings report by Martha Ross and Nicole Bateman, titled Meet the Low-wage Workforce,

Low-wage workers comprise a substantial share of the workforce.  More than 53 million people, or 44 percent of all workers ages 18 to 64 in the United States, earn low hourly wages. More than half (56 percent) are in their prime working years of 25-50, and this age group is also the most likely to be raising children (43 percent).

Ross and Bateman draw upon the Census Bureau’s 2012-2016 American Community Survey 5-year Public Use Microdata Sample to identify low-wage workers.  Although their work does not incorporate the small increase in wages between 2017-2019, they are confident that doing so would not significantly change their findings.

Their workforce definition started with all civilian, non-institutionalized individuals, 18 to 64 years of age, who worked at some point in the previous year (during the survey period) and remained in the labor force (either employed or unemployed).  They then removed graduate and professional students and traditional high school and college students, as well as those who reported being self-employed or earning self-employment income and those who worked without pay in a family business or farm.  This left them with a total of 122 million workers.

Their definition of a low-wage worker started with the “often-employed threshold” of two-thirds the median hourly wage of a full-time/full year worker, with one major modification. They used the male wage because they wanted to establish a threshold that was not affected by gender inequality.  They identified anyone earning a lower hourly wage as a low-wage worker.

The average national threshold across their five years of data, in 2016 real dollars, was $16.03.  They then adjusted this value, using the Bureau of Economic Analysis’s Regional Price Parities, to take into account variations in the cost of living in individual metropolitan areas.  The adjusted thresholds ranged from $12.54 in Beckley, West Virginia to $20.02 in San Jose, California.  Using these thresholds, the authors found that 44 percent of the workforce, some 53 million workers, were low-wage workers.

These low-wage workers were a racially diverse group.  Fifty-two percent were white, 25 percent Latinx, 15 percent Black, and 5 percent Asian American. Both Latinx and Black workers were over-represented relative to their share of the total workforce.

Strikingly, 57 percent of low-wage workers worked full time year-round.  And half of all low-wage workers “are primary earners or contribute substantially to family living expenses. Twenty-six percent of low-wage workers are the sole earners in their families, with median family earnings of $20,400.”

Finally, as the authors also note, the economic mobility of low wage workers appears quite limited. They cite one study that “found that, within a 12-month period, 70 percent of low-wage workers stayed in the same job, 6 percent switched to a different low-wage job, and just 5 percent found a better job.”

The growing share of low-wage jobs

The downward movement in a new monthly index, the job quality index (JQI), makes clear that economic growth alone will not solve the problem of too many workers employed in low-wage work.  The index measures the ratio of high-quality jobs (those that pay more than the average weekly income) to low quality jobs (those that pay less than the average weekly income).  The index steadily declined over the past three decades, during periods of expansion as well as recession, from a ratio of 94.9 in 1990 to a ratio of 79.0 as of July 2019 (as illustrated below).

The process of creating the index and its usefulness is described in a recent paper authored by Daniel Alpert, Jeffrey Ferry, Robert C. Hockett, Amir Khaleghi.  The index itself is maintained by a group of researchers from Cornell University Law School, the Coalition for a Prosperous America, the University of Missouri-Kansas City, and the Global Institute for Sustainable Prosperity.  As the authors note, the most prominent factor underlying the three decade fall in the ratio is the “relative devaluation” of US labor.

The index tracks private sector jobs provided by third party employers, which excludes self-employed workers, and, for now, covers only production and nonsupervisory (P&NS) positions, which account for approximately 82 percent of total private sector jobs in the country.

The index draws on the BLS’s Current Employment Statistics which provides average weekly hours, average hourly wages, and total employment for 180 distinct job categories organized in industry groups.  As the authors explain:

JQI itself is a fairly simple measure. The index divides all categories of jobs in the US into high and low quality by calculating the mean weekly income (hourly wages multiplied by hours worked) of all P&NS jobs and then calculates the number of P&NS jobs that are above or below that mean. An index reading of 100 would indicate an even distribution, as between high- and low-quality jobs. Readings below 100 indicate a greater concentration in lower quality (those below the mean) positions, and a reading above 100 would greater concentration in high quality (above the mean) positions.

Recognizing that some groups are quite large and include a wide range of jobs hovering around the mean, the JQI is further adjusted by disaggregating those particular groups into subgroups. The average income of each of those subgroups is then compared with the mean weekly income derived from the entire sample to determine whether the positions should be classified as high or low quality jobs.

As noted above, the JQI fell from 94.9 in 1990 to 79.0 as of July 2019.  As for the significance of this decline:

The decline confirms sustained and steadily mounting dependence of the U.S. employment situation on private P&NS jobs that are below the mean level of weekly wages. . . .

Notably, movements in the JQI are not particularly correlated with recession; it is important to note that the first big decline occurred during the expansion of the late 1990s. The index was steady during the 2001 recession, and its second big decline occurred during and after the Great Recession. There is admittedly some cyclical patterning evidenced in the JQI output, but this is overwhelmed by a larger secular phenomenon.

Losing ground

Not only are US workers facing a labor market increasingly oriented towards low-wage employment, the resulting downward pressure on wages appears to be proceeding at a more rapid pace in the US than in other countries.  As a consequence, a majority of US workers are now poorer, in real terms, than many of their counterparts in other countries.

For example, in a study comparing income inequality in France and the US, the economists Thomas Piketty, Emmanuel Saez, and Gabriel Zucman found that the average pre-tax national income of adults in the bottom 50 percent of the income distribution is now greater in France than in the United States.  “While the bottom 50 percent of incomes were 11 percent lower in France than in the US in 1980, they are now 16 percent higher.”  Moreover,

The bottom 50 percent of income earners makes more in France than in the US even though average income per adult is still 35 percent lower in France than in the US (partly due to differences in standard working hours in the two countries). Since the welfare state is more generous in France, the gap between the bottom 50 percent of income earners in France and the US would be even greater after taxes and transfers.

A recent study by the Center for the Study of Living Standards finds that growing numbers of US workers are also falling behind their Canadian counterparts.  More specifically, “the study compares incomes in every percentile of the income distribution, and finds that up through the 56th percentile Canadians are better off than their U.S. counterparts.”

The study’s author, Simon Lapointe, in words that echo the comments of Piketty, Saez, and Zucman, adds:

Our income estimates may actually underestimate the economic well-being of Canadians relative to Americans. Indeed, Canadians usually receive more in-kind benefits from their governments, including notably in health care. Had these benefits been included in the estimates, the median augmented household income in Canada would likely surpass the American median by a greater margin. While these benefits also come with higher taxes, the progressivity of the income tax system is such that the median household is most likely a net beneficiary.

The takeaway

There are many reasons for those at the top of the US income distribution to celebrate the performance of the US economy and tout the superiority of current US economic and political institutions and policies.  Unfortunately, there is a strong connection between the continuing gains for those at the top and the steadily deteriorating employment conditions experienced by growing numbers of workers.  Hopefully, this economic reality will become far better understood, leading to a more widespread recognition of the need for collective action to transform the US economy in ways that are responsive to majority interests.

A Wealth Tax: Because That’s Where The Money Is

The bank robber Willie Sutton, when asked by a reporter why he robbed banks, is reputed to have answered, “Because that’s where the money is.”  Which brings us to a wealth tax.

Transforming our economy is going to be expensive.  And a tax on the wealth of the super wealthy is one way to capture a sizeable amount of money, which is why both Bernie Sanders and Elizabeth Warren include the tax in their respective programs.  The economists Gabriel Zucman and Emmanuel Saez estimate that Sanders’s proposed wealth tax would raise $4.35 trillion over the next decade, while Warren’s would raise $2.75 trillion.

Where the money is  

The concentration of wealth has steadily increased since the mid-1990s, as illustrated in the following Bloomberg News chart.

A recent Federal Reserve Bank study highlights the fact that the top 10 percent and even more so the top 1 percent of households have been especially successful in increasing their equity ownership in US public and private companies.  For example,

in 1989, the richest 10 percent of households held 80 percent of corporate equity and 78 percent of equity in noncorporate business. Since 1989, the top 10 percent’s share of corporate equity has increased, on net, from 80 percent to 87 percent, and their share of noncorporate business equity has increased, on net, from 78 percent to 86 percent. Furthermore, most of these increases in business equity holdings have been realized by the top 1 percent, whose corporate equity shares increased from 39 percent to 50 percent and noncorporate equity shares increased from 42 percent to 53 percent since 1989.

It is worth emphasizing that last point: the top 1 percent of households now control more than half of the equity in US businesses, public and private.

The figure below shows total wealth holdings for all US families as of the second quarter, 2019.  The top 1 percent now own almost as much wealth as all the families in the 50th to 90th percentiles combined.

A comparison with the size and distribution of wealth in 2006, shown below, illustrates the rapid gains made by those at the top.

In 2006, the total wealth held by families in the 50th to 90th percentiles was slightly greater than that held by families in the 90th to 99th percentiles and significantly larger than those in the top 1 percent.  But not anymore.  And sadly, families in the bottom half of the distribution, whose wealth is predominately in real estate, have fallen further behind everyone else.

Time for a wealth tax

Recognizing this reality, and the fact that this concentration of wealth was aided by a steady decline in top individual, corporate, and estate tax rates, both Sanders and Warren want to tax the super wealthy to generate funds to help pay for their key programs, especially Medicare for All.  And, as an added bonus, to begin weakening the enormous political power of those top families.

Sanders would create an annual tax that would apply to married couple households with a net worth above $32 million — about 180,000 households in total, or roughly the top 0.1 percent.  The tax would start at 1 percent on net worth above $32 million, with increasing marginal tax rates–a 2 percent tax on net worth between $50 to $250 million, a 3 percent tax from $250 to $500 million, a 4 percent tax from $500 million to $1 billion, a 5 percent tax from $1 to $2.5 billion, a 6 percent tax from $2.5 to $5 billion, a 7 percent tax from $5 to $10 billion, and an 8 percent tax on wealth over $10 billion. For single filers, the brackets would be halved, with the tax starting at $16 million.

Warren’s wealth tax would apply to households with a net worth above $50 million — an estimated 70,000 households. The tax would start at 2 percent on net worth between $50 million to $1 billion, rising to 3 percent on net worth above $1 billion.  Her proposed tax brackets would be the same for married and single filers.

Zucman and Saez have calculated how some of the richest Americans would have fared if these wealth taxes had been in place starting in 1982.  For example, Jeff Bezos, the founder of Amazon, is currently worth some $160 billion.  Under the Sanders plan his wealth would have been reduced to $43 billion.  Under the Warren plan, it would be $87 billion.

As a New York Times article sums up:

Over all, the economists found, the cumulative wealth of the top 15 richest Americans in 2018 — amounting to $943 billion, using estimates from Forbes — would have been $434 billion under the Warren plan and $196 billion under the Sanders plan.

Despite the fact that the super wealthy will still have unbelievable fortunes even if forced to pay a wealth tax, almost all of them are strongly opposed to the tax and determined to discredit it.

Challenges ahead

Polling done early in the year found strong support for a wealth tax.  As Matthew Yglesias explains:

Americans are . . . positively enthusiastic about Sen. Elizabeth Warren’s proposal to institute a wealth tax on large fortunes, according to a new poll from Morning Consult.  Their survey finds that . . . the wealth tax scores a crushing 60-21 victory that includes majority support from Republicans.

Of course, this kind of support was registered before the start of any serious media effort to raise doubts about its effectiveness.  Recently, a number of wealthy business people and conservative economists have begun to make the case that a wealth tax is a radical measure that will harm the economy.  Some point to the fact that many countries that once used the tax have now abandoned it.  Twelve OECD countries had a wealth tax in 1990, now only three do (Norway, Switzerland, and Spain).  France, Germany, and Sweden are among the majority that no longer use it.

However, as Zucman and Saez explain, this fact does not mean that a wealth tax would not work in the US.  For example, in some countries it was the election of conservative governments philosophically opposed to such taxes that led to their elimination.  More substantively, they highlight four problem areas that tended to undermine the effectiveness of and support for national wealth taxes in Europe and why these should not be a major problem for the US.

First, European countries have their own separate tax laws and member states do not tax their nationals living abroad.  Thus, a wealthy person living in a country with a wealth tax could easily move to a nearby country without a wealth tax and escape paying it.  And many have.  But, as the economists note,

The situation in the United States is different. You can’t shirk your tax responsibilities by moving, because US citizens are responsible to the Internal Revenue Service no matter where they live. The only way to escape the IRS is to renounce citizenship, an extreme move that in both Warren’s and Sanders’s plans would trigger a large exit tax of 40 percent on net worth.

Second, European governments tolerated a high level of tax evasion. Until last year, they did not require banks in Switzerland or other tax havens to share information about deposits with national tax authorities.  This made it easy for the wealthy to hide their assets. The US is in a better situation to avoid this outcome.  The Foreign Account Tax Compliance Act, signed in 2010, requires foreign financial institutions to send detailed information to the Internal Revenue Service about the accounts of U.S. citizens each year, or face sanctions. Almost all foreign banks have agreed to cooperate.

Third, European wealth taxes had many exemptions and deductions.  In contrast, there are none in the proposed plans by Warren and Sanders.  Zucman and Saez highlight the French program that was in place from 1988 to 2017 as a prime example:

Paintings? Exempt. Businesses owned by their managers? Exempt. Main homes? Wealthy French received a 30 percent deduction on those. Shares in small or medium-size enterprises got a 75 percent exemption. The list of tax breaks for the wealthy grew year after year.

Fourth, European wealth taxes fell on a considerably larger share of the population than would the proposed plans by Warren or Sanders. In Europe, “wealth taxes tended to start around $1 million, meaning they hit about 2 percent of the population, compared with about 0.1 percent for the proposed U.S. plans.”  This broader reach of the European wealth taxes helped to generate popular pressure to weaken them, leading to their eventual removal.  The more limited reach of the proposed US plans should help to blunt that development in the US.

We can certainly expect a fierce debate over the viability and effectiveness of a wealth tax as the campaign season continues, especially if Sanders or Warren becomes the Democratic Party nominee for president.  We should be prepared to advocate for the tax as one important way to ensure adequate funding of needed programs.  But we should also take advantage of the debate to shine the brightest light possible on the growing and already obscene concentration of wealth in the US and even more importantly on the underlying and destructive logic of the capitalist accumulation process that generates it.

What the New Deal can teach us about winning a Green New Deal: Part V—summing up the New Deal experience

Growing awareness of our ever-worsening climate crisis has boosted the popularity of movements calling for a Green New Deal.  At present, the Green New Deal is a big tent idea, grounded to some extent by its identification with the original New Deal and emphasis on the need for strong state action to initiate social-system change on a massive scale.  Challenges abound for Green New Deal activists.  Among the many, how to:

  • create supportive working relationships between the different movements currently pushing for a Green New Deal
  • develop a sharper, shared vision of the aims of a Green New Deal
  • increase popular support for those aims as well as participation in those movements
  • build sufficient political power to force a change in state policy along lines favorable to the Green New Deal
  • ensure that the resulting trajectory of change strengthens the broader struggle to achieve a socially just and ecologically sustainable political-economy

While there are great differences between the crises and political movements and possibilities of the 1930s and now, there are also important lessons that can be learned from the efforts of activists to build mass movements for social transformation during the Great Depression.  My aim in this series, including in this fifth and last post, is to illuminate the challenges faced and choices made by these activists in order to draw out some of the relevant lessons.

In previous posts I argued that the despite the severity of the Great Depression, it took sustained, left-led, mass organizing and actions to force the federal government to accept responsibility for improving economic conditions.  Unfortunately, First New Deal relief and job creation policies were inadequate, far from what the growing movement of unemployed demanded or was needed to meet majority needs.  However, continued mass activity by the unemployed, those on relief, and those employed eventually forced the Roosevelt administration to undertake a Second New Deal, which included its widely praised programs for public works (WPA), social security (Social Security Act), and union rights (National Labor Relations Act).

These Second New Deal programs were unprecedented and did improve conditions for working people.  But, as I argue in this final post, both the WPA and the Social Security Act again fell short of the transformative changes demanded by activists.  And while the NLRA did offer workers important legal protections that made it safer for them to unionize their workplaces, its effect was to encourage a top-down system of labor-management relations that suppressed rank and file activism and class consciousness. Thus, despite their pathbreaking nature, these programs were far from revolutionary.  Rather they were designed to ameliorate the suffering caused by capitalism’s crisis without threatening capitalist control over economic activity.

Tragically, changes in the political and economic environment, as well as strategic choices made by the left in response to those changes led to the weakening of popular movements, leaving them unable to push the Roosevelt administration into yet a Third New Deal.  As a result, the upsurge of the 1930s failed to advance the socialist-inspired transformation that motivated many of its participants. In the end, it proved only able to force the state to adopt policies that reformed the workings of the system, a not inconsiderable achievement, but one that still left working people vulnerable to the vicissitudes of capitalism.  Hopefully, a careful study of the New Deal experience will help Green New Deal activists build movements able to avoid the trap of limited reform while fighting for the massive, interconnected, and empowering social-system change we so desperately need.

The Second New Deal

It is easy to understand why supporters of a Green New Deal look to the New Deal as a touchstone.  Growing numbers of people have come to the conclusion that our problems are too big to be solved by individual or local efforts alone, and that once again innovative and transformative state-led actions will be needed to solve them.  Quite simply, the New Deal experience inspires people to believe in the possibility of a Green New Deal.

When people talk about the innovative and transformative policies of the New Deal they normally mean the core policies of the Second New Deal: the WPA, the Social Security Act, and the National Labor Relations Act.  As innovative as these policies were, they were, as discussed in Part IV, largely forced on the Roosevelt Administration by left-led mass movements.  And, as we see next, they were, by design, meant to blunt more radical demands for change.  In short, they were important reforms, but no more than reforms, and as such offered only partial solutions to the problems of the time.  Sadly, workers today continue to suffer from their limitations.

Works Progress Administration

One of the most important Second New Deal programs was the Works Progress Administration (WPA). Established in May 1935, it employed millions of unemployed to carry out public projects such as construction of public buildings and roads.  Federal Project Number One, a much smaller program that also operated under the WPA umbrella, employed musicians, artists, writers, actors and directors in large arts, drama, media, and literacy projects. These included the Federal Writers’ Project, the Federal Theatre Project, the Federal Music Project, and the Federal Art Project.

Roosevelt’s decision to replace the Federal Emergency Relief Administration (FERA) with the WPA was a clear sign that he recognized that his First New Deal employment and relief programs — FERA and the Civil Works Administration (CWA) — had done little to satisfy fast growing left-led unemployed movements that were demanding a federal jobs program under which unemployed workers would be directly put to work, at union wages, producing a wide range of needed goods and services.

FERA had provided loans and grants to states which then offered relief work to those that qualified for relief.  As discussed in Part III, the program required workers to submit to demeaning financial investigations, often paid those chosen for relief with coupons that could only be redeemed for select food items, made no attempt to match worker skills with jobs, and often employed those on relief in make-work tasks.  While FERA marked the first direct federal support for relief and enabled states to greatly expand their relief rolls, it also required states to provide matching funds to receive FERA money.  Limited state resources meant that relief covered only about one-third of those unemployed.

CWA was a far more popular program, most importantly because it involved direct federal employment, had no relief requirement, paid relatively well, and sought to match workers’ skills with jobs.  However, it was, by design, a short-term program that lasted only 6 months, with most employment creation ending after 4 months.

The WPA was a federal program that operated its own projects in cooperation with state and local governments, which were required to cover some 10 to 30 percent of their costs.  In some cases, the WPA took over ongoing FERA state and local relief programs.  But, despite its impressive accomplishments, it also fell short of movement demands.

Although the WPA combined elements of both FERA and the CWA, it was far more like the former than the latter. For example, in contrast to the CWA, participation in WPA projects required a state means test.  Thus, unemployment alone was not enough to qualify a person for the program.  Moreover, as under FERA, participants were subject to demeaning monitoring of their spending habits and living conditions.

Again. unlike the CWA, little effort was made to match workers’ skills with jobs.  Workers were divided into two broad categories of skilled and unskilled.  The unskilled were assigned construction jobs even if they had no construction experience.  The skilled were assigned a variety of writing or teaching jobs regardless of whether they had experience in those areas.  The program did pay market wages.  However, limits were put on maximum allowable hours of weekly employment in addition to an overall limit on total earnings.

WPA employment opportunities were also limited.  Its average monthly employment was approximately 3 million workers.  The CWA, at its peak, employed over 4 million a month.  The WPA, like FERA, employed only about one-third of the unemployed.  Moreover, because of unstable program financing, even those employed by the WPA would sometimes suffer layoffs.

The unemployed movement wanted a permanent federal employment program that would guarantee full employment.  And they wanted that program to employ people to produce needed goods and services as a direct counter to private production.  This was far from the vision of the Roosevelt administration.  As Harry Hopkins, chief administrator of the WPA, explained:

Policy from the first was not to compete with private business. Hence we could neither work on private property, set up a rival merchandising system, nor form a work outlet through manufacturing, even though manufacturing had contributed to relief rolls hundreds of thousands of workers accustomed to operating machines and to doing nothing else for a living.

Operating under these limits, the WPA had little choice but to focus its efforts on the construction of public buildings and roads.  Post offices accounted for close to half of the more than 3000 public buildings constructed.

Moreover, despite its limitations, the unemployed had to fight to sustain the program.  Congress decided to provide funds for the program one year at a time.  Sometimes allocations fell short of planned spending, resulting in layoffs.   Other times, militant demonstrations by an alliance of unemployed groups forced Congress into making supplemental appropriations.

The number of public works projects and WPA participants began a steady decline in 1939.  The next year the Roosevelt administration decided to reorient program activity to projects of direct use to the military, including construction of base housing and military airfields as well as expansion of naval yards. The WPA was quietly terminated in 1943, with unemployment problems seemingly solved thanks to the demands of wartime production.  Sadly, the unemployed never developed the political weight or broader social movement needed to push the government into embracing a more expansive and ongoing program of national planning and public production.

The Social Security Act

The Social Security Act is widely considered to be the New Deal’s crown jewel.  According to his Secretary of Labor, “[President Roosevelt] always regarded the Social Security Act as the cornerstone of his administration . . . and . . . took greater satisfaction from it than from anything else he achieved on the domestic front.”

Roosevelt appointed a Committee on Economic Security in July 1934 with the charge to develop a social security bill that he could present to Congress in January 1935 that would include provisions for both unemployment insurance and old-age security.  An administration approved bill was in fact introduced in January and Roosevelt called for quick Congressional action.  The bill was revised in April by a House committee and given a new name, “The Social Security Act.”  After additional revisions the Social Security Act was approved by overwhelming majorities in both Houses of Congress, and the legislation was signed by the President on August 14, 1935.

The Social Security Act was a complex piece of legislation.  It included what we now call Social Security, a federal old-age benefit program; a program of unemployment benefits administered by the states, and a program of federal grants to states to fund benefits for the needy elderly and aid to dependent children.  It was a cautious beginning, as explained by Edwin E. Witte, the Executive Director and Secretary of the President’s Committee on Economic Security:

Because we were in the midst of a deep depression, the Administration and Congress were very anxious to avoid placing too great burdens on business and also to avoid adding to Government deficits. It was these considerations that resulted in the low beginning social security tax rates and the step-plan of the introduction of both old-age and unemployment insurance and also in the establishment of completely self-financed social insurance programs, without Government contributions–to this day a distinctive feature of social insurance in this country.

Before examining the way Roosevelt’s concerns for the well-being of business placed limits on the timeliness, coverage, and support provided by these programs, it is important to recognize that, as with the WPA, Roosevelt’s commitment to social security was a response to the efforts of the Communist Party (CP), which authored a far more progressive bill, one that would have significantly shifted the balance of class power towards workers.

The CP began pushing its Workers Unemployment Insurance Bill in the summer of 1930, and it, as well as the Unemployment Councils, worked hard to promote it over the following years.  On March 4, 1933, the day of Roosevelt’s inauguration, they organized demonstrations stressing the need for action on unemployment insurance.

Undeterred by Roosevelt’s lack of action, the CP authored a bill–the Workers Unemployment and Social Insurance Bill–that was introduced in Congress in February 1934 by Representative Ernest Lundeen of the Farmer-Labor Party.  In broad brush, as Chris Wright summarizes, the bill:

provided for unemployment insurance for workers and farmers (regardless of age, sex, or race) that was to be equal to average local wages but no less than $10 per week plus $3 for each dependent; people compelled to work part-time (because of inability to find full-time jobs) were to receive the difference between their earnings and the average local full-time wages; commissions directly elected by members of workers’ and farmers’ organizations were to administer the system; social insurance would be given to the sick and elderly, and maternity benefits would be paid eight weeks before and eight weeks after birth; and the system would be financed by unappropriated funds in the Treasury and by taxes on inheritances, gifts, and individual and corporate incomes above $5,000 a year. Later iterations of the bill went into greater detail on how the system would be financed and managed.

Not surprisingly, the bill enjoyed strong support among workers, employed and unemployed.  Thanks to the efforts of unemployed and union activists it was soon endorsed by 5 international unions, 35 central labor bodies, and more than 3000 local unions.  Rank and file worker committees also formed across the country to pressure members of Congress to pass it.

When Congress refused to act on the bill, Lundeen reintroduced it in January 1935. Because of public pressure, the bill became the first unemployment insurance plan in US history to be recommended by a congressional committee, in this case the House Labor Committee.  It was voted down in the full House of Representatives, 204 to 52.

Roosevelt strongly opposed the Lundeen bill and it was to provide a counter that he established his Committee on Economic Security in July 1934 and pressed Congress to approve the resulting Social Security Act as quickly as possible.  Roosevelt’s Social Security Act fell far short of what the Workers Unemployment and Social Insurance Bill offered, and it was strongly opposed by movement activists and organizations of the unemployed.

The part of the bill that established what we now call Social Security suffered from five main weaknesses.  First, it was to be self-financing because of administration fears of deficit spending, a decision which placed downward pressure on benefit levels.  Second, it was to be financed by contributions from both workers and employers.  Thus, workers had to shoulder half the costs of the program.

Third, the system was not universal.  The act covered only workers in commerce and industry, about half the jobs in the economy.  Among those left out were farm and domestic workers.

Fourth, the act provided for monthly retirement benefits payable only to the primary worker in a family when they retired at age 65 or older. Moreover, the amount received depended on the value of wages earned in covered employment starting in 1937.

Finally, the act mandated that monthly benefit payments would not begin until 1942.  A 1939 amendment did allow benefit payments to begin in 1940 and added child, spouse, and survivor benefits to the authorized retirement benefits.

In sum, this was a program that offered too little, too late, and to too few people.  And while improvements were made over the years, the current system pales in comparison to the kind of security and humane retirement workers would have enjoyed if the workers’ movement had been powerful enough to secure passage of its preferred bill.

The unemployment system established as part of the Social Security Act was also structured in ways unfavorable to workers compared with the proposed benefits of the Workers Unemployment and Social Insurance Bill.  Rather than set up a comprehensive national system of unemployment compensation, as workers desired, the act established a federal-state cooperative system that gave states wide latitude in determining standards.

More specifically, the act levied a uniform national pay-roll tax of 1 percent in 1936, 2 percent in 1937, and 3 percent in 1938, on covered employers, defined as those employers with eight or more employees for at least twenty weeks, not including government employers and employers in agriculture.  Only workers employed by a covered employer could receive benefits.

Covered employers were given a federal credit on up to 90 percent of the tax if they paid their credit amount into a certified state unemployment compensation fund.  The act left it to the states to decide whether to enact their own plans, and if so, to determine eligibility conditions, the waiting period to receive benefits, benefit amounts, minimum and maximum benefit levels, duration of benefits, disqualifications, and other administrative matters. It was not until 1937 that programs were established in every state as well as the then-territories of Alaska and Hawaii.  And it was not until 1938 that most began paying benefits.

In the early years, most states required eligible workers to wait 2 to 4 weeks before drawing benefits, which were commonly set at half recent earnings (subject to weekly maximums) for a period ranging from 12 to 16 weeks. Ten state laws called for employee contributions as well as employer contributions.

Just like with social security, over the following years the program was expanded in a number of positive ways, including by expanding coverage and benefits.  However, the unemployment program established by the Social Security Act fell far short of the universal, progressively funded social safety net that workers were demanding.

The National Labor Relations Act

In the spring of 1934, Senator Robert Wagner introduced a bill to establish a new labor relations board that, unlike the one established by the First New Deal’s National Industrial Recovery Act (NIRA), would have enforcement authority.  Few in Congress supported the bill; President Roosevelt also opposed it.

Wagner reintroduced a revised version of his bill a year later and to a dramatically different outcome.  In May 1935 it received unanimous support in the Senate Labor Committee, followed by strong support in both the Senate and House.  As reported by the editors of Who Built America?, President Roosevelt remained opposed to the bill up until the very end:

“It ought to be on the record,” his labor secretary noted, that the bill was “not a part of the President’s program.  It did not particularly appeal to him when it was described to him.”  But when the US Supreme Court struck down the NIRA in May and Wagner’s National Labor Relations bill was passed by one house of Congress, FDR finally endorsed the bill.

In broad brush, the National Labor Relations Act established a set of laws and regulations designed to guarantee the right of private sector workers to peacefully organize into trade unions of their choosing and engage in collective bargaining and actions such as strikes.  The act also created the National Labor Relations Board to organize and oversee the process by which workers decide on whether to join a union as well as determine whether collective bargaining agreements are being fairly bargained and enforced.

The turnaround in support for the NLRA owes much to the growing militancy of workers, and the threat that this militancy posed to the established order.  Section 7a of the NIRA had promised workers that they would “have the right to organize and bargain collectively through representatives of their own choosing . . . free from the interference, restraint, or coercion of employers.”  Unfortunately, with no mechanism to ensure that workers would be able to exercise this right, after a short period of successful union organizing, companies began violently repressing genuine union activity. By 1935, growing numbers of workers were calling the National Recovery Administration (NRA), which had been established to oversee the NIRA, the National Run Around.

However, it was not the corporate campaign of violence directed against workers that was the catalyst for the change in government policy.  Rather it was the explosion of powerful left-led worker victories in three major labor struggles in early 1934.  The first was in Toledo, Ohio, where American Workers’ Party sponsored unemployed organizations joined with striking auto workers seeking to unionize a major auto parts manufacturer.  The workers battled special deputies and National Guard troops for weeks, maintaining an effective strike.  Fearful of the possibility of an even larger strike, the Roosevelt administration finally sent federal mediators to Toledo, forcing the company to recognize the union and agree to significant wage increases.

At almost the same time, an even bigger struggle began in Minneapolis. A Trotskyist-led Teamster local, fighting to unionize a number of trucking and warehouse companies, effectively shut down commercial transport in the city.  Days of violence followed as police and special deputies tried to break the strike.  Faced with a growing threat of a general strike, federal mediators again were forced to intervene, and again forced the employers to recognize the union.

A general strike did take place in San Francisco.  Led by Communist and other radical rank and file activists, San Francisco longshoremen rejected a secretly negotiated deal between the national leadership of the International Longshoremen’s Association and the waterfront employers.  Their strike was quickly joined by dockworkers in every other West Coast port as well as many sailors and waterfront truckers.

Police attempts to break the San Francisco strike led to a full-scale battle and the death of two strikers by police on what became known as Bloody Thursday.  In response, the labor movement declared a general strike.  Some 150,000 workers went out, essentially bringing San Francisco, Oakland, Berkeley and other nearby municipalities, to a halt.  Again, federal intervention was required to bring the strike to a halt, with a victory for the workers.

These struggles, all with important left leadership, showed a dramatic growth in worker militancy, solidarity, and radicalism that sent shock waves throughout the corporate community as well as the government.  And it was to head off the further radicalization of the labor movement that the Congress and Roosevelt agreed to support the NLRA and its mechanisms to regularize the unionization process.  In the words of Steve Fraser:

The Wagner Act helped institutionalize a form of industrial democracy that steered clear of any frontal assault on the underlying political economy. It legitimated collective bargaining, imposed responsibilities on both management and trade union officialdom, and worked to establish peace on the shop floor.

Union leaders were to police their members, instilling a disciplined commitment to the terms of the contract. Control of life on the shop-floor remained with management. Militants who thought otherwise were soon enough reigned in. The much-maligned (not without cause) trade union bureaucracy was, after all, the fruit of a mass movement, an institution, created where there had been nothing, the slowly solidified residue of fiery desires.

For a few years, it appeared that worker militancy, a willingness to directly challenge corporate rights with no concern for issues of legality, would continue despite the NLRB’s existence.  For example, in early 1936 rubber workers in Akron, Ohio disregarded both union leadership and a court injunction to surround the eleven-mile perimeter of a Goodyear plant with pickets.  They shut down the plant in protest over recent wage cuts and layoffs of activists and rejected federal attempts at mediation.  When word came that the sheriff might come with armed deputies to open the plat, the strikers armed themselves.  Finally, after four weeks, Goodyear settled, agreeing to reinstate the fired workers, reduce the workweek, and recognize the authority of union shop committees.

Not long after, inspired by the rubber workers, auto workers began staging walk-outs and strikes at several different Chrysler and GM plants over firings and unionization.  The biggest action came at the end of 1936 with the Flint sit-down strike.  The workers held the plant for 44 days, during which time they fought off attempts by armed police to evict them and ignored injunctions issued by the courts demanding that they leave.  In the end GM agreed to recognize the UAW as the exclusive bargaining representative for all GM workers.

The number of strikes grew dramatically from 2,014 in 1935 to 4,740 in 1937, with workers increasingly winning unionization not through the machinery of the NLRA, but through direct action.  For example, the number of sit-down strikes lasting more than a day grew from 48 in 1936 to some 500 in 1937.

Unfortunately, this upward trajectory of militant, class conscious activity would not be sustained.  The reasons are complex.  One part of the explanation concerns the evolving political orientation of the CP.  Responding to the new strategic orientation of the Communist International, which stressed the importance of building coalitions with all progressive and liberal forces to check the rise of fascism, the CP began pursuing an anti-fascist popular front policy that included support for Roosevelt’s 1936 re-election and the New Deal more generally.

This new orientation also translated into an increasingly conservative line regarding labor activism.  Party activists were encouraged not only to support the new CIO union leadership but also to oppose militant organizing tactics.  As Frances Fox Piven and Richard A. Cloward describe:

The Communists, by now well into their Popular Front phase and some of them into the union bureaucracy as well, endorsed the call for union discipline. Wyndham Mortimer issued a statement early in 1937 saying: “Sit-down strikes should be resorted to only when absolutely necessary.” And the Flint Auto Worker, edited by Communist Henry Kraus, editorialized that “the problem is not to foster strikes and labor trouble. The union can only grow on the basis of established procedure and collective bargaining.”

At the same time, corporate leaders were taking direct aim at the new labor reforms.  One of their first big victories was a 1938 Supreme Court ruling that said companies had the right to hire permanent replacement workers when workers went on strike.  The following year it ruled sit-down strikes illegal, even if undertaken in response to an illegal corporate action.

States also joined in.  In 1939, as Piven and Cloward report:

state legislatures began to pass laws prohibiting some kinds of strikes and secondary boycotts, limiting picketing, outlawing the closed shop, requiring the registration of unions, limiting the amount of dues unions could charge, and providing stiff jail terms for violations of the new offenses. By 1947 almost all of the states had passed legislation imposing at least some of these limitations.

Finally, corporate leaders also launched an anti-Communist attack against union activists, especially those in leadership positions in the newly created unions of the CIO.  Their efforts were amplified by House Un-American Activities Committee hearings which began in 1938.  The 1947 Taft–Hartley Act codified all these developments, outlawing wildcat strikes, solidarity or political strikes, secondary boycotts, secondary and mass picketing, and closed shops, as well as requiring union officers to sign non-communist affidavits as a condition for their union to secure NLRA rights.

In sum, as left and union leadership began to rely ever more heavily on the NLRA to win gains for workers, corporate and political elites began aggressively narrowing the acceptable boundaries of legal action.  As a consequence, although there would still be periods of worker militancy, the frequency of rank and file-led actions, open rebellion against the law, and moments of cross-union and class solidarity became increasingly rare.  Thus, the NLRB succeeded, as its supporters hoped, in creating a more stable system of labor relations that was consistent with and supportive of the needs of capitalist production.

The Movement’s Decline

The workers movement of the 1930s was a mass movement that, thanks to left leadership, encouraged class solidarity and support for a program of radical social change.  The movement was, as described in this and past posts, powerful enough to force the Roosevelt administration into adopting successively more progressive programs that, although flawed, did improve working and living conditions for many.

However, even as its different political tendencies began to unify, creating a national organization of the unemployed, the movement began to suffer a loss of militancy and vision that left it unable to further influence political developments.  As a consequence, the reforms of the Second New Deal came to define the limits of change.

In 1934 the Communist Party organized Unemployed Councils tightened their organizational form, finally adopting a written constitution.  In early 1935, Socialist Party organized unemployed organizations and a number of Musteite organized Unemployed Leagues joined together to create a national organization of the unemployed, the Workers Alliance.  The following year, the Workers Alliance reached agreement with the Unemployed Councils and several other small unemployed organizations to form a new, larger national organization of the unemployed, the Workers Alliance of America (WAA). This unity was possible in large part because of the CP’s newly adopted popular front policy which led it to seek alliances with other political tendencies and groups that were seen as anti-fascist.  This included the Socialist Party and Muste’s Conference for Progressive Labor Action and their associated movements of unemployed.

The Workers Alliance of America, critical of the WPA, continued to fight for the unemployed and those on relief.  For example, when the Roosevelt administration announced planned cuts in WPA employment for 1937, the organization organized a number of sit-ins and demonstrations at city relief offices throughout the country.  The President, under pressure from big city mayors, rescinded the cuts.

However, defending an existing program is not the same as winning a new, improved one.  And this the movement could not do for several reasons.  One is that the rank and file base of the unemployed movement was shrinking because of the growth in the economy and the expansion in relief opportunities.  Another is that many of the movement’s most experienced activists were now employed as organizers in the growing trade union movement.

A third reason is that changes in the relief system undermined the movement’s ability to mobilize the unemployed and win gains through collective action.  The system had become professionalized, with relief officials in city after city establishing rules about the size of delegations that would be allowed in offices and the number of times each week that delegations could seek meetings with officials. Moreover, relief office workers were instructed not to meet clients if they were accompanied by a delegation or grant relief if a delegation was present in the office.

This left local unemployed organizers in the position of either accepting the new ground rules to ensure that their members received relief or continuing their mass activity hoping that their old strategy would be more effective in winning gains.   Increasingly, members advocated for the former, leaving organizers with no choice.  In fact, as a sign of the growing sophistication of the New Deal relief effort, a number of relief offices actually offered jobs to local activists with the unemployed movement with the promise that they could help make the system work more efficiently and effectively for those seeking relief.  In many cases, those offers were accepted.

Perhaps the most important reason for the movement’s growing political weakness was the Communist Party’s decision to pursue an alliance with the Roosevelt administration as part of its anti-fascist popular front policy.  This led the party to organize support for Roosevelt’s 1936 election and his New Deal policies, and to deemphasize oppositional and militant mass actions in support of social transformation in favor of more established political activity such as petition drives and lobbying for improvements in existing programs. In fact, hoping to win Roosevelt’s good will, the CP often organized rallies designed to show worker support for the WPA and other New Deal programs.  Roosevelt was actually invited to give the main speech at the WAA’s second annual convention.  When he turned down the invitation the honor was given to the WPA’s Director of Labor Relations. In 1938, WAA locals even campaigned for pro-New Deal candidates.

Increasingly the WAA became integrated into the New Deal.  As Piven and Cloward point out:

The [WAA became] recognized as the official bargaining agent for WPA workers, and alliance leaders now corresponded frequently with WPA administrators, communicating a host of complaints, and discussing innumerable procedural questions regarding WPA administrative regulations. Some of the complaints were major, having to do with pay cuts and arbitrary layoffs. Much of the correspondence, however, had to do with minute questions of procedure, and especially with the question of whether WPA workers were being allowed to make up the time lost while attending alliance meetings. Alliance leaders also wrote regularly to the president, reviewing the economic situation for him, deploring cuts in WPA, and calling for an expansion of the program.

The WAA continued to make demands on the administration, drafting their own bills calling for greater public spending and employment at union wages, advocating for their own far more sweeping social insurance program, and calling for the establishment of a national planning agency to oversee a permanent public works program.  But the movement no longer threatened Roosevelt, and its demands were largely ignored.  The WAA dissolved itself in 1941.

The labor movement, riding the growth in the economy, soon replaced the unemployed movement as the most powerful social force for change.  However, for reasons noted above, it also underwent its own moderation despite the efforts of rank and file activists.  For example, CIO leaders established Labor’s Non-Partisan League in 1936 to support President Roosevelt’s reelection and his New Deal program. World War II; the post-war vicious anti-communist attacks on all critics of capitalism, especially in the labor movement; and the strength of the post-war economic expansion finally buried the promise of a radical transformation.  There would be no transformative Third New Deal.

Lessons

The New Deal experience holds a number of important lessons for those advocating a Green New Deal.  First, the existence or even recognition of a crisis cannot be counted on to motivate a change in government policy if that change threatens the status quo.  It took years of mass organizing to force the federal government to acknowledge its responsibility to respond to the devastating social consequences of the Great Depression.  The challenge will be even greater today since, as opposed to the 1930s, the capitalist class continues to enjoy lucrative opportunities for profit-making.

Second, a broad-base mass movement that threatens the stability of the system can force a significant change in government policy.  The driving force for change in the 1930s was the movement of unemployed, and its early power came from the Communist Party’s ability to establish a network of local Unemployed Councils that provided unemployed workers with the opportunity to better understand the cause of their hard times, build class solidarity through collective actions in defense of local needs, and become part of broader campaigns for public policies on the national level that were directly responsive to their local concerns.

It is likely that activists for a Green New Deal will have to engage in a similar process of movement building if they hope to force a meaningful government response to our current crises.  Despite the fact that we face a number of interrelated social, economic, and ecological crises, activists must still find ways to weave together different local organizations engaged in collective actions in defense of their local needs into a nation-wide political force able to project a vision of responsive system change as well as define and fight for associated policies.

Third, government responses to political pressure can be expected to fall far short of movement demands for transformative change.  The Roosevelt administration’s First New Deal programs fell far short of what working people demanded and needed.  It took sustained organizing to win a Second New Deal, which while better, was still inadequate.  It the movement for a Green New Deal succeeds in forcing government action, it is safe to assume that, much as in the 1930s, the policies implemented will be partial and inadequate.  Thus, movement activists have to prepare participants for a long, and ongoing campaign of mobilization, organizational development, and pressure.

Fourth, because of the importance of government policy and the natural attraction of wanting to exert personal influence on it, movement activists must remain vigilant against becoming too tied to the government bureaucracy, thereby losing their political independence and weakening the movement’s capacity to continue pushing for further changes in state policy.  WAA leaders understandably wanted to influence New Deal policy, but their growing embrace of the Roosevelt administration, pursued for broader political objectives as well, ended up weakening the movement’s organizational strengthen and effectiveness and perhaps even more importantly, vision of a more egalitarian and democratic society. Green New Deal activists can be expected to face the same kind of pressures if a progressive government comes to power and begins to initiate its own reform program and movements must be alert to the danger.

Fifth, and finally, movements have to be careful not to become too policy oriented. The New Deal included a number of different programs each designed to address different problems.  This created a natural tendency for the different organizations that comprised the broader social movement to narrow their own focus and concentrate on finding ways to respond to the policy shortcomings that most affected their members.  Thus, while the unemployed, those on relief, and those fighting for unionization initially shared a sense of common struggle, over time, in large measure because of their success in winning reforms, they became separate movements, each with their own separate concerns. As a consequence, the overall power, unity, and commitment of the broader social movement for massive societal change was weakened.

This is a challenge that the movement for a Green New Deal can expect to face if it is successful enough to force meaningful government reforms, especially given the multiplicity of the challenges the country faces. The only way to minimize this challenge is to ensure that movement organizing, from the very beginning, encourages participants to see the need for the broader transformative change inspired by the notion of a Green New Deal, and to draw from their struggle an ever more concrete understanding of how that change can be advanced and how real improvement in their lives depends on its achievement.

What the New Deal can teach us about winning a Green New Deal: Part II—Movement Building

In Part I in this series on lessons to be learned from the New Deal, I described the enormous economic and social costs of the first years of the Great Depression and the reluctance of business and government leaders to pursue policies likely to threaten the status quo.  I did so to demonstrate that we should not assume that simply establishing the seriousness of our current multifaceted crisis, especially one that has yet to directly threaten capitalist profitability, will be enough to win elite consideration of a transformative Green New Deal.

I also argued that it was the growth of an increasingly militant political movement openly challenging the legitimacy of the police, courts, and other state institutions that finally transformed the national political environment and pushed Roosevelt to change course and introduce his early New Deal employment and relief programs.  In this post, I examine the driving force of this movement, the movement of unemployed.

The growth and effectiveness of the unemployed movement owes much to the organizing and strategic choices of the US Communist Party (CP).  While there is much to criticize about CP policies and activities, especially its sectarianism and aggressive antagonism towards other groups, there is also much we can learn about successful organizing from its work with the unemployed in the early years of the depression.

The party faced the challenge of building a mass movement powerful enough to force a change in government policy. Although its initial victory was limited, the policy breakthrough associated with the programs of the First New Deal led to new expectations and demands, culminating in Roosevelt’s adoption of far more extensive employment and relief policies as part of his Second New Deal, only two years later.

We face a similar challenge today; we need to build a mass movement capable of forcing the government to begin adopting policies that help advance a Green New Deal.  Therefore, it is well worth our time to study how party activists built a national organization of the unemployed that helped the unemployed see that their hard times were the result of structural rather than personal failure; encouraged local, collective, and direct action in defense of immediate shared basic needs; and connected local actions to a broader national campaign for government action.

The CP and the unemployed movement

The CP made its decision to organize the unemployed even before the start of the Great Depression.  In August 1929, two months before the stock market crash, the CP established the Trade Union Unity League (TUUL) as an alternative to the AFL and called on that body to assist in the creation of a nation-wide organization of Unemployed Councils (UCs).

The CP was following the lead of the Communist International which had, in 1928, declared the start of the so-called Third Period, which was said to mark the beginning of capitalism’s terminal stage, and called on all communist parties to end their joint work with other organizations and prepare for the coming revolutionary struggle.  This stance meant that as unemployment exploded, those without work had the benefit of an existing organization to give them a voice and instrument of action.  Unfortunately, it also led to destructive attacks on other political tendencies and efforts to build organizations of the unemployed, thereby weakening the overall effort.

The CP’s first big effort directed towards the unemployed was the March 6, 1930 demonstrations against unemployment and for relief that drew some 500,000 people in twenty-five cities and was organized under the banner of “International Day for Struggle against Worldwide Unemployment.”  The New York City demonstration, the largest, was met by police repression, with many demonstrators beaten and arrested.  But another New York City protest by the unemployed in October produced a victory, with the city agreeing to boost relief spending by $1 million.  These actions created visibility for the CP’s fledgling national network of UCs and helped to build its membership.

The Unemployed Councils of the USA held its founding convention in early July.  The following month it issued a statement calling on Congress to adopt its “Workers Unemployment Insurance Bill.” The bill called for “payment of $35 per week for each unemployed worker plus an additional $5 per week per dependent and the creation of a ‘National Unemployment Insurance Fund’ to be generated through a tax on all property valued in excess of $25,000 and incomes of more than $5,000.” A new Workers’ Commission, to be elected by working people, was to control the distribution of funds.

To this point, the Unemployed Councils of the USA was dominated by the CP, and its general program and demands largely echoed those of the CP, often including foreign policy declarations expressing support for the Soviet Union.  However, in November, finally acknowledging that this dominance was limiting recruitment, the party agreed to give its organizers more independence and freedom to focus on the issues of most direct concern to the unemployed.  In the months that followed, “a wave of rent strikes, eviction fights, and hunger marches involving an estimated 250,000 workers in seventy-five cities and six states swept the country. The Unemployed Councils had become a force to be reckoned with.”

The party’s focus on building a confrontational movement operating both locally and nationally led it to reject a variety of other efforts embraced by some unemployed.  As Franklin Folsom describes:

Early in 1931, some leaders of Unemployed Councils had recommended setting up food kitchens, and Communists helped organize food collections. These were humane acts of assistance to people who needed something to eat immediately. In a few months, however, both the Communists and the Unemployed Councils abandoned the idea, saying it had nothing to do with solving the basic problems of the unemployed.  Similarly, Communist and council policy on the subject of looting varied depending on time and place.  In the early days of mass unemployment some Communists encouraged the direct appropriation of food.  Later the practice was frowned on because it solved no long-term problem and could provoke very costly counteraction.

Many unemployed also turned to self-help activities to survive.  The so-called “productive enterprise” movement, in which unemployed workers sought to create their own enterprises to produce either for the market or barter, spread rapidly.  According to one study, by the end of 1932 this movement was active in thirty-seven states, with the largest group in California.  The CP and UCs opposed this effort from the start, calling it a self-starvation movement.

The organization and activity of the UCs

Most UCs were neighborhood centered, since the unemployed generally spent most of their time in the neighborhoods where they lived. The basic unit of the UC was the block committee, which comprised all unemployed local residents and their family members.  Each block committee elected delegates to a neighborhood unemployed council, and these councils, in turn, elected delegates to county or city unemployed councils.

The block committee office served as a social center, where the unemployed could gather and build relationships.  Through conversation and even more importantly action they were also able to develop a new radical understanding of the cause of their unemployment as well as appreciation for collective power.  As Steve Nelson, a leader of the Chicago UC movement, explained, it was important for the unemployed to “see that unemployment was not the result of their own or someone else’s mistake, that it was a worldwide phenomenon and a natural product of the system.” Thus, “unemployed agitation was as much education as direct action.”

With time on their hands, the unemployed were generally eager to act in defense of their neighbors, especially around housing and relief.  Here is Christine Ellis, a UC organizer, talking about what happened at one UC meeting in a black neighborhood on the west side of Chicago:

We spoke simply, explained the platform, the demands and activities of the unemployed council. And then we said, “Are there any questions?”…. Finally an elderly Black man stood up and said, “What you folks figure on doing about that colored family that was thrown out of their house today?… They’re still out there with their furniture on the sidewalk.” So the man with me said, “Very simple. We’ll adjourn the meeting, go over there, and put the furniture back in the house. After that, anyone wishing to join the unemployed council and build an organization to fight evictions, return to this hall and we’ll talk about it some more.” That’s what we did…everybody else pitched in, began to haul in every last bit of furniture, fix up the beds…and when that was all done, went back to the hall. The hall was jammed!

Carl Winder, another UC activist, describes the response of the councils in New York to attempted evictions for nonpayment of rent:

Squads of neighbors were organized to bar the way to the dispossessing offices.  Whole neighborhoods were frequently mobilized to take part in this mutual assistance.  Where superior police force prevailed, it became common practice for the Unemployed Councils to lead volunteer squads in carrying the displaced furniture and belongings back into the home after the police had departed.  Council organizers became adept in fashioning meter-jumps to restore disconnected electric service and gas.

Hosea Hudson, a UC activist in Alabama, tells how landlords in Birmingham would sometimes allow tenants to stay even without paying rent “because if they put a family out, the unemployed workers would wreck the house and take it away for fuel by night…. This was kind of a free-for-all, a share-the-wealth situation.”

No Work, No Rent! was the common chant at UC anti-eviction actions.  And because UCs were part of a national organization, successful strategies in one area were quickly shared with UCs in another, spurring new actions.  According to one account, UCs had practically stopped evictions in Detroit by March 1931.  It was estimated that in 1932, 77,000 New York City families were moved back into their homes by UCs.  At the same time, these were costly actions. The police would often arrest many of those involved as well as use force to end resistance, leading to serious injuries and in some cases deaths.

UCs also mobilized to help people who were turned down for relief assistance.  Normally, UC organizers would gather a large crowd outside the relief agency and send in an elected committee to demand a meeting to reverse the decision.  Here is Hosea Hudson again, explaining the approach of the Birmingham UC:

If someone get out of food and been down to the welfare two or three times and still ain’t got no grocery order…. We’d go to the house of the person that’s involved, the victim, let her tell her story. Then we’d ask all the people, “What do you all think could be done about it?” We wouldn’t just jump up and say what to do. We let the neighbors talk about it for a while, and then it would be some of us in the crowd, we going to say, “If the lady wants to go back down to the welfare, if she wants, I suggest we have a little committee to go with her and find out what the condition is.”

In New York, UC members would often organize sit-ins at the relief office and refuse to leave until the center reversed a negative decision.  Intimidated by the aggressive protests, local relief officials throughout the country increasingly gave ground and approved relief requests.

This kind of activism directly challenged business and elite claims that prosperity was just around the corner.  It also revealed a growing radical spark, as more and more people openly challenged the legitimacy of the police, the court system, and state institutions.

With demands for relief escalating, cash-strapped relief agencies began pressing city governments for additional funds.  But city budgets were also shrinking.    As Danny Lucia reports in his study of unemployed organizing, this was an explosive situation.  In 1932, with Chicago’s unemployment rate at 40 percent, “Mayor Anton Cermak told Congress to send $150 million today or federal troops in the future.”

Thus, the militancy of the unemployed movement was now pushing mayors and even some business leaders to also press for federal action.  This development served to amplify the UCs own state and national campaigns demanding direct job creation and a program of federal relief.  These campaigns, by design, also helped generate publicity and support for local UC actions.

For example, in January 1931, a gathering of the Unemployed Councils of America and the TUUL decided to launch a national petition drive aimed at forcing Congress to pass a Federal Unemployment Insurance bill.  The UCs then began door-to-door canvassing for signatures.  Approximately a month later a delegation of 140 people was sent to Washington DC to deliver the petition to Congress on National Unemployment Insurance Day.  Demonstrations in support of the petition, organized by UCs, were held in most major cities on the same day.

Not long after, the CP set up a new organization, the Unemployed Committee for the National Hunger March, to coordinate a national hunger march on Washington DC to demand federal unemployment insurance and “the granting of emergency winter relief for the unemployed in the form of a lump-sum payment of $150 per unemployed worker, with an additional $50 for each dependent” as well as “a 7-hour workday, establishment of a union wage pay scale for unemployed workers, payment of a soldiers’ bonus to veterans of World War I, and an end to discrimination against black American and foreign-born workers.”  Local conferences selected 1,670 delegates, who converged on Washington from four separate columns in December 1931.  Their trip across the country was supported by local UCs.

Not surprisingly, the delegates were denied entrance to the Capital to present their demands.  They stayed two days and then started back, holding mass meetings across the country on their return trip to talk about their demands and the need for mass action to win them.

Another National Hunger March took place the following year.  This time 3,000 delegates came to Washington DC to again present their demands for winter relief and unemployment insurance.  These marches not only helped to strengthen the movement of the unemployed, they also greatly increased the pressure on elected officials to take some action to restore popular confidence in the government.

Underpinning the strategic orientation of the work of the UCs was the CP’s determination to build solidarity between the labor movement and the unemployed and anti-racist unity.  The first is highlighted by struggles in Detroit, where most unemployment was the result of auto factory layoffs.  There, the UCs and the Young Communist League led several marches to auto plants to protest the inadequate benefits given to laid-off workers.  Organizers would also read statements aimed at the workers still employed in the plants, pledging that the unemployed would not scab if workers struck for improved conditions.

As for anti-racism work, the CP “made sure that all of its agitation in the unemployed councils included protests against racial discrimination by relief agencies, landlords, and local and federal government.  On a more individual level, the Communists’ emphasis on multiracial organizing created situations in which whites and Blacks worked together for a common purpose and created personal bonds.”

Other organizing efforts

The CP was not the only left organization working to build a movement of the unemployed.  Both the Socialist Party and the Conference of Progressive Labor Action (CPLA), led by A.J. Muste, also created unemployed organizations that mobilized hundreds of thousands of jobless workers in local and national protests.  The Socialist Party created affiliated committees in a number of cities, the largest in Chicago and New York.  These committees were, like the UCs, generally oriented towards direct action in response to local conditions but they also engaged in electoral efforts.

The CPLA organized a number of Unemployed Citizen Leagues (UCLs) following the model of the Seattle Unemployed Citizens League. Established in the summer of 1931, the Seattle UCL quickly grew to a membership of 80,000 by 1933.  The UCLs initially focused on self-help through barter and labor exchange.  For example, members of the Seattle league:

persuaded farmers to let them harvest the fruit and potatoes for which there was no market, and they borrowed trucks to transport this produce.  Women exchanged sewing for food.  Barbers cut hair for canned berries.  This practice of barter spread and was highly organized. . . . Some men collected firewood from cutover forested areas; in all, they cut, split, and hauled 11,000 cords.  The products of these labors were shared by UCL members.  Some members repaired houses or worked in shoe repair shops, while others did gardening.  There were also child welfare and legal aid projects in which lawyers contributed their services.

The UCLs were also active in local elections, supporting candidates and legislation in favor of extended relief aid and unemployment insurance.  However, after a few years, most abandoned their focus on self-help, finding that “the needs of the jobless greatly exceeded the ability of a mutual aid program to meet them,” and turned instead to more direct-action protests similar to those of the UCs.  Although the CPLA failed to develop a national presence, their leagues were important in the Midwest, especially Ohio.

The CP was hostile to these organizations and their organizing efforts. In line with their Third Period strategy, the CP considered them to be a danger to the movement they were trying to build and their leaders to be “social-fascists.”  Party opposition went beyond denouncing these groups.  UC activists were encouraged to undermine their work, sometimes by physical force, other times by infiltrating and disrupting their meetings. This sectarianism clearly weakened the overall strength of the unemployed movement.  At the same time, local UC activists would sometimes ignore CP and UC leadership directives and find ways to build solidarity around joint actions on behalf of the unemployed.

The unemployed were not the only group whose organizing threatened the status quo.  As Steve Fraser pointed out: “Farmers took to the fields and roads in shocking displays of lawlessness. All across the corn belt, rebels banded together to forcibly prevent evictions of fellow farmers.” The Farm Holiday Association, an organization of midwestern farmers founded in 1932, not only mobilized its members to resist evictions, it also supported a progressive income tax, federal relief for the urban unemployed, and federal government control of the banks.  “In the South, tenants and sharecroppers unionized and conducted what a Department of Labor study called a ‘miniature civil war.’”

Veterans also organized.  World War I veterans from around the country, many with their families, traveled to Washington DC in summer 1932.  The call for a national Bonus March, although made by a largely anti-communist leadership, was inspired by the CP organized First National Hunger March. The veterans had been promised a bonus to compensate for their low war-time pay, but the Congress had delayed payment until 1945.  The veterans wanted their money now and set-up camps near the Capitol to pressure Congress to act.  Their camps were destroyed and the veterans violently dispersed by troops led by Douglas McArthur and Dwight Eisenhower.

In short, the political trajectory was one that concerned a growing number of political and business leaders.  Working people, largely anchored by a left-promoted, mass-based movement of unemployed, were becoming increasingly militant and dismissive of establishment calls for patience.  Continued federal inaction was becoming ever more dangerous.  Recognizing the need for action to preserve existing structures of power, it took Roosevelt only three months to drop his commitment to balanced budget orthodoxy in favor of New Deal experimentation.

Lessons

The multifaceted crisis we face today is significantly different from the crisis activists faced in the first years of the Great Depression.  But there is no question that, much like then, we will need to build a powerful, mass-movement for change if we hope to harness state power to advance a Green New Deal.

The First New Deal was not the result of administration concerns over the economic and social costs of the Great Depression.  Rather, it was political pressure that forced Roosevelt to begin experimenting with programs responsive to the concerns of working people.  And, not surprisingly, these experiments were, as will be discussed in the next post in this series, quite limited. It took new organizing to push Roosevelt to implement more progressive programs as part of his Second New Deal.

There are also lessons to be learned from the period about movement building itself, specifically the CPs organizing and strategic choices in targeting the unemployed and building a national movement of the unemployed anchored by a network of UCs.   The UCs helped transform how people understood the cause of their hard times.  They also created a local, collective, and direct outlet for action in defense of immediate shared basic needs.  The CP also emphasized the importance of organizing those actions in ways designed to overcome important divisions among working people.  Finally, the party and the UCs created broader campaigns for public policies on the national level that were directly responsive to local concerns and actions. Thus, organizing helped create a momentum that built political awareness, leadership capacity, class unity, and national weight around demands for new public initiatives.

The call for a Green New Deal speaks to a variety of crises and the need for change in many different sectors, including food production, energy generation, transportation, manufacturing, social and physical infrastructure, housing, health care, and employment creation.  It also projects a vision of a new more sustainable, egalitarian, and democratic society.  While it would be a mistake to equate the organizing work in the early years of the depression, which focused on employment and relief, with what is required today given the multifaceted nature of our crisis, we would do well to keep the organizing experience highlighted above in mind as we seek to advance the movement building process needed to win a Green New Deal.  It offers important insights into some of the organizational and political challenges we can expect to face and helpful criteria for deciding how best to respond to them.

For example, it challenges us to think carefully about how to ensure that our organizing work both illuminates the roots of our current multifaceted crises, building anti-capitalist consciousness, and challenges existing racial, ethnic, and gender divisions, strengthening working class unity.  It also challenges us to think about how to ensure that that our efforts in different geographic areas and around different issues will connect to build a national presence and organizational form that strengthens and unites our various efforts and also projects our overall vision of a restructured society.  And it also challenges us to think about how we should engage the state itself, envisioning and preparing for the ways it can be expected to seek to undermine whatever reforms are won.

Growing Old in America: Baby Boomer Nightmare

Despite its reputation as the wealthiest generation, baby boomers (generally considered to be those born between 1946 and 1964) are facing a retirement nightmare.  A 2016 St. Louis Federal Reserve study of the retirement readiness of U.S. families came to the same conclusion but put it more gently: “It could be worrisome that, for many American households, the total balances of their retirement accounts may not be sufficient to ensure a solid life in retirement.”

The investment industry, always ready to deflect blame, argues that the problem is the result of the fact that Americans just don’t save enough.  But even Barron’s, a sister publication of the Wall Street Journal that specializes in financial news, understands what is really happening.  As a recent article in the magazine points out:

Too few Americans are saving for retirement. Those who do save are putting away too little. It is only a matter of time before this sparks an economic and political crisis. . . .

But America’s retirement crisis wasn’t created because of character flaws or personal irresponsibility. Nor can it realistically be fixed by technocratic fixes.

The ugly, unspoken truth is that many people are just not earning enough money. They barely have enough to cover their daily expenses; they don’t have enough left over to be able to save.

The promised golden years are out of reach for most boomers.

Baby boomers are moving rapidly towards retirement.  Those born in 1946 are now 73, those born in 1964 are now 55.  Despite being celebrated for their good economic fortune, especially in contrast to millennials, most boomers face a future that doesn’t include retirement with dignity or, in the words of the St. Louis Fed, a solid life.

Although labeled the wealthiest generation, a Stanford Center on Longevity examination of retirement preparedness found that “baby boomers are in a financially weaker position than earlier generations of retirees, in terms of home equity accumulation, financial wealth, and total wealth.”

The Stanford Center study divided the boomer generation into two groups, the early boomers (born 1948-1953) and mid-boomers (1954-1959), and compared them to early (before 1942) and later born (1942-47) members of the previous “silent generation.”  The following are some of its key findings:

  • Holding age fixed, mid-boomers age around 55-60 years old had saved less than previous generations at the same age.

  • Holding age fixed, a 50-year old mid-boomer had saved less in any retirement plans, including workplace plans and Individual Retirement Account plans, than a 50-year old from prior generations.

  • Holding age fixed, boomers age 55-60 had a higher debt burden than prior generations at the same age, evidenced in a higher debt-to-net worth ratio, a lower liquid-asset to all asset ratio, and a higher loan-to-value ratio.

But boomer problems are not just comparative.  For example, the Stanford study also found that approximately 30 percent of baby boomers had no money saved in retirement plans in 2014, when they were age 58, on average, “leaving them little time to start saving for retirement.”  And, the median balance for those who held a retirement account was only $200,000, far too small an amount to generate the income needed to carry a person through a 20- to 30-year retirement.

Looking just at retirement age boomers, a 2018 PBS News Hour report noted that:

Nearly half of Americans nearing retirement age (65 years old) have less than $25,000 put away, according to the Employee Benefit Research Institute’s annual survey. One in four don’t even have $1,000 saved.

Adding to the retirement nightmare is the fact that many boomers also remain deep in debt.  A CNBC story reports that:

One-third of homeowners over the age of 65 were still paying off a mortgage in 2012, compared with less than a quarter of people in 1998 — and the median amount they owed nearly doubled to $82,000 from $44,000.

Meanwhile, the number of people aged 60 and older with student debt quadrupled between 2005 and 2015, to 2.8 million from 700,000.

One reason for low boomer financial balances is that this generation was hit hard by the Great Recession and the following years of low interest rates, and has yet to recover. Boomer median household net worth was $224,100 in 2007 and only $184,200 in 2016.

African American and Latinx baby boomers face even greater problems, earning less money and having far less retirement savings than white Americans.  According to Forbes, “The average white family had more than $130,000 in liquid retirement savings (cash in accounts such as 401(k)s, 403(b)s and IRAs) vs. $19,000 for the average African American in 2013, the most recent data available.”

Latinx retirement savings also trails that of whites.  For example, in 2014, among working individuals age 55 to 64, only 32.2 percent of Latinx had money in a retirement account compared with 58.5 percent of whites. The average Latinx account held $42,335 while the average white account held $103,526.

With private pensions and personal savings inadequate to fund a secure retirement, it is no wonder that so many boomers strongly defend Social Security, the so-called third leg (in addition to private pensions and personal savings) of the retirement “stool.”  But, as important as it is, the average Social Security check in 2018 was only $1,422 a month or $17,064 a year.

It should therefore come as no surprise that research by the Institute on Assets and Social Policy finds that one-third of seniors have no money left over at the end of the month or are in debt after meeting necessary expenses.  Or that growing numbers of seniors are making the decision to forego retirement altogether, by either continuing to work to returning to the labor force.

Saying goodbye to retirement

According to the 2019 report titled Boomer Expectations for Retirement, one-third of boomers plan to retire at age 70 or not at all.  And one-third of employed boomers ages 67-72 postponed retirement.

Thus, while labor force participation rates are declining for many age cohorts, they are growing for boomers and older workers. In fact, between April 2000 and January 2018, “there has been essentially no net growth of employment for workers under age 55. Over that same time, employment for workers over age 55 has doubled.”

The figure below shows labor force participation rates for six age 50-plus cohorts since the turn of the century. As Jill Mislinski states: “The pattern is clear: The older the cohort, the greater the growth.”

Sadly, many of these older workers have had little choice but to accept low-paid, physically demanding work at some of America’s richest companies (e.g., Walmart and Amazon) who are delighted to take advantage of their desperation.

Jessica Bruder’s 2017 book, Nomadland: Survival in Twenty First Century America, describes mostly white baby boomers who, strapped for money, decide to buy used RVs and travel.  We learn about the friendships they make, and also the minimum wage seasonal jobs they are forced to take to survive. Zhandarka Kurti draws on Bruder’s work to highlight their experience with Amazon:

With its motivational slogan of “work hard, have fun, make history,” Amazon recruits seasonal workers at various “nomad friendly events” including “RV shows and rallies—in more than a dozen states across America.” Older workers are drawn to the opportunity to make good money in a relatively short time. . . . Also ironically, Walmart and Amazon, the two competing retail giants and also the country’s largest employers, allow their workers to park overnight, an attractive perk for many older nomads who struggle with food security let alone rent. . . .

While most of the nomads are made aware of the physical aspects of the job [working in Amazon fulfillment centers] during the training seminars, they are nonetheless surprised by just how much pain they are in after a day’s work. . . . Older workers constantly complain of chronic pain from work and Amazon’s solution is to offer free over-the-counter pain killers. . . . Amazon leaves older workers so physically tired that they have little occasion to enjoy their leisure time. Instead they spent the remainder of their “free” time nursing themselves back to health to survive another workday. . . .

In many ways older workers are Amazon’s dream labor force. “They love retirees because we’re dependable. We’ll show up and work hard, and are basically slave labor” one 78 year-old workamper who previously worked as a teacher in California’s community colleges confides in Bruder. Older workers are what Bruder calls “plug-and-play labor” in that they are only around for a short time, are often too tired to complain about the non-existent benefits and are generally appreciative of the jobs regardless of the pain they endure. . . . Amazon also receives federal tax credits to hire older disadvantaged workers and the company predicts that by 2020 one in every four workampers in the US would have worked for Amazon.

It is clear that leading American businesses do not favor bringing back pensions, or boosting wages, or paying higher taxes to strengthen and expand social security and other social services.  Thus, if existing trends are not challenged and reversed, the boomer generation (or at least a significant minority) may well be the last to experience some sort of satisfactory retirement. This development is yet another sign of a failed system.  Boomers need to find ways to help younger generations keep the goal of a satisfying retirement alive, and join in a common fight for the structural changes required to realize it.

Millennials: Hit Hard And Fighting Back

A lot has been written and said critical of millennials. The business press has been tough on their spending habits.  As a recent Federal Reserve Board study of millennial economic well-being explained:

In the fields of business and economics, the unique tastes and preferences of millennials have been cited as reasons why new-car sales were lackluster during the early years of the recovery from the 2007–09 recession, why many brick-and-mortar retail chains have run into financial trouble (through lower brand loyalty and goods spending), why the recoveries in home sales and construction have remained slow, and why the indebtedness of the working-age population has increased.

Politicians, even some Democratic Party leaders, have tended to write them off as complainers. For example, while on a book tour, former Vice President Joe Biden told a Los Angeles Times interviewer that “The younger generation now tells me how tough things are. Give me a break. I have no empathy for it. Give me a break.” Biden went on to say that things were much tougher for young people in the 1960s and 1970s.

In fact, quite the opposite is true.  For better or worse, the authors of the Federal Reserve Board study found that there is “little evidence that millennial households have tastes and preference for consumption that are lower than those of earlier generations, once the effects of age, income, and a wide range of demographic characteristics are taken into account.”  More importantly, millennials are far poorer than past generations were at a similar age, and are becoming a significant force in revitalizing the labor movement.

Economic hard times for millennials

The Federal Reserve Board study leaves no doubt that millennials are less well off than members of earlier generations when they were equally young. They have lower earnings, fewer assets, and less wealth.  All despite being better educated.

The study compares the financial standing of three different cohorts: millennials (those born between 1981 and 1997), Generation Xers (those born between 1965 and 1980), and baby boomers (those born between 1946 and 1964).  Table 1, below, shows inflation adjusted income in three different time periods for all households with a full-time worker and for all households headed by a worker younger than 33 years.

The median figures, which best represent the earnings of the typical member of the group, are shown in brackets.  Comparing the median annual earnings of young male heads of households and of young female heads of household across the three time periods shows the millennial earnings disadvantage.  For example, while the median boomer male head of household earned $53,400, the median millennial male head of household earned only $40,600.  Millennial female heads of household suffered a similar decline, although not nearly as steep.

Table 4 compares the asset and wealth holdings of the three generations, and again highlights the deteriorating economic position of millennials.  As we can see, the median total assets held by millennials in 2016 is significantly lower than that held by baby boomers and only half as large as that held by Generation Xers.  Moreover, millennials suffered a decrease in asset holdings across most asset categories.

Finally, we also see that millennials have substantially lower real net worth than earlier cohorts. In 2016, the average real net worth of millennial households was $91,700, some 20 percent less than baby boomer households and almost 40 percent less than Generation X households.

Fighting back

Millennials have good reason to be concerned about their economic situation.  What is encouraging is that there are signs that growing numbers see structural failings in the operation of capitalism as the cause of their problems and collective action as the best response.  A recent Gallup poll offers one sign.  It found a sharp fall in support for capitalism among those 18 to 29 years, from 68 percent positive in 2010 down to 45 percent positive in 2018.  Support for socialism remained unchanged at 51 percent.

A recent Pew Research poll offers another, as shown below. Young people registered the strongest support for unions and the weakest support for corporations.

Of course, what millennials do rather than say is what counts. And millennials are now boosting the ranks of unions.  Union membership grew in 2017 for the first time in years, by 262,000.  And three in four of those new members was under 35.  Figures for 2018 are not yet available, but given the strong and successful organizing work among education, health care, hotel, and restaurant workers, the positive trend is likely to continue.

Millennials are now the largest generation in the United States, having surpassed the baby boomers in 2015.  Hopefully, self-interest will encourage them to play a leading role in building the movement necessary to transform the US political-economy, improving working and living conditions for everyone.

The US Medical System: Healthy Profits At People’s Expense

Health care is a big and profitable business.  As a Wall Street Journal article points out:

[The US] will soon spend close to 20% of its GDP on health–significantly more than the percentage spent by major Organization for Economic Cooperation and Development nations. . . .

Health care has become a larger part of the economy, creating powerful constituencies resistant to changing the way the system operates.

The health-care industry overtook the retail sector as the nation’s largest employer in December, giving local economies and their workers a stake in the industry’s growth. Health jobs surpassed manufacturing jobs in 2008.

The revenues of health-care companies represented nearly 16% of the total revenues of firms in the S&P 500 last year, up from about 4% in 1984.

Health-care companies have more than doubled overall lobbying spending since 1998, and have become a bigger percentage of total lobbying by industries.

Unfortunately, as the Wall Street Journal article also goes on to say, “Despite the higher spending, the U.S. fares worse than the OECD on most major measures of health.”

The following chart provides one illustration of the distorted nature of our health care system.  As we can see, the US spends a far higher share of its GDP on health care than the OECD average, yet has a significantly lower life expectancy.

The article highlights drug prices as one of the key drivers of US health care costs, noting that that they “have risen the most of the three largest components of health spending since 2000, followed by hospital care and physician services.”

The power of the drug industry is immense, and their lobbyists actively work to ensure that the industry’s profits will remain healthy regardless of the social consequences.  The text of the recently negotiated United States-Mexico-Canada Agreement (USMCA) makes this clear.

As a Washington Post story explains:

A handful of major industries scored big wins in President Trump’s North American trade agreement — at times at the expense of ordinary consumers in the United States, Canada and Mexico.

The winners include oil companies, technology firms and retailers, but chief among them are pharmaceutical companies, which gained guarantees against competition from cheaper generic drugs. . . .

The pharmaceutical industry won stronger protection for sales of so-called biologic drugs, which are typically derived from living organisms and are administered by injection or infusion. The medicines are among the most costly and innovative on the market and are a major driver of drug spending.

USMCA guards new biologic drugs from cheaper generic competition for “at least ten years,” compared with current protection of eight years in Canada and five years in Mexico.

“By increasing the term to 10 years, there will be less competition and higher prices,” said Valeria Moy, an economics professor and the director of Mexico Como Vamos, a think tank in Mexico City. “Having more protection in that area means higher prices for consumers.”

The agreement provides extra protection to drug companies in the much larger U.S. market, as well. Current U.S. law protects biologic drugs from generic competition for 12 years, but some Democrats, including in the Obama administration, have pushed to lower that to seven years as a way to speed cheaper generics to the market and lower drug spending.

Critics of the trade agreement argued that by setting a minimum of 10 years of protection, the trilateral pact shields the pharmaceutical industry from future legislative attempts in the United States to shorten biologic drug monopolies.

It “decreases U.S. sovereignty,” said Jeff Francer, general counsel for the Association for Accessible Medicines, a lobbying group for generic drugmakers. “It would be much harder for Congress to try to roll back 12 years to seven years if we’re enshrining 10 years in a free-trade agreement.”

In short, our health care system operates very efficiently for the health care industry, and the industry appears well organized to keep it that way.

Forgotten Workers And The US Expansion

There is a lot of celebrating going on in mainstream policy circles.  The economy is said to be running at full steam with the unemployment rate now below 4 percent.  As Clive Crook puts it in Bloomberg Businessweek, “The U.S. expansion has put millions of people back to work and economists agree that the economy is now at or close to full employment.”

Forgotten in all this celebration is the fact that wages remain stagnant.  Also forgotten are the millions of workers who are no longer counted as part of the labor force and thus not counted as unemployed.

Forgotten workers

One of the best indicators of the weakness of the current recovery is the labor market status of what is called the core workforce, those ages 25-54.  Their core status stems from the fact that, as Jill Mislinski explains, “This cohort leaves out the employment volatility of the high-school and college years, the lower employment of the retirement years and also the age 55-64 decade when many in the workforce begin transitioning to retirement … for example, two-income households that downsize into one-income households.”

The unemployment rate of those 25-54 reached a peak of 9 percent in 2009 before falling steadily to a low of 3.2 percent as of July 2018.  However, the unemployment rate alone can be a very misleading indicator of labor market conditions.  That is certainly true when it comes to the labor market status of today’s core workforce.

A more revealing measure is the Labor Force Participation Rate, which is defined as the Civilian Labor Force (i.e. the sum of those employed and unemployed) divided by the Civilian Noninstitutional Population (i.e. those of working age who are not in the military or institutionalized). Because there can be significant monthly swings in both the numerator and denominator of this measure, the Labor Force Participation Rate shown in the chart below is calculated using a 12-month moving average.

As we can see, the Labor Force Participation Rate for the 25-54 core cohort has sharply declined, from a mid-2000 high of 84.2 percent, down to a low of 81.9 percent in July 2018. Mislinski calculates that:

Based on the moving average, today’s age 25-54 cohort would require 1.6 million additional people in the labor force to match its interim peak participation rate in 2008 and 2.9 million to match the peak rate around the turn of the century.

A related measure of labor market conditions is the Employment-to-Population Ratio, which is defined as the Civilian Employed divided by the Civilian Noninstitutional Population.  As we can see in the next chart, the Employment-to-Population Ratio of our core cohort has also declined from its mid-2000 peak.

Again, according to Mislinski,

First the good news: This metric began to rebound from its post-recession trough in late 2012. However, the more disturbing news is that the current age 25-54 cohort would require an increase of 1.2 million employed prime-age participants to match its ratio peak in 2007. To match its mid-2000 peak would require a 3.1 million participant increase.

The takeaway

Both the Labor Force Participation Rate and the Employment-to-Population Ratio are useful measures of the employment intensity of the economy.  And in a healthy economy we should expect to see high values for both measures for the 25-54 age cohort. That is especially true for a country like the United States, where the non-market public provision of education, health care, and housing is quite limited, and an adequate retirement depends upon private savings.  In other words, people need paid employment to live and these are prime work years.

The decline, over the business cycle, in both the Labor Force Participation Rate and the Employment-to-Population Ratio for our core cohort strongly suggests that our economy is undergoing a profound structural change, with business increasingly organizing its activities in ways that require fewer workers. More specifically, the lower values in these measures mean that millions of prime age workers are being sidelined, left outside the labor market.

It is hard to know what will become of these workers and by extension their families and communities.  Moreover, this is not a problem only of the moment.  This cohort is still relatively young, and the social costs of being sidelined from employment—and here we are not even considering the quality of that employment—will only grow with age.  We can only hope that workers of all ages will eventually recognize that our growing employment problems are the result, not of individual failings, but an increasingly problematic economic system, and begin pushing for its structural transformation.

US Militarism Marches On

Republicans and Democrats like to claim that they are on opposite sides of important issues.  Of course, depending on which way the wind blows, they sometimes change sides, like over support for free trade and federal deficits.  Tragically, however, there is no division when it comes to militarism.

For example, the federal budget for fiscal year 2018 (which ends on September 30, 2018), included more money for the military than even President Trump requested.  Trump had asked for a military budget of $603 billion, a sizeable $25 billion increase over fiscal year 2017 levels; Congress approved $629 billion.  Trump had also asked for $65 billion to finance current war fighting, a bump of $5 billion; Congress approved $71 billion.  The National Defense Authorization Act of 2018, which set the target budget for the Department of Defense at this high level, was approved by the Senate in a September 2017 vote of 89-9.

In the words of the New York Times: “In a rare act of bipartisanship on Capitol Hill, the Senate passed a $700 billion defense policy bill . . . that sets forth a muscular vision of America as a global power, with a Pentagon budget that far exceeds what President Trump has asked for.”

That Act also called for a further increase in military spending of $16 billion for fiscal year 2019 (which begins October 1, 2018).  And, in June 2018, the Senate voted 85 to 10 to authorize that increase, boosting the Defense Department’s fiscal year 2019 total to $716 billion.

This bipartisan embrace of militarism comes at enormous cost for working people.  This cost includes cuts in funding for public housing, health care and education; the rebuilding of our infrastructure; basic research and development; and efforts to mitigate climate change.  It also includes the militarization of our police, since the military happily transfers its excess or outdated equipment to willing local police departments.

And it also includes a belligerent foreign policy.  A case in point: Congress has made clear its opposition to the Trump administration decision to meet with North Korean leader Kim Jong-un and halt war games directed against North Korea, apparently preferring the possibility of a new Korean War.  Congress is also trying to pass a law that will restrict the ability of the President to reduce the number of US troops stationed in South Korea.

In brief, the US military industrial complex, including the bipartisan consensus which helps to promote militarism’s popular legitimacy, is one of the most important and powerful foes we must overcome if we are to seriously tackle our ever-growing social, economic, and ecological problems.

The military is everywhere

The US has approximately 800 formal military bases in 80 countries, with 135,000 soldiers stationed around the globe.  Putting this in perspective, Alice Slater reports that:

only 11 other countries have bases in foreign countries, some 70 altogether. Russia has an estimated 26 to 40 in nine countries, mostly former Soviet Republics, as well as in Syria and Vietnam; the UK, France, and Turkey have four to 10 bases each; and an estimated one to three foreign bases are occupied by India, China, Japan, South Korea, Germany, Italy, and the Netherlands.

US special forces are deployed in even more countries.  According to Nick Turse, as of 2015, these forces were operating in 135 countries, an 80 percent increase over the previous five years.  “That’s roughly 70 percent of the countries on the planet. Every day, in fact, America’s most elite troops are carrying out missions in 80 to 90 nations practicing night raids or sometimes conducting them for real, engaging in sniper training or sometimes actually gunning down enemies from afar.”

This widespread geographic deployment represents not only an aggressive projection of US elite interests, it also provides a convenient rationale for those that want to keep the money flowing.  The military, and those that support its funding, always complain that the military needs more funds to carry out its mission.  Of course, the additional funds enable the military to expand the reach of its operations, thereby justifying another demand for yet more money.

The US military is well funded 

It is no simple matter to estimate of how much we spend on military related activities.  The base military budget is the starting point.  It represents the amount of the discretionary federal budget that is allocated to the Department of Defense.  Then there is the overseas contingency operations fund, which is a separate pool of money sitting outside any budgetary restrictions, that the military receives yearly from the Congress to cover the costs of its ongoing warfare.

It is the combination of the two that most analysts cite when talking about the size of the military budget. Using this combined measure, the Stockholm International Peace Research Institute finds that the United States spends more on its military than the next seven largest military spenders combined, which are China, Russia, Saudi Arabia, India, France, the UK, and Japan.

As the following chart shows, US military spending (base budget plus overseas contingency operations fund), adjusted for inflation, has been on the rise for some time, and is now higher than at any time other than during the height of the Iraq war.  Jeff Stein, writing in the Washington Post, reports that the military’s base budget will likely be “the biggest in recent American history since at least the 1970s, adjusting for inflation.”

As big as it is, the above measure of military spending grossly understates the total.  As JP Sottile explains:

The Project on Government Oversight (POGO) tabulated all “defense-related spending” for both 2017 and 2018, and it hit nearly $1.1 trillion for each of the two years. The “defense-related” part is important because the annual National Defense Authorization Act, a.k.a. the defense budget, doesn’t fully account for all the various forms of national security spending that gets peppered around a half-dozen agencies.

William Hartung, an expert on military spending, went agency by agency to expose all the various military-related expenses that are hidden in different parts of the budget.  As he points out:

You might think that the most powerful weapons in the U.S. arsenal — nuclear warheads — would be paid for out of the Pentagon budget.   And you would, of course, be wrong.  The cost of researching, developing, maintaining, and “modernizing” the American arsenal of 6,800 nuclear warheads falls to an obscure agency located inside the Department of Energy, the National Nuclear Security Administration, or NNSA. It also works on naval nuclear reactors, pays for the environmental cleanup of nuclear weapons facilities, and funds the nation’s three nuclear weapons laboratories, at a total annual cost of more than $20 billion per year.

Hartung’s grand total, which includes, among other things, the costs of Homeland Security, foreign military aid, intelligence services, the Veterans Administration, and the interest on the debt generated by past spending on the military, is $1.09 trillion, roughly the same as the POGO total cited above.  In short, our political leaders are far from forthcoming about the true size of our military spending.

Adding insult to injury, the military cannot account for how it spends a significant share of the funds it is given.  A Reuters’ article by Scott Paltrow tells the story:

The United States Army’s finances are so jumbled it had to make trillions of dollars of improper accounting adjustments to create an illusion that its books are balanced.

The Defense Department’s Inspector General, in a June [2016] report, said the Army made $2.8 trillion in wrongful adjustments to accounting entries in one quarter alone in 2015, and $6.5 trillion for the year. Yet the Army lacked receipts and invoices to support those numbers or simply made them up.

As a result, the Army’s financial statements for 2015 were “materially misstated,” the report concluded. The “forced” adjustments rendered the statements useless because “DoD and Army managers could not rely on the data in their accounting systems when making management and resource decisions.” . . .

The report affirms a 2013 Reuters series revealing how the Defense Department falsified accounting on a large scale as it scrambled to close its books. As a result, there has been no way to know how the Defense Department – far and away the biggest chunk of Congress’ annual budget – spends the public’s money.

The new report focused on the Army’s General Fund, the bigger of its two main accounts, with assets of $282.6 billion in 2015. The Army lost or didn’t keep required data, and much of the data it had was inaccurate, the IG said.

“Where is the money going? Nobody knows,” said Franklin Spinney, a retired military analyst for the Pentagon and critic of Defense Department planning. . . .

For years, the Inspector General – the Defense Department’s official auditor – has inserted a disclaimer on all military annual reports. The accounting is so unreliable that “the basic financial statements may have undetected misstatements that are both material and pervasive.”

Military spending is big for business

Almost half of the US military budget goes to private military contractors.  These military contracts are the lifeblood for many of the largest corporations in America.  Lockheed Martin and Boeing rank one and two on the list of companies that get the most money from the government.  In 2017 Lockheed Martin reported $51 billion in sales, with $35.2 billion coming from the government.  Boeing got $26.5 billion. The next three in line are Raytheon, General Dynamics, and Northrop Grumman.  These top five firms captured some $100 billion in Pentagon contracts in 2016.

And, as Hartung describes,

The Pentagon buys more than just weapons. Health care companies like Humana ($3.6 billion), United Health Group ($2.9 billion), and Health Net ($2.6 billion) cash in as well, and they’re joined by, among others, pharmaceutical companies like McKesson ($2.7 billion) and universities deeply involved in military-industrial complex research like MIT ($1 billion) and Johns Hopkins ($902 million).

Not surprisingly, given how lucrative these contracts are, private contractors work hard to ensure the generosity of Congress. In 2017, for example, 208 defense companies spent almost $100 million to deploy 728 reported lobbyists.  Lobbying is made far easier by the fact that more than 80 percent of top Pentagon officials have worked for the defense industry at some point in their careers, and many will go back to work in the defense industry.

Then there are arms sales to foreign governments. Lawrence Wittner cites a study by the Stockholm International Peace Research Institute that found that sales of weapons and military services by the world’s largest 100 corporate military suppliers totaled $375 billion in 2016. “U.S. corporations increased their share of that total to almost 58 percent, supplying weapons to at least 100 nations around the world.”

Eager to promote the arms industry, government officials work hard on their behalf.  As Hartung explains: From the president on his trips abroad to visit allied world leaders to the secretaries of state and defense to the staffs of U.S. embassies, American officials regularly act as salespeople for the arms firms.”

More for the military and less for everything else

The federal budget is divided into three categories: mandatory spending (primarily social security and medicare), discretionary spending, and interest on the debt. Two trends in discretionary spending, the component of the budget set each year at the discretion of Congress, offer a window on how militarism is squeezing out funding for programs that serve majority needs.

The first noteworthy trend is the growing Congressional support for defense (base military budget) over non-defense programs. In 2001, the majority of discretionary funds went to non-defense programs,  However, that soon changed, as we see in the chart below, thanks to the “war on terror.”  In the decade following September 11, 2001, military spending increased by 50 percent, while spending on every other government program increased by only 13.5 percent.

In the 2018 federal budget, 54 percent of discretionary funds are allocated to the military (narrowly defined), $700 billion to the military and $591 billion to non-military programs. The chart below shows President Trump’s discretionary budgetary request for fiscal year 2019. As we can see, the share of funds for the military would rise to 61 percent of the total.

According to the National Priorities Project, “President Trump’s proposals for future spending, if accepted by Congress, would ensure that, by 2023, the proportion of military spending [in the discretionary budget] would soar to 65 percent.”  Of course, militarism’s actual share is much greater, since the military is being defined quite narrowly.  For example, Veterans’ Benefits is included in the non-defense category.

The second revealing trend is the decline in non-defense discretionary spending relative to GDP.  Thus, not only is the military base budget growing more rapidly than the budget for nondefense programs, spending on discretionary non-defense programs is not even keeping up with the growth in the economy.  This trend translates into a declining public capacity to support research and development and infrastructure modernization, as well as meet growing needs for housing, education, health and safety, disaster response . . . the list is long.

The 2018 bipartisan budget deal increased discretionary spending for both defense and non-defense programs, but the deal did little to reverse this long run decline in non-defense discretionary spending relative to the size of the economy.  A Progressive Policy Institute blog post by Ben Ritz explains:

The Budget Control Act of 2011 (BCA) capped both categories of discretionary spending as part of a broader effort to reduce future deficits. When Congress failed to reach a bipartisan agreement on taxes and other categories of federal spending, the BCA automatically triggered an even deeper, across-the-board cut to discretionary spending known as sequestration. While the sequester has been lifted several times since it first took effect, discretionary spending consistently remained far below the original BCA caps.

That trend ended with the Bipartisan Budget Act of 2018 (BBA). This budget deal not only lifted discretionary spending above sequester levels – it also went above and beyond the original BCA caps for two years. Nevertheless, projected domestic discretionary spending for Fiscal Year 2019 is significantly below the historical average as a percentage of gross domestic product. Moreover, even if policymakers extended these policy changes beyond the two years covered by the BBA, we project that domestic discretionary spending could fall to just 3 percent of GDP within the next decade – the lowest level in modern history [see dashed black line in chart below].

The story is similar for defense spending. Thanks to the pressure put on by the sequester, defense discretionary spending fell to just under 3.1 percent of GDP in FY2017. Under the BBA, defense spending would increase to 3.4 percent of GDP in FY2019 before falling again [see dashed black line in following chart]. Unlike domestic discretionary spending, however, defense would remain above the all-time low it reached before the 2001 terrorist attacks throughout the next decade.

In sum, Congress appears determined to squeeze non-defense programs, increasingly privileging defense over non-defense spending in the discretionary budget and allowing non-defense spending as a share of GDP to fall to record lows.  The ratio of discretionary defense spending relative to GDP appears to be stabilizing, although at levels below its long-term average.  However, discretionary defense spending refers only to the base budget of the Department of Defense and as such is a seriously understated measure of the costs of US militarism.  Including the growing costs of Homeland Security, foreign military aid, intelligence services, the Veterans Administration, the interest on the debt generated by past spending on the military, and the overseas contingency operations fund, would result in a far different picture, one that would leave no doubt about the government’s bipartisan commitment to militarism.

The challenge ahead

Fighting militarism is not easy.  Powerful political and business forces have made great strides in converting the United States into a society that celebrates violence, guns, and the military. The chart below highlights one measure of this success.  Sadly, 39 percent of Americans polled support increasing our national defense while 46 percent think it is just about right. Only 13 percent think it is stronger than it needs to be.

Polls, of course, just reveal individual responses at a moment in time to questions that, in isolation, often provide respondents with no meaningful context or alternatives and thus reveal little about people’s true thoughts.  At the same time, results like this show just how important it is for us to work to create space for community conversations that are informed by accurate information on the extent and aims of US militarism and its enormous political, social, economic, and ecological costs for the great majority of working people.