The failings of our unemployment insurance system are there by design

Our unemployment insurance system has failed the country at a moment of great need.  With tens of millions of workers struggling just to pay rent and buy food, Congress was forced to pass two emergency spending bills, providing one-time stimulus payments, special weekly unemployment insurance payments, and temporary unemployment benefits to those not covered by the system.  And, because of their limited short-term nature, President Biden must now advocate for a third.

The system’s shortcomings have been obvious for some time, but little effort has been made to improve it.  In fact, those shortcomings were baked into the system at the beginning, as President Roosevelt wanted, not by accident.  While we must continue to organize to ensure working people are able to survive the pandemic, we must also start the long process of building popular support for a radical transformation of our unemployment insurance system.  The history of struggle that produced our current system offers some useful lessons.

Performance

Our unemployment insurance system was designed during the Great Depression.  It was supposed to shield workers and their families from the punishing costs of unemployment, thereby also helping to promote both political and economic stability.  Unfortunately, as Eduardo Porter and Karl Russell reveal in a New York Times article, that system has largely failed working people.

The chart below shows the downward trend in the share of unemployed workers receiving benefits and the replacement value of those benefits.  Benefits now replace less than one-third of prior wages, some eight percentage points below the level in the 1940s.  Benefits aside, it is hard to celebrate a system that covers fewer than 30 percent of those struggling with unemployment.

A faulty system

Although every state has an unemployment insurance system, they all operate independently.  There is no national system.  Each state separately generates the funds it needs to provide unemployment benefits and is largely free, subject to some basic federal standards, to set the conditions under which an unemployed worker becomes eligible to receive benefits, the waiting period before benefits will be paid, the length of time benefits will be paid, the benefit amount, and requirements to continue receiving benefits.

Payroll taxes paid by firms generate the funds used to pay unemployment insurance benefits.  The size of the taxes to be paid depends on the value of employee earnings that is made taxable (the base wage) and the tax rate.  States are free to set the base wage as they want, subject to a federally mandated floor of $7000 established in the 1970s.  States are also free to set the tax rate as they want.  Not surprisingly, in the interest of supporting business profitability, states have generally sought to keep both the base wage and tax rate low.  For example, Florida, Tennessee and Arizona continue to set their base wage at the federal minimum value.  And, as the figure below shows, insurance tax rates have been trending down for some time.

While such a policy might help business, lowering the tax rate means that states have less money in their trust funds to pay unemployment benefits.  Thus, when times are hard, and unemployment claims rise, many states find themselves hard pressed to meet their required obligations.  In fact, as Porter and Russell explain:

Washington has been repeatedly called on to provide additional relief, including emergency patches to unemployment insurance after the Great Recession hit in 2008. Indeed, it has intervened in response to every recession since the 1950s.

This is far from a desirable outcome for those states forced to borrow, since the money has to be paid back with interest by imposing higher future payroll taxes on employers.  Thus, growing numbers of states have sought to minimize the likelihood of this happening, or at least the amount to be borrowed, by raising eligibility standards, reducing benefits, and shortening time of coverage, all of which they hope will reduce the number of people drawing unemployment benefits as well as the amount and length of time they will receive them.

Porter and Russell highlight some of the consequences of this strategy:

In Arizona, nearly 70 percent of unemployment insurance applications are denied. Only 15 percent of the unemployed get anything from the state. Many don’t even apply. Tennessee rejects nearly six in 10 applications.

In Florida, only one in 10 unemployed workers gets any benefits. The state is notably stingy: no more than $275 a week, roughly a third of the maximum benefit in Washington State. And benefits run out quickly, after as little as 12 weeks, depending on the state’s overall unemployment rate.

And, the growing stagnation of the US economy, which has led to more precarity of employment, only makes this strategy ever more fiscally “intelligent.”  For example, as the following figure shows, a growing percentage of the unemployed are remaining jobless for a longer time.  Such a trend, absent state actions to restrict access to benefits, would mean financial trouble for state officials.

Adding to the system’s structural shortcomings is that fact that growing numbers of workers, for example the many workers who have been reclassified as independent contractors, are not covered by it.  In addition, since eligibility for benefits requires satisfying a minimum earnings and hours of work requirement over a base year, the growth in irregular low wage work means that many of those in most need of the system’s financial support during periods of unemployment find themselves declared ineligible for benefits.

By design, not by mistake

Our current unemployment insurance system and its patchwork set of state standards and benefits dates back to the depression. While President Roosevelt gets credit for establishing our unemployment insurance system as part of the New Deal, the fact is he deliberately sidelined a far stronger program that, if it had been approved, would have put working people today in a far more secure position. 

The Communist Party (CP) began pushing an unemployment and social insurance bill in the summer of 1930 and, along with the numerous Unemployed Councils that existed in cities throughout the country, worked hard to promote it over the following years.  On March 4, 1933, the day of Roosevelt’s inauguration, they organized demonstrations stressing the need for action on unemployment insurance.

Undeterred by Roosevelt’s lack of action, the CP-authored “Workers Unemployment and Social Insurance Bill” was introduced in Congress in February 1934 by Representative Ernest Lundeen of the Farmer-Labor Party.  In broad brush, the bill mandated the payment of unemployment insurance to all unemployed workers and farmers equal to average local full-time wages, with a guaranteed minimum of $10 per week plus $3 for each dependent. Those forced into part-time employment would receive the difference between their earnings and the average local full-time wage.  The bill also created a social insurance program that would provide payments to the sick and elderly, and maternity benefits to be paid eight weeks before and eight weeks after birth.  All these benefits were to be financed by unappropriated funds in the Treasury and taxes on inheritances, gifts, and individual and corporate incomes above $5,000 a year.

The bill enjoyed strong support among workers—employed and unemployed—and it was soon endorsed by 5 international unions, 35 central labor bodies, and more than 3000 local unions.  Rank and file worker committees also formed across the country to pressure members of Congress to pass it.

When Congress refused to act on the bill, Lundeen reintroduced it in January 1935. Because of public pressure, the bill became the first social insurance plan to be recommended by a congressional committee, in this case the House Labor Committee.  However, it was soon voted down in the full House of Representatives, 204 to 52.

Roosevelt strongly opposed the Lundeen bill and it was to provide a counter that he pushed to create an alternative, one that offered benefits far short of what the Workers Unemployment and Social Insurance Bill offered, and was strongly opposed by many workers and all organizations of the unemployed.  Roosevelt appointed a Committee on Economic Security in July 1934 with the charge to develop a social security bill that he could present to Congress in January 1935 that would include provisions for both unemployment insurance and old-age security.  An administration approved bill was introduced right on schedule in January and Roosevelt called for quick congressional action. 

Roosevelt’s bill was revised in April by a House committee and given a new name, “The Social Security Act.”  After additional revisions the Social Security Act was signed into law on August 14, 1935. The Social Security Act was a complex piece of legislation.  It included what we now call Social Security, a federal old-age benefit program; a program of unemployment insurance administered by the states; and a program of federal grants to states to fund benefits for the needy elderly and aid to dependent children. 

The unemployment system established by the Social Security Act was structured in ways unfavorable to workers (as was the federal old-age benefit program).  Rather than a progressively funded, comprehensive national system of unemployment insurance that paid benefits commensurate with worker wages, the act established a federal-state cooperative system that gave states wide latitude in determining standards.

More specifically, the act levied a uniform national pay-roll tax of 1 percent in 1936, 2 percent in 1937, and 3 percent in 1938, on covered employers, defined as those employers with eight or more employees for at least twenty weeks, not including government employers and employers in agriculture.  Only workers employed by a covered employer could receive benefits.

The act left it to the states to decide whether to enact their own plans, and if so, to determine eligibility conditions, the waiting period to receive benefits, benefit amounts, minimum and maximum benefit levels, duration of benefits, disqualifications, and other administrative matters. It was not until 1937 that programs were established in every state as well as the then-territories of Alaska and Hawaii.  And it was not until 1938 that most began paying benefits.

In the early years, most states required eligible workers to wait 2 to 4 weeks before drawing benefits, which were commonly set at half recent earnings (subject to weekly maximums) for a period ranging from 12 to 16 weeks. Ten state laws called for employee contributions as well as employer contributions; three still do today.

Over the following years the unemployment insurance system has been improved in a number of positive ways, including by broadening coverage and boosting benefits.  However, its basic structure remains largely intact, a structure that is overly complex, with a patchwork set of state eligibility requirements and miserly benefits. And we are paying the cost today.

This history makes clear that nothing will be given to us.  We need and deserve a better unemployment insurance system. And to get it, we are going to have to fight for it, and not be distracted by the temporary, although needed, band-aids Congress is willing to provide.  The principles shaping the Workers Unemployment and Social Insurance Bill can provide a useful starting point for current efforts.

The planning and politics of conversion: World War II lessons for a Green New Deal—Part 1

This is the first in a series of posts that aim to describe and evaluate the World War II mobilization experience in the United States in order to illuminate some of the economic and political challenges we can expect to face as we work for a Green New Deal.  

This post highlights the successful government directed wartime reorientation of the U.S. economy from civilian to military production, an achievement that both demonstrates the feasibility of a rapid Green New Deal transformation of the U.S. economy and points to the kinds of organizational capacities we will need to develop. The post also highlights some of the strategies employed by big business to successfully stamp the wartime transformation as a victory for “market freedom,” an outcome that strengthened capital’s ability to dominate the postwar U.S. political economy and suggests the kind of political struggles we can expect and will need to overcome as we work to achieve a just Green New Deal transformation.

The climate challenge and the Green New Deal

We are hurtling towards a climate catastrophe.  The Intergovernmental Panel on Climate Change, in its Special Report on Global Warming of 1.5°C, warns that we must limit the increase in the global mean temperature to 1.5 degrees Celsius above pre-industrial levels by 2100 if we hope to avoid a future with ever worsening climate disasters and “global scale degradation and loss of ecosystems and biodiversity.”  And, it concludes, to achieve that goal global net carbon dioxide emissions must fall by 45 per cent by 2030 and reach net zero emissions by 2050.

Tragically, none of the major carbon dioxide emitting nations has been willing to pursue the system-wide changes necessary to halt the rise in the global mean temperature.  Rather than falling, carbon dioxide emissions rose over the decade ending in 2019.  Only a major crisis, in the current case a pandemic, appears able to reverse the rise in emissions.   

Early estimates are that the COVID-19 pandemic will cause a fall in global emissions of somewhere between 4 and 7 percent in 2020.  But the decline will likely be temporary.  For example, the International Monetary Fund is forecasting an emission rise of 5.8 percent in 2021. This bounce back is in line with what happened after the 2008-09 Great Recession.  After falling by 1.4 percent in 2009, global emissions grew by 5.1 percent in 2010.

Motivated by signs of the emerging climate crisis—extreme weather conditions, droughts, floods, warming oceans, rising sea levels, fires, ocean acidification, and soil deterioration—activists in the United States have worked to build a movement that joins climate and social justice activists around a call for a Green New Deal to tackle both global warming and the country’s worsening economic and social problems. The Green Party has promoted its ecosocialist Green New Deal since 2006, but it was the 2018 mass actions by new climate action groups such as Extreme Rebellion and the Sunrise Movement and then the 2019 introduction of a Green New Deal congressional resolution by Representative Alexandria Ocasio-Cortez and Senator Edward Markey that helped popularize the idea.

The Ocasio-Cortez—Markey resolution, echoing the Intergovernmental Panel on Climate Change, calls for a ten-year national program of mobilization designed to cut CO2 emissions by 40-60 percent from 2010 levels by 2030 and achieve net-zero emissions by 2050.  Its program includes policies that aim at replacing fossil fuels with clean, renewable sources of energy, and existing forms of transportation, agriculture, and urban development with new affordable and sustainable ones; encouraging investment and the growth of clean manufacturing; and promoting good, high paying union jobs and universal access to clean air and water, health care, and healthy food.

While there are similarities, there are also important differences, between the Green Party’s Green New Deal and Ocasio-Cortez—Markey’s Green New Deal, including over the speed of change, the role of public ownership, and the use of fracking and nuclear power for energy generation.  More generally, there are also differences among supporters of a Green New Deal style transformation over whether the needed government investments and proposed social policies should be financed by raising taxes, slashing the military budget, borrowing, or money creation.  There are also environmentalists who oppose the notion of sustained but sustainable growth explicitly embraced by many Green New Deal supporters and argue instead for a policy of degrowth, or a “Green New Deal without growth.”

These arguments are important, representing different political sensibilities and visions, and need to be taken seriously.  But what has largely escaped discussion is any detailed consideration of the actual process of economic transformation required by any serious Green New Deal program.  Here are some examples of the kind of issues we will need to confront:

Fossil fuel production has to be ratcheted down, which will dramatically raise fossil fuel prices.  The higher cost of fossil fuels will significantly raise the cost of business for many industries, especially air travel, tourism, and the aerospace and automobile industries, triggering significant declines in demand for their respective goods and services and reductions in their output and employment.  We will need to develop a mechanism that will allow us to humanely and efficiently repurpose newly created surplus facilities and provide alternative employment for released workers.

New industries, especially those involved in the production of renewable energy will have to be rapidly developed.  We will need to develop agencies capable of deciding the speed of their expansion as well as who will own the new facilities, how they will be financed, and how best to ensure that the materials required by these industries will be produced in sufficient quantities and made available at the appropriate time. We will also have to develop mechanisms for deciding where the new industries will be located and how to develop the necessary social infrastructure to house and care for the new workforce.  

The list goes on—we will need to ensure the rapid and smooth expansion of facilities capable of producing electric cars, mass transit vehicles, and a revitalized national rail system.  We will need to organize the retrofitting of existing buildings, both office and residential, as well as the training of workers and the production of required equipment and materials.  The development of a new universal health care system will also require the planning and construction of new clinics and the development of new technologies and health practices. 

The challenges sound overwhelming, especially given the required short time frame for change.  But, reassuringly, the U.S. government faced remarkable similar challenges during the war years when, in approximately three years, it successfully converted the U.S. economy from civilian to military production. This experience points to the importance of studying the World War II planning process for lessons and should give us confidence that we can successfully carry out our own Green New Deal conversion in a timely fashion.

World War II economic mobilization

The name Green New Deal calls to mind the New Deal of the 1930s, which is best understood as a collection of largely unrelated initiatives designed to promote employment and boost a depressed economy.  In contrast, the Green New Deal aims at an integrated transformation of a “functioning” economy, which is a task much closer to the World War II transformation of the U.S. economy. That transformation required the repression of civilian production, much like the Green New Deal will require repression of the fossil fuel industry and those industries dependent on it.  Simultaneously, it also required the rapid expansion of military production, including the creation of entirely new products like synthetic rubber and weapon systems, much like the Green New Deal will require expansion of new forms of renewable energy, transportation, and social programs.  And it also required the process of conversion to take place quickly, much like what is required under the Green New Deal. 

J.W. Mason and Andrew Bossie highlight the contemporary relevance of the wartime experience by pointing out:

Just as in today’s public-health and climate crises, the goal of wartime economic management was not to raise GDP in the abstract, but to drastically raise production of specific kinds of goods, many of which had hardly figured in the prewar economy. Then as now, this rapid reorganization of the economy required a massive expansion of public spending, on a scale that had hardly been contemplated before the emergency. And then as, potentially, now, this massive expansion of public spending, while aimed at the immediate non-economic goal, had a decisive impact on long-standing economic problems of stagnation and inequality. Of course, there are many important differences between the two periods. But the similarities are sufficient to make it worth looking to the 1940s for economic lessons for today.

Before studying the organization, practice, and evolution of the World War II era planning system, it is useful to have an overall picture of the extent, speed, and success of the economy’s transformation. The following two charts highlight the dominant role played by the government.  The first shows the dramatic growth and reorientation in government spending beginning in 1941.  As we can see federal government war expenditures soared, while non-war expenditures actually fell in value.  Military spending as a share of GNP rose from 2.2 percent in 1940, to 11 percent in 1941, and to 31.2 percent in 1942.

The second shows that the expansion in plant and equipment required to produce the goods and services needed to fight the war was largely financed by the government.  Private investment actually fell in value over the war years.

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 92.

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 115.

The next chart illustrates the speed and extent of the reorientation of industrial production over the period 1941-1944.  As we can see, while industrial production aimed at military needs soared, non-military industrial production significantly declined.

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 104.

The next two charts illustrate the success of the conversion process.  The first shows the rapid increase in the production of a variety of military weapons and equipment.  The second demonstrates why the United States was called the “Arsenal of democracy”; it produced the majority of all the munitions produced during World War II.

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 319

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 507.

Significantly, while the rapid growth in military related production did boost the overall growth of the economy, because it was largely achieved at the expense of nonmilitary production, the economy’s overall growth over the years 1941-44/45, was far from extraordinary.  For example, the table below compares the growth in real gross nonfarm product over the early years of the 1920’s to that of the early years of the 1940’s.  As we can see, there is little difference between the two periods, and that holds true even if we exclude the last year of the war, when military spending plateaued and military production began to decline.  The same holds true when comparing just the growth in industrial production over the two periods.

Years                   Growth in real gross nonfarm product                                              

1921-2528.4%
1941-4524.6%
  
1921-2426.2%
1941-4425.8%
Source: Harold G. Vatter, The U.S. Economy in World War II, New York: Columbia University Press, 1985, p. 22.

This similarity between the two periods reinforces the point that the economic success of the war years, the rapid ramping up of military production, was primarily due to the ability of government mobilization agencies to direct an economic conversion that privileged the production of goods and services for the military at the expense of non-military goods and services.  This experience certainly lends credibility to those who seek a similar system-wide conversion to achieve a Green New Deal transformation of the U.S. economy.

Such a transformation is not without sacrifice.  For example, workers did pay a cost for the resulting suppression of civilian oriented production, but it was limited.  As Harold Vatter points out: “There were large and real absolute decreases in total consumer expenditures between 1941 and 1945 on some items considered important in ordinary times.  Prominent among these, in the durable goods category, were major home appliances, new cars, and net purchases of used cars, furniture, and radio and TV sets.”

At the same time there were real gains for workers.  Overall personal consumption which rose in both 1940 and 1941, declined absolutely in 1942, but then began a slow and steady increase, with total personal consumption higher in 1945 than in 1941.  However, this record understates the real gains.  The U.S. civilian population declined from 131.6 million in 1941 to 126.7 million in 1944.  Thus, the gain in personal consumption on a per capita basis was significant.  As Vatter notes, “real employee compensation per private employee in nonfarm establishments rose steadily ever year, and in 1945 was over one-fifth above the 1941 level. . . . More broadly, similar results show up for the index of real disposable personal income per capita, which increased well over one-fourth during the same war years.”  Of course, these gains were largely the result of more people working and for longer hours; it was definitely earned.  Also important is the fact that pretax family income rose faster for those at the bottom of the income distribution than for those at the top, helping to reduce overall income inequality. 

In sum, there are good reasons for those seeking to implement a Green New Deal style transformation of the U.S. economy to use the World War II planning experience as a template.  A careful study of that experience can alert us to the kinds of organizational and institutional capacities we will need to develop.  And, it is important to add, it can also alert us to the kinds of political challenges we can expect to face.

Planning and politics

The success of the U.S. economy’s World War II transformation was due, in large part, to the work of a series of changing and overlapping mobilization agencies that President Roosevelt established by executive order and then replaced or modified as new political and economic challenges emerged. Roosevelt took his first meaningful action to help prepare the United States economy for war in May 1940, when he reactivated the World War 1-era National Defense Advisory Commission (NDAC).  The NDAC was replaced by the Office of Production Management (OPM) in December 1940.  The Supply Priorities and Allocation Board (SPAB) was then created in August 1941 to develop a needed longer-term planning orientation to guide the work of the OPM.  And finally, both the OPM and the SPAB were replaced by the War Production Board (WPB) in January 1942.  With each change, decision-making became more centralized, planning responsibilities expanded, and authority to direct economic activity strengthened.

The work of these agencies was greatly enhanced by a number of other initiatives, one of the most important being the August 1940 establishment of the Defense Plant Corporation (DPC). The DPC was authorized to directly finance and own plant and equipment vital to the national defense. The DPC ended up financing and owning roughly one-third of the plant and equipment built during the war, most of which was leased to private companies to operate for a minimal amount, often $1 a year. The aircraft industry was the main beneficiary of DPC investment, but plants were also built to produce synthetic rubber, ships, machine tools, iron and steel, magnesium, and aluminum.

Despite its successful outcome, the process of economic conversion was far from smooth and the main reason was resistance by capitalists.  Still distrustful of New Deal reformers, most business leaders were critical of any serious attempt at prewar planning that involved strengthening government regulation and oversight of their respective activities.  Rather, they preferred to continue their existing practice of individually negotiating contracts with Army and Navy procurement agencies.  Many also opposed prewar government entreaties to expand their scale of operations to meet the military’s growing demand for munitions and equipment.  Their reasons were many: they were reluctant to expand capacity after a decade of depression; civilian markets were growing rapidly and highly profitable; and the course of the war, and the U.S. participation in it, remained uncertain.

Their attitude and power greatly influenced the operation and policies of the NDAC, which was built on industry divisions run by industry leaders, most of whom were so-called “dollar-a-year men” who continued to draw their full salaries from the corporations that employed them, and advised by industry associations.  This business-friendly structure, with various modifications, was then transferred to the OPM and later the WPB.

With business interests well represented in the prewar mobilization agencies, the government struggled to transform the economy in preparation for war.  The lack of new business investment in critical industries meant that by mid-1941 material shortages began forcing delays in defense orders; aluminum, magnesium, zinc, steel, and machine tools were all growing scare.  At the same time, a number of industries that were major consumers of these scare materials and machinery, such as the automobile industry, also resisted government efforts to get them to abandon their consumer markets and convert to the production of needed military goods.

In some cases, this resistance lasted deep into the war years, with some firms objecting not only to undertaking their own expansion but to any government financed expansion as well, out of fear of post-war overproduction and/or loss of market share.  The resulting political tension is captured by the following exchange at a February 1943 Congressional hearing between Senator E. H. Moore of Oklahoma and Interior Secretary and Petroleum Administrator for War Harold L. Ickes over the construction of a petroleum pipeline from Texas to the East Coast:

Secretary Ickes. I would like to say one thing, however. I think there are certain gentlemen in the oil industry who are thinking of the competitive position after the war.

The Chairman. That is what we are afraid of, Mr. Secretary.

Secretary Ickes. That’s all right. I am not doing that kind of thinking.

The Chairman. I know you are not.

Secretary Ickes. I am thinking of how best to win this war with the least possible amount of casualties and in the quickest time.

Senator Moore. Regardless, Mr. Secretary, of what the effect would be after the war? Are you not concerned with that?

Secretary Ickes. Absolutely.

Senator Moore. Are you not concerned with the economic situation with regard to existing conditions after the war?

Secretary Ickes. Terribly. But there won’t be any economic situation to worry about if we don’t win the war.

Senator Moore. We are going to win the war.

Secretary Ickes. We haven’t won it yet.

Senator Moore. Can’t we also, while we are winning the war, look beyond the war to see what the situation will be with reference to –

Secretary Ickes (interposing). That is what the automobile industry tried to do, Senator. It wouldn’t convert because it was more interested in what would happen after the war. That is what the steel industry did, Senator, when it said we didn’t need any more steel capacity, and we are paying the price now. If decisions are left with me, it is only fair to say that I will not take into account any post-war factor—but it can be taken out of my hands if those considerations are paid attention to.

Once the war began, many businesses were also able to build a strategic alliance with the military that allowed them to roll back past worker gains and isolate and weaken unions.  For example, by invoking the military’s overriding concern with achieving maximum production of the weapons of war, business leaders were able to defeat union attempts to legislate against the awarding of military contracts to firms in violation of labor law. They also succeeded in ignoring overtime pay requirements when lengthening the workweek and in imposing new workplace rules that strengthened management prerogatives. 

If unions struck to demand higher wages or resist unilateral workplace changes, business and military leaders would declare their actions a threat to the wartime effort, which cost them public support. Often the striking unions were threatened with government sanctions by mobilization authorities.  In some cases, especially when it came to the aircraft industry, the military actually seized control of plants, sending in troops with fixed bayonets, to break a strike.  Eventually, the CIO traded a no-strike pledge for a maintenance of membership agreement, but that often put national union officials in the position of suppressing rank-and-file job actions and disciplining local leaders and activists, an outcome which weakened worker support for the union.

Business didn’t always have its own way.  Its importance as essential producer was, during the war, matched by the military’s role as essential demander.  And, while the two usually saw eye-to-eye, there were times when military interests diverged from, and dominated, corporate interests.  Moreover, as the war continued, government planning agencies gained new powers that enabled them to effectively regulate the activities of both business and the military.  Finally, the work of congressional committees engaged in oversight of the planning process as well as pressure from unions and small business associations also helped, depending on the issue, to place limits on corporate prerogatives.

Still, when all was said and done, corporate leaders proved remarkably successful in dominating the mobilization process and strengthening their post-war authority over both the government and organized labor.  Perhaps the main reason for their success is that almost from the beginning of the mobilization process, a number of influential business leaders and associations aggressively organized themselves to fight their own two-front war—one that involved boosting production to help the United States defeat the Axis powers and one that involved winning popular identification of the fight for democracy with corporate freedom of action.

In terms of this second front, as J.W. Mason describes:

Already by 1941, government enterprise was, according to a Chamber of Com­merce publication, “the ghost that stalks at every business conference.” J. Howard Pew of Sun Oil declared that if the United States abandoned private ownership and “supinely reli[es] on government control and operation, then Hitlerism wins even though Hitler himself be defeated.” Even the largest recipients of military contracts regarded the wartime state with hostility. GM chairman Alfred Sloan—referring to the danger of government enterprises operating after war—wondered if it is “not as essential to win the peace, in an eco­nomic sense, as it is to win the war, in a military sense,” while GE’s Philip Reed vowed to “oppose any project or program that will weaken” free enterprise.

Throughout the war, business leaders and associations “flooded the public sphere with descriptions of the mobilization effort in which for-profit companies figured as the heroic engineers of a production ‘miracle’.”  For example, Boeing spent nearly a million dollars a year on print advertising in 1943-45, almost as much as it set aside for research and development.

The National Association of Manufactures (NAM) was one of the most active promoters of the idea that it was business, not government, that was winning the war against state totalitarianism.  It did so by funding a steady stream of films, books, tours, and speeches.  Mark R. Wilson describes one of its initiatives:

One of the NAM’s major public-relations projects for 1942, which built upon its efforts in radio and print media, was its “Production for Victory” tour, designed to show that “industry is making the utmost contributions toward victory.” Starting the first week in May, the NAM paid for twenty newspaper reporters to take a twenty-four-day, fifteen-state trip during which they visited sixty-four major defense plants run by fifty-eight private companies. For most of May, newspapers across the country ran daily articles related to the tour, written by the papers’ own reporters or by one of the wire services. The articles’ headlines included “Army Gets Rubber Thanks to Akron,” “General Motors Plants Turning Out Huge Volume of War Goods,” “Baldwin Ups Tank Output,” and “American Industry Overcomes a Start of 7 Years by Axis.”

It was rarely if ever mentioned by the companies or the reporters that almost all of these new plants were actually financed, built, and owned by the government, or that it was thanks to government planning efforts that these companies had well-trained workers and received needed materials on a timely basis.  Perhaps not surprisingly, government and union efforts to challenge the corporate story were never as well funded, sustained, or shaped by as clear a class perspective.  As a consequence, they were far less effective.

Paul A.C. Koistinen, in his major study of World War II planning, quotes Hebert Emmerich, past Secretary of the Office of Production Management (OPM), who looking back at the mobilization experience in 1956 commented that “When big business realized it had lost the elections of 1932 and 1936, it tried to come in through the back door, first through the NRA and then through the NDAC and OPM and WPB.”  Its success allowed it to emerge from the war politically stronger than when it began.

Capital is clearly much more organized and powerful today than it was in the 1940s.  And we can safely assume that business leaders will draw upon all their many strengths in an effort to shape any future conversion process in ways likely to limit its transformative potential.  Capital’s wartime strategy points to some of the difficult challenges we must prepare to face, including how to minimize corporate dominance over the work of mobilization agencies and ensure that the process of transformation strengthens, rather than weakens, worker organization and power.  Most importantly, the wartime experience makes clear that the fight for a Green New Deal is best understood as a new front in an ongoing class war, and that we need to strengthen our own capacity to wage a serious and well-prepared ideological struggle for the society we want to create.

Profits over people: frontline workers during the pandemic

It wasn’t that long ago that the country celebrated frontline workers by banging pots in the evening to thank them for the risks they took doing their jobs during the pandemic. One national survey found that health care workers were the most admired (80%), closely followed by grocery store workers (77%), and delivery drivers (73%). 

Corporate leaders joined in the celebration. Supermarket News quoted Dacona Smith, executive vice president and chief operating officer at Walmart U.S., as saying in April:

We cannot thank and appreciate our associates enough. What they have accomplished in the last few weeks has been amazing to watch and fills everyone at our company with enormous pride. America is getting the chance to see what we’ve always known — that our people truly do make the difference. Let’s all take care of each other out there.

Driven by a desire to burnish their public image, deflect attention from their soaring profits, and attract more workers, many of the country’s leading retailers, including Walmart, proudly announced special pandemic wage increases and bonuses.  But as a report by Brookings points out, although their profits continued to roll in, those special payments didn’t last long.

There are three important takeaways from the report: First, don’t trust corporate PR statements; once people stop paying attention, corporations do what they want.  Second, workers need unions to defend their interests.  Third, there should be some form of federal regulation to ensure workers receive hazard pay during health emergencies like pandemics, similar to the laws requiring time and half for overtime work.

The companies and their workers

In Windfall Profits and Deadly Risks, Molly Kinder, Laura Stateler, and Julia Du look at the compensation paid to frontline workers at, and profits earned by, 13 of the 20 biggest retail companies in the United States.  The 13, listed in the figure below, “employ more than 6 million workers and include the largest corporations in grocery, big-box retail, home improvement, pharmacies, electronics, and discount retail.” The seven left out “either did not have public financial information available or were in retail sectors that were hit hard by the pandemic (such as clothing) and did not provide COVID-19 compensation to workers.”

Pre-pandemic, the median wages for the main frontline retail jobs (e.g., cashiers, salespersons, and stock clerks) at these 13 companies generally ranged from $10 to $12 per hour (see the grey bar in the figure below).  The exceptions at the high end were Costco and Amazon, both of which had a minimum starting wage of $15 before the start of the pandemic. The exception at the low end was Dollar General, which the authors estimate had a starting wage of only $8 per hour.  

Clearly, these companies thrive on low-wage work.  And it should be added, disproportionately the work of women of color.  “Women make up a significantly larger share of the frontline workforce in general retail stores and at companies such as Target and Walmart than they do in the workforce overall. Amazon and Walmart employ well above-average shares of Black workers (27% and 21%, respectively) compared to the national figure of 12%.”

Then came the pandemic

Eager to take advantage of the new pandemic-driven business coming their way, all 13 companies highlighted in the report quickly offered some form of special COVID-19-related compensation in an effort to attract new workers (as highlighted in the figure below).  “Commonly referred to as “hazard pay,” the additional compensation came in the form of small, temporary hourly wage increases, typically between $2 and $2.50 per hour, as well as one-off bonuses. In addition to temporary hazard pay, a few companies permanently raised wages for workers during the pandemic.“

Unfortunately, as the next figure reveals, these special corporate payment programs were short-lived.  Of the 10 companies that offered temporary hourly wage increases, 7 ended them before the beginning of July and the start of a new wave of COVID-19 infections. Moreover, even with these programs, nine of the 13 companies continued to pay wages below $15 an hour.  Only three companies instituted permanent wage hikes.   While periodic bonuses are no doubt welcomed, they are impossible to count on and of limited dollar value compared with an increase in hourly wages.  So much, for corporate caring!

Don’t worry about the companies

As the next figure shows, while the leading retail companies highlighted in the study have been stingy when it comes to paying their frontline workers, the pandemic has treated them quite well.  As the authors point out:

Across the 13 companies in our analysis, revenue was up an average of 14% over last year, while profits rose 39%. Excluding Walgreens—whose business has struggled during the pandemic—profits rose a staggering 46%. Stock prices rose on average 30% since the end of February. In total, the 13 companies reported 2020 profits to date of $67 billion, which is an additional $16.9 billion compared to last year.

Looking just at the compensation generosity of the six companies that had public data on the total cost of their extra compensation to workers, the authors found that the numbers “paint a picture of most companies prioritizing profits and wealth for shareholders over investments in their employees. On average, the six companies’ contribution to compensating workers was less than half of the additional profit earned during the pandemic compared to the previous year.”

This kind of scam, where companies publicly celebrate their generosity only to quietly withdraw it a short time later, is a common one.  And because it is hard to follow corporate policies over months, they are often able to sell the public that they really do care about the well-being of their workers.  That is why this study is important—it makes clear that relying on corporations to do the “right thing” is a losing proposition for workers.

Defunding police and challenging militarism, a necessary response to their “battle space”

The excessive use of force and killings of unarmed Black Americans by police has fueled a popular movement for slashing police budgets, reimagining policing, and directing freed funds to community-based programs that provide medical and mental health care, housing, and employment support to those in need.  This is a long overdue development.

Police are not the answer

Police budgets rose steadily from the 1990s to the Great Recession and, despite the economic stagnation that followed, have remained largely unchanged.  This trend is highlighted in the figure below, which shows real median per capita spending on police in the 150 largest U.S. cities.  That spending grew, adjusted for inflation, from $359 in 2007 to $374 in 2017.  The contrast with state and local government spending on social programs is dramatic.  From 2007 to 2017, median per capita spending on housing and community development fell from $217 to $173, while spending on public welfare programs fell from $70 to $47.

Thus, as economic developments over the last three decades left working people confronting weak job growth, growing inequality, stagnant wages, declining real wealth, and rising rates of mortality, funding priorities meant that the resulting social consequences would increasingly be treated as policing problems.  And, in line with other powerful trends that shaped this period–especially globalization, privatization, and militarization–police departments were encouraged to meet their new responsibilities by transforming themselves into small, heavily equipped armies whose purpose was to wage war against those they were supposed to protect and serve. 

The military-to-police pipeline

The massive, unchecked militarization of the country and its associated military-to-police pipeline was one of the more powerful factors promoting this transformation.  The Pentagon, overflowing with military hardware and eager to justify a further modernization of its weaponry, initiated a program in the early 1990s that allowed it to provide surplus military equipment free to law enforcement agencies, allegedly to support their “war on drugs.”  As a Forbes article explains:

Since the early 1990s, more than $7 billion worth of excess U.S. military equipment has been transferred from the Department of Defense to federal, state and local law enforcement agencies, free of charge, as part of its so-called 1033 program. As of June [2020], there are some 8,200 law enforcement agencies from 49 states and four U.S. territories participating. 

The program grew dramatically after September 11, 2001, justified by government claims that the police needed to strengthen their ability to combat domestic terrorism.  As an example of the resulting excesses, the Los Angeles Times reported in 2014 that the Los Angeles Unified School District and its police officers were in possession of three grenade launchers, 61 automatic military rifles and a Mine Resistant Ambush Protected armored vehicle. Finally, in 2015, President Obama took steps to place limits on the items that could be transferred; tracked armored vehicles, grenade launchers, and bayonets were among the items that were to be returned to the military.

President Trump removed those limits in 2017, and the supplies are again flowing freely, including armored vehicles, riot gear, explosives, battering rams, and yes, once again bayonets.  According to the New York Times, “Trump administration officials said that the police believed bayonets were handy, for instance, in cutting seatbelts in an emergency.”

Outfitting police departments for war also encouraged different criteria for recruiting and training. For example, as Forbes notes, “The average police department spends 168 hours training new recruits on firearms, self-defense, and use of force tactics. It spends just nine hours on conflict management and mediation.”  Arming and training police for military action leads naturally to the militarization of police relations with community members, especially Black, Indigeous and other people of color, who come to play the role of the enemy that needs to be controlled or, if conditions warrant, destroyed.

In fact, the military has become a major cheerleader for domestic military action.  President Trump, on a call with governors after the start of demonstrations protesting the May 25, 2020 killing of George Floyd while in police custody, exhorted them to “dominate” the street protests.

As the Washington Examiner reports:

“You’ve got a big National Guard out there that’s ready to come and fight like hell,” Trump told governors on the Monday call, which was leaked to the press.

[Secretary of Defense] Esper lamented that only two states called up more than 1,000 Guard members of the 23 states that have called up the Guard in response to street protests. The National Guard said Monday that 17,015 Guard members have been activated for civil unrest.

“I agree, we need to dominate the battle space,” Esper said after Trump’s initial remarks. “We have deep resources in the Guard. I stand ready, the chairman stands ready, the head of the National Guard stands ready to fully support you in terms of helping mobilize the Guard and doing what they need to do.”

The militarization of the federal budget

The same squeeze of social spending and support for militarization is being played out at the federal level.  As the National Priorities Project highlights in the following figure, the United States has a military budget greater than the next ten countries combined.

Yet, this dominance has done little to slow the military’s growing hold over federal discretionary spending.  At $730 billion, military spending accounts for more than 53 percent of the federal discretionary budget.  A slightly broader notion, what the National Priorities Project calls the militarized budget, actually accounts for almost two-thirds of the discretionary budget.  The militarized budget:

includes discretionary spending on the traditional military budget, as well as veterans’ affairs, homeland security, and law enforcement and incarceration. In 2019, the militarized budget totaled $887.8 billion – amounting to 64.5 percent of discretionary spending. . . . This count does not include forms of militarized spending allocated outside the discretionary budget, include mandatory spending related to veterans’ benefits, intelligence agencies, and interest on militarized spending.

The militarized budget has been larger than the non-militarized budget every year since 1976.  But the gap between the two has grown dramatically over the last two decades. 

In sum, the critical ongoing struggle to slash police budgets and reimagine policing needs to be joined to a larger movement against militarism more generally if we are to make meaningful improvements in majority living and working conditions.

Racism, COVID-19, and the fight for economic justice

While the Black Lives Matter protests sweeping the United States were triggered by recent police murders of unarmed African Americans, they are also helping to encourage popular recognition that racism has a long history with punishing consequences for black people that extend beyond policing.  Among the consequences are enormous disparities between black and white well-being and security.  This post seeks to draw attention to some of these disparities by highlighting black-white trends in unemployment, wages, income, wealth, and security. 

A common refrain during this pandemic is that “We are all in it together.”  Although this is true in the sense that almost all of us find our lives transformed for the worst because of COVID-19, it is also not true in some very important ways.  For example, African Americans are disproportionally dying from the virus.  They account for 22.4 percent of all COVID-19 deaths despite making up only 12.5 percent of the population. 

One reason is that African Americans also disproportionally suffer from serious preexisting health conditions, a lack of health insurance, and inadequate housing, all of which increased their vulnerability to the virus.  Another reason is that black workers are far more likely than white workers to work in “front-line” jobs, especially low-wage ones, forcing them to risk their health and that of their families.  While black workers comprise 11.9 percent of the labor force, they make up 17 percent of all front-line workers.  They represent an even higher percentage in some key front-line industries: 26 percent of public transit workers; 19.3 percent of child care and social service workers; and 18.2 percent of trucking, warehouse and postal service workers.

African Americans have also disproportionately lost jobs during this pandemic.  The black employment to adult population ratio fell from 59.4 percent before the start of the pandemic to a record low of 48.8 percent in April.  Not surprisingly, recent surveys find, as the Washington Post reports, that:

More than 1 in 5 black families now report they often or sometimes do not have enough food — more than three times the rate for white families. Black families are also almost four times as likely as whites to report they missed a mortgage payment during the crisis — numbers that do not bode well for the already low black homeownership rate.

This pandemic has hit African Americans especially hard precisely because they were forced to confront it from a position of economic and social vulnerability as the following trends help to demonstrate.

Unemployment

The Bureau of Labor Statistics began collecting separate data on African American unemployment in January 1972.  Since then, as the figure below shows, the African American unemployment rate has largely stayed at or above twice the white unemployment rate. 

As Olugbenga Ajilore explains

Between strides in civil rights legislation, desegregation of government, and increases in educational attainment, employment gaps should have narrowed by now, if not completely closed. Yet as [the figure above] shows, this has not been the case.

Wages

The figure below from an Economic Policy Institute study, shows the black-white wage gap for workers in different earning percentiles, by education level, and regression-adjusted (to control for age, gender, education and regional differences).  As we can see, the wage gap has grown over time regardless of measure. 

Elise Gould summarizes some important take-aways from this study:

The black–white wage gap is smallest at the bottom of the wage distribution, where the minimum wage serves as a wage floor. The largest black–white wage gap as well as the one with the most growth since the Great Recession, is found at the top of the wage distribution, explained in part by the pulling away of top earners generally as well as continued occupational segregation, the disproportionate likelihood for white workers to occupy positions in the highest-wage professions.

It’s clear from the figure that education is not a panacea for closing these wage gaps. Again, this should not be shocking, as increased equality of educational access—as laudable a goal as it is—has been shown to have only small effects on class-based wage inequality, and racial wealth gaps have been almost entirely unmoved by a narrowing of the black–white college attainment gap . . . . And after controlling for age, gender, education, and region, black workers are paid 14.9% less than white workers.

Income

The next figure shows that while median household income has generally stagnated for all races/ethnicities over the period 2000 to 2017, only blacks have suffered an actual decline.  The median income for black households actually fell from $42,348 to $40,258 over this period.  As a consequence, the black-white income gap has grown.  The median black household in 2017 earned just 59 cents for every dollar of income earned by the white median household, down from 65 cents in 2000.

Moreover, as Valerie Wilson, points out, “Based on [Economic Policy Institute] imputed historical income values, 10 years after the start of the Great Recession in 2007, only African American and Asian households have not recovered their pre-recession median income.“  Median household income for African American households fell 1.9 percent or $781 over the period 2007 to 2017.  While the decline was greater for Asian households (3.8 percent), they continued to have the highest median income.

Wealth

The wealth gap between black and white households also remains large.  In 1968, median black household wealth was $6,674 compared with median white household wealth of $70,768.  In 2016, as the figure below shows, it was $13,024 compared with $149,703.

As the Washington Post summarizes:

“The historical data reveal that no progress has been made in reducing income and wealth inequalities between black and white households over the past 70 years,” wrote economists Moritz Kuhn, Moritz Schularick and Ulrike I. Steins in their analysis of U.S. incomes and wealth since World War II.

As of 2016, the most recent year for which data is available, you would have to combine the net worth of 11.5 black households to get the net worth of a typical white U.S. household.

The self-reinforcing nature of racial discrimination is well illustrated in the next figure.  It shows the median household wealth by education level as defined by the education level of the head of household. 

As we see, black median household wealth is below white median household wealth at every education level, with the gap growing with the level of education.  In fact, the median black household headed by someone with an advanced degree has less wealth than the median white household headed by someone with only a high school diploma.  The primary reason for this is that wealth is passed on from generation to generation, and the history of racism has made it difficult for black families to accumulate wealth much less pass it on to future generations. 

Security

The dollar value of household ownership of liquid assets is one measure of economic security.  The greater the value, the easier it is for a household to weather difficult times not to mention unexpected crises, such as today’s pandemic.  And as one might expect in light of the above income and wealth trends, black households have far less security than do white households.

As we can see in the following figure, the median black household held only $8,762 in liquid assets (as defined as the sum of all cash, checking and savings accounts, and directly held stocks, bonds, and mutual funds).  In comparison, the median white household held $49,529 in liquid assets.  And the black-white gap is dramatically larger for households headed by someone with a bachelors degree or higher. 

Hopeful possibilities

The fight against police violence against African Americans, now being advanced in the streets, will eventually have to be expanded and the struggle for racial justice joined to a struggle for economic justice.  Ending the disparities highlighted above will require nothing less than a transformational change in the organization and workings of our economy.

One hopeful sign is the widespread popular support for and growing participation in the Black Lives Matter-led movement that is challenging not only racist policing but the idea of policing itself and is demanding that the country acknowledge and confront its racist past.  Perhaps the ways in which our current economic system has allowed corporations to so quickly shift the dangers and costs of the pandemic on to working people, following years of steady decline in majority working and living conditions, is helping whites better understand the destructive consequences of racism and encouraging this political awakening. 

If so, perhaps we have arrived at a moment where it will be possible to build a multi-racial working class-led movement for structural change that is rooted in and guided by a commitment to achieving economic justice for all people of color. One can only hope that is true for all our sakes.

Victory: Ohio’s plan to deny workers their unemployment insurance is shelved

Some stories are just so satisfying that they deserve to be shared.  Here is one.

In early May, Ohio Republican Governor Mike DeWine began reopening the state economy.  And to support business and slash state expenses, both at worker expense, he had a “COVID-19 Fraud” form put up on the Ohio Department of Job and Family Services website where employers could confidentially report employees “who quit or refuse work when it is available due to COVID-19.”  Inspectors would then investigate whether the reported workers should lose their unemployment benefits and possibly be charged with unemployment fraud.

Significantly, as Sarah Ingles, the board president of the Central Ohio Worker Center, noted in a statement quoted by the Intercept, the form “does not define what constitutes a ‘good cause’ exemption, and by doing so, may exclude many Ohio workers who have justifiable reasons for not returning to work and for receiving unemployment insurance benefits.”  In other words, “while the state did not take the time to define what a ‘good cause’ exemption includes or does not include, it did have time to develop an online form where employers could report employees.”

However, thanks to the work of an anonymous hacker, the site has now been taken down. In officialese, “The previous form is under revision pending policy references.”  Most importantly, as Janus Rose writing for Motherboard reports:

“No benefits are being denied right now as a result of a person’s decision not to return to work while we continue to evaluate the policy,” ODJFS Director Kimberly Hall told Cleveland.com.

According to Rose, the hacker developed a script that overwhelmed the system:

The script works by automatically generating fake information and entering it into the form. For example, the companies are taken from a list of the top 100 employers in the state of Ohio—including Wendy’s, Macy’s, and Kroger—and names and addresses are randomly created using freely-available generators found online. Once all the data is entered, the script has to defeat a CAPTCHA-like anti-spam measure at the end of the form. Unlike regular CAPTCHAs, which display a grid of pictures and words that the user must identify, the security tool used by the form is merely a question-and-answer field. By storing a list of common questions and their respective answers, the script can easily defeat the security measure by simply hitting the “switch questions” button until it finds a question it can answer.

To make the code more accessible, software engineer David Ankin repackaged the script into a simple command line tool which allows users to run the script in the background of their computer, continuously submitting fake data to the Ohio website.

“If you get several hundred people to do this, it’s pretty hard to keep your data clean unless you have data scientists on staff,” Ankin told Motherboard.

The hacker told Motherboard they viewed their effort as a form of direct action against the exploitation of working people during the COVID-19 crisis.  Score one for working people.

The 1930s and Now: Looking Back to Move Forward

My article What the New Deal Can Teach Us About Winning a Green New Deal is in the latest issue of the journal Class, Race and Corporate Power.  As I say in the abstract,

While there are great differences between the crises and political movements and possibilities of the 1930s and now, there are also important lessons that can be learned from the efforts of activists to build mass movements for social transformation during the Great Depression. My aim in this paper is to illuminate the challenges faced and choices made by these activists and draw out some of the relevant lessons for contemporary activists seeking to advance a Green New Deal.

Advocates of a Green New Deal often point to the New Deal and its government programs to demonstrate the possibility of a progressive state-directed process of economic change.  I wrote my article to show that the New Deal was a response to growing mass activity that threatened the legitimacy and stability of the existing economic and political order rather than elite good-will, and to examine the movement building process that generated that activity.

Depression-era activists were forced to organize in a period of economic crisis, mass unemployment and desperation, and state intransigence. While they fell short of achieving their goal of social transformation, they did build a movement of the unemployed and spark a wave of militant labor activism that was powerful enough to force state policy-makers to embrace significant, although limited, social reforms, including the creation of programs of public employment and systems of social security and unemployment insurance.

Differences between that time period and this one are shrinking and the lessons we can learn from studying the organizing strategies and tactics of those activists are becoming ever more relevant.  The US economy is now in a deep recession, one that will be more devastating than the Great Recession.  US GDP shrank at a 4.8 percent annualized rate in the first quarter of this year and will likely contract at a far greater 25 percent annualized rate in the second quarter.  While most analysts believe the economy will begin growing again in the third quarter, their predictions are for an overall yearly decline in the 6-8 percent range.   As for the years ahead—no one can really say.  The Economist, for example, is talking about a 90 percent economy for years after the current lockdown ends.  In other words, life will remain hard for most working people for some time.

Not surprisingly, given the size of the economic contraction, unemployment has also exploded. According to the Economic Policy Institute, “In the past six weeks, nearly 28 million, or one in six, workers applied for unemployment insurance benefits across the country.”  More than a quarter of the workforce in the following states have filed for benefits: Hawaii, Kentucky, Georgia, Rhode Island, Michigan, and Nevada.  And tragically, millions of other workers have been prevented from applying because of outdated state computer systems and punitive regulations as well as overworked employment department staff.  Even at its best, the US unemployment system, established in 1935 as part of the New Deal reforms, was problematic, paying too little, for too short a time period, and with too many eligibility restrictions.  Now, it is collapsing under the weight of the crisis.

Yet, at the same time, worker organizing and militancy is growing. Payday Report has a strike tracker that has already identified over 150 strikes, walkouts, and sickouts since early March across a range of sectors and industries, including retail, fast food, food processing, warehousing, manufacturing, public sector, health care, and the gig economy.  As an Associated Press story points out:

Across the country, the unexpected front-line workers of the pandemic — grocery store workers, Instacart shoppers and Uber drivers, among them — are taking action to protect themselves. Rolling job actions have raced through what’s left of the economy, including Pittsburgh sanitation workers who walked off their jobs in the first weeks of lockdown and dozens of fast-food workers in California who left restaurants last week to perform socially distant protests in their cars.

Rather than defending workers, governments are now becoming directly involved in suppressing their struggles. For example, after meatpacker walkouts closed at least 22 meat plans and threatened the operation of many others, triggered by an alarming rise in the number of workers testing positive for the virus, President Trump signed an executive order requiring companies to remain open and fully staffed. It remains to be seen how workers will respond.  In Pennsylvania, the Governor responded to nurse walkouts at nursing homes and long-term care facilities to protest a lack of protective equipment by sending national guard members to replace them.

Activists throughout the country are now creatively exploring ways to support those struggling to survive the loss of employment and those engaged in workplace actions to defend their health and well-being.  Many are also seeking ways to weave the many struggles and current expressions of social solidarity together into a mass movement for radical transformation.  Despite important differences in political and economic conditions, activists today are increasingly confronting challenges that are similar to ones faced by activists in the 1930s and there is much we can learn from a critical examination of their efforts.  My article highlights what I believe are some of the most important lessons.

Coronavirus: a return to normal is not good enough

We shouldn’t be satisfied with a return to normalcy. We need a “new normal.”

We are now in a recession, one triggered by government ordered closures of businesses producing nonessential goods and services, an action taken to limit the spread of the coronavirus. In response, Congress has approved three stimulus measures which legislators hope will keep the economy afloat until the virus is contained and companies can resume business as usual.

Many people, rightly criticizing the size, speed, and aims of these measures, have called for a new, improved stimulus package.  But what is getting far less attention, and may be the most important thing to criticize, is the notion that we should view a return to normalcy as our desired goal.  The fact is we also need a new economy.

The old normal only benefited a few

The media, even those critical of the Trump administration, all too often showcase economic experts who, while acknowledging the severity of the current crisis, reassure us that economic activity will return to normal before too long.  But since our economy increasingly worked to benefit a small minority, that is no cause for celebration.

Rarely mentioned is the fact that our economy was heading into a recession before the coronavirus hit. Or that living and working conditions for the majority of Americans were declining even during the past years of expansion. Or that the share of workers in low-wage jobs was growing over the last fifteen years.  Or that Americans are facing a retirement crisis.  Or that life expectancy fell from 2014 to 2017 because of the rise in mortality among young and middle-aged adults of all racial groups due to drug overdoses, suicides, and alcoholism.  If existing patterns of ownership and production remain largely unchanged, we face a future of ever greater instability, inequality, and poverty.

The economic crisis

The failings of our current system are only accentuated by the crisis. Many analysts are predicting an unprecedented one-quarter decline in GDP of 8 percent to 10 percent in the second quarter of this year.   The overall yearly decline may well be in the 5-7 percent range, the steepest annual drop in growth since 1946.

The unemployment rate is soaring and may reach 20 percent before the year is out.  A recent national survey found that 52 percent of workers under the age of 45 have already lost their job, been placed on leave, or had their hours cut because of the pandemic-caused downturn.

As a consequence, many people are finding it difficult to pay rent.  Survey results show that only 69 percent of renters paid their rent during the first week of April compared with over 80 percent during the first week of March.  And this includes renters who made partial payments.  Homeowners are not in much better shape.

Our unemployment insurance system has long been deficient: benefits are inadequate, last for only short period of time, and eligibility restrictions leave many workers uncovered. As of year-end 2019, the average unemployment insurance check was only $378 a week, the average duration of benefits was less than 15 weeks, and fewer than one-third of those unemployed were drawing benefits.

Now, the system is overwhelmed by people seeking to file new claims, leaving millions unable to even start their application process.  Although recent federal legislation allows states to expand their unemployment insurance eligibility and benefits, a very large share of those losing their jobs will find this part of our safety net not up to its assigned job.

A better crafted stimulus is needed

In response to the crisis, policy-makers have struggled to approve three so-called stimulus measures, the March 2020 Coronavirus Aid, Relief, and Economic Security (CARES) Act being the largest and most recent.  Unfortunately, these efforts have been disappointing.  For example, most of the provisions in the CARES Act include set termination dates untied to economic or health conditions. Approved spending amounts for individuals are also insufficient, despite the fact that Treasury Secretary Mnuchin believes the $1200 provided to most Americans as part of the CARES Act will be enough to tide them over for 10 weeks.

Also problematic is that not all CARE funds are directed to where they are most needed.  For example, no money was allocated to help states maintain their existing Medicaid program eligibility and benefit standards or expand health care coverage to uninsured immigrants and those who lose their job-based insurance.  And no money was allocated to state and local governments to help them maintain existing services in the face of declining tax revenues. Perhaps not surprisingly, the largest share of CARES approved spending is earmarked for corporate rescues without any requirement that the funds be used for saving jobs or wages.  In sum, we need another, better stimulus measure if we hope to minimize the social costs of our current crisis.

Creating a new normal

Even a better stimulus measure leaves our economy largely unchanged.  Yet, ironically, our perilous situation has encouraged countless expressions of social trust and solidarity that reveal ways to move forward to a more humane, egalitarian, and sustainable economy.  This starts with the growing recognition by many Americans that social solidarity, not competitive individualism, should shape our policies. People have demonstrated strong support for free and universal access to health care during this crisis, and we can build on that to push for an expansive Medicare for All health care system.  People also have shown great solidarity with the increasingly organized struggles of mail carriers, health care workers, bus drivers, grocery shoppers, cashiers, and warehouse workers to keep themselves safe while they brave the virus for our benefit.  We can build on that solidarity to push for new labor laws that strengthen the ability of all workers to form strong, democratic unions.

There is also growing support for putting social well-being before the pursuit of profit.  Many people have welcomed government action mandating that private corporations convert their production to meet social needs, such as the production of ventilators and masks.  We can build on this development to encourage the establishment of publicly owned and operated industries to ensure the timely and affordable production of critical goods like pharmaceuticals and health care equipment. And many people are coming to appreciate the importance of planning for future crises.  This appreciation can be deepened to encourage support for the needed transformation of our economy to minimize the negative consequences of the growing climate crisis.

We should not discount our ability to shape the future we want.

The Green New Deal and the State: Lessons from World War II—Part II

There is growing interest in a Green New Deal, but far too little discussion among supporters about the challenging nature of the required economic transformation, the necessary role of public planning and ownership in shaping it, or the strategies necessary to institutionalize a strong worker-community voice in the process and final outcome. In this two-part series I draw on the experience of World War II, when the state was forced to direct a rapid transformation from civilian to military production, to help encourage and concretize that discussion.

In Part I, I first discussed the need for a rapid Green New Deal-inspired transformation and the value of studying the U.S. experience during World War II to help us achieve it. Then, I examined the evolution, challenges, and central role of state planning in the wartime conversion to alert us to the kind of state agencies and capacities we will need to develop. Finally, I highlighted two problematic aspects of the wartime conversion and postwar reconversion which we must guard against: the ability of corporations to strengthen their dominance and the marginalization of working people from any decision-making role in conversion planning.

Here in Part II, I discuss the efforts of labor activists to democratize the process of transformation during the war period in order to sharpen our thinking about how best to organize a labor-community movement for a Green New Deal.  During this period, many labor activists struggled against powerful political forces to open up space for new forms of economic planning with institutionalized worker-community involvement.  The organizing and movement building efforts of District 8 leaders of the United Electrical, Radio & Machine Workers of America (UE), as described by Rosemary Feuer in her book Radical Unionism in the Midwest, 1900-1950, stand out in this regard.  Although their success was limited, there is much that we can learn from their efforts.

Organizing for a worker-community planned conversion process

District 8 covered Missouri, Iowa, Kansas, Arkansas, southern Indiana and southern and western Illinois, and UE contracts in that area were heavily weighted towards small and medium sized firms producing mechanical and electrical products.  As the government began its war time economic conversion in 1941, its policy of suppressing civilian goods and rewarding big corporations with defense contracts hit the firms that employed UE members hard.

The UE response was to build a labor and community-based effort to gain control over the conversion process. In Evansville, Indiana, the UE organized a community campaign titled “Prevent Evansville from Becoming a Ghost Town.”  As Feurer explains,

District 8’s tentative proposal called upon union and civic and business leaders to request the establishment of a federal program that would “be administered through joint and bona fide union-management-government cooperation” at the local level. It would ensure that before reductions in the production of consumer goods were instituted, government must give enough primary war contracts and subcontracts to “take up the slack” of unemployment caused in cities such as Evansville. It also proposed that laid-off workers would get “first claim on jobs with other companies in the community,” while excessive overtime would be eliminated until unemployment was reduced.

District 8 organizers pressed Evansville’s mayor to gather community, labor, and business representatives from all over the Midwest to discuss how to manage the conversion to save jobs.  They organized mass petition drives and won endorsements for their campaign from many community groups and small businesses.  Persuaded, Evansville’s mayor contacted some 500 mayors from cities with populations under 250,000 in eleven midwestern states, requesting that they send delegations of “city officials, labor leaders, managers of industry and other civic leaders” to a gathering in Chicago.  Some 1500 delegates attended the September meeting.

The conference endorsed the UE’s call for a significant role for labor in conversion planning, specifically “equal participation of management and labor in determining a proper and adequate retraining program and allocation of primary and sub-contracts. . . [And that] all possible steps be taken to avoid serious dislocations in non-defense industries.”  A committee of seven, with two labor representatives, was chosen to draw up a more concrete program of action.

One result was that Evansville and Newton, Iowa (another city with a strong UE presence) were named “Priority Unemployment Plan” areas, and allowed to conduct “an experiment for community-based solving of unemployment and dislocations caused by war priorities.”  The plan restricted new plant construction if existing production capacity was considered sufficient, encouraged industry-wide and geographical-based pooling of production facilities to boost efficiency and stabilize employment, required companies to provide training to help workers upgrade their skills, and supported industry-wide studies to determine how to best adapt existing facilities for military production.

William Sentner, the head of District 8, called for labor to take a leading role in organizing community gatherings in other regions and creating regional planning councils. Unfortunately, CIO leaders did little to support the idea. Moreover, once the war started, unemployment stopped being a serious problem and the federal government took direct control over the conversion process.

Organizing for a worker-community planned reconversion process

As the war began to wind down, District 8 leaders once again took up the issue of conversion, this time conversion back to a peacetime economy.  In 1943, they got the mayor of St. Louis to create a community planning committee, with strong labor participation, to discuss future economic possibilities for the city.  In 1944, they organized a series of union conferences with elected worker representatives from each factory department in plants under UE contract throughout the district, along with selected guests, to discuss reconversion and postwar employment issues.

At these conferences District 8 leaders emphasized the importance of continued government planning to guarantee full employment, but also stressed that the new jobs should be interesting and fulfilling and the workweek should be reduced to 30 hours to allow more time for study, recreation, and family life.  They also discussed the importance of other goals: an expansion of workers’ rights in production; labor-management collaboration to develop and produce new products responsive to new needs; support for women who wanted to continue working, in part by the provision of nurseries; and the need to end employment discrimination against African Americans.

While these conferences were taking place, the Missouri River flooded, covering many thousands of acres of farmland with dirt and sand, and leaving thousands of people homeless.  The US Army Corps of Engineers rushed to take advantage of the situation, proposing a major dredging operation to deepen the lower Missouri River channel, an effort strongly supported by big shipping interests.  It became known as the Pick Plan. Not long after, the Bureau of Reclamation proposed a competing plan that involved building a series of dams and reservoirs in the upper river valley, a plan strongly supported by big agricultural interests. It became known as the Sloan Plan.

While lower river and upper river business interests battled, a grassroots movement grew across the region opposing both plans, seeing them, each in their own way, as highly destructive.  For example, building the dams and reservoirs would destroy the environment and require the flooding of hundreds of thousands of acres, much of it owned by small farmers, and leave tens of thousands of families without homes.

Influenced by the growing public anger, newspapers in St. Louis began calling for the creation of a new public authority, a Missouri Valley Authority (MVA), to implement a unified plan for flood control and development that was responsive to popular needs.  Their interest in an MVA reflected the popularity of the Tennessee Valley Authority (TVA), an agency created in 1933 and tasked with providing cheap electricity to homes and businesses and addressing many of the region’s other development challenges, such as flooding, land erosion, and population out-migration.  In fact, during the 1930s, several bills were submitted to Congress to establish other river-based regional authorities.  Roosevelt endorsed seven of them, but they all died in committee as the Congress grew more conservative and war planning took center stage in Washington DC.

District 8, building on its desire to promote postwar regional public planning, eagerly took up the idea of an MVA.  It issued a pamphlet titled “One River, One Plan” that laid out its vision for the agency.  As a public agency, it was to be responsive to a broad community steering committee; have the authority to engage in economic and environmental planning for the region; and, like the TVA, directly employ unionized workers to carry out much of its work.  Its primary tasks would be the electrification of rural areas and flood control through soil and water conservation projects and reforestation.  The pamphlet estimated that five hundred thousand jobs could be created within five years as a result of these activities and the greater demand for goods and services flowing from electrification and the revitalization of small farms and their communities.

District 8 used its pamphlet to launch a community-based grassroots campaign for its MVA, which received strong support from many unions, environmentalists, and farm groups.  And, in August 1944, Senator James Murray from Montana submitted legislation to establish an MVA, written largely with the help of District 8 representatives.  A similar bill was submitted in the House.  Both versions called for a two-year planning period with the final plan to be voted on by Congress.

District 8 began planning for a bigger campaign to win Congressional approval.  However, their efforts were dealt a major blow when rival supporters of the Pick and Sloan plans settled their differences and coalesced around a compromise plan.  Congress quickly approved the Pick-Sloan Flood Control Act late December 1944 but, giving MVA supporters some hope that they could still prevail, Senator Murray succeeded in removing the act’s anti-MVA provisions.

District 8 leaders persuaded their national union to assign staff to help them establish a St. Louis committee, a nine-state committee, and a national committee to support the MVA. The St. Louis committee was formed in January 1945 with a diverse community-based steering committee.  Its strong outreach effort was remarkably successful, even winning support from the St. Louis Chamber of Commerce.  Feurer provides a good picture of the breadth and success of the effort:

By early 1945, other city-based committees were organizing in the nine-state region. A new national CIO committee for an MVA laid plans for “reaching every CIO member in the nine-state region on the importance of regionally administered MVA.  In addition, other state CIO federations pledged to organize for an MVA and to disseminate material on the MVA through local unions to individual members.  Further the seeds planted in 1944 among AFL unions were beginning to develop into a real coalition.  In Kansas City, the AFL was “circulating all the building trades unions in the nine states for support” to establish a nine-state buildings trades MVA committee. Both the AFL and CIO held valley wide conferences on the MVA to promote and organize for it.

Murray submitted a new bill in February 1945, which included new measures on soil conversation and the protection of wild game, water conservation, and forest renewal. It also gave the MVA responsibility for the “disposal of war and defense factories to encourage industrial and business expansion.”

But the political tide had turned.  The economy was in expansion, the Democratic Party was moving rightward, and powerful forces were promoting a growing fear of communism.  Murray’s new bill was shunted to a hostile committee and big business mounted an unrelenting and successful campaign to kill it, arguing that the MVA would establish an undemocratic “super-government,” was a step toward “state socialism,” and was now unnecessary given passage of the Pick-Sloan Flood Control Act.

Drawing lessons

A careful study of District 8’s efforts, especially its campaign for an MVA, can help us think more creatively and effectively about how to build a labor-community coalition in support of a Green New Deal.  In terms of policy, there are many reasons to consider following District 8 in advocating for regionally based public entities empowered to plan and direct economic activity as a way to begin the national process of transformation.  For example, many of the consequences of climate change are experienced differently depending on region, which makes it far more effective to plan regional responses.  And many of the energy and natural resources that need to be managed during a period of transformation are shared by neighboring states.  Moreover, state governments, unions, and community groups are more likely to have established relations with their regional counterparts, making conversation and coordination easier to achieve.  Also, regionally organized action would make it much harder for corporations to use inter-state competition to weaken initiatives.

Jonathan Kissam, UE’s Communication Director and editor of the UE News, advocates just such an approach:

UE District 8’s Missouri Valley Authority proposal could easily be revived and modernized, and combined with elements of the British proposal for a National Climate Service. A network of regional Just Transition Authorities, publicly owned and accountable to communities and workers, could be set up to address the specific carbon-reduction and employment needs of different regions of the country.

The political lessons are perhaps the most important.  District 8’s success in building significant labor-community alliances around innovative plans for war conversion and then peacetime reconversion highlights the pivotal role unions can, or perhaps must, play in a progressive transformation process.  Underpinning this success was District 8’s commitment to sustained internal organizing and engagement with community partners.  Union members embraced the campaigns because they could see how a planned transformation of regional economic activity was the only way to secure meaningful improvements in workplace conditions, and such a transformation could only be won in alliance with the broader community.  And community allies, and eventually even political leaders, were drawn to the campaigns because they recognized that joining with organized labor gave them the best chance to win structural changes that also benefited them.

We face enormous challenges in attempting to build a similar kind of working class-anchored movement for a Green New Deal-inspired economic transformation.  Among them: weakened unions; popular distrust of the effectiveness of public planning and production; and weak ties between labor, environmental, and other community groups.  Overcoming these challenges will require our own sustained conversations and organizing to strengthen the capacities of, and connections between, our organizations and to develop a shared and grounded vision of a Green New Deal, one that can unite and empower the broader movement for change we so desperately need.

The Green New Deal and the State: Lessons from World War II—Part I

There is growing interest in a Green New Deal, but far too little discussion among supporters about the challenging nature of the required economic transformation, the necessary role of public planning and ownership in shaping it, or the strategies necessary to institutionalize a strong worker-community voice in the process and final outcome.  In this two-part series I draw on the experience of World War II, when the state was forced to direct a rapid transformation from civilian to military production, to help encourage and concretize that discussion.

In this post, I first discuss the need for a rapid Green New Deal-inspired transformation and the value of studying the US experience during World War II to help us achieve it.  Next, I examine the evolution, challenges, and central role of state planning in the wartime conversion of the US economy to alert us to the kind of state agencies and capacities  we will need to develop. Finally, I highlight two problematic aspects of the wartime conversion and postwar reconversion which must be avoided if we hope to ensure a conversion to a more democratic and solidaristic economy.

In the post to follow, I will highlight the efforts of labor activists to democratize the process of transformation during the war period in order to sharpen our thinking about how best to organize a labor-community mass movement for a Green New Deal.

The challenge of transformation

We are already experiencing a climate crisis, marked by extreme weather conditions, droughts, floods, warming oceans, rising sea levels, fires, ocean acidification, and soil deterioration.  The Special Report on Global Warming of 1.5°C by the Intergovernmental Panel on Climate Change underscores the importance of limiting the increase in the global mean temperature to 1.5 degrees Celsius above pre-industrial levels by 2100 if we are to avoid ever worsening climate disasters and “global scale degradation and loss of ecosystems and biodiversity.” The report makes clear that achieving this goal requires reducing global net carbon dioxide emissions by 45 per cent by 2030 and then reaching net zero emissions by 2050.

Tragically, despite the seriousness of the crisis, we are on track for a far higher global mean temperature.  Even big business is aware of what is at stake. Two researchers employed by JP Morgan, the world’s largest financer of fossil fuels, recently published an internal study that warns of the dangers of climate inaction. According to the Guardian, which obtained a copy of the report, “the authors say policymakers need to change direction because a business-as-usual climate policy ‘would likely push the earth to a place that we haven’t seen for many millions of years,’ with outcomes that might be impossible to reverse.”

It is easy to see why growing numbers of people are attracted to the idea of a Green New Deal.  The Green New Deal promises a rapid and dramatic curtailing of fossil fuel use as part of a broader transformation to a more sustainable, egalitarian, and socially responsive economy.  Such a transformation will, by necessity, involve massive new investments to promote the production and distribution of clean renewable energy, expand energy efficient public transit systems, support regenerative agriculture, and retrofit existing homes, offices, and factories.  The Green New Deal also promises new, publicly funded programs designed to ensure well-paid and secure employment for all; high-quality universal health care; affordable, safe public housing; clean air; and healthy and affordable food.

Unfortunately, the proposed Green New Deal investments and programs, as attractive and as needed as they may be, are unlikely on their own to achieve the required reduction in carbon emissions.  It is true that many Green New Deal investments and programs can be expected to lower overall energy demand, thereby making it easier for rapidly growing supplies of clean energy to support economic activity.  But even though renewable energy production is growing rapidly in the US, it still accounts for less than 15 percent of total US energy consumption and less than 20 percent of electricity generation.  And based on the experience of other countries, increasing the production of renewable energy does not, by itself, guarantee a significant decline in the production and use of fossil fuels, especially when they remain relatively cheap and plentiful.

Rapid decarbonization will also require direct government action to force down the production of fossil fuels and make their use prohibitively expensive.  And this action will have significant consequences.  For example, limiting fossil fuel production will leave fossil fuel companies with enormous unused and therefore worthless assets.  Raising the price of fossil fuels will sharply increase the cost of flying, with negative consequences for the large manufacturers of airplanes and their subcontractors.  It will also increase the cost of gasoline, with negative consequences for automobile companies that produce gas guzzling cars.  Other major industries will also be affected, for example, the home building industry that specializes in large suburban homes, and the financial sector that has extended loans to firms in all these industries.

Thus, any serious attempt to rapidly force down fossil fuel use can be expected to negatively affect important sectors of the economy. Proposed Green New Deal investments and social policy initiatives will lay the foundation for a new economy, helping to boost employment and absorb some of the newly created excess capacity, but given the need for a speedy transformation to head off climate catastrophe, the process, if left unplanned, could easily end up dragging the economy down.

As difficult as this process appears, we do have historical experience to draw upon that can help us prepare for some of the challenges we can expect to face: the experience of World War II, when the US government was forced to initiate a rapid transformation of the US economy from civilian to military production.  New planning bodies were created to direct resources away from civilian use, retrain workers, encourage retooling of parts of the civilian economy to produce military goods and services, and direct massive investments to build new facilities to expand production or produce new goods needed for the war effort.  While far from a model to be recreated, advocates of a Green New Deal can learn much from studying the US war-time experience.

World War II planning

The shift to a war economy began gradually in 1939, some two years before the US actually entered the war. In June 1939, the Congress passed the Strategic and Critical Materials Stockpiling Act, which called for establishing reserves of strategic materials necessary for defense.  In August 1939, President Roosevelt established the War Resources Board to help the Joint Army and Navy Munitions Board develop plans for mobilizing the economic resources of the country in the event of war.

In June 1940, a National Roster of Scientific and Specialized Personnel was created.  In August 1940, the Defense Plant Corporation was created and charged with planning how to expand the nation’s ability to produce military equipment.  And in September 1940, the Congress approved the Selective Training and Service Act of 1940, which required all men between the ages of 21 and 45 to register for the draft.

In January 1941, President Roosevelt created the Office of Production Management to centralize all federal procurement programs concerned with the country’s preparation for war.  Shortly after the US entered the war, this office was replaced by the War Production Board (WPB), which was tasked with directing the conversion of industries from civilian to military work; the allocation of scare materials; and the establishment of priorities for the distribution of goods and services, including those to be rationed.

The conversion to a war economy, and the end of the depression, roughly dates to the second half of 1941, when defense spending sharply accelerated.  Federal spending on goods and services for national defense rose from 2.2 percent of GNP in 1940 to 11 percent of GNP in 1941. This was the last year that military-generated activity was compatible with growing civilian production. In 1942, military spending soared to 31 percent of GNP.  From then to the end of the war, civilian production was suppressed in order to secure the desired growth in military production.

For example, real consumer durable expenditures reached $24.7 billion (in 1972 dollars) or 6.2 percent of GNP in 1941.  The following year they fell to $16.3 billion or 3.6 percent of GNP.  Real personal consumption which grew by 6.2 percent in 1941, fell absolutely the following year.  Between 1940 and 1944, the total production of non-war goods and services fell from $180 billion to $164 billion (in 1950 dollars).  In contrast, real federal purchases of military commodities grew from $18 billion in 1941 to $88 billion in 1944 (in 1947 dollars), accounting for approximately one-half of all commodities produced that year.

No doubt, the high level of unemployment that existed at the start of the conversion made it easier to ramp up military production—but the military itself soon absorbed a large share of the male working age population.  Moreover, the challenge facing planners was not just that of ramping up production in a depressed economy, but of converting the economy to produce different goods, often in new locations.  This required the recruitment, training, and placement of millions of workers in accordance with ever changing industrial, occupational, and geographic requirements.

In the period of preparation for war, perhaps the biggest challenge was training.  It was largely met thanks to vocational training programs organized by the Employment Division.  These training programs made use of ongoing New Deal programs such as the Civilian Conservation Corps, Works Progress Administration, and National Youth Administration; the existing network of schools and colleges, and a Training-Within-Industry program.  Once the war began, the War Manpower Commission continued the effort.  Altogether, some 7 million people went through training programs, almost half through Training-Within-Industry programs.

The hard shift from a civilian driven economy into a military dominated one was, to a large degree, forced on the government by corporate concerns over future profitability.  In brief, most large corporations were reluctant to expand their productive capacity for fear that doing so would leave them vulnerable to a post-war collapse in demand and depression.  Among the most resistant were leading firms in the following industries: automobile, steel, oil, electric power, and railroads.  At the same time, these firms also opposed the establishment of government owned enterprises; they feared they might become post-war competitors or even worse, encourage popular interest in socialism.

Unwilling to challenge business leaders, the government took the path of least resistance—it agreed to support business efforts to convert their plant and equipment from civilian to military production; offer businesses engaged in defense work cost plus contracting; and suppress worker wages and their right to strike.  And, if the government did find it necessary to invest and establish new firms to produce critical goods, it agreed to allow private businesses to run them, with the option to purchase the new plant and its equipment at a discounted price at the war’s conclusion.  As a consequence, big business did quite well during the war and was well position to be highly profitable in the years following the end of the war.

Business reluctance to invest in expanding capacity, including in industries vital to the military, meant that the government had to develop a number of powerful new planning bodies to ensure that the limited output was allocated correctly and efficiently across the military-industrial supply chain.  For example, raw steel production grew only 8 percent from 1941 to the 1944 wartime peak.  Crude petroleum refining capacity grew only 12 percent between 1941 and 1945.  Leading firms in the auto industry were also reluctant to give up sales or engage in conversion to military production, initially claiming that no more than 15 percent of its machine tools were convertible.  But, once the war started and US planners regulated steel use, giving priority to military production, the auto industry did retool and produce a range of important military goods, including tanks, jeeps, trucks, and parts and subassemblies for the aircraft industry, including engines and propellers.

In many cases, corporate foot-dragging forced the government to establish its own production. Thus, while steel ingot capacity expanded by a modest 17 percent from 1940 to 1945, almost half of that increase came from government owned firms.  The role of government production was probably greatest in the case of synthetic rubber.  The US had relied on imports for some 90 percent of its supply of natural rubber, mostly from countries that fell under Japanese control.  Desperate for synthetic rubber to maintain critical civilian and military production, the government pursued a massive facility construction program. Almost all of the new capacity was financed and owned by the government and then leased to private operators for $1 per year. Thanks to this effort, synthetic rubber output rose from 22,434 long tons in 1942 to 753,111 long tons in 1944.  The Defense Plant Corporation ended up financing and owning approximately one-third of all the plant and equipment built during the war.

The War Production Board, created by presidential executive order in January 1942, was the country’s first major wartime planning agency.   Roosevelt choose Donald M. Nelson, a Sears Roebuck executive, to be its chairperson. Other members of the board were the Secretaries of War, Navy, and Agriculture, the lieutenant general in charge of War Department procurement, the director of the Office of Price Administration, the Federal Loan Administrator, the chair of the Board of Economic Warfare, and the special assistant to the President for the defense aid program.

The WPB managed twelve regional offices, and operated some one hundred twenty field offices throughout the country.  Their work was supported by state-level war production boards, which were responsible for keeping records on the firms engaged in war production in their respective states, including whether they operated under government contract.

However, despite its vast information gathering network, the WPB was never able to take command of the conversion of the economy. To some extent that was because Nelson proved to be a weak leader. But a more important reason was that the WPB had to contend with a number of other powerful agencies that were each authorized to direct the output of a specific critical industry.  The result was a kind of free-for-all when it came to developing and implementing a unified plan.

Perhaps the most powerful independent agency was the Army-Navy Munitions Board.  And early on the WPB ceded its authority over the awarding of military contracts to it. The Army and Navy awarded more contracts then could be fulfilled, creating problems in the supply chain as firms competed to obtain needed materials.  Turf fights among government agencies led to other problems. For example, the Office of Defense Transportation and the Petroleum Administration for War battled over who could decide petroleum requirements for transportation services.  And the Office of Price Administration fought the Solid Fuels Administration over who would control the rationing of coal.

A Bureau of the Budget history of the period captures some of the early chaos:

Locomotive plants went into tank production when locomotives were more necessary than tanks . . . Truck plants began to produce airplanes, a change that caused shortages of trucks later on . . . Merchant ships took steel from the Navy, and the landing craft cut into both. The Navy took aluminum from aircraft.  Rubber took valves from escort vessels, from petroleum, from the Navy.  The pipe-lines took steel from ships, new tools, and the railroads. And at every turn there were foreign demands to be met as well as requirements for new plants.

In response to the chaos, Roosevelt established another super agency in May 1943, the Office of War Mobilization (OWM).  This agency, headed by James F. Byrnes, a former politician and Supreme Court justice, was given authority over the WPB and the other agencies.  In fact, Byrnes’ authority was so great, he was often called the “assistant President.”

The OWM succeeded in installing a rigorous system of materials control and bringing order to the planning process.  As a result, civilian production was efficiently suppressed and military production steadily increased.  Over the period 1941 to 1945, the US was responsible for roughly 40 percent of the world’s production of weapons and supplies, and with little increase in the nation’s capital stock.

The experience highlighted above shows the effectiveness of planning, and that a contemporary economic conversion based on Green New Deal priorities, in which fossil fuel dependent industries are suppressed in favor of more sustainable economic activity, can be achieved.  It also shows that a successful transformation will require the creation of an integrated, multi-level system of planning, and that the process of transformation can be expected to generate challenges that will need to be handled with flexibility and patience.

World War II planning: cautionary lessons

The war-time conversion experience also holds two important cautionary lessons for a Green New Deal-inspired economic transformation.  The first is the need to remain vigilant against the expected attempt by big business to use the planning process to strengthen its hold on the economy.  If we are to achieve our goal of creating a sustainable, egalitarian, and solidaristic economy, we must ensure a dominant and ongoing role for public planning of economic activity and an expansive policy of public ownership, both taking over firms that prove resistant to the transformation and retaining ownership of newly created firms.

Unfortunately, the federal government was all too willing to allow big corporations to dominate the war-time conversion process as well as the peacetime reconversion, thereby helping them boost their profits and solidify their post-war economic domination.  For example, the Army and Navy routinely awarded their defense contracts to a very few large companies.  And these companies often chose other big companies as their prime subcontractors.  Small and medium sized firms also struggled to maintain their production of civilian goods because planning agencies often denied them access to needed materials.

Harold G. Vatter highlights the contract preference given to big firms during the war, noting that:

of $175 billion of primary contracts awarded between June 1940 and September 1944, over one-half went to the top 33 corporations (with size measured by value of primary supply contracts received). The smallest 94 percent of prime supply contract corporations (contracts of $9 million or less) got 10 percent of the value of all prime contracts in that period.

The same big firms disproportionally benefited from the reconversion process.  In October 1944, the OWM was converted into the Office of War Mobilization and Reconversion (OWMR), with Byrnes remaining as head.  The OWMR embraced its new role and moved quickly to achieve the reconversion of the economy.  It overcame opposition from the large military contractors, who were reluctant to give up their lucrative business, by granting them early authorization to begin production of civilian goods, thereby helping them dominate the emerging consumer markets.

The OWMR was also generous in its post-war distribution of government assets. The government, at war’s end, owned approximately $17 billion of plant and equipment.  These holdings, concentrated in the chemical, steel, aluminum, copper, shipbuilding, and aircraft industries, were estimated to be about 15 percent of the country’s total postwar plant capacity.  The government also owned “surplus” war property estimated to be worth some $50 and $70 billion.

Because of the way government wartime investment had been structured, there was little question about who would get the lion’s share of these public assets.  Most government owned plants were financed under terms specifying that the private firms operating them would be given the right to purchase them at war’s end if desired.  Thus, according to one specialist, roughly two-thirds of the $17 billion of government plant and equipment was sold to 87 large firms.  The “bulk of copolymer synthetic rubber plants went to the Big Four in rubber; large chemical plants were sold to the leading oil companies, and U.S. Steel received 71 percent of government-built integrated steel plants.”

The second cautionary lesson is the need to resist efforts by the government, justified in the name of efficiency, to minimize the role of unions, and working people more generally, in the planning and organization of the economic conversion. The only way to guarantee that a Green New Deal-inspired transformation will create an economy responsive to the needs of working people and their communities is to create institutional arrangements that secure popular participation in decision-making at all levels of economic activity.

While organized labor had at least an advisory role in prewar planning agencies, once the war began, it was quickly marginalized, and its repeated calls for more participation rejected.  For example, Sidney Hillman (head of the Amalgamated Clothing Workers) was appointed to be one of two chairs of the Office of Production Management, which was established in January 1941 to oversee federal efforts at national war preparation. The other was William S. Knudsen (president of General Motors).  The OPM also included a Labor Bureau, also led by Hillman, which was to advise it on labor recruitment, training, and mobilization issues, as well as Labor Advisory Committees attached to the various commodity and industry branches that reported to the OPM.

The labor presence was dropped from the War Production Board, which replaced the OPM in January 1942; Roosevelt appointed a businessman, Donald M. Nelson, to be its sole chair.  Hillman was appointed director of the board’s Labor Division, but that division was soon eliminated and its responsibilities transferred to the newly created War Manpower Commission in April 1942.

More generally, as organized labor found itself increasingly removed from key planning bodies, workers found themselves increasingly asked to accept growing sacrifices.  Prices began rising in 1940 and 1941 as the economy slowly recovered from the depression and began its transformation to war production.  In response, workers pushed for significant wage increases which the government, concerned about inflation, generally opposed.  In 1940, there were 2500 strikes producing 6.7 million labor-days idle.  The following year there were 4300 strikes with 23.1 million labor-days idle.

Hillman called for a national policy of real wage maintenance based on inflation indexing that would also allow the greatest wage gains to go to those who earned the least, but the government took no action.  As war mobilization continued, the government sought a number of concessions from the unions.  For example, it wanted workers to sacrifice their job rights, such as seniority, when they were transferred from nondefense to defense work.  Union leaders refused.  Union leaders also demanded, unsuccessfully, that military contracts not be given to firms found to violate labor laws.

Worried about disruptions to war production, Roosevelt established the War Labor Board by executive order in January 1942.  The board was given responsibility for stabilizing wages and resolving disputes between workers and managers at companies considered vital to the war effort. The board’s hard stand on wage increases was set in July, when it developed its so-called “Little Steel Formula.” Ruling in a case involving the United Steelworkers and the four so-called “Little Steel” companies, the board decided that although steelworkers deserved a raise, it had to be limited to the amount that would restore their real earnings to their prewar level, which they set as January 1, 1941.  Adding insult to injury, the board relied on a faulty price index that underestimated the true rate of inflation since the beginning of 1941.

Thus, while corporations were able to pursue higher profits, workers would have to postpone their “quest for an increasing share of the national income.”  Several months later, Roosevelt instructed the War Labor Board to use a similar formula, although with a different baseline, in all its future rulings.  Not surprisingly, the number of strikes continued to rise throughout the war years despite a December 1941 pledge by AFL and CIO leaders not to call strikes for the duration of the war.

In June 1943, with strikes continuing, especially in the coal fields, Congress passed the War Labor Disputes Act.  The act gave the president the power to seize and operate privately owned plants when an actual or threatened strike interfered with war production. Subsequent strikes in plants seized by the government were prohibited. The act was invoked more than 60 times during the war. The act also included a clause that made it illegal for unions to contribute to candidates for office in national elections, clearly an attempt to weaken labor’s political influence.

Although wage struggles drew most attention, union demands were far more expansive.  As Vatter describes:

Organized labor wanted wartime representation and participation in production decision-making at all levels, not merely the meaningless advisory role allotted to it during the preparedness period. But from the outset, management maintained a chronic hostile stance on the ground that management-labor industry councils such as proposed by Walter Reuther and CIO President Philip Murray in 1940 would, under cover of patriotism, undermine managements prerogatives and inaugurate a postwar “sovietization” of American industry.

Unions often pointed to the chaos of early planning, as captured by the Budget Bureau history, arguing that their active participation in production decisions would greatly improve overall efficiency.  The government’s lack of seriousness about union involvement is best illustrated by the WPB’s March 1942 decision to establish a special War Production Drive Division that was supposed to encourage the voluntary creation of labor-management plant committees.  However, the committees were only allowed to address specific physical production problems, not broader labor-management issues or production coordination across firms. Most large firms didn’t even bother creating committees.

Significantly, there was only one time that the government encouraged and supported popular participation in wartime decision-making, and that effort proved a great success.  Inflation was a constant concern of the government throughout the war years, largely because it was a trigger for strikes which threatened war time production.  The Office of Price Administration tried a variety of voluntary and bureaucratic controls to limit price increases on consumer goods and services, especially food, with little success.  Finally, beginning in mid-1943, and over the strong opposition of business, it welcomed popular participation in the operation of its price control system.

Tens of thousands of volunteers were formally authorized to visit retail locations throughout the country to monitor business compliance with the controls and tens of thousands of additional volunteers were chosen to serve on price boards that were empowered to fine retailers found in violation of the controls.  As a result, prices remained relatively stable from mid-1943 until early 1946 when the government abruptly ended the system of controls.  This was incredible achievement considering that the production of civilian goods and services declined over those years, while consumer purchasing power and the money supply rose.

 

Part II, on organizing and movement building lessons.