Pandemic economic woes continue, but so do deep structural problems, especially the long-term growth in the share of low wage jobs

Many are understandably alarmed about what the September 4th termination of several special federal pandemic unemployment insurance programs will mean for millions of workers.  Twenty-five states ended their programs months earlier, with government and business leaders claiming that their termination would spur employment and economic activity.  However, several studies have disproved their claims.

One study, based on the experience of 19 of these states, found that for every 8 workers that lost benefits, only one found a new job.  Consumer spending in those states fell by $2 billion, with every lost $1 of benefits leading to a fall in spending of 52 cents.   It is hard to see how anything good can come from the federal government’s willingness to allow these programs to expire nationwide. 

The Biden administration appears to believe that adoption of its physical infrastructure bill and $3.5 trillion spending plan will ensure that those left without benefits will find new jobs.  But chances for Congressional approval are growing dim.  Even more importantly, and largely overlooked in the debate over whether the time is right to replace the pandemic unemployment insurance programs with new spending measures, is that an increasing share of the jobs created by economic growth are low-wage, and thus inadequate to ensure workers and their families an acceptable standard of living. 

For example, according to another study, the share of low wage jobs has been steadily growing since 1979.  More specifically, the share of workers (18-64 years of age) with a low wage job rose from 39.1 percent in 1979 to 45.2 percent in 2017.  For workers 18 to 34 without a college degree the share soared from 46.9 percent to 61.6 percent over the same tyears. Thus, a meaningful improvement in worker well-being will require far more than a return to “normal” labor market conditions.  It will require building a movement able to directly challenge and transformation the way the US economy operates.  

The importance of government programs

The figure below provides some sense of how important government programs have been to working people.  Government support was truly a lifeline for working people, delivering a significant boost to total monthly personal income (relative to the February 2020 start of the pandemic-triggered recession), especially during the first months.  Even now, despite the fact that the recession has officially been declared over, it still accounts for approximately half the increase in total monthly income.   

The government’s support of personal income was anchored by three special unemployment insurance programs–the Federal Pandemic Unemployment Compensation (FPUC), Pandemic Emergency Unemployment Compensation (PEUC), and Pandemic Unemployment Assistance (PUA). 

The FPUC was authorized by the March 2020 CARES Act and renewed by subsequent legislation and a presidential order. It originally provided $600 per week in extra unemployment benefits to unemployed workers in states that opted in to the program. In August 2020, the extra payment was lowered to $300.

The PEUC was also established by the CARES Act. It provided up to 13 weeks of extended unemployment compensation to individuals that had exhausted their regular unemployment insurance compensation.  This was later extended to 24 additional weeks and then by a further 29 weeks, allowing for a total of 53 weeks.  The PUA allowed states to provide unemployment assistance to the self-employed and those seeking part-time employment, or who otherwise did not qualify for regular unemployment compensation.

Tragically, the federal government allowed all three programs to expire on September 4th. Months earlier, in June 2021, 25 states actually ended these programs for their unemployed workers, eliminating benefits for over 2 million.  Several studies, as we see next, have documented the devastating cost of that decision. 

The cost of state program termination

Beginning in April 2021, a number of business analysts and politicians began to aggressively argue that federally provided unemployment benefit programs were no longer needed.  In fact, according to them, the programs were actually keeping workers from pursuing available jobs, thereby holding back the country’s economic recovery. Using these arguments as cover, in June, 25 states ended their participation in one or more of these programs. 

For example, Henry McMaster, the governor of South Carolina, announced his decision to end his state’s participation in the federal programs, saying: “This labor shortage is being created in large part by the supplemental unemployment payments that the federal government provides claimants on top of their state unemployment benefits.”

Similarly, Tate Reeves, the governor of Mississippi, stated in a May 2021 tweet:

It has become clear to me that we cannot have a full economic recovery until we get the thousands of available jobs in our state filled. . . . Therefore, I have informed the Department of Employment Security to direct the Biden Administration that Mississippi will be opting out of the additional federal unemployment benefits as early as federal law allows—June 12, 2021.

The argument that these special federal unemployment benefit programs hurt employment and economic activity was tested and found wanting.  Business Insider highlights the results of several studies:

Economist Peter Ganong, who co-authored a paper that found the disincentive effect of benefits was small, told the [Wall Street] Journal: “If the question is, ‘Is UI [unemployment insurance] the key thing that’s holding back the labor market recovery?’ The answer is no, definitely not, based on the available data.” 

That aligns with other early research on the impact of benefits ending. CNBC reports that analyses from payroll firms UKG and Homebase both found that employment didn’t go up in the states cutting off the benefits; in fact, that Homebase analysis found that employment declined in the states opting out of federal benefits, while it went up in states that chose to retain benefits. In June, Indeed’s Hiring Lab found that job searches in states ending benefits were below April’s baseline.

In July, Arindrajit Dube, an economics professor at University of Massachusetts Amherst, found that ending benefits didn’t make workers rush back. “Even as there was a clear reduction in the number of people who were receiving unemployment benefits — and a clear increase in the number of people who said that they were having difficulty paying their bills — that didn’t seem to translate, at least in the short run, into an uptick in overall employment rates,” Dube told Insider at the time.

Dube, along with five other researchers, examined “the effect of withdrawing pandemic UI on the financial and employment trajectories of unemployed workers in [19] states that withdrew benefits, compared to workers with the same unemployment duration in states that retained these benefits.” 

They found, as noted above, that for every 8 workers who lost their benefits, only 1 found a new job.  And for every $1 of reduced benefits, spending fell by 52 cents—only 7 cents of new income was generated for each dollar of lost benefits. “Extrapolating to all UI recipients in the early withdrawal states, we estimate these states eliminated $4 billion in unemployment benefits paid by federal transfers as of August 6 [2021].  Spending fell by $2 billion and earnings rose by $270 million.  These states therefore saw a much larger drop in federal transfers than gains from job creation.”

An additional 8 million workers have now lost benefits because of the federal termination of these special unemployment insurance programs.  It is hard to be optimistic about what awaits them, given the experience of the early termination states.  And equally important, even if the “optimists” are proven right, and those workers are able to find employment, there is still reason for concern about the likely quality of those jobs given long-term employment trends.

The lack of decent jobs

There is no agreed upon definition of a low wage job.  David R. Howell and Arne L. Kalleberg note two of the most popular in their study of declining job quality in the United States.  One is to define low wage jobs as those that pay less than two-thirds of the median hourly wage.  The other, used by the OECD, is to define low wage jobs as those that pay less than two-thirds of the median hourly wage for full-time workers.

Howell and Kallenberg find both inadequate.  Instead, they define low wage jobs as those that pay less than two-thirds of the mean hourly wage for full-time prime-age workers (35-59).  Their definition sets the dividing line between low wage and what they call “decent” wage jobs at $17.50 in 2017.  As they explain:

This wage is well above the wage that would make a full-time (or near full-time) worker eligible for food stamps and several dollars above the basic needs budget for a single adult in most American cities, but is conservative in that the basic needs budget for a single adult with one child ranges from $22 to $30).

The figure below, based on their definition, shows the growth in low wage jobs for workers 18-34 years of age without a college degree (in blue), all workers 18-64 years of age (in gold), and prime age workers 35-59 years of age (in green).  Their dividing line between low wage and decent wage jobs, equivalent to $17.50 in 2017, is far from a generous wage.  Yet, all three groupings show an upward trend in the share of low wage jobs.  

The authors then divide their low wage and decent wage categories into upper and lower tiers.   The lower tier of the low wage category includes jobs that pay less than two-thirds of the median wage for full-time workers, which equaled $13.33 in 2017.  As the authors report:

Based on evidence from basic needs budgets, this is a wage that, even on a full-time basis, would make it extremely difficult to support a minimally adequate standard of living for even a single adult anywhere in the country. This wage threshold ($13.33) is just above the wage cutoff for food stamps ($12.40) and Medicaid ($12.80) for a full- time worker (thirty-five hours per week, fifty weeks per year) with a child; full-year work at thirty hours per week would make a family of two eligible for the food stamps with a wage as high as $14.46 and as high as $14.94 for Medicaid.  For this reason, we refer to this as the poverty-wage threshold.

The lower tier of the decent wage category includes jobs that pay less than 50 percent more than the decent-job threshold, which equaled $26.50 in 2017.  The figure below shows the overall job distribution in 2017.

The following table shows the changing distribution of jobs over the years 1979 to 2017 for all workers 18 to 64, for workers 18-34 without a college degree, and for workers 18-34 with a college degree.

While the share of upper-tier decent jobs held by workers 18 to 64 has remained relatively stable, there has been a notable decline in the share of workers with lower-tier decent jobs.  Also worth noting is the rise in the share of poverty-level low wage jobs. 

Perhaps most striking is the large decline in the share of decent jobs held by workers 18 to 34, those with and those without a college degree.  The share of poverty level jobs held by those without a college degree soared from 35.7 percent to 53.5 percent.  The share of low wage jobs also spiked for those with a college degree, rising from 22 percent to 39.1 percent, with an increase in the share of both low-wage tiers.

This long-term decline in job quality will not reverse on its own.  And, not surprisingly, corporate leaders remain largely opposed to policies that might threaten the status quo.

So, do we need a better unemployment insurance system? For sure.  Do we need a better funded and more climate resilient social and physical infrastructure?  Definitely.  But we also need a dramatically different economy, one that, in sharp contrast to our current system, is grounded in greater worker control over both the organization and aims of production.  Lots of work ahead.

Learning from history: community-run child-care centers during World War II

We face many big challenges.  And we will need strong, bold policies to meaningfully address them.  Solving our child-care crisis is one of those challenges, and a study of World War II government efforts to ensure accessible and affordable high-quality child care points the way to the kind of bold action we need. 

The child care crisis

A number of studies have established that high-quality early childhood programs provide significant community and individual benefits.  One found that “per dollar invested, early childhood programs increase present value of state per capita earnings by $5 to $9.”  Universal preschool programs have also been shown to offer significant benefits to all children, even producing better outcomes for the most disadvantaged children than means-tested programs.  Yet, even before the pandemic, most families struggled with a lack of desirable child-care options.    

The pandemic has now created a child-care crisis. As Lisa Dodson and Mary King point out: “By some estimates, as many as 4.5 million child-care ‘slots’ may be permanently lost and as many as 40 percent of child-care providers say they will never reopen.”  The lack of child care is greatly hindering our recovery from the pandemic.  Women suffered far greater job losses than men during 2020, including as child-care workers, and the child-care crisis has made it difficult for many working mothers to return to the labor force.  The cost goes beyond the immediate family hardship from lost income; there is strong evidence that a sustained period without work, the so-called employment gap, will result in significantly lower lifetime earnings and reduced retirement benefits.  

To his credit, President Biden has recognized the importance of strengthening our care economy.  His proposed American Families Plan includes some $225 billion in tax credits to help make child care more affordable for working families.  According to a White House fact sheet, families would “receive a tax credit for as much as half of their spending on qualified child care for children under age 13, up to a total of $4,000 for one child or $8,000 for two or more children. . . . The credit can be used for expenses ranging from full-time care to after school care to summer care.”

But tax credits don’t ensure the existence of convenient, affordable, high-quality child-care facilities staffed by well-paid and trained child-care providers.  And if that is what we really want, we will need to directly provide it.  That is what the government did during World War II.  While its program was far from perfect, in part because it was designed to be short-term, it provides an example of the type of strong, bold action we will need to overcome our current child-care crisis. 

Federal support for child care

During World War II the United States government financed a heavily-subsidized child-care program.  From August 1943 through February 1946, the Federal Works Agency (FWA), using Lanham Act funds, provided some $52 million in grants for child-care services (equal to more than $1 billion today) to any approved community group that could demonstrate a war-related need for the service.  At its July 1944 peak, 3,102 federally subsidized child-care centers, with some 130,000 children enrolled, operated throughout the country.  There was at least one center in every state but New Mexico, which decided against participation in the program.  By the end of the war, between 550,000 and 600,000 children received some care from Lanham Act funded child-care programs.  

Communities were allowed to use the federal grant money to cover most of the costs involved in establishing and running their centers, including facilities construction and upkeep, staff wages and most other daily operating costs.  They were required to provide some matching funds, most of which came from fees paid by the parents of children enrolled in the program.  However, these fees were capped. In the fall of 1943, the FWA established a ceiling on fees of 50 cents per child per day (about $7 now), which was raised to 75 cents in July 1945. And those fees included snacks, lunch, and in some cases dinner as well. Overall, the federal subsidy covered two-thirds of the total maintenance and operation of the centers.

The only eligibility requirement for enrollment was a mother’s employment status: she had to be working at a job considered important to the war effort, and this was not limited to military production. Center hours varied, but many accommodated the round-the-clock manufacturing schedule, staying open 24 hours a day, 6 days a week. 

The centers served preschoolers (infants, toddlers, and children up to 5 years of age) and school-age children (6 to 14 years of age). In July 1944, approximately 53,000 preschoolers and 77,000 school-age children were enrolled.  School-age enrollment always climbed during summer vacation.  However, in most months, preschoolers made up the majority of the children served by Lanham Act-funded centers. Enrollment of preschoolers peaked at some 74,000 in May 1945. 

Some 90 percent of the centers were housed in public schools, with newly contructed housing projects providing the next most used location. Although local school boards were free to decide program standards–including staff-child ratios, worker qualifications, and facility design–state boards of education were responsible for program supervision. The recommended teacher-child ratio was 10-to-1, and most centers complied.  According to Chris M. Herbst,

Anecdotal evidence suggests that preschool-aged children engaged in indoor and outdoor play; used educational materials such paints, clay, and musical instruments; and took regular naps. . . . Programs for school-aged children included . . . outdoor activities, participation in music and drama clubs, library reading, and assistance with schoolwork. 

Children at a child-care center sit for “story time.” (Gordon Parks / Library of Congress / The Crowley Company)

While quality did vary–largely the result of differences in community support for public child care, the willingness of cities to provide additional financial support, and the ability of centers to hire trained professionals to develop and oversee program activities–the centers did their best to deliver a high-quality childhood education.  As Ruth Peason Koshuk, the author of a 1947 study of the developmental records of 500 children, 2 to 5 years of age, at two Los Angeles Country centers, describes:

In these two . . . schools, as elsewhere, the program has developed since 1943, toward recognized standards of early childhood education. The aim has been to apply the best of existing standards, and to maintain as close contact with the home as possible. In-service training courses carrying college credit have been given, for the teaching staff, and a mutually helpful parent education program carried on in spite of difficulties inherent in a child care situation.

There has been a corresponding development in the basic records. A pre-entrance medical examination has been required by state law since the first center opened. In December 1943 a developmental record was added, which is filled out by the director during an unhurried interview with the mother just before a child enters. One page is devoted to infancy experience; the four following cover briefly the child’s development history, with emphasis on emotional experience, behavior problems he has presented to the parents, if any, and the control methods used, as well as the personal-social behavior traits which they value and desire for the child. After entrance, observational notes and semester reports are compiled by the teachers. Intelligence testing has been limited to cases where it seemed especially indicated. A closing record is filled out, in most cases, by the parent when a child is withdrawn. These records are considered a minimum. They have proved indispensable as aids to the teachers in guiding the individual children and as a basis for conferences on behavior in the home.

A 2013 study of the long-term effects on mothers and children from use of Lanham centers found a substantial increase in maternal employment, even five years after the end of the program, and “strong and persistent positive effects on well-being” for their children.

In short, despite many shortcomings, these Lanham centers, as Thalia Ertman sums up,

broke ground as the first and, to date, only time in American history when parents could send their children to federally-subsidized child care, regardless of income, and do so affordably. . . .

Additionally, these centers are seen as historically important because they sought to address the needs of both children and mothers. Rather than simply functioning as holding pens for children while their mothers were at work, the Lanham child care centers were found to have a strong and persistent positive effect on the well-being of children.

The federal government also supported some private employer-sponsored child care during the war. The most well-known example is the two massive centers built by the Kaiser Company in Portland, Oregon to provide child care for the children of workers at their Portland Yards and Oregon Shipbuilding Corporation. The centers were located right at the front of the shipyards, making it easy for mothers to drop their children off and pick them up, and were operated on a 24-hour schedule.  They were also large, each caring for up to 1,125 children between 18 months and 6 years of age. The centers had their own medical clinic, cafeteria, and large play areas, and employed highly trained staff.  Parents paid $5 for a six-day week for one child and $3.75 for each additional child.  For a small additional fee, the centers also prepared a small dinner for parents to pick up at the end of their working day.

While the Kaiser Company received much national praise as well as appreciation from its employees with young children, these centers were largely paid for by the government.  Government funds directly paid for their construction, and a majority of the costs of running the center, including staff salaries, were included in the company’s cost-plus contracting with the military.

Political dynamics

There was considerable opposition to federal financing of group child care, especially for children younger than 6 years of age.  The sentiment is captured in this quote from a 1943 New York Times article: “The worst mother is better than the best institution when it is a matter of child care, Mayor La Guardia declared.”  Even the War Manpower Commission initially opposed mothers with young children working outside the home, even in service of the war effort, stating that “The first responsibility of women with young children, in war as in peace, is to give suitable care in their own homes to their children.”

But on-the-ground realities made this an untenable position for both the government and business. Women sought jobs, whether out of economic necessity or patriotism.  The government, highlighted by its Rosie the Riveter campaign, was eager to encourage their employment in industries producing for the war effort.  And, despite public sentiment, a significant number of those women were mothers with young children. 

Luedell Mitchell and Lavada Cherry working at a Douglas Aircraft plant in El Segundo, Calif. Circa 1944. Credit: National Archives photo no. 535811

The growing importance of women in the workplace, and especially mothers with young children, is captured in employment trends in Portland, Oregon.  Women began moving into the defense workforce in great numbers starting in 1942, with the number employed in local war industries climbing from 7,000 in November 1942 to 40,000 in June 1943.  An official with the state child-care committee reported that “a check of six shipyards reveals that the number of women employed in the shipyards has increased 25 percent in one month and that the number is going to increase more rapidly in the future.” 

The number of employed mothers was also rapidly growing.  According to the Council of Social Agencies, “Despite the recommendations of the War Manpower Commission . . . thousands of young mothers in their twenties and thirties have accepted jobs in war industries and other businesses in Multnomah County. Of the 8,000 women employed at the Oregon Shipyards in January, 1943, 32 percent of them had children, 16 percent having pre-school children.”

Portland was far from unique.  During the war, for the first time, married women workers outnumbered single women workers.  Increasingly, employers began to recognize the need for child care to address absenteeism problems.  As a “women’s counselor” at the Bendix Aviation Corporation in New Jersey explained to reporters in 1943, child care is one of the biggest concerns for new hires. “We feel a mother should be with her small baby if possible. But many of them have to come back. Their husbands are in the service and they can’t get along on his allotment.”  Media stories, many unsubstantiated, of children left in parked cars outside workplaces or fending for themselves at home, also contributed to a greater public acceptance of group child care. 

An image of Rosie the Riveter that appeared in a 1943 issue of the magazine Hygeia

Finally, the government took action.  The Federal Works Agency was one of two new super agencies established in 1939 to oversee the large number of agencies created during the New Deal period.  In 1940 President Roosevelt signed into law the Lanham Act, which authorized the FWA to fund and supervise the construction of needed public infrastructure, such as housing, hospitals, water and sewer systems, police and firefighting facilities, and recreation centers, in communities experiencing rapid growth because of the defense buildup. In August 1942, the FWA decided, without any public debate, that public infrastructure also meant child care, and it began its program of support for the construction and operation of group child-care facilities.

The Federal Works Agency, the other super agency, whose oversight responsibilities included the Children’s Bureau and the U.S. Office of Education, opposed the FWA’s new child-care initiative.  It did so not only because it believed that child care fell under its mandate, but also because the leadership of the Children’s Bureau and Office of Education opposed group child care.  The FWA won the political battle, and in July 1943, Congress authorized additional funding for the FWA’s child-care efforts. 

And, as William M. Tuttle, Jr. describes, public pressure played an important part in the victory:

the proponents of group child care organized a potent lobbying effort. The women’s auxiliaries of certain industrial unions, such as the United Electrical Workers and the United Auto Workers, joined with community leaders and FWA officials in the effort. Also influential were the six women members of the House of Representatives. In February 1944, Representative Mary T. Norton presented to the House “a joint appeal” for immediate funds to expand the wartime child day care program under the FWA.

Termination and a step back

Congressional support for group child care was always tied to wartime needs, a position shared by most FWA officials.  The May 1945 Allied victory in Europe brought a drop in war production, and a reduction in FWA community child care approvals and renewals.  In August, after the Japanese surrender brought the war to a close, the FWA announced that it would end its funding of child-care centers as soon as possible, but no later than the end of October 1945.

Almost immediately thousands of individuals wrote letters, sent wires, and signed petitions calling for the continuation of the program.  Officials in California, the location of many war-related manufacturing sites and nearly 25 percent of all children enrolled in Lanham Act centers in August 1945, also weighed in, strongly supporting the call.  Congress yielded, largely influenced by the argument that since it would be months before all the “men” in the military returned to the country, mothers had no choice but to continue working and needed the support of the centers to do so.  It approved new funds, but only enough to keep the centers operating until the end of February 1946.

The great majority of centers rapidly closed not long after the termination of federal support, with demonstrations following many of the closings.  The common assumption was that women would not mind the closures, since most would be happy to return to homemaking.  Many women were, in fact, forced out of the labor force, disproportionately suffering from post-war industrial layoffs.  But by 1947, women’s labor force participation was again on the rise and a new push began for a renewal of federal support for community child-care centers. Unfortunately, the government refused to change its position. During the Korean War, Congress did approve a public child-care bill, but then it refused to authorize any funding.

After WWII, parents organized demonstrations, like this one in New York on Sept. 21, 1947, calling for the continuing funding of the centers. The city’s welfare commissioner dismissed the protests as “hysterical.” Credit: The New York Times

Finally, in 1954, as Sonya Michel explains, “Congress found an approach to child care it could live with: the child-care tax deduction.”  While the child-care tax deduction did offer some financial relief to some families, it did nothing to ensure the availability of affordable, high-quality child care.  The history of child care during World War II makes clear that this turn to market-based tax policy to solve child-care problems represented a big step back for working women and their children.  And this was well understood by most working people at the time. 

Sadly, this history has been forgotten, and Biden’s commitment to expand the child-care tax credit is now seen as an important step forward.  History shows we can and need to do better.

Realizing a Green New Deal: Lessons from World War II

Many activists in the United States support a Green New Deal transformation of the economy in order to tackle the escalating global climate crisis and the country’s worsening economic and social problems.  At present, the Green New Deal remains a big tent idea, with advocates continuing to debate what it should include and even its ultimate aims.[1]  Although perhaps understandable given this lack of agreement, far too little attention has been paid to the process of transformation.  That is concerning, because it will be far from easy.

One productive way for us to sharpen our thinking about the transformation is to study the World War II-era mobilization process. Then, the U.S. government, facing remarkably similar challenges to the ones we are likely to confront, successfully converted the U.S. economy from civilian to military production in a period of only three years.

It is easy to provide examples of some of the challenges that await us.  All Green New Deal proposals call for a sharp decrease in fossil fuel production, which will dramatically raise fossil fuel prices.  The higher cost of fossil fuels will significantly raise the cost of business for many industries, especially air travel, tourism, and the aerospace and automobile industries, triggering significant declines in demand and reductions in their output and employment.   We will need to develop a mechanism that allows us to humanely and efficiently repurpose the newly created surplus facilities and provide alternative employment for released workers.

New industries, especially those involved in the production of renewable energy will have to be rapidly developed.  We will need to create agencies capable of deciding the speed of their expansion as well as who will own the new facilities, how they will be financed, and how best to ensure that the materials they require will be produced in sufficient quantities and made available at the appropriate time. We will also have to develop mechanisms for deciding where the new industries will be located and how to develop the necessary social infrastructure to house and care for the required workforce.  

We will also need to ensure the rapid and smooth expansion of facilities capable of producing mass transit vehicles and a revitalized national rail system.  We will need to organize the retrofitting of existing buildings, both office and residential, as well as the training of workers and the production of required equipment and materials.  The development of a new universal health care system will also require the planning and construction of new clinics and the development of new technologies and health practices.  In sum, a system-wide transformation involves a lot of moving parts that have to be managed and coordinated.

While it would be a mistake to imagine that the U.S. wartime experience can provide a readymade blueprint for the economic conversion we seek, there is much we can learn, both positive and negative, from it.  In what follows, I first highlight some of the key lessons and then conclude with a brief discussion of the relevance of the World War II experience to our current efforts to transform the U.S. economy.

1. A rapid, system-wide conversion of the U.S. economy is possible 

The primary driver of the wartime conversion was the enormous increase in military spending over the years 1940-1943.  Military spending grew by an incredible 269.3 percent in 1941, 259.7 percent in 1942, and 99.5 percent in 1943.  As a consequence, military spending as a share of GDP rose from 1.6 percent in 1940 to 32.2 percent in 1943.  That last year, federal spending hit a record high of 46.6 percent of GDP and remained at over 41 percent of GDP in each of the following two years.[2] 

The results were equally impressive: the combined output of the war-related manufacturing, mining, and construction industries doubled between 1939 and 1944.[3] In 1943 and 1944 alone, the United States was responsible for approximately 40 percent of all the munitions produced during World War II. 

This record has led many to call what was accomplished a “production miracle.”  However, a more complete assessment of the period tells a different story.  For example, there is little difference between the years 1921-24 and 1941-1944 in either the growth of industrial production or the growth in real gross nonfarm product.[4] 

Paul A. C. Koistinen casts further doubt on production miracle claims, pointing out that:

When placed in the proper context, the American production record does not appear exceptional, unless the characterization applies to all other belligerents. Gauged by the percentage distribution of the world’s manufacturing production for the period 1926-1929, the United Sates in the peak year 1944 was producing munitions at almost exactly the level it should have been.  Great Britain is modestly high, Canada low, Germany high, Japan very high, and the Soviet Union spectacularly high.[5]

The explanation for these two significantly different views of the period is that the transformation involved far more than the increase in military spending.  There was also the curtailment or outright suppression of the production of many industries, the rationing of limited supplies of many goods, and the development and production of entirely new goods and services.  For example, civilian automobile production was stopped, tires and food were rationed, and synthetic rubber was created and produced in significant amounts.  Between 1940 and 1944, the total production of non-war goods and services actually fell by more than 10 percent, from $221.7 billion to $198.9 billion (in 1958 dollars).[6]

In other words, the tremendous gains in U.S. military production were achieved, and in a relatively short period of time, not because of some impossible-to-repeat production miracle, but because a government directed-mobilization succeeded in fully employing the country’s resources while shifting their use from civilian to military purposes. 

2. State capacities and action matter

The economy’s successful transformation demonstrates the critical importance of state planning, public financing and ownership, and state direction of economic activity.  Mobilization officials faced two major tasks. The first was to quickly expand the economy’s capacity to produce the weapons and supplies required by the military.  The second was to manage the scarcities of critical materials and components caused by the rapid pace of the mobilization. 

The first task was made significantly more difficult by a lack of corporate support.  Most corporations were reluctant to undertake the massive expansion in plant and equipment required to achieve the desired boost in military production. In fact, private investment actually fell in value over the years 1941-43.  It was the federal government, using a variety of new policy initiatives, that provided the solution.

One of the most important initiatives was the creation of the Defense Production Corporation (DPC). In May 1940, Congress passed a series of amendments which allowed the still operating depression-era Reconstruction Finance Corporation (RFC) to create new subsidiaries “with such powers as it may deem necessary to aid the Government of the United States in its national defense program.”  The DPC was one of those new subsidiaries. 

Since the RFC had independent borrowing authority, the DPC was able to directly finance the expansion of facilities deemed critical to the military buildup without needing Congressional approval.  The DPC kept ownership of the new facilities it financed, but planned the construction with and then leased the new facilities for a minimal fee to predetermined contractors who would operate them. The DPC eventually financed and owned some one-third of all the plant and equipment built during the war.

By its termination at the end of June 1945, the DPC:

owned approximately 96 per cent of the capacity of the synthetic-rubber industry, 90 per cent of magnesium metal, 71 per cent of aircraft and aircraft engines, and 58 per cent of the aluminum metal industry. It also had sizeable investments in iron and steel, aviation gasoline, ordnance, machinery and machine tool, transportation, radio, and other more miscellaneous facilities.[7]

The DPC supported facilities expansion in other ways too.  Responding to concerns of shortages in machine tools and the industry’s reluctance to boost capacity to produce them, the DPC began a machine tools pool program.  The DPC gave machine tool producers a 30 percent advance to begin production.  If the producers found a private buyer, they returned the advance.  If they found no buyer, the DPC would pay them full price and put the machine tool in storage for later sale.  This program proved remarkably successful in boosting machine tool production and, with machine tools readily available, speeding up weapons production.[8]

The second task, the timely delivery of scarce materials to military and essential civilian producers, was accomplished thanks to the efforts of the War Production Board (WPB), the country’s primary wartime mobilization agency.  In late 1942, after considerable experimentation, it launched its Controlled Materials Plan (CMP).  The plan required key claimants, such as the Army, the Navy, and the Maritime Commission, to provide detailed descriptions of their projected programs and the quantities of essential controlled metals required to realize them, with a monthly production schedule for the upcoming year.  The WPB industry divisions responsible for these metals would then estimate their projected supply and decide the amount of each metal to be allocated to each claimant following WPB policy directives.  The claimants would then adjust their programs accordingly and assign their metal shares to their prime contractors who were then responsible for assigning supplies to their subcontractors. 

When, over time, a shortage of components replaced the shortage of metals as the most serious bottleneck to military production, the WPB introduced another program.  The newly established Production Executive Committee created a list of 34 critical components.  One of its subcommittees, working in concert with the CMP process, would then arrange for essential manufacturers to receive all their required scarce materials and components. 

3. Flexibility is important

Flexibility in both planning structures and mobilization policies was critical to the success of the conversion.  President Roosevelt began the mobilization process in May 1940, with an executive order reactivating the World War 1-era National Defense Advisory Commission (NDAC).  In December 1940, he replaced the NDAC with the Office of Production Management (OPM). Then, in August 1941, he created the Supply Priorities and Allocation Board (SPAB) and placed it over the OPM with the charge of developing a long-term mobilization strategy and overseeing OPM’s work.  And finally, again using an executive order, he established the War Production Board (WPB) in January 1942, replacing both the OPM and the SPAB.  

All three agencies, the NDAC, OPM, and WPB, relied heavily on divisions overseeing industrial sections to carry out their responsibilities.  The NDAC had 7 divisions: Industrial Production, Industrial Materials, Labor, Price Stabilization, Farm Products, Transportation, and Consumer Protection.  The first two were the most important.

The Industrial Production Division had 8 sections, the most important being aircraft; ammunition and lite ordnance; and tanks, trucks, and tractors. The Industrial Materials Division had three subdivisions, each with its own sections: the mining and minerals products subdivision had sections for iron and steel, copper, aluminum, and tin; the agricultural and forest products subdivision had sections for textiles, leather, paper, rubber, and the like; and the chemical and allied products division had sections for petroleum, nitrogen, etc. 

Each division, subdivision, and section had an appointed head, and each section head had an industry advisory committee to assist them. The divisions, subdivisions, and sections were responsible, as appropriate, for assessing the industrial capacities of their respective industries to meet present and projected military needs, facilitating military procurement activity, and assisting with plant expansion plans and the priority distribution and allocation of scarce goods.

When Roosevelt felt that an existing mobilization agency was not up to the task of furthering the war effort, he replaced it.  Accordingly, each new mobilization agency had a more centralized decision-making structure, broader responsibilities, and greater authority over private business decisions than its predecessor. 

Thus, the OPM, reflecting a different stage in the mobilization, was more narrowly focused on production and had only four divisions: Production Division, Purchases Division, Priorities Division, and Labor Division.  Later, in recognition of the spillover effects of military production on civilian production, the Civilian Supply Division was added and given responsibility for all industries producing 50 percent or less for the defense program. 

The WPB had six divisions: Production Division, Materials Division, Division of Industry Operations, Purchases Division, Civilian Supply Division, and Labor Division.  The newly created Division of Industry Operations included all nonmunitions-producing industries and had responsibility for promoting the conversion of industries to military production and for maximizing the flow of materials, equipment, and workers to essential producers.   

4.  Conversion means conflict

Powerful corporations and the military opposed policies that threatened their interests even when those policies benefitted the war effort.  Corporations producing goods of direct importance to the military often refused to undertake needed investments.  Corporations producing for the civilian market routinely ignored agency requests that they curtail or convert their production to economize on the nonmilitary use of scarce materials.

By late 1940, this corporate resistance had begun to cause shortages, especially of strategic materials.  Aluminum was one of those materials and Alcoa, the only major producer of the metal, aggressively resisted expanding its production capacity even though a lack of aluminum was causing delays in military aircraft production.  A similar situation existed with steel, with steel executives arguing that there was no need for capacity expansion while critical activities such as ship building and railroad car manufacturing ground to a halt because of a lack of supply.[9]

This growing shortage problem, and its threat to the military buildup, could have been minimized if large producers of consumer durables had been willing to either reduce their production or convert to military production. But almost all of them rebuffed NDAC entreaties. They were enjoying substantial profits for the first time in years and were unwilling to abandon their civilian markets. 

The industry that drew the most criticism because of its heavy resource use was the automobile industry. In 1939, the automobile industry “absorbed 18 percent of total national steel output, 80 percent of rubber, 34 percent of lead, nearly 10-14 percent of copper, tin, and aluminum, and 90 percent of gasoline. Throughout 1940 and 1941, automobile production went up, taking proportionately even more materials and products indispensable for defense preparation.”[10]

In some cases, this corporate opposition to policies that threatened their profits lasted deep into the war years, with some firms objecting not only to undertaking their own expansion but to any government financed expansion as well, out of fear of post-war overproduction and/or loss of market share.  This stance is captured in the following exchange between Senator E. H. Moore of Oklahoma and Interior Secretary and Petroleum Administrator for War Harold L. Ickes at a February 1943 Congressional hearing over the construction of a federally financed petroleum pipeline from Texas to the East Coast:

Secretary Ickes. I would like to say one thing, however. I think there are certain gentlemen in the oil industry who are thinking of the competitive position after the war.

The Chairman. That is what we are afraid of, Mr. Secretary.

Secretary Ickes. That’s all right. I am not doing that kind of thinking.

The Chairman. I know you are not.

Secretary Ickes. I am thinking of how best to win this war with the least possible amount of casualties and in the quickest time.

Senator Moore. Regardless, Mr. Secretary, of what the effect would be after the war? Are you not concerned with that?

Secretary Ickes. Absolutely.

Senator Moore. Are you not concerned with the economic situation with regard to existing conditions after the war?

Secretary Ickes. Terribly. But there won’t be any economic situation to worry about if we don’t win the war.

Senator Moore. We are going to win the war.

Secretary Ickes. We haven’t won it yet.

Senator Moore. Can’t we also, while we are winning the war, look beyond the war to see what the situation will be with reference to –

Secretary Ickes (interposing). That is what the automobile industry tried to do, Senator. It wouldn’t convert because it was more interested in what would happen after the war. That is what the steel industry did, Senator, when it said we didn’t need any more steel capacity, and we are paying the price now. If decisions are left with me, it is only fair to say that I will not take into account any post-war factor—but it can be taken out of my hands if those considerations are paid attention to.[11]

Military procurement agencies, determined to maintain their independence, also greatly hindered government efforts to ensure a timely flow of resources to essential producers by actively opposing any meaningful oversight or regulation of their activities. Most importantly, the procurement agencies refused to adjust their demand for goods and services to the productive capacity of the economy. Demanding more than the economy could produce meant that shortages, dislocations, and stockpiling were unavoidable. The Joint Chiefs of Staff actually ignored several WPB requests to form a joint planning committee. 

David Kennedy provides a good sense of what was at stake:

As money began to pour into the treasury, contracts began to flood out of the military purchasing bureaus—over $100 billion worth in the first six months of 1942, a stupefying sum that exceeded the value of the entire nation’s output in 1941 . . . Military orders became hunting licenses, unleashing a jostling frenzy of competition for materials and labor in the jungle of the marketplace.  Contractors ran riot in a cutthroat scramble for scarce resources.[12]

It took until late 1942 for the WPB to win what became known as the “feasibility dispute,” after which the military’s procurement agencies grudgingly took the economy’s ability to produce into account when making their procurement demands.

5. Class matters

Leading corporations and their executives took advantage of every opportunity to shape the wartime mobilization process and strengthen their post-war political and economic power.  Many of the appointed section heads responsible for implementing mobilization policies were so-called “dollar-a-year men” who remained employed by the very firms they were supposed to oversee.  And most of these section heads relied on trade association officials as well as industry advisory committees to help them with their work.  In some cases, trade association officials themselves served as section heads of the industries they were hired to represent.  These appointments gave leading corporations an important voice in decisions involving the speed and location of new investments, the timing and process of industry conversions, procurement contract terms and procedures, the use of small businesses as subcontractors, the designation of goods as scare and thus subject to regulation, the role of unions in shopfloor production decisions, and labor allocation policies.

NDAC officials initially welcomed the participation of dollar-a-year men on the grounds that business executives knew best how to organize and maximize production. However, they soon often found these executives speaking out against agency policies in defense of corporate interests.  In response, the OPM created a Legal Division and empowered it to write and implement regulations designed to limit their number and power, but to little avail.  As the agency’s responsibilities grew, so did the number of dollar-a-year men working for it. 

Little changed under the WPB.  In fact, between January and December 1942, their number grew from 310 to a wartime high of 805, driven in large part by the explosion in the number of industry advisory committees.[13] The WPB’s continued dependence on these nominally paid business executives was a constant source of concern in Congress.

Corporate leaders also never lost sight of what was to them the bigger picture, the post-war balance of class power.  Thus, from the very beginning of the wartime mobilization, they actively worked to win popular identification of democracy with corporate freedom of action and totalitarianism with government planning and direction of economic activity.

As J.W. Mason illustrates:

Already by 1941, government enterprise was, according to a Chamber of Com­merce publication, “the ghost that stalks at every business conference.” J. Howard Pew of Sun Oil declared that if the United States abandoned private ownership and “supinely reli[es] on government control and operation, then Hitlerism wins even though Hitler himself be defeated.” Even the largest recipients of military contracts regarded the wartime state with hostility. GM chairman Alfred Sloan—referring to the danger of government enterprises operating after war—wondered if it is “not as essential to win the peace, in an eco­nomic sense, as it is to win the war, in a military sense,” while GE’s Philip Reed vowed to “oppose any project or program that will weaken” free enterprise.[14]

Throughout the war, business leaders and associations “flooded the public sphere with descriptions of the mobilization effort in which for-profit companies figured as the heroic engineers of a production ‘miracle’.”  For example, Boeing spent nearly a million dollars a year on print advertising in 1943-45, almost as much as it set aside for research and development.

The National Association of Manufactures (NAM) was one of the most active promoters of the idea that it was business, not government, that was winning the war against state totalitarianism.  It did so by funding a steady stream of films, books, tours, and speeches.  Mark R. Wilson describes one of its initiatives:

One of the NAM’s major public-relations projects for 1942, which built upon its efforts in radio and print media, was its “Production for Victory” tour, designed to show that “industry is making the utmost contributions toward victory.” Starting the first week in May, the NAM paid for twenty newspaper reporters to take a twenty-four-day, fifteen-state trip during which they visited sixty-four major defense plants run by fifty-eight private companies. For most of May, newspapers across the country ran daily articles related to the tour, written by the papers’ own reporters or by one of the wire services. The articles’ headlines included “Army Gets Rubber Thanks to Akron,” “General Motors Plants Turning Out Huge Volume of War Goods,” “Baldwin Ups Tank Output,” and “American Industry Overcomes a Start of 7 Years by Axis.”[15]

The companies and reporters rarely mentioned that almost all of these new plants were actually financed, built, and owned by the government, or that it was thanks to government planning efforts that these companies received needed materials on a timely basis and had well-trained and highly motivated workers.  Perhaps not surprisingly, government and union efforts to challenge the corporate story were never as well funded, sustained, or shaped by as clear a class perspective.[16]  As a consequence, they were far less effective.

6. Final thoughts

 Although the World War II-era economic transformation cannot and should not serve as a model for a Green New Deal transformation of the U.S. economy, it does provide lessons that deserve to be taken seriously.  Among the most important is that a rapid system-wide transformation, such as required for a Green New Deal, is possible to achieve, and in a timely manner.  It will take the development of new state capacities and flexible policies.  And we should be prepared, from the beginning, that our own efforts to create a more socially just and environmentally sustainable economy will be met by sophisticated opposition from powerful corporations and their allies. 

The conversion history also points to some of our biggest challenges. Germany’s military victories in Europe as well as Japan’s direct attack on the United States encouraged popular support for state action to convert the economy from civilian to military production. In sharp contrast, widespread support for state action to combat climate change or restrict corporate freedom of action does not yet exist. Even now, there are many who deny the reality of climate change.  There is also widespread doubt about the ability of government to solve problems. This means we have big work ahead to create the political conditions supportive of decisive action to transform our economy.

Perhaps equally daunting, we have no simple equivalent to the military during World War II to drive a Green New Deal transformation.  The war-time mobilization was designed to meet the needs of the military.  Thus, the mobilization agencies generally treated military procurement demands as marching orders.  In contrast, a Green New Deal transformation will involve changes to many parts of our economy, and our interest in a grassroots democratic restructuring process means there needs to be popular involvement in shaping the transformation of each part, as well as the connections between them. Thus, we face the difficult task of creating the organizational relationships and networks required to bring together leading community representatives, and produce, through conversation and negotiation, a broad roadmap of the process of transformation we collectively seek.

And finally, we must confront a corporate sector that is far more powerful and popular now than it was during the period of the war.  And thanks to the current freedom corporations enjoy to shift production and finance globally, they have a variety of ways to blunt or undermine state efforts to direct their activities.

In sum, achieving a Green New Deal transformation will be far from easy. It will require developing a broad-based effort to educate people about how capitalism is driving our interrelated ecological and economic crises, building a political movement for system-wide change anchored by a new ecological understanding and vision, and creating the state and community-based representative institutions needed to initiate and direct the desired Green New Deal transformation. 

It is that last task that makes a careful consideration of the World War II-era conversion so valuable.  By studying how that rapid economy-wide transformation was organized and managed, we are able to gain important insights into, and the ability to prepare for, some of the challenges and choices that await us on the road to the new economy we so badly need.

Notes

[1] These include debates over the speed of change, the role of public ownership, and the use of nuclear power for energy generation.  There are also environmentalists who oppose the notion of sustained but sustainable growth explicitly embraced by many Green New Deal supporters and argue instead for a policy of degrowth, or a “Green New Deal without growth.”

[2] Christopher J. Tassava, “The American Economy during World War II,” EH.Net Encyclopedia, edited by Robert Whaples, February 10, 2008.

[3] Harold G. Vatter, The U.S. Economy in World War II (New York: Columbia University Press, 1985), 23.

[4] Vatter, The U.S. Economy in World War II, 22.

[5] Paul A.C Koistinen, Arsenal of World War II, The Political Economy of American Warfare 1940-1945. (Lawrence, Kansas: University of Kansas Press, 2004), 498.

[6] Hugh Rockoff, “The United States: From Ploughshares to Swords,” in Mark Harrison, editor, The Economics of World War II (New York: Cambridge University Press, 1998), p. 83.

[7] Gerald T. White, “Financing Industrial Expansion for War: The Origin of the Defense Plant Corporation Leases,” The Journal of Economic History, Vol. 9, No. 2 (November, 1949), 158.

[8] Andrew Bossie and J.W. Mason, “The Public Role in Economic Transformation: Lessons from World War II,” The Roosevelt Institute, March 2020, 9-10.

[9] Maury Klein, A Call to Arms, Mobilizing America for World War II (New York: Bloomsbury Press, 2013), 165.

[10] Koistinen, Arsenal of World War II, 130.

[11] As quoted in Vatter, The U.S. Economy in World War II, 24-25.

[12] As quoted in Klein, A Call to Arms, 376.

[13] Koistinen, Arsenal of World War II, 199.

[14] J.W. Mason, “The Economy During Wartime,” Dissent Magazine, Fall 2017.

[15] Mark R. Wilson, Destructive Creation, American Business and the Winning of World War II (Philadelphia: University of Pennsylvania Press, 2016), 102.

[16] Union suggestions for improving the overall efficiency of the mobilization effort as well as their offers to join with management in company production circles were routinely rejected. See Martin Hart-Landsberg, The Green New Deal and the State, Lessons from World War II,” Against the Current, No. 207 (July/August 2020); Paul A. C. Koistinen, “Mobilizing the World War II Economy: Labor and the Industrial-Military Alliance,” Pacific Historical Review, Vol. 42, No. 4 (November 1973); and Nelson Lichtenstein, Labor’s War at Home, The CIO in World War II (New York: Cambridge University Press, 1982).

The latest argument against federal relief: business claims that workers won’t work

A growing number of business and political leaders have found yet another argument to use against federal pandemic relief programs, especially those that provide income support for workers: they hurt the economic recovery by encouraging workers not to work.

In the words of Senate Minority Leader Mitch McConnell, as reported by BusinessInsider

“We have flooded the zone with checks that I’m sure everybody loves to get, and also enhanced unemployment,” McConnell said from Kentucky. “And what I hear from businesspeople, hospitals, educators, everybody across the state all week is, regretfully, it’s actually more lucrative for many Kentuckians and Americans to not work than work.”

He went on: “So we have a workforce shortage and we have raising inflation, both directly related to this recent bill that just passed.”

In line with business claims that they can’t find willing workers despite their best efforts at recruitment, the governors of Montana, South Carolina, Alabama, Arkansas, and Mississippi have all announced that they will no longer allow the unemployed in their respective states to collect the $300-a-week federal supplemental unemployment benefit and will once again require that those receiving unemployment benefits demonstrate they are actively looking for work.

In reality there is little support for the argument that expanded unemployment benefits have created an overly worker-friendly labor market, leaving companies unable to hire and, by extension, meet growing demand.  But of course, if enough people accept the argument, corporate leaders and their political allies will have achieved their shared goal, which is to weaken worker bargaining power as corporations seek to position themselves for a profitable post-pandemic economic recovery.

Wage trends

If companies were aggressively seeking workers, we would expect to see the resulting competition push up wages.  The following figure shows year-over-year real weekly earnings of production and nonsupervisory workers—approximately 85 percent of the workforce.  As we can see, those earnings were actually lower in April 2021 than they were in April 2020. 

In short, companies may want more workers, but it is hard to take their cries of anguish seriously if they remain unwilling to offer higher real wages to attract them.  Real average weekly earnings of production and nonsupervisory workers in April 2021 stood at $875.  Multiplying weekly earnings by 50, gives an estimated annual salary of $43,774.  That total is actually 5.7 percent below the similarly calculated peak in October 1972.

Over the last three months, the only sector experiencing significant wage growth due to labor shortages is the leisure and hospitality sector (which includes arts, entertainment, and leisure as well as accommodations and food services).  Wages in that sector grew at an annualized rate of nearly 18 percent relative to the previous three months.  But, as Josh Bivens and Heidi Shierholz explain,

There is very little reason to worry that labor shortages in leisure and hospitality will soon spill over into other sectors and drive economywide “overheating.”  For example, jobs in leisure and hospitality have notably low wages and fewer hours compared to other sectors. Weekly wages of production and nonsupervisory workers in leisure and hospitality now equate to annual earnings of just $20,628, and total wages in leisure and hospitality account for just 4% of total private wages in the U.S. economy. . . . [Moreover] this sector seems notably segmented off from much of the rest of the economy.

Job openings and labor turnover

The figure below, drawn from the Bureau of Labor Statistics’s Job Openings and Labor Turnover Summary (JOLTS), shows the monthly movement in job openings, hires, quits, and layoffs and discharges, with solid lines showing their six-month moving averages.   

As we can see, despite business complaints, monthly hiring (green line) still remains greater than during the last years of the pre-pandemic expansion.  And although job openings (blue line) are growing sharply while the number of hires is falling, the gap between openings and hires is also still smaller than it was during the last years of the previous expansion.  In addition, the number of quits (light blue line), which are an indicator of labor tightness, remain below the last years of the previous expansion and rather stable.  In short, there is nothing in the data that suggests business is facing a dysfunctional labor market marked by an unreasonable worker unwillingness to work.

Even with the additional financial support in Biden’s American Rescue Plan, many workers and their families continue to struggle to afford food, housing, and health care.  Many workers remain reluctant to re-enter the labor market because of Covid-related health concerns and care responsibilities.  Moreover, as Heidi Shierholz points out

there are far more unemployed people than available jobs in the current labor market. In the latest data on job openings, there were nearly 40% more unemployed workers than job openings overall, and more than 80% more unemployed workers than job openings in the leisure and hospitality sector.

While there are certainly fewer people looking for jobs now than there would be if Covid weren’t a factor . . . without enough job openings to even come close to providing work for all job seekers, it again stretches the imagination to suggest that labor shortages are a core dynamic in the labor market.

We need to discredit this attempt by the business community and its political allies to generate opposition to policies that help workers survive this period of crisis and redouble our own efforts to strengthen worker rights and build popular support for truly transformative economic policies, ones that go beyond the stopgap fixes currently promoted.

Time to put the spotlight on corporate taxes

A battle is slowly brewing in Washington DC over whether to raise corporate taxes to help finance new infrastructure investments.  While higher corporate taxes cannot generate all the funds needed, the coming debate over whether to raise them gives us an opportunity to challenge the still strong popular identification of corporate profitability with the health of the economy and, by extension, worker wellbeing.

According to the media, President Biden’s advisers are hard at work on two major proposals with a combined $3 trillion price tag.  The first aims to modernize the country’s physical infrastructure and is said to include funds for the construction of roads, bridges, rail lines, ports, electric vehicle charging stations, and affordable and energy efficient housing as well as rural broadband, improvements to the electric grid, and worker training programs.  The second targets social infrastructure and would provide funds for free community college education, universal prekindergarten, and a national paid leave program. 

To pay for these proposals, Biden has been talking up the need to raise corporate taxes, at least to offset some of the costs of modernizing the country’s physical infrastructure.  Not surprisingly, Republican leaders in Congress have voiced their opposition to corporate tax increases.  And corporate leaders have drawn their own line in the sand.  As the New York Times reports:

Business groups have warned that corporate tax increases would scuttle their support for an infrastructure plan. “That’s the kind of thing that can just wreck the competitiveness in a country,” Aric Newhouse, the senior vice president for policy and government relations at the National Association of Manufacturers, said last month [February 2021].

Regardless of whether Biden decides to pursue his broad policy agenda, this appears to be a favorable moment for activists to take advantage of media coverage surrounding the proposals and their funding to contest these kinds of corporate claims and demonstrate the anti-working-class consequences of corporate profit-maximizing behavior.  

What do corporations have to complain about?

To hear corporate leaders talk, one would think that they have been subjected to decades of tax increases.  In fact, quite the opposite is true.  The figure below shows the movement in the top corporate tax rate.  As we can see, it peaked in the early 1950s and has been falling ever since, with a big drop in 1986, and another in 2017, thanks to Congressionally approved tax changes.

One consequence of this corporate friendly tax policy is, as the following figure shows, a steady decline in federal corporate tax payments as a share of GDP.  These payments fell from 5.6 percent of GDP in 1953 to 1.5 percent in 1982, and a still lower 1.0 percent in 2020.  By contrast there has been very little change in individual income tax payments as a share of GDP; they were 7.7 percent of GDP in 2020.

Congressional tax policy has certainly been good for the corporate bottom line.  As the next figure illustrates, both pre-tax and after-tax corporate profits have risen as a share of GDP since the early 1980s.  But the rise in after-tax profits has been the most dramatic, soaring from 5.2 percent of GDP in 1980 to 9.1 percent in 2019, before dipping slightly to 8.8 percent in 2020.   To put recent after-tax profit gains in perspective, the 2020 after-tax profit share is greater than the profit share in every year from 1930 to 2005.

What do corporations do with their profits?

Corporations claim that higher taxes would hurt U.S. competitiveness, implying that they need their profits to invest and keep the economy strong.  Yet, despite ever higher after-tax rates of profit, private investment in plant and equipment has been on the decline.

As the figure below shows, gross private domestic nonresidential fixed investment as a share of GDP has been trending down since the early 1980s.  It fell from 14.8 percent in 1981 to 13.4 percent in 2020.

Rather than investing in new plant and equipment, corporations have been using their profits to fund an aggressive program of stock repurchases and dividend payouts.  The figure below highlights the rise in corporate stock buybacks, which have helped drive up stock prices, enriching CEOs and other top wealth holders. In fact, between 2008 and 2017, companies spent some 53 percent of their profits on stock buybacks and another 30 percent on dividend payments.

It should therefore come as no surprise that CEO compensation is also exploding, with CEO-to-worker compensation growing from 21-to-1 in 1965, to 61-to-1 in 1989, 293-to-1 in 2018, and 320-to-1 in 2019.  As we see in the next figure, the growth in CEO compensation has actually been outpacing the rise in the S&P 500.

In sum, the system is not broken.  It continues to work as it is supposed to work, generating large profits for leading corporations that then find ways to generously reward their top managers and stockholders.  Unfortunately, investing in plant and equipment, creating decent jobs, or supporting public investment are all low on the corporate profit-maximizing agenda.  

Thus, if we are going to rebuild and revitalize our economy in ways that meaningfully serve the public interest, working people will have to actively promote policies that will enable them to gain control over the wealth their labor produces.  One example: new labor laws that strengthen the ability of workers to unionize and engage in collective and solidaristic actions.  Another is the expansion of publicly funded and provided social programs, including for health care, housing, education, energy, and transportation. 

And then there are corporate taxes.  Raising them is one of the easiest ways we have to claw back funds from the private sector to help finance some of the investment we need.  Perhaps more importantly, the fight over corporate tax increases provides us with an important opportunity to make the case that the public interest is not well served by reliance on corporate profitability.

The failings of our unemployment insurance system are there by design

Our unemployment insurance system has failed the country at a moment of great need.  With tens of millions of workers struggling just to pay rent and buy food, Congress was forced to pass two emergency spending bills, providing one-time stimulus payments, special weekly unemployment insurance payments, and temporary unemployment benefits to those not covered by the system.  And, because of their limited short-term nature, President Biden must now advocate for a third.

The system’s shortcomings have been obvious for some time, but little effort has been made to improve it.  In fact, those shortcomings were baked into the system at the beginning, as President Roosevelt wanted, not by accident.  While we must continue to organize to ensure working people are able to survive the pandemic, we must also start the long process of building popular support for a radical transformation of our unemployment insurance system.  The history of struggle that produced our current system offers some useful lessons.

Performance

Our unemployment insurance system was designed during the Great Depression.  It was supposed to shield workers and their families from the punishing costs of unemployment, thereby also helping to promote both political and economic stability.  Unfortunately, as Eduardo Porter and Karl Russell reveal in a New York Times article, that system has largely failed working people.

The chart below shows the downward trend in the share of unemployed workers receiving benefits and the replacement value of those benefits.  Benefits now replace less than one-third of prior wages, some eight percentage points below the level in the 1940s.  Benefits aside, it is hard to celebrate a system that covers fewer than 30 percent of those struggling with unemployment.

A faulty system

Although every state has an unemployment insurance system, they all operate independently.  There is no national system.  Each state separately generates the funds it needs to provide unemployment benefits and is largely free, subject to some basic federal standards, to set the conditions under which an unemployed worker becomes eligible to receive benefits, the waiting period before benefits will be paid, the length of time benefits will be paid, the benefit amount, and requirements to continue receiving benefits.

Payroll taxes paid by firms generate the funds used to pay unemployment insurance benefits.  The size of the taxes to be paid depends on the value of employee earnings that is made taxable (the base wage) and the tax rate.  States are free to set the base wage as they want, subject to a federally mandated floor of $7000 established in the 1970s.  States are also free to set the tax rate as they want.  Not surprisingly, in the interest of supporting business profitability, states have generally sought to keep both the base wage and tax rate low.  For example, Florida, Tennessee and Arizona continue to set their base wage at the federal minimum value.  And, as the figure below shows, insurance tax rates have been trending down for some time.

While such a policy might help business, lowering the tax rate means that states have less money in their trust funds to pay unemployment benefits.  Thus, when times are hard, and unemployment claims rise, many states find themselves hard pressed to meet their required obligations.  In fact, as Porter and Russell explain:

Washington has been repeatedly called on to provide additional relief, including emergency patches to unemployment insurance after the Great Recession hit in 2008. Indeed, it has intervened in response to every recession since the 1950s.

This is far from a desirable outcome for those states forced to borrow, since the money has to be paid back with interest by imposing higher future payroll taxes on employers.  Thus, growing numbers of states have sought to minimize the likelihood of this happening, or at least the amount to be borrowed, by raising eligibility standards, reducing benefits, and shortening time of coverage, all of which they hope will reduce the number of people drawing unemployment benefits as well as the amount and length of time they will receive them.

Porter and Russell highlight some of the consequences of this strategy:

In Arizona, nearly 70 percent of unemployment insurance applications are denied. Only 15 percent of the unemployed get anything from the state. Many don’t even apply. Tennessee rejects nearly six in 10 applications.

In Florida, only one in 10 unemployed workers gets any benefits. The state is notably stingy: no more than $275 a week, roughly a third of the maximum benefit in Washington State. And benefits run out quickly, after as little as 12 weeks, depending on the state’s overall unemployment rate.

And, the growing stagnation of the US economy, which has led to more precarity of employment, only makes this strategy ever more fiscally “intelligent.”  For example, as the following figure shows, a growing percentage of the unemployed are remaining jobless for a longer time.  Such a trend, absent state actions to restrict access to benefits, would mean financial trouble for state officials.

Adding to the system’s structural shortcomings is that fact that growing numbers of workers, for example the many workers who have been reclassified as independent contractors, are not covered by it.  In addition, since eligibility for benefits requires satisfying a minimum earnings and hours of work requirement over a base year, the growth in irregular low wage work means that many of those in most need of the system’s financial support during periods of unemployment find themselves declared ineligible for benefits.

By design, not by mistake

Our current unemployment insurance system and its patchwork set of state standards and benefits dates back to the depression. While President Roosevelt gets credit for establishing our unemployment insurance system as part of the New Deal, the fact is he deliberately sidelined a far stronger program that, if it had been approved, would have put working people today in a far more secure position. 

The Communist Party (CP) began pushing an unemployment and social insurance bill in the summer of 1930 and, along with the numerous Unemployed Councils that existed in cities throughout the country, worked hard to promote it over the following years.  On March 4, 1933, the day of Roosevelt’s inauguration, they organized demonstrations stressing the need for action on unemployment insurance.

Undeterred by Roosevelt’s lack of action, the CP-authored “Workers Unemployment and Social Insurance Bill” was introduced in Congress in February 1934 by Representative Ernest Lundeen of the Farmer-Labor Party.  In broad brush, the bill mandated the payment of unemployment insurance to all unemployed workers and farmers equal to average local full-time wages, with a guaranteed minimum of $10 per week plus $3 for each dependent. Those forced into part-time employment would receive the difference between their earnings and the average local full-time wage.  The bill also created a social insurance program that would provide payments to the sick and elderly, and maternity benefits to be paid eight weeks before and eight weeks after birth.  All these benefits were to be financed by unappropriated funds in the Treasury and taxes on inheritances, gifts, and individual and corporate incomes above $5,000 a year.

The bill enjoyed strong support among workers—employed and unemployed—and it was soon endorsed by 5 international unions, 35 central labor bodies, and more than 3000 local unions.  Rank and file worker committees also formed across the country to pressure members of Congress to pass it.

When Congress refused to act on the bill, Lundeen reintroduced it in January 1935. Because of public pressure, the bill became the first social insurance plan to be recommended by a congressional committee, in this case the House Labor Committee.  However, it was soon voted down in the full House of Representatives, 204 to 52.

Roosevelt strongly opposed the Lundeen bill and it was to provide a counter that he pushed to create an alternative, one that offered benefits far short of what the Workers Unemployment and Social Insurance Bill offered, and was strongly opposed by many workers and all organizations of the unemployed.  Roosevelt appointed a Committee on Economic Security in July 1934 with the charge to develop a social security bill that he could present to Congress in January 1935 that would include provisions for both unemployment insurance and old-age security.  An administration approved bill was introduced right on schedule in January and Roosevelt called for quick congressional action. 

Roosevelt’s bill was revised in April by a House committee and given a new name, “The Social Security Act.”  After additional revisions the Social Security Act was signed into law on August 14, 1935. The Social Security Act was a complex piece of legislation.  It included what we now call Social Security, a federal old-age benefit program; a program of unemployment insurance administered by the states; and a program of federal grants to states to fund benefits for the needy elderly and aid to dependent children. 

The unemployment system established by the Social Security Act was structured in ways unfavorable to workers (as was the federal old-age benefit program).  Rather than a progressively funded, comprehensive national system of unemployment insurance that paid benefits commensurate with worker wages, the act established a federal-state cooperative system that gave states wide latitude in determining standards.

More specifically, the act levied a uniform national pay-roll tax of 1 percent in 1936, 2 percent in 1937, and 3 percent in 1938, on covered employers, defined as those employers with eight or more employees for at least twenty weeks, not including government employers and employers in agriculture.  Only workers employed by a covered employer could receive benefits.

The act left it to the states to decide whether to enact their own plans, and if so, to determine eligibility conditions, the waiting period to receive benefits, benefit amounts, minimum and maximum benefit levels, duration of benefits, disqualifications, and other administrative matters. It was not until 1937 that programs were established in every state as well as the then-territories of Alaska and Hawaii.  And it was not until 1938 that most began paying benefits.

In the early years, most states required eligible workers to wait 2 to 4 weeks before drawing benefits, which were commonly set at half recent earnings (subject to weekly maximums) for a period ranging from 12 to 16 weeks. Ten state laws called for employee contributions as well as employer contributions; three still do today.

Over the following years the unemployment insurance system has been improved in a number of positive ways, including by broadening coverage and boosting benefits.  However, its basic structure remains largely intact, a structure that is overly complex, with a patchwork set of state eligibility requirements and miserly benefits. And we are paying the cost today.

This history makes clear that nothing will be given to us.  We need and deserve a better unemployment insurance system. And to get it, we are going to have to fight for it, and not be distracted by the temporary, although needed, band-aids Congress is willing to provide.  The principles shaping the Workers Unemployment and Social Insurance Bill can provide a useful starting point for current efforts.

The U.S. recovery on pause, December brings new job losses

A meaningful working-class recovery from the recession seems far away.

After seven months of job gains, although diminishing gains to be sure, we are again losing jobs.  As the chart below shows,  the number of jobs fell by 140,000 in December.

We are currently about 9.8 million jobs down from the February 2020 employment peak, having recovered only 55 percent of the jobs lost.  And, as the following chart illustrates, the percentage of jobs lost remains greater, even now after months of job growth, than it was at any point during the Great Recession. 

If the job recovery continues on its current pace, some analysts predict that it will likely take more than three years to just get back to pre-pandemic employment levels.  However, this might well be too rosy a projection.  One reason is that the early assumption that many of the job losses were temporary, and that those unemployed would soon be recalled to employment, is turning out to be wrong.  A rapidly growing share of the unemployed are remaining unemployed for an extended period. 

As we see below, in October, almost one-third of the unemployed had been unemployed for 27 weeks or longer.  According to the December jobs report, that percentage is now up to 37 percent, four times what it was before the pandemic.  And that figure seriously understates the problem, since many workers have given up looking for work; having dropped out of the workforce, they are no longer counted as unemployed.  The labor force participation rate is now 61.5 percent, down from 63.3 percent in February.

Dean Baker, quoted in a recent Market Place story, underscores the importance of this development:

“This is obviously a story of people losing their job at the beginning of the crisis in March and April and not getting it back,” said Dean Baker, co-founder and senior economist with the Center for Economic and Policy Research.

Those out of work for 27 weeks or more make up a growing share of the unemployed, and that could have enduring consequences, Baker said.

“After people have been unemployed for more than six months, they find it much harder to get a job,” he said. “And if they do get a job, their labor market prospects could be permanently worsened.”

And tragically, the workers that have suffered the greatest job losses during this crisis are those that earned the lowest wages. 

It is no wonder that growing numbers of working people are finding it difficult to meet their basic needs.

There is no way to sugar coat this situation.  We need a significant stimulus package, a meaningful increase in the minimum wage, real labor law reform, a robust national single-payer health care system, and an aggressive Green New Deal designed public sector investment and jobs program.  And there is no getting around the fact that it is going to take hard organizing and mutually supportive community and workplace actions to move the country in the direction it needs to go.

The planning and politics of conversion: World War II lessons for a Green New Deal—Part 1

This is the first in a series of posts that aim to describe and evaluate the World War II mobilization experience in the United States in order to illuminate some of the economic and political challenges we can expect to face as we work for a Green New Deal.  

This post highlights the successful government directed wartime reorientation of the U.S. economy from civilian to military production, an achievement that both demonstrates the feasibility of a rapid Green New Deal transformation of the U.S. economy and points to the kinds of organizational capacities we will need to develop. The post also highlights some of the strategies employed by big business to successfully stamp the wartime transformation as a victory for “market freedom,” an outcome that strengthened capital’s ability to dominate the postwar U.S. political economy and suggests the kind of political struggles we can expect and will need to overcome as we work to achieve a just Green New Deal transformation.

The climate challenge and the Green New Deal

We are hurtling towards a climate catastrophe.  The Intergovernmental Panel on Climate Change, in its Special Report on Global Warming of 1.5°C, warns that we must limit the increase in the global mean temperature to 1.5 degrees Celsius above pre-industrial levels by 2100 if we hope to avoid a future with ever worsening climate disasters and “global scale degradation and loss of ecosystems and biodiversity.”  And, it concludes, to achieve that goal global net carbon dioxide emissions must fall by 45 per cent by 2030 and reach net zero emissions by 2050.

Tragically, none of the major carbon dioxide emitting nations has been willing to pursue the system-wide changes necessary to halt the rise in the global mean temperature.  Rather than falling, carbon dioxide emissions rose over the decade ending in 2019.  Only a major crisis, in the current case a pandemic, appears able to reverse the rise in emissions.   

Early estimates are that the COVID-19 pandemic will cause a fall in global emissions of somewhere between 4 and 7 percent in 2020.  But the decline will likely be temporary.  For example, the International Monetary Fund is forecasting an emission rise of 5.8 percent in 2021. This bounce back is in line with what happened after the 2008-09 Great Recession.  After falling by 1.4 percent in 2009, global emissions grew by 5.1 percent in 2010.

Motivated by signs of the emerging climate crisis—extreme weather conditions, droughts, floods, warming oceans, rising sea levels, fires, ocean acidification, and soil deterioration—activists in the United States have worked to build a movement that joins climate and social justice activists around a call for a Green New Deal to tackle both global warming and the country’s worsening economic and social problems. The Green Party has promoted its ecosocialist Green New Deal since 2006, but it was the 2018 mass actions by new climate action groups such as Extreme Rebellion and the Sunrise Movement and then the 2019 introduction of a Green New Deal congressional resolution by Representative Alexandria Ocasio-Cortez and Senator Edward Markey that helped popularize the idea.

The Ocasio-Cortez—Markey resolution, echoing the Intergovernmental Panel on Climate Change, calls for a ten-year national program of mobilization designed to cut CO2 emissions by 40-60 percent from 2010 levels by 2030 and achieve net-zero emissions by 2050.  Its program includes policies that aim at replacing fossil fuels with clean, renewable sources of energy, and existing forms of transportation, agriculture, and urban development with new affordable and sustainable ones; encouraging investment and the growth of clean manufacturing; and promoting good, high paying union jobs and universal access to clean air and water, health care, and healthy food.

While there are similarities, there are also important differences, between the Green Party’s Green New Deal and Ocasio-Cortez—Markey’s Green New Deal, including over the speed of change, the role of public ownership, and the use of fracking and nuclear power for energy generation.  More generally, there are also differences among supporters of a Green New Deal style transformation over whether the needed government investments and proposed social policies should be financed by raising taxes, slashing the military budget, borrowing, or money creation.  There are also environmentalists who oppose the notion of sustained but sustainable growth explicitly embraced by many Green New Deal supporters and argue instead for a policy of degrowth, or a “Green New Deal without growth.”

These arguments are important, representing different political sensibilities and visions, and need to be taken seriously.  But what has largely escaped discussion is any detailed consideration of the actual process of economic transformation required by any serious Green New Deal program.  Here are some examples of the kind of issues we will need to confront:

Fossil fuel production has to be ratcheted down, which will dramatically raise fossil fuel prices.  The higher cost of fossil fuels will significantly raise the cost of business for many industries, especially air travel, tourism, and the aerospace and automobile industries, triggering significant declines in demand for their respective goods and services and reductions in their output and employment.  We will need to develop a mechanism that will allow us to humanely and efficiently repurpose newly created surplus facilities and provide alternative employment for released workers.

New industries, especially those involved in the production of renewable energy will have to be rapidly developed.  We will need to develop agencies capable of deciding the speed of their expansion as well as who will own the new facilities, how they will be financed, and how best to ensure that the materials required by these industries will be produced in sufficient quantities and made available at the appropriate time. We will also have to develop mechanisms for deciding where the new industries will be located and how to develop the necessary social infrastructure to house and care for the new workforce.  

The list goes on—we will need to ensure the rapid and smooth expansion of facilities capable of producing electric cars, mass transit vehicles, and a revitalized national rail system.  We will need to organize the retrofitting of existing buildings, both office and residential, as well as the training of workers and the production of required equipment and materials.  The development of a new universal health care system will also require the planning and construction of new clinics and the development of new technologies and health practices. 

The challenges sound overwhelming, especially given the required short time frame for change.  But, reassuringly, the U.S. government faced remarkable similar challenges during the war years when, in approximately three years, it successfully converted the U.S. economy from civilian to military production. This experience points to the importance of studying the World War II planning process for lessons and should give us confidence that we can successfully carry out our own Green New Deal conversion in a timely fashion.

World War II economic mobilization

The name Green New Deal calls to mind the New Deal of the 1930s, which is best understood as a collection of largely unrelated initiatives designed to promote employment and boost a depressed economy.  In contrast, the Green New Deal aims at an integrated transformation of a “functioning” economy, which is a task much closer to the World War II transformation of the U.S. economy. That transformation required the repression of civilian production, much like the Green New Deal will require repression of the fossil fuel industry and those industries dependent on it.  Simultaneously, it also required the rapid expansion of military production, including the creation of entirely new products like synthetic rubber and weapon systems, much like the Green New Deal will require expansion of new forms of renewable energy, transportation, and social programs.  And it also required the process of conversion to take place quickly, much like what is required under the Green New Deal. 

J.W. Mason and Andrew Bossie highlight the contemporary relevance of the wartime experience by pointing out:

Just as in today’s public-health and climate crises, the goal of wartime economic management was not to raise GDP in the abstract, but to drastically raise production of specific kinds of goods, many of which had hardly figured in the prewar economy. Then as now, this rapid reorganization of the economy required a massive expansion of public spending, on a scale that had hardly been contemplated before the emergency. And then as, potentially, now, this massive expansion of public spending, while aimed at the immediate non-economic goal, had a decisive impact on long-standing economic problems of stagnation and inequality. Of course, there are many important differences between the two periods. But the similarities are sufficient to make it worth looking to the 1940s for economic lessons for today.

Before studying the organization, practice, and evolution of the World War II era planning system, it is useful to have an overall picture of the extent, speed, and success of the economy’s transformation. The following two charts highlight the dominant role played by the government.  The first shows the dramatic growth and reorientation in government spending beginning in 1941.  As we can see federal government war expenditures soared, while non-war expenditures actually fell in value.  Military spending as a share of GNP rose from 2.2 percent in 1940, to 11 percent in 1941, and to 31.2 percent in 1942.

The second shows that the expansion in plant and equipment required to produce the goods and services needed to fight the war was largely financed by the government.  Private investment actually fell in value over the war years.

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 92.

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 115.

The next chart illustrates the speed and extent of the reorientation of industrial production over the period 1941-1944.  As we can see, while industrial production aimed at military needs soared, non-military industrial production significantly declined.

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 104.

The next two charts illustrate the success of the conversion process.  The first shows the rapid increase in the production of a variety of military weapons and equipment.  The second demonstrates why the United States was called the “Arsenal of democracy”; it produced the majority of all the munitions produced during World War II.

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 319

Source: U.S. Bureau of the Budget, The United States at War, Development and Administration of the War Program by the Federal Government, Washington DC: The U.S. Government Printing Office, 1947, p. 507.

Significantly, while the rapid growth in military related production did boost the overall growth of the economy, because it was largely achieved at the expense of nonmilitary production, the economy’s overall growth over the years 1941-44/45, was far from extraordinary.  For example, the table below compares the growth in real gross nonfarm product over the early years of the 1920’s to that of the early years of the 1940’s.  As we can see, there is little difference between the two periods, and that holds true even if we exclude the last year of the war, when military spending plateaued and military production began to decline.  The same holds true when comparing just the growth in industrial production over the two periods.

Years                   Growth in real gross nonfarm product                                              

1921-2528.4%
1941-4524.6%
  
1921-2426.2%
1941-4425.8%
Source: Harold G. Vatter, The U.S. Economy in World War II, New York: Columbia University Press, 1985, p. 22.

This similarity between the two periods reinforces the point that the economic success of the war years, the rapid ramping up of military production, was primarily due to the ability of government mobilization agencies to direct an economic conversion that privileged the production of goods and services for the military at the expense of non-military goods and services.  This experience certainly lends credibility to those who seek a similar system-wide conversion to achieve a Green New Deal transformation of the U.S. economy.

Such a transformation is not without sacrifice.  For example, workers did pay a cost for the resulting suppression of civilian oriented production, but it was limited.  As Harold Vatter points out: “There were large and real absolute decreases in total consumer expenditures between 1941 and 1945 on some items considered important in ordinary times.  Prominent among these, in the durable goods category, were major home appliances, new cars, and net purchases of used cars, furniture, and radio and TV sets.”

At the same time there were real gains for workers.  Overall personal consumption which rose in both 1940 and 1941, declined absolutely in 1942, but then began a slow and steady increase, with total personal consumption higher in 1945 than in 1941.  However, this record understates the real gains.  The U.S. civilian population declined from 131.6 million in 1941 to 126.7 million in 1944.  Thus, the gain in personal consumption on a per capita basis was significant.  As Vatter notes, “real employee compensation per private employee in nonfarm establishments rose steadily ever year, and in 1945 was over one-fifth above the 1941 level. . . . More broadly, similar results show up for the index of real disposable personal income per capita, which increased well over one-fourth during the same war years.”  Of course, these gains were largely the result of more people working and for longer hours; it was definitely earned.  Also important is the fact that pretax family income rose faster for those at the bottom of the income distribution than for those at the top, helping to reduce overall income inequality. 

In sum, there are good reasons for those seeking to implement a Green New Deal style transformation of the U.S. economy to use the World War II planning experience as a template.  A careful study of that experience can alert us to the kinds of organizational and institutional capacities we will need to develop.  And, it is important to add, it can also alert us to the kinds of political challenges we can expect to face.

Planning and politics

The success of the U.S. economy’s World War II transformation was due, in large part, to the work of a series of changing and overlapping mobilization agencies that President Roosevelt established by executive order and then replaced or modified as new political and economic challenges emerged. Roosevelt took his first meaningful action to help prepare the United States economy for war in May 1940, when he reactivated the World War 1-era National Defense Advisory Commission (NDAC).  The NDAC was replaced by the Office of Production Management (OPM) in December 1940.  The Supply Priorities and Allocation Board (SPAB) was then created in August 1941 to develop a needed longer-term planning orientation to guide the work of the OPM.  And finally, both the OPM and the SPAB were replaced by the War Production Board (WPB) in January 1942.  With each change, decision-making became more centralized, planning responsibilities expanded, and authority to direct economic activity strengthened.

The work of these agencies was greatly enhanced by a number of other initiatives, one of the most important being the August 1940 establishment of the Defense Plant Corporation (DPC). The DPC was authorized to directly finance and own plant and equipment vital to the national defense. The DPC ended up financing and owning roughly one-third of the plant and equipment built during the war, most of which was leased to private companies to operate for a minimal amount, often $1 a year. The aircraft industry was the main beneficiary of DPC investment, but plants were also built to produce synthetic rubber, ships, machine tools, iron and steel, magnesium, and aluminum.

Despite its successful outcome, the process of economic conversion was far from smooth and the main reason was resistance by capitalists.  Still distrustful of New Deal reformers, most business leaders were critical of any serious attempt at prewar planning that involved strengthening government regulation and oversight of their respective activities.  Rather, they preferred to continue their existing practice of individually negotiating contracts with Army and Navy procurement agencies.  Many also opposed prewar government entreaties to expand their scale of operations to meet the military’s growing demand for munitions and equipment.  Their reasons were many: they were reluctant to expand capacity after a decade of depression; civilian markets were growing rapidly and highly profitable; and the course of the war, and the U.S. participation in it, remained uncertain.

Their attitude and power greatly influenced the operation and policies of the NDAC, which was built on industry divisions run by industry leaders, most of whom were so-called “dollar-a-year men” who continued to draw their full salaries from the corporations that employed them, and advised by industry associations.  This business-friendly structure, with various modifications, was then transferred to the OPM and later the WPB.

With business interests well represented in the prewar mobilization agencies, the government struggled to transform the economy in preparation for war.  The lack of new business investment in critical industries meant that by mid-1941 material shortages began forcing delays in defense orders; aluminum, magnesium, zinc, steel, and machine tools were all growing scare.  At the same time, a number of industries that were major consumers of these scare materials and machinery, such as the automobile industry, also resisted government efforts to get them to abandon their consumer markets and convert to the production of needed military goods.

In some cases, this resistance lasted deep into the war years, with some firms objecting not only to undertaking their own expansion but to any government financed expansion as well, out of fear of post-war overproduction and/or loss of market share.  The resulting political tension is captured by the following exchange at a February 1943 Congressional hearing between Senator E. H. Moore of Oklahoma and Interior Secretary and Petroleum Administrator for War Harold L. Ickes over the construction of a petroleum pipeline from Texas to the East Coast:

Secretary Ickes. I would like to say one thing, however. I think there are certain gentlemen in the oil industry who are thinking of the competitive position after the war.

The Chairman. That is what we are afraid of, Mr. Secretary.

Secretary Ickes. That’s all right. I am not doing that kind of thinking.

The Chairman. I know you are not.

Secretary Ickes. I am thinking of how best to win this war with the least possible amount of casualties and in the quickest time.

Senator Moore. Regardless, Mr. Secretary, of what the effect would be after the war? Are you not concerned with that?

Secretary Ickes. Absolutely.

Senator Moore. Are you not concerned with the economic situation with regard to existing conditions after the war?

Secretary Ickes. Terribly. But there won’t be any economic situation to worry about if we don’t win the war.

Senator Moore. We are going to win the war.

Secretary Ickes. We haven’t won it yet.

Senator Moore. Can’t we also, while we are winning the war, look beyond the war to see what the situation will be with reference to –

Secretary Ickes (interposing). That is what the automobile industry tried to do, Senator. It wouldn’t convert because it was more interested in what would happen after the war. That is what the steel industry did, Senator, when it said we didn’t need any more steel capacity, and we are paying the price now. If decisions are left with me, it is only fair to say that I will not take into account any post-war factor—but it can be taken out of my hands if those considerations are paid attention to.

Once the war began, many businesses were also able to build a strategic alliance with the military that allowed them to roll back past worker gains and isolate and weaken unions.  For example, by invoking the military’s overriding concern with achieving maximum production of the weapons of war, business leaders were able to defeat union attempts to legislate against the awarding of military contracts to firms in violation of labor law. They also succeeded in ignoring overtime pay requirements when lengthening the workweek and in imposing new workplace rules that strengthened management prerogatives. 

If unions struck to demand higher wages or resist unilateral workplace changes, business and military leaders would declare their actions a threat to the wartime effort, which cost them public support. Often the striking unions were threatened with government sanctions by mobilization authorities.  In some cases, especially when it came to the aircraft industry, the military actually seized control of plants, sending in troops with fixed bayonets, to break a strike.  Eventually, the CIO traded a no-strike pledge for a maintenance of membership agreement, but that often put national union officials in the position of suppressing rank-and-file job actions and disciplining local leaders and activists, an outcome which weakened worker support for the union.

Business didn’t always have its own way.  Its importance as essential producer was, during the war, matched by the military’s role as essential demander.  And, while the two usually saw eye-to-eye, there were times when military interests diverged from, and dominated, corporate interests.  Moreover, as the war continued, government planning agencies gained new powers that enabled them to effectively regulate the activities of both business and the military.  Finally, the work of congressional committees engaged in oversight of the planning process as well as pressure from unions and small business associations also helped, depending on the issue, to place limits on corporate prerogatives.

Still, when all was said and done, corporate leaders proved remarkably successful in dominating the mobilization process and strengthening their post-war authority over both the government and organized labor.  Perhaps the main reason for their success is that almost from the beginning of the mobilization process, a number of influential business leaders and associations aggressively organized themselves to fight their own two-front war—one that involved boosting production to help the United States defeat the Axis powers and one that involved winning popular identification of the fight for democracy with corporate freedom of action.

In terms of this second front, as J.W. Mason describes:

Already by 1941, government enterprise was, according to a Chamber of Com­merce publication, “the ghost that stalks at every business conference.” J. Howard Pew of Sun Oil declared that if the United States abandoned private ownership and “supinely reli[es] on government control and operation, then Hitlerism wins even though Hitler himself be defeated.” Even the largest recipients of military contracts regarded the wartime state with hostility. GM chairman Alfred Sloan—referring to the danger of government enterprises operating after war—wondered if it is “not as essential to win the peace, in an eco­nomic sense, as it is to win the war, in a military sense,” while GE’s Philip Reed vowed to “oppose any project or program that will weaken” free enterprise.

Throughout the war, business leaders and associations “flooded the public sphere with descriptions of the mobilization effort in which for-profit companies figured as the heroic engineers of a production ‘miracle’.”  For example, Boeing spent nearly a million dollars a year on print advertising in 1943-45, almost as much as it set aside for research and development.

The National Association of Manufactures (NAM) was one of the most active promoters of the idea that it was business, not government, that was winning the war against state totalitarianism.  It did so by funding a steady stream of films, books, tours, and speeches.  Mark R. Wilson describes one of its initiatives:

One of the NAM’s major public-relations projects for 1942, which built upon its efforts in radio and print media, was its “Production for Victory” tour, designed to show that “industry is making the utmost contributions toward victory.” Starting the first week in May, the NAM paid for twenty newspaper reporters to take a twenty-four-day, fifteen-state trip during which they visited sixty-four major defense plants run by fifty-eight private companies. For most of May, newspapers across the country ran daily articles related to the tour, written by the papers’ own reporters or by one of the wire services. The articles’ headlines included “Army Gets Rubber Thanks to Akron,” “General Motors Plants Turning Out Huge Volume of War Goods,” “Baldwin Ups Tank Output,” and “American Industry Overcomes a Start of 7 Years by Axis.”

It was rarely if ever mentioned by the companies or the reporters that almost all of these new plants were actually financed, built, and owned by the government, or that it was thanks to government planning efforts that these companies had well-trained workers and received needed materials on a timely basis.  Perhaps not surprisingly, government and union efforts to challenge the corporate story were never as well funded, sustained, or shaped by as clear a class perspective.  As a consequence, they were far less effective.

Paul A.C. Koistinen, in his major study of World War II planning, quotes Hebert Emmerich, past Secretary of the Office of Production Management (OPM), who looking back at the mobilization experience in 1956 commented that “When big business realized it had lost the elections of 1932 and 1936, it tried to come in through the back door, first through the NRA and then through the NDAC and OPM and WPB.”  Its success allowed it to emerge from the war politically stronger than when it began.

Capital is clearly much more organized and powerful today than it was in the 1940s.  And we can safely assume that business leaders will draw upon all their many strengths in an effort to shape any future conversion process in ways likely to limit its transformative potential.  Capital’s wartime strategy points to some of the difficult challenges we must prepare to face, including how to minimize corporate dominance over the work of mobilization agencies and ensure that the process of transformation strengthens, rather than weakens, worker organization and power.  Most importantly, the wartime experience makes clear that the fight for a Green New Deal is best understood as a new front in an ongoing class war, and that we need to strengthen our own capacity to wage a serious and well-prepared ideological struggle for the society we want to create.

COVID-19 Economic Crisis Snapshot

 Workers in the United States are in the midst of a punishing COVID-19 economic crisis.  Unfortunately, while a new fiscal spending package and an effective vaccine can bring needed relief, a meaningful sustained economic recovery will require significant structural changes in the operation and orientation of the economy.

The unemployment problem

Many people blame government mandated closure orders for the decline in economic activity and spike in unemployment.  But the evidence points to widespread concerns about the virus as the driving force.  As Emily Badger and Alicia Parlapiano describe in a New York Times article, and as illustrated in the following graphic taken from the article:

In the weeks before states around the country issued lockdown orders this spring, Americans were already hunkering down. They were spending less, traveling less, dining out less. Small businesses were already cutting employment. Some were even closing shop.

People were behaving this way — effectively winding down the economy — before the government told them to. And that pattern, apparent in a range of data looking back over the past two months, suggests in the weeks ahead that official pronouncements will have limited power to open the economy back up.

As the graphic shows, economic activity nosedived around the same time regardless of whether state governments were quick to mandate closings, slow to mandate closings, or unwilling to issue stay-at-home orders.

The resulting sharp decline in economic activity caused unemployment to soar. Almost 21 million jobs were lost in April at the peak of the crisis.  The unemployment rate hit a high of 14.7 percent.  By comparison the highest unemployment rate during the Great Recession was 10.6 percent in January 2010.

Employment recovered the next month, with an increase of 2.8 million jobs in May.  In June, payrolls grew by an even greater number, 4.8 million.  But things have dramatically slowed since.  In July, only 1.8 million jobs came back.  In August it was 1.5 million.  And in September it was only 661,000.  To this point, only half of the jobs lost have returned, and current trends are far from encouraging.

The unemployment rate fell to 7.9 percent in September, a significant decline from April.  But a large reason for that decline is that millions of workers have given up working or looking for work and are no longer counted as being part of the labor force.  And, as Alisha Haridasani Gupta writes in the New York Times:

A majority of those dropping out were women. Of the 1.1 million people ages 20 and over who left the work force (neither working nor looking for work) between August and September, over 800,000 were women, according to an analysis by the National Women’s Law Center. That figure includes 324,000 Latinas and 58,000 Black women. For comparison, 216,000 men left the job market in the same time period.

The relationship between the fall in the unemployment rate and worker exodus from the labor market is illustrated in the next figure which shows both the unemployment rate and the labor force participation rate (LFPR), which is measured by dividing the number of people 16 and over who are employed or seeking employment by the size of the civilian noninstitutional population that is 16 and over.

The figure allows us to see that even the relatively “low” September unemployment rate of 7.9 percent is still high by historical standards.  It also allows us to see that its recent decline was aided by a decline in the LFPR to a level not seen since the mid-1970s.  If those who left the labor market were to decide to once again seek employment, pushing the LFPR back up, unless the economic environment changed dramatically, the unemployment rate would also be pushed up to a much higher level.

Beyond the aggregate figures is the fact, as Heather Long, Andrew Van Dam, Alyssa Fowers and Leslie Shapiro explain in a Washington Post article, that “No other recession in modern history has so pummeled society’s most vulnerable.”

As we can see in the above graphic, the 1990 recession was a relatively egalitarian affair with all income groups suffering roughly a similar decline in employment.  That changed during the recessions of 2001 and 2008, with the lowest earning cohort suffering the most.  But, as the authors of the Washington Post article state, “even that inequality is a blip compared with what the coronavirus inflicted on low-wage workers this year.”  By the end of the summer, the employment crisis was largely over for the highest earners, while employment was still down more than 20 percent for low-wage workers and around 10 percent for middle-wage workers.

Poverty is on the rise

In line with this disproportionate hit suffered by low wage workers, the poverty rate has been climbing.  Five Columbia University researchers, using a monthly version of the Supplemental Poverty Measure (SPM), provide estimates of the monthly poverty rate from October 2019 through September 2020.  They found, as illustrated below, “that the monthly poverty rate increased from 15% to 16.7% from February to September 2020, even after taking the CARES Act’s income transfers into account. Increases in monthly poverty rates have been particularly acute for Black and Hispanic individuals, as well as for children.”

The standard poverty measure used by the federal government is an annual one, based on whether a family’s total annual income falls below a specified income level.  It doesn’t allow for monthly calculations and is widely criticized for using an extremely low emergency food budget to set its poverty level.   The SPM includes a more complete and accurate measure of family resources, a more expansive definition of family, the cost of a broader basket of necessities, and is adjusted for cost of living across metro areas.

As we can see in the above figure, the $2.2 trillion Coronavirus Aid, Relief, and Economic Security (CARES) Act, which was passed by Congress and signed into law on March 27th, 2020, has had a positive effect on poverty levels.  For example, without it, the poverty rate would have jumped to 19.4 percent in April. “Put differently, the CARE Act’s income transfers directly lifted around 18 million individuals out of poverty in April.”

However, as we can also see, the positive effects of the CARES Act have gradually dissipated.  The Economic Impact Payments (“Recovery Rebates”) were one-time payments.  The $600 per week unemployment supplement expired at the end of July.  Thus, the gap between the monthly SPM with and without the CARES Act has gradually narrowed.  And, with job creation dramatically slowing, without a new federal stimulus measure it is likely we will not see much improvement in the poverty rate in the coming months.  In fact, if working people continue to leave the labor market out of discouragement and the pressure of home responsibilities, there is a good chance the poverty rate will climb again.

It is also important to note that the rise in monthly rates of poverty, even with the CARES Act, differs greatly by race/ethnicity as illustrated in the following figure.

The need to do more

Republican opposition to a new stimulus ensures that that there will be no follow-up to the CARES Act before the upcoming election.  Opponents claim that the federal government has already done enough and the economy is well on its way to recovery. 

As for the size of the stimulus, the United States has been a lagger when it comes to its fiscal response to the pandemic.  The OECD recently published an interim report titled “Coronavirus: Living with uncertainty.”  One section of the report looks at fiscal support as a percent of 2019 GDP for nine countries. As the following figure shows, the United States trails every country but Korea when it comes to direct support for workers, firms, and health care.  

A big change is needed

While it is natural to view COVID-19 as responsible for our current crisis, the truth is that our economic problems are more long-term.  The U.S. economy has been steadily weakening for years.  In the figure below, the “trend” line is based on the 2.1% average rate of growth in real per capita GDP from 1970 to 2007, the year before the Great Recession.  Not surprising, real per capita GDP took a big hit during the Great Recession.  But as we can also see, real per capita GDP has yet to return to its historical trend. In fact, the gap has grown larger despite the record long recovery that followed. 

As Doug Henwood explains:

Since 2009, the growth rate has averaged 1.6%. Last year [2019], which Trump touted as the greatest economy ever, it managed to get back to the pre-2008 average of 2.1%, an average that includes two deep recessions (1973–1975 and 1981–1982).

At the end of 2019, actual [real GDP per capita] was 13% below trend. At the end of the 2008–2009 recession it was 9% below trend. Remarkably, despite a decade-long expansion, it fell further below trend in well over half the quarters since the Great Recession ended. The gap is now equal to $10,200 per person—a permanent loss of income, as economists say. 

The pre-coronavirus period of expansion (June 2009 to February 2020), although the longest on record, was actually also one of the weakest. It was marked by slow growth, weak job creation, deteriorating job quality, declining investment, rising debt, declining life expectancy, and narrowing corporate profit margins. In other words, the economy was heading toward recession even before the start of state mandated lockdowns.  The manufacturing sector actually spent much of 2019 in recession.   

Thus, there is strong reason to believe that a meaningful sustained recovery from the current COVID-19 economic crisis is going to require more than the development of an effective vaccine and a responsive health care system to ensure its wide distribution.  Also needed is significant structural change in the operation and orientation of the economy.

Defunding police and challenging militarism, a necessary response to their “battle space”

The excessive use of force and killings of unarmed Black Americans by police has fueled a popular movement for slashing police budgets, reimagining policing, and directing freed funds to community-based programs that provide medical and mental health care, housing, and employment support to those in need.  This is a long overdue development.

Police are not the answer

Police budgets rose steadily from the 1990s to the Great Recession and, despite the economic stagnation that followed, have remained largely unchanged.  This trend is highlighted in the figure below, which shows real median per capita spending on police in the 150 largest U.S. cities.  That spending grew, adjusted for inflation, from $359 in 2007 to $374 in 2017.  The contrast with state and local government spending on social programs is dramatic.  From 2007 to 2017, median per capita spending on housing and community development fell from $217 to $173, while spending on public welfare programs fell from $70 to $47.

Thus, as economic developments over the last three decades left working people confronting weak job growth, growing inequality, stagnant wages, declining real wealth, and rising rates of mortality, funding priorities meant that the resulting social consequences would increasingly be treated as policing problems.  And, in line with other powerful trends that shaped this period–especially globalization, privatization, and militarization–police departments were encouraged to meet their new responsibilities by transforming themselves into small, heavily equipped armies whose purpose was to wage war against those they were supposed to protect and serve. 

The military-to-police pipeline

The massive, unchecked militarization of the country and its associated military-to-police pipeline was one of the more powerful factors promoting this transformation.  The Pentagon, overflowing with military hardware and eager to justify a further modernization of its weaponry, initiated a program in the early 1990s that allowed it to provide surplus military equipment free to law enforcement agencies, allegedly to support their “war on drugs.”  As a Forbes article explains:

Since the early 1990s, more than $7 billion worth of excess U.S. military equipment has been transferred from the Department of Defense to federal, state and local law enforcement agencies, free of charge, as part of its so-called 1033 program. As of June [2020], there are some 8,200 law enforcement agencies from 49 states and four U.S. territories participating. 

The program grew dramatically after September 11, 2001, justified by government claims that the police needed to strengthen their ability to combat domestic terrorism.  As an example of the resulting excesses, the Los Angeles Times reported in 2014 that the Los Angeles Unified School District and its police officers were in possession of three grenade launchers, 61 automatic military rifles and a Mine Resistant Ambush Protected armored vehicle. Finally, in 2015, President Obama took steps to place limits on the items that could be transferred; tracked armored vehicles, grenade launchers, and bayonets were among the items that were to be returned to the military.

President Trump removed those limits in 2017, and the supplies are again flowing freely, including armored vehicles, riot gear, explosives, battering rams, and yes, once again bayonets.  According to the New York Times, “Trump administration officials said that the police believed bayonets were handy, for instance, in cutting seatbelts in an emergency.”

Outfitting police departments for war also encouraged different criteria for recruiting and training. For example, as Forbes notes, “The average police department spends 168 hours training new recruits on firearms, self-defense, and use of force tactics. It spends just nine hours on conflict management and mediation.”  Arming and training police for military action leads naturally to the militarization of police relations with community members, especially Black, Indigeous and other people of color, who come to play the role of the enemy that needs to be controlled or, if conditions warrant, destroyed.

In fact, the military has become a major cheerleader for domestic military action.  President Trump, on a call with governors after the start of demonstrations protesting the May 25, 2020 killing of George Floyd while in police custody, exhorted them to “dominate” the street protests.

As the Washington Examiner reports:

“You’ve got a big National Guard out there that’s ready to come and fight like hell,” Trump told governors on the Monday call, which was leaked to the press.

[Secretary of Defense] Esper lamented that only two states called up more than 1,000 Guard members of the 23 states that have called up the Guard in response to street protests. The National Guard said Monday that 17,015 Guard members have been activated for civil unrest.

“I agree, we need to dominate the battle space,” Esper said after Trump’s initial remarks. “We have deep resources in the Guard. I stand ready, the chairman stands ready, the head of the National Guard stands ready to fully support you in terms of helping mobilize the Guard and doing what they need to do.”

The militarization of the federal budget

The same squeeze of social spending and support for militarization is being played out at the federal level.  As the National Priorities Project highlights in the following figure, the United States has a military budget greater than the next ten countries combined.

Yet, this dominance has done little to slow the military’s growing hold over federal discretionary spending.  At $730 billion, military spending accounts for more than 53 percent of the federal discretionary budget.  A slightly broader notion, what the National Priorities Project calls the militarized budget, actually accounts for almost two-thirds of the discretionary budget.  The militarized budget:

includes discretionary spending on the traditional military budget, as well as veterans’ affairs, homeland security, and law enforcement and incarceration. In 2019, the militarized budget totaled $887.8 billion – amounting to 64.5 percent of discretionary spending. . . . This count does not include forms of militarized spending allocated outside the discretionary budget, include mandatory spending related to veterans’ benefits, intelligence agencies, and interest on militarized spending.

The militarized budget has been larger than the non-militarized budget every year since 1976.  But the gap between the two has grown dramatically over the last two decades. 

In sum, the critical ongoing struggle to slash police budgets and reimagine policing needs to be joined to a larger movement against militarism more generally if we are to make meaningful improvements in majority living and working conditions.