Pandemic economic woes continue, but so do deep structural problems, especially the long-term growth in the share of low wage jobs

Many are understandably alarmed about what the September 4th termination of several special federal pandemic unemployment insurance programs will mean for millions of workers.  Twenty-five states ended their programs months earlier, with government and business leaders claiming that their termination would spur employment and economic activity.  However, several studies have disproved their claims.

One study, based on the experience of 19 of these states, found that for every 8 workers that lost benefits, only one found a new job.  Consumer spending in those states fell by $2 billion, with every lost $1 of benefits leading to a fall in spending of 52 cents.   It is hard to see how anything good can come from the federal government’s willingness to allow these programs to expire nationwide. 

The Biden administration appears to believe that adoption of its physical infrastructure bill and $3.5 trillion spending plan will ensure that those left without benefits will find new jobs.  But chances for Congressional approval are growing dim.  Even more importantly, and largely overlooked in the debate over whether the time is right to replace the pandemic unemployment insurance programs with new spending measures, is that an increasing share of the jobs created by economic growth are low-wage, and thus inadequate to ensure workers and their families an acceptable standard of living. 

For example, according to another study, the share of low wage jobs has been steadily growing since 1979.  More specifically, the share of workers (18-64 years of age) with a low wage job rose from 39.1 percent in 1979 to 45.2 percent in 2017.  For workers 18 to 34 without a college degree the share soared from 46.9 percent to 61.6 percent over the same tyears. Thus, a meaningful improvement in worker well-being will require far more than a return to “normal” labor market conditions.  It will require building a movement able to directly challenge and transformation the way the US economy operates.  

The importance of government programs

The figure below provides some sense of how important government programs have been to working people.  Government support was truly a lifeline for working people, delivering a significant boost to total monthly personal income (relative to the February 2020 start of the pandemic-triggered recession), especially during the first months.  Even now, despite the fact that the recession has officially been declared over, it still accounts for approximately half the increase in total monthly income.   

The government’s support of personal income was anchored by three special unemployment insurance programs–the Federal Pandemic Unemployment Compensation (FPUC), Pandemic Emergency Unemployment Compensation (PEUC), and Pandemic Unemployment Assistance (PUA). 

The FPUC was authorized by the March 2020 CARES Act and renewed by subsequent legislation and a presidential order. It originally provided $600 per week in extra unemployment benefits to unemployed workers in states that opted in to the program. In August 2020, the extra payment was lowered to $300.

The PEUC was also established by the CARES Act. It provided up to 13 weeks of extended unemployment compensation to individuals that had exhausted their regular unemployment insurance compensation.  This was later extended to 24 additional weeks and then by a further 29 weeks, allowing for a total of 53 weeks.  The PUA allowed states to provide unemployment assistance to the self-employed and those seeking part-time employment, or who otherwise did not qualify for regular unemployment compensation.

Tragically, the federal government allowed all three programs to expire on September 4th. Months earlier, in June 2021, 25 states actually ended these programs for their unemployed workers, eliminating benefits for over 2 million.  Several studies, as we see next, have documented the devastating cost of that decision. 

The cost of state program termination

Beginning in April 2021, a number of business analysts and politicians began to aggressively argue that federally provided unemployment benefit programs were no longer needed.  In fact, according to them, the programs were actually keeping workers from pursuing available jobs, thereby holding back the country’s economic recovery. Using these arguments as cover, in June, 25 states ended their participation in one or more of these programs. 

For example, Henry McMaster, the governor of South Carolina, announced his decision to end his state’s participation in the federal programs, saying: “This labor shortage is being created in large part by the supplemental unemployment payments that the federal government provides claimants on top of their state unemployment benefits.”

Similarly, Tate Reeves, the governor of Mississippi, stated in a May 2021 tweet:

It has become clear to me that we cannot have a full economic recovery until we get the thousands of available jobs in our state filled. . . . Therefore, I have informed the Department of Employment Security to direct the Biden Administration that Mississippi will be opting out of the additional federal unemployment benefits as early as federal law allows—June 12, 2021.

The argument that these special federal unemployment benefit programs hurt employment and economic activity was tested and found wanting.  Business Insider highlights the results of several studies:

Economist Peter Ganong, who co-authored a paper that found the disincentive effect of benefits was small, told the [Wall Street] Journal: “If the question is, ‘Is UI [unemployment insurance] the key thing that’s holding back the labor market recovery?’ The answer is no, definitely not, based on the available data.” 

That aligns with other early research on the impact of benefits ending. CNBC reports that analyses from payroll firms UKG and Homebase both found that employment didn’t go up in the states cutting off the benefits; in fact, that Homebase analysis found that employment declined in the states opting out of federal benefits, while it went up in states that chose to retain benefits. In June, Indeed’s Hiring Lab found that job searches in states ending benefits were below April’s baseline.

In July, Arindrajit Dube, an economics professor at University of Massachusetts Amherst, found that ending benefits didn’t make workers rush back. “Even as there was a clear reduction in the number of people who were receiving unemployment benefits — and a clear increase in the number of people who said that they were having difficulty paying their bills — that didn’t seem to translate, at least in the short run, into an uptick in overall employment rates,” Dube told Insider at the time.

Dube, along with five other researchers, examined “the effect of withdrawing pandemic UI on the financial and employment trajectories of unemployed workers in [19] states that withdrew benefits, compared to workers with the same unemployment duration in states that retained these benefits.” 

They found, as noted above, that for every 8 workers who lost their benefits, only 1 found a new job.  And for every $1 of reduced benefits, spending fell by 52 cents—only 7 cents of new income was generated for each dollar of lost benefits. “Extrapolating to all UI recipients in the early withdrawal states, we estimate these states eliminated $4 billion in unemployment benefits paid by federal transfers as of August 6 [2021].  Spending fell by $2 billion and earnings rose by $270 million.  These states therefore saw a much larger drop in federal transfers than gains from job creation.”

An additional 8 million workers have now lost benefits because of the federal termination of these special unemployment insurance programs.  It is hard to be optimistic about what awaits them, given the experience of the early termination states.  And equally important, even if the “optimists” are proven right, and those workers are able to find employment, there is still reason for concern about the likely quality of those jobs given long-term employment trends.

The lack of decent jobs

There is no agreed upon definition of a low wage job.  David R. Howell and Arne L. Kalleberg note two of the most popular in their study of declining job quality in the United States.  One is to define low wage jobs as those that pay less than two-thirds of the median hourly wage.  The other, used by the OECD, is to define low wage jobs as those that pay less than two-thirds of the median hourly wage for full-time workers.

Howell and Kallenberg find both inadequate.  Instead, they define low wage jobs as those that pay less than two-thirds of the mean hourly wage for full-time prime-age workers (35-59).  Their definition sets the dividing line between low wage and what they call “decent” wage jobs at $17.50 in 2017.  As they explain:

This wage is well above the wage that would make a full-time (or near full-time) worker eligible for food stamps and several dollars above the basic needs budget for a single adult in most American cities, but is conservative in that the basic needs budget for a single adult with one child ranges from $22 to $30).

The figure below, based on their definition, shows the growth in low wage jobs for workers 18-34 years of age without a college degree (in blue), all workers 18-64 years of age (in gold), and prime age workers 35-59 years of age (in green).  Their dividing line between low wage and decent wage jobs, equivalent to $17.50 in 2017, is far from a generous wage.  Yet, all three groupings show an upward trend in the share of low wage jobs.  

The authors then divide their low wage and decent wage categories into upper and lower tiers.   The lower tier of the low wage category includes jobs that pay less than two-thirds of the median wage for full-time workers, which equaled $13.33 in 2017.  As the authors report:

Based on evidence from basic needs budgets, this is a wage that, even on a full-time basis, would make it extremely difficult to support a minimally adequate standard of living for even a single adult anywhere in the country. This wage threshold ($13.33) is just above the wage cutoff for food stamps ($12.40) and Medicaid ($12.80) for a full- time worker (thirty-five hours per week, fifty weeks per year) with a child; full-year work at thirty hours per week would make a family of two eligible for the food stamps with a wage as high as $14.46 and as high as $14.94 for Medicaid.  For this reason, we refer to this as the poverty-wage threshold.

The lower tier of the decent wage category includes jobs that pay less than 50 percent more than the decent-job threshold, which equaled $26.50 in 2017.  The figure below shows the overall job distribution in 2017.

The following table shows the changing distribution of jobs over the years 1979 to 2017 for all workers 18 to 64, for workers 18-34 without a college degree, and for workers 18-34 with a college degree.

While the share of upper-tier decent jobs held by workers 18 to 64 has remained relatively stable, there has been a notable decline in the share of workers with lower-tier decent jobs.  Also worth noting is the rise in the share of poverty-level low wage jobs. 

Perhaps most striking is the large decline in the share of decent jobs held by workers 18 to 34, those with and those without a college degree.  The share of poverty level jobs held by those without a college degree soared from 35.7 percent to 53.5 percent.  The share of low wage jobs also spiked for those with a college degree, rising from 22 percent to 39.1 percent, with an increase in the share of both low-wage tiers.

This long-term decline in job quality will not reverse on its own.  And, not surprisingly, corporate leaders remain largely opposed to policies that might threaten the status quo.

So, do we need a better unemployment insurance system? For sure.  Do we need a better funded and more climate resilient social and physical infrastructure?  Definitely.  But we also need a dramatically different economy, one that, in sharp contrast to our current system, is grounded in greater worker control over both the organization and aims of production.  Lots of work ahead.

Playing the capitalist game: heads they win, tails you lose

According to an Economic Policy Institute report, between 28 and 47 percent of U.S. private sector workers are subject to noncompete agreements.  In brief, noncompete agreements (or noncompetes) are provisions in an employment contract that ban workers from leaving their job to work for a “competitor” that operates in the same geographic area, for a given period of time.  In a way, it’s an attempt to recreate the power dynamics of the employer-dominated company towns of old—with workers unable to change employers if they want to continuing working in the same industry.

It is not just top executives that are forced to accept a noncompete agreement.  Companies also use them to restrict the employment freedom of many low wage workers, including janitors, security guards, fast food workers, warehouse workers, personal care aids, and room cleaners.  In fact, the Economic Policy Institute estimates that almost a third of all businesses require that all of their workers sign noncompetes, regardless of their job duties or pay.

As for the impact of these agreements, a number of studies have found that noncompetes lower wages for all workers in the industry, even those not subject to noncompetes.  And then there is this from CBS News:

“In the context of the pandemic, which caused millions of people to be laid off, it’s safe to say at least a share of those workers are constrained [by noncompetes] in pursuing other opportunities during this crisis,” said John Lettieri, head of the Economic Innovation Group, a think tank that advocates against noncompetes. 

Indeed, at least four employers — including an accounting firm and a real estate brokerage — have tried to enforce noncompetes against workers they’ve laid off, with the lawsuits making their way through the courts.

On July 9, 2021 President Biden signed an executive order on “Promoting Competition in the American Economy” that, among other things, calls upon the Chair of the Federal Trade Commission (FTC) to work “with the rest of the Commission to exercise the FTC’s statutory rulemaking authority under the Federal Trade Commission Act to curtail the unfair use of non-compete clauses and other clauses or agreements that may unfairly limit worker mobility.”  While it seems likely that the FTC will take some action, the scope of that action remains uncertain.

Noncompetes and their use

There are no federal rules governing the use of noncompetes.  It is up to the states to decide how to regulate their use.  California, North Dakota, and Oklahoma are the only states with outright bans on their use; Washington DC also outlaws them.  Several states have placed limits on the use of non-competition agreements.  Illinois, Maryland, Nevada, Oregon, and Virginia all prohibit the use of noncompetes with low wage workers.  Washington state banned noncompetes for those earning under $100,000. Hawaii has prohibited noncompetes for tech workers only.  On the other hand, there are some states, like Idaho, which have actually passed laws making it easier for companies to enforce noncompete agreements.

Most workers live in states where there are few if any restrictions on the use of noncompete agreements.  And as the results of a national survey that included firms with at least 50 employees show, the use of noncompetes is common in workplaces with low pay (see the table below).  As the Economic Policy Institute report points out, although “the use of noncompetes tends to be higher for higher-wage workplaces than lower-wage workplaces . . . it is striking that more than a quarter—29.0%—of responding establishments where the average wage is less than $13.00 use noncompetes for all their workers.”

Popular outrage has sometimes forced companies to change their policies or state authorities to intervene on behalf of workers.  An example of the former: in 2015 Amazon began requiring its warehouse workers to sign noncompetes.  As The Verge reported:

The work is repetitive and physically demanding and can pay several dollars above minimum wage, yet Amazon is requiring these workers — even seasonal ones — to sign strict and far-reaching noncompete agreements. The Amazon contract, obtained by The Verge, requires employees to promise that they will not work at any company where they “directly or indirectly” support any good or service that competes with those they helped support at Amazon, for a year and a half after their brief stints at Amazon end. Of course, the company’s warehouses are the beating heart of Amazon’s online shopping empire, the extraordinary breadth of which has earned it the title of “the Everything Store,” so Amazon appears to be requiring temp workers to foreswear a sizable portion of the global economy in exchange for a several-months-long hourly warehouse gig.

The company has even required its permanent warehouse workers who get laid off to reaffirm their non-compete contracts as a condition of receiving severance pay. 

The company eventually ended the practice after its actions were widely reported in the media, generating bad publicity for the company.

Jimmy John’s offers an example of state action. In 2016, the attorneys general of New York and Illinois, reacting to public anger, forced Jimmy John to stop its franchises from using noncompetes that forbid its employees from working at any other sandwich shop within a 3-mile radius of the franchise for two years.

The cost of noncompetes to workers

When noncompetes are banned, worker pay rises.  One of the most detailed and complete studies of the wage consequences of such a change is based on Oregon’s 2008 decision to ban noncompetes (NCAs) for hourly wage workers.  As the authors of the study explain:

We find that banning NCAs for hourly workers increased hourly wages by 2-3% on average. Since only a subset of workers sign NCAs, scaling this estimate by the prevalence of NCA use in the hourly-paid population suggests that the effect on employees actually bound by NCAs may be as great as 14-21%, though the true effect is likely lower due to labor market spillovers onto those not bound by NCAs. While the positive wage effects are found across the age, education and wage distributions, they are stronger for female workers and in occupations where NCAs are more common. The Oregon low-wage NCA ban also improved average occupational status in Oregon, raised job-to-job mobility, and increased the proportion of salaried workers without affecting hours worked.”

Earlier studies of the consequence of changes in the use of noncompetes in other states produced similar results. For example, a study of Hawaii’s 2015 decision to ban noncompetes for tech workers showed a 4.2% pay bump for new hires and a 12% increase in worker mobility.

But even a change in law doesn’t necessarily bring an end to the practice, as highlighted by the California experience.  California courts will not enforce a noncompete contract, but that hasn’t stopped many California businesses from including them in their employment contracts.  One reason according to worker advocates, as reported by CBS News, is that most workers don’t know that noncompetes are banned in California: 

As a result, employers in California use these restrictive contracts just as much as employers elsewhere in the U.S., and they have their desired effect: scaring workers away from leaving for better jobs. 

“There’s no disincentive for the employer to include it in the employment contract. The worst thing that would happen is a court would declare [the noncompete] void,” said Harvard’s Gerstein. “There needs to be a disincentive to employer overreach.”

Possible federal action

President Biden pledged during his campaign to “eliminate all non-compete agreements, except the very few that are absolutely necessary to protect a narrowly defined category of trade secrets.”  On the other hand, his executive order speaks to “curtailing” their use.  The best outcome would be an FTC ban on the use of non-competes in all situations and for all workers; noncompetes are just another tool that businesses can use to exploit their workers.

But it may be that the FTC will instead seek to place limits on the use of such agreements, perhaps outlawing their use with low wage workers or establishing federal regulations that restrict their scope and duration.  Although such a step would be an improvement over the current situation, where most states do little to restrict the use of noncompetes, it may well result in an unsatisfying patchwork regulatory framework, much like that of our current unemployment system.

No matter how the FTC rules on the use of noncompete agreements, there are two other actions it should take that would significantly strengthen worker rights. Currently, many workers only learn they are subject to a noncompete agreement after they have already accepted a job.  The FTC should mandate that employers include any noncompete requirements in all job postings.

And as the California experience shows, companies will continue to use noncompetes even if they are not enforceable, relying on ignorance, intimidation, as well as the financial costs of court proceedings, to get workers to accept their terms.  Therefore, the FTC should also allow workers to sue for damages if a business is illegally attempting to enforce a noncompete agreement.

In the meantime, while we await FTC action, the greater the public knowledge about, and voiced opposition to the use of noncompetes, the better. 

Creating a democratically run economy: lessons from World War II price control struggles*

Many activists in the United States are working to build a movement for a Green New Deal transformation of the economy.  Not surprisingly, a growing number look to the World War II conversion of the US economy from civilian to military production for inspiration and policy ideas.  The conversion experience shows that a rapid, system-wide transformation of the U.S. economy is indeed possible. It also demonstrates that state capacities and action are critical; the successful conversion required state planning, public financing and ownership, and state direction of economic activity.  At the same time, with very few exceptions, the conversion process was structured in ways that minimized any serious challenge to existing class relations.1 

One of the most important exceptions was the government’s approach to price stability.  Faced with business determination to boost its prices and profits, and worker willingness to strike to maintain their real income, the Office of Price Administration was eventually forced to “deputize” volunteers to administer and ensure compliance with its system of price control.  Tens of thousands of volunteers were authorized to visit retail locations to monitor business compliance with the controls and tens of thousands of additional volunteers served on price boards that were empowered to investigate and fine retailers who were found to be in violation of the controls. 

This experience demonstrates that working people are willing and able, with minimal national supervision, to oversee and regulate business practices in order to advance a favorable national economic program of change.  In fact, their effort was critical to the program’s success.  Most importantly, I believe that the struggle over price stability has much to teach us about the process of building the institutional and class capacities needed for creating a democratically run economy.

In Part I of this post, I highlight the inflationary consequences of the growth of the US war economy.  In Part II, I examine the economic and social pressures that led the government to pursue ever more stringent price controls.  In Part III, I describe the struggle for popular participation in the implementation and enforcement of the controls.  In Part IV, I describe the role played by volunteers in successfully containing inflation during the last two years of the war.  In Part V, I discuss the business offensive against this volunteer participation. In Part VI, I conclude by highlighting some of the important lessons from this experience and their contemporary relevance. 

Part I: The war economy and inflation

The growth of the war economy brought the depression era to a close.  Unemployment in 1940 was still 14.6 percent.  Real gross private investment remained 18 percent below what it had been in 1929, and industrial production only 16 percent above its 1929 level.2 It was sustained military spending, beginning mid-1941, which finally produced the conditions necessary for a rapid decline in unemployment and rise in production. 

The transformation of the US economy into a war economy is perhaps best highlighted by the fact that although civilian production rose along with military production in the transition year of 1941, it declined thereafter, with many civilian production facilities shutdown or converted to military production.  While overall US industrial production rose rapidly over the years 1941 to 1943, civilian industrial production actually declined every year, in dollar terms, from 1941 to 1944.  Civilian industrial production as a percent of total industrial production fell from approximately 80% in 1941, to 40% in 1942, and to 35% in both 1943 and 1944.3  Business investment fell to a “low in 1943 that was only 37 percent of the 1940 level (and much below replacement requirements)” at the same time that “total civilian consumption, even of goods, in 1943 was higher than it had been in 1940.”4 

The combination of the fall in civilian-oriented industrial production and the rise in civilian employment and purchasing power meant upward pressure on prices.  And the inflationary pressure posed two serious threats to the war effort—it could trigger strikes by workers seeking to maintain their real income and it would raise the cost of financing the war. 

Part II: Inflation, wage struggles and price controls

Days after the US entry into the war, the government moved to minimize possible disruptions to the war mobilization; labor was its prime target.  President Roosevelt convened a labor-management conference at which the participants agreed on two basic policies: strikes and lockouts were to be prohibited during the war.  That meant that labor disputes, especially over wage increases, were to be submitted to the newly created National War Labor Board (NWLB) for resolution.

The Office of Price Administration (OPA), established in August 1941, was charged with securing price stability.  Its early efforts at price control were largely limited to issuing price schedules with maximum prices and negotiating voluntary agreements with producers of major industrial commodities and goods.  However, this industry-by-industry approach proved unworkable as the military build-up proceeded.  Prices rose rapidly and polls conducted in late 1941 and early 1942 showed a strong public desire for government action to stop their rise.

In late April 1942, in response to public concerns, Roosevelt announced a seven-point anti-inflationary program that called for increasing taxes, wage controls, rationing, war bonds sales, and credit controls.  The next day, the OPA issued its General Maximum Price Regulation (GMPR), which ordered the prices of almost all consumer goods to be frozen at the highest level reached in March 1942, effective as of May 15. 

Taking its cue from Roosevelt’s anti-inflationary program and the GMPR, the NWLB moved to place a ceiling on wage increases.  It ruled, in its July 1942 “Little Steel” decision, that since government figures showed that living costs had only risen by 15 percent between January 1, 1941 and May 1, 1942, steel worker wages could only increase by that amount. The Steel Worker Organizing Committee had sought dollar a day wage increases for workers at the smaller steel companies; the NWLB awarded them an increase of only 44 cents. 

The GMPR was an across-the-board effort at price control, covering “all commodities and services not specifically excluded or not covered by another regulation office.”  As part of that effort, retailers were required to keep a record of all their March prices at their place of business. They were also required to file with the OPA, and post on-site, a list showing their March prices for a specified group of “cost-of-living” items. 

While the GMPR appeared comprehensive, it proved difficult to administer as well as enforce.  For example, businesses were constantly modifying existing products or introducing new ones.  When that happened, the regulation called for businesses to price their products at a price that was comparable to similar products that were sold in March.  If a particular business had not previously produced or sold similar products, it could use the price charged by a competitive firm. 

Not surprisingly, consumers found the GMPR wanting.  There was no way for them to determine whether businesses were complying with the regulation.  Prices changed regularly and, if challenged, sellers had little trouble in justifying their actions.

With weak price controls doing little to prevent price increases, especially for food, the Congress, with the president’s encouragement, passed the Economic Stabilization Act on October 2, 1942.  It called upon the president to ensure that prices, wages, and salaries remained at their September 15, 1942 levels.  Roosevelt directed the OPA to immediately place new price ceilings on a number of food items, including eggs, chicken, butter, cheese, potatoes, and flour.  He also ordered the NWLB to apply the terms of its Little Steel ruling to all workers, regardless of industry. 

The AFL and CIO attacked the Little Steel formula on multiple grounds.  Union officials pointed out that while hourly wage gains were limited to 15 percent, prices and profits had risen far more.  Moreover, even the government’s own cost of living index showed a 23.4 percent rise between January 1941 and December 1943.  They also highlighted flaws in the index which led to a serious underestimation of the rise in cost of living.  For example, many low-cost items such as shoes and clothing had disappeared from stores, forcing workers to buy more expensive ones.  Prices in restaurants and cafeterias had doubled, yet more people were forced to eat out because of their work schedules.  The unions calculated that living costs over this period had actually increased by 43.5 percent.5

The price rise slowed in the months following passage of the Economic Stabilization Act, but then prices began accelerating again in early 1943.  Not surprisingly, workers responded by striking for higher wages. In 1943, over 2 million workers went out on strike.  Approximately 13 million [person] days of labor were lost, more than three times the number in 1942.6 

The President sought to mollify workers with yet another initiative.  His April 1943 “Hold the Line” order included, in addition to an end to the remaining few NWLB approved exceptions to the Little Steel formula, a call for OPA to take still tougher actions on prices.  The OPA, in turn, significantly tightened its regulatory structure, taking an especially aggressive stance towards food prices.  For the first time it set actual dollars and cents ceiling prices for the most important food items. 

In contrast to its past approach, which relied on adjustable price ceilings tied to base period prices, dollars and cents ceiling prices could easily be understood and monitored by consumers.  Some of the new ceiling prices, such as for meat, were set by the national office.  The great majority—for a community price list of 300 designated grocery products—were to be set by district offices. 

More specifically, all grocery stores were divided into one of four categories based on their size and service; each category was then assigned its own nationally determined percentage mark-up.  To calculate community price ceilings, OPA district offices first calculated the local production costs of each product on the list using information from local suppliers.  Then they applied the national mark-up to the local costs.  The result was a dollars and cents ceiling price for each commodity, that varied by type of store and community.  District offices regularly adjusted these ceiling prices—weekly for perishables and monthly for dry groceries.

Equally important, for the first time, as discussed more fully below, the OPA began to use volunteers to administer and ensure retail compliance with its new ceiling prices.  The combination of new controls and popular participation in their application proved to be a success.  “Between the spring of 1943 and April 1945 . . . the BLS Index of Consumer Prices rose less than 2 percent, or at one-sixth the rate which had characterized the preceding 2 years.  By means of subsidies, roll-backs, and drastic simplification of controls, food prices actually declined at retail by more than 4 percent.”7 This was an especially noteworthy achievement in that it occurred during the last two years of the war economy, a time when employment was at a maximum and consumer goods production highly restricted. 

Part III: Struggles over popular participation in price control

The OPA had resisted using volunteers in its price control work even though they had proven essential to its rationing work.  The main reason: it didn’t want to anger the business community. 

The war in the Pacific had cut off America’s main source of rubber.  To conserve the existing stock of rubber, the government halted the production and sale of new tires and, in December 1941, the OPA was put in charge of rationing the existing supply.  Tire ration boards were quickly established in every political subdivision in the county and governors in each state were contacted and asked to mobilize their state defense councils to generate the required volunteer board members. 

Over 7000 tire-rationing boards (varying in size from 3 to 7 members) were quickly created and tire rationing began in early January 1942.  Soon after, the OPA was designated the official rationing agency, and the work of the rationing boards was expanded to cover typewriters, automobiles, sugar, gasoline, bicycles, rubber footwear, fuel oil, coffee, stoves, shoes, processed foods, and meats and fats. 

In addition to the volunteer board members, the rationing program also used what the OPA called “regular” rationing volunteers.  These included receptionists, file clerks, telephone clerks, and counter clerks.  At its peak, the program relied on approximately 125,000 regular rationing volunteers.  Although far from perfect, the ration system generally enjoyed community support.  Most Americans were willing to accept its inefficiencies because those administering it were unpaid volunteers, many of who gave more than 40 hours a week in service.8

Most business leaders were willing to accept rationing, including volunteer participation in the rationing program, because they did not see it as a threat to their independence or profitability.  Price controls were a different matter.  Business leaders not only opposed price controls, they were adamantly opposed to popular participation in their enforcement out of fear that it might lead to public involvement in, and regulation of, all aspects of their business operation. Therefore, to maintain good relations with the business community, the OPA had refrained from enlisting volunteers for its price control efforts.

Unfortunately for the OPA, its pro-business stance was not rewarded. In June 1942, one month after the GMPR went into effect, prices continued to rise.  Under pressure to produce results, the OPA decided to carry out a survey of retailers to check on their compliance.

This decision raised the question of who would actually visit the country’s two million retail outlets.  Echoing the position of the OPA policy committee, John Kenneth Galbraith, Deputy Administrator for Price, argued that the survey should be done by eight to ten thousand paid inspectors.  He publicly pledged to the business community that “no Gestapo of volunteer housewives” would be used.9 Members of Congress and the business community began openly referring to shoppers or volunteers who wanted to help check compliance as “trouble makers” and “snoopers.”  This was strong language in the context of the war.

Despite the popularity of this anti-volunteer sentiment, Congress refused to appropriate the money needed to hire paid inspectors.  Therefore, the OPA was forced to seek volunteers for its price survey.  Not surprisingly, this proved difficult.  Many civic organizations were reluctant to mobilize their members for volunteer price work given the negative public comments that had been made about the activity.  Even though some OPA leaders, including Galbraith, eventually changed their position, these early attacks on volunteer participation in the price program had long lasting negative effects on recruitment. 

In May 1942, OPA director Leon Henderson announced that existing rationing boards would be renamed War Price and Rationing Boards.  But still he refrained from recruiting new volunteers with responsibilities for price control.  Price material was sent to the existing boards but, busy with rationing, it was largely ignored.     

AFL and CIO leaders were not happy with OPA’s reluctance to establish active price panels.  They believed that the agency’s lack of commitment to a popular mobilization in support of price controls was a major reason for the lack of progress in the anti-inflation fight.  The OPA was caught in a crossfire.  Labor was critical of the OPA because it was not doing enough to halt the inflation-driven decline in real wages.  Business was critical of the OPA because it was slowly moving (or being pushed) towards adoption of an effective, volunteer-supported price control system. 

Business opposition to OPA’s policy direction eventually led to Leon Henderson’s forced resignation in January 1943; he was replaced by Prentiss Brown.  Galbraith was also forced to resign not long after his change in heart about the use of volunteers.  The ongoing political struggle within the agency did little to improve its effectiveness.  Thus, until Roosevelt’s April 1943 Hold the Line order, prices continued to rise, labor activism continued to grow, and popular participation in price control remained limited. 

Part IV: Popular participation and the success of price control

In March 1943, Prentiss Brown, the newly appointed head of the OPA, called for price panels to be established on every War Price and Rationing Board and for volunteer price assistants to be recruited to assist the panels in their work.  At the time, Brown did not intend for these price assistants to play a major role in price control; he saw them primarily helping to distribute and explain OPA materials to retailers.  But, after Roosevelt’s Hold the Line order, he had little choice but to greatly increase their responsibilities. 

On May 7, the OPA instructed its boards to carry out a national grocery survey over the period May 12-13 to check for compliance with the newly created community price lists.  This survey effort immediately forced a rethinking and elevating of the role of the price assistants. They were assigned the job of checking that every grocery store displayed a poster showing its assigned category, a separate poster showing the dollars and cents price ceilings for stores in its category, and clear signs posted near each product on the community price list showing its selling price.  The assistants were also to check that the selling prices were no higher than the mandated ceiling prices.  If a store was found to be out of compliance the assistants were to inform the store manager and request that the problems be corrected. 

This first survey was far from complete.  The boards were not given enough time to prepare for it.  Moreover, less than one-third of the 5,500 boards had price panels, and fewer than 10 percent of those boards had price panel assistants. 

Brown called for another, more complete, survey of grocery stores no later than June 28.  By then, many more boards had price panels and recruited price assistants.  The assistants were instructed to examine 12 designated food items, comparing their selling price with OPA’s ceiling price.  Not surprisingly, they found a large number of violations.  As required, panel assistants returned for follow-up visits the next week to determine whether corrections had been made.  The program proved to be a success; in June, for the first time in 30 months, the national cost of living index fell.

Despite the program’s overall success, there were still many areas without well-functioning price panels.  Therefore, in July 1943, Chester Bowles, the Connecticut OPA director, was brought to the national office to assist local boards in establishing effective price panels and recruiting and training price panel assistants. 

Bowles pressured the OPA’s nine regional administrators to work with their 93 district offices to quickly increase the number and diversity of volunteers working on price control.  No longer required to work exclusively with defense councils, the district offices began selecting new volunteers from lists drawn up by unions, women’s clubs, and consumer groups.  In reporting on the Connecticut experience, Bowles described the composition of that state’s board members in late 1943 as follows: “There are 156 farmers, 209 housewives, 43 insurance men, 20 doctors and 8 dentists, 47 nurses, 97 school teachers, 237 industrial workers, 32 attorneys, 53 engineers, 127 storekeepers and store clerks, 21 clergymen, 15 carpenters, 11 electricians, and 10 plumbers.”10

The price panels had tremendous responsibilities.  For example, they were supposed to educate all retail businesses, not just those selling food, about the government’s price regulations.  This meant that price panel members had to be informed about price regulations for laundries, department stores, grocery stores, restaurants, and many other establishments.  To carry out their educational responsibilities, panel members generally held meetings with the various retailers in their area to explain the rules and answer questions. 

Price panels were also charged with investigating cases of alleged business non-compliance.  The panels were instructed to investigate all consumer complaints.  If the panel felt that a violation had likely occurred, panel members would hold a conference, normally in the evening, where both the consumer and retailer could argue their respective positions.  If the panel concluded that a violation had taken place, the consumer had the right to collect any overcharge due, or sue in court for three times the overcharge or $50, which ever was larger.

Most of the time, the retailer, if found guilty by the panel, would sign a pledge acknowledging that a mistake had been made, promise to correct it, and pay the overcharge to the consumer.  In the case of overcharges on food items, which were the majority of cases, the amounts involved were small.  But since the panels also regulated prices for large consumer durable goods such as refrigerators, washing machines, and automobiles, sometimes the refunds could be substantial.   

If retailers refused to acknowledge their guilt or change their pricing habits, the board’s only recourse was to send a letter to the offending business threatening it with loss of license (after gaining approval from the district office) and then, if there was no response, ask its local Legal Division to take action.  Unfortunately, in many cases, the Legal Divisions proved unwilling to act. 

Conferences were also held after surveys to follow up on violations discovered by price assistants that were not corrected.  If the board determined that a retailer had violated the law, its only recourse was to collect the overcharge for the benefit of the US Treasury.  Boards were barred from suing for damages; only consumers were given this right. 

The legal sanctions available to the local boards were greatly strengthened in the 1944 legislation renewing the authority of the OPA.  Since few consumers used their legal right to sue, most retailers faced relatively small penalties for violating ceiling prices.  However, according to the terms of the new legislation, if a consumer did not sue within 30 days, then the local board, acting for the OPA Administrator, could assert the “Administrator’s claim” for damages.  This new legislation also allowed local boards to sue for damages when violations were discovered by price assistants. 

Although some boards were reluctant to use their new power, many were not.  The national record was impressive: from January 1945 to June 1946, a total of 71,050 sellers were required to pay over $5.1 million to the US Treasury.11

Most importantly, the volunteer price panel assistants were proving highly effective in stabilizing prices. OPA studies established that business compliance rose substantially the more frequently enterprises were surveyed.  However, there were still, even at the beginning of 1944, too few price assistants to carry out the desired monthly surveys of the country’s 600,000 food stores.  Moreover, the OPA was determined to undertake similar but less frequent surveys of restaurants and service outlets.  Therefore, Chester Bowles, who became head of the OPA in November 1943, called for expanding the number of price assistants to 125,000, each of whom was expected to work from 3-4 hours a day, 2 to 3 days a week.12 

In an attempt to strengthen the survey effort, the OPA ordered an emergency price check of every food store in the country during a one-week period in March 1944.  Over 5000 boards participated and some 450,000 food stores were visited, 3 times the number covered in any previous month.  A board-by-board examination of the survey results yielded “convincing evidence of the effect of continued store checking.  In areas where there were enough volunteers to make weekly visits the number of stores in violation dropped to 4 percent, and where there were no regular visits as many as 75 percent of the stores were found to be overcharging.”13

In April 1944, Bowles sent a letter to the President summarizing the success of OPA’s community pricing efforts.  He noted that while the prices of some items had continued to rise over the last year, in particular clothing, those of other items, in particular foods, had fallen.  In fact, “the cost of living as a whole is slightly lower than it was a year ago today.  This record—1 year of stable living costs—is unprecedented either in this war or the last war.”14

Part V: Business fights back

This success of the OPA’s volunteer-driven price control program set off alarm bells in the business community.  It was the Grocers’ Association, whose member stores were the main focus of OPA survey work, that organized and led the fight against the use of volunteer price checkers.  In fact, it was its “Grocer-Consumer Anti-inflation Campaign” that proved to be the critical turning point in the struggle over popular participation in price control. 

OPA surveys of grocery stores revealed that its price posters were often not displayed properly or at all.  Unable to achieve his target of 125,000 price assistants to force compliance, Bowles decided to outmaneuver the grocers by printing pocket-sized posters with the ceiling prices, and distributing them directly to consumers so that they could do their own checking.  The OPA printed up 2.5 million posters in March 1944 and sent them to a select group of regional directors for a test distribution. 

Most directors quickly distributed them by enlisting the assistance of various civic organizations.  However, one director resisted, claiming that distribution of the posters was an affront to the grocers.  Grocers and some OPA officials took up the charge and began pressing Bowles to drop his project.  Bowles refused.  He ordered the OPA to print 10 million price lists and send them to all regional directors as part of a national educational campaign.

The lists were sent, but came with no instructions for distribution or accompanying national campaign literature.  Although it is unclear who orchestrated it or how it was accomplished, a deal was struck between the grocers and the OPA without the knowledge of Bowles.  In a July 3, 1944 staff memo, Bowles was told that the OPA had agreed not to promote its June consumer price lists or print any new price lists for public distribution.  In return, the Grocers’ Association agreed to launch its own Grocer-Consumer Anti-inflation Campaign, which included a commitment by the association to ensure that grocers would display highly visible ceiling price posters of their own making.15 

The grocers did as they had promised.  Grocer committees were established in most cities and stores hung huge banners touting the campaign and colorful posters with ceiling prices.  However, while posting compliance rose noticeably, price compliance did not.  More importantly, the agreement weakened the OPA’s resolve to recruit more price panel assistants and marginalized the role of consumers.  As a result, popular interest in and support for the price control program declined. 

The OPA had long warned that inflation could easily explode with the end of the war.  Yet, in the months before its August 1945 conclusion, the national office did nothing to reinvigorate its volunteer effort in preparation for the next stage in price control.  In fact, it did quite the opposite: it abolished the price panel division and suspended all price surveys, even though price controls remained in place.  Moreover, the OPA national office made no attempt to communicate with its volunteers to explain its plans. 

Then, almost immediately after the war’s conclusion, President Truman ended restrictions on wage bargaining.  Eager to restore earnings lost under wartime wage restrictions, workers sought significant wage increases.  And, in line with the principles of wartime price control, a number of unions demanded that businesses absorb the costs of the higher wages out of their record profits and not pass them on to other working people through higher prices.  Business refused, and unions, no longer bound by their wartime no-strike pledge, responded with a massive strike wave. 

Had popular support for, and involvement in, price control been sustained, it might have been possible to create a worker-community alliance powerful enough to force business to yield.  Such a victory would have represented a significant restructuring of class relations.  Unfortunately, labor launched its struggle after the demobilization of community participation in price control. 

Truman finally acted to end the strike wave by allowing businesses to raise prices to compensate for paying higher wages.  The result was that union efforts to increase wages became pitted against the broader anti-inflation interests of the majority of working people.  While unionized workers did win substantial wage gains during the strike wave of 1945-46, their success proved short-lived.  The resulting inflation eventually erased their gains, leaving most working people worse off. 

Part VI. Lessons

Hundreds of thousands of working people served as volunteers in OPA’s price control program.  They worked on tasks that were essential to the government’s ability to manage the economy, and their efforts were largely successful.  Yet, this history of government reliance on volunteers for the implementation, monitoring, and enforcement of price controls is generally unknown. 

It would be a mistake to dismiss this history of volunteer participation as a unique wartime phenomenon of no contemporary relevance.  Of course, the war years were a period of great national emergency, which encouraged working people to generously offer their time in service to the nation.  However, it is also clear from the history of the period that working people were not mindless captives of wartime propaganda.  A case in point: many workers struck for higher wages despite wartime pleas by the president not to do so and laws that threatened punishment.  In other words, to the extent that working people volunteered their time to participate in OPA’s price control program, it was because they believed that the program was in their interest. 

Thus, this price control experience should encourage us to think more creatively about the relationship between greater democratic participation in the economy and effective national planning to meet popular needs.  Perhaps the obvious lesson we should draw from this history is that under the “right” political conditions, working people are indeed willing and able to work collectively, with minimal national supervision, to monitor and regulate private economic activity.    

A second lesson is that our criteria for evaluating public policies should include an assessment of their compatibility with, and potential to encourage, popular participation.  Those who supported OPA’s price control efforts, even those in the progressive community, tended to focus their praise or criticism on OPA’s system of price regulation.  Decades later, the Hold the Line period is still defined by the introduction of dollars and cents price ceilings.  However, what ensured the success of this control program was the administrative and monitoring work of volunteers.  In other words, the OPA’s price controls worked as the result of two complementary developments: the creation of a control system that allowed for popular participation and the mobilization of working people to enforce it. 

A third lesson is that activists seeking to promote a Green New Deal style transformation of the economy need to incorporate a critical understanding of both class interest and the contested nature of the state into their organizing work.  As the history of the war period makes clear, the business community was unwilling to surrender its privileges even during a time of “national” emergency.  One example: its strong opposition to meaningful price control.  Therefore, we must be careful not to encourage working people to think about policies or strategies of transformation in terms of a national interest that would promote uncritical alliances with the business community. 

Similarly, while many working people looked to the OPA for leadership in the fight against inflation, the OPA was itself internally divided and weakened because of the presence of business-friendly administrators.  Popular awareness of the structural advantages enjoyed by business in a capitalist economy, including their projection into the state policy-making process, needs to be encouraged.  However, acknowledging this reality should not lead us to conclude that state policy is unimportant or that state agencies are structurally unable to play an important role in social transformation.  The evolution of OPA’s price control program reveals that state policy can be influenced by the political strength of the working class. 

The last, and perhaps most important lesson from the history of this period, is that our organizing efforts for social transformation should be guided by a broad, class-based, worker-community perspective.  An examination of writings in the New Republic, The Nation, The Daily Worker, and AFL and CIO publications leaves no doubt that the progressive and labor communities of the time strongly supported the OPA and its efforts at price control.  In fact, these publications printed articles that openly discussed the ways in which the business community worked to undermine the OPA and called for the agency to purge its internal opponents and actively recruit working people to its cause as price panel members and price panel assistants. 

At the same time, I found no articles that considered or debated the broader political potential of the volunteer experience.  For example, there was no call for activists to work with those volunteering to help them develop a more radical perspective on the nature of the anti-inflation struggle or workings of capitalism.  This is unfortunate, because such efforts might have produced important results, including strengthened links between various working-class constituencies; greater political activism and new leadership skills among people not normally reached through the traditional activist channels of the time, especially women; and new thinking about economic alternatives. 

It is difficult to know why there was no strategic discussion about the volunteer experience.  Perhaps, it took place out of public view.  Or perhaps those writing did not see the price control struggle as offering significant potential for radicalization.  Or given the ongoing union and workplace struggles, they did not believe that activists had the organizational capacities necessary to extend their reach.  Or perhaps the lack of discussion was a result of sexism, with those writing dismissing the volunteer experience because the great majority of volunteers were women.

Regardless of the reason, hopefully the history presented here can encourage a greater appreciation for the institutional and class challenges involved in progressive policy-making, as well more creative thinking about how to overcome them.


* This post is a revised and condensed version of Martin Hart-Landsberg, “Popular Mobilization and Progressive Policy Making: Lessons from World War II Price Control Struggles in the United States,” Science and Society, Vol 67, No 4 (2003).

1. Martin Hart-Landsberg, “Realizing a Green New Deal: Lessons from World War II,” Reports from the Economic Front, May 27, 2021.

2. Harold G. Vatter, The U.S. Economy in World War II (New York: Columbia University Press, 1985), p. 3.

3. Ibid., p. 16. 

4. Ibid. p. 20.

5. Irving Bernstein, “Americans in Depression and War,” in Richard M. Morris, editor, A History of the American Worker (Princeton, New Jersey: Princeton University Press, 1983), p. 182.

6. Joseph G. Rayback, A History of American Labor (New York: The Free Press, 1966), p. 380.

7. Harvey C. Mansfield, A Short History of OPA (Washington D.C.: Office of Price Administration, 1948), p. 56.

8. Imogene H. Putnam, Volunteers in OPA (Washington D.C.: Office of Price Administration, 1947) p. 21.

9. Ibid., p. 32.

10. As quoted in Ibid., pp. 83-4.

11. Ibid., p. 119.

12. Ibid., p. 157.

13. Ibid., p. 99.

14. As quoted in Ibid., p. 100.

15. Ibid., p. 112.

Learning from history: community-run child-care centers during World War II

We face many big challenges.  And we will need strong, bold policies to meaningfully address them.  Solving our child-care crisis is one of those challenges, and a study of World War II government efforts to ensure accessible and affordable high-quality child care points the way to the kind of bold action we need. 

The child care crisis

A number of studies have established that high-quality early childhood programs provide significant community and individual benefits.  One found that “per dollar invested, early childhood programs increase present value of state per capita earnings by $5 to $9.”  Universal preschool programs have also been shown to offer significant benefits to all children, even producing better outcomes for the most disadvantaged children than means-tested programs.  Yet, even before the pandemic, most families struggled with a lack of desirable child-care options.    

The pandemic has now created a child-care crisis. As Lisa Dodson and Mary King point out: “By some estimates, as many as 4.5 million child-care ‘slots’ may be permanently lost and as many as 40 percent of child-care providers say they will never reopen.”  The lack of child care is greatly hindering our recovery from the pandemic.  Women suffered far greater job losses than men during 2020, including as child-care workers, and the child-care crisis has made it difficult for many working mothers to return to the labor force.  The cost goes beyond the immediate family hardship from lost income; there is strong evidence that a sustained period without work, the so-called employment gap, will result in significantly lower lifetime earnings and reduced retirement benefits.  

To his credit, President Biden has recognized the importance of strengthening our care economy.  His proposed American Families Plan includes some $225 billion in tax credits to help make child care more affordable for working families.  According to a White House fact sheet, families would “receive a tax credit for as much as half of their spending on qualified child care for children under age 13, up to a total of $4,000 for one child or $8,000 for two or more children. . . . The credit can be used for expenses ranging from full-time care to after school care to summer care.”

But tax credits don’t ensure the existence of convenient, affordable, high-quality child-care facilities staffed by well-paid and trained child-care providers.  And if that is what we really want, we will need to directly provide it.  That is what the government did during World War II.  While its program was far from perfect, in part because it was designed to be short-term, it provides an example of the type of strong, bold action we will need to overcome our current child-care crisis. 

Federal support for child care

During World War II the United States government financed a heavily-subsidized child-care program.  From August 1943 through February 1946, the Federal Works Agency (FWA), using Lanham Act funds, provided some $52 million in grants for child-care services (equal to more than $1 billion today) to any approved community group that could demonstrate a war-related need for the service.  At its July 1944 peak, 3,102 federally subsidized child-care centers, with some 130,000 children enrolled, operated throughout the country.  There was at least one center in every state but New Mexico, which decided against participation in the program.  By the end of the war, between 550,000 and 600,000 children received some care from Lanham Act funded child-care programs.  

Communities were allowed to use the federal grant money to cover most of the costs involved in establishing and running their centers, including facilities construction and upkeep, staff wages and most other daily operating costs.  They were required to provide some matching funds, most of which came from fees paid by the parents of children enrolled in the program.  However, these fees were capped. In the fall of 1943, the FWA established a ceiling on fees of 50 cents per child per day (about $7 now), which was raised to 75 cents in July 1945. And those fees included snacks, lunch, and in some cases dinner as well. Overall, the federal subsidy covered two-thirds of the total maintenance and operation of the centers.

The only eligibility requirement for enrollment was a mother’s employment status: she had to be working at a job considered important to the war effort, and this was not limited to military production. Center hours varied, but many accommodated the round-the-clock manufacturing schedule, staying open 24 hours a day, 6 days a week. 

The centers served preschoolers (infants, toddlers, and children up to 5 years of age) and school-age children (6 to 14 years of age). In July 1944, approximately 53,000 preschoolers and 77,000 school-age children were enrolled.  School-age enrollment always climbed during summer vacation.  However, in most months, preschoolers made up the majority of the children served by Lanham Act-funded centers. Enrollment of preschoolers peaked at some 74,000 in May 1945. 

Some 90 percent of the centers were housed in public schools, with newly contructed housing projects providing the next most used location. Although local school boards were free to decide program standards–including staff-child ratios, worker qualifications, and facility design–state boards of education were responsible for program supervision. The recommended teacher-child ratio was 10-to-1, and most centers complied.  According to Chris M. Herbst,

Anecdotal evidence suggests that preschool-aged children engaged in indoor and outdoor play; used educational materials such paints, clay, and musical instruments; and took regular naps. . . . Programs for school-aged children included . . . outdoor activities, participation in music and drama clubs, library reading, and assistance with schoolwork. 

Children at a child-care center sit for “story time.” (Gordon Parks / Library of Congress / The Crowley Company)

While quality did vary–largely the result of differences in community support for public child care, the willingness of cities to provide additional financial support, and the ability of centers to hire trained professionals to develop and oversee program activities–the centers did their best to deliver a high-quality childhood education.  As Ruth Peason Koshuk, the author of a 1947 study of the developmental records of 500 children, 2 to 5 years of age, at two Los Angeles Country centers, describes:

In these two . . . schools, as elsewhere, the program has developed since 1943, toward recognized standards of early childhood education. The aim has been to apply the best of existing standards, and to maintain as close contact with the home as possible. In-service training courses carrying college credit have been given, for the teaching staff, and a mutually helpful parent education program carried on in spite of difficulties inherent in a child care situation.

There has been a corresponding development in the basic records. A pre-entrance medical examination has been required by state law since the first center opened. In December 1943 a developmental record was added, which is filled out by the director during an unhurried interview with the mother just before a child enters. One page is devoted to infancy experience; the four following cover briefly the child’s development history, with emphasis on emotional experience, behavior problems he has presented to the parents, if any, and the control methods used, as well as the personal-social behavior traits which they value and desire for the child. After entrance, observational notes and semester reports are compiled by the teachers. Intelligence testing has been limited to cases where it seemed especially indicated. A closing record is filled out, in most cases, by the parent when a child is withdrawn. These records are considered a minimum. They have proved indispensable as aids to the teachers in guiding the individual children and as a basis for conferences on behavior in the home.

A 2013 study of the long-term effects on mothers and children from use of Lanham centers found a substantial increase in maternal employment, even five years after the end of the program, and “strong and persistent positive effects on well-being” for their children.

In short, despite many shortcomings, these Lanham centers, as Thalia Ertman sums up,

broke ground as the first and, to date, only time in American history when parents could send their children to federally-subsidized child care, regardless of income, and do so affordably. . . .

Additionally, these centers are seen as historically important because they sought to address the needs of both children and mothers. Rather than simply functioning as holding pens for children while their mothers were at work, the Lanham child care centers were found to have a strong and persistent positive effect on the well-being of children.

The federal government also supported some private employer-sponsored child care during the war. The most well-known example is the two massive centers built by the Kaiser Company in Portland, Oregon to provide child care for the children of workers at their Portland Yards and Oregon Shipbuilding Corporation. The centers were located right at the front of the shipyards, making it easy for mothers to drop their children off and pick them up, and were operated on a 24-hour schedule.  They were also large, each caring for up to 1,125 children between 18 months and 6 years of age. The centers had their own medical clinic, cafeteria, and large play areas, and employed highly trained staff.  Parents paid $5 for a six-day week for one child and $3.75 for each additional child.  For a small additional fee, the centers also prepared a small dinner for parents to pick up at the end of their working day.

While the Kaiser Company received much national praise as well as appreciation from its employees with young children, these centers were largely paid for by the government.  Government funds directly paid for their construction, and a majority of the costs of running the center, including staff salaries, were included in the company’s cost-plus contracting with the military.

Political dynamics

There was considerable opposition to federal financing of group child care, especially for children younger than 6 years of age.  The sentiment is captured in this quote from a 1943 New York Times article: “The worst mother is better than the best institution when it is a matter of child care, Mayor La Guardia declared.”  Even the War Manpower Commission initially opposed mothers with young children working outside the home, even in service of the war effort, stating that “The first responsibility of women with young children, in war as in peace, is to give suitable care in their own homes to their children.”

But on-the-ground realities made this an untenable position for both the government and business. Women sought jobs, whether out of economic necessity or patriotism.  The government, highlighted by its Rosie the Riveter campaign, was eager to encourage their employment in industries producing for the war effort.  And, despite public sentiment, a significant number of those women were mothers with young children. 

Luedell Mitchell and Lavada Cherry working at a Douglas Aircraft plant in El Segundo, Calif. Circa 1944. Credit: National Archives photo no. 535811

The growing importance of women in the workplace, and especially mothers with young children, is captured in employment trends in Portland, Oregon.  Women began moving into the defense workforce in great numbers starting in 1942, with the number employed in local war industries climbing from 7,000 in November 1942 to 40,000 in June 1943.  An official with the state child-care committee reported that “a check of six shipyards reveals that the number of women employed in the shipyards has increased 25 percent in one month and that the number is going to increase more rapidly in the future.” 

The number of employed mothers was also rapidly growing.  According to the Council of Social Agencies, “Despite the recommendations of the War Manpower Commission . . . thousands of young mothers in their twenties and thirties have accepted jobs in war industries and other businesses in Multnomah County. Of the 8,000 women employed at the Oregon Shipyards in January, 1943, 32 percent of them had children, 16 percent having pre-school children.”

Portland was far from unique.  During the war, for the first time, married women workers outnumbered single women workers.  Increasingly, employers began to recognize the need for child care to address absenteeism problems.  As a “women’s counselor” at the Bendix Aviation Corporation in New Jersey explained to reporters in 1943, child care is one of the biggest concerns for new hires. “We feel a mother should be with her small baby if possible. But many of them have to come back. Their husbands are in the service and they can’t get along on his allotment.”  Media stories, many unsubstantiated, of children left in parked cars outside workplaces or fending for themselves at home, also contributed to a greater public acceptance of group child care. 

An image of Rosie the Riveter that appeared in a 1943 issue of the magazine Hygeia

Finally, the government took action.  The Federal Works Agency was one of two new super agencies established in 1939 to oversee the large number of agencies created during the New Deal period.  In 1940 President Roosevelt signed into law the Lanham Act, which authorized the FWA to fund and supervise the construction of needed public infrastructure, such as housing, hospitals, water and sewer systems, police and firefighting facilities, and recreation centers, in communities experiencing rapid growth because of the defense buildup. In August 1942, the FWA decided, without any public debate, that public infrastructure also meant child care, and it began its program of support for the construction and operation of group child-care facilities.

The Federal Works Agency, the other super agency, whose oversight responsibilities included the Children’s Bureau and the U.S. Office of Education, opposed the FWA’s new child-care initiative.  It did so not only because it believed that child care fell under its mandate, but also because the leadership of the Children’s Bureau and Office of Education opposed group child care.  The FWA won the political battle, and in July 1943, Congress authorized additional funding for the FWA’s child-care efforts. 

And, as William M. Tuttle, Jr. describes, public pressure played an important part in the victory:

the proponents of group child care organized a potent lobbying effort. The women’s auxiliaries of certain industrial unions, such as the United Electrical Workers and the United Auto Workers, joined with community leaders and FWA officials in the effort. Also influential were the six women members of the House of Representatives. In February 1944, Representative Mary T. Norton presented to the House “a joint appeal” for immediate funds to expand the wartime child day care program under the FWA.

Termination and a step back

Congressional support for group child care was always tied to wartime needs, a position shared by most FWA officials.  The May 1945 Allied victory in Europe brought a drop in war production, and a reduction in FWA community child care approvals and renewals.  In August, after the Japanese surrender brought the war to a close, the FWA announced that it would end its funding of child-care centers as soon as possible, but no later than the end of October 1945.

Almost immediately thousands of individuals wrote letters, sent wires, and signed petitions calling for the continuation of the program.  Officials in California, the location of many war-related manufacturing sites and nearly 25 percent of all children enrolled in Lanham Act centers in August 1945, also weighed in, strongly supporting the call.  Congress yielded, largely influenced by the argument that since it would be months before all the “men” in the military returned to the country, mothers had no choice but to continue working and needed the support of the centers to do so.  It approved new funds, but only enough to keep the centers operating until the end of February 1946.

The great majority of centers rapidly closed not long after the termination of federal support, with demonstrations following many of the closings.  The common assumption was that women would not mind the closures, since most would be happy to return to homemaking.  Many women were, in fact, forced out of the labor force, disproportionately suffering from post-war industrial layoffs.  But by 1947, women’s labor force participation was again on the rise and a new push began for a renewal of federal support for community child-care centers. Unfortunately, the government refused to change its position. During the Korean War, Congress did approve a public child-care bill, but then it refused to authorize any funding.

After WWII, parents organized demonstrations, like this one in New York on Sept. 21, 1947, calling for the continuing funding of the centers. The city’s welfare commissioner dismissed the protests as “hysterical.” Credit: The New York Times

Finally, in 1954, as Sonya Michel explains, “Congress found an approach to child care it could live with: the child-care tax deduction.”  While the child-care tax deduction did offer some financial relief to some families, it did nothing to ensure the availability of affordable, high-quality child care.  The history of child care during World War II makes clear that this turn to market-based tax policy to solve child-care problems represented a big step back for working women and their children.  And this was well understood by most working people at the time. 

Sadly, this history has been forgotten, and Biden’s commitment to expand the child-care tax credit is now seen as an important step forward.  History shows we can and need to do better.

Realizing a Green New Deal: Lessons from World War II

Many activists in the United States support a Green New Deal transformation of the economy in order to tackle the escalating global climate crisis and the country’s worsening economic and social problems.  At present, the Green New Deal remains a big tent idea, with advocates continuing to debate what it should include and even its ultimate aims.[1]  Although perhaps understandable given this lack of agreement, far too little attention has been paid to the process of transformation.  That is concerning, because it will be far from easy.

One productive way for us to sharpen our thinking about the transformation is to study the World War II-era mobilization process. Then, the U.S. government, facing remarkably similar challenges to the ones we are likely to confront, successfully converted the U.S. economy from civilian to military production in a period of only three years.

It is easy to provide examples of some of the challenges that await us.  All Green New Deal proposals call for a sharp decrease in fossil fuel production, which will dramatically raise fossil fuel prices.  The higher cost of fossil fuels will significantly raise the cost of business for many industries, especially air travel, tourism, and the aerospace and automobile industries, triggering significant declines in demand and reductions in their output and employment.   We will need to develop a mechanism that allows us to humanely and efficiently repurpose the newly created surplus facilities and provide alternative employment for released workers.

New industries, especially those involved in the production of renewable energy will have to be rapidly developed.  We will need to create agencies capable of deciding the speed of their expansion as well as who will own the new facilities, how they will be financed, and how best to ensure that the materials they require will be produced in sufficient quantities and made available at the appropriate time. We will also have to develop mechanisms for deciding where the new industries will be located and how to develop the necessary social infrastructure to house and care for the required workforce.  

We will also need to ensure the rapid and smooth expansion of facilities capable of producing mass transit vehicles and a revitalized national rail system.  We will need to organize the retrofitting of existing buildings, both office and residential, as well as the training of workers and the production of required equipment and materials.  The development of a new universal health care system will also require the planning and construction of new clinics and the development of new technologies and health practices.  In sum, a system-wide transformation involves a lot of moving parts that have to be managed and coordinated.

While it would be a mistake to imagine that the U.S. wartime experience can provide a readymade blueprint for the economic conversion we seek, there is much we can learn, both positive and negative, from it.  In what follows, I first highlight some of the key lessons and then conclude with a brief discussion of the relevance of the World War II experience to our current efforts to transform the U.S. economy.

1. A rapid, system-wide conversion of the U.S. economy is possible 

The primary driver of the wartime conversion was the enormous increase in military spending over the years 1940-1943.  Military spending grew by an incredible 269.3 percent in 1941, 259.7 percent in 1942, and 99.5 percent in 1943.  As a consequence, military spending as a share of GDP rose from 1.6 percent in 1940 to 32.2 percent in 1943.  That last year, federal spending hit a record high of 46.6 percent of GDP and remained at over 41 percent of GDP in each of the following two years.[2] 

The results were equally impressive: the combined output of the war-related manufacturing, mining, and construction industries doubled between 1939 and 1944.[3] In 1943 and 1944 alone, the United States was responsible for approximately 40 percent of all the munitions produced during World War II. 

This record has led many to call what was accomplished a “production miracle.”  However, a more complete assessment of the period tells a different story.  For example, there is little difference between the years 1921-24 and 1941-1944 in either the growth of industrial production or the growth in real gross nonfarm product.[4] 

Paul A. C. Koistinen casts further doubt on production miracle claims, pointing out that:

When placed in the proper context, the American production record does not appear exceptional, unless the characterization applies to all other belligerents. Gauged by the percentage distribution of the world’s manufacturing production for the period 1926-1929, the United Sates in the peak year 1944 was producing munitions at almost exactly the level it should have been.  Great Britain is modestly high, Canada low, Germany high, Japan very high, and the Soviet Union spectacularly high.[5]

The explanation for these two significantly different views of the period is that the transformation involved far more than the increase in military spending.  There was also the curtailment or outright suppression of the production of many industries, the rationing of limited supplies of many goods, and the development and production of entirely new goods and services.  For example, civilian automobile production was stopped, tires and food were rationed, and synthetic rubber was created and produced in significant amounts.  Between 1940 and 1944, the total production of non-war goods and services actually fell by more than 10 percent, from $221.7 billion to $198.9 billion (in 1958 dollars).[6]

In other words, the tremendous gains in U.S. military production were achieved, and in a relatively short period of time, not because of some impossible-to-repeat production miracle, but because a government directed-mobilization succeeded in fully employing the country’s resources while shifting their use from civilian to military purposes. 

2. State capacities and action matter

The economy’s successful transformation demonstrates the critical importance of state planning, public financing and ownership, and state direction of economic activity.  Mobilization officials faced two major tasks. The first was to quickly expand the economy’s capacity to produce the weapons and supplies required by the military.  The second was to manage the scarcities of critical materials and components caused by the rapid pace of the mobilization. 

The first task was made significantly more difficult by a lack of corporate support.  Most corporations were reluctant to undertake the massive expansion in plant and equipment required to achieve the desired boost in military production. In fact, private investment actually fell in value over the years 1941-43.  It was the federal government, using a variety of new policy initiatives, that provided the solution.

One of the most important initiatives was the creation of the Defense Production Corporation (DPC). In May 1940, Congress passed a series of amendments which allowed the still operating depression-era Reconstruction Finance Corporation (RFC) to create new subsidiaries “with such powers as it may deem necessary to aid the Government of the United States in its national defense program.”  The DPC was one of those new subsidiaries. 

Since the RFC had independent borrowing authority, the DPC was able to directly finance the expansion of facilities deemed critical to the military buildup without needing Congressional approval.  The DPC kept ownership of the new facilities it financed, but planned the construction with and then leased the new facilities for a minimal fee to predetermined contractors who would operate them. The DPC eventually financed and owned some one-third of all the plant and equipment built during the war.

By its termination at the end of June 1945, the DPC:

owned approximately 96 per cent of the capacity of the synthetic-rubber industry, 90 per cent of magnesium metal, 71 per cent of aircraft and aircraft engines, and 58 per cent of the aluminum metal industry. It also had sizeable investments in iron and steel, aviation gasoline, ordnance, machinery and machine tool, transportation, radio, and other more miscellaneous facilities.[7]

The DPC supported facilities expansion in other ways too.  Responding to concerns of shortages in machine tools and the industry’s reluctance to boost capacity to produce them, the DPC began a machine tools pool program.  The DPC gave machine tool producers a 30 percent advance to begin production.  If the producers found a private buyer, they returned the advance.  If they found no buyer, the DPC would pay them full price and put the machine tool in storage for later sale.  This program proved remarkably successful in boosting machine tool production and, with machine tools readily available, speeding up weapons production.[8]

The second task, the timely delivery of scarce materials to military and essential civilian producers, was accomplished thanks to the efforts of the War Production Board (WPB), the country’s primary wartime mobilization agency.  In late 1942, after considerable experimentation, it launched its Controlled Materials Plan (CMP).  The plan required key claimants, such as the Army, the Navy, and the Maritime Commission, to provide detailed descriptions of their projected programs and the quantities of essential controlled metals required to realize them, with a monthly production schedule for the upcoming year.  The WPB industry divisions responsible for these metals would then estimate their projected supply and decide the amount of each metal to be allocated to each claimant following WPB policy directives.  The claimants would then adjust their programs accordingly and assign their metal shares to their prime contractors who were then responsible for assigning supplies to their subcontractors. 

When, over time, a shortage of components replaced the shortage of metals as the most serious bottleneck to military production, the WPB introduced another program.  The newly established Production Executive Committee created a list of 34 critical components.  One of its subcommittees, working in concert with the CMP process, would then arrange for essential manufacturers to receive all their required scarce materials and components. 

3. Flexibility is important

Flexibility in both planning structures and mobilization policies was critical to the success of the conversion.  President Roosevelt began the mobilization process in May 1940, with an executive order reactivating the World War 1-era National Defense Advisory Commission (NDAC).  In December 1940, he replaced the NDAC with the Office of Production Management (OPM). Then, in August 1941, he created the Supply Priorities and Allocation Board (SPAB) and placed it over the OPM with the charge of developing a long-term mobilization strategy and overseeing OPM’s work.  And finally, again using an executive order, he established the War Production Board (WPB) in January 1942, replacing both the OPM and the SPAB.  

All three agencies, the NDAC, OPM, and WPB, relied heavily on divisions overseeing industrial sections to carry out their responsibilities.  The NDAC had 7 divisions: Industrial Production, Industrial Materials, Labor, Price Stabilization, Farm Products, Transportation, and Consumer Protection.  The first two were the most important.

The Industrial Production Division had 8 sections, the most important being aircraft; ammunition and lite ordnance; and tanks, trucks, and tractors. The Industrial Materials Division had three subdivisions, each with its own sections: the mining and minerals products subdivision had sections for iron and steel, copper, aluminum, and tin; the agricultural and forest products subdivision had sections for textiles, leather, paper, rubber, and the like; and the chemical and allied products division had sections for petroleum, nitrogen, etc. 

Each division, subdivision, and section had an appointed head, and each section head had an industry advisory committee to assist them. The divisions, subdivisions, and sections were responsible, as appropriate, for assessing the industrial capacities of their respective industries to meet present and projected military needs, facilitating military procurement activity, and assisting with plant expansion plans and the priority distribution and allocation of scarce goods.

When Roosevelt felt that an existing mobilization agency was not up to the task of furthering the war effort, he replaced it.  Accordingly, each new mobilization agency had a more centralized decision-making structure, broader responsibilities, and greater authority over private business decisions than its predecessor. 

Thus, the OPM, reflecting a different stage in the mobilization, was more narrowly focused on production and had only four divisions: Production Division, Purchases Division, Priorities Division, and Labor Division.  Later, in recognition of the spillover effects of military production on civilian production, the Civilian Supply Division was added and given responsibility for all industries producing 50 percent or less for the defense program. 

The WPB had six divisions: Production Division, Materials Division, Division of Industry Operations, Purchases Division, Civilian Supply Division, and Labor Division.  The newly created Division of Industry Operations included all nonmunitions-producing industries and had responsibility for promoting the conversion of industries to military production and for maximizing the flow of materials, equipment, and workers to essential producers.   

4.  Conversion means conflict

Powerful corporations and the military opposed policies that threatened their interests even when those policies benefitted the war effort.  Corporations producing goods of direct importance to the military often refused to undertake needed investments.  Corporations producing for the civilian market routinely ignored agency requests that they curtail or convert their production to economize on the nonmilitary use of scarce materials.

By late 1940, this corporate resistance had begun to cause shortages, especially of strategic materials.  Aluminum was one of those materials and Alcoa, the only major producer of the metal, aggressively resisted expanding its production capacity even though a lack of aluminum was causing delays in military aircraft production.  A similar situation existed with steel, with steel executives arguing that there was no need for capacity expansion while critical activities such as ship building and railroad car manufacturing ground to a halt because of a lack of supply.[9]

This growing shortage problem, and its threat to the military buildup, could have been minimized if large producers of consumer durables had been willing to either reduce their production or convert to military production. But almost all of them rebuffed NDAC entreaties. They were enjoying substantial profits for the first time in years and were unwilling to abandon their civilian markets. 

The industry that drew the most criticism because of its heavy resource use was the automobile industry. In 1939, the automobile industry “absorbed 18 percent of total national steel output, 80 percent of rubber, 34 percent of lead, nearly 10-14 percent of copper, tin, and aluminum, and 90 percent of gasoline. Throughout 1940 and 1941, automobile production went up, taking proportionately even more materials and products indispensable for defense preparation.”[10]

In some cases, this corporate opposition to policies that threatened their profits lasted deep into the war years, with some firms objecting not only to undertaking their own expansion but to any government financed expansion as well, out of fear of post-war overproduction and/or loss of market share.  This stance is captured in the following exchange between Senator E. H. Moore of Oklahoma and Interior Secretary and Petroleum Administrator for War Harold L. Ickes at a February 1943 Congressional hearing over the construction of a federally financed petroleum pipeline from Texas to the East Coast:

Secretary Ickes. I would like to say one thing, however. I think there are certain gentlemen in the oil industry who are thinking of the competitive position after the war.

The Chairman. That is what we are afraid of, Mr. Secretary.

Secretary Ickes. That’s all right. I am not doing that kind of thinking.

The Chairman. I know you are not.

Secretary Ickes. I am thinking of how best to win this war with the least possible amount of casualties and in the quickest time.

Senator Moore. Regardless, Mr. Secretary, of what the effect would be after the war? Are you not concerned with that?

Secretary Ickes. Absolutely.

Senator Moore. Are you not concerned with the economic situation with regard to existing conditions after the war?

Secretary Ickes. Terribly. But there won’t be any economic situation to worry about if we don’t win the war.

Senator Moore. We are going to win the war.

Secretary Ickes. We haven’t won it yet.

Senator Moore. Can’t we also, while we are winning the war, look beyond the war to see what the situation will be with reference to –

Secretary Ickes (interposing). That is what the automobile industry tried to do, Senator. It wouldn’t convert because it was more interested in what would happen after the war. That is what the steel industry did, Senator, when it said we didn’t need any more steel capacity, and we are paying the price now. If decisions are left with me, it is only fair to say that I will not take into account any post-war factor—but it can be taken out of my hands if those considerations are paid attention to.[11]

Military procurement agencies, determined to maintain their independence, also greatly hindered government efforts to ensure a timely flow of resources to essential producers by actively opposing any meaningful oversight or regulation of their activities. Most importantly, the procurement agencies refused to adjust their demand for goods and services to the productive capacity of the economy. Demanding more than the economy could produce meant that shortages, dislocations, and stockpiling were unavoidable. The Joint Chiefs of Staff actually ignored several WPB requests to form a joint planning committee. 

David Kennedy provides a good sense of what was at stake:

As money began to pour into the treasury, contracts began to flood out of the military purchasing bureaus—over $100 billion worth in the first six months of 1942, a stupefying sum that exceeded the value of the entire nation’s output in 1941 . . . Military orders became hunting licenses, unleashing a jostling frenzy of competition for materials and labor in the jungle of the marketplace.  Contractors ran riot in a cutthroat scramble for scarce resources.[12]

It took until late 1942 for the WPB to win what became known as the “feasibility dispute,” after which the military’s procurement agencies grudgingly took the economy’s ability to produce into account when making their procurement demands.

5. Class matters

Leading corporations and their executives took advantage of every opportunity to shape the wartime mobilization process and strengthen their post-war political and economic power.  Many of the appointed section heads responsible for implementing mobilization policies were so-called “dollar-a-year men” who remained employed by the very firms they were supposed to oversee.  And most of these section heads relied on trade association officials as well as industry advisory committees to help them with their work.  In some cases, trade association officials themselves served as section heads of the industries they were hired to represent.  These appointments gave leading corporations an important voice in decisions involving the speed and location of new investments, the timing and process of industry conversions, procurement contract terms and procedures, the use of small businesses as subcontractors, the designation of goods as scare and thus subject to regulation, the role of unions in shopfloor production decisions, and labor allocation policies.

NDAC officials initially welcomed the participation of dollar-a-year men on the grounds that business executives knew best how to organize and maximize production. However, they soon often found these executives speaking out against agency policies in defense of corporate interests.  In response, the OPM created a Legal Division and empowered it to write and implement regulations designed to limit their number and power, but to little avail.  As the agency’s responsibilities grew, so did the number of dollar-a-year men working for it. 

Little changed under the WPB.  In fact, between January and December 1942, their number grew from 310 to a wartime high of 805, driven in large part by the explosion in the number of industry advisory committees.[13] The WPB’s continued dependence on these nominally paid business executives was a constant source of concern in Congress.

Corporate leaders also never lost sight of what was to them the bigger picture, the post-war balance of class power.  Thus, from the very beginning of the wartime mobilization, they actively worked to win popular identification of democracy with corporate freedom of action and totalitarianism with government planning and direction of economic activity.

As J.W. Mason illustrates:

Already by 1941, government enterprise was, according to a Chamber of Com­merce publication, “the ghost that stalks at every business conference.” J. Howard Pew of Sun Oil declared that if the United States abandoned private ownership and “supinely reli[es] on government control and operation, then Hitlerism wins even though Hitler himself be defeated.” Even the largest recipients of military contracts regarded the wartime state with hostility. GM chairman Alfred Sloan—referring to the danger of government enterprises operating after war—wondered if it is “not as essential to win the peace, in an eco­nomic sense, as it is to win the war, in a military sense,” while GE’s Philip Reed vowed to “oppose any project or program that will weaken” free enterprise.[14]

Throughout the war, business leaders and associations “flooded the public sphere with descriptions of the mobilization effort in which for-profit companies figured as the heroic engineers of a production ‘miracle’.”  For example, Boeing spent nearly a million dollars a year on print advertising in 1943-45, almost as much as it set aside for research and development.

The National Association of Manufactures (NAM) was one of the most active promoters of the idea that it was business, not government, that was winning the war against state totalitarianism.  It did so by funding a steady stream of films, books, tours, and speeches.  Mark R. Wilson describes one of its initiatives:

One of the NAM’s major public-relations projects for 1942, which built upon its efforts in radio and print media, was its “Production for Victory” tour, designed to show that “industry is making the utmost contributions toward victory.” Starting the first week in May, the NAM paid for twenty newspaper reporters to take a twenty-four-day, fifteen-state trip during which they visited sixty-four major defense plants run by fifty-eight private companies. For most of May, newspapers across the country ran daily articles related to the tour, written by the papers’ own reporters or by one of the wire services. The articles’ headlines included “Army Gets Rubber Thanks to Akron,” “General Motors Plants Turning Out Huge Volume of War Goods,” “Baldwin Ups Tank Output,” and “American Industry Overcomes a Start of 7 Years by Axis.”[15]

The companies and reporters rarely mentioned that almost all of these new plants were actually financed, built, and owned by the government, or that it was thanks to government planning efforts that these companies received needed materials on a timely basis and had well-trained and highly motivated workers.  Perhaps not surprisingly, government and union efforts to challenge the corporate story were never as well funded, sustained, or shaped by as clear a class perspective.[16]  As a consequence, they were far less effective.

6. Final thoughts

 Although the World War II-era economic transformation cannot and should not serve as a model for a Green New Deal transformation of the U.S. economy, it does provide lessons that deserve to be taken seriously.  Among the most important is that a rapid system-wide transformation, such as required for a Green New Deal, is possible to achieve, and in a timely manner.  It will take the development of new state capacities and flexible policies.  And we should be prepared, from the beginning, that our own efforts to create a more socially just and environmentally sustainable economy will be met by sophisticated opposition from powerful corporations and their allies. 

The conversion history also points to some of our biggest challenges. Germany’s military victories in Europe as well as Japan’s direct attack on the United States encouraged popular support for state action to convert the economy from civilian to military production. In sharp contrast, widespread support for state action to combat climate change or restrict corporate freedom of action does not yet exist. Even now, there are many who deny the reality of climate change.  There is also widespread doubt about the ability of government to solve problems. This means we have big work ahead to create the political conditions supportive of decisive action to transform our economy.

Perhaps equally daunting, we have no simple equivalent to the military during World War II to drive a Green New Deal transformation.  The war-time mobilization was designed to meet the needs of the military.  Thus, the mobilization agencies generally treated military procurement demands as marching orders.  In contrast, a Green New Deal transformation will involve changes to many parts of our economy, and our interest in a grassroots democratic restructuring process means there needs to be popular involvement in shaping the transformation of each part, as well as the connections between them. Thus, we face the difficult task of creating the organizational relationships and networks required to bring together leading community representatives, and produce, through conversation and negotiation, a broad roadmap of the process of transformation we collectively seek.

And finally, we must confront a corporate sector that is far more powerful and popular now than it was during the period of the war.  And thanks to the current freedom corporations enjoy to shift production and finance globally, they have a variety of ways to blunt or undermine state efforts to direct their activities.

In sum, achieving a Green New Deal transformation will be far from easy. It will require developing a broad-based effort to educate people about how capitalism is driving our interrelated ecological and economic crises, building a political movement for system-wide change anchored by a new ecological understanding and vision, and creating the state and community-based representative institutions needed to initiate and direct the desired Green New Deal transformation. 

It is that last task that makes a careful consideration of the World War II-era conversion so valuable.  By studying how that rapid economy-wide transformation was organized and managed, we are able to gain important insights into, and the ability to prepare for, some of the challenges and choices that await us on the road to the new economy we so badly need.


[1] These include debates over the speed of change, the role of public ownership, and the use of nuclear power for energy generation.  There are also environmentalists who oppose the notion of sustained but sustainable growth explicitly embraced by many Green New Deal supporters and argue instead for a policy of degrowth, or a “Green New Deal without growth.”

[2] Christopher J. Tassava, “The American Economy during World War II,” EH.Net Encyclopedia, edited by Robert Whaples, February 10, 2008.

[3] Harold G. Vatter, The U.S. Economy in World War II (New York: Columbia University Press, 1985), 23.

[4] Vatter, The U.S. Economy in World War II, 22.

[5] Paul A.C Koistinen, Arsenal of World War II, The Political Economy of American Warfare 1940-1945. (Lawrence, Kansas: University of Kansas Press, 2004), 498.

[6] Hugh Rockoff, “The United States: From Ploughshares to Swords,” in Mark Harrison, editor, The Economics of World War II (New York: Cambridge University Press, 1998), p. 83.

[7] Gerald T. White, “Financing Industrial Expansion for War: The Origin of the Defense Plant Corporation Leases,” The Journal of Economic History, Vol. 9, No. 2 (November, 1949), 158.

[8] Andrew Bossie and J.W. Mason, “The Public Role in Economic Transformation: Lessons from World War II,” The Roosevelt Institute, March 2020, 9-10.

[9] Maury Klein, A Call to Arms, Mobilizing America for World War II (New York: Bloomsbury Press, 2013), 165.

[10] Koistinen, Arsenal of World War II, 130.

[11] As quoted in Vatter, The U.S. Economy in World War II, 24-25.

[12] As quoted in Klein, A Call to Arms, 376.

[13] Koistinen, Arsenal of World War II, 199.

[14] J.W. Mason, “The Economy During Wartime,” Dissent Magazine, Fall 2017.

[15] Mark R. Wilson, Destructive Creation, American Business and the Winning of World War II (Philadelphia: University of Pennsylvania Press, 2016), 102.

[16] Union suggestions for improving the overall efficiency of the mobilization effort as well as their offers to join with management in company production circles were routinely rejected. See Martin Hart-Landsberg, The Green New Deal and the State, Lessons from World War II,” Against the Current, No. 207 (July/August 2020); Paul A. C. Koistinen, “Mobilizing the World War II Economy: Labor and the Industrial-Military Alliance,” Pacific Historical Review, Vol. 42, No. 4 (November 1973); and Nelson Lichtenstein, Labor’s War at Home, The CIO in World War II (New York: Cambridge University Press, 1982).

The latest argument against federal relief: business claims that workers won’t work

A growing number of business and political leaders have found yet another argument to use against federal pandemic relief programs, especially those that provide income support for workers: they hurt the economic recovery by encouraging workers not to work.

In the words of Senate Minority Leader Mitch McConnell, as reported by BusinessInsider

“We have flooded the zone with checks that I’m sure everybody loves to get, and also enhanced unemployment,” McConnell said from Kentucky. “And what I hear from businesspeople, hospitals, educators, everybody across the state all week is, regretfully, it’s actually more lucrative for many Kentuckians and Americans to not work than work.”

He went on: “So we have a workforce shortage and we have raising inflation, both directly related to this recent bill that just passed.”

In line with business claims that they can’t find willing workers despite their best efforts at recruitment, the governors of Montana, South Carolina, Alabama, Arkansas, and Mississippi have all announced that they will no longer allow the unemployed in their respective states to collect the $300-a-week federal supplemental unemployment benefit and will once again require that those receiving unemployment benefits demonstrate they are actively looking for work.

In reality there is little support for the argument that expanded unemployment benefits have created an overly worker-friendly labor market, leaving companies unable to hire and, by extension, meet growing demand.  But of course, if enough people accept the argument, corporate leaders and their political allies will have achieved their shared goal, which is to weaken worker bargaining power as corporations seek to position themselves for a profitable post-pandemic economic recovery.

Wage trends

If companies were aggressively seeking workers, we would expect to see the resulting competition push up wages.  The following figure shows year-over-year real weekly earnings of production and nonsupervisory workers—approximately 85 percent of the workforce.  As we can see, those earnings were actually lower in April 2021 than they were in April 2020. 

In short, companies may want more workers, but it is hard to take their cries of anguish seriously if they remain unwilling to offer higher real wages to attract them.  Real average weekly earnings of production and nonsupervisory workers in April 2021 stood at $875.  Multiplying weekly earnings by 50, gives an estimated annual salary of $43,774.  That total is actually 5.7 percent below the similarly calculated peak in October 1972.

Over the last three months, the only sector experiencing significant wage growth due to labor shortages is the leisure and hospitality sector (which includes arts, entertainment, and leisure as well as accommodations and food services).  Wages in that sector grew at an annualized rate of nearly 18 percent relative to the previous three months.  But, as Josh Bivens and Heidi Shierholz explain,

There is very little reason to worry that labor shortages in leisure and hospitality will soon spill over into other sectors and drive economywide “overheating.”  For example, jobs in leisure and hospitality have notably low wages and fewer hours compared to other sectors. Weekly wages of production and nonsupervisory workers in leisure and hospitality now equate to annual earnings of just $20,628, and total wages in leisure and hospitality account for just 4% of total private wages in the U.S. economy. . . . [Moreover] this sector seems notably segmented off from much of the rest of the economy.

Job openings and labor turnover

The figure below, drawn from the Bureau of Labor Statistics’s Job Openings and Labor Turnover Summary (JOLTS), shows the monthly movement in job openings, hires, quits, and layoffs and discharges, with solid lines showing their six-month moving averages.   

As we can see, despite business complaints, monthly hiring (green line) still remains greater than during the last years of the pre-pandemic expansion.  And although job openings (blue line) are growing sharply while the number of hires is falling, the gap between openings and hires is also still smaller than it was during the last years of the previous expansion.  In addition, the number of quits (light blue line), which are an indicator of labor tightness, remain below the last years of the previous expansion and rather stable.  In short, there is nothing in the data that suggests business is facing a dysfunctional labor market marked by an unreasonable worker unwillingness to work.

Even with the additional financial support in Biden’s American Rescue Plan, many workers and their families continue to struggle to afford food, housing, and health care.  Many workers remain reluctant to re-enter the labor market because of Covid-related health concerns and care responsibilities.  Moreover, as Heidi Shierholz points out

there are far more unemployed people than available jobs in the current labor market. In the latest data on job openings, there were nearly 40% more unemployed workers than job openings overall, and more than 80% more unemployed workers than job openings in the leisure and hospitality sector.

While there are certainly fewer people looking for jobs now than there would be if Covid weren’t a factor . . . without enough job openings to even come close to providing work for all job seekers, it again stretches the imagination to suggest that labor shortages are a core dynamic in the labor market.

We need to discredit this attempt by the business community and its political allies to generate opposition to policies that help workers survive this period of crisis and redouble our own efforts to strengthen worker rights and build popular support for truly transformative economic policies, ones that go beyond the stopgap fixes currently promoted.

Time to put the spotlight on corporate taxes

A battle is slowly brewing in Washington DC over whether to raise corporate taxes to help finance new infrastructure investments.  While higher corporate taxes cannot generate all the funds needed, the coming debate over whether to raise them gives us an opportunity to challenge the still strong popular identification of corporate profitability with the health of the economy and, by extension, worker wellbeing.

According to the media, President Biden’s advisers are hard at work on two major proposals with a combined $3 trillion price tag.  The first aims to modernize the country’s physical infrastructure and is said to include funds for the construction of roads, bridges, rail lines, ports, electric vehicle charging stations, and affordable and energy efficient housing as well as rural broadband, improvements to the electric grid, and worker training programs.  The second targets social infrastructure and would provide funds for free community college education, universal prekindergarten, and a national paid leave program. 

To pay for these proposals, Biden has been talking up the need to raise corporate taxes, at least to offset some of the costs of modernizing the country’s physical infrastructure.  Not surprisingly, Republican leaders in Congress have voiced their opposition to corporate tax increases.  And corporate leaders have drawn their own line in the sand.  As the New York Times reports:

Business groups have warned that corporate tax increases would scuttle their support for an infrastructure plan. “That’s the kind of thing that can just wreck the competitiveness in a country,” Aric Newhouse, the senior vice president for policy and government relations at the National Association of Manufacturers, said last month [February 2021].

Regardless of whether Biden decides to pursue his broad policy agenda, this appears to be a favorable moment for activists to take advantage of media coverage surrounding the proposals and their funding to contest these kinds of corporate claims and demonstrate the anti-working-class consequences of corporate profit-maximizing behavior.  

What do corporations have to complain about?

To hear corporate leaders talk, one would think that they have been subjected to decades of tax increases.  In fact, quite the opposite is true.  The figure below shows the movement in the top corporate tax rate.  As we can see, it peaked in the early 1950s and has been falling ever since, with a big drop in 1986, and another in 2017, thanks to Congressionally approved tax changes.

One consequence of this corporate friendly tax policy is, as the following figure shows, a steady decline in federal corporate tax payments as a share of GDP.  These payments fell from 5.6 percent of GDP in 1953 to 1.5 percent in 1982, and a still lower 1.0 percent in 2020.  By contrast there has been very little change in individual income tax payments as a share of GDP; they were 7.7 percent of GDP in 2020.

Congressional tax policy has certainly been good for the corporate bottom line.  As the next figure illustrates, both pre-tax and after-tax corporate profits have risen as a share of GDP since the early 1980s.  But the rise in after-tax profits has been the most dramatic, soaring from 5.2 percent of GDP in 1980 to 9.1 percent in 2019, before dipping slightly to 8.8 percent in 2020.   To put recent after-tax profit gains in perspective, the 2020 after-tax profit share is greater than the profit share in every year from 1930 to 2005.

What do corporations do with their profits?

Corporations claim that higher taxes would hurt U.S. competitiveness, implying that they need their profits to invest and keep the economy strong.  Yet, despite ever higher after-tax rates of profit, private investment in plant and equipment has been on the decline.

As the figure below shows, gross private domestic nonresidential fixed investment as a share of GDP has been trending down since the early 1980s.  It fell from 14.8 percent in 1981 to 13.4 percent in 2020.

Rather than investing in new plant and equipment, corporations have been using their profits to fund an aggressive program of stock repurchases and dividend payouts.  The figure below highlights the rise in corporate stock buybacks, which have helped drive up stock prices, enriching CEOs and other top wealth holders. In fact, between 2008 and 2017, companies spent some 53 percent of their profits on stock buybacks and another 30 percent on dividend payments.

It should therefore come as no surprise that CEO compensation is also exploding, with CEO-to-worker compensation growing from 21-to-1 in 1965, to 61-to-1 in 1989, 293-to-1 in 2018, and 320-to-1 in 2019.  As we see in the next figure, the growth in CEO compensation has actually been outpacing the rise in the S&P 500.

In sum, the system is not broken.  It continues to work as it is supposed to work, generating large profits for leading corporations that then find ways to generously reward their top managers and stockholders.  Unfortunately, investing in plant and equipment, creating decent jobs, or supporting public investment are all low on the corporate profit-maximizing agenda.  

Thus, if we are going to rebuild and revitalize our economy in ways that meaningfully serve the public interest, working people will have to actively promote policies that will enable them to gain control over the wealth their labor produces.  One example: new labor laws that strengthen the ability of workers to unionize and engage in collective and solidaristic actions.  Another is the expansion of publicly funded and provided social programs, including for health care, housing, education, energy, and transportation. 

And then there are corporate taxes.  Raising them is one of the easiest ways we have to claw back funds from the private sector to help finance some of the investment we need.  Perhaps more importantly, the fight over corporate tax increases provides us with an important opportunity to make the case that the public interest is not well served by reliance on corporate profitability.

Black Lives Matter protests are saving lives

The research is pretty clear that oppressive economic and social conditions are bad for one’s mental and physical health.   And there is also research showing that protesting is good for one’s mental and physical health.  As Dr. Bandy X. Lee, a psychiatrist at Yale University explains:  

Can protesting and other forms of activism help people break out of those negative thought cycles? Yes, because protesting alone is therapeutic. It is acting on hope and it is also, in the case of oppression, therapeutic.

Now we have a study that finds that protesting actually saves lives.  More specifically, that Black Lives Matter Protests reduce police killings.  As Travis Campbell, the author of the study, concludes, “census places with Black Lives Matter protests experience a 15 percent to 20 percent decrease in police homicides over [the period 2014-2019], around 300 fewer deaths. The gap in lethal use-of-force between places with and without protests widens over these subsequent years and is most prominent when protests are large or frequent.” 

Black Lives Matter movement and protests

Campbell dates the rise of the Black Lives Matter movement to the outrage triggered by the acquittal of George Zimmerman for the 2013 killing of Trayvon Martin.  Alicia Garza, an Oakland, California-based activist, posted a Facebook message saying “black people. I love you. I love us. Our lives matter.” Patrisse Cullors, another Oakland activist, began sharing the message along with the twitter tag #blacklivesmatters.  And, “with the help of activist Opal Tometi, Black Lives Matter (BLM) was born.” 

Campbell dates the start of the Black Lives Matter protest movement to the explosion of protests in response to the 2014 police killings of Eric Garner in New York City and Michael Brown in Ferguson, Missouri. The protests continued, as did the police killings, among the most well-known victims being Tamir Rice, Walter Scott, Sandra Bland, Alton Sterling, Freddie Gray, Laquan McDonald, Philando Castile, Stephon Clark, Breonna Taylor, and George Floyd.

The study—killings and protests

Campbell sought to determine whether these BLM protests, motivated by police killings, actually helped to reduce them.  An important question but one not easy to answer.  His first challenge was to determine the actual number of police killings. 

Unfortunately, there is no reliable federal database of police killings.  There are, however, a number of nonprofit and media organizations that do maintain a public record.  The most important are, The Homicide Record by the Los Angeles Times, Mapping Police Violence (MPV), the Washington Post, the Counted by the Guardian, and Fatal Encounters Dot Org (FE). 

Campbell uses the Fatal Encounters Dot Org database–which relies on the work of paid researchers, public records requests, and crowdsourcing for its information–for his study.  Its dataset is updated regularly and begins in 2000.  According to Campbell, the MPV database has the most complete information on victims, including their race and the circumstances surrounding their death, because it also makes use of social media sites, criminal records databases, and police reports.  However, Campbell doesn’t use it.  Its records only date back to 2013, which makes it impossible to determine pre-protest trends in police homicides in locations with BLM protests.  

As for what constitutes a police homicide, FE uses a broad definition: all lethal interactions with police, whether on- or off-duty, including suicides.  The MPV database only includes cases where “a person dies as a result of being shot, beaten, restrained, intentionally hit by a police vehicle, pepper-sprayed, tasered, or otherwise harmed by police officer whether on-duty or off-duty.”  Campbell uses a more restrictive measure for his study, one that only includes police homicides due to asphyxiation, bludgeoning, a gunshot, pepper spray, or a taser that are not suicides. 

As we can see from Figures 1 (c) and (d) below, which correspond to Campbell’s more restrictive measure, both FE and MPV include roughly the same number of total deaths with a correspondingly high share caused by gunshot. This similarity encourages confidence in the reliability of the FE database.

The second challenge is to determine the number of BLM protests.  Campbell draws his data from a 2018 published study that covers protests over the years 2014-2015 and a public data base maintained by Alisa Robinson for the following years.  To maintain a focus on street demonstrations Campbell does not include in his data “online demonstrations, protests by professional athletes, protests against presidential candidates, or protests against conservative talks at universities.”

The following figure shows the location of the killings and protests used in the study.

The methodology

Campbell’s study includes every census place with a population of at least 20,000 people.  Using a stacked difference-in-difference design, he tested whether Black Lives Matter protests had an effect on police killings in the locations where protests occurred.  In broad brush, the design uses the locations where no Black Lives Matter protests occurred to develop a baseline trend, adjusted for relevant economic and social determinants (highlighted below), of police killings.  Then, the adjusted baseline is applied to locations where Black Lives Matter protests have occurred to determine whether the protests had an independent effect on the number of police killings.

In recognition of the great differences between census areas, the adjusted baseline trend is calculated taking a number of different variables into account.  These include: poverty rates, labor force participation rates, unemployment rates, full-time employment rates, black poverty rates; and educational attainment measures; rates and types of crime; and the number, renumeration, and training of police officers as well as officer demographics, unionization, use-of-force reporting, use of cameras, and community policing initiatives.

The results

The results are striking.  As Campbell explains:

Following BLM protests, lethal use-of-force fell by 16.8 percent on average.  If the model is correct, then BLM protests are responsible for approximately 300 fewer people being killed by the police from 2014 through 2019.  The payoff for protesting is substantial; every 5 of the 1,654 protests in the sample correspond with approximately one less person killed by the police over the following years. The police killed one less person for every four thousand participants.

When Campbell normalized the homicides by population and gave added weight to larger census areas under the assumption that news reports of police killings and protests were likely more accurate there, the decline in police homicides grew to 19.8 percent.

Campbell tested his conclusion by looking separately at the cities with the greatest number of protests.  The figure below shows census places with the most protests in descending order, many of which were home to high profile police killings.  As we can see, almost all experienced a statistically significant decline in police killings following protests.  The exceptions were St. Louis, Minneapolis, San Francisco, and Portland.

As to possible explanations for the decline in police killings, Campbell found that the demonstrations appeared to force changes in local policing.  For example, they increased police use of body-worn cameras, the number of police officers assigned to regular geographic patrols, and the adoption of a variety of community policing initiatives.

A Scientific American review of the study quotes Aldon Morris, the Leon Forrest Professor of Sociology and African American Studies at Northwestern University and president of the American Sociological Association:  

The question becomes, ‘Are Black Lives Matter protests having any real effect in terms of generating change?’ The data show very clearly that where you had Black Lives Matter protests, killing of people by the police decreased. It’s inescapable from this study that protest matters—that it can generate change.

Hopefully that recognition—that BLM protests are saving lives—will encourage ever greater support for, and participation, in the movement, thereby helping to achieve the transformational changes in policing needed to protect the rights of communities of color to live safely and well.

The failings of our unemployment insurance system are there by design

Our unemployment insurance system has failed the country at a moment of great need.  With tens of millions of workers struggling just to pay rent and buy food, Congress was forced to pass two emergency spending bills, providing one-time stimulus payments, special weekly unemployment insurance payments, and temporary unemployment benefits to those not covered by the system.  And, because of their limited short-term nature, President Biden must now advocate for a third.

The system’s shortcomings have been obvious for some time, but little effort has been made to improve it.  In fact, those shortcomings were baked into the system at the beginning, as President Roosevelt wanted, not by accident.  While we must continue to organize to ensure working people are able to survive the pandemic, we must also start the long process of building popular support for a radical transformation of our unemployment insurance system.  The history of struggle that produced our current system offers some useful lessons.


Our unemployment insurance system was designed during the Great Depression.  It was supposed to shield workers and their families from the punishing costs of unemployment, thereby also helping to promote both political and economic stability.  Unfortunately, as Eduardo Porter and Karl Russell reveal in a New York Times article, that system has largely failed working people.

The chart below shows the downward trend in the share of unemployed workers receiving benefits and the replacement value of those benefits.  Benefits now replace less than one-third of prior wages, some eight percentage points below the level in the 1940s.  Benefits aside, it is hard to celebrate a system that covers fewer than 30 percent of those struggling with unemployment.

A faulty system

Although every state has an unemployment insurance system, they all operate independently.  There is no national system.  Each state separately generates the funds it needs to provide unemployment benefits and is largely free, subject to some basic federal standards, to set the conditions under which an unemployed worker becomes eligible to receive benefits, the waiting period before benefits will be paid, the length of time benefits will be paid, the benefit amount, and requirements to continue receiving benefits.

Payroll taxes paid by firms generate the funds used to pay unemployment insurance benefits.  The size of the taxes to be paid depends on the value of employee earnings that is made taxable (the base wage) and the tax rate.  States are free to set the base wage as they want, subject to a federally mandated floor of $7000 established in the 1970s.  States are also free to set the tax rate as they want.  Not surprisingly, in the interest of supporting business profitability, states have generally sought to keep both the base wage and tax rate low.  For example, Florida, Tennessee and Arizona continue to set their base wage at the federal minimum value.  And, as the figure below shows, insurance tax rates have been trending down for some time.

While such a policy might help business, lowering the tax rate means that states have less money in their trust funds to pay unemployment benefits.  Thus, when times are hard, and unemployment claims rise, many states find themselves hard pressed to meet their required obligations.  In fact, as Porter and Russell explain:

Washington has been repeatedly called on to provide additional relief, including emergency patches to unemployment insurance after the Great Recession hit in 2008. Indeed, it has intervened in response to every recession since the 1950s.

This is far from a desirable outcome for those states forced to borrow, since the money has to be paid back with interest by imposing higher future payroll taxes on employers.  Thus, growing numbers of states have sought to minimize the likelihood of this happening, or at least the amount to be borrowed, by raising eligibility standards, reducing benefits, and shortening time of coverage, all of which they hope will reduce the number of people drawing unemployment benefits as well as the amount and length of time they will receive them.

Porter and Russell highlight some of the consequences of this strategy:

In Arizona, nearly 70 percent of unemployment insurance applications are denied. Only 15 percent of the unemployed get anything from the state. Many don’t even apply. Tennessee rejects nearly six in 10 applications.

In Florida, only one in 10 unemployed workers gets any benefits. The state is notably stingy: no more than $275 a week, roughly a third of the maximum benefit in Washington State. And benefits run out quickly, after as little as 12 weeks, depending on the state’s overall unemployment rate.

And, the growing stagnation of the US economy, which has led to more precarity of employment, only makes this strategy ever more fiscally “intelligent.”  For example, as the following figure shows, a growing percentage of the unemployed are remaining jobless for a longer time.  Such a trend, absent state actions to restrict access to benefits, would mean financial trouble for state officials.

Adding to the system’s structural shortcomings is that fact that growing numbers of workers, for example the many workers who have been reclassified as independent contractors, are not covered by it.  In addition, since eligibility for benefits requires satisfying a minimum earnings and hours of work requirement over a base year, the growth in irregular low wage work means that many of those in most need of the system’s financial support during periods of unemployment find themselves declared ineligible for benefits.

By design, not by mistake

Our current unemployment insurance system and its patchwork set of state standards and benefits dates back to the depression. While President Roosevelt gets credit for establishing our unemployment insurance system as part of the New Deal, the fact is he deliberately sidelined a far stronger program that, if it had been approved, would have put working people today in a far more secure position. 

The Communist Party (CP) began pushing an unemployment and social insurance bill in the summer of 1930 and, along with the numerous Unemployed Councils that existed in cities throughout the country, worked hard to promote it over the following years.  On March 4, 1933, the day of Roosevelt’s inauguration, they organized demonstrations stressing the need for action on unemployment insurance.

Undeterred by Roosevelt’s lack of action, the CP-authored “Workers Unemployment and Social Insurance Bill” was introduced in Congress in February 1934 by Representative Ernest Lundeen of the Farmer-Labor Party.  In broad brush, the bill mandated the payment of unemployment insurance to all unemployed workers and farmers equal to average local full-time wages, with a guaranteed minimum of $10 per week plus $3 for each dependent. Those forced into part-time employment would receive the difference between their earnings and the average local full-time wage.  The bill also created a social insurance program that would provide payments to the sick and elderly, and maternity benefits to be paid eight weeks before and eight weeks after birth.  All these benefits were to be financed by unappropriated funds in the Treasury and taxes on inheritances, gifts, and individual and corporate incomes above $5,000 a year.

The bill enjoyed strong support among workers—employed and unemployed—and it was soon endorsed by 5 international unions, 35 central labor bodies, and more than 3000 local unions.  Rank and file worker committees also formed across the country to pressure members of Congress to pass it.

When Congress refused to act on the bill, Lundeen reintroduced it in January 1935. Because of public pressure, the bill became the first social insurance plan to be recommended by a congressional committee, in this case the House Labor Committee.  However, it was soon voted down in the full House of Representatives, 204 to 52.

Roosevelt strongly opposed the Lundeen bill and it was to provide a counter that he pushed to create an alternative, one that offered benefits far short of what the Workers Unemployment and Social Insurance Bill offered, and was strongly opposed by many workers and all organizations of the unemployed.  Roosevelt appointed a Committee on Economic Security in July 1934 with the charge to develop a social security bill that he could present to Congress in January 1935 that would include provisions for both unemployment insurance and old-age security.  An administration approved bill was introduced right on schedule in January and Roosevelt called for quick congressional action. 

Roosevelt’s bill was revised in April by a House committee and given a new name, “The Social Security Act.”  After additional revisions the Social Security Act was signed into law on August 14, 1935. The Social Security Act was a complex piece of legislation.  It included what we now call Social Security, a federal old-age benefit program; a program of unemployment insurance administered by the states; and a program of federal grants to states to fund benefits for the needy elderly and aid to dependent children. 

The unemployment system established by the Social Security Act was structured in ways unfavorable to workers (as was the federal old-age benefit program).  Rather than a progressively funded, comprehensive national system of unemployment insurance that paid benefits commensurate with worker wages, the act established a federal-state cooperative system that gave states wide latitude in determining standards.

More specifically, the act levied a uniform national pay-roll tax of 1 percent in 1936, 2 percent in 1937, and 3 percent in 1938, on covered employers, defined as those employers with eight or more employees for at least twenty weeks, not including government employers and employers in agriculture.  Only workers employed by a covered employer could receive benefits.

The act left it to the states to decide whether to enact their own plans, and if so, to determine eligibility conditions, the waiting period to receive benefits, benefit amounts, minimum and maximum benefit levels, duration of benefits, disqualifications, and other administrative matters. It was not until 1937 that programs were established in every state as well as the then-territories of Alaska and Hawaii.  And it was not until 1938 that most began paying benefits.

In the early years, most states required eligible workers to wait 2 to 4 weeks before drawing benefits, which were commonly set at half recent earnings (subject to weekly maximums) for a period ranging from 12 to 16 weeks. Ten state laws called for employee contributions as well as employer contributions; three still do today.

Over the following years the unemployment insurance system has been improved in a number of positive ways, including by broadening coverage and boosting benefits.  However, its basic structure remains largely intact, a structure that is overly complex, with a patchwork set of state eligibility requirements and miserly benefits. And we are paying the cost today.

This history makes clear that nothing will be given to us.  We need and deserve a better unemployment insurance system. And to get it, we are going to have to fight for it, and not be distracted by the temporary, although needed, band-aids Congress is willing to provide.  The principles shaping the Workers Unemployment and Social Insurance Bill can provide a useful starting point for current efforts.

The U.S. recovery on pause, December brings new job losses

A meaningful working-class recovery from the recession seems far away.

After seven months of job gains, although diminishing gains to be sure, we are again losing jobs.  As the chart below shows,  the number of jobs fell by 140,000 in December.

We are currently about 9.8 million jobs down from the February 2020 employment peak, having recovered only 55 percent of the jobs lost.  And, as the following chart illustrates, the percentage of jobs lost remains greater, even now after months of job growth, than it was at any point during the Great Recession. 

If the job recovery continues on its current pace, some analysts predict that it will likely take more than three years to just get back to pre-pandemic employment levels.  However, this might well be too rosy a projection.  One reason is that the early assumption that many of the job losses were temporary, and that those unemployed would soon be recalled to employment, is turning out to be wrong.  A rapidly growing share of the unemployed are remaining unemployed for an extended period. 

As we see below, in October, almost one-third of the unemployed had been unemployed for 27 weeks or longer.  According to the December jobs report, that percentage is now up to 37 percent, four times what it was before the pandemic.  And that figure seriously understates the problem, since many workers have given up looking for work; having dropped out of the workforce, they are no longer counted as unemployed.  The labor force participation rate is now 61.5 percent, down from 63.3 percent in February.

Dean Baker, quoted in a recent Market Place story, underscores the importance of this development:

“This is obviously a story of people losing their job at the beginning of the crisis in March and April and not getting it back,” said Dean Baker, co-founder and senior economist with the Center for Economic and Policy Research.

Those out of work for 27 weeks or more make up a growing share of the unemployed, and that could have enduring consequences, Baker said.

“After people have been unemployed for more than six months, they find it much harder to get a job,” he said. “And if they do get a job, their labor market prospects could be permanently worsened.”

And tragically, the workers that have suffered the greatest job losses during this crisis are those that earned the lowest wages. 

It is no wonder that growing numbers of working people are finding it difficult to meet their basic needs.

There is no way to sugar coat this situation.  We need a significant stimulus package, a meaningful increase in the minimum wage, real labor law reform, a robust national single-payer health care system, and an aggressive Green New Deal designed public sector investment and jobs program.  And there is no getting around the fact that it is going to take hard organizing and mutually supportive community and workplace actions to move the country in the direction it needs to go.