Learning from history: community-run child-care centers during World War II

We face many big challenges.  And we will need strong, bold policies to meaningfully address them.  Solving our child-care crisis is one of those challenges, and a study of World War II government efforts to ensure accessible and affordable high-quality child care points the way to the kind of bold action we need. 

The child care crisis

A number of studies have established that high-quality early childhood programs provide significant community and individual benefits.  One found that “per dollar invested, early childhood programs increase present value of state per capita earnings by $5 to $9.”  Universal preschool programs have also been shown to offer significant benefits to all children, even producing better outcomes for the most disadvantaged children than means-tested programs.  Yet, even before the pandemic, most families struggled with a lack of desirable child-care options.    

The pandemic has now created a child-care crisis. As Lisa Dodson and Mary King point out: “By some estimates, as many as 4.5 million child-care ‘slots’ may be permanently lost and as many as 40 percent of child-care providers say they will never reopen.”  The lack of child care is greatly hindering our recovery from the pandemic.  Women suffered far greater job losses than men during 2020, including as child-care workers, and the child-care crisis has made it difficult for many working mothers to return to the labor force.  The cost goes beyond the immediate family hardship from lost income; there is strong evidence that a sustained period without work, the so-called employment gap, will result in significantly lower lifetime earnings and reduced retirement benefits.  

To his credit, President Biden has recognized the importance of strengthening our care economy.  His proposed American Families Plan includes some $225 billion in tax credits to help make child care more affordable for working families.  According to a White House fact sheet, families would “receive a tax credit for as much as half of their spending on qualified child care for children under age 13, up to a total of $4,000 for one child or $8,000 for two or more children. . . . The credit can be used for expenses ranging from full-time care to after school care to summer care.”

But tax credits don’t ensure the existence of convenient, affordable, high-quality child-care facilities staffed by well-paid and trained child-care providers.  And if that is what we really want, we will need to directly provide it.  That is what the government did during World War II.  While its program was far from perfect, in part because it was designed to be short-term, it provides an example of the type of strong, bold action we will need to overcome our current child-care crisis. 

Federal support for child care

During World War II the United States government financed a heavily-subsidized child-care program.  From August 1943 through February 1946, the Federal Works Agency (FWA), using Lanham Act funds, provided some $52 million in grants for child-care services (equal to more than $1 billion today) to any approved community group that could demonstrate a war-related need for the service.  At its July 1944 peak, 3,102 federally subsidized child-care centers, with some 130,000 children enrolled, operated throughout the country.  There was at least one center in every state but New Mexico, which decided against participation in the program.  By the end of the war, between 550,000 and 600,000 children received some care from Lanham Act funded child-care programs.  

Communities were allowed to use the federal grant money to cover most of the costs involved in establishing and running their centers, including facilities construction and upkeep, staff wages and most other daily operating costs.  They were required to provide some matching funds, most of which came from fees paid by the parents of children enrolled in the program.  However, these fees were capped. In the fall of 1943, the FWA established a ceiling on fees of 50 cents per child per day (about $7 now), which was raised to 75 cents in July 1945. And those fees included snacks, lunch, and in some cases dinner as well. Overall, the federal subsidy covered two-thirds of the total maintenance and operation of the centers.

The only eligibility requirement for enrollment was a mother’s employment status: she had to be working at a job considered important to the war effort, and this was not limited to military production. Center hours varied, but many accommodated the round-the-clock manufacturing schedule, staying open 24 hours a day, 6 days a week. 

The centers served preschoolers (infants, toddlers, and children up to 5 years of age) and school-age children (6 to 14 years of age). In July 1944, approximately 53,000 preschoolers and 77,000 school-age children were enrolled.  School-age enrollment always climbed during summer vacation.  However, in most months, preschoolers made up the majority of the children served by Lanham Act-funded centers. Enrollment of preschoolers peaked at some 74,000 in May 1945. 

Some 90 percent of the centers were housed in public schools, with newly contructed housing projects providing the next most used location. Although local school boards were free to decide program standards–including staff-child ratios, worker qualifications, and facility design–state boards of education were responsible for program supervision. The recommended teacher-child ratio was 10-to-1, and most centers complied.  According to Chris M. Herbst,

Anecdotal evidence suggests that preschool-aged children engaged in indoor and outdoor play; used educational materials such paints, clay, and musical instruments; and took regular naps. . . . Programs for school-aged children included . . . outdoor activities, participation in music and drama clubs, library reading, and assistance with schoolwork. 

Children at a child-care center sit for “story time.” (Gordon Parks / Library of Congress / The Crowley Company)

While quality did vary–largely the result of differences in community support for public child care, the willingness of cities to provide additional financial support, and the ability of centers to hire trained professionals to develop and oversee program activities–the centers did their best to deliver a high-quality childhood education.  As Ruth Peason Koshuk, the author of a 1947 study of the developmental records of 500 children, 2 to 5 years of age, at two Los Angeles Country centers, describes:

In these two . . . schools, as elsewhere, the program has developed since 1943, toward recognized standards of early childhood education. The aim has been to apply the best of existing standards, and to maintain as close contact with the home as possible. In-service training courses carrying college credit have been given, for the teaching staff, and a mutually helpful parent education program carried on in spite of difficulties inherent in a child care situation.

There has been a corresponding development in the basic records. A pre-entrance medical examination has been required by state law since the first center opened. In December 1943 a developmental record was added, which is filled out by the director during an unhurried interview with the mother just before a child enters. One page is devoted to infancy experience; the four following cover briefly the child’s development history, with emphasis on emotional experience, behavior problems he has presented to the parents, if any, and the control methods used, as well as the personal-social behavior traits which they value and desire for the child. After entrance, observational notes and semester reports are compiled by the teachers. Intelligence testing has been limited to cases where it seemed especially indicated. A closing record is filled out, in most cases, by the parent when a child is withdrawn. These records are considered a minimum. They have proved indispensable as aids to the teachers in guiding the individual children and as a basis for conferences on behavior in the home.

A 2013 study of the long-term effects on mothers and children from use of Lanham centers found a substantial increase in maternal employment, even five years after the end of the program, and “strong and persistent positive effects on well-being” for their children.

In short, despite many shortcomings, these Lanham centers, as Thalia Ertman sums up,

broke ground as the first and, to date, only time in American history when parents could send their children to federally-subsidized child care, regardless of income, and do so affordably. . . .

Additionally, these centers are seen as historically important because they sought to address the needs of both children and mothers. Rather than simply functioning as holding pens for children while their mothers were at work, the Lanham child care centers were found to have a strong and persistent positive effect on the well-being of children.

The federal government also supported some private employer-sponsored child care during the war. The most well-known example is the two massive centers built by the Kaiser Company in Portland, Oregon to provide child care for the children of workers at their Portland Yards and Oregon Shipbuilding Corporation. The centers were located right at the front of the shipyards, making it easy for mothers to drop their children off and pick them up, and were operated on a 24-hour schedule.  They were also large, each caring for up to 1,125 children between 18 months and 6 years of age. The centers had their own medical clinic, cafeteria, and large play areas, and employed highly trained staff.  Parents paid $5 for a six-day week for one child and $3.75 for each additional child.  For a small additional fee, the centers also prepared a small dinner for parents to pick up at the end of their working day.

While the Kaiser Company received much national praise as well as appreciation from its employees with young children, these centers were largely paid for by the government.  Government funds directly paid for their construction, and a majority of the costs of running the center, including staff salaries, were included in the company’s cost-plus contracting with the military.

Political dynamics

There was considerable opposition to federal financing of group child care, especially for children younger than 6 years of age.  The sentiment is captured in this quote from a 1943 New York Times article: “The worst mother is better than the best institution when it is a matter of child care, Mayor La Guardia declared.”  Even the War Manpower Commission initially opposed mothers with young children working outside the home, even in service of the war effort, stating that “The first responsibility of women with young children, in war as in peace, is to give suitable care in their own homes to their children.”

But on-the-ground realities made this an untenable position for both the government and business. Women sought jobs, whether out of economic necessity or patriotism.  The government, highlighted by its Rosie the Riveter campaign, was eager to encourage their employment in industries producing for the war effort.  And, despite public sentiment, a significant number of those women were mothers with young children. 

Luedell Mitchell and Lavada Cherry working at a Douglas Aircraft plant in El Segundo, Calif. Circa 1944. Credit: National Archives photo no. 535811

The growing importance of women in the workplace, and especially mothers with young children, is captured in employment trends in Portland, Oregon.  Women began moving into the defense workforce in great numbers starting in 1942, with the number employed in local war industries climbing from 7,000 in November 1942 to 40,000 in June 1943.  An official with the state child-care committee reported that “a check of six shipyards reveals that the number of women employed in the shipyards has increased 25 percent in one month and that the number is going to increase more rapidly in the future.” 

The number of employed mothers was also rapidly growing.  According to the Council of Social Agencies, “Despite the recommendations of the War Manpower Commission . . . thousands of young mothers in their twenties and thirties have accepted jobs in war industries and other businesses in Multnomah County. Of the 8,000 women employed at the Oregon Shipyards in January, 1943, 32 percent of them had children, 16 percent having pre-school children.”

Portland was far from unique.  During the war, for the first time, married women workers outnumbered single women workers.  Increasingly, employers began to recognize the need for child care to address absenteeism problems.  As a “women’s counselor” at the Bendix Aviation Corporation in New Jersey explained to reporters in 1943, child care is one of the biggest concerns for new hires. “We feel a mother should be with her small baby if possible. But many of them have to come back. Their husbands are in the service and they can’t get along on his allotment.”  Media stories, many unsubstantiated, of children left in parked cars outside workplaces or fending for themselves at home, also contributed to a greater public acceptance of group child care. 

An image of Rosie the Riveter that appeared in a 1943 issue of the magazine Hygeia

Finally, the government took action.  The Federal Works Agency was one of two new super agencies established in 1939 to oversee the large number of agencies created during the New Deal period.  In 1940 President Roosevelt signed into law the Lanham Act, which authorized the FWA to fund and supervise the construction of needed public infrastructure, such as housing, hospitals, water and sewer systems, police and firefighting facilities, and recreation centers, in communities experiencing rapid growth because of the defense buildup. In August 1942, the FWA decided, without any public debate, that public infrastructure also meant child care, and it began its program of support for the construction and operation of group child-care facilities.

The Federal Works Agency, the other super agency, whose oversight responsibilities included the Children’s Bureau and the U.S. Office of Education, opposed the FWA’s new child-care initiative.  It did so not only because it believed that child care fell under its mandate, but also because the leadership of the Children’s Bureau and Office of Education opposed group child care.  The FWA won the political battle, and in July 1943, Congress authorized additional funding for the FWA’s child-care efforts. 

And, as William M. Tuttle, Jr. describes, public pressure played an important part in the victory:

the proponents of group child care organized a potent lobbying effort. The women’s auxiliaries of certain industrial unions, such as the United Electrical Workers and the United Auto Workers, joined with community leaders and FWA officials in the effort. Also influential were the six women members of the House of Representatives. In February 1944, Representative Mary T. Norton presented to the House “a joint appeal” for immediate funds to expand the wartime child day care program under the FWA.

Termination and a step back

Congressional support for group child care was always tied to wartime needs, a position shared by most FWA officials.  The May 1945 Allied victory in Europe brought a drop in war production, and a reduction in FWA community child care approvals and renewals.  In August, after the Japanese surrender brought the war to a close, the FWA announced that it would end its funding of child-care centers as soon as possible, but no later than the end of October 1945.

Almost immediately thousands of individuals wrote letters, sent wires, and signed petitions calling for the continuation of the program.  Officials in California, the location of many war-related manufacturing sites and nearly 25 percent of all children enrolled in Lanham Act centers in August 1945, also weighed in, strongly supporting the call.  Congress yielded, largely influenced by the argument that since it would be months before all the “men” in the military returned to the country, mothers had no choice but to continue working and needed the support of the centers to do so.  It approved new funds, but only enough to keep the centers operating until the end of February 1946.

The great majority of centers rapidly closed not long after the termination of federal support, with demonstrations following many of the closings.  The common assumption was that women would not mind the closures, since most would be happy to return to homemaking.  Many women were, in fact, forced out of the labor force, disproportionately suffering from post-war industrial layoffs.  But by 1947, women’s labor force participation was again on the rise and a new push began for a renewal of federal support for community child-care centers. Unfortunately, the government refused to change its position. During the Korean War, Congress did approve a public child-care bill, but then it refused to authorize any funding.

After WWII, parents organized demonstrations, like this one in New York on Sept. 21, 1947, calling for the continuing funding of the centers. The city’s welfare commissioner dismissed the protests as “hysterical.” Credit: The New York Times

Finally, in 1954, as Sonya Michel explains, “Congress found an approach to child care it could live with: the child-care tax deduction.”  While the child-care tax deduction did offer some financial relief to some families, it did nothing to ensure the availability of affordable, high-quality child care.  The history of child care during World War II makes clear that this turn to market-based tax policy to solve child-care problems represented a big step back for working women and their children.  And this was well understood by most working people at the time. 

Sadly, this history has been forgotten, and Biden’s commitment to expand the child-care tax credit is now seen as an important step forward.  History shows we can and need to do better.

Time to put the spotlight on corporate taxes

A battle is slowly brewing in Washington DC over whether to raise corporate taxes to help finance new infrastructure investments.  While higher corporate taxes cannot generate all the funds needed, the coming debate over whether to raise them gives us an opportunity to challenge the still strong popular identification of corporate profitability with the health of the economy and, by extension, worker wellbeing.

According to the media, President Biden’s advisers are hard at work on two major proposals with a combined $3 trillion price tag.  The first aims to modernize the country’s physical infrastructure and is said to include funds for the construction of roads, bridges, rail lines, ports, electric vehicle charging stations, and affordable and energy efficient housing as well as rural broadband, improvements to the electric grid, and worker training programs.  The second targets social infrastructure and would provide funds for free community college education, universal prekindergarten, and a national paid leave program. 

To pay for these proposals, Biden has been talking up the need to raise corporate taxes, at least to offset some of the costs of modernizing the country’s physical infrastructure.  Not surprisingly, Republican leaders in Congress have voiced their opposition to corporate tax increases.  And corporate leaders have drawn their own line in the sand.  As the New York Times reports:

Business groups have warned that corporate tax increases would scuttle their support for an infrastructure plan. “That’s the kind of thing that can just wreck the competitiveness in a country,” Aric Newhouse, the senior vice president for policy and government relations at the National Association of Manufacturers, said last month [February 2021].

Regardless of whether Biden decides to pursue his broad policy agenda, this appears to be a favorable moment for activists to take advantage of media coverage surrounding the proposals and their funding to contest these kinds of corporate claims and demonstrate the anti-working-class consequences of corporate profit-maximizing behavior.  

What do corporations have to complain about?

To hear corporate leaders talk, one would think that they have been subjected to decades of tax increases.  In fact, quite the opposite is true.  The figure below shows the movement in the top corporate tax rate.  As we can see, it peaked in the early 1950s and has been falling ever since, with a big drop in 1986, and another in 2017, thanks to Congressionally approved tax changes.

One consequence of this corporate friendly tax policy is, as the following figure shows, a steady decline in federal corporate tax payments as a share of GDP.  These payments fell from 5.6 percent of GDP in 1953 to 1.5 percent in 1982, and a still lower 1.0 percent in 2020.  By contrast there has been very little change in individual income tax payments as a share of GDP; they were 7.7 percent of GDP in 2020.

Congressional tax policy has certainly been good for the corporate bottom line.  As the next figure illustrates, both pre-tax and after-tax corporate profits have risen as a share of GDP since the early 1980s.  But the rise in after-tax profits has been the most dramatic, soaring from 5.2 percent of GDP in 1980 to 9.1 percent in 2019, before dipping slightly to 8.8 percent in 2020.   To put recent after-tax profit gains in perspective, the 2020 after-tax profit share is greater than the profit share in every year from 1930 to 2005.

What do corporations do with their profits?

Corporations claim that higher taxes would hurt U.S. competitiveness, implying that they need their profits to invest and keep the economy strong.  Yet, despite ever higher after-tax rates of profit, private investment in plant and equipment has been on the decline.

As the figure below shows, gross private domestic nonresidential fixed investment as a share of GDP has been trending down since the early 1980s.  It fell from 14.8 percent in 1981 to 13.4 percent in 2020.

Rather than investing in new plant and equipment, corporations have been using their profits to fund an aggressive program of stock repurchases and dividend payouts.  The figure below highlights the rise in corporate stock buybacks, which have helped drive up stock prices, enriching CEOs and other top wealth holders. In fact, between 2008 and 2017, companies spent some 53 percent of their profits on stock buybacks and another 30 percent on dividend payments.

It should therefore come as no surprise that CEO compensation is also exploding, with CEO-to-worker compensation growing from 21-to-1 in 1965, to 61-to-1 in 1989, 293-to-1 in 2018, and 320-to-1 in 2019.  As we see in the next figure, the growth in CEO compensation has actually been outpacing the rise in the S&P 500.

In sum, the system is not broken.  It continues to work as it is supposed to work, generating large profits for leading corporations that then find ways to generously reward their top managers and stockholders.  Unfortunately, investing in plant and equipment, creating decent jobs, or supporting public investment are all low on the corporate profit-maximizing agenda.  

Thus, if we are going to rebuild and revitalize our economy in ways that meaningfully serve the public interest, working people will have to actively promote policies that will enable them to gain control over the wealth their labor produces.  One example: new labor laws that strengthen the ability of workers to unionize and engage in collective and solidaristic actions.  Another is the expansion of publicly funded and provided social programs, including for health care, housing, education, energy, and transportation. 

And then there are corporate taxes.  Raising them is one of the easiest ways we have to claw back funds from the private sector to help finance some of the investment we need.  Perhaps more importantly, the fight over corporate tax increases provides us with an important opportunity to make the case that the public interest is not well served by reliance on corporate profitability.

COVID-19 Economic Crisis Snapshot

 Workers in the United States are in the midst of a punishing COVID-19 economic crisis.  Unfortunately, while a new fiscal spending package and an effective vaccine can bring needed relief, a meaningful sustained economic recovery will require significant structural changes in the operation and orientation of the economy.

The unemployment problem

Many people blame government mandated closure orders for the decline in economic activity and spike in unemployment.  But the evidence points to widespread concerns about the virus as the driving force.  As Emily Badger and Alicia Parlapiano describe in a New York Times article, and as illustrated in the following graphic taken from the article:

In the weeks before states around the country issued lockdown orders this spring, Americans were already hunkering down. They were spending less, traveling less, dining out less. Small businesses were already cutting employment. Some were even closing shop.

People were behaving this way — effectively winding down the economy — before the government told them to. And that pattern, apparent in a range of data looking back over the past two months, suggests in the weeks ahead that official pronouncements will have limited power to open the economy back up.

As the graphic shows, economic activity nosedived around the same time regardless of whether state governments were quick to mandate closings, slow to mandate closings, or unwilling to issue stay-at-home orders.

The resulting sharp decline in economic activity caused unemployment to soar. Almost 21 million jobs were lost in April at the peak of the crisis.  The unemployment rate hit a high of 14.7 percent.  By comparison the highest unemployment rate during the Great Recession was 10.6 percent in January 2010.

Employment recovered the next month, with an increase of 2.8 million jobs in May.  In June, payrolls grew by an even greater number, 4.8 million.  But things have dramatically slowed since.  In July, only 1.8 million jobs came back.  In August it was 1.5 million.  And in September it was only 661,000.  To this point, only half of the jobs lost have returned, and current trends are far from encouraging.

The unemployment rate fell to 7.9 percent in September, a significant decline from April.  But a large reason for that decline is that millions of workers have given up working or looking for work and are no longer counted as being part of the labor force.  And, as Alisha Haridasani Gupta writes in the New York Times:

A majority of those dropping out were women. Of the 1.1 million people ages 20 and over who left the work force (neither working nor looking for work) between August and September, over 800,000 were women, according to an analysis by the National Women’s Law Center. That figure includes 324,000 Latinas and 58,000 Black women. For comparison, 216,000 men left the job market in the same time period.

The relationship between the fall in the unemployment rate and worker exodus from the labor market is illustrated in the next figure which shows both the unemployment rate and the labor force participation rate (LFPR), which is measured by dividing the number of people 16 and over who are employed or seeking employment by the size of the civilian noninstitutional population that is 16 and over.

The figure allows us to see that even the relatively “low” September unemployment rate of 7.9 percent is still high by historical standards.  It also allows us to see that its recent decline was aided by a decline in the LFPR to a level not seen since the mid-1970s.  If those who left the labor market were to decide to once again seek employment, pushing the LFPR back up, unless the economic environment changed dramatically, the unemployment rate would also be pushed up to a much higher level.

Beyond the aggregate figures is the fact, as Heather Long, Andrew Van Dam, Alyssa Fowers and Leslie Shapiro explain in a Washington Post article, that “No other recession in modern history has so pummeled society’s most vulnerable.”

As we can see in the above graphic, the 1990 recession was a relatively egalitarian affair with all income groups suffering roughly a similar decline in employment.  That changed during the recessions of 2001 and 2008, with the lowest earning cohort suffering the most.  But, as the authors of the Washington Post article state, “even that inequality is a blip compared with what the coronavirus inflicted on low-wage workers this year.”  By the end of the summer, the employment crisis was largely over for the highest earners, while employment was still down more than 20 percent for low-wage workers and around 10 percent for middle-wage workers.

Poverty is on the rise

In line with this disproportionate hit suffered by low wage workers, the poverty rate has been climbing.  Five Columbia University researchers, using a monthly version of the Supplemental Poverty Measure (SPM), provide estimates of the monthly poverty rate from October 2019 through September 2020.  They found, as illustrated below, “that the monthly poverty rate increased from 15% to 16.7% from February to September 2020, even after taking the CARES Act’s income transfers into account. Increases in monthly poverty rates have been particularly acute for Black and Hispanic individuals, as well as for children.”

The standard poverty measure used by the federal government is an annual one, based on whether a family’s total annual income falls below a specified income level.  It doesn’t allow for monthly calculations and is widely criticized for using an extremely low emergency food budget to set its poverty level.   The SPM includes a more complete and accurate measure of family resources, a more expansive definition of family, the cost of a broader basket of necessities, and is adjusted for cost of living across metro areas.

As we can see in the above figure, the $2.2 trillion Coronavirus Aid, Relief, and Economic Security (CARES) Act, which was passed by Congress and signed into law on March 27th, 2020, has had a positive effect on poverty levels.  For example, without it, the poverty rate would have jumped to 19.4 percent in April. “Put differently, the CARE Act’s income transfers directly lifted around 18 million individuals out of poverty in April.”

However, as we can also see, the positive effects of the CARES Act have gradually dissipated.  The Economic Impact Payments (“Recovery Rebates”) were one-time payments.  The $600 per week unemployment supplement expired at the end of July.  Thus, the gap between the monthly SPM with and without the CARES Act has gradually narrowed.  And, with job creation dramatically slowing, without a new federal stimulus measure it is likely we will not see much improvement in the poverty rate in the coming months.  In fact, if working people continue to leave the labor market out of discouragement and the pressure of home responsibilities, there is a good chance the poverty rate will climb again.

It is also important to note that the rise in monthly rates of poverty, even with the CARES Act, differs greatly by race/ethnicity as illustrated in the following figure.

The need to do more

Republican opposition to a new stimulus ensures that that there will be no follow-up to the CARES Act before the upcoming election.  Opponents claim that the federal government has already done enough and the economy is well on its way to recovery. 

As for the size of the stimulus, the United States has been a lagger when it comes to its fiscal response to the pandemic.  The OECD recently published an interim report titled “Coronavirus: Living with uncertainty.”  One section of the report looks at fiscal support as a percent of 2019 GDP for nine countries. As the following figure shows, the United States trails every country but Korea when it comes to direct support for workers, firms, and health care.  

A big change is needed

While it is natural to view COVID-19 as responsible for our current crisis, the truth is that our economic problems are more long-term.  The U.S. economy has been steadily weakening for years.  In the figure below, the “trend” line is based on the 2.1% average rate of growth in real per capita GDP from 1970 to 2007, the year before the Great Recession.  Not surprising, real per capita GDP took a big hit during the Great Recession.  But as we can also see, real per capita GDP has yet to return to its historical trend. In fact, the gap has grown larger despite the record long recovery that followed. 

As Doug Henwood explains:

Since 2009, the growth rate has averaged 1.6%. Last year [2019], which Trump touted as the greatest economy ever, it managed to get back to the pre-2008 average of 2.1%, an average that includes two deep recessions (1973–1975 and 1981–1982).

At the end of 2019, actual [real GDP per capita] was 13% below trend. At the end of the 2008–2009 recession it was 9% below trend. Remarkably, despite a decade-long expansion, it fell further below trend in well over half the quarters since the Great Recession ended. The gap is now equal to $10,200 per person—a permanent loss of income, as economists say. 

The pre-coronavirus period of expansion (June 2009 to February 2020), although the longest on record, was actually also one of the weakest. It was marked by slow growth, weak job creation, deteriorating job quality, declining investment, rising debt, declining life expectancy, and narrowing corporate profit margins. In other words, the economy was heading toward recession even before the start of state mandated lockdowns.  The manufacturing sector actually spent much of 2019 in recession.   

Thus, there is strong reason to believe that a meaningful sustained recovery from the current COVID-19 economic crisis is going to require more than the development of an effective vaccine and a responsive health care system to ensure its wide distribution.  Also needed is significant structural change in the operation and orientation of the economy.

Defunding police and challenging militarism, a necessary response to their “battle space”

The excessive use of force and killings of unarmed Black Americans by police has fueled a popular movement for slashing police budgets, reimagining policing, and directing freed funds to community-based programs that provide medical and mental health care, housing, and employment support to those in need.  This is a long overdue development.

Police are not the answer

Police budgets rose steadily from the 1990s to the Great Recession and, despite the economic stagnation that followed, have remained largely unchanged.  This trend is highlighted in the figure below, which shows real median per capita spending on police in the 150 largest U.S. cities.  That spending grew, adjusted for inflation, from $359 in 2007 to $374 in 2017.  The contrast with state and local government spending on social programs is dramatic.  From 2007 to 2017, median per capita spending on housing and community development fell from $217 to $173, while spending on public welfare programs fell from $70 to $47.

Thus, as economic developments over the last three decades left working people confronting weak job growth, growing inequality, stagnant wages, declining real wealth, and rising rates of mortality, funding priorities meant that the resulting social consequences would increasingly be treated as policing problems.  And, in line with other powerful trends that shaped this period–especially globalization, privatization, and militarization–police departments were encouraged to meet their new responsibilities by transforming themselves into small, heavily equipped armies whose purpose was to wage war against those they were supposed to protect and serve. 

The military-to-police pipeline

The massive, unchecked militarization of the country and its associated military-to-police pipeline was one of the more powerful factors promoting this transformation.  The Pentagon, overflowing with military hardware and eager to justify a further modernization of its weaponry, initiated a program in the early 1990s that allowed it to provide surplus military equipment free to law enforcement agencies, allegedly to support their “war on drugs.”  As a Forbes article explains:

Since the early 1990s, more than $7 billion worth of excess U.S. military equipment has been transferred from the Department of Defense to federal, state and local law enforcement agencies, free of charge, as part of its so-called 1033 program. As of June [2020], there are some 8,200 law enforcement agencies from 49 states and four U.S. territories participating. 

The program grew dramatically after September 11, 2001, justified by government claims that the police needed to strengthen their ability to combat domestic terrorism.  As an example of the resulting excesses, the Los Angeles Times reported in 2014 that the Los Angeles Unified School District and its police officers were in possession of three grenade launchers, 61 automatic military rifles and a Mine Resistant Ambush Protected armored vehicle. Finally, in 2015, President Obama took steps to place limits on the items that could be transferred; tracked armored vehicles, grenade launchers, and bayonets were among the items that were to be returned to the military.

President Trump removed those limits in 2017, and the supplies are again flowing freely, including armored vehicles, riot gear, explosives, battering rams, and yes, once again bayonets.  According to the New York Times, “Trump administration officials said that the police believed bayonets were handy, for instance, in cutting seatbelts in an emergency.”

Outfitting police departments for war also encouraged different criteria for recruiting and training. For example, as Forbes notes, “The average police department spends 168 hours training new recruits on firearms, self-defense, and use of force tactics. It spends just nine hours on conflict management and mediation.”  Arming and training police for military action leads naturally to the militarization of police relations with community members, especially Black, Indigeous and other people of color, who come to play the role of the enemy that needs to be controlled or, if conditions warrant, destroyed.

In fact, the military has become a major cheerleader for domestic military action.  President Trump, on a call with governors after the start of demonstrations protesting the May 25, 2020 killing of George Floyd while in police custody, exhorted them to “dominate” the street protests.

As the Washington Examiner reports:

“You’ve got a big National Guard out there that’s ready to come and fight like hell,” Trump told governors on the Monday call, which was leaked to the press.

[Secretary of Defense] Esper lamented that only two states called up more than 1,000 Guard members of the 23 states that have called up the Guard in response to street protests. The National Guard said Monday that 17,015 Guard members have been activated for civil unrest.

“I agree, we need to dominate the battle space,” Esper said after Trump’s initial remarks. “We have deep resources in the Guard. I stand ready, the chairman stands ready, the head of the National Guard stands ready to fully support you in terms of helping mobilize the Guard and doing what they need to do.”

The militarization of the federal budget

The same squeeze of social spending and support for militarization is being played out at the federal level.  As the National Priorities Project highlights in the following figure, the United States has a military budget greater than the next ten countries combined.

Yet, this dominance has done little to slow the military’s growing hold over federal discretionary spending.  At $730 billion, military spending accounts for more than 53 percent of the federal discretionary budget.  A slightly broader notion, what the National Priorities Project calls the militarized budget, actually accounts for almost two-thirds of the discretionary budget.  The militarized budget:

includes discretionary spending on the traditional military budget, as well as veterans’ affairs, homeland security, and law enforcement and incarceration. In 2019, the militarized budget totaled $887.8 billion – amounting to 64.5 percent of discretionary spending. . . . This count does not include forms of militarized spending allocated outside the discretionary budget, include mandatory spending related to veterans’ benefits, intelligence agencies, and interest on militarized spending.

The militarized budget has been larger than the non-militarized budget every year since 1976.  But the gap between the two has grown dramatically over the last two decades. 

In sum, the critical ongoing struggle to slash police budgets and reimagine policing needs to be joined to a larger movement against militarism more generally if we are to make meaningful improvements in majority living and working conditions.

The economy: we are still in big trouble

The announcement by the Bureau of Labor Statistics that the federal unemployment rate declined to 13.3 percent in May, from 14.7 percent in April, took most analysts by surprise.  The economy added 2.5 million jobs in May, the first increase in employment since February.  Most economists had predicted further job losses and a rise in the unemployment rate to as high as 20 percent.

This employment gain has encouraged some analysts, especially those close to the Trump administration, to proclaim that their predicted V-shaped economic recovery had begun.  But there are strong reasons to believe that the US economy is far from recovery.

Long term trends and the coronavirus

Predictions for a V-shaped recovery rest to a considerable degree on the belief that our current economic collapse was caused by state mandated business closures to battle the coronavirus which unsurprisingly choked off our long expansion.  Now that a growing number of states are ending their forced lockdowns it is only natural that the economy would resume growing.  Certainly, the stock market’s recent rise suggests that many investors agree. 

However, there are many reasons to challenge this upbeat story of impending recovery.  One of the most important is that the pre-coronavirus period of expansion (June 2009 to February 2020), although the longest on record, was also one of the weakest. It was marked by slow growth, weak job creation, deteriorating job quality, declining investment, rising debt, declining life expectancy, and narrowing corporate profit margins. In other words, the economy was heading toward recession even before the start of state mandated lockdowns.  For example, the manufacturing sector spent much of 2019 in recession.   

Another reason is that the downturn in economic activity that marks the start of our current recession predates lockdown orders.  It was driven by health concerns.  As Emily Badger and Alicia Parlapiano explain in their New York Times article, and as illustrated in the following graphic taken from the article:

In the weeks before states around the country issued lockdown orders this spring, Americans were already hunkering down. They were spending less, traveling less, dining out less. Small businesses were already cutting employment. Some were even closing shop.

People were behaving this way — effectively winding down the economy — before the government told them to. And that pattern, apparent in a range of data looking back over the past two months, suggests in the weeks ahead that official pronouncements will have limited power to open the economy back up.

In some states that have already begun that process, like Georgia, South Carolina, Oklahoma and Alaska, the same daily economic data shows only meager signs so far that businesses, workers and consumers have returned to their old routines.

Thus, while some rebound in economic growth is to be expected given the severity of the downturn to this point, it is unlikely that the May employment jump signals the start of a powerful economic recovery.  Weak underlying economic conditions and health worries remain significant obstacles.

In fact, even the optimistic US Congressional Budget Office predicts at best a long, slow recovery.  As Michael Roberts describes:

It now expects US nominal GDP to fall 14.2% in the first half of 2020, from the trend it forecast in January before the COVID-19 pandemic broke. Then it expects the various fiscal and monetary injections by the authorities and the end of the lockdowns to reduce this loss from the January figure to 9.4% by end 2020. The CBO still expects a sort of V-shaped recovery in US GDP in 2021 but does not expect the pre-pandemic crisis trend in US economic growth (already reduced in the Long Depression since 2009) to be reached until 2029 and may not even return to the previous trend growth forecast until after 2030! So there will be a permanent loss of 5.3% in nominal GDP compared to pre-COVID forecasts – $16trn in value lost forever. In real GDP terms, the loss will be about 3% cumulatively, or $8trn in 2019 money.  And this assumes no second wave in the pandemic and no financial collapse as companies go bust.

Depression level unemployment

Although President Trump has celebrated the May employment gains, the fact is we continue to suffer depression level unemployment.  The following figure from the Washington Post provides some historical perspective.  The current official unemployment rate of 13.3 percent is more than a third higher than the highest level of unemployment reached during the Great Recession. 

But even the Bureau of Labor Statistics acknowledges that because of the unique nature of the current crisis the official announced unemployment rate for each of the last three months is flawed.  The unemployment rate is based on household surveys.  For the past three months, in an attempt to better understand the impact of the coronavirus, interviewers were supposed to classify people not working because of the virus as “unemployed on temporary layoff.” However, as the Bureau of Labor Statistics acknowledges, many of those people were incorrectly classified as “employed but absent at work,” which is the classification used when a person isn’t coming to work because of vacation, illness, bad weather, a labor dispute, or other reasons.  People in this latter category are not counted as unemployed.

The BLS has determined that correcting the classification error would boost the official April unemployment rate to 19.7 percent and the May rate to 16.3 percent.  And, it is important to note that this unemployment rate does not include those workers who have stopped looking for work and those who are involuntarily working part-time.  Including them would push the May rate close to 25 percent.  

Stephen Moore, an economic adviser to President Trump, has stated that the May job numbers take “a lot of the wind out of the sails of any phase 4 [stimulus bill] — we don’t need it now. There’s no reason to have a major spending bill. The sense of urgent crisis is very greatly dissipated by the report.”  This is crazy.

Danger signs ahead

There are three reasons to fear that without substantial new federal action May employment gains will be short-lived. 

First, it has been relatively low-wage production and nonsupervisory workers who have suffered the greatest number of job losses.  That has left many businesses relatively top-heavy with managers and high-income professionals. A number of business analysts are now predicting a new wave of layoffs or firings of higher-income and management personal to bring staffing levels back into pre-coronavirus balance.

The following figure shows that almost 90 percent of the jobs lost from mid-February to mid-April were in the six lowest paid supersectors as defined by the Bureau of Labor Statistics. The May employment gains were also in these six sectors.

Economists with Bloomberg Economics are now warning of a second wave of job losses that will include “higher-paid supervisors in sectors where frontline workers were hit first, such as restaurants and hotels. It also includes the knock on-effects to connected industries such as professional services, finance and real estate.”

As Bloomberg explains:

The pandemic isn’t finished with the U.S. labor market, threatening a second wave of job cuts—this time among white-collar workers. . . .

For the analysis, [Bloomberg Economics economists] looked at job losses by sector in March and April—with affected industries dominated by blue-collar, hospitality and production workers—and determined how those layoffs would move to supervisory positions, since management cuts tend to lag the frontline workers.

The economists then took government data on relations between industries to compute the ones most reliant on demand from the most-affected sectors. Combining that information with the hit to employment in the most affected sectors they extrapolated to other jobs at risk, most of which were higher-skilled, white-collar roles.

The second reason to downplay the significance of the May employment gains is that critically important stimulus measures–in particular the one-time grant of $1200 for individuals and the $600 a week additional unemployment benefit (which expires at the end of July)–appear unlikely to be renewed.  If that boost to earnings is withdrawn, economic demand and employment will likely fall again.

As Ben Casselman, writing in the New York Times, points out:

Research routinely finds that unemployment insurance is one of the most effective parts of the safety net, both in cushioning the effects of job loss on families and in lifting the economy. In economists’ parlance, the program is “well targeted” — it goes to people who need the money and who will spend it. Various studies have found that in the last recession, the system helped prevent 1.4 million foreclosures, saved two million jobs and kept five million people out of poverty.

The impact could be greater in this crisis because the program is reaching more people and giving them more money. The government paid $48 billion in benefits in April and has reached $86 billion in May, according to the Treasury Department.

As the following figure shows, almost all workers have suffered significant declines in employment income with low income workers taking the biggest hit.

Yet, the increase in food insecurity has been relatively small, especially for low income workers.

It is, as highlighted in the next figure, the massive individual benefit boosts included in the March stimulus package that has so far kept the decline in employment income from translating into dramatic spikes in food insecurity. If Congress refuses to pass a new stimulus that includes direct aid to the unemployed, the odds are great that the economic recovery will stall and unemployment will grow again.

The last reason for pessimism is the likely further contraction in state and local government spending and, by extension, employment and services, as a result of declining revenue.  State and local government employment fell by 1 million from February to April, and by an additional 600 million in May.  Looking just at state budgets, the Center for Budget and Policy Priorities estimates a shortfall in state budgets of $765 billion over fiscal years 2020-22, “much deeper than in the Great Recession of about a decade ago” (see the figure below).

And unfortunately, as the Center for Budget and Policy Priorities also notes, the federal government has, up to now, been unwilling to do much to help state governments manage their ballooning deficits:

Federal aid that policymakers provided in earlier COVID-19 packages isn’t nearly enough. Only about $65 billion is readily available to narrow state budget shortfalls. Treasury Department guidance now says that states may use some of the aid in the CARES Act of March to cover payroll costs for public safety and public health workers, but it’s unclear how much of state shortfalls that might cover; existing aid likely won’t cover much more than $100 billion of state shortfalls, leaving nearly $665 billion unaddressed. States hold $75 billion in their rainy-day funds, a historically high amount but far too little to meet the unprecedented challenge they face. And, even if states use all of it to cover their shortfalls, that still leaves them about $600 billion short.

States must balance their budgets every year, even in recessions. Without substantial federal help during this crisis, they very likely will deeply cut areas such as education and health care, lay off teachers and other workers in large numbers, and cancel contracts with many businesses. . . . That would worsen the recession, delay the recovery, and further harm families and communities.

Without a new stimulus measure that also includes support for state and local governments, their forced reduction in spending and cuts in employment can only add to the existing pressures working against recovery.

In sum, the crisis is real.  A new stimulus that included a renewal of special unemployment payments as well as direct support for state and local governments and other critical services like the postal service could help stabilize the economy.  But real progress will require a major effort on the part of the federal government to ensure adequate production of COVID-19 test kits and PPE as well as nationwide testing and contact tracing programs and then, most importantly, a fundamental reorganization of our economy.

What the New Deal can teach us about winning a Green New Deal: Part IV—Keeping the pressure on the state

Advocates for a Green New Deal, pointing to ever-worsening and interrelated environmental, economic, and social problems, seek adoption of a complex and multifaceted state-directed program of economic transformation.  Many point to the original New Deal–highlighting the federal government’s acceptance of responsibility for fighting the depression and introduction of new initiatives to stabilize markets, expand relief, create jobs producing public goods and services, and establish a system of social security–to make it easier for people to envision and support another transformative state effort to solve a major societal crisis.

While the New Deal experience might well inspire people to believe in the possibility of a Green New Deal, the way that experience is commonly presented may well encourage Green New Deal supporters to miss what is most important to learn from it and thus weaken our chances for advancing a meaningful and responsive Green New Deal.  Often the New Deal experience is described as a set of interconnected government policies that were implemented over a short period of time by a progressive government determined to end an economic crisis.  This emphasis on government policy encourages current activists to focus on developing policies appropriate to our contemporary crisis and on electing progressive leaders to implement them.

In reality, as discussed in Part I, Part II, and Part III of this series, despite the enormous negative social consequences of the Great Depression, it took sustained organizing, led by a movement of the unemployed, to transform the national political environment and force the federal government to accept responsibility for improving economic conditions.  Even then, the policies of the Roosevelt administration’s First New Deal were most concerned with stabilizing business conditions under terms more favorable to business than workers.  Its relief and job creation policies were minimal and far from what the movement demanded or was needed to meet majority needs.

As I argue in this post, it took continued mass organizing to force the Roosevelt administration to implement, two years later, its Second New Deal, which included its now widely praised programs for public works, social security, and union rights.  However, as important and unprecedented as these programs were, they again, as we will see in the next post, fell short of what working people were demanding at the time.

Thus, the most important take-away from this history is that winning a meaningful Green New Deal will require more than well-constructed policy demands and the election of a progressive president.  It will require building a left-led mass movement that prepares people for a long struggle to overcome expected state and corporate resistance to the needed transformative changes.  And a careful study of the New Deal experience can alert to the many challenges and strategic choices we are likely to confront in our movement building efforts as well as the many policy twists and turns we are likely to face as the federal government and corporate sector respond to our demands for a Green New Deal.

The failings of the First New Deal

The First New Deal, as important and innovative as it was, offered no meaningful solution to the crisis faced by working people.  The economy had hit bottom in early 1933 and was beginning to recover.  But although national income grew by one-quarter between 1933 and 1934, it was still only a little more than half of what it had been in 1929.  Some ten million workers remained without jobs and almost twenty million people remained at least partially dependent on relief.

As discussed in Part III, Roosevelt’s First New Deal relief and job programs, were, by design, inadequate to address the ongoing social crisis.  For example, the Federal Emergency Relief Administration (FERA) did provide the first direct federal financing of state relief.  But because the program required matching state funds, many states either refused to apply for FERA grants or kept their requests small. Moreover, because state governments were determined to minimize their own financial obligations and not undermine private business activity, those that did receive relief were subject to demeaning investigations into their personal finances and relief payments were kept small and often limited to coupons exchangeable only for food items on an approved list.

The Civil Works Administration (CWA), created under FERA’s umbrella, was a far more attractive program.  Most importantly, participation was not limited to those on relief. And the program offered meaningful, federally organized, work for pay.  However, with millions of workers seeking to participate in the program, Roosevelt, determined to keep the federal budget deficit small, refused to fund it beyond six months.

Many workers were also critical of one of the First New Deal’s most important efforts to promote economic recovery: the National Industrial Recovery Act (NIRA).  The NIRA suspended anti-trust laws and encouraged companies to engage in self-regulation through industry organized wage and price controls, and the establishment of production quotas and restrictions on market entry.  Workers saw this act as rewarding the same business leaders that were responsible for the Great Depression.

To appease trade union leaders, Section 7a of the NIRA included the statement that “employees shall have the right to organize and bargain collectively through representatives of their own choosing . . . free from the interference, restraint, or coercion of employers.”  Unfortunately, no mechanism was included to ensure that workers would be able to exercise this right, and after a short period of successful union organizing, companies began violently repressing genuine union activity.

Another key First New Deal initiative, The Agricultural Adjustment Act, was also unpopular.  It sought to strengthen the agricultural sector by paying farmers to take land out of production, thus lowering the supply of agricultural goods and boosting their price. However, most of the land that was taken out of production had been used by poor African American sharecroppers and tenant farmers.  Thus, the policy ended up rewarding the bigger farmers and punishing the poorest.

In short, the programs of the First New Deal, coming almost four years after the start of the Great Depression, fell far short of what workers needed and wanted.  And, they did little to slow the on-going mass organizing.

The unemployed movement continues

As described in Part II, the Communist Party (CP) was fast off the mark in organizing the unemployed.  As early as August 1929, two months before the stock market crash, it had begun work on the creation of a nationwide organization of Unemployed Councils (UCs).  The UCs grew fast, uniting and mobilizing the unemployed, who engaged in locally organized fights for relief and against evictions in many parts of the country.  The CP and the UCs also organized several national mobilizations in support of federal unemployment insurance and emergency relief assistance as well as a 7-hour workday and an end to discrimination against African American and foreign-born workers.

The Socialist Party (SP) and the Muste-led Conference of Progressive Labor Action (CPLA) each had their own organizations concerned with the unemployed.  But they were few in number and initially not engaged in the kind of direct organizing of the unemployed and direct action practiced by the UCs.  SP organizations concentrated on educating the population about the causes of unemployment and the need for national action to combat it.  While CPLA organizations did include the unemployed, they were mostly focused on promoting self-help activities for survival.  However, beginning in 1933, both the SP and CPLA began to change their approach, and their respective organizations began to operate much like the CP’s Unemployed Councils.

The SP sponsored Chicago Workers Committee on Unemployment began the turn towards direct organizing of the unemployed and a commitment to direct action.  By 1933 it had 67 locals in Chicago as well as some in other nearby cities.  Committees in other states, primarily in the Midwest, soon followed Chicago’s example.  And in November 1933, these more activist committees came together to found a new, Midwest-centered organization, the Unemployed Workers League of America.

The SP’s New York organizations of unemployed were also growing in number.  Several came together in 1933 to form the Workers Unemployed League, which later merged with other organizations in the state to become the Workers Unemployed Union.  This group eventually merged with groups in other East Coast states to form the Eastern Federation of the Unemployed and Emergency Workers.  Socialist Party-led unemployed organizations held multi-state demonstrations in their areas of strength in March 1933 and November 1934 to demand new and more expansive programs of federal relief and job creation.

The Musteites began their own turn to more militant unemployed organizing in early 1933.  By July 1933 their Unemployed Leagues (ULs) claimed 100,000 members in Ohio, 40,000 in Pennsylvania, and 10,000 more in West Virginia, New Jersey, and North Carolina.  That same month, their ULs formed a national organization to coordinate their work, the National Unemployed League.

The CPLA dissolved itself in December 1933, as activists established a new, more radical organization, the American Workers Party (AWP).  Reflecting this change, delegates to the National Unemployed League’s second national convention in 1934 formally rejected the organization’s past reliance on self-help activities and private relief and declared their opposition to capitalism.

The ULs, like the UCs, engaged in mass sit-ins at relief offices to overturn negative decisions by relief officials.  One sit-in in Pittsburgh lasted 59 days.  They also organized mass resistance to court ordered evictions, blocking sheriffs when possible or returning furniture to an evictees home if it had been removed.   ULs in several cities also engaged in direct appropriate of food from government warehouses in line with their slogan, “Give Us Relief, Or We’ll Take It.”  The AWP, like its processor, had a strong presence in the Midwest, but was never able to extend its influence or build networks of Uls outside that region.

Not only did the programs of the First New Deal not slow unemployed organizing, the unemployed movement began increasingly taking on a unified national character as unemployed activists from the three different political tendencies gradually began working together, often against the mandates of their leaders. The extent and militance of unemployed activism made it difficult for governments–local, state, and national–to rest easy.  For one thing, it highlighted a growing radicalization of the population, as more and more people demonstrated their willingness to openly challenge the legitimacy of the police, the court system, and state institutions.

Relief worker organizing

The First New Deal greatly expanded the number of people on relief, and the CP quickly began organizing relief workers in 1934, followed shortly by the SP and CPLA.  The CP sponsored Relief Workers Leagues (RWLs) targeted those receiving FERA relief funds or employed by the CWA. In addition to organizing grievance committees to fight discrimination, especially against African American, single, and foreign-born workers, the RLWs fought for timely payment of relief wages, higher pay for relief work with cost of living adjustments, free transportation to work sites and free medical care, and a moratorium on electric and gas charges for those on relief.

They also sent delegations to Washington D.C. to protest wage discrimination or low wages and organized in support of the CP’s call for a national Unemployment Insurance Bill.  Local RWL members also joined with the unemployed in marches on state capitals and on picket lines outside welfare offices to demand more employment opportunities and more money for relief.  League members were especially aggressive in protesting against the termination of the CWA.

Nels Anderson, director of Labor Relations for the Works Progress Administration (WPA), provides a good feeling for the work of RWL members:

They parade; they protest; they make demands; they write millions of letters to officials. . . . They are irreconcilable . . . they never stop asking.  They state their demands in every conceivable way.  They crowd through the doors of every relief station and every WPA office.  They surround social workers on the street.

Although not as large or as developed as the unemployment movement, the organization and activities of relief worker organizations were not easy to ignore and made it difficult for the Roosevelt administration to tout the success of its First New Deal initiatives.

Organizing for a national Unemployment Insurance Bill

The CP also continued to organize for a national Unemployment Insurance Bill.  The CP and the UCs had declared National Unemployment Insurance Day on Feb 4, 1932 with activities in many cities.  It was also the major demand of the second national Hunger March in late 1932.  On March 4, 1933, the day of Roosevelt’s inauguration, they organized demonstrations stressing the need for action on unemployment insurance.

Undeterred by Roosevelt’s lack of action, the CP authored a bill that was introduced in Congress in February 1934 by Representative Ernest Lundeen of the Farmer-Labor Party.  Not surprisingly, the Workers Unemployment and Social Insurance Bill was strongly supported by the UCs as well SP and Musteite organizations of unemployed.  And as a result of the efforts of activists from these and other organizations it was soon formally endorsed by 5 international unions, 35 central labor bodies, and more than 3000 local unions.  Rank and file worker committees also formed across the country to pressure members of Congress to pass it.

In broad brush, the bill proposed social insurance for all the jobless, the sick, and the elderly without discrimination, at the expense of the wealthy.  More specifically, as Chris Wright summarizes, the bill:

provided for unemployment insurance for workers and farmers (regardless of age, sex, or race) that was to be equal to average local wages but no less than $10 per week plus $3 for each dependent; people compelled to work part-time (because of inability to find full-time jobs) were to receive the difference between their earnings and the average local full-time wages; commissions directly elected by members of workers’ and farmers’ organizations were to administer the system; social insurance would be given to the sick and elderly, and maternity benefits would be paid eight weeks before and eight weeks after birth; and the system would be financed by unappropriated funds in the Treasury and by taxes on inheritances, gifts, and individual and corporate incomes above $5,000 a year. Later iterations of the bill went into greater detail on how the system would be financed and managed.

When Congress refused to act on the bill, Lundeen reintroduced it in January 1935. Because of public pressure, the bill became the first unemployment insurance plan in US history to be recommended by a congressional committee, in this case the House Labor Committee.  It was voted down in the full House of Representatives, 204 to 52.

Roosevelt strongly opposed the Workers Unemployment and Social Insurance Bill, and so moved quickly to pressure Congress to write a social security bill he could support.  He created the President’s Committee on Economic Security in July 1934, which established the principles that formed the basis of the Social Security Act that was eventually signed into law as part of the Second New Deal.

Trade union organizing

As the economy continued to recover, and the unemployed were increasingly able to find jobs or gain relief, left groups began shifting their attention towards organizing the employed.  As one UL organizer who later became an organizer for the CIO explained, the goal was not a permanent organization of the unemployed. “We wanted the day to come when unemployed organizations would be done away with and there would only be organizations of employed workers.”

A number of unions, hoping to build on worker anger over employment conditions and the NIRA’s Section 7a, which many workers were encouraged to believe meant that the President supported unionization, launched lighting fast organizing drives.  And with good success.  The United Mine Workers was one.  For example, it took the union only one day after the NIRA became law to sign up some 80 percent of Ohio miners.  And it was able to press its advantage, aided by a series of wildcat strikes, to win gains for its members. The Amalgamated Clothing Workers and the International Ladies’ Garment Workers’ Union also grew quickly, with each winning significant employer concessions following a series of short strikes.

The following year saw an explosion of trade union organizing, including three major successful union struggles.  The first was in Toledo Ohio, which at the time was a major center for automobile parts manufacturing.  Organizing began in the summer 1933 at several parts plants.  In February 1934 some 4000 workers went out on strike.  It appeared that the strike would be settled quickly when one of the largest companies, Electric Auto-Lite, decided to oppose any deal.  The other companies quickly followed Electric Auto-Lite’s lead and the strike resumed.  With Electric Auto-Lite hiring scabs and maintaining production, it appeared the strike was lost.  Then, in May, the local UL, the unemployed organization of the American Workers’ Party, intervened.

It organized a mass picket line around Electric Auto-Lite, even though the courts had issued an injunction against third party picketing.  The local sheriff and special deputies arrested several picketers, beating one badly. In response, the UL and the union organized a bigger blockade of some 10,000 workers, trapping the strikebreakers inside the factory.

The “Battle of Toledo” was on.  In an effort to break the blockade, the sheriff and deputies used tear gas, water hoses, and guns.  The workers responded by stoning the plant and burning cars that were in the company parking lot.  The National Guard was called out and in the fighting that followed two picketers were killed.  Unable to break the strike, the plant was forced to close.  After two weeks of Federal mediation, the company and the union reached an agreement: the company recognized the union, boosted its minimum wage, and hiked average wages by 5 percent.

At almost the same time as the struggle began in Toledo, another major union battle started in Minneapolis.  In February 1934, the Trotskyist-led Teamster Local 574 organized a short successful strike, winning contracts with most of the city’s coal delivery companies.  The victory brought in many new members, both truckers and those who worked in warehouses.  In May, when employers refused to bargain with the union, some 5000 walked off their jobs. The union, well prepared for the strike, effectively shut down commercial transport in the city, allowing only approved farmers to deliver food directly to grocers.

The Citizen’s Alliance, composed of the city’s leading business people, tried to break the strike.  Police and special deputies trapped and beat several of the strikers.  The union responded with its own ambush.  The fighting continued over two days. A number of deputies and strikers were badly hurt, some from beatings and some from gunshots; two strikers died.  But the strike held. The National Guard was called in an attempt to restore order, and while they brought a halt to the fighting, their presence didn’t end the strike.

Other unions, especially in the building trades, began striking in solidarity with the Teamsters, and the threat of a general strike was growing.  After several weeks, with federal authorities applying pressure, the employers finally settled, signing a contract with the union.

A general strike did take place in San Francisco.  Passage of the NIRA had, much like in the coal industry, spurred a massive increase in union membership in West Coast locals of the International Longshoremen’s Association (ILA).  Led by left-wing activists, these locals began, in March 1934, organizing for a coastwide strike to win a shorter workweek, higher pay, union recognition, and a union-run hiring hall.  The threatened strike was soon called off by the top East Coast-based leadership of the ILA, following a request from Roosevelt.  They then secretly negotiated a new agreement with the employers that met none of the workers demands.

The San Francisco longshoremen rejected the deal and struck on May 9.  They were quickly joined by dockworkers in every other West Coast port as well as many sailors and waterfront truckers.  All totaled some 40,000 maritime workers stopped working.

Battles between the police and strikers who resisted the employers use of strikebreakers led to injuries in several ports and the death of one striker. Roosevelt tried again to end the strike, but without success.  On July 3, employers decided to use the police to break the picket line in San Francisco, and succeeded in getting a few trucks through.  They tried again on July 5, leading to a full-scale battle between the police and the strikers.  Two strikers were shot and killed on what became known as Bloody Thursday.

On the following day San Francisco longshoreman called for a general strike.  Teamster locals in both San Francisco and Oakland quickly voted to strike, despite the opposition of their leaders. On July 14, after a number of other unions had voted for a general strike, the San Francisco Labor Council endorsed the action. Some 150,000 workers went out, essentially bringing the city, as well as Oakland, Berkeley and other nearby municipalities, to a halt.  Police tried to break the strike by arresting strike leaders, but the workers held firm.  General Hugh S. Johnson, head of the National Recovery Administration, denounced the strike as a “bloody insurrection” and “a menace to the government.”

After three days, city union leadership, fearful of the growing radicalization of the strikers and worried about escalating threats from employers, called off the strike.  Local ILA unions were forced to accept federal arbitration, but in October, the arbitrator gave the workers most of what they had demanded.

These struggles showed a growth in worker militancy and radicalism that sent shock waves throughout the corporate community as well as the government. As Steve Fraser explains:

General strikes are rare and inherently political. While they last, the mechanisms and authority of the strike supplant or co-exist with those of the “legitimate” municipal government. . . . Barring actual revolution, power ultimately devolves back to where it came from. But the act of calling and conducting a general strike is a grave one. It may have no revolutionary aspirations, yet it opens the door to the unknown. That these two strikes [in Minneapolis and San Francisco] happened in the same year — 1934 — is a barometer of just how far down the road of anti-capitalism the working-class movement had traveled.

Corporate leaders, as the editors of the American Social History Project describe, did not just roll over in the face of this growing activism:

After the employers’ initial shock over Section 7a had worn off, executives in steel, auto, rubber, and a host of other industries followed a two-pronged strategy to forestall unionization: they established or revived company unions to channel workers discontent in nonthreatening directions, and the vigorously resisted organizing drives.

Textile employers were among the more ruthless in their response.  The largest strike in 1934 began in September when 376,000 textile workers from Maine to Alabama walked off their jobs. The employers hired spies, fired union activists, and had workers evicted from their company housing.  With the support of a number of governors, they also made use of the National Guard to break strikes.  Many strikers were injured in the violence that followed, some fatally.  The employers rejected a Roosevelt attempt at mediation and after three weeks, the union leadership ended the strike, having suffered a major defeat.

While a few unions were able to take advantage of the NIRA, most were not.  In fact, by early 1935, five hundred AFL local unions had been disbanded. Section 7a’s statement promising workers the right to organize freely turned out to be largely meaningless.  It was supposed to be enforced by a tripartite National Labor Board, but the board was given no real enforcement power, and it often refused to intervene in unionization struggles.  A number of industries, such as auto, were not even covered by it.  By 1935, growing numbers of workers were calling the National Recovery Administration (NRA), which had been established by the President to oversee the NIRA, the National Run Around.

Mounting pressure for a Second New Deal

With his First New Deal, Roosevelt demonstrated a willingness to experiment, but within established limits.  For example, he remained determined to limit federal budget deficits and minimize federal responsibility for relief and job creation.  Thus, his early initiatives failed to calm the political waters.

Economic improvements, while real, were not sufficient to satisfy working people.  Unemployment remained too high, relief programs remained too limited and punitive, and possibilities for improving wages and work conditions remained daunting for most of those with paid employment.  Consequently, left-led movements continued to successfully mobilize, educate, and radicalize growing numbers of workers around demands increasingly threatening to the status quo.

Also noteworthy as an indicator of the tenor of the times was Upton Sinclair’s 1934 run for governor of California.  His popular End Poverty in California movement advocated production for use and not for profit.  Among other things, it called for the state to purchase unused land and factories for use by the unemployed, allowing them to barter what they produced, as well as pensions for the poor and those over sixty years old, all to be financed by higher taxes on the wealthy and corporations.

More right-wing political movements were also gaining in popularity, feeding off of popular disenchantment with government policy.  For example, Senator Huey Long from Louisiana criticized Roosevelt for creating huge bureaucracies and supporting monopolization.  In 1934 he launched his Share Our Wealth Plan, which called for a system of taxes on the wealthy to finance guaranteed payments of between three to five thousand dollars per household and pensions for everyone over sixty.  He also advocated a thirty-hour work week and an eleven-month work-year.  His Share Our Wealth Clubs enjoyed a membership of some seven or eight million people, mostly in the South but also in the Midwest and mid-Atlantic states as well.

Frances Townsend, a retired doctor from California, had his own proposal.  His Townsend Plan called for giving every person over 60 who was not working $200 a month on the promise that they would spend it all during the month.  It also called for abolishing all other forms of Federal relief and was to be financed by a regressive national sales tax.  Within two years of the publication of his plan, over 3000 Townsend Plan Clubs, with some 2.2 million members, were organized all over the country and began pressuring Congress to pass it.

Despite political differences, all these movements–at least initially in the case of the movements promoted by Long and Townsend–tended to encourage a critical view of private ownership and wealth inequality and most business leaders blamed Roosevelt, and his First New Deal policies, for this development.  They were especially worried about the possibility of greater government regulation of their activities.  In 1934, a number of top business leaders resigned from Roosevelt’s Business Advisory Council and began exploring ways to defeat him in the presidential election of 1936.

In sum, by 1935 Roosevelt was well aware that he needed to act, and act decisively to reestablish his authority and popularity.  In some ways his decision to launch a more worker-friendly Second New Deal spoke to his limited choices.  Most business leaders had now made clear their opposition not only to his administration but to any new major federal initiatives as well.  In fact, in May 1935, the Supreme Court ruled the National Industrial Recovery Act unconstitutional.  It did the same with the Agricultural Adjustment Act in January 1936.

A do-nothing policy was unlikely to win back business support or strengthen Roosevelt’s political standing given the economy’s weak on-going economic expansion.  Thus, as Steve Fraser comments:

The Roosevelt administration needed new allies. To get them it would have to pay closer attention to the social upheavals erupting around the country. The center of gravity was shifting, and the New Deal would have to shift with it or risk isolation.

Roosevelt’s response was the Second New Deal.  His political acumen is well illustrated by the fact that the three signature achievements of the Second New Deal—the Works Progress Administration, the National Labor Relations Act, and the Social Security Act–not only responded to the demands of the mass movements organized by left political forces, but did so in a way that allowed him to take back the initiative from the left.

The Works Progress Administration, created in May 1935, provided meaningful work for millions of jobless workers, satisfying the demands of many of those in the unemployed and relief workers movements.  Moreover, unlike the earlier short-lived CWA that focused on public construction work, the Works Progress Administration also included a Federal Arts Project, a Federal Theater Project, a Federal Writers’ Project, and a Federal Music Project.

The National Labor Relations Act, passed in July 1935, created a framework for protecting the rights of private sector workers to organize into unions of their choosing, engage in collective bargaining, and take collective actions such as strikes.  Many trade unionists celebrated this act, believing that it would secure their rights to organize.  The Social Security Act passed in August 1935 established a system of old-age benefits for workers, benefits for victims of industrial accidents, unemployment insurance, and aid for dependent mothers and children, the blind, and the physically handicapped.  This was a direct response to the demands of the broad workers’ movement for a federally organized system of social protection.

However, as we will see in the next post, while the Second New Deal represented a major step forward for working people, each of these signature initiatives, as designed, fell short of what progressive movements demanded.  Unfortunately, changing political and economic conditions greatly weakened the left over the following years, leaving it unable to sustain its organizing and its pressure on the state.  As a consequence, not only was there no meaningful Third New Deal, the reforms of the Second New Deal have either ended (direct public employment), greatly weakened (labor protections), or come under attack (social security).

Lessons

The New Deal was not a program conceived and implemented at a moment in time by a government committed to transformative policies in defense of popular needs.  Rather, it encompassed two very different New Deals, with the Second far more progressive than the First.  Moreover, the Second New Deal did not emerge as a natural evolution of the First New Deal.  Rather as shown above, it was a largely a response to continued popular pressure from movements with strong left leadership.

This history holds an important lesson for those advocating for a Green New Deal.  It is unlikely that popular movements can win and secure full implementation of their demands for a Green New Deal at one historical moment.  Rather, if we succeed, it will take time.  However, as the history of the New Deal shows, the process of change is not likely to be advanced by advocating for modest demands in the belief that these can be easily won and that governments will be predisposed to extend and deepen the required interventions over time.  Rather, we need to build awareness that our political leaders will most likely respond to our efforts with reforms designed to blunt or contain our demands for change. Thus, it is necessary to put forward the most progressive demands that can win popular support at the time while preparing movement participants for the fact that the struggle to win meaningful transformative policies will be long and complex.

Another lesson from the New Deal experience is that movement building itself must also be a dynamic process, responding and transforming in response to political and economic developments.  It was the movement of unemployed that spearheaded the political pressures leading to the First New Deal.  First New Deal policies and the economic recovery then changed the organizing terrain, leading to new organizing of relief workers and trade unions.

At the same time, there were strong threads tying these movements together. One of the most important was that experiences in one, say the unemployed movement, provided an educational experience that helped create organizers able to spur the work of the newer movements, for example that of relief workers.  And because of all these movements owed much to the work of left political groups, there was a common vision that also tied them together and encouraged each to support the struggles of the other and join in support of even bigger demands, such as for a new system of social insurance.

Advocates of a Green New Deal need to pay careful attention to this organizing experience. Given the Green New Deal’s multidimensional concerns, achieving it will likely require organizing in many different arenas which may well require, at least at an early stage, organizing a number of different movements, each with their own separate concerns.  The challenge will be finding ways to ensure coordination, productive interactions and interconnections, and an emerging unified vision around big transformative demands.

What the New Deal can teach us about winning a Green New Deal: Part III—the First New Deal

In Part I and Part II of this series on lessons to be learned from the New Deal I argued that despite the severity of the Great Depression, sustained organizing was required to transform the national political environment and force the federal government to accept direct responsibility for financing relief and job creation programs. In this post, I begin an examination of the evolution and aims of New Deal programs in order to highlight the complex and conflictual nature of a state-directed reform process.

The New Deal is often talked about as if it were a set of interconnected programs that were introduced at one moment in time to reinvigorate national economic activity and ameliorate the hardships faced by working people.  Advocates for a Green New Deal, which calls for a new state-led “national, social, industrial, and economic mobilization” to confront our multiple interlocking problems, tend to reinforce this view of the New Deal.  It is easy to understand why: state action is desperately needed, and pointing to a time in history when it appears that the state rose to the occasion, developing and implementing the programs necessary to solve a crisis, makes it easier for people to envision and support another major effort.

Unfortunately, this view misrepresents the experience of the New Deal.  And, to the extent it influences our approach to shaping and winning a Green New Deal, it weakens our ability to successfully organize and promote the kind of state action we want.

The New Deal actually encompasses two different periods; the First New Deal was begun in 1933, the Second New Deal in 1935.  In both periods, the programs designed to respond to working class concerns fell far short of popular demands.  In fact, it was continued mass organizing, spearheaded by an increasingly unified unemployed movement and an invigorated trade union movement, that pushed the Roosevelt administration to initiate its Second New Deal, which included new and significantly more progressive initiatives.

Unfortunately, as those social movements lost energy and vision in the years that followed, pressure on the state for further change largely abated, leaving the final reforms won compromised and vulnerable to future attack.   The lesson from this history for those advocating for a Green New Deal is clear: winning a Green New Deal requires, in addition to carefully constructed policy demands, an approach to movement building that prepares people for a long struggle to overcome expected state efforts to resist the needed transformative changes.

The First New Deal

Roosevelt’s initial policies were largely consistent with those of the previous Hoover administration.  Like Hoover, he sought to stabilize the banking system and balance the budget.  On his first day in office Roosevelt declared a national bank “holiday,” dismissing Congressional sentiment for bank nationalization.  He then rushed through a new law, the Emergency Banking Act, which gave the Comptroller of the Currency, the Secretary of the Treasury, and the Federal Reserve new powers to ensure that reopened banks would remain financially secure.

On his sixth day in office, he requested that Congress cut $500 million from the $3.6 billion federal budget, eliminate government agencies, reduce the salaries of civilian and military federal workers, and slash veterans’ benefits by 50 percent.  Congressional resistance led to spending cuts of “only” $243 million.

Roosevelt remained committed, against the advice of many of his most trusted advisers, to balanced budget policies for most of the decade.  While his administration did boost government spending to nearly double the levels of the Hoover administration, it also collected sufficient taxes to keep deficits low.  It wasn’t until 1938 that Roosevelt proposed a Keynesian-style deficit spending plan.

At the same time, facing escalating demands for action from the unemployed as well as many elected city leaders, Roosevelt also knew that the status quo was politically untenable.  And, in an effort to halt the deepening depression and growing militancy of working people, he pursued a dizzying array of initiatives, most within his first 100 days in office.  The great majority were aimed at stabilizing or reforming markets, which Roosevelt believed was the best way to restore business confidence, investment, and growth.  This emphasis is clear from the following list of some of his most important initiatives.

  • The Agricultural Adjustment Act (May 1933). The act sought to boost the prices of agricultural goods. The government bought livestock and paid subsidies to farmers in exchange for reduced planting. It also created the Agricultural Adjustment Administration to manage the payment of subsidies.
  • The Securities Act of 1933 (May 1933). The act sought to restore confidence in the stock market by requiring that securities issuers disclose all information necessary for investors to be able to make informed investment decisions.
  • The Home Owners’ Loan Act of 1933 (June 1933). The act sought to stabilize the finance industry and housing industry by providing mortgage assistance to homeowners. It created the Home Owners Loan Corporation which was authorized to issue bonds and loans to help homeowners in financial difficulties pay their mortgages, back taxes, and insurance.
  • The Banking Act of 1933 (June 1933). The act separated commercial and investment banking and created the Federal Deposit Insurance Corporation to insure bank deposits, curb bank runs, and reduce bank failures.
  • Farm Credit Act (June 1933). The act established the Farm Credit System as a group of cooperative lending institutions to provide low cost loans to farmers.
  • National Industrial Recovery Act (June 1933). Title I of the act suspended anti-trust laws and required companies to write industrywide codes of fair competition that included wage and price fixing, the establishment of production quotas, and restrictions on market entry.  It also gave workers the right to organize unions, although without legal protection.  Title I also created the National Recovery Administration to encourage business compliance.  The Supreme Court ruled the suspension of anti-trust laws unconstitutional in 1935.  Title II, which established the Federal Emergency Administration of Public Works or Public Works Administration, is discussed below.

Roosevelt also pursued several initiatives in response to working class demands for jobs and a humane system of relief.  These include:

  • The Emergency Conservation Work Act (March 1933). The act created the Civilian Conservation Corps which employed jobless young men to work in the nation’s forests and parks, planting trees, reducing erosion, and fighting fires.
  • The Federal Emergency Relief Act of 1933 (May 1933). The act created the Federal Emergency Relief Administration to provide work and cash relief for the unemployed.
  • The Federal Emergency Administration of Public Works or Public Works Administration (June 1933). Established under Title II of the National Industrial Recovery Act, the Public Works Administration was a federally funded public works program that financed private construction of major public projects such as dams, bridges, hospitals, and schools.
  • The Civil Works Administration (November 1933).  Established by executive order, the Civil Works Administration was a short-lived jobs program that employed jobless workers at mostly manual-labor construction jobs.

This is without doubt an impressive record of accomplishments, and it doesn’t include other noteworthy actions, such as the establishment of the Tennessee Valley Authority, the ending of prohibition, and the removal of the US from the gold standard.  Yet, when looked at from the point of view of working people, this First New Deal was sadly lacking.

Roosevelt’s pursuit of market reform rather than deficit spending meant a slow recovery from the depths of the recession.  In fact, John Maynard Keynes wrote Roosevelt a public letter in December 1933, pointing out that the Roosevelt administration appeared more concerned with reform than recovery or, to be charitable, was confusing the former with the latter.  Primary attention, he argued, should be on recovery, and that required greater government spending financed by loans to increase national purchasing power.

Roosevelt also refused to address one of the unemployed movement’s major policy demands: the establishment of a federal unemployment insurance fund financed by taxes on the wealthy.  Finally, as we see next, even the New Deal’s early job creation and relief initiatives were deliberately designed in ways that limited their ability to meaningfully address their targeted social concerns.

First New Deal employment and relief programs

The Roosevelt administration’s first direct response to the country’s massive unemployment was the Civilian Conservation Corps (CCC).  Its enrollees, as Roosevelt explained, were to be “used in complex work, not interfering with normal employment and confining itself to forestry, the prevention of soil erosion, flood control, and similar projects.”  The project was important for establishing a new level of federal responsibility, as employer of last resort, for boosting employment.  Over its nine-year lifespan, its participants built thousands of miles of hiking trails, planted millions of trees, and fought hundreds of forest fires.

However, the program was far from meeting the needs of the tens of million jobless and their dependents.  Participation in the program was limited to unmarried male citizens, 18 to 25 years of age, whose families were on local relief, and who were able to pass a physical exam.  By law, maximum enrollment in the program was limited to 300,000.

Moreover, although the CCC provided its participants with shelter, clothing, and food, the wages it paid, $30 a month ($25 of which had to be sent home to their families), were low.  And, while white and black were supposed to be housed together in the CCC camps where participants lived under Army supervision, many of the camps were segregated, with whites given preference for the best jobs.

Two months later, the Roosevelt administration launched the Federal Emergency Relief Administration (FERA), the first program of direct federal financing of relief.  Under the Hoover administration, the federal government had restricted its support of state relief efforts to the offer of loans.  Because of the precariousness of their own financial situation, many states were unable to take on new debt, and were thus left with no choice but to curtail their relief efforts.

FERA, in contrast, offered grants as well as loans, providing approximately $3 billion in grants over its 2 ½ year lifespan. The grants allowed state and local governments to employ people who were on relief rolls to work on a variety of public projects in agriculture, the arts, construction and education.  FERA grants supported the employment of over 20 million people, or about 16 percent of the total population of the United States.

However, the program suffered from a number of shortcomings.  FERA provided funds to the states on a matching basis, with states required to contribute three dollars for every federal dollar.  This restriction meant that a number of states, struggling with budget shortfalls, either refused to apply for FERA grants or kept their requests small.

Also problematic was the program’s requirement that participants be on state relief rolls.  This meant that only one person in a family was eligible for FERA work.  And the amount of pay or relief was determined by a social worker’s evaluation of the extent of the family’s financial need.  Many states had extremely low standards of necessity, resulting in either low wages or inadequate relief payments which could sometimes be limited to coupons exchangeable only for food items on an approved list.

Finally, FERA was not directly involved in the administration and oversight of the projects it funded. This meant that compensation for work and working conditions differed across states.  It also meant that in many states, white males were given preferential treatment.

A month later, the Public Works Administration (PWA) was created as part of the National Industrial Recovery Act.  The PWA was a federal public works program that financed private construction of major long-term public projects such as dams, bridges, hospitals, and schools.  Administrators at PWA headquarters planned the projects and then gave funds to appropriate federal agencies to enable them to help state and local governments finance the work. The PWA played no role in hiring or production; private construction companies carried out the work, hiring workers on the open market.

The program lasted for six years, spent $6 billion, and helped finance a number of important infrastructure projects.  It also gave federal administrators valuable public policy planning experience, which was put to good use during World War II.  However, as was the case with FERA, PWA projects required matching contributions from state and local governments, and given their financial constraints, the program never spent as much money as was budgeted.

These programs paint a picture of a serious but limited effort on the part of the Roosevelt administration to help workers weather the crisis.  In particular, the requirement that states match federal contributions to receive FERA and PWA funds greatly limited their reach.  And, the participant restrictions attached to both the CCC and FERA meant that program benefits were far from adequate.  Moreover, because all of these were new programs, it often took time for administrators to get funds flowing, projects developed, participants chosen, and benefits distributed.  Thus, despite a flurry of activity, millions of workers and their families remained in desperate conditions with winter approaching.

Pressed to do more, the Roosevelt administration launched its final First New Deal jobs program in November 1933, the Civil Works Administration (CWA), under the umbrella of FERA.  It was designed to be a short-term program, and it lasted only 6 months, with most employment creation ending after 4 months.  The jobs created were primarily low-skilled construction jobs, improving or constructing roads, schools, parks, airports, and bridges. The CWA gave jobs to some 4 million people.

This was a dramatically different program from those discussed above.  Most importantly, employment was not limited to those on relief, greatly enlarging the number of unemployed who could participate.  At the end of Hoover’s term in office, only one unemployed person out of four was on a relief roll.  It also meant that participants would not be subject to the relief system’s humiliating means tests or have their wages tied to their family’s “estimated budgetary deficit.”  Also significant was the fact that although many of the jobs were inherited from current relief projects, CWA administrators made a real effort to employ their workers in new projects designed to be of value to the community.

For all of these reasons, jobless workers flocked to the program, seeking an opportunity to do, in the words of the time, “real work for a real wage.”   As Harry Hopkins, the program’s chief administrator, summed up in a talk shortly after the program’s termination:

When we started Civil Works we said we were going to put four million men to work.  How many do you suppose applied for those four million jobs? About ten million. Now I don’t say there were ten million people out of work, but ten million people walked up to a window and stood in line, many of them all night, asking for a job that paid them somewhere between seven and eighteen dollars a week.

In point of fact, there were some fifteen million people unemployed.  And as the demand for CWA jobs became clear, Roosevelt moved to end the program.   As Jeff Singleton describes:

In early January Hopkins told Roosevelt that CWA would run out of funds sooner than expected.  According to one account, Roosevelt “blew up” and demanded that Hopkins begin phasing out the program immediately.  On January 18 Hopkins ordered weekly wages cut (through a reduction in hours worked) and hinted that the program would be terminated at the beginning of March.  The cutback, coming at a time when the program had just reached its promised quota, generated a storm of protest and a movement in Congress to continue CWA through the spring of 1934.  These pressures helped the New Deal secure a new emergency relief appropriation of $950 million, but the CWA was phased out in March and April.

Lessons

The First New Deal did represent an important change in the economic role of the federal government.  In particular, the Roosevelt administration broke new ground in acknowledging federal responsibility for job creation and relief.  Yet, the record of the First New Deal also makes clear that the Roosevelt administration was reluctant to embrace the transformative role that many now attribute to it.

As Keynes pointed out, Roosevelt’s primary concern in the first years of his administration was achieving market stability through market reform, not a larger financial stake in the economy to speed recovery.  In fact, in some cases, his initiatives gave private corporations even greater control over market activity.

The Roosevelt administration response to worker demands for jobs and a more humane system of welfare was also far from transformative.  Determined to place limits on federal spending, its major initiatives required substantial participation from struggling state governments.  They also did little to challenge the punitive and inadequate relief systems operated by state governments.  The one exception was the CWA, which mandated wage-paying federally directed employment.  And that was the one program, despite its popularity, that was quickly terminated.

Of course, there was a Second New Deal, which included a number of important and more progressive initiatives, including the Works Progress Administration, the Social Security Act, and the National Labor Relations Act.  However, as I will discuss in the next post in this series, this Second New Deal was largely undertaken in response to the growing strength of the unemployed movement and workplace labor militancy.   And as we shall see, even these initiatives fell short of what many working people demanded.

One lesson to be learned from this history for those advocating a Green New Deal is that major policy transformations do not come ready made, or emerge fully developed.  Even during a period of exceptional crisis, the Roosevelt administration was hesitant to pursue truly radical experiments.  And the evolution of its policy owed far more to political pressure than the maturation of its administrative capacities or a new found determination to experiment.

If we hope to win a Green New Deal we will have to build a movement that is not only powerful enough to push the federal government to take on new responsibilities with new capacities, but also has the political maturity required to appreciate the contested nature of state policy and the vision necessary to sustain its forward march.

What the New Deal can teach us about winning a Green New Deal: Part I–Confronting Crisis

The New Deal has recently become a touchstone for many progressive efforts, illustrated by Bernie Sanders’ recent embrace of its aims and accomplishments and the popularity of calls for a Green New Deal.  The reasons are not hard to understand. Once again, growing numbers of people have come to the conclusion that our problems are too big to be solved by individual or local efforts alone, that they are structural and thus innovative and transformative state-led actions will be needed to solve them.

The New Deal was indeed a big deal and, given contemporary conditions, it is not surprising that people are looking back to that period for inspiration and hope that meaningful change is possible.  However, inspiration, while important, is not the same as seeking and drawing useful organizing and strategic lessons from a study of the dynamics of that period.

This is the first of a series of posts in which I will try to illuminate some of those lessons.  In this first post I start with the importance of crisis as a motivator of change.  What the experience of the Great Depression shows is that years of major economic decline and social devastation are not themselves sufficient to motivate business and government elites to pursue policies likely to threaten the status quo.  It was only after three and a half years of organizing had also created a political crisis, that the government began taking halting steps at serious change, marked by the policies associated with the First New Deal.  In terms of contemporary lessons, this history should serve to dispel any illusions that simply establishing the seriousness of our current multifaceted crisis will be enough to win elite consideration of a transformative Green New Deal.

The Great Depression

The US economy expanded rapidly throughout the 1920s, a period dubbed the Roaring Twenties. It was a time of rapid technological change, business consolidation, and wealth concentration.  It was also a decade when many traditional industries struggled, such as agriculture, textiles, coal, and shipbuilding, as did most of those who worked in them.  Growth was increasingly sustained by consumer demand underpinned by stock market speculation and debt.

The economy suffered a major downturn in 1920-21, and then mild recessions in 1924 and 1927.  And there were growing signs of the start of another recession in summer 1929, months before the October 1929 stock market collapse, which triggered the beginning of the Great Depression.  The collapse quickly led to the unraveling of the US economy.

The Dow Jones average dropped from 381 in September 1929 to forty-one at the start of 1932.  Manufacturing output fell by roughly 40 percent between 1929 and 1933.  The number of full-time workers at United States Steel went from 25,000 in 1929 to zero in 1933.  Five thousand banks failed over the same period.  Steve Frazer captured the extent and depth of the decline as follows: “In early 1933, thirty-six of forty key economic indicators had arrived at the lowest point they were to reach during the whole eleven grim years of the Great Depression.”

The resulting crisis hit working people hard.   Between 1930 and 1932, the number of unemployed grew from 3 million to 15 million, or approximately 25 percent of the workforce.  The unemployment rate for those outside the agricultural sector was close to 37 percent.  As Danny Lucia describes:

Workers who managed to hold onto their jobs faced increased exploitation and reduction in wages and hours, which made it harder for them to help out jobless family and friends. The social fabric of America was ripped by the crisis: One-quarter of children suffered malnutrition, birth rates dropped, suicide rates rose. Many families were torn apart. In New York City alone, 20,000 children were placed in institutions because their parents couldn’t support them. Homeless armies wandered the country on freight trains; one railroad official testified that the number of train-hoppers caught by his company ballooned from 14,000 in 1929 to 186,000 in 1931.

“Not altogether a bad thing”

Strikingly, despite the severity of the economic and social crisis, business leaders and the federal government were in no hurry to act.  There was certainly no support for any meaningful federal relief effort.  In fact, business leaders initially tended to downplay the seriousness of the crisis and were generally optimistic about a quick recovery.

As the authors of Who Built America (volume 2) noted:

when the business leaders who made up the National Economic League were asked in January 1930 what the country’s ‘paramount economic problems’ were, they listed first, ‘administration of justice,’ second, ‘Prohibition,” and third, ‘lawlessness.’ Unemployment was eighteenth on their list!

Some members of the Hoover administration tended to agree. Treasury Secretary Andrew Mellon thought the crisis was “not altogether a bad thing.”  “People,” he argued, “will work harder, live a more moral life.  Values will be adjusted, and enterprising people will pick up the wrecks from less competent people.”

President Hoover repeatedly stated that the economy was “on a sound and prosperous basis.”  The solution to the crisis, he believed, was to be found in restoring business confidence and that was best achieved through maintaining a balanced budget.  When it came to relief for those unemployed or in need, Hoover believed that the federal government’s main role was to encourage local government and private efforts, not initiate programs of its own.

At time of stock market crash, relief for the poor was primarily provided by private charities, which relied on donations from charitable and religious organizations.  Only 8 states had any type of unemployment insurance.  Not surprisingly, this system was inadequate to meet popular needs.  As the authors of Who Built America explained:

by 1931 most local governments and many private agencies were running out of money for relief.  Sometimes needy people were simply removed from the relief rolls.  According to one survey, in 1932 only about one-quarter of the jobless were receiving aid.  Many cities discriminated against nonwhites.  In Dallas and Houston, African-Americans and Mexican-Americans were denied any assistances.

It was not until January 1932 that Congress made its first move to strengthen the economy, establishing the Reconstruction Finance Corporation (RFC) to provide support to financial institutions, corporations, and railroads.  Six months later, in July, it approved the Emergency Relief and Construction Act, which broadened the scope of the RFC, allowing it to provide loans to state and local governments for both public works and relief.  However, the Act was structured in ways that undermined its effectiveness. For example, the $322 million allocated for public works could only be used for projects that would generate revenue sufficient to pay back the loans, such as toll bridges and public housing.  The $300 million allocated for relief also had to be repaid.  Already worried about debt, many local governments refused to apply for the funds.

Finally, as 1932 came to a close, some business leaders began considering the desirability of a significant federal recovery program, but only for business.  Most of their suggestions were modeled on World War I programs and involved government-business partnerships designed to regulate and stabilize markets.  There was still no interest in any program involving sustained and direct federal relief to the millions needing jobs, food, and housing.

By the time of Roosevelt’s inauguration in March 1933, the economy, as noted above, had fallen to its lowest point of the entire depression.  Roosevelt had won the presidency promising “a new deal for the American people,” yet his first initiatives were very much in line with the policies of the previous administration. Two days after his inauguration he declared a national bank holiday, which shut down the entire banking system for four days and ended a month-long run on the banks. The “holiday” gave Congress time to approve a new law which empowered the Federal Reserve Board to supply unlimited currency to reopened banks, which reassured the public about the safety of their accounts.

Six days after his inauguration, Roosevelt, who had campaigned for the Presidency, in part, on a pledge to balance the federal budget, submitted legislation to Congress which would have cut $500 million from the $3.6 billion federal budget.  He proposed eliminating government agencies, reducing the pay of civilian and military federal workers (including members of Congress), and slashing veterans’ benefits by 50 percent.  Facing Congressional opposition, the final bill cut spending by “only” $243 million.

Lessons

It is striking that some 3 ½ years after the start of the Great Depression, despite the steep decline in economic activity and incredible pain and suffering felt by working people, business and government leaders were still not ready to support any serious federal program of economic restructuring or direct relief.  That history certainly suggests that even a deep economic and social crisis cannot be counted on to encourage elites to explore policies that might upset existing structures of production or relations of power, an important insight for those hoping that recognition of the seriousness of our current environmental crisis might encourage business or government receptivity to new transformative policies.

Of course, we do know that in May 1933 Roosevelt finally began introducing relief and job creation programs as part of his First New Deal.  And while many factors might have contributed to such a dramatic change in government policy, one of the most important was the growing movement of unemployed and their increasingly militant and collective action in defense of their interests.  Their activism was a clear refutation of business and elite claims that prosperity was just around the corner.  It also revealed a growing radical spark, as more and more people openly challenged the legitimacy of the police, courts, and other state institutions.  As a result, what was an economic and social crisis also became a political crisis.  As Adolf Berle, an important member of Roosevelt’s “Brain Trust,” wrote, “we may have anything on our hands from a recovery to a revolution.”

In Part II, I will discuss the rise and strategic orientation of the unemployment movement, highlighting the ways it was able to transform the political environment and thus encourage government experimentation.  And I will attempt to draw out some of the lessons from this experience for our own contemporary movement building efforts.

The 1933 programs, although important for breaking new ground, were exceedingly modest.  And, as I will discuss in a future post, it was only the rejuvenated labor movement that pushed Roosevelt to implement significantly more labor friendly policies in the Second New Deal starting in 1935.  Another post will focus more directly on the development and range of New Deal policies in order to shed light on the forces driving state policy as well as the structural dynamics which tend to limit its progressive possibilities, topics of direct relevance to contemporary efforts to envision and advance a Green New Deal agenda.

The Trump Administration: Lots of Noise But Nothing Changed For US TNCs

President Trump has long pointed to the US balance of payments deficit as a sign of US economic weakness. Of course, his nation-state focus, and claim that trade deficits with countries such as China and Mexico are the result of unfair trading practices that benefit foreign business and workers at the expense of US business and workers, is misleading.  These deficits owe much to the operation of US corporate controlled cross-border production networks, which have boosted US corporate power and profits largely at the expense of workers in all three countries.

Criticizing past administrations for selling out America, President Trump has pursued a series of policies—renegotiated trade agreements, tariff wars, public shaming of corporate disinvestment, and tax reform—all of which are supposed to help rebuild the US economy by encouraging US firms to modernize and expand their US operations. These policies have all failed to achieve their stated aim.  In fact, they have, largely by design, only served to strengthen existing corporate dominated patterns of international production and value capture.

As a result, there has been little change in US trade patterns.  The US trade deficit in goods, as shown below, has continued to grow every year of the Trump presidency.

Strengthening TNC power and profits

After first threatening to dissolve NAFTA, President Trump eventually pursued a rewrite of the NAFTA agreement.  However, his proposed changes to the agreement primarily speak to corporate needs, especially the new chapters that increase protection for intellectual property rights and promote greater cross-border freedom for electronic commerce and digital trade.  Similarly, the Trump tariff “war” against China appears primarily aimed at forcing the Chinese government to tighten regulations protecting US intellectual property rights and open new sectors of its economy to US foreign investment, especially the finance sector.

President Trump has also engaged in occasional twitter “wars” against corporate decisions to close or relocate abroad part of their operations.  Initially, corporate leaders felt pressure to modify or delay their decisions.  Now, no doubt reassured by the general policy direction of the Trump administration, they no longer appear worried about his periodic outbursts.  For example, both GM and Harley Davidson recently announced plans to shut domestic plants in favor of overseas production and have largely ignored Trump’s tweets critical of their globalization activities.

Much has been written about these efforts, but little about the consequences of the last policy, tax cuts, on US TNC decision-making.  The “Tax Cuts and Jobs Act” Act, signed into law on Dec. 22, 2017, was promoted as a way to encourage US transnational corporations to bring back funds held outside the country and boost their domestic investment.  However, as a Bank of France blog post by Cristina Jude and Francesco Pappadà makes clear, this initiative, like the others, has done nothing to change US corporate behavior, although the lower tax rates make it more profitable.

Jude and Pappadà focus on profit hording and profit shifting.  Profit hording refers to the accumulation of “non-repatriated earnings” by US TNCs.  Economists estimated that US firms held approximately $2.5 trillion outside the country at the end of 2017 and the Trump administration predicted that a large share would be brought back thanks to the one-time lower tax rate included in the 2017 act.  Apple alone is said to hold $252 billion in offshore accounts.

Although economists speak of corporate earnings held abroad, in fact most of those earnings are held in the US.  However, as long as those funds are not used for certain purposes, such as paying dividends to shareholders, financing domestic acquisitions, guaranteeing loans, or making investments in physical capital in the US, they can be invested in the US tax free.

As we can see in the chart below, US companies did respond to the one-time lower tax rate by “repatriating” some funds.  Dividend payouts went up, which resulted in a period of negative “reinvested earnings” in foreign affiliates.

However, as Jude and Pappadà explain:

Despite the permanent cut of the standard corporate tax rate from 35 percent to 21 percent, the adjustment of repatriated dividends and reinvested earnings appears limited to the first and second quarters of 2018. Indeed, dividends decrease substantially in quarter three, whereas reinvested earnings return to positive as they were before the tax reform.

The response of US companies to the corporate tax reform mainly consisted in the partial repatriation of previously accumulated stocks of earnings (around 20 percent of the total) due to the temporary lower tax. This firms’ behavior is similar to the one observed in 2005 when another law granted US multinationals a one-year tax holiday to repatriate foreign profits at a 5.25 percent tax rate.

Thus, the tax change produced a one-time shift in a relatively small share of the non-repatriated earnings held by leading US TNCs, with stock owners the primary beneficiaries. Moreover, this shift did not change the overall size of income receipts from US foreign direct investment, as the increase in dividends was offset by the negative reinvested earnings.

If the “Tax Cuts and Jobs Act,” is to have a long-lasting effect on the US trade balance, it needs to stop the corporate practice of tax shifting, which is how TNCs generated the huge sum of money held as non-repatriated earnings.  Profit shifting refers to the corporate strategy of using various means such as transfer pricing, often achieved using intellectual property rights over patents and trademarks, to book profits generated from US activities in a lower-tax jurisdiction.  As Jude and Pappadà point out, “six small jurisdictions (Bermuda, Ireland, Luxembourg, the Netherlands, Singapore and Switzerland), which count for less than 1 percent of the world’s population, hold 63 percent of the overall profits earned abroad by US multinationals.”

Google is, as Tim Hyde explains, one of the firms that makes good use of this strategy:

it is able to claim billions of profits in Bermuda each year (corporate tax rate: 0 percent) even though it has no office building there and not even any employees on the island. . . . this is legitimate because the rights to Google’s search and advertising technologies are technically owned by a subsidiary called Google Holdings housed in Bermuda, thanks in part to a trick called the Double Irish Dutch Sandwich. Other Google subsidiaries pay billions in royalties to the Bermudian company Google Holdings for the rights to use its technology, which was originally invented by Google employees in California and sold to Google Holdings in 2001. Those billions of profits are reclassified as Bermudian rather than American or Irish and thus not taxed.

If US firms booked their earnings in the US, rather than in a foreign tax haven, foreign direct investment receipts would decline, net US service exports would increase, and the overall trade deficit would narrow.

An Oxfam study of profit shifting by leading pharmaceutical companies shows just how important this strategy is to US TNCs and how much we lose from it:

Abbott, Johnson & Johnson, Merck, and Pfizer—systematically stash their profits in overseas tax havens. As a result, these four corporate giants appear to deprive the United States of $2.3 billion annually and deny other advanced economies of $1.4 billion. And they appear to deprive the cash-strapped governments of developing countries of an estimated $112 million every year—money that could be spent on vaccines, midwives, or rural clinics.

Pharma corporations’ “profit-shifting” may take the form of “domiciling” a patent or rights to its brand not where the drug was actually developed or where the firm is headquartered, but in a tax haven, where a company’s presence may be as little as a mailbox. That tax haven subsidiary then charges hefty licensing fees to subsidiaries in other countries. The fees are a tax-deductible expense in the jurisdictions where taxes are standard, while the fee income accrues to the subsidiary in the tax haven, where it is taxed lightly or not at all. Loans from tax-haven subsidiaries and fees for their “services” are other common strategies to avoid taxes. . . .

Further opportunities for avoiding taxes involve locating corporate brand or patents in tax havens, and fees for marketing, finance, or management services. For example, a pharmaceutical corporation may bill much of its R&D costs on products consumed around the globe to a subsidiary in a tax haven where R&D rights are registered, even though not a single researcher is based there. That immediately creates a cost in the country where the product is consumed, which minimizes the tax bill, and an artificial profit in the tax havens, where almost no taxes are paid in return.

As a result of this practice:

Pfizer posted losses on US operations of 8 percent in 2013, 25 percent in 2014, and 31 percent in 2015. The pattern has continued, with Pfizer posting losses of 32 percent in 2016 and 26 percent in 2017. Meanwhile, Pfizer’s international operations earned 56–58 percent in 2013–2015 and even more in the two years since (64 and 72 percent). The story is similar though less extreme for Abbott and Johnson & Johnson.

The pharmaceutical industry is no outlier.  According to a study by three economists, Thomas Tørsløv, Ludvig Wier, and Gabriel Zucman, “close to 40 percent of multinational profits were artificially shifted to tax havens in 2015.”

And, as the chart below reveals, there is no sign that passage of the Tax Cuts and Jobs Act has produced any change in US TNC profit-shifting activities.  As Jude and Pappadà discuss:

in Chart 2, we observe a change in the composition of foreign direct investment income, but the balance remains stable at its pre-reform level. Moreover, this is not associated with an increase in net exports of services. In particular, the decomposition of the services trade balance in Chart 3 shows that there has not been any increase in intellectual property charges, for which profit shifting is more relevant. At the moment, it is too soon to assess the full impact of the reform as US multinationals may take time to adjust the location of their assets and activities. However, the profit shifting decisions of multinational firms do not seem to be affected so far.

In sum, for all of Trump’s bluster, his administration has done nothing to produce a change in TNC business practices or improve the health of the US economy.  In fact, quite the opposite is true, as almost his initiatives have been designed, above all, to expand the reach and profitability of leading US corporations.

The Trump Tax Plan Proves A Bonanza For Business

Every time a progressive policy captures the public imagination, like the Green New Deal, opponents are quick to raise the revenue question in an effort to discredit it.  While higher taxes on the wealthy and leading corporations should be an obvious starting point in any response, until recently elites have been remarkably successful in winning tax reductions, spinning the argument that cuts are the best way to stimulate private investment and create jobs.  And they have enjoyed a double gain: not only do the cuts benefit them financially, the loss of public revenue encourages people to think small when it comes to public policy.

However, there are signs that the times might be changing.  Alexandria Ocasio-Cortez’s proposal to tax annual incomes over $10 million at a marginal tax rate of 70 percent has won significant public support. Strong popular opposition in New York to a plan to heavily subsidize a new Amazon headquarters forced the company to withdraw its proposal. And then there is the negative lesson of the Wisconsin fiasco, where the state showered Foxconn with massive tax and other subsidies in an effort to land a new manufacturing facility, only to have the company walk-back its commitments after significant state expenditures.

But there is still important education as well as political work that remains to be done to win majority support for the kind of tax reform we so desperately need. President Trump’s “Tax Cuts and Jobs Act,” which was signed into law on Dec. 22, 2017, is one example of what we are up against.

The “American model”

President Trump’s signature tax law included significant benefits for the wealthy as well as most major corporations.  Looking just at the business side, the law:

  • lowered the US corporate tax rate from 35 percent to 21 percent and eliminated the corporate Alternative Minimum Tax.
  • changed the federal tax system from a global to a territorial one.  Under the previous global tax system, US multinational corporations were supposed to pay the 35 percent US tax rate for income earned in any country in which they had a subsidiary, less a credit for the income taxes they paid to that country. However, the tax payment could be deferred until the earnings were repatriated.  Under the new territorial tax system, each corporate subsidiary only has to pay the tax rate of the country in which it is legally established; foreign profits face no additional US taxes.
  • established a new “global minimum” tax of 10.5 percent that is only applied to total foreign earnings greater than a newly established “normal rate of return” on tangible investments in plant and equipment (set at 10 percent).
  • offered multinational corporations a one-time special lower tax rate of 8 percent on repatriated funds that were held overseas by corporate subsidiaries in tax-haven countries.

Of course, President Trump sold these changes as a means to rebuild the American economy, predicting a massive return of overseas money and increase in domestic investment.  As he explained:

For too long, our tax code has incentivized companies to leave our country in search of lower tax rates. My administration rejects the offshoring model, and we have embraced a brand new model. It’s called the American model. We want companies to hire and grow in America, to raise wages for the American workers, and to help rebuild American cities and towns.

The same old story

Not surprisingly, the so-called new American model looks a lot like the old one, with corporations–and their managers and stockholders–gaining at the public expense.

Corporate investment has not been limited by a lack of money.  Rather, corporate profits have steadily increased while investment in plant and equipment has remained weak.  Instead of investing, corporations have used their surplus to finance dividend payments, stock repurchases, and mergers and acquisitions. Instead of stimulating new productive investment, the tax cut only gave firms more money to use for the same purposes.

The new territorial tax system, which was supposed to promote domestic investment and production, actually continues to encourage the globalization of production since it lowers the taxes corporations have to pay on profits generated outside the country. The new global minimum tax does much the same.  Although its supporters claimed that it would ensure that corporations pay some US tax on their foreign profits, as structured it encourages foreign investment.  The minimum tax rate remains far below the US domestic rate, and the larger the capital base of the foreign subsidiary, the greater the foreign profits the parent firm can shield from taxation.

As for the one-time tax break on repatriated profits, the fact is that most of the money supposedly held abroad was already in the country, sitting in accounts protected from taxation.  Moreover, since firms remain reluctant to invest, the one-time break only served to give firms the opportunity to channel more money into nonproductive uses at a special lower tax rate.

Tax realities

According to the Treasury Department, corporate income tax receipts fell by 31 percent in fiscal year 2018.  As a Peter G. Peterson blog post explains:

The 31 percent drop in corporate income tax receipts last year is the second largest since at least 1934, which is the first year for which data are available. Only the 55 percent decline from 2008 to 2009 was larger. While that decrease can be explained by the Great Recession, the drop from 2017 to 2018 can be explained by tax policy decisions.

The Tax Cuts and Jobs Act, enacted in December 2017, is responsible for the plunge in corporate income tax receipts in 2018. Those changes include a reduction in the statutory rate from 35 percent to 21 percent and the expanded ability to immediately deduct the full value of equipment purchases. The Congressional Budget Office points out that about half of the 2018 decline occurred since June, which includes estimated tax payments made by corporations in June and September that reflected the new tax provisions.

Ben Foldy, writing for Bloomberg news, highlights the spoils that went to the banking sector:

Major U.S. banks shaved about $21 billion from their tax bills last year — almost double the IRS’s annual budget — as the industry benefited more than many others from the Republican tax overhaul. . . .

On average, the banks saw their effective tax rates fall below 19 percent from the roughly 28 percent they paid in 2016. And while the breaks set off a gusher of payouts to shareholders, firms cut thousands of jobs and saw their lending growth slow. . . .

Tax savings contributed to a banner year for banks, with the six largest surpassing $120 billion in combined profits for the first time. Dividends and stock buybacks at the 23 [largest] lenders surged by an additional $28 billion from 2017 — even more than their tax savings.

The stability and profitability of global corporate networks

US firms also continue to take advantage of overseas tax havens.  As Brad Setser, writing in the New York Times, points out:

despite Mr. Trump’s proud rhetoric regarding tax reform . . . there is no wide pattern of companies bringing back jobs or profits from abroad. The global distribution of corporations’ offshore profits — our best measure of their tax avoidance gymnastics — hasn’t budged from the prevailing trend.

Well over half the profits that American companies report earning abroad are still booked in only a few low-tax nations — places that, of course, are not actually home to the customers, workers and taxpayers facilitating most of their business. A multinational corporation can route its global sales through Ireland, pay royalties to its Dutch subsidiary and then funnel income to its Bermudian subsidiary — taking advantage of Bermuda’s corporate tax rate of zero.

The chart below makes this quite clear, showing that US profits are disproportionately booked in countries where there is little or no actual productive activity.

In fact, as Setser notes, “the new [tax] law encourages firms to move ‘tangible assets’ — like factories — offshore.”

The chart below, from a Fortune magazine post, provides an overview of the large cash holdings of some of America’s largest corporations and the share held “outside” the country.

Economists estimated that US firms held approximately $2.6 trillion outside the country and the Trump administration predicted that a large share would be brought back, funding new productive investments, thanks to the one-time lower tax rate included in the 2017 tax reform act.  Government officials and the media talked about this money in a way that gave the impression that it was actually sitting outside the country. But it wasn’t.

Adam Looney, in a Brookings blog post, clarifies that:

”repatriation” is not a geographic concept, but refers to a set of rules defining when corporations have to pay taxes on their earnings. For instance, paying dividends to shareholders triggers a tax bill, but simply bringing the cash to the U.S. does not. Indeed, nearly all of the $2.6 trillion is already invested in the U.S. . . .

U.S. multinational corporations can defer paying tax on profits they earn abroad indefinitely by agreeing not to use the earnings for certain purposes, like paying dividends to shareholders, financing domestic acquisitions, guaranteeing loans, or making investments in physical capital in the U.S. In short, the rules prohibit a company from using pre-tax money in transactions that benefit shareholders. No one believes this is rational or efficient, and it is certainly onerous for shareholders, who would rather have that cash in their pockets than held by the corporation. But those rules don’t place requirements on the geographic location of the cash. Multinational firms are allowed to bring those dollars back to the U.S. and to invest them in our financial system.

Indeed, that’s exactly what they do. Don’t take my word for it, the financial statements of the companies with large stocks of overseas earnings, like Apple, Microsoft, Cisco, Google, Oracle, or Merck describe exactly where their cash is invested. Those statements show most of it is in U.S. treasuries, U.S. agency securities, U.S. mortgage backed securities, or U.S. dollar-denominated corporate notes and bonds.

Of course, these firms could easily have used their tax deferred dollar assets as collateral to borrow to finance any investment projects they found attractive.  Their lack of interest in doing so provides additional evidence that low corporate rates of investment are not due to funding constraints.  Rather, corporations have only a limited interest in undertaking productive investments in the US.

Thus, it should come as no surprise that the one-time tax break resulted in a one-time, modest, “repatriation” and that the money was largely used for financial rather than productive purposes. The New York Times reports that:

 JPMorgan Chase analysts estimate that in the first half of 2018, about $270 billion in corporate profits previously held overseas were repatriated to the United States and spent as a result of changes to the tax code. Some 46 percent of that, JPMorgan Chase analysts said, was spent on $124 billion in stock buybacks.

The flow of repatriated corporate cash is just one tributary in what has become a flood of payouts to shareholders, both as buybacks and dividends. Such payouts are expected to hit almost $1.3 trillion this year, up 28 percent from 2017, according to estimates from Goldman Sachs analysts.

In sum, thanks to the Trump tax plan, trillions of dollars that could have been used to transform our transportation and energy infrastructure, industrial structure, and system of social services are instead being transferred to big businesses, who use them for speculative activities and to further enrich their already wealthy managers and stock holders.

Given current realities, we can expect growing popular interest in and support for new public initiatives like the Green New Deal and a new progressive system of taxation to help finance it.  Hopefully, exposing the workings of our current tax system and the lies our government and business leaders tell about whose interests it serves, will help speed this development.