Learning from history: community-run child-care centers during World War II

We face many big challenges.  And we will need strong, bold policies to meaningfully address them.  Solving our child-care crisis is one of those challenges, and a study of World War II government efforts to ensure accessible and affordable high-quality child care points the way to the kind of bold action we need. 

The child care crisis

A number of studies have established that high-quality early childhood programs provide significant community and individual benefits.  One found that “per dollar invested, early childhood programs increase present value of state per capita earnings by $5 to $9.”  Universal preschool programs have also been shown to offer significant benefits to all children, even producing better outcomes for the most disadvantaged children than means-tested programs.  Yet, even before the pandemic, most families struggled with a lack of desirable child-care options.    

The pandemic has now created a child-care crisis. As Lisa Dodson and Mary King point out: “By some estimates, as many as 4.5 million child-care ‘slots’ may be permanently lost and as many as 40 percent of child-care providers say they will never reopen.”  The lack of child care is greatly hindering our recovery from the pandemic.  Women suffered far greater job losses than men during 2020, including as child-care workers, and the child-care crisis has made it difficult for many working mothers to return to the labor force.  The cost goes beyond the immediate family hardship from lost income; there is strong evidence that a sustained period without work, the so-called employment gap, will result in significantly lower lifetime earnings and reduced retirement benefits.  

To his credit, President Biden has recognized the importance of strengthening our care economy.  His proposed American Families Plan includes some $225 billion in tax credits to help make child care more affordable for working families.  According to a White House fact sheet, families would “receive a tax credit for as much as half of their spending on qualified child care for children under age 13, up to a total of $4,000 for one child or $8,000 for two or more children. . . . The credit can be used for expenses ranging from full-time care to after school care to summer care.”

But tax credits don’t ensure the existence of convenient, affordable, high-quality child-care facilities staffed by well-paid and trained child-care providers.  And if that is what we really want, we will need to directly provide it.  That is what the government did during World War II.  While its program was far from perfect, in part because it was designed to be short-term, it provides an example of the type of strong, bold action we will need to overcome our current child-care crisis. 

Federal support for child care

During World War II the United States government financed a heavily-subsidized child-care program.  From August 1943 through February 1946, the Federal Works Agency (FWA), using Lanham Act funds, provided some $52 million in grants for child-care services (equal to more than $1 billion today) to any approved community group that could demonstrate a war-related need for the service.  At its July 1944 peak, 3,102 federally subsidized child-care centers, with some 130,000 children enrolled, operated throughout the country.  There was at least one center in every state but New Mexico, which decided against participation in the program.  By the end of the war, between 550,000 and 600,000 children received some care from Lanham Act funded child-care programs.  

Communities were allowed to use the federal grant money to cover most of the costs involved in establishing and running their centers, including facilities construction and upkeep, staff wages and most other daily operating costs.  They were required to provide some matching funds, most of which came from fees paid by the parents of children enrolled in the program.  However, these fees were capped. In the fall of 1943, the FWA established a ceiling on fees of 50 cents per child per day (about $7 now), which was raised to 75 cents in July 1945. And those fees included snacks, lunch, and in some cases dinner as well. Overall, the federal subsidy covered two-thirds of the total maintenance and operation of the centers.

The only eligibility requirement for enrollment was a mother’s employment status: she had to be working at a job considered important to the war effort, and this was not limited to military production. Center hours varied, but many accommodated the round-the-clock manufacturing schedule, staying open 24 hours a day, 6 days a week. 

The centers served preschoolers (infants, toddlers, and children up to 5 years of age) and school-age children (6 to 14 years of age). In July 1944, approximately 53,000 preschoolers and 77,000 school-age children were enrolled.  School-age enrollment always climbed during summer vacation.  However, in most months, preschoolers made up the majority of the children served by Lanham Act-funded centers. Enrollment of preschoolers peaked at some 74,000 in May 1945. 

Some 90 percent of the centers were housed in public schools, with newly contructed housing projects providing the next most used location. Although local school boards were free to decide program standards–including staff-child ratios, worker qualifications, and facility design–state boards of education were responsible for program supervision. The recommended teacher-child ratio was 10-to-1, and most centers complied.  According to Chris M. Herbst,

Anecdotal evidence suggests that preschool-aged children engaged in indoor and outdoor play; used educational materials such paints, clay, and musical instruments; and took regular naps. . . . Programs for school-aged children included . . . outdoor activities, participation in music and drama clubs, library reading, and assistance with schoolwork. 

Children at a child-care center sit for “story time.” (Gordon Parks / Library of Congress / The Crowley Company)

While quality did vary–largely the result of differences in community support for public child care, the willingness of cities to provide additional financial support, and the ability of centers to hire trained professionals to develop and oversee program activities–the centers did their best to deliver a high-quality childhood education.  As Ruth Peason Koshuk, the author of a 1947 study of the developmental records of 500 children, 2 to 5 years of age, at two Los Angeles Country centers, describes:

In these two . . . schools, as elsewhere, the program has developed since 1943, toward recognized standards of early childhood education. The aim has been to apply the best of existing standards, and to maintain as close contact with the home as possible. In-service training courses carrying college credit have been given, for the teaching staff, and a mutually helpful parent education program carried on in spite of difficulties inherent in a child care situation.

There has been a corresponding development in the basic records. A pre-entrance medical examination has been required by state law since the first center opened. In December 1943 a developmental record was added, which is filled out by the director during an unhurried interview with the mother just before a child enters. One page is devoted to infancy experience; the four following cover briefly the child’s development history, with emphasis on emotional experience, behavior problems he has presented to the parents, if any, and the control methods used, as well as the personal-social behavior traits which they value and desire for the child. After entrance, observational notes and semester reports are compiled by the teachers. Intelligence testing has been limited to cases where it seemed especially indicated. A closing record is filled out, in most cases, by the parent when a child is withdrawn. These records are considered a minimum. They have proved indispensable as aids to the teachers in guiding the individual children and as a basis for conferences on behavior in the home.

A 2013 study of the long-term effects on mothers and children from use of Lanham centers found a substantial increase in maternal employment, even five years after the end of the program, and “strong and persistent positive effects on well-being” for their children.

In short, despite many shortcomings, these Lanham centers, as Thalia Ertman sums up,

broke ground as the first and, to date, only time in American history when parents could send their children to federally-subsidized child care, regardless of income, and do so affordably. . . .

Additionally, these centers are seen as historically important because they sought to address the needs of both children and mothers. Rather than simply functioning as holding pens for children while their mothers were at work, the Lanham child care centers were found to have a strong and persistent positive effect on the well-being of children.

The federal government also supported some private employer-sponsored child care during the war. The most well-known example is the two massive centers built by the Kaiser Company in Portland, Oregon to provide child care for the children of workers at their Portland Yards and Oregon Shipbuilding Corporation. The centers were located right at the front of the shipyards, making it easy for mothers to drop their children off and pick them up, and were operated on a 24-hour schedule.  They were also large, each caring for up to 1,125 children between 18 months and 6 years of age. The centers had their own medical clinic, cafeteria, and large play areas, and employed highly trained staff.  Parents paid $5 for a six-day week for one child and $3.75 for each additional child.  For a small additional fee, the centers also prepared a small dinner for parents to pick up at the end of their working day.

While the Kaiser Company received much national praise as well as appreciation from its employees with young children, these centers were largely paid for by the government.  Government funds directly paid for their construction, and a majority of the costs of running the center, including staff salaries, were included in the company’s cost-plus contracting with the military.

Political dynamics

There was considerable opposition to federal financing of group child care, especially for children younger than 6 years of age.  The sentiment is captured in this quote from a 1943 New York Times article: “The worst mother is better than the best institution when it is a matter of child care, Mayor La Guardia declared.”  Even the War Manpower Commission initially opposed mothers with young children working outside the home, even in service of the war effort, stating that “The first responsibility of women with young children, in war as in peace, is to give suitable care in their own homes to their children.”

But on-the-ground realities made this an untenable position for both the government and business. Women sought jobs, whether out of economic necessity or patriotism.  The government, highlighted by its Rosie the Riveter campaign, was eager to encourage their employment in industries producing for the war effort.  And, despite public sentiment, a significant number of those women were mothers with young children. 

Luedell Mitchell and Lavada Cherry working at a Douglas Aircraft plant in El Segundo, Calif. Circa 1944. Credit: National Archives photo no. 535811

The growing importance of women in the workplace, and especially mothers with young children, is captured in employment trends in Portland, Oregon.  Women began moving into the defense workforce in great numbers starting in 1942, with the number employed in local war industries climbing from 7,000 in November 1942 to 40,000 in June 1943.  An official with the state child-care committee reported that “a check of six shipyards reveals that the number of women employed in the shipyards has increased 25 percent in one month and that the number is going to increase more rapidly in the future.” 

The number of employed mothers was also rapidly growing.  According to the Council of Social Agencies, “Despite the recommendations of the War Manpower Commission . . . thousands of young mothers in their twenties and thirties have accepted jobs in war industries and other businesses in Multnomah County. Of the 8,000 women employed at the Oregon Shipyards in January, 1943, 32 percent of them had children, 16 percent having pre-school children.”

Portland was far from unique.  During the war, for the first time, married women workers outnumbered single women workers.  Increasingly, employers began to recognize the need for child care to address absenteeism problems.  As a “women’s counselor” at the Bendix Aviation Corporation in New Jersey explained to reporters in 1943, child care is one of the biggest concerns for new hires. “We feel a mother should be with her small baby if possible. But many of them have to come back. Their husbands are in the service and they can’t get along on his allotment.”  Media stories, many unsubstantiated, of children left in parked cars outside workplaces or fending for themselves at home, also contributed to a greater public acceptance of group child care. 

An image of Rosie the Riveter that appeared in a 1943 issue of the magazine Hygeia

Finally, the government took action.  The Federal Works Agency was one of two new super agencies established in 1939 to oversee the large number of agencies created during the New Deal period.  In 1940 President Roosevelt signed into law the Lanham Act, which authorized the FWA to fund and supervise the construction of needed public infrastructure, such as housing, hospitals, water and sewer systems, police and firefighting facilities, and recreation centers, in communities experiencing rapid growth because of the defense buildup. In August 1942, the FWA decided, without any public debate, that public infrastructure also meant child care, and it began its program of support for the construction and operation of group child-care facilities.

The Federal Works Agency, the other super agency, whose oversight responsibilities included the Children’s Bureau and the U.S. Office of Education, opposed the FWA’s new child-care initiative.  It did so not only because it believed that child care fell under its mandate, but also because the leadership of the Children’s Bureau and Office of Education opposed group child care.  The FWA won the political battle, and in July 1943, Congress authorized additional funding for the FWA’s child-care efforts. 

And, as William M. Tuttle, Jr. describes, public pressure played an important part in the victory:

the proponents of group child care organized a potent lobbying effort. The women’s auxiliaries of certain industrial unions, such as the United Electrical Workers and the United Auto Workers, joined with community leaders and FWA officials in the effort. Also influential were the six women members of the House of Representatives. In February 1944, Representative Mary T. Norton presented to the House “a joint appeal” for immediate funds to expand the wartime child day care program under the FWA.

Termination and a step back

Congressional support for group child care was always tied to wartime needs, a position shared by most FWA officials.  The May 1945 Allied victory in Europe brought a drop in war production, and a reduction in FWA community child care approvals and renewals.  In August, after the Japanese surrender brought the war to a close, the FWA announced that it would end its funding of child-care centers as soon as possible, but no later than the end of October 1945.

Almost immediately thousands of individuals wrote letters, sent wires, and signed petitions calling for the continuation of the program.  Officials in California, the location of many war-related manufacturing sites and nearly 25 percent of all children enrolled in Lanham Act centers in August 1945, also weighed in, strongly supporting the call.  Congress yielded, largely influenced by the argument that since it would be months before all the “men” in the military returned to the country, mothers had no choice but to continue working and needed the support of the centers to do so.  It approved new funds, but only enough to keep the centers operating until the end of February 1946.

The great majority of centers rapidly closed not long after the termination of federal support, with demonstrations following many of the closings.  The common assumption was that women would not mind the closures, since most would be happy to return to homemaking.  Many women were, in fact, forced out of the labor force, disproportionately suffering from post-war industrial layoffs.  But by 1947, women’s labor force participation was again on the rise and a new push began for a renewal of federal support for community child-care centers. Unfortunately, the government refused to change its position. During the Korean War, Congress did approve a public child-care bill, but then it refused to authorize any funding.

After WWII, parents organized demonstrations, like this one in New York on Sept. 21, 1947, calling for the continuing funding of the centers. The city’s welfare commissioner dismissed the protests as “hysterical.” Credit: The New York Times

Finally, in 1954, as Sonya Michel explains, “Congress found an approach to child care it could live with: the child-care tax deduction.”  While the child-care tax deduction did offer some financial relief to some families, it did nothing to ensure the availability of affordable, high-quality child care.  The history of child care during World War II makes clear that this turn to market-based tax policy to solve child-care problems represented a big step back for working women and their children.  And this was well understood by most working people at the time. 

Sadly, this history has been forgotten, and Biden’s commitment to expand the child-care tax credit is now seen as an important step forward.  History shows we can and need to do better.

The latest argument against federal relief: business claims that workers won’t work

A growing number of business and political leaders have found yet another argument to use against federal pandemic relief programs, especially those that provide income support for workers: they hurt the economic recovery by encouraging workers not to work.

In the words of Senate Minority Leader Mitch McConnell, as reported by BusinessInsider

“We have flooded the zone with checks that I’m sure everybody loves to get, and also enhanced unemployment,” McConnell said from Kentucky. “And what I hear from businesspeople, hospitals, educators, everybody across the state all week is, regretfully, it’s actually more lucrative for many Kentuckians and Americans to not work than work.”

He went on: “So we have a workforce shortage and we have raising inflation, both directly related to this recent bill that just passed.”

In line with business claims that they can’t find willing workers despite their best efforts at recruitment, the governors of Montana, South Carolina, Alabama, Arkansas, and Mississippi have all announced that they will no longer allow the unemployed in their respective states to collect the $300-a-week federal supplemental unemployment benefit and will once again require that those receiving unemployment benefits demonstrate they are actively looking for work.

In reality there is little support for the argument that expanded unemployment benefits have created an overly worker-friendly labor market, leaving companies unable to hire and, by extension, meet growing demand.  But of course, if enough people accept the argument, corporate leaders and their political allies will have achieved their shared goal, which is to weaken worker bargaining power as corporations seek to position themselves for a profitable post-pandemic economic recovery.

Wage trends

If companies were aggressively seeking workers, we would expect to see the resulting competition push up wages.  The following figure shows year-over-year real weekly earnings of production and nonsupervisory workers—approximately 85 percent of the workforce.  As we can see, those earnings were actually lower in April 2021 than they were in April 2020. 

In short, companies may want more workers, but it is hard to take their cries of anguish seriously if they remain unwilling to offer higher real wages to attract them.  Real average weekly earnings of production and nonsupervisory workers in April 2021 stood at $875.  Multiplying weekly earnings by 50, gives an estimated annual salary of $43,774.  That total is actually 5.7 percent below the similarly calculated peak in October 1972.

Over the last three months, the only sector experiencing significant wage growth due to labor shortages is the leisure and hospitality sector (which includes arts, entertainment, and leisure as well as accommodations and food services).  Wages in that sector grew at an annualized rate of nearly 18 percent relative to the previous three months.  But, as Josh Bivens and Heidi Shierholz explain,

There is very little reason to worry that labor shortages in leisure and hospitality will soon spill over into other sectors and drive economywide “overheating.”  For example, jobs in leisure and hospitality have notably low wages and fewer hours compared to other sectors. Weekly wages of production and nonsupervisory workers in leisure and hospitality now equate to annual earnings of just $20,628, and total wages in leisure and hospitality account for just 4% of total private wages in the U.S. economy. . . . [Moreover] this sector seems notably segmented off from much of the rest of the economy.

Job openings and labor turnover

The figure below, drawn from the Bureau of Labor Statistics’s Job Openings and Labor Turnover Summary (JOLTS), shows the monthly movement in job openings, hires, quits, and layoffs and discharges, with solid lines showing their six-month moving averages.   

As we can see, despite business complaints, monthly hiring (green line) still remains greater than during the last years of the pre-pandemic expansion.  And although job openings (blue line) are growing sharply while the number of hires is falling, the gap between openings and hires is also still smaller than it was during the last years of the previous expansion.  In addition, the number of quits (light blue line), which are an indicator of labor tightness, remain below the last years of the previous expansion and rather stable.  In short, there is nothing in the data that suggests business is facing a dysfunctional labor market marked by an unreasonable worker unwillingness to work.

Even with the additional financial support in Biden’s American Rescue Plan, many workers and their families continue to struggle to afford food, housing, and health care.  Many workers remain reluctant to re-enter the labor market because of Covid-related health concerns and care responsibilities.  Moreover, as Heidi Shierholz points out

there are far more unemployed people than available jobs in the current labor market. In the latest data on job openings, there were nearly 40% more unemployed workers than job openings overall, and more than 80% more unemployed workers than job openings in the leisure and hospitality sector.

While there are certainly fewer people looking for jobs now than there would be if Covid weren’t a factor . . . without enough job openings to even come close to providing work for all job seekers, it again stretches the imagination to suggest that labor shortages are a core dynamic in the labor market.

We need to discredit this attempt by the business community and its political allies to generate opposition to policies that help workers survive this period of crisis and redouble our own efforts to strengthen worker rights and build popular support for truly transformative economic policies, ones that go beyond the stopgap fixes currently promoted.

Time to put the spotlight on corporate taxes

A battle is slowly brewing in Washington DC over whether to raise corporate taxes to help finance new infrastructure investments.  While higher corporate taxes cannot generate all the funds needed, the coming debate over whether to raise them gives us an opportunity to challenge the still strong popular identification of corporate profitability with the health of the economy and, by extension, worker wellbeing.

According to the media, President Biden’s advisers are hard at work on two major proposals with a combined $3 trillion price tag.  The first aims to modernize the country’s physical infrastructure and is said to include funds for the construction of roads, bridges, rail lines, ports, electric vehicle charging stations, and affordable and energy efficient housing as well as rural broadband, improvements to the electric grid, and worker training programs.  The second targets social infrastructure and would provide funds for free community college education, universal prekindergarten, and a national paid leave program. 

To pay for these proposals, Biden has been talking up the need to raise corporate taxes, at least to offset some of the costs of modernizing the country’s physical infrastructure.  Not surprisingly, Republican leaders in Congress have voiced their opposition to corporate tax increases.  And corporate leaders have drawn their own line in the sand.  As the New York Times reports:

Business groups have warned that corporate tax increases would scuttle their support for an infrastructure plan. “That’s the kind of thing that can just wreck the competitiveness in a country,” Aric Newhouse, the senior vice president for policy and government relations at the National Association of Manufacturers, said last month [February 2021].

Regardless of whether Biden decides to pursue his broad policy agenda, this appears to be a favorable moment for activists to take advantage of media coverage surrounding the proposals and their funding to contest these kinds of corporate claims and demonstrate the anti-working-class consequences of corporate profit-maximizing behavior.  

What do corporations have to complain about?

To hear corporate leaders talk, one would think that they have been subjected to decades of tax increases.  In fact, quite the opposite is true.  The figure below shows the movement in the top corporate tax rate.  As we can see, it peaked in the early 1950s and has been falling ever since, with a big drop in 1986, and another in 2017, thanks to Congressionally approved tax changes.

One consequence of this corporate friendly tax policy is, as the following figure shows, a steady decline in federal corporate tax payments as a share of GDP.  These payments fell from 5.6 percent of GDP in 1953 to 1.5 percent in 1982, and a still lower 1.0 percent in 2020.  By contrast there has been very little change in individual income tax payments as a share of GDP; they were 7.7 percent of GDP in 2020.

Congressional tax policy has certainly been good for the corporate bottom line.  As the next figure illustrates, both pre-tax and after-tax corporate profits have risen as a share of GDP since the early 1980s.  But the rise in after-tax profits has been the most dramatic, soaring from 5.2 percent of GDP in 1980 to 9.1 percent in 2019, before dipping slightly to 8.8 percent in 2020.   To put recent after-tax profit gains in perspective, the 2020 after-tax profit share is greater than the profit share in every year from 1930 to 2005.

What do corporations do with their profits?

Corporations claim that higher taxes would hurt U.S. competitiveness, implying that they need their profits to invest and keep the economy strong.  Yet, despite ever higher after-tax rates of profit, private investment in plant and equipment has been on the decline.

As the figure below shows, gross private domestic nonresidential fixed investment as a share of GDP has been trending down since the early 1980s.  It fell from 14.8 percent in 1981 to 13.4 percent in 2020.

Rather than investing in new plant and equipment, corporations have been using their profits to fund an aggressive program of stock repurchases and dividend payouts.  The figure below highlights the rise in corporate stock buybacks, which have helped drive up stock prices, enriching CEOs and other top wealth holders. In fact, between 2008 and 2017, companies spent some 53 percent of their profits on stock buybacks and another 30 percent on dividend payments.

It should therefore come as no surprise that CEO compensation is also exploding, with CEO-to-worker compensation growing from 21-to-1 in 1965, to 61-to-1 in 1989, 293-to-1 in 2018, and 320-to-1 in 2019.  As we see in the next figure, the growth in CEO compensation has actually been outpacing the rise in the S&P 500.

In sum, the system is not broken.  It continues to work as it is supposed to work, generating large profits for leading corporations that then find ways to generously reward their top managers and stockholders.  Unfortunately, investing in plant and equipment, creating decent jobs, or supporting public investment are all low on the corporate profit-maximizing agenda.  

Thus, if we are going to rebuild and revitalize our economy in ways that meaningfully serve the public interest, working people will have to actively promote policies that will enable them to gain control over the wealth their labor produces.  One example: new labor laws that strengthen the ability of workers to unionize and engage in collective and solidaristic actions.  Another is the expansion of publicly funded and provided social programs, including for health care, housing, education, energy, and transportation. 

And then there are corporate taxes.  Raising them is one of the easiest ways we have to claw back funds from the private sector to help finance some of the investment we need.  Perhaps more importantly, the fight over corporate tax increases provides us with an important opportunity to make the case that the public interest is not well served by reliance on corporate profitability.

The U.S. recovery on pause, December brings new job losses

A meaningful working-class recovery from the recession seems far away.

After seven months of job gains, although diminishing gains to be sure, we are again losing jobs.  As the chart below shows,  the number of jobs fell by 140,000 in December.

We are currently about 9.8 million jobs down from the February 2020 employment peak, having recovered only 55 percent of the jobs lost.  And, as the following chart illustrates, the percentage of jobs lost remains greater, even now after months of job growth, than it was at any point during the Great Recession. 

If the job recovery continues on its current pace, some analysts predict that it will likely take more than three years to just get back to pre-pandemic employment levels.  However, this might well be too rosy a projection.  One reason is that the early assumption that many of the job losses were temporary, and that those unemployed would soon be recalled to employment, is turning out to be wrong.  A rapidly growing share of the unemployed are remaining unemployed for an extended period. 

As we see below, in October, almost one-third of the unemployed had been unemployed for 27 weeks or longer.  According to the December jobs report, that percentage is now up to 37 percent, four times what it was before the pandemic.  And that figure seriously understates the problem, since many workers have given up looking for work; having dropped out of the workforce, they are no longer counted as unemployed.  The labor force participation rate is now 61.5 percent, down from 63.3 percent in February.

Dean Baker, quoted in a recent Market Place story, underscores the importance of this development:

“This is obviously a story of people losing their job at the beginning of the crisis in March and April and not getting it back,” said Dean Baker, co-founder and senior economist with the Center for Economic and Policy Research.

Those out of work for 27 weeks or more make up a growing share of the unemployed, and that could have enduring consequences, Baker said.

“After people have been unemployed for more than six months, they find it much harder to get a job,” he said. “And if they do get a job, their labor market prospects could be permanently worsened.”

And tragically, the workers that have suffered the greatest job losses during this crisis are those that earned the lowest wages. 

It is no wonder that growing numbers of working people are finding it difficult to meet their basic needs.

There is no way to sugar coat this situation.  We need a significant stimulus package, a meaningful increase in the minimum wage, real labor law reform, a robust national single-payer health care system, and an aggressive Green New Deal designed public sector investment and jobs program.  And there is no getting around the fact that it is going to take hard organizing and mutually supportive community and workplace actions to move the country in the direction it needs to go.

Defunding police and challenging militarism, a necessary response to their “battle space”

The excessive use of force and killings of unarmed Black Americans by police has fueled a popular movement for slashing police budgets, reimagining policing, and directing freed funds to community-based programs that provide medical and mental health care, housing, and employment support to those in need.  This is a long overdue development.

Police are not the answer

Police budgets rose steadily from the 1990s to the Great Recession and, despite the economic stagnation that followed, have remained largely unchanged.  This trend is highlighted in the figure below, which shows real median per capita spending on police in the 150 largest U.S. cities.  That spending grew, adjusted for inflation, from $359 in 2007 to $374 in 2017.  The contrast with state and local government spending on social programs is dramatic.  From 2007 to 2017, median per capita spending on housing and community development fell from $217 to $173, while spending on public welfare programs fell from $70 to $47.

Thus, as economic developments over the last three decades left working people confronting weak job growth, growing inequality, stagnant wages, declining real wealth, and rising rates of mortality, funding priorities meant that the resulting social consequences would increasingly be treated as policing problems.  And, in line with other powerful trends that shaped this period–especially globalization, privatization, and militarization–police departments were encouraged to meet their new responsibilities by transforming themselves into small, heavily equipped armies whose purpose was to wage war against those they were supposed to protect and serve. 

The military-to-police pipeline

The massive, unchecked militarization of the country and its associated military-to-police pipeline was one of the more powerful factors promoting this transformation.  The Pentagon, overflowing with military hardware and eager to justify a further modernization of its weaponry, initiated a program in the early 1990s that allowed it to provide surplus military equipment free to law enforcement agencies, allegedly to support their “war on drugs.”  As a Forbes article explains:

Since the early 1990s, more than $7 billion worth of excess U.S. military equipment has been transferred from the Department of Defense to federal, state and local law enforcement agencies, free of charge, as part of its so-called 1033 program. As of June [2020], there are some 8,200 law enforcement agencies from 49 states and four U.S. territories participating. 

The program grew dramatically after September 11, 2001, justified by government claims that the police needed to strengthen their ability to combat domestic terrorism.  As an example of the resulting excesses, the Los Angeles Times reported in 2014 that the Los Angeles Unified School District and its police officers were in possession of three grenade launchers, 61 automatic military rifles and a Mine Resistant Ambush Protected armored vehicle. Finally, in 2015, President Obama took steps to place limits on the items that could be transferred; tracked armored vehicles, grenade launchers, and bayonets were among the items that were to be returned to the military.

President Trump removed those limits in 2017, and the supplies are again flowing freely, including armored vehicles, riot gear, explosives, battering rams, and yes, once again bayonets.  According to the New York Times, “Trump administration officials said that the police believed bayonets were handy, for instance, in cutting seatbelts in an emergency.”

Outfitting police departments for war also encouraged different criteria for recruiting and training. For example, as Forbes notes, “The average police department spends 168 hours training new recruits on firearms, self-defense, and use of force tactics. It spends just nine hours on conflict management and mediation.”  Arming and training police for military action leads naturally to the militarization of police relations with community members, especially Black, Indigeous and other people of color, who come to play the role of the enemy that needs to be controlled or, if conditions warrant, destroyed.

In fact, the military has become a major cheerleader for domestic military action.  President Trump, on a call with governors after the start of demonstrations protesting the May 25, 2020 killing of George Floyd while in police custody, exhorted them to “dominate” the street protests.

As the Washington Examiner reports:

“You’ve got a big National Guard out there that’s ready to come and fight like hell,” Trump told governors on the Monday call, which was leaked to the press.

[Secretary of Defense] Esper lamented that only two states called up more than 1,000 Guard members of the 23 states that have called up the Guard in response to street protests. The National Guard said Monday that 17,015 Guard members have been activated for civil unrest.

“I agree, we need to dominate the battle space,” Esper said after Trump’s initial remarks. “We have deep resources in the Guard. I stand ready, the chairman stands ready, the head of the National Guard stands ready to fully support you in terms of helping mobilize the Guard and doing what they need to do.”

The militarization of the federal budget

The same squeeze of social spending and support for militarization is being played out at the federal level.  As the National Priorities Project highlights in the following figure, the United States has a military budget greater than the next ten countries combined.

Yet, this dominance has done little to slow the military’s growing hold over federal discretionary spending.  At $730 billion, military spending accounts for more than 53 percent of the federal discretionary budget.  A slightly broader notion, what the National Priorities Project calls the militarized budget, actually accounts for almost two-thirds of the discretionary budget.  The militarized budget:

includes discretionary spending on the traditional military budget, as well as veterans’ affairs, homeland security, and law enforcement and incarceration. In 2019, the militarized budget totaled $887.8 billion – amounting to 64.5 percent of discretionary spending. . . . This count does not include forms of militarized spending allocated outside the discretionary budget, include mandatory spending related to veterans’ benefits, intelligence agencies, and interest on militarized spending.

The militarized budget has been larger than the non-militarized budget every year since 1976.  But the gap between the two has grown dramatically over the last two decades. 

In sum, the critical ongoing struggle to slash police budgets and reimagine policing needs to be joined to a larger movement against militarism more generally if we are to make meaningful improvements in majority living and working conditions.

Racism, COVID-19, and the fight for economic justice

While the Black Lives Matter protests sweeping the United States were triggered by recent police murders of unarmed African Americans, they are also helping to encourage popular recognition that racism has a long history with punishing consequences for black people that extend beyond policing.  Among the consequences are enormous disparities between black and white well-being and security.  This post seeks to draw attention to some of these disparities by highlighting black-white trends in unemployment, wages, income, wealth, and security. 

A common refrain during this pandemic is that “We are all in it together.”  Although this is true in the sense that almost all of us find our lives transformed for the worst because of COVID-19, it is also not true in some very important ways.  For example, African Americans are disproportionally dying from the virus.  They account for 22.4 percent of all COVID-19 deaths despite making up only 12.5 percent of the population. 

One reason is that African Americans also disproportionally suffer from serious preexisting health conditions, a lack of health insurance, and inadequate housing, all of which increased their vulnerability to the virus.  Another reason is that black workers are far more likely than white workers to work in “front-line” jobs, especially low-wage ones, forcing them to risk their health and that of their families.  While black workers comprise 11.9 percent of the labor force, they make up 17 percent of all front-line workers.  They represent an even higher percentage in some key front-line industries: 26 percent of public transit workers; 19.3 percent of child care and social service workers; and 18.2 percent of trucking, warehouse and postal service workers.

African Americans have also disproportionately lost jobs during this pandemic.  The black employment to adult population ratio fell from 59.4 percent before the start of the pandemic to a record low of 48.8 percent in April.  Not surprisingly, recent surveys find, as the Washington Post reports, that:

More than 1 in 5 black families now report they often or sometimes do not have enough food — more than three times the rate for white families. Black families are also almost four times as likely as whites to report they missed a mortgage payment during the crisis — numbers that do not bode well for the already low black homeownership rate.

This pandemic has hit African Americans especially hard precisely because they were forced to confront it from a position of economic and social vulnerability as the following trends help to demonstrate.

Unemployment

The Bureau of Labor Statistics began collecting separate data on African American unemployment in January 1972.  Since then, as the figure below shows, the African American unemployment rate has largely stayed at or above twice the white unemployment rate. 

As Olugbenga Ajilore explains

Between strides in civil rights legislation, desegregation of government, and increases in educational attainment, employment gaps should have narrowed by now, if not completely closed. Yet as [the figure above] shows, this has not been the case.

Wages

The figure below from an Economic Policy Institute study, shows the black-white wage gap for workers in different earning percentiles, by education level, and regression-adjusted (to control for age, gender, education and regional differences).  As we can see, the wage gap has grown over time regardless of measure. 

Elise Gould summarizes some important take-aways from this study:

The black–white wage gap is smallest at the bottom of the wage distribution, where the minimum wage serves as a wage floor. The largest black–white wage gap as well as the one with the most growth since the Great Recession, is found at the top of the wage distribution, explained in part by the pulling away of top earners generally as well as continued occupational segregation, the disproportionate likelihood for white workers to occupy positions in the highest-wage professions.

It’s clear from the figure that education is not a panacea for closing these wage gaps. Again, this should not be shocking, as increased equality of educational access—as laudable a goal as it is—has been shown to have only small effects on class-based wage inequality, and racial wealth gaps have been almost entirely unmoved by a narrowing of the black–white college attainment gap . . . . And after controlling for age, gender, education, and region, black workers are paid 14.9% less than white workers.

Income

The next figure shows that while median household income has generally stagnated for all races/ethnicities over the period 2000 to 2017, only blacks have suffered an actual decline.  The median income for black households actually fell from $42,348 to $40,258 over this period.  As a consequence, the black-white income gap has grown.  The median black household in 2017 earned just 59 cents for every dollar of income earned by the white median household, down from 65 cents in 2000.

Moreover, as Valerie Wilson, points out, “Based on [Economic Policy Institute] imputed historical income values, 10 years after the start of the Great Recession in 2007, only African American and Asian households have not recovered their pre-recession median income.“  Median household income for African American households fell 1.9 percent or $781 over the period 2007 to 2017.  While the decline was greater for Asian households (3.8 percent), they continued to have the highest median income.

Wealth

The wealth gap between black and white households also remains large.  In 1968, median black household wealth was $6,674 compared with median white household wealth of $70,768.  In 2016, as the figure below shows, it was $13,024 compared with $149,703.

As the Washington Post summarizes:

“The historical data reveal that no progress has been made in reducing income and wealth inequalities between black and white households over the past 70 years,” wrote economists Moritz Kuhn, Moritz Schularick and Ulrike I. Steins in their analysis of U.S. incomes and wealth since World War II.

As of 2016, the most recent year for which data is available, you would have to combine the net worth of 11.5 black households to get the net worth of a typical white U.S. household.

The self-reinforcing nature of racial discrimination is well illustrated in the next figure.  It shows the median household wealth by education level as defined by the education level of the head of household. 

As we see, black median household wealth is below white median household wealth at every education level, with the gap growing with the level of education.  In fact, the median black household headed by someone with an advanced degree has less wealth than the median white household headed by someone with only a high school diploma.  The primary reason for this is that wealth is passed on from generation to generation, and the history of racism has made it difficult for black families to accumulate wealth much less pass it on to future generations. 

Security

The dollar value of household ownership of liquid assets is one measure of economic security.  The greater the value, the easier it is for a household to weather difficult times not to mention unexpected crises, such as today’s pandemic.  And as one might expect in light of the above income and wealth trends, black households have far less security than do white households.

As we can see in the following figure, the median black household held only $8,762 in liquid assets (as defined as the sum of all cash, checking and savings accounts, and directly held stocks, bonds, and mutual funds).  In comparison, the median white household held $49,529 in liquid assets.  And the black-white gap is dramatically larger for households headed by someone with a bachelors degree or higher. 

Hopeful possibilities

The fight against police violence against African Americans, now being advanced in the streets, will eventually have to be expanded and the struggle for racial justice joined to a struggle for economic justice.  Ending the disparities highlighted above will require nothing less than a transformational change in the organization and workings of our economy.

One hopeful sign is the widespread popular support for and growing participation in the Black Lives Matter-led movement that is challenging not only racist policing but the idea of policing itself and is demanding that the country acknowledge and confront its racist past.  Perhaps the ways in which our current economic system has allowed corporations to so quickly shift the dangers and costs of the pandemic on to working people, following years of steady decline in majority working and living conditions, is helping whites better understand the destructive consequences of racism and encouraging this political awakening. 

If so, perhaps we have arrived at a moment where it will be possible to build a multi-racial working class-led movement for structural change that is rooted in and guided by a commitment to achieving economic justice for all people of color. One can only hope that is true for all our sakes.

Coronavirus: a return to normal is not good enough

We shouldn’t be satisfied with a return to normalcy. We need a “new normal.”

We are now in a recession, one triggered by government ordered closures of businesses producing nonessential goods and services, an action taken to limit the spread of the coronavirus. In response, Congress has approved three stimulus measures which legislators hope will keep the economy afloat until the virus is contained and companies can resume business as usual.

Many people, rightly criticizing the size, speed, and aims of these measures, have called for a new, improved stimulus package.  But what is getting far less attention, and may be the most important thing to criticize, is the notion that we should view a return to normalcy as our desired goal.  The fact is we also need a new economy.

The old normal only benefited a few

The media, even those critical of the Trump administration, all too often showcase economic experts who, while acknowledging the severity of the current crisis, reassure us that economic activity will return to normal before too long.  But since our economy increasingly worked to benefit a small minority, that is no cause for celebration.

Rarely mentioned is the fact that our economy was heading into a recession before the coronavirus hit. Or that living and working conditions for the majority of Americans were declining even during the past years of expansion. Or that the share of workers in low-wage jobs was growing over the last fifteen years.  Or that Americans are facing a retirement crisis.  Or that life expectancy fell from 2014 to 2017 because of the rise in mortality among young and middle-aged adults of all racial groups due to drug overdoses, suicides, and alcoholism.  If existing patterns of ownership and production remain largely unchanged, we face a future of ever greater instability, inequality, and poverty.

The economic crisis

The failings of our current system are only accentuated by the crisis. Many analysts are predicting an unprecedented one-quarter decline in GDP of 8 percent to 10 percent in the second quarter of this year.   The overall yearly decline may well be in the 5-7 percent range, the steepest annual drop in growth since 1946.

The unemployment rate is soaring and may reach 20 percent before the year is out.  A recent national survey found that 52 percent of workers under the age of 45 have already lost their job, been placed on leave, or had their hours cut because of the pandemic-caused downturn.

As a consequence, many people are finding it difficult to pay rent.  Survey results show that only 69 percent of renters paid their rent during the first week of April compared with over 80 percent during the first week of March.  And this includes renters who made partial payments.  Homeowners are not in much better shape.

Our unemployment insurance system has long been deficient: benefits are inadequate, last for only short period of time, and eligibility restrictions leave many workers uncovered. As of year-end 2019, the average unemployment insurance check was only $378 a week, the average duration of benefits was less than 15 weeks, and fewer than one-third of those unemployed were drawing benefits.

Now, the system is overwhelmed by people seeking to file new claims, leaving millions unable to even start their application process.  Although recent federal legislation allows states to expand their unemployment insurance eligibility and benefits, a very large share of those losing their jobs will find this part of our safety net not up to its assigned job.

A better crafted stimulus is needed

In response to the crisis, policy-makers have struggled to approve three so-called stimulus measures, the March 2020 Coronavirus Aid, Relief, and Economic Security (CARES) Act being the largest and most recent.  Unfortunately, these efforts have been disappointing.  For example, most of the provisions in the CARES Act include set termination dates untied to economic or health conditions. Approved spending amounts for individuals are also insufficient, despite the fact that Treasury Secretary Mnuchin believes the $1200 provided to most Americans as part of the CARES Act will be enough to tide them over for 10 weeks.

Also problematic is that not all CARE funds are directed to where they are most needed.  For example, no money was allocated to help states maintain their existing Medicaid program eligibility and benefit standards or expand health care coverage to uninsured immigrants and those who lose their job-based insurance.  And no money was allocated to state and local governments to help them maintain existing services in the face of declining tax revenues. Perhaps not surprisingly, the largest share of CARES approved spending is earmarked for corporate rescues without any requirement that the funds be used for saving jobs or wages.  In sum, we need another, better stimulus measure if we hope to minimize the social costs of our current crisis.

Creating a new normal

Even a better stimulus measure leaves our economy largely unchanged.  Yet, ironically, our perilous situation has encouraged countless expressions of social trust and solidarity that reveal ways to move forward to a more humane, egalitarian, and sustainable economy.  This starts with the growing recognition by many Americans that social solidarity, not competitive individualism, should shape our policies. People have demonstrated strong support for free and universal access to health care during this crisis, and we can build on that to push for an expansive Medicare for All health care system.  People also have shown great solidarity with the increasingly organized struggles of mail carriers, health care workers, bus drivers, grocery shoppers, cashiers, and warehouse workers to keep themselves safe while they brave the virus for our benefit.  We can build on that solidarity to push for new labor laws that strengthen the ability of all workers to form strong, democratic unions.

There is also growing support for putting social well-being before the pursuit of profit.  Many people have welcomed government action mandating that private corporations convert their production to meet social needs, such as the production of ventilators and masks.  We can build on this development to encourage the establishment of publicly owned and operated industries to ensure the timely and affordable production of critical goods like pharmaceuticals and health care equipment. And many people are coming to appreciate the importance of planning for future crises.  This appreciation can be deepened to encourage support for the needed transformation of our economy to minimize the negative consequences of the growing climate crisis.

We should not discount our ability to shape the future we want.

The Green New Deal and the State: Lessons from World War II—Part II

There is growing interest in a Green New Deal, but far too little discussion among supporters about the challenging nature of the required economic transformation, the necessary role of public planning and ownership in shaping it, or the strategies necessary to institutionalize a strong worker-community voice in the process and final outcome. In this two-part series I draw on the experience of World War II, when the state was forced to direct a rapid transformation from civilian to military production, to help encourage and concretize that discussion.

In Part I, I first discussed the need for a rapid Green New Deal-inspired transformation and the value of studying the U.S. experience during World War II to help us achieve it. Then, I examined the evolution, challenges, and central role of state planning in the wartime conversion to alert us to the kind of state agencies and capacities we will need to develop. Finally, I highlighted two problematic aspects of the wartime conversion and postwar reconversion which we must guard against: the ability of corporations to strengthen their dominance and the marginalization of working people from any decision-making role in conversion planning.

Here in Part II, I discuss the efforts of labor activists to democratize the process of transformation during the war period in order to sharpen our thinking about how best to organize a labor-community movement for a Green New Deal.  During this period, many labor activists struggled against powerful political forces to open up space for new forms of economic planning with institutionalized worker-community involvement.  The organizing and movement building efforts of District 8 leaders of the United Electrical, Radio & Machine Workers of America (UE), as described by Rosemary Feuer in her book Radical Unionism in the Midwest, 1900-1950, stand out in this regard.  Although their success was limited, there is much that we can learn from their efforts.

Organizing for a worker-community planned conversion process

District 8 covered Missouri, Iowa, Kansas, Arkansas, southern Indiana and southern and western Illinois, and UE contracts in that area were heavily weighted towards small and medium sized firms producing mechanical and electrical products.  As the government began its war time economic conversion in 1941, its policy of suppressing civilian goods and rewarding big corporations with defense contracts hit the firms that employed UE members hard.

The UE response was to build a labor and community-based effort to gain control over the conversion process. In Evansville, Indiana, the UE organized a community campaign titled “Prevent Evansville from Becoming a Ghost Town.”  As Feurer explains,

District 8’s tentative proposal called upon union and civic and business leaders to request the establishment of a federal program that would “be administered through joint and bona fide union-management-government cooperation” at the local level. It would ensure that before reductions in the production of consumer goods were instituted, government must give enough primary war contracts and subcontracts to “take up the slack” of unemployment caused in cities such as Evansville. It also proposed that laid-off workers would get “first claim on jobs with other companies in the community,” while excessive overtime would be eliminated until unemployment was reduced.

District 8 organizers pressed Evansville’s mayor to gather community, labor, and business representatives from all over the Midwest to discuss how to manage the conversion to save jobs.  They organized mass petition drives and won endorsements for their campaign from many community groups and small businesses.  Persuaded, Evansville’s mayor contacted some 500 mayors from cities with populations under 250,000 in eleven midwestern states, requesting that they send delegations of “city officials, labor leaders, managers of industry and other civic leaders” to a gathering in Chicago.  Some 1500 delegates attended the September meeting.

The conference endorsed the UE’s call for a significant role for labor in conversion planning, specifically “equal participation of management and labor in determining a proper and adequate retraining program and allocation of primary and sub-contracts. . . [And that] all possible steps be taken to avoid serious dislocations in non-defense industries.”  A committee of seven, with two labor representatives, was chosen to draw up a more concrete program of action.

One result was that Evansville and Newton, Iowa (another city with a strong UE presence) were named “Priority Unemployment Plan” areas, and allowed to conduct “an experiment for community-based solving of unemployment and dislocations caused by war priorities.”  The plan restricted new plant construction if existing production capacity was considered sufficient, encouraged industry-wide and geographical-based pooling of production facilities to boost efficiency and stabilize employment, required companies to provide training to help workers upgrade their skills, and supported industry-wide studies to determine how to best adapt existing facilities for military production.

William Sentner, the head of District 8, called for labor to take a leading role in organizing community gatherings in other regions and creating regional planning councils. Unfortunately, CIO leaders did little to support the idea. Moreover, once the war started, unemployment stopped being a serious problem and the federal government took direct control over the conversion process.

Organizing for a worker-community planned reconversion process

As the war began to wind down, District 8 leaders once again took up the issue of conversion, this time conversion back to a peacetime economy.  In 1943, they got the mayor of St. Louis to create a community planning committee, with strong labor participation, to discuss future economic possibilities for the city.  In 1944, they organized a series of union conferences with elected worker representatives from each factory department in plants under UE contract throughout the district, along with selected guests, to discuss reconversion and postwar employment issues.

At these conferences District 8 leaders emphasized the importance of continued government planning to guarantee full employment, but also stressed that the new jobs should be interesting and fulfilling and the workweek should be reduced to 30 hours to allow more time for study, recreation, and family life.  They also discussed the importance of other goals: an expansion of workers’ rights in production; labor-management collaboration to develop and produce new products responsive to new needs; support for women who wanted to continue working, in part by the provision of nurseries; and the need to end employment discrimination against African Americans.

While these conferences were taking place, the Missouri River flooded, covering many thousands of acres of farmland with dirt and sand, and leaving thousands of people homeless.  The US Army Corps of Engineers rushed to take advantage of the situation, proposing a major dredging operation to deepen the lower Missouri River channel, an effort strongly supported by big shipping interests.  It became known as the Pick Plan. Not long after, the Bureau of Reclamation proposed a competing plan that involved building a series of dams and reservoirs in the upper river valley, a plan strongly supported by big agricultural interests. It became known as the Sloan Plan.

While lower river and upper river business interests battled, a grassroots movement grew across the region opposing both plans, seeing them, each in their own way, as highly destructive.  For example, building the dams and reservoirs would destroy the environment and require the flooding of hundreds of thousands of acres, much of it owned by small farmers, and leave tens of thousands of families without homes.

Influenced by the growing public anger, newspapers in St. Louis began calling for the creation of a new public authority, a Missouri Valley Authority (MVA), to implement a unified plan for flood control and development that was responsive to popular needs.  Their interest in an MVA reflected the popularity of the Tennessee Valley Authority (TVA), an agency created in 1933 and tasked with providing cheap electricity to homes and businesses and addressing many of the region’s other development challenges, such as flooding, land erosion, and population out-migration.  In fact, during the 1930s, several bills were submitted to Congress to establish other river-based regional authorities.  Roosevelt endorsed seven of them, but they all died in committee as the Congress grew more conservative and war planning took center stage in Washington DC.

District 8, building on its desire to promote postwar regional public planning, eagerly took up the idea of an MVA.  It issued a pamphlet titled “One River, One Plan” that laid out its vision for the agency.  As a public agency, it was to be responsive to a broad community steering committee; have the authority to engage in economic and environmental planning for the region; and, like the TVA, directly employ unionized workers to carry out much of its work.  Its primary tasks would be the electrification of rural areas and flood control through soil and water conservation projects and reforestation.  The pamphlet estimated that five hundred thousand jobs could be created within five years as a result of these activities and the greater demand for goods and services flowing from electrification and the revitalization of small farms and their communities.

District 8 used its pamphlet to launch a community-based grassroots campaign for its MVA, which received strong support from many unions, environmentalists, and farm groups.  And, in August 1944, Senator James Murray from Montana submitted legislation to establish an MVA, written largely with the help of District 8 representatives.  A similar bill was submitted in the House.  Both versions called for a two-year planning period with the final plan to be voted on by Congress.

District 8 began planning for a bigger campaign to win Congressional approval.  However, their efforts were dealt a major blow when rival supporters of the Pick and Sloan plans settled their differences and coalesced around a compromise plan.  Congress quickly approved the Pick-Sloan Flood Control Act late December 1944 but, giving MVA supporters some hope that they could still prevail, Senator Murray succeeded in removing the act’s anti-MVA provisions.

District 8 leaders persuaded their national union to assign staff to help them establish a St. Louis committee, a nine-state committee, and a national committee to support the MVA. The St. Louis committee was formed in January 1945 with a diverse community-based steering committee.  Its strong outreach effort was remarkably successful, even winning support from the St. Louis Chamber of Commerce.  Feurer provides a good picture of the breadth and success of the effort:

By early 1945, other city-based committees were organizing in the nine-state region. A new national CIO committee for an MVA laid plans for “reaching every CIO member in the nine-state region on the importance of regionally administered MVA.  In addition, other state CIO federations pledged to organize for an MVA and to disseminate material on the MVA through local unions to individual members.  Further the seeds planted in 1944 among AFL unions were beginning to develop into a real coalition.  In Kansas City, the AFL was “circulating all the building trades unions in the nine states for support” to establish a nine-state buildings trades MVA committee. Both the AFL and CIO held valley wide conferences on the MVA to promote and organize for it.

Murray submitted a new bill in February 1945, which included new measures on soil conversation and the protection of wild game, water conservation, and forest renewal. It also gave the MVA responsibility for the “disposal of war and defense factories to encourage industrial and business expansion.”

But the political tide had turned.  The economy was in expansion, the Democratic Party was moving rightward, and powerful forces were promoting a growing fear of communism.  Murray’s new bill was shunted to a hostile committee and big business mounted an unrelenting and successful campaign to kill it, arguing that the MVA would establish an undemocratic “super-government,” was a step toward “state socialism,” and was now unnecessary given passage of the Pick-Sloan Flood Control Act.

Drawing lessons

A careful study of District 8’s efforts, especially its campaign for an MVA, can help us think more creatively and effectively about how to build a labor-community coalition in support of a Green New Deal.  In terms of policy, there are many reasons to consider following District 8 in advocating for regionally based public entities empowered to plan and direct economic activity as a way to begin the national process of transformation.  For example, many of the consequences of climate change are experienced differently depending on region, which makes it far more effective to plan regional responses.  And many of the energy and natural resources that need to be managed during a period of transformation are shared by neighboring states.  Moreover, state governments, unions, and community groups are more likely to have established relations with their regional counterparts, making conversation and coordination easier to achieve.  Also, regionally organized action would make it much harder for corporations to use inter-state competition to weaken initiatives.

Jonathan Kissam, UE’s Communication Director and editor of the UE News, advocates just such an approach:

UE District 8’s Missouri Valley Authority proposal could easily be revived and modernized, and combined with elements of the British proposal for a National Climate Service. A network of regional Just Transition Authorities, publicly owned and accountable to communities and workers, could be set up to address the specific carbon-reduction and employment needs of different regions of the country.

The political lessons are perhaps the most important.  District 8’s success in building significant labor-community alliances around innovative plans for war conversion and then peacetime reconversion highlights the pivotal role unions can, or perhaps must, play in a progressive transformation process.  Underpinning this success was District 8’s commitment to sustained internal organizing and engagement with community partners.  Union members embraced the campaigns because they could see how a planned transformation of regional economic activity was the only way to secure meaningful improvements in workplace conditions, and such a transformation could only be won in alliance with the broader community.  And community allies, and eventually even political leaders, were drawn to the campaigns because they recognized that joining with organized labor gave them the best chance to win structural changes that also benefited them.

We face enormous challenges in attempting to build a similar kind of working class-anchored movement for a Green New Deal-inspired economic transformation.  Among them: weakened unions; popular distrust of the effectiveness of public planning and production; and weak ties between labor, environmental, and other community groups.  Overcoming these challenges will require our own sustained conversations and organizing to strengthen the capacities of, and connections between, our organizations and to develop a shared and grounded vision of a Green New Deal, one that can unite and empower the broader movement for change we so desperately need.

Climate Change, The Green New Deal, and the Struggle for Climate Justice

Most calls for a Green New Deal correctly emphasize that it must include a meaningful commitment to climate justice.  That is because climate change—for reasons of racism and capitalist profit-making—disproportionately punishes frontline communities, especially communities of color and low-income.

A 2020 published study on redlining (“the historical practice of refusing home loans or insurance to whole neighborhoods based on a racially motivated perception of safety for investment”) and urban heat islands helps to shed light on the process.  The authors of the study, Jeremy S. Hoffman, Vivek Shandas, and Nicholas Pendleton, examined temperature patterns in 108 US urban areas and found that 94 percent of them displayed “consistent city-scale patterns of elevated land surface temperatures in formerly redlined areas relative to their non-redlined neighbors by as much as 7 degrees Celsius (or 13 degrees Fahrenheit).”

As one of the authors explained in an interview:

“We found that those urban neighborhoods that were denied municipal services and support for home ownership during the mid-20th century now contain the hottest areas in almost every one of the 108 cities we studied,” Shandas said. “Our concern is that this systemic pattern suggests a woefully negligent planning system that hyper-privileged richer and whiter communities. As climate change brings hotter, more frequent and longer heat waves, the same historically underserved neighborhoods — often where lower-income households and communities of color still live — will, as a result, face the greatest impact.”

Urban heat islands

Climate scientists have long been aware of the existence of urban heat islands, localized areas of excessive land surface heat.  The urban heat island effect can cause temperatures to vary by as much as 10 degrees C within a single urban area.  As heat extremes become more common, and last longer, the number of associated illnesses and even deaths can be expected to rise.  Already, as Hoffman, Shandas, and Pendleton note,

extreme heat is the leading cause of summertime morbidity and has specific impacts on those communities with pre-existing health conditions (e.g., chronic obstructive pulmonary disease, asthma, cardiovascular disease, etc.), limited access to resources, and the elderly. Excess heat limits the human body’s ability to regulate its internal temperature, which can result in increased cases of heat cramps, heat exhaustion, and heatstroke and may exacerbate other nervous system, respiratory, cardiovascular, genitourinary, and diabetes-related conditions.

Studies have identified some clear causes for urban heat extremes—one is the density of impervious surface area; the greater the density, the hotter the land surface temperature.  The other is the tree canopy; the greater the canopy, the cooler the land surface temperature.  And as the three authors observe, “emerging research suggests that many of the hottest urban areas also tend to be inhabited by resource-limited residents and communities of color, underscoring the emerging lens of environmental justice as it relates to urban climate change and adaptation.” What their study helps us understand is that the process by which communities of color and poor came to live in areas with more impervious surface area and fewer green spaces was to a large degree the “result of racism and market forces.”

Racism and redlining

Racism in housing has a long history.  Kale Williams, writing in the Oregonian newspaper, highlights the Portland, Oregon history:

Exclusionary covenants, legal clauses written into property deeds, prohibited people of certain races, specifically African Americans and people of Asian descent, from purchasing homes. In 1919, the Portland Realty Board adopted a rule declaring it unethical to sell a home in a white neighborhood to an African American or Chinese person. The rules stayed in place until 1956.

In 1924, Portland voters approved the city’s first zoning policies. More than a dozen upscale neighborhoods were zoned for single-family homes. The policy, pushed by homeowners under the guise of protecting their property values, kept apartment buildings and multi-family homes, housing options more attainable for low-income residents, in less-desirable areas.

Portland was no isolated case; racism shaped national housing policy as well.  In 1933, Congress, as part of the New Deal, passed the Home Owners’ Loan Act, which established the Home Owners’ Loan Corporation (HOLC).  The purpose of the HOLC was to help homeowners refinance mortgages currently in default to prevent foreclosure and, of course, reduce stress on the financial system. It did that by issuing bonds, using the funds to purchase housing loans from lenders, and then refinancing the original mortgages, offering homeowners easier terms.

Between 1935 and 1940, the HOLC drew residential “security” maps for 239 cities across the United States.  These maps were made to access the long-term value of real estate now owned by the Federal Government and the health of the banking industry. They were based on input from local appraisers and neighborhood surveys, and neighborhood demographics.

As Hoffman, Shandas, and Pendleton describe, the HOLC:

created color-coded residential maps of 239 individual US cities with populations over 40,000. HOLC maps distinguished neighborhoods that were considered “best” and “hazardous” for real estate investments (largely based on racial makeup), the latter of which was outlined in red, leading to the term “redlining.” These “Residential Security” maps reflect one of four categories ranging from “Best” (A, outlined in green), “Still Desirable” (B, outlined in blue), “Definitely Declining” (C, outlined in yellow), to “Hazardous” (D, outlined in red).

This identification of problem neighborhoods with the racial makeup of the neighborhood was no accident.  And because the maps were widely distributed to other government bodies and private financial institutions, they served to guide private mortgage lending as well as government urban planning in the years that followed.  Areas outlined in red were almost always majority African-American.  And as a consequence of the rating system, those who lived in them had more difficulty getting home loans or upgrading their existing homes. Redlined neighborhoods were also targeted as prime locations for development of multi-unit buildings, industrial use, and freeway construction.

As expected, a 2019 paper by three researchers with the Chicago Federal Reserve Bank found:

a significant and persistent causal effect of the HOLC maps on the racial composition and housing development of urban neighborhoods. These patterns are consistent with the hypothesis that the maps led to reduced credit access and higher borrowing costs which, in turn, contributed to disinvestment in poor urban American neighborhoods with long-run repercussions.

What Hoffman, Shandas, and Pendleton establish in their paper is that this racially influenced mapping has also had real climate consequences.  Urban heat islands are not just randomly distributed through an urban area—they are more often than not located in redlined areas.  And those extra degrees of heat have real health and financial consequences. As Hoffman explains, the impact on residents of those heat islands is serious and wide-ranging:

“They are not only experiencing hotter heat waves with their associated health risks but also potentially suffering from higher energy bills, limited access to green spaces that alleviate stress and limited economic mobility at the same time,” Hoffman said. “Our study is just the first step in identifying a roadmap toward equitable climate resilience by addressing these systemic patterns in our cities.”

Redlining and climate change

Hoffman, Shandas, and Pendleton condensed the 239 HOLC maps into a database of 108 US cities.  They excluded cities that were not mapped with all four HOLC security rating categories and in some cases had to remove overlapping security rating boundaries, or merge them because they were drawn in different years.  The map below shows the location of the 108 cities.

They then used land surface temperature (LST) maps generated in summer months between 2014 and 2017 to estimate land surface temperatures in all four color-coded neighborhoods in each of these 108 cities to determine whether there was a relationship between LST and neighborhood rating in each city.

They found that present-day temperatures were noticeably higher in D-rated areas relative to A-rated areas in approximately 94 percent of the 108 cities.  The results are illustrated below. Figure a shows the LST difference between ranked neighborhoods for the country as a whole.  The four other figures do the same for each designated region of the country.

Portland, Oregon and Denver, Colorado had the greatest D to A temperature differences, with their D-rated areas some 7 degrees Celsius warmer than their A-rated areas (or some 13 degrees warmer in Fahrenheit).  For the nation as a whole, D-rated areas are now on average 2.6 degrees Celsius warmer than A-rated areas. Thus, as the authors note, “current maps of intra-urban heat echo the legacy of past planning policies.”   Moreover,

indicators of and/or higher intra-urban LSTs have been shown to correlate with higher summertime energy use, and excess mortality and morbidity. The fact that residents living in formerly redlined areas may face higher financial burdens due to higher energy and more frequent health bills further exacerbates the long-term and historical inequities of present and future climate change.

As this study so clearly shows, we are not all in the same boat when it comes to climate change; racial and class dimensions matter.  The poor and people of color are disproportionately suffering the most from global warming largely because of the way racism and profit-making combined to shape urbanization in the United States.  But this is only one example.  A transformative Green New Deal must bring to light the ways in which this dynamic has shaped countless other processes and embrace and support the struggles of frontline communities, economic and climate.

Flying Above the Clouds: the US Military and Climate Change

Climate change is occurring, highlighted by dramatically shifting weather patterns and ever more deadly storms, floods, droughts, and wildfires.  And the evidence is overwhelming that it is driven by the steady increase in greenhouse gases in our atmosphere, especially carbon dioxide and methane, produced by our fossil fuel-based economic system.

Aware of global warming’s deadly human consequences, millions of people have taken to the streets to demand that governments take action to end our use of fossil fuels as part of a massive system-wide economic transformation that would also be designed to ensure a just transition for all communities and workers.

As movements here in the US take aim at the fossil fuel industry and government leaders that continue to resist efforts to promote more sustainable and egalitarian forms of energy generation and distribution, transportation, agriculture, and housing, the largest generator of greenhouse gas emissions continues to fly above the clouds and largely out of public view.  As Neta Crawford, Co-Director of Brown University’s Costs of War Project, states in her recently published study of Pentagon fuel use and climate change, “the Department of Defense is the world’s largest institutional user of petroleum and correspondingly, the single largest producer of greenhouse gases in the world.”

Flying above the clouds

We know that we have an enormous military budget.  US military spending is greater than the total military spending of the next seven countries combined: China, Saudi Arabia, India, France, Russia, United Kingdom, and Germany.  The budget of the Department of Defense alone commands more than half of all US federal discretionary spending each year.  Add in spending on national security activities and weapons included in other departmental budgets, like that of the Department of Energy, and the military’s budget share approaches two-thirds of all discretionary spending.

This kind of information is readily available.  The US military’s contribution to global warming is not.  One reason is that because of US government pressure, the governments negotiating the Kyoto Protocol agreed that emissions generated by military activity would not count as national emissions and would not have to be reported.  That exemption remained in the agreement even though the US government never signed-on to the Kyoto Protocol.  Perhaps as a consequence, the Intergovernmental Panel on Climate Change also does not include national military emissions in its calculations.  Although the Paris Accord removed the exemption, the US government is committed to withdrawing from the agreement in 2020.

Uncovering the carbon costs of the US military

Although the US military does not publicly disclose its fuel use, four researchers—Oliver Belcher, Benjamin Neimark, Patrick Bigger, and Cara Kennelly—using  multiple Freedom of Information Act requests to the US Defense Logistics Agency, have recently published an article that provides a good estimate.

The Defense Logistics Agency (DLA) is charged with overseeing the supply chain that supports all military activities, including its warfighting, peacekeeping, and base operations.  The Defense Logistics Agency–Energy (DLA-E), a unit within the DLA, has responsibility for managing the military’s energy requirements.  In the words of Belcher, Neimark, Bigger, and Kennelly, “the DLA-E is the one-stop shop for fueling purchases and contracts within the US military both domestically and internationally, and acts as the US military’s internal market for all consumables, including fuel.”

In simple terms, the military needs fuel—to fly its jets and bombers on surveillance or attack missions, to deliver troops and weapons to bases and areas of conflict, to power ships on maneuvers, to run the vehicles used by patrols and fighting forces, and to maintain base operations here and around the world. And because it is the DLA-E that secures and distributes the required fuel, the four researchers used “Freedom of Information Act requests to compile a database of DLA-E records for all known land, sea, and aircraft fuel purchases, as well as fuel contracts made with US operators in military posts, camps, stations, and ship bunkers abroad from FY 2013 to 2017.”  The resulting calculation of total fuel purchases and use served as the basis for the authors’ estimate of the military’s production of greenhouse gas emissions.

The US military runs on fuel

The fuel dependence of the US military has dramatically grown over time, largely as a consequence of the nature of its continually evolving weapons systems and warfighting strategies.  For example, average fuel use, by soldier, grew from one gallon a day during World War II, to nine gallons a day by the end of the Vietnam War, to 22 gallons a day in the wars currently being fought in Afghanistan and Iraq.

One reason for this upward trajectory is that the US military has come to depend ever more on airpower to directly threaten or attack its enemies as well as support its heavily armored ground forces operating in foreign countries. As Crawford explains, the US military consumes so much energy because “its fighting ‘tooth’ employs equipment that guzzles fuel at an incredible rate . . . [and its] logistical ‘tail’ and the installations that support operations are also extremely fuel intensive.”

For example, Crawford reports that the fuel consumption of a B-2 Bomber is 4.28 gallons to the mile.  Read that carefully–that is gallons to the mile, not the more common miles to the gallon.  The fuel consumption of a F-35A Fighter bomber is 2.37 gallons to the mile, while it is 4.9 miles to the gallon for a KC-135R Refueling Tanker (loaded with transfer fuel).  “Even the military’s non-armored vehicles are notoriously inefficient. For instance, the approximately 60,000 HUMVEEs remaining in the US Army fleet get between four to eight miles per gallon of diesel fuel.”

Needless to say, an active military will burn through a lot of fuel.  And as Belcher, Neimark, Bigger, and Kennelly point out, the US military has indeed been busy: “Between 2015 and 2017, the US military was active in 76 countries, including seven countries on the receiving end of air/drone strikes, 15 countries with ‘boots on the ground,’ 44 overseas military bases, and 56 countries receiving counter-terrorism training.”

The carbon footprint of the US military

Belcher, Neimark, Bigger, and Kennelly determined that “the US military consumes more liquid fuels and emits more CO2e (carbon-Dioxide equivalent) than many medium-sized countries.”  Comparing 2014 country liquid fuel consumption with US military liquid fuel consumption revealed that the US military, if treated as a country, would rank between Peru and Portugal.  The US military’s 2014 greenhouse gas emissions, just from its use of fuel, was roughly equal “to total–not just fuel-–emissions from Romania.”  That year, the US military, again just from its fuel use, was the 47th largest emitter of greenhouse gases in the world, and not far behind a host of other countries.

The US military’s ranking would be higher if its other emissions were included, such as from the electricity and food the military consumes, or the land use changes from military operations.  And of course, none of this includes the emissions from the many corporations engaged in producing weapons for the military. In 2017, the US military purchased about 269,230 barrels of oil a day and emitted 25,375.8 kt-CO2e by burning those fuels.

One reason that the US military is such a large greenhouse gas emitter is that most of its fuel is jet fuel procured for use by the Air Force or Navy.  Their planes burn the fuel at extremely high altitudes, which “produces different kinds of chemical reactions, resulting in warming 2–4 times greater than on the ground.”

The military’s response to climate change

The military is well aware of the dangers of climate change—in contrast to many of our leading politicians.  One reason is that it threatens its operational readiness. As Crawford explains:

In early 2018, the DOD reported that about half of their installations had already experienced climate change related effects. A year later, the DOD reported that the US military is already experiencing the effects of global warming at dozens of installations. These include recurrent flooding (53 installations), drought (43 installations), wildfires (36 installations) and desertification (6 installations).

But most importantly, the military sees climate change as a threat to US national security.  For years, the military has considered the impact of climate change in its defense planning because, as a recent report from the Office of the Director of National Intelligence puts it, “global environmental and ecological degradation, as well as climate change, are likely to fuel competition for resources, economic distress, and social discontent through 2019 and beyond.”  Of course, in planning responses to possible climate-generated threats to US interests, the military remains committed to strengthening its capacity for action, even though doing so adds to the likelihood of greater climate chaos.

In short, people are right to demand that governments take meaningful and immediate steps to stop global warning.  And those steps need to include significant reductions in military spending as well as overseas bases and interventions.  Since the US military is the single largest producer of greenhouse gases in the world, the fight to reign in militarism in this country is especially important.  As an added benefit, the money freed could be put to good use helping to finance the broader system-wide transformation required to create an ecologically responsive economy.