Victory: Ohio’s plan to deny workers their unemployment insurance is shelved

Some stories are just so satisfying that they deserve to be shared.  Here is one.

In early May, Ohio Republican Governor Mike DeWine began reopening the state economy.  And to support business and slash state expenses, both at worker expense, he had a “COVID-19 Fraud” form put up on the Ohio Department of Job and Family Services website where employers could confidentially report employees “who quit or refuse work when it is available due to COVID-19.”  Inspectors would then investigate whether the reported workers should lose their unemployment benefits and possibly be charged with unemployment fraud.

Significantly, as Sarah Ingles, the board president of the Central Ohio Worker Center, noted in a statement quoted by the Intercept, the form “does not define what constitutes a ‘good cause’ exemption, and by doing so, may exclude many Ohio workers who have justifiable reasons for not returning to work and for receiving unemployment insurance benefits.”  In other words, “while the state did not take the time to define what a ‘good cause’ exemption includes or does not include, it did have time to develop an online form where employers could report employees.”

However, thanks to the work of an anonymous hacker, the site has now been taken down. In officialese, “The previous form is under revision pending policy references.”  Most importantly, as Janus Rose writing for Motherboard reports:

“No benefits are being denied right now as a result of a person’s decision not to return to work while we continue to evaluate the policy,” ODJFS Director Kimberly Hall told Cleveland.com.

According to Rose, the hacker developed a script that overwhelmed the system:

The script works by automatically generating fake information and entering it into the form. For example, the companies are taken from a list of the top 100 employers in the state of Ohio—including Wendy’s, Macy’s, and Kroger—and names and addresses are randomly created using freely-available generators found online. Once all the data is entered, the script has to defeat a CAPTCHA-like anti-spam measure at the end of the form. Unlike regular CAPTCHAs, which display a grid of pictures and words that the user must identify, the security tool used by the form is merely a question-and-answer field. By storing a list of common questions and their respective answers, the script can easily defeat the security measure by simply hitting the “switch questions” button until it finds a question it can answer.

To make the code more accessible, software engineer David Ankin repackaged the script into a simple command line tool which allows users to run the script in the background of their computer, continuously submitting fake data to the Ohio website.

“If you get several hundred people to do this, it’s pretty hard to keep your data clean unless you have data scientists on staff,” Ankin told Motherboard.

The hacker told Motherboard they viewed their effort as a form of direct action against the exploitation of working people during the COVID-19 crisis.  Score one for working people.

The 1930s and Now: Looking Back to Move Forward

My article What the New Deal Can Teach Us About Winning a Green New Deal is in the latest issue of the journal Class, Race and Corporate Power.  As I say in the abstract,

While there are great differences between the crises and political movements and possibilities of the 1930s and now, there are also important lessons that can be learned from the efforts of activists to build mass movements for social transformation during the Great Depression. My aim in this paper is to illuminate the challenges faced and choices made by these activists and draw out some of the relevant lessons for contemporary activists seeking to advance a Green New Deal.

Advocates of a Green New Deal often point to the New Deal and its government programs to demonstrate the possibility of a progressive state-directed process of economic change.  I wrote my article to show that the New Deal was a response to growing mass activity that threatened the legitimacy and stability of the existing economic and political order rather than elite good-will, and to examine the movement building process that generated that activity.

Depression-era activists were forced to organize in a period of economic crisis, mass unemployment and desperation, and state intransigence. While they fell short of achieving their goal of social transformation, they did build a movement of the unemployed and spark a wave of militant labor activism that was powerful enough to force state policy-makers to embrace significant, although limited, social reforms, including the creation of programs of public employment and systems of social security and unemployment insurance.

Differences between that time period and this one are shrinking and the lessons we can learn from studying the organizing strategies and tactics of those activists are becoming ever more relevant.  The US economy is now in a deep recession, one that will be more devastating than the Great Recession.  US GDP shrank at a 4.8 percent annualized rate in the first quarter of this year and will likely contract at a far greater 25 percent annualized rate in the second quarter.  While most analysts believe the economy will begin growing again in the third quarter, their predictions are for an overall yearly decline in the 6-8 percent range.   As for the years ahead—no one can really say.  The Economist, for example, is talking about a 90 percent economy for years after the current lockdown ends.  In other words, life will remain hard for most working people for some time.

Not surprisingly, given the size of the economic contraction, unemployment has also exploded. According to the Economic Policy Institute, “In the past six weeks, nearly 28 million, or one in six, workers applied for unemployment insurance benefits across the country.”  More than a quarter of the workforce in the following states have filed for benefits: Hawaii, Kentucky, Georgia, Rhode Island, Michigan, and Nevada.  And tragically, millions of other workers have been prevented from applying because of outdated state computer systems and punitive regulations as well as overworked employment department staff.  Even at its best, the US unemployment system, established in 1935 as part of the New Deal reforms, was problematic, paying too little, for too short a time period, and with too many eligibility restrictions.  Now, it is collapsing under the weight of the crisis.

Yet, at the same time, worker organizing and militancy is growing. Payday Report has a strike tracker that has already identified over 150 strikes, walkouts, and sickouts since early March across a range of sectors and industries, including retail, fast food, food processing, warehousing, manufacturing, public sector, health care, and the gig economy.  As an Associated Press story points out:

Across the country, the unexpected front-line workers of the pandemic — grocery store workers, Instacart shoppers and Uber drivers, among them — are taking action to protect themselves. Rolling job actions have raced through what’s left of the economy, including Pittsburgh sanitation workers who walked off their jobs in the first weeks of lockdown and dozens of fast-food workers in California who left restaurants last week to perform socially distant protests in their cars.

Rather than defending workers, governments are now becoming directly involved in suppressing their struggles. For example, after meatpacker walkouts closed at least 22 meat plans and threatened the operation of many others, triggered by an alarming rise in the number of workers testing positive for the virus, President Trump signed an executive order requiring companies to remain open and fully staffed. It remains to be seen how workers will respond.  In Pennsylvania, the Governor responded to nurse walkouts at nursing homes and long-term care facilities to protest a lack of protective equipment by sending national guard members to replace them.

Activists throughout the country are now creatively exploring ways to support those struggling to survive the loss of employment and those engaged in workplace actions to defend their health and well-being.  Many are also seeking ways to weave the many struggles and current expressions of social solidarity together into a mass movement for radical transformation.  Despite important differences in political and economic conditions, activists today are increasingly confronting challenges that are similar to ones faced by activists in the 1930s and there is much we can learn from a critical examination of their efforts.  My article highlights what I believe are some of the most important lessons.

Coronavirus: a return to normal is not good enough

We shouldn’t be satisfied with a return to normalcy. We need a “new normal.”

We are now in a recession, one triggered by government ordered closures of businesses producing nonessential goods and services, an action taken to limit the spread of the coronavirus. In response, Congress has approved three stimulus measures which legislators hope will keep the economy afloat until the virus is contained and companies can resume business as usual.

Many people, rightly criticizing the size, speed, and aims of these measures, have called for a new, improved stimulus package.  But what is getting far less attention, and may be the most important thing to criticize, is the notion that we should view a return to normalcy as our desired goal.  The fact is we also need a new economy.

The old normal only benefited a few

The media, even those critical of the Trump administration, all too often showcase economic experts who, while acknowledging the severity of the current crisis, reassure us that economic activity will return to normal before too long.  But since our economy increasingly worked to benefit a small minority, that is no cause for celebration.

Rarely mentioned is the fact that our economy was heading into a recession before the coronavirus hit. Or that living and working conditions for the majority of Americans were declining even during the past years of expansion. Or that the share of workers in low-wage jobs was growing over the last fifteen years.  Or that Americans are facing a retirement crisis.  Or that life expectancy fell from 2014 to 2017 because of the rise in mortality among young and middle-aged adults of all racial groups due to drug overdoses, suicides, and alcoholism.  If existing patterns of ownership and production remain largely unchanged, we face a future of ever greater instability, inequality, and poverty.

The economic crisis

The failings of our current system are only accentuated by the crisis. Many analysts are predicting an unprecedented one-quarter decline in GDP of 8 percent to 10 percent in the second quarter of this year.   The overall yearly decline may well be in the 5-7 percent range, the steepest annual drop in growth since 1946.

The unemployment rate is soaring and may reach 20 percent before the year is out.  A recent national survey found that 52 percent of workers under the age of 45 have already lost their job, been placed on leave, or had their hours cut because of the pandemic-caused downturn.

As a consequence, many people are finding it difficult to pay rent.  Survey results show that only 69 percent of renters paid their rent during the first week of April compared with over 80 percent during the first week of March.  And this includes renters who made partial payments.  Homeowners are not in much better shape.

Our unemployment insurance system has long been deficient: benefits are inadequate, last for only short period of time, and eligibility restrictions leave many workers uncovered. As of year-end 2019, the average unemployment insurance check was only $378 a week, the average duration of benefits was less than 15 weeks, and fewer than one-third of those unemployed were drawing benefits.

Now, the system is overwhelmed by people seeking to file new claims, leaving millions unable to even start their application process.  Although recent federal legislation allows states to expand their unemployment insurance eligibility and benefits, a very large share of those losing their jobs will find this part of our safety net not up to its assigned job.

A better crafted stimulus is needed

In response to the crisis, policy-makers have struggled to approve three so-called stimulus measures, the March 2020 Coronavirus Aid, Relief, and Economic Security (CARES) Act being the largest and most recent.  Unfortunately, these efforts have been disappointing.  For example, most of the provisions in the CARES Act include set termination dates untied to economic or health conditions. Approved spending amounts for individuals are also insufficient, despite the fact that Treasury Secretary Mnuchin believes the $1200 provided to most Americans as part of the CARES Act will be enough to tide them over for 10 weeks.

Also problematic is that not all CARE funds are directed to where they are most needed.  For example, no money was allocated to help states maintain their existing Medicaid program eligibility and benefit standards or expand health care coverage to uninsured immigrants and those who lose their job-based insurance.  And no money was allocated to state and local governments to help them maintain existing services in the face of declining tax revenues. Perhaps not surprisingly, the largest share of CARES approved spending is earmarked for corporate rescues without any requirement that the funds be used for saving jobs or wages.  In sum, we need another, better stimulus measure if we hope to minimize the social costs of our current crisis.

Creating a new normal

Even a better stimulus measure leaves our economy largely unchanged.  Yet, ironically, our perilous situation has encouraged countless expressions of social trust and solidarity that reveal ways to move forward to a more humane, egalitarian, and sustainable economy.  This starts with the growing recognition by many Americans that social solidarity, not competitive individualism, should shape our policies. People have demonstrated strong support for free and universal access to health care during this crisis, and we can build on that to push for an expansive Medicare for All health care system.  People also have shown great solidarity with the increasingly organized struggles of mail carriers, health care workers, bus drivers, grocery shoppers, cashiers, and warehouse workers to keep themselves safe while they brave the virus for our benefit.  We can build on that solidarity to push for new labor laws that strengthen the ability of all workers to form strong, democratic unions.

There is also growing support for putting social well-being before the pursuit of profit.  Many people have welcomed government action mandating that private corporations convert their production to meet social needs, such as the production of ventilators and masks.  We can build on this development to encourage the establishment of publicly owned and operated industries to ensure the timely and affordable production of critical goods like pharmaceuticals and health care equipment. And many people are coming to appreciate the importance of planning for future crises.  This appreciation can be deepened to encourage support for the needed transformation of our economy to minimize the negative consequences of the growing climate crisis.

We should not discount our ability to shape the future we want.

The Green New Deal and the State: Lessons from World War II—Part II

There is growing interest in a Green New Deal, but far too little discussion among supporters about the challenging nature of the required economic transformation, the necessary role of public planning and ownership in shaping it, or the strategies necessary to institutionalize a strong worker-community voice in the process and final outcome. In this two-part series I draw on the experience of World War II, when the state was forced to direct a rapid transformation from civilian to military production, to help encourage and concretize that discussion.

In Part I, I first discussed the need for a rapid Green New Deal-inspired transformation and the value of studying the U.S. experience during World War II to help us achieve it. Then, I examined the evolution, challenges, and central role of state planning in the wartime conversion to alert us to the kind of state agencies and capacities we will need to develop. Finally, I highlighted two problematic aspects of the wartime conversion and postwar reconversion which we must guard against: the ability of corporations to strengthen their dominance and the marginalization of working people from any decision-making role in conversion planning.

Here in Part II, I discuss the efforts of labor activists to democratize the process of transformation during the war period in order to sharpen our thinking about how best to organize a labor-community movement for a Green New Deal.  During this period, many labor activists struggled against powerful political forces to open up space for new forms of economic planning with institutionalized worker-community involvement.  The organizing and movement building efforts of District 8 leaders of the United Electrical, Radio & Machine Workers of America (UE), as described by Rosemary Feuer in her book Radical Unionism in the Midwest, 1900-1950, stand out in this regard.  Although their success was limited, there is much that we can learn from their efforts.

Organizing for a worker-community planned conversion process

District 8 covered Missouri, Iowa, Kansas, Arkansas, southern Indiana and southern and western Illinois, and UE contracts in that area were heavily weighted towards small and medium sized firms producing mechanical and electrical products.  As the government began its war time economic conversion in 1941, its policy of suppressing civilian goods and rewarding big corporations with defense contracts hit the firms that employed UE members hard.

The UE response was to build a labor and community-based effort to gain control over the conversion process. In Evansville, Indiana, the UE organized a community campaign titled “Prevent Evansville from Becoming a Ghost Town.”  As Feurer explains,

District 8’s tentative proposal called upon union and civic and business leaders to request the establishment of a federal program that would “be administered through joint and bona fide union-management-government cooperation” at the local level. It would ensure that before reductions in the production of consumer goods were instituted, government must give enough primary war contracts and subcontracts to “take up the slack” of unemployment caused in cities such as Evansville. It also proposed that laid-off workers would get “first claim on jobs with other companies in the community,” while excessive overtime would be eliminated until unemployment was reduced.

District 8 organizers pressed Evansville’s mayor to gather community, labor, and business representatives from all over the Midwest to discuss how to manage the conversion to save jobs.  They organized mass petition drives and won endorsements for their campaign from many community groups and small businesses.  Persuaded, Evansville’s mayor contacted some 500 mayors from cities with populations under 250,000 in eleven midwestern states, requesting that they send delegations of “city officials, labor leaders, managers of industry and other civic leaders” to a gathering in Chicago.  Some 1500 delegates attended the September meeting.

The conference endorsed the UE’s call for a significant role for labor in conversion planning, specifically “equal participation of management and labor in determining a proper and adequate retraining program and allocation of primary and sub-contracts. . . [And that] all possible steps be taken to avoid serious dislocations in non-defense industries.”  A committee of seven, with two labor representatives, was chosen to draw up a more concrete program of action.

One result was that Evansville and Newton, Iowa (another city with a strong UE presence) were named “Priority Unemployment Plan” areas, and allowed to conduct “an experiment for community-based solving of unemployment and dislocations caused by war priorities.”  The plan restricted new plant construction if existing production capacity was considered sufficient, encouraged industry-wide and geographical-based pooling of production facilities to boost efficiency and stabilize employment, required companies to provide training to help workers upgrade their skills, and supported industry-wide studies to determine how to best adapt existing facilities for military production.

William Sentner, the head of District 8, called for labor to take a leading role in organizing community gatherings in other regions and creating regional planning councils. Unfortunately, CIO leaders did little to support the idea. Moreover, once the war started, unemployment stopped being a serious problem and the federal government took direct control over the conversion process.

Organizing for a worker-community planned reconversion process

As the war began to wind down, District 8 leaders once again took up the issue of conversion, this time conversion back to a peacetime economy.  In 1943, they got the mayor of St. Louis to create a community planning committee, with strong labor participation, to discuss future economic possibilities for the city.  In 1944, they organized a series of union conferences with elected worker representatives from each factory department in plants under UE contract throughout the district, along with selected guests, to discuss reconversion and postwar employment issues.

At these conferences District 8 leaders emphasized the importance of continued government planning to guarantee full employment, but also stressed that the new jobs should be interesting and fulfilling and the workweek should be reduced to 30 hours to allow more time for study, recreation, and family life.  They also discussed the importance of other goals: an expansion of workers’ rights in production; labor-management collaboration to develop and produce new products responsive to new needs; support for women who wanted to continue working, in part by the provision of nurseries; and the need to end employment discrimination against African Americans.

While these conferences were taking place, the Missouri River flooded, covering many thousands of acres of farmland with dirt and sand, and leaving thousands of people homeless.  The US Army Corps of Engineers rushed to take advantage of the situation, proposing a major dredging operation to deepen the lower Missouri River channel, an effort strongly supported by big shipping interests.  It became known as the Pick Plan. Not long after, the Bureau of Reclamation proposed a competing plan that involved building a series of dams and reservoirs in the upper river valley, a plan strongly supported by big agricultural interests. It became known as the Sloan Plan.

While lower river and upper river business interests battled, a grassroots movement grew across the region opposing both plans, seeing them, each in their own way, as highly destructive.  For example, building the dams and reservoirs would destroy the environment and require the flooding of hundreds of thousands of acres, much of it owned by small farmers, and leave tens of thousands of families without homes.

Influenced by the growing public anger, newspapers in St. Louis began calling for the creation of a new public authority, a Missouri Valley Authority (MVA), to implement a unified plan for flood control and development that was responsive to popular needs.  Their interest in an MVA reflected the popularity of the Tennessee Valley Authority (TVA), an agency created in 1933 and tasked with providing cheap electricity to homes and businesses and addressing many of the region’s other development challenges, such as flooding, land erosion, and population out-migration.  In fact, during the 1930s, several bills were submitted to Congress to establish other river-based regional authorities.  Roosevelt endorsed seven of them, but they all died in committee as the Congress grew more conservative and war planning took center stage in Washington DC.

District 8, building on its desire to promote postwar regional public planning, eagerly took up the idea of an MVA.  It issued a pamphlet titled “One River, One Plan” that laid out its vision for the agency.  As a public agency, it was to be responsive to a broad community steering committee; have the authority to engage in economic and environmental planning for the region; and, like the TVA, directly employ unionized workers to carry out much of its work.  Its primary tasks would be the electrification of rural areas and flood control through soil and water conservation projects and reforestation.  The pamphlet estimated that five hundred thousand jobs could be created within five years as a result of these activities and the greater demand for goods and services flowing from electrification and the revitalization of small farms and their communities.

District 8 used its pamphlet to launch a community-based grassroots campaign for its MVA, which received strong support from many unions, environmentalists, and farm groups.  And, in August 1944, Senator James Murray from Montana submitted legislation to establish an MVA, written largely with the help of District 8 representatives.  A similar bill was submitted in the House.  Both versions called for a two-year planning period with the final plan to be voted on by Congress.

District 8 began planning for a bigger campaign to win Congressional approval.  However, their efforts were dealt a major blow when rival supporters of the Pick and Sloan plans settled their differences and coalesced around a compromise plan.  Congress quickly approved the Pick-Sloan Flood Control Act late December 1944 but, giving MVA supporters some hope that they could still prevail, Senator Murray succeeded in removing the act’s anti-MVA provisions.

District 8 leaders persuaded their national union to assign staff to help them establish a St. Louis committee, a nine-state committee, and a national committee to support the MVA. The St. Louis committee was formed in January 1945 with a diverse community-based steering committee.  Its strong outreach effort was remarkably successful, even winning support from the St. Louis Chamber of Commerce.  Feurer provides a good picture of the breadth and success of the effort:

By early 1945, other city-based committees were organizing in the nine-state region. A new national CIO committee for an MVA laid plans for “reaching every CIO member in the nine-state region on the importance of regionally administered MVA.  In addition, other state CIO federations pledged to organize for an MVA and to disseminate material on the MVA through local unions to individual members.  Further the seeds planted in 1944 among AFL unions were beginning to develop into a real coalition.  In Kansas City, the AFL was “circulating all the building trades unions in the nine states for support” to establish a nine-state buildings trades MVA committee. Both the AFL and CIO held valley wide conferences on the MVA to promote and organize for it.

Murray submitted a new bill in February 1945, which included new measures on soil conversation and the protection of wild game, water conservation, and forest renewal. It also gave the MVA responsibility for the “disposal of war and defense factories to encourage industrial and business expansion.”

But the political tide had turned.  The economy was in expansion, the Democratic Party was moving rightward, and powerful forces were promoting a growing fear of communism.  Murray’s new bill was shunted to a hostile committee and big business mounted an unrelenting and successful campaign to kill it, arguing that the MVA would establish an undemocratic “super-government,” was a step toward “state socialism,” and was now unnecessary given passage of the Pick-Sloan Flood Control Act.

Drawing lessons

A careful study of District 8’s efforts, especially its campaign for an MVA, can help us think more creatively and effectively about how to build a labor-community coalition in support of a Green New Deal.  In terms of policy, there are many reasons to consider following District 8 in advocating for regionally based public entities empowered to plan and direct economic activity as a way to begin the national process of transformation.  For example, many of the consequences of climate change are experienced differently depending on region, which makes it far more effective to plan regional responses.  And many of the energy and natural resources that need to be managed during a period of transformation are shared by neighboring states.  Moreover, state governments, unions, and community groups are more likely to have established relations with their regional counterparts, making conversation and coordination easier to achieve.  Also, regionally organized action would make it much harder for corporations to use inter-state competition to weaken initiatives.

Jonathan Kissam, UE’s Communication Director and editor of the UE News, advocates just such an approach:

UE District 8’s Missouri Valley Authority proposal could easily be revived and modernized, and combined with elements of the British proposal for a National Climate Service. A network of regional Just Transition Authorities, publicly owned and accountable to communities and workers, could be set up to address the specific carbon-reduction and employment needs of different regions of the country.

The political lessons are perhaps the most important.  District 8’s success in building significant labor-community alliances around innovative plans for war conversion and then peacetime reconversion highlights the pivotal role unions can, or perhaps must, play in a progressive transformation process.  Underpinning this success was District 8’s commitment to sustained internal organizing and engagement with community partners.  Union members embraced the campaigns because they could see how a planned transformation of regional economic activity was the only way to secure meaningful improvements in workplace conditions, and such a transformation could only be won in alliance with the broader community.  And community allies, and eventually even political leaders, were drawn to the campaigns because they recognized that joining with organized labor gave them the best chance to win structural changes that also benefited them.

We face enormous challenges in attempting to build a similar kind of working class-anchored movement for a Green New Deal-inspired economic transformation.  Among them: weakened unions; popular distrust of the effectiveness of public planning and production; and weak ties between labor, environmental, and other community groups.  Overcoming these challenges will require our own sustained conversations and organizing to strengthen the capacities of, and connections between, our organizations and to develop a shared and grounded vision of a Green New Deal, one that can unite and empower the broader movement for change we so desperately need.

The Green New Deal and the State: Lessons from World War II—Part I

There is growing interest in a Green New Deal, but far too little discussion among supporters about the challenging nature of the required economic transformation, the necessary role of public planning and ownership in shaping it, or the strategies necessary to institutionalize a strong worker-community voice in the process and final outcome.  In this two-part series I draw on the experience of World War II, when the state was forced to direct a rapid transformation from civilian to military production, to help encourage and concretize that discussion.

In this post, I first discuss the need for a rapid Green New Deal-inspired transformation and the value of studying the US experience during World War II to help us achieve it.  Next, I examine the evolution, challenges, and central role of state planning in the wartime conversion of the US economy to alert us to the kind of state agencies and capacities  we will need to develop. Finally, I highlight two problematic aspects of the wartime conversion and postwar reconversion which must be avoided if we hope to ensure a conversion to a more democratic and solidaristic economy.

In the post to follow, I will highlight the efforts of labor activists to democratize the process of transformation during the war period in order to sharpen our thinking about how best to organize a labor-community mass movement for a Green New Deal.

The challenge of transformation

We are already experiencing a climate crisis, marked by extreme weather conditions, droughts, floods, warming oceans, rising sea levels, fires, ocean acidification, and soil deterioration.  The Special Report on Global Warming of 1.5°C by the Intergovernmental Panel on Climate Change underscores the importance of limiting the increase in the global mean temperature to 1.5 degrees Celsius above pre-industrial levels by 2100 if we are to avoid ever worsening climate disasters and “global scale degradation and loss of ecosystems and biodiversity.” The report makes clear that achieving this goal requires reducing global net carbon dioxide emissions by 45 per cent by 2030 and then reaching net zero emissions by 2050.

Tragically, despite the seriousness of the crisis, we are on track for a far higher global mean temperature.  Even big business is aware of what is at stake. Two researchers employed by JP Morgan, the world’s largest financer of fossil fuels, recently published an internal study that warns of the dangers of climate inaction. According to the Guardian, which obtained a copy of the report, “the authors say policymakers need to change direction because a business-as-usual climate policy ‘would likely push the earth to a place that we haven’t seen for many millions of years,’ with outcomes that might be impossible to reverse.”

It is easy to see why growing numbers of people are attracted to the idea of a Green New Deal.  The Green New Deal promises a rapid and dramatic curtailing of fossil fuel use as part of a broader transformation to a more sustainable, egalitarian, and socially responsive economy.  Such a transformation will, by necessity, involve massive new investments to promote the production and distribution of clean renewable energy, expand energy efficient public transit systems, support regenerative agriculture, and retrofit existing homes, offices, and factories.  The Green New Deal also promises new, publicly funded programs designed to ensure well-paid and secure employment for all; high-quality universal health care; affordable, safe public housing; clean air; and healthy and affordable food.

Unfortunately, the proposed Green New Deal investments and programs, as attractive and as needed as they may be, are unlikely on their own to achieve the required reduction in carbon emissions.  It is true that many Green New Deal investments and programs can be expected to lower overall energy demand, thereby making it easier for rapidly growing supplies of clean energy to support economic activity.  But even though renewable energy production is growing rapidly in the US, it still accounts for less than 15 percent of total US energy consumption and less than 20 percent of electricity generation.  And based on the experience of other countries, increasing the production of renewable energy does not, by itself, guarantee a significant decline in the production and use of fossil fuels, especially when they remain relatively cheap and plentiful.

Rapid decarbonization will also require direct government action to force down the production of fossil fuels and make their use prohibitively expensive.  And this action will have significant consequences.  For example, limiting fossil fuel production will leave fossil fuel companies with enormous unused and therefore worthless assets.  Raising the price of fossil fuels will sharply increase the cost of flying, with negative consequences for the large manufacturers of airplanes and their subcontractors.  It will also increase the cost of gasoline, with negative consequences for automobile companies that produce gas guzzling cars.  Other major industries will also be affected, for example, the home building industry that specializes in large suburban homes, and the financial sector that has extended loans to firms in all these industries.

Thus, any serious attempt to rapidly force down fossil fuel use can be expected to negatively affect important sectors of the economy. Proposed Green New Deal investments and social policy initiatives will lay the foundation for a new economy, helping to boost employment and absorb some of the newly created excess capacity, but given the need for a speedy transformation to head off climate catastrophe, the process, if left unplanned, could easily end up dragging the economy down.

As difficult as this process appears, we do have historical experience to draw upon that can help us prepare for some of the challenges we can expect to face: the experience of World War II, when the US government was forced to initiate a rapid transformation of the US economy from civilian to military production.  New planning bodies were created to direct resources away from civilian use, retrain workers, encourage retooling of parts of the civilian economy to produce military goods and services, and direct massive investments to build new facilities to expand production or produce new goods needed for the war effort.  While far from a model to be recreated, advocates of a Green New Deal can learn much from studying the US war-time experience.

World War II planning

The shift to a war economy began gradually in 1939, some two years before the US actually entered the war. In June 1939, the Congress passed the Strategic and Critical Materials Stockpiling Act, which called for establishing reserves of strategic materials necessary for defense.  In August 1939, President Roosevelt established the War Resources Board to help the Joint Army and Navy Munitions Board develop plans for mobilizing the economic resources of the country in the event of war.

In June 1940, a National Roster of Scientific and Specialized Personnel was created.  In August 1940, the Defense Plant Corporation was created and charged with planning how to expand the nation’s ability to produce military equipment.  And in September 1940, the Congress approved the Selective Training and Service Act of 1940, which required all men between the ages of 21 and 45 to register for the draft.

In January 1941, President Roosevelt created the Office of Production Management to centralize all federal procurement programs concerned with the country’s preparation for war.  Shortly after the US entered the war, this office was replaced by the War Production Board (WPB), which was tasked with directing the conversion of industries from civilian to military work; the allocation of scare materials; and the establishment of priorities for the distribution of goods and services, including those to be rationed.

The conversion to a war economy, and the end of the depression, roughly dates to the second half of 1941, when defense spending sharply accelerated.  Federal spending on goods and services for national defense rose from 2.2 percent of GNP in 1940 to 11 percent of GNP in 1941. This was the last year that military-generated activity was compatible with growing civilian production. In 1942, military spending soared to 31 percent of GNP.  From then to the end of the war, civilian production was suppressed in order to secure the desired growth in military production.

For example, real consumer durable expenditures reached $24.7 billion (in 1972 dollars) or 6.2 percent of GNP in 1941.  The following year they fell to $16.3 billion or 3.6 percent of GNP.  Real personal consumption which grew by 6.2 percent in 1941, fell absolutely the following year.  Between 1940 and 1944, the total production of non-war goods and services fell from $180 billion to $164 billion (in 1950 dollars).  In contrast, real federal purchases of military commodities grew from $18 billion in 1941 to $88 billion in 1944 (in 1947 dollars), accounting for approximately one-half of all commodities produced that year.

No doubt, the high level of unemployment that existed at the start of the conversion made it easier to ramp up military production—but the military itself soon absorbed a large share of the male working age population.  Moreover, the challenge facing planners was not just that of ramping up production in a depressed economy, but of converting the economy to produce different goods, often in new locations.  This required the recruitment, training, and placement of millions of workers in accordance with ever changing industrial, occupational, and geographic requirements.

In the period of preparation for war, perhaps the biggest challenge was training.  It was largely met thanks to vocational training programs organized by the Employment Division.  These training programs made use of ongoing New Deal programs such as the Civilian Conservation Corps, Works Progress Administration, and National Youth Administration; the existing network of schools and colleges, and a Training-Within-Industry program.  Once the war began, the War Manpower Commission continued the effort.  Altogether, some 7 million people went through training programs, almost half through Training-Within-Industry programs.

The hard shift from a civilian driven economy into a military dominated one was, to a large degree, forced on the government by corporate concerns over future profitability.  In brief, most large corporations were reluctant to expand their productive capacity for fear that doing so would leave them vulnerable to a post-war collapse in demand and depression.  Among the most resistant were leading firms in the following industries: automobile, steel, oil, electric power, and railroads.  At the same time, these firms also opposed the establishment of government owned enterprises; they feared they might become post-war competitors or even worse, encourage popular interest in socialism.

Unwilling to challenge business leaders, the government took the path of least resistance—it agreed to support business efforts to convert their plant and equipment from civilian to military production; offer businesses engaged in defense work cost plus contracting; and suppress worker wages and their right to strike.  And, if the government did find it necessary to invest and establish new firms to produce critical goods, it agreed to allow private businesses to run them, with the option to purchase the new plant and its equipment at a discounted price at the war’s conclusion.  As a consequence, big business did quite well during the war and was well position to be highly profitable in the years following the end of the war.

Business reluctance to invest in expanding capacity, including in industries vital to the military, meant that the government had to develop a number of powerful new planning bodies to ensure that the limited output was allocated correctly and efficiently across the military-industrial supply chain.  For example, raw steel production grew only 8 percent from 1941 to the 1944 wartime peak.  Crude petroleum refining capacity grew only 12 percent between 1941 and 1945.  Leading firms in the auto industry were also reluctant to give up sales or engage in conversion to military production, initially claiming that no more than 15 percent of its machine tools were convertible.  But, once the war started and US planners regulated steel use, giving priority to military production, the auto industry did retool and produce a range of important military goods, including tanks, jeeps, trucks, and parts and subassemblies for the aircraft industry, including engines and propellers.

In many cases, corporate foot-dragging forced the government to establish its own production. Thus, while steel ingot capacity expanded by a modest 17 percent from 1940 to 1945, almost half of that increase came from government owned firms.  The role of government production was probably greatest in the case of synthetic rubber.  The US had relied on imports for some 90 percent of its supply of natural rubber, mostly from countries that fell under Japanese control.  Desperate for synthetic rubber to maintain critical civilian and military production, the government pursued a massive facility construction program. Almost all of the new capacity was financed and owned by the government and then leased to private operators for $1 per year. Thanks to this effort, synthetic rubber output rose from 22,434 long tons in 1942 to 753,111 long tons in 1944.  The Defense Plant Corporation ended up financing and owning approximately one-third of all the plant and equipment built during the war.

The War Production Board, created by presidential executive order in January 1942, was the country’s first major wartime planning agency.   Roosevelt choose Donald M. Nelson, a Sears Roebuck executive, to be its chairperson. Other members of the board were the Secretaries of War, Navy, and Agriculture, the lieutenant general in charge of War Department procurement, the director of the Office of Price Administration, the Federal Loan Administrator, the chair of the Board of Economic Warfare, and the special assistant to the President for the defense aid program.

The WPB managed twelve regional offices, and operated some one hundred twenty field offices throughout the country.  Their work was supported by state-level war production boards, which were responsible for keeping records on the firms engaged in war production in their respective states, including whether they operated under government contract.

However, despite its vast information gathering network, the WPB was never able to take command of the conversion of the economy. To some extent that was because Nelson proved to be a weak leader. But a more important reason was that the WPB had to contend with a number of other powerful agencies that were each authorized to direct the output of a specific critical industry.  The result was a kind of free-for-all when it came to developing and implementing a unified plan.

Perhaps the most powerful independent agency was the Army-Navy Munitions Board.  And early on the WPB ceded its authority over the awarding of military contracts to it. The Army and Navy awarded more contracts then could be fulfilled, creating problems in the supply chain as firms competed to obtain needed materials.  Turf fights among government agencies led to other problems. For example, the Office of Defense Transportation and the Petroleum Administration for War battled over who could decide petroleum requirements for transportation services.  And the Office of Price Administration fought the Solid Fuels Administration over who would control the rationing of coal.

A Bureau of the Budget history of the period captures some of the early chaos:

Locomotive plants went into tank production when locomotives were more necessary than tanks . . . Truck plants began to produce airplanes, a change that caused shortages of trucks later on . . . Merchant ships took steel from the Navy, and the landing craft cut into both. The Navy took aluminum from aircraft.  Rubber took valves from escort vessels, from petroleum, from the Navy.  The pipe-lines took steel from ships, new tools, and the railroads. And at every turn there were foreign demands to be met as well as requirements for new plants.

In response to the chaos, Roosevelt established another super agency in May 1943, the Office of War Mobilization (OWM).  This agency, headed by James F. Byrnes, a former politician and Supreme Court justice, was given authority over the WPB and the other agencies.  In fact, Byrnes’ authority was so great, he was often called the “assistant President.”

The OWM succeeded in installing a rigorous system of materials control and bringing order to the planning process.  As a result, civilian production was efficiently suppressed and military production steadily increased.  Over the period 1941 to 1945, the US was responsible for roughly 40 percent of the world’s production of weapons and supplies, and with little increase in the nation’s capital stock.

The experience highlighted above shows the effectiveness of planning, and that a contemporary economic conversion based on Green New Deal priorities, in which fossil fuel dependent industries are suppressed in favor of more sustainable economic activity, can be achieved.  It also shows that a successful transformation will require the creation of an integrated, multi-level system of planning, and that the process of transformation can be expected to generate challenges that will need to be handled with flexibility and patience.

World War II planning: cautionary lessons

The war-time conversion experience also holds two important cautionary lessons for a Green New Deal-inspired economic transformation.  The first is the need to remain vigilant against the expected attempt by big business to use the planning process to strengthen its hold on the economy.  If we are to achieve our goal of creating a sustainable, egalitarian, and solidaristic economy, we must ensure a dominant and ongoing role for public planning of economic activity and an expansive policy of public ownership, both taking over firms that prove resistant to the transformation and retaining ownership of newly created firms.

Unfortunately, the federal government was all too willing to allow big corporations to dominate the war-time conversion process as well as the peacetime reconversion, thereby helping them boost their profits and solidify their post-war economic domination.  For example, the Army and Navy routinely awarded their defense contracts to a very few large companies.  And these companies often chose other big companies as their prime subcontractors.  Small and medium sized firms also struggled to maintain their production of civilian goods because planning agencies often denied them access to needed materials.

Harold G. Vatter highlights the contract preference given to big firms during the war, noting that:

of $175 billion of primary contracts awarded between June 1940 and September 1944, over one-half went to the top 33 corporations (with size measured by value of primary supply contracts received). The smallest 94 percent of prime supply contract corporations (contracts of $9 million or less) got 10 percent of the value of all prime contracts in that period.

The same big firms disproportionally benefited from the reconversion process.  In October 1944, the OWM was converted into the Office of War Mobilization and Reconversion (OWMR), with Byrnes remaining as head.  The OWMR embraced its new role and moved quickly to achieve the reconversion of the economy.  It overcame opposition from the large military contractors, who were reluctant to give up their lucrative business, by granting them early authorization to begin production of civilian goods, thereby helping them dominate the emerging consumer markets.

The OWMR was also generous in its post-war distribution of government assets. The government, at war’s end, owned approximately $17 billion of plant and equipment.  These holdings, concentrated in the chemical, steel, aluminum, copper, shipbuilding, and aircraft industries, were estimated to be about 15 percent of the country’s total postwar plant capacity.  The government also owned “surplus” war property estimated to be worth some $50 and $70 billion.

Because of the way government wartime investment had been structured, there was little question about who would get the lion’s share of these public assets.  Most government owned plants were financed under terms specifying that the private firms operating them would be given the right to purchase them at war’s end if desired.  Thus, according to one specialist, roughly two-thirds of the $17 billion of government plant and equipment was sold to 87 large firms.  The “bulk of copolymer synthetic rubber plants went to the Big Four in rubber; large chemical plants were sold to the leading oil companies, and U.S. Steel received 71 percent of government-built integrated steel plants.”

The second cautionary lesson is the need to resist efforts by the government, justified in the name of efficiency, to minimize the role of unions, and working people more generally, in the planning and organization of the economic conversion. The only way to guarantee that a Green New Deal-inspired transformation will create an economy responsive to the needs of working people and their communities is to create institutional arrangements that secure popular participation in decision-making at all levels of economic activity.

While organized labor had at least an advisory role in prewar planning agencies, once the war began, it was quickly marginalized, and its repeated calls for more participation rejected.  For example, Sidney Hillman (head of the Amalgamated Clothing Workers) was appointed to be one of two chairs of the Office of Production Management, which was established in January 1941 to oversee federal efforts at national war preparation. The other was William S. Knudsen (president of General Motors).  The OPM also included a Labor Bureau, also led by Hillman, which was to advise it on labor recruitment, training, and mobilization issues, as well as Labor Advisory Committees attached to the various commodity and industry branches that reported to the OPM.

The labor presence was dropped from the War Production Board, which replaced the OPM in January 1942; Roosevelt appointed a businessman, Donald M. Nelson, to be its sole chair.  Hillman was appointed director of the board’s Labor Division, but that division was soon eliminated and its responsibilities transferred to the newly created War Manpower Commission in April 1942.

More generally, as organized labor found itself increasingly removed from key planning bodies, workers found themselves increasingly asked to accept growing sacrifices.  Prices began rising in 1940 and 1941 as the economy slowly recovered from the depression and began its transformation to war production.  In response, workers pushed for significant wage increases which the government, concerned about inflation, generally opposed.  In 1940, there were 2500 strikes producing 6.7 million labor-days idle.  The following year there were 4300 strikes with 23.1 million labor-days idle.

Hillman called for a national policy of real wage maintenance based on inflation indexing that would also allow the greatest wage gains to go to those who earned the least, but the government took no action.  As war mobilization continued, the government sought a number of concessions from the unions.  For example, it wanted workers to sacrifice their job rights, such as seniority, when they were transferred from nondefense to defense work.  Union leaders refused.  Union leaders also demanded, unsuccessfully, that military contracts not be given to firms found to violate labor laws.

Worried about disruptions to war production, Roosevelt established the War Labor Board by executive order in January 1942.  The board was given responsibility for stabilizing wages and resolving disputes between workers and managers at companies considered vital to the war effort. The board’s hard stand on wage increases was set in July, when it developed its so-called “Little Steel Formula.” Ruling in a case involving the United Steelworkers and the four so-called “Little Steel” companies, the board decided that although steelworkers deserved a raise, it had to be limited to the amount that would restore their real earnings to their prewar level, which they set as January 1, 1941.  Adding insult to injury, the board relied on a faulty price index that underestimated the true rate of inflation since the beginning of 1941.

Thus, while corporations were able to pursue higher profits, workers would have to postpone their “quest for an increasing share of the national income.”  Several months later, Roosevelt instructed the War Labor Board to use a similar formula, although with a different baseline, in all its future rulings.  Not surprisingly, the number of strikes continued to rise throughout the war years despite a December 1941 pledge by AFL and CIO leaders not to call strikes for the duration of the war.

In June 1943, with strikes continuing, especially in the coal fields, Congress passed the War Labor Disputes Act.  The act gave the president the power to seize and operate privately owned plants when an actual or threatened strike interfered with war production. Subsequent strikes in plants seized by the government were prohibited. The act was invoked more than 60 times during the war. The act also included a clause that made it illegal for unions to contribute to candidates for office in national elections, clearly an attempt to weaken labor’s political influence.

Although wage struggles drew most attention, union demands were far more expansive.  As Vatter describes:

Organized labor wanted wartime representation and participation in production decision-making at all levels, not merely the meaningless advisory role allotted to it during the preparedness period. But from the outset, management maintained a chronic hostile stance on the ground that management-labor industry councils such as proposed by Walter Reuther and CIO President Philip Murray in 1940 would, under cover of patriotism, undermine managements prerogatives and inaugurate a postwar “sovietization” of American industry.

Unions often pointed to the chaos of early planning, as captured by the Budget Bureau history, arguing that their active participation in production decisions would greatly improve overall efficiency.  The government’s lack of seriousness about union involvement is best illustrated by the WPB’s March 1942 decision to establish a special War Production Drive Division that was supposed to encourage the voluntary creation of labor-management plant committees.  However, the committees were only allowed to address specific physical production problems, not broader labor-management issues or production coordination across firms. Most large firms didn’t even bother creating committees.

Significantly, there was only one time that the government encouraged and supported popular participation in wartime decision-making, and that effort proved a great success.  Inflation was a constant concern of the government throughout the war years, largely because it was a trigger for strikes which threatened war time production.  The Office of Price Administration tried a variety of voluntary and bureaucratic controls to limit price increases on consumer goods and services, especially food, with little success.  Finally, beginning in mid-1943, and over the strong opposition of business, it welcomed popular participation in the operation of its price control system.

Tens of thousands of volunteers were formally authorized to visit retail locations throughout the country to monitor business compliance with the controls and tens of thousands of additional volunteers were chosen to serve on price boards that were empowered to fine retailers found in violation of the controls.  As a result, prices remained relatively stable from mid-1943 until early 1946 when the government abruptly ended the system of controls.  This was incredible achievement considering that the production of civilian goods and services declined over those years, while consumer purchasing power and the money supply rose.

 

Part II, on organizing and movement building lessons.

 

Health check: US manufacturing is in trouble

President Trump is all in, touting his success in rebuilding US manufacturing.  For example, in his state of the union address he claimed:

We are restoring our nation’s manufacturing might, even though predictions were that this could never be done. After losing 60,000 factories under the previous two administrations, America has now gained 12,000 new factories under my administration.

Apparently, it is a family achievement.  Joe Ragazzo, writing at TPM Café, reports that:

In a towering act of sycophantry, the National Association of Manufacturers announced Friday that it will be giving Ivanka Trump the organization’s first ever Alexander Hamilton Award for “extraordinary support of manufacturing in America.” The organization made the outrageous claim that “no one”  — no one! — has ever “provided singular leadership and shown an unwavering commitment to modern manufacturing in America” like she has.

Unfortunately, US manufacturing is far from healthy.

Production

The reality is that manufacturing output has been relatively flat over the last two decades, thanks in large part to the globalization of US production.  In 2019, despite the growth of the overall economy, the manufacturing sector actually fell into recession.  In contrast to Trump’s claim, manufacturing output fell 1.3 percent over the year.

The figure below shows real manufacturing output indexed to 2012.  We see slow but steady growth from 2000 to 2007, relatively flat growth from 2010 to 2018, and then a decline in real output in 2019.

The manufacturing sector’s woes in 2019 are on display in the following figure.

Investment

In line with these trends, as we can see below real investment in new structures by manufacturing firms has also been falling.  Real investment fell from $73.7 billion in 2015 to $56.4 billion in 2019.

Productivity

Manufacturing productivity is at a standstill.  The following figure shows manufacturing output per worker hour, indexed to 2012.  As we can see, it was largely unchanged from 2010 to 2014.  Since then it has trended downward.

Employment

The employment story is even worse.  As the next figure shows, manufacturing employment, in millions of jobs, took a nose dive beginning in the late 1990s and has yet to make a meaningful recovery.

Some 5 million manufacturing jobs have been lost since the late 1990s.  Nearly 90,000 U.S. factories have been lost as well.  And in line with manufacturing’s current recession, manufacturing employment, as we see below, is again falling.

Earnings

Equally alarming, the average hourly earnings of production workers employed in manufacturing has now fallen below the average hourly earnings of private sector production and nonsupervisory workers.  Thus, even if US policy were to succeed in bringing back or spurring the creation of new manufacturing jobs, they likely won’t be the living wage jobs of the past.

As the Monthly Labor Review explains:

Although manufacturing industries had a reputation for stable, well-paying jobs for much of the 20th century, shifts within the industry in the last several decades have considerably altered that picture. Since 1990, average hourly earnings trends in the various manufacturing industries have been disparate, with a few industries showing strong growth but many others showing growth rates that are lower than those of the total private sector. In fact, average hourly earnings of production and nonsupervisory workers in the total private sector have recently surpassed those of their counterparts in the relatively high-paying durable goods portion of manufacturing.

As we can see in the following figure, in 1990 average hourly earnings of production workers in manufacturing were greater than those of production and nonsupervisory workers in the total private sector (by about 6 percent.)  However, by 2007, average hourly earnings in the private sector had surpassed average hourly earnings in manufacturing.  And by 2015, average hourly earnings in the private sector surpassed average hourly earnings of manufacturing workers in the more highly compensated durable goods sector.  In 2018, the average hourly earnings of private sector production and nonsupervisory workers was approximately 5 percent greater than those of their manufacturing counterparts.

Workers in the auto industry have especially taken a big hit.

Trade

The overall trade deficit, which reflects the combined balances on trade in goods and services, slightly improved in 2019—falling by 1.7 percent or $10.9 billion.  So did the deficit in the trade of goods, falling by 2.4 percent or $21.4 billion. However, those improvements were mostly driven by the rapid growth in US exports of petroleum products.  The trade deficit in non-oil goods, mostly manufactures, actually increased by 1.8 percent in 2019, as can be seen in the following figure.

As Robert E. Scott remarks,

The small decline in overall US trade deficits follows an 18.3 percent increase in the goods trade deficit in the first two years of the Trump administration. Taken altogether, the US goods trade deficit increased $116.2 billion (15.5 percent) in the first three years of the Trump Administration. . . .

Meanwhile, the deficit in non-oil goods trade has nearly tripled since 2000, rising from $317.2 billion in 2000 to $852.3 billion in 2019, an all-time high. For the past two years, the non-oil goods trade deficit also reached record territory as a share of GDP, reaching or exceeding 4.0% of GDP. This growing U.S. trade deficit in non-oil goods is largely responsible for the loss of 5 million U.S. manufacturing jobs since 1998.

 

As we can see, talk of a manufacturing renaissance is nonsense.  And there is no reason, based on Trump administration’s economic policies, to expect one.

Climate Change, The Green New Deal, and the Struggle for Climate Justice

Most calls for a Green New Deal correctly emphasize that it must include a meaningful commitment to climate justice.  That is because climate change—for reasons of racism and capitalist profit-making—disproportionately punishes frontline communities, especially communities of color and low-income.

A 2020 published study on redlining (“the historical practice of refusing home loans or insurance to whole neighborhoods based on a racially motivated perception of safety for investment”) and urban heat islands helps to shed light on the process.  The authors of the study, Jeremy S. Hoffman, Vivek Shandas, and Nicholas Pendleton, examined temperature patterns in 108 US urban areas and found that 94 percent of them displayed “consistent city-scale patterns of elevated land surface temperatures in formerly redlined areas relative to their non-redlined neighbors by as much as 7 degrees Celsius (or 13 degrees Fahrenheit).”

As one of the authors explained in an interview:

“We found that those urban neighborhoods that were denied municipal services and support for home ownership during the mid-20th century now contain the hottest areas in almost every one of the 108 cities we studied,” Shandas said. “Our concern is that this systemic pattern suggests a woefully negligent planning system that hyper-privileged richer and whiter communities. As climate change brings hotter, more frequent and longer heat waves, the same historically underserved neighborhoods — often where lower-income households and communities of color still live — will, as a result, face the greatest impact.”

Urban heat islands

Climate scientists have long been aware of the existence of urban heat islands, localized areas of excessive land surface heat.  The urban heat island effect can cause temperatures to vary by as much as 10 degrees C within a single urban area.  As heat extremes become more common, and last longer, the number of associated illnesses and even deaths can be expected to rise.  Already, as Hoffman, Shandas, and Pendleton note,

extreme heat is the leading cause of summertime morbidity and has specific impacts on those communities with pre-existing health conditions (e.g., chronic obstructive pulmonary disease, asthma, cardiovascular disease, etc.), limited access to resources, and the elderly. Excess heat limits the human body’s ability to regulate its internal temperature, which can result in increased cases of heat cramps, heat exhaustion, and heatstroke and may exacerbate other nervous system, respiratory, cardiovascular, genitourinary, and diabetes-related conditions.

Studies have identified some clear causes for urban heat extremes—one is the density of impervious surface area; the greater the density, the hotter the land surface temperature.  The other is the tree canopy; the greater the canopy, the cooler the land surface temperature.  And as the three authors observe, “emerging research suggests that many of the hottest urban areas also tend to be inhabited by resource-limited residents and communities of color, underscoring the emerging lens of environmental justice as it relates to urban climate change and adaptation.” What their study helps us understand is that the process by which communities of color and poor came to live in areas with more impervious surface area and fewer green spaces was to a large degree the “result of racism and market forces.”

Racism and redlining

Racism in housing has a long history.  Kale Williams, writing in the Oregonian newspaper, highlights the Portland, Oregon history:

Exclusionary covenants, legal clauses written into property deeds, prohibited people of certain races, specifically African Americans and people of Asian descent, from purchasing homes. In 1919, the Portland Realty Board adopted a rule declaring it unethical to sell a home in a white neighborhood to an African American or Chinese person. The rules stayed in place until 1956.

In 1924, Portland voters approved the city’s first zoning policies. More than a dozen upscale neighborhoods were zoned for single-family homes. The policy, pushed by homeowners under the guise of protecting their property values, kept apartment buildings and multi-family homes, housing options more attainable for low-income residents, in less-desirable areas.

Portland was no isolated case; racism shaped national housing policy as well.  In 1933, Congress, as part of the New Deal, passed the Home Owners’ Loan Act, which established the Home Owners’ Loan Corporation (HOLC).  The purpose of the HOLC was to help homeowners refinance mortgages currently in default to prevent foreclosure and, of course, reduce stress on the financial system. It did that by issuing bonds, using the funds to purchase housing loans from lenders, and then refinancing the original mortgages, offering homeowners easier terms.

Between 1935 and 1940, the HOLC drew residential “security” maps for 239 cities across the United States.  These maps were made to access the long-term value of real estate now owned by the Federal Government and the health of the banking industry. They were based on input from local appraisers and neighborhood surveys, and neighborhood demographics.

As Hoffman, Shandas, and Pendleton describe, the HOLC:

created color-coded residential maps of 239 individual US cities with populations over 40,000. HOLC maps distinguished neighborhoods that were considered “best” and “hazardous” for real estate investments (largely based on racial makeup), the latter of which was outlined in red, leading to the term “redlining.” These “Residential Security” maps reflect one of four categories ranging from “Best” (A, outlined in green), “Still Desirable” (B, outlined in blue), “Definitely Declining” (C, outlined in yellow), to “Hazardous” (D, outlined in red).

This identification of problem neighborhoods with the racial makeup of the neighborhood was no accident.  And because the maps were widely distributed to other government bodies and private financial institutions, they served to guide private mortgage lending as well as government urban planning in the years that followed.  Areas outlined in red were almost always majority African-American.  And as a consequence of the rating system, those who lived in them had more difficulty getting home loans or upgrading their existing homes. Redlined neighborhoods were also targeted as prime locations for development of multi-unit buildings, industrial use, and freeway construction.

As expected, a 2019 paper by three researchers with the Chicago Federal Reserve Bank found:

a significant and persistent causal effect of the HOLC maps on the racial composition and housing development of urban neighborhoods. These patterns are consistent with the hypothesis that the maps led to reduced credit access and higher borrowing costs which, in turn, contributed to disinvestment in poor urban American neighborhoods with long-run repercussions.

What Hoffman, Shandas, and Pendleton establish in their paper is that this racially influenced mapping has also had real climate consequences.  Urban heat islands are not just randomly distributed through an urban area—they are more often than not located in redlined areas.  And those extra degrees of heat have real health and financial consequences. As Hoffman explains, the impact on residents of those heat islands is serious and wide-ranging:

“They are not only experiencing hotter heat waves with their associated health risks but also potentially suffering from higher energy bills, limited access to green spaces that alleviate stress and limited economic mobility at the same time,” Hoffman said. “Our study is just the first step in identifying a roadmap toward equitable climate resilience by addressing these systemic patterns in our cities.”

Redlining and climate change

Hoffman, Shandas, and Pendleton condensed the 239 HOLC maps into a database of 108 US cities.  They excluded cities that were not mapped with all four HOLC security rating categories and in some cases had to remove overlapping security rating boundaries, or merge them because they were drawn in different years.  The map below shows the location of the 108 cities.

They then used land surface temperature (LST) maps generated in summer months between 2014 and 2017 to estimate land surface temperatures in all four color-coded neighborhoods in each of these 108 cities to determine whether there was a relationship between LST and neighborhood rating in each city.

They found that present-day temperatures were noticeably higher in D-rated areas relative to A-rated areas in approximately 94 percent of the 108 cities.  The results are illustrated below. Figure a shows the LST difference between ranked neighborhoods for the country as a whole.  The four other figures do the same for each designated region of the country.

Portland, Oregon and Denver, Colorado had the greatest D to A temperature differences, with their D-rated areas some 7 degrees Celsius warmer than their A-rated areas (or some 13 degrees warmer in Fahrenheit).  For the nation as a whole, D-rated areas are now on average 2.6 degrees Celsius warmer than A-rated areas. Thus, as the authors note, “current maps of intra-urban heat echo the legacy of past planning policies.”   Moreover,

indicators of and/or higher intra-urban LSTs have been shown to correlate with higher summertime energy use, and excess mortality and morbidity. The fact that residents living in formerly redlined areas may face higher financial burdens due to higher energy and more frequent health bills further exacerbates the long-term and historical inequities of present and future climate change.

As this study so clearly shows, we are not all in the same boat when it comes to climate change; racial and class dimensions matter.  The poor and people of color are disproportionately suffering the most from global warming largely because of the way racism and profit-making combined to shape urbanization in the United States.  But this is only one example.  A transformative Green New Deal must bring to light the ways in which this dynamic has shaped countless other processes and embrace and support the struggles of frontline communities, economic and climate.

When it comes to pay, US business leaders are world champs

US CEOs not only draw the highest salaries (including bonuses and equity awards, etc.), but they are king of the hill when it comes to lording it over their employees, as illustrated by the high ratio of CEO to worker earnings.

And this record-breaking performance is no one-off.  The share of net wealth held by the top 0.1 percent has been steadily climbing and now rivals that of the bottom 90 percent.

Who cares that wages stagnate, life expectancy falls, economic insecurity grows, social services are gutted in favor of militarism, and climate-generated crises multiply?  Not those at the top, who are doing just fine.

Another sign of the deepening social crisis: The decline in US life expectancy

US life expectancy is on the decline, falling from 2014 to 2017—the first years of decline in life expectancy in over twenty years.  And according to Steven H. Woolf and Heidi Schoomaker, authors of the recently published “Life Expectancy and Mortality Rates in the United States, 1959-2017” in the Journal of the American Medical Association, “A major contributor has been an increase in mortality from specific causes (e.g., drug overdoses, suicides, organ system diseases) among young and middle-aged adults of all racial groups, with an onset as early as the 1990s and with the largest relative increases occurring in the Ohio Valley and New England.”

Declining life expectancy

In 1960, the US had the highest life expectancy of any country in the world.  By 2017 US life expectancy significantly trailed that of other comparable countries, as illustrated below.

In 1980, the difference between average life expectancy in the US and that of comparable countries was not large–73.7 years versus 74.5 years.  However, as we can see in the next figure, the gap steadily grew over the following years.  The US gained 4.9 years in average life expectancy from 1980 to 2017; comparable countries gained 7.8 years on average.

As researchers for the Kaiser Family Foundation point out, “The U.S. and most comparable countries experienced a slight decline in life expectancy in 2015. By 2016, life expectancy for these comparable countries rebounded to pre-2015 numbers, but in the US, such a bounce back did not occur.”  After averaging 78.9 years in 2014, averaged life expectancy in the US fell to 78.7 years in both 2015 and 2016, and then dropped again in 2017 to 78.6 years. These declines mark the first decreases in US life expectancy in more than 20 years.

Moreover, this growing gap and outright decline in average life expectancy holds for both US males and females, as we see from the following figure.

The growing social crisis

Woolf and Schoomaker drew upon 50 years’ worth of data from the US Mortality Database and the US Centers for Disease Control and Prevention’s WONDER database in an attempt to explain why US life expectancy has not kept pace with that of other wealthy countries and is now falling.  Their primary finding, as noted above, is that US life expectancy is being dragged down by “an increase in mortality from specific causes (eg, drug overdoses, suicides, organ system diseases) among young and middle-aged adults of all racial groups.”

More specifically, while over the period 1999-2017, infant mortality, mortality rates among children and early adolescents (1-14 years of age), and age-adjusted mortality rates among adults 65-84 all declined, individuals aged 25-64 “experienced retrogression” beginning in 2010, as we can see in the following figures taken from their article.  Between 2010 and 2017, these midlife adults experienced a 6 percent total increase in mortality rate. This increase overwhelmed gains experienced by the other age cohorts, dragging down overall US average life expectancy.

Woolf and Schoomaker concluded that there were multiple causes for this rise in mortality rates among individuals 25-64.  However, they highlighted drug overdose, alcohol abuse and suicide as among the most important.  This age cohort experienced a nearly four-fold increase in fatal drug overdoses between 1999 and 2017.  Their suicide rates went up nearly 40 percent over the same period. The rate of alcohol-related disease deaths soared by almost 160 percent for those 25-34 years.

In an interview with BusinessInsider, Woolf wrestled with why the country is experiencing such a dramatic rise in mortality rates among young and middle aged adults. “It’s a quandary of why this is happening when we spend so much on healthcare,” Woolf said, adding: “But my betting money is on the economy.”

That seems like a pretty safe answer.  It also raises the question: how do we help working people understand the increasingly toxic nature of the workings of the US economy and build the ties of solidarity necessary to advance the struggle for system transformation.

The Harsh Reality of Job Growth in America

The current US economic expansion, which began a little over a decade ago, is now the longest in US history.  But while commentators celebrate the slow but steady growth in economic activity, and the wealthy toast continuing strong corporate profits, lowered taxes, and record highs in the stock market, things are not so bright for the majority of workers, despite record low levels of unemployment.

The fact is, despite the long expansion, the share of workers in low-wage jobs remains substantial. To make matters worse, the share of low-quality jobs in total employment seems likely to keep growing. And, although US workers are not unique in facing hard times, the downward press on worker well-being in the US has been more punishing than in many other advanced capitalist countries, leaving the average US worker absolutely poorer than the average worker in several of them.

The low wage reality

According to a recent Bookings report by Martha Ross and Nicole Bateman, titled Meet the Low-wage Workforce,

Low-wage workers comprise a substantial share of the workforce.  More than 53 million people, or 44 percent of all workers ages 18 to 64 in the United States, earn low hourly wages. More than half (56 percent) are in their prime working years of 25-50, and this age group is also the most likely to be raising children (43 percent).

Ross and Bateman draw upon the Census Bureau’s 2012-2016 American Community Survey 5-year Public Use Microdata Sample to identify low-wage workers.  Although their work does not incorporate the small increase in wages between 2017-2019, they are confident that doing so would not significantly change their findings.

Their workforce definition started with all civilian, non-institutionalized individuals, 18 to 64 years of age, who worked at some point in the previous year (during the survey period) and remained in the labor force (either employed or unemployed).  They then removed graduate and professional students and traditional high school and college students, as well as those who reported being self-employed or earning self-employment income and those who worked without pay in a family business or farm.  This left them with a total of 122 million workers.

Their definition of a low-wage worker started with the “often-employed threshold” of two-thirds the median hourly wage of a full-time/full year worker, with one major modification. They used the male wage because they wanted to establish a threshold that was not affected by gender inequality.  They identified anyone earning a lower hourly wage as a low-wage worker.

The average national threshold across their five years of data, in 2016 real dollars, was $16.03.  They then adjusted this value, using the Bureau of Economic Analysis’s Regional Price Parities, to take into account variations in the cost of living in individual metropolitan areas.  The adjusted thresholds ranged from $12.54 in Beckley, West Virginia to $20.02 in San Jose, California.  Using these thresholds, the authors found that 44 percent of the workforce, some 53 million workers, were low-wage workers.

These low-wage workers were a racially diverse group.  Fifty-two percent were white, 25 percent Latinx, 15 percent Black, and 5 percent Asian American. Both Latinx and Black workers were over-represented relative to their share of the total workforce.

Strikingly, 57 percent of low-wage workers worked full time year-round.  And half of all low-wage workers “are primary earners or contribute substantially to family living expenses. Twenty-six percent of low-wage workers are the sole earners in their families, with median family earnings of $20,400.”

Finally, as the authors also note, the economic mobility of low wage workers appears quite limited. They cite one study that “found that, within a 12-month period, 70 percent of low-wage workers stayed in the same job, 6 percent switched to a different low-wage job, and just 5 percent found a better job.”

The growing share of low-wage jobs

The downward movement in a new monthly index, the job quality index (JQI), makes clear that economic growth alone will not solve the problem of too many workers employed in low-wage work.  The index measures the ratio of high-quality jobs (those that pay more than the average weekly income) to low quality jobs (those that pay less than the average weekly income).  The index steadily declined over the past three decades, during periods of expansion as well as recession, from a ratio of 94.9 in 1990 to a ratio of 79.0 as of July 2019 (as illustrated below).

The process of creating the index and its usefulness is described in a recent paper authored by Daniel Alpert, Jeffrey Ferry, Robert C. Hockett, Amir Khaleghi.  The index itself is maintained by a group of researchers from Cornell University Law School, the Coalition for a Prosperous America, the University of Missouri-Kansas City, and the Global Institute for Sustainable Prosperity.  As the authors note, the most prominent factor underlying the three decade fall in the ratio is the “relative devaluation” of US labor.

The index tracks private sector jobs provided by third party employers, which excludes self-employed workers, and, for now, covers only production and nonsupervisory (P&NS) positions, which account for approximately 82 percent of total private sector jobs in the country.

The index draws on the BLS’s Current Employment Statistics which provides average weekly hours, average hourly wages, and total employment for 180 distinct job categories organized in industry groups.  As the authors explain:

JQI itself is a fairly simple measure. The index divides all categories of jobs in the US into high and low quality by calculating the mean weekly income (hourly wages multiplied by hours worked) of all P&NS jobs and then calculates the number of P&NS jobs that are above or below that mean. An index reading of 100 would indicate an even distribution, as between high- and low-quality jobs. Readings below 100 indicate a greater concentration in lower quality (those below the mean) positions, and a reading above 100 would greater concentration in high quality (above the mean) positions.

Recognizing that some groups are quite large and include a wide range of jobs hovering around the mean, the JQI is further adjusted by disaggregating those particular groups into subgroups. The average income of each of those subgroups is then compared with the mean weekly income derived from the entire sample to determine whether the positions should be classified as high or low quality jobs.

As noted above, the JQI fell from 94.9 in 1990 to 79.0 as of July 2019.  As for the significance of this decline:

The decline confirms sustained and steadily mounting dependence of the U.S. employment situation on private P&NS jobs that are below the mean level of weekly wages. . . .

Notably, movements in the JQI are not particularly correlated with recession; it is important to note that the first big decline occurred during the expansion of the late 1990s. The index was steady during the 2001 recession, and its second big decline occurred during and after the Great Recession. There is admittedly some cyclical patterning evidenced in the JQI output, but this is overwhelmed by a larger secular phenomenon.

Losing ground

Not only are US workers facing a labor market increasingly oriented towards low-wage employment, the resulting downward pressure on wages appears to be proceeding at a more rapid pace in the US than in other countries.  As a consequence, a majority of US workers are now poorer, in real terms, than many of their counterparts in other countries.

For example, in a study comparing income inequality in France and the US, the economists Thomas Piketty, Emmanuel Saez, and Gabriel Zucman found that the average pre-tax national income of adults in the bottom 50 percent of the income distribution is now greater in France than in the United States.  “While the bottom 50 percent of incomes were 11 percent lower in France than in the US in 1980, they are now 16 percent higher.”  Moreover,

The bottom 50 percent of income earners makes more in France than in the US even though average income per adult is still 35 percent lower in France than in the US (partly due to differences in standard working hours in the two countries). Since the welfare state is more generous in France, the gap between the bottom 50 percent of income earners in France and the US would be even greater after taxes and transfers.

A recent study by the Center for the Study of Living Standards finds that growing numbers of US workers are also falling behind their Canadian counterparts.  More specifically, “the study compares incomes in every percentile of the income distribution, and finds that up through the 56th percentile Canadians are better off than their U.S. counterparts.”

The study’s author, Simon Lapointe, in words that echo the comments of Piketty, Saez, and Zucman, adds:

Our income estimates may actually underestimate the economic well-being of Canadians relative to Americans. Indeed, Canadians usually receive more in-kind benefits from their governments, including notably in health care. Had these benefits been included in the estimates, the median augmented household income in Canada would likely surpass the American median by a greater margin. While these benefits also come with higher taxes, the progressivity of the income tax system is such that the median household is most likely a net beneficiary.

The takeaway

There are many reasons for those at the top of the US income distribution to celebrate the performance of the US economy and tout the superiority of current US economic and political institutions and policies.  Unfortunately, there is a strong connection between the continuing gains for those at the top and the steadily deteriorating employment conditions experienced by growing numbers of workers.  Hopefully, this economic reality will become far better understood, leading to a more widespread recognition of the need for collective action to transform the US economy in ways that are responsive to majority interests.