Pandemic economic woes continue, but so do deep structural problems, especially the long-term growth in the share of low wage jobs

Many are understandably alarmed about what the September 4th termination of several special federal pandemic unemployment insurance programs will mean for millions of workers.  Twenty-five states ended their programs months earlier, with government and business leaders claiming that their termination would spur employment and economic activity.  However, several studies have disproved their claims.

One study, based on the experience of 19 of these states, found that for every 8 workers that lost benefits, only one found a new job.  Consumer spending in those states fell by $2 billion, with every lost $1 of benefits leading to a fall in spending of 52 cents.   It is hard to see how anything good can come from the federal government’s willingness to allow these programs to expire nationwide. 

The Biden administration appears to believe that adoption of its physical infrastructure bill and $3.5 trillion spending plan will ensure that those left without benefits will find new jobs.  But chances for Congressional approval are growing dim.  Even more importantly, and largely overlooked in the debate over whether the time is right to replace the pandemic unemployment insurance programs with new spending measures, is that an increasing share of the jobs created by economic growth are low-wage, and thus inadequate to ensure workers and their families an acceptable standard of living. 

For example, according to another study, the share of low wage jobs has been steadily growing since 1979.  More specifically, the share of workers (18-64 years of age) with a low wage job rose from 39.1 percent in 1979 to 45.2 percent in 2017.  For workers 18 to 34 without a college degree the share soared from 46.9 percent to 61.6 percent over the same tyears. Thus, a meaningful improvement in worker well-being will require far more than a return to “normal” labor market conditions.  It will require building a movement able to directly challenge and transformation the way the US economy operates.  

The importance of government programs

The figure below provides some sense of how important government programs have been to working people.  Government support was truly a lifeline for working people, delivering a significant boost to total monthly personal income (relative to the February 2020 start of the pandemic-triggered recession), especially during the first months.  Even now, despite the fact that the recession has officially been declared over, it still accounts for approximately half the increase in total monthly income.   

The government’s support of personal income was anchored by three special unemployment insurance programs–the Federal Pandemic Unemployment Compensation (FPUC), Pandemic Emergency Unemployment Compensation (PEUC), and Pandemic Unemployment Assistance (PUA). 

The FPUC was authorized by the March 2020 CARES Act and renewed by subsequent legislation and a presidential order. It originally provided $600 per week in extra unemployment benefits to unemployed workers in states that opted in to the program. In August 2020, the extra payment was lowered to $300.

The PEUC was also established by the CARES Act. It provided up to 13 weeks of extended unemployment compensation to individuals that had exhausted their regular unemployment insurance compensation.  This was later extended to 24 additional weeks and then by a further 29 weeks, allowing for a total of 53 weeks.  The PUA allowed states to provide unemployment assistance to the self-employed and those seeking part-time employment, or who otherwise did not qualify for regular unemployment compensation.

Tragically, the federal government allowed all three programs to expire on September 4th. Months earlier, in June 2021, 25 states actually ended these programs for their unemployed workers, eliminating benefits for over 2 million.  Several studies, as we see next, have documented the devastating cost of that decision. 

The cost of state program termination

Beginning in April 2021, a number of business analysts and politicians began to aggressively argue that federally provided unemployment benefit programs were no longer needed.  In fact, according to them, the programs were actually keeping workers from pursuing available jobs, thereby holding back the country’s economic recovery. Using these arguments as cover, in June, 25 states ended their participation in one or more of these programs. 

For example, Henry McMaster, the governor of South Carolina, announced his decision to end his state’s participation in the federal programs, saying: “This labor shortage is being created in large part by the supplemental unemployment payments that the federal government provides claimants on top of their state unemployment benefits.”

Similarly, Tate Reeves, the governor of Mississippi, stated in a May 2021 tweet:

It has become clear to me that we cannot have a full economic recovery until we get the thousands of available jobs in our state filled. . . . Therefore, I have informed the Department of Employment Security to direct the Biden Administration that Mississippi will be opting out of the additional federal unemployment benefits as early as federal law allows—June 12, 2021.

The argument that these special federal unemployment benefit programs hurt employment and economic activity was tested and found wanting.  Business Insider highlights the results of several studies:

Economist Peter Ganong, who co-authored a paper that found the disincentive effect of benefits was small, told the [Wall Street] Journal: “If the question is, ‘Is UI [unemployment insurance] the key thing that’s holding back the labor market recovery?’ The answer is no, definitely not, based on the available data.” 

That aligns with other early research on the impact of benefits ending. CNBC reports that analyses from payroll firms UKG and Homebase both found that employment didn’t go up in the states cutting off the benefits; in fact, that Homebase analysis found that employment declined in the states opting out of federal benefits, while it went up in states that chose to retain benefits. In June, Indeed’s Hiring Lab found that job searches in states ending benefits were below April’s baseline.

In July, Arindrajit Dube, an economics professor at University of Massachusetts Amherst, found that ending benefits didn’t make workers rush back. “Even as there was a clear reduction in the number of people who were receiving unemployment benefits — and a clear increase in the number of people who said that they were having difficulty paying their bills — that didn’t seem to translate, at least in the short run, into an uptick in overall employment rates,” Dube told Insider at the time.

Dube, along with five other researchers, examined “the effect of withdrawing pandemic UI on the financial and employment trajectories of unemployed workers in [19] states that withdrew benefits, compared to workers with the same unemployment duration in states that retained these benefits.” 

They found, as noted above, that for every 8 workers who lost their benefits, only 1 found a new job.  And for every $1 of reduced benefits, spending fell by 52 cents—only 7 cents of new income was generated for each dollar of lost benefits. “Extrapolating to all UI recipients in the early withdrawal states, we estimate these states eliminated $4 billion in unemployment benefits paid by federal transfers as of August 6 [2021].  Spending fell by $2 billion and earnings rose by $270 million.  These states therefore saw a much larger drop in federal transfers than gains from job creation.”

An additional 8 million workers have now lost benefits because of the federal termination of these special unemployment insurance programs.  It is hard to be optimistic about what awaits them, given the experience of the early termination states.  And equally important, even if the “optimists” are proven right, and those workers are able to find employment, there is still reason for concern about the likely quality of those jobs given long-term employment trends.

The lack of decent jobs

There is no agreed upon definition of a low wage job.  David R. Howell and Arne L. Kalleberg note two of the most popular in their study of declining job quality in the United States.  One is to define low wage jobs as those that pay less than two-thirds of the median hourly wage.  The other, used by the OECD, is to define low wage jobs as those that pay less than two-thirds of the median hourly wage for full-time workers.

Howell and Kallenberg find both inadequate.  Instead, they define low wage jobs as those that pay less than two-thirds of the mean hourly wage for full-time prime-age workers (35-59).  Their definition sets the dividing line between low wage and what they call “decent” wage jobs at $17.50 in 2017.  As they explain:

This wage is well above the wage that would make a full-time (or near full-time) worker eligible for food stamps and several dollars above the basic needs budget for a single adult in most American cities, but is conservative in that the basic needs budget for a single adult with one child ranges from $22 to $30).

The figure below, based on their definition, shows the growth in low wage jobs for workers 18-34 years of age without a college degree (in blue), all workers 18-64 years of age (in gold), and prime age workers 35-59 years of age (in green).  Their dividing line between low wage and decent wage jobs, equivalent to $17.50 in 2017, is far from a generous wage.  Yet, all three groupings show an upward trend in the share of low wage jobs.  

The authors then divide their low wage and decent wage categories into upper and lower tiers.   The lower tier of the low wage category includes jobs that pay less than two-thirds of the median wage for full-time workers, which equaled $13.33 in 2017.  As the authors report:

Based on evidence from basic needs budgets, this is a wage that, even on a full-time basis, would make it extremely difficult to support a minimally adequate standard of living for even a single adult anywhere in the country. This wage threshold ($13.33) is just above the wage cutoff for food stamps ($12.40) and Medicaid ($12.80) for a full- time worker (thirty-five hours per week, fifty weeks per year) with a child; full-year work at thirty hours per week would make a family of two eligible for the food stamps with a wage as high as $14.46 and as high as $14.94 for Medicaid.  For this reason, we refer to this as the poverty-wage threshold.

The lower tier of the decent wage category includes jobs that pay less than 50 percent more than the decent-job threshold, which equaled $26.50 in 2017.  The figure below shows the overall job distribution in 2017.

The following table shows the changing distribution of jobs over the years 1979 to 2017 for all workers 18 to 64, for workers 18-34 without a college degree, and for workers 18-34 with a college degree.

While the share of upper-tier decent jobs held by workers 18 to 64 has remained relatively stable, there has been a notable decline in the share of workers with lower-tier decent jobs.  Also worth noting is the rise in the share of poverty-level low wage jobs. 

Perhaps most striking is the large decline in the share of decent jobs held by workers 18 to 34, those with and those without a college degree.  The share of poverty level jobs held by those without a college degree soared from 35.7 percent to 53.5 percent.  The share of low wage jobs also spiked for those with a college degree, rising from 22 percent to 39.1 percent, with an increase in the share of both low-wage tiers.

This long-term decline in job quality will not reverse on its own.  And, not surprisingly, corporate leaders remain largely opposed to policies that might threaten the status quo.

So, do we need a better unemployment insurance system? For sure.  Do we need a better funded and more climate resilient social and physical infrastructure?  Definitely.  But we also need a dramatically different economy, one that, in sharp contrast to our current system, is grounded in greater worker control over both the organization and aims of production.  Lots of work ahead.

Playing the capitalist game: heads they win, tails you lose

According to an Economic Policy Institute report, between 28 and 47 percent of U.S. private sector workers are subject to noncompete agreements.  In brief, noncompete agreements (or noncompetes) are provisions in an employment contract that ban workers from leaving their job to work for a “competitor” that operates in the same geographic area, for a given period of time.  In a way, it’s an attempt to recreate the power dynamics of the employer-dominated company towns of old—with workers unable to change employers if they want to continuing working in the same industry.

It is not just top executives that are forced to accept a noncompete agreement.  Companies also use them to restrict the employment freedom of many low wage workers, including janitors, security guards, fast food workers, warehouse workers, personal care aids, and room cleaners.  In fact, the Economic Policy Institute estimates that almost a third of all businesses require that all of their workers sign noncompetes, regardless of their job duties or pay.

As for the impact of these agreements, a number of studies have found that noncompetes lower wages for all workers in the industry, even those not subject to noncompetes.  And then there is this from CBS News:

“In the context of the pandemic, which caused millions of people to be laid off, it’s safe to say at least a share of those workers are constrained [by noncompetes] in pursuing other opportunities during this crisis,” said John Lettieri, head of the Economic Innovation Group, a think tank that advocates against noncompetes. 

Indeed, at least four employers — including an accounting firm and a real estate brokerage — have tried to enforce noncompetes against workers they’ve laid off, with the lawsuits making their way through the courts.

On July 9, 2021 President Biden signed an executive order on “Promoting Competition in the American Economy” that, among other things, calls upon the Chair of the Federal Trade Commission (FTC) to work “with the rest of the Commission to exercise the FTC’s statutory rulemaking authority under the Federal Trade Commission Act to curtail the unfair use of non-compete clauses and other clauses or agreements that may unfairly limit worker mobility.”  While it seems likely that the FTC will take some action, the scope of that action remains uncertain.

Noncompetes and their use

There are no federal rules governing the use of noncompetes.  It is up to the states to decide how to regulate their use.  California, North Dakota, and Oklahoma are the only states with outright bans on their use; Washington DC also outlaws them.  Several states have placed limits on the use of non-competition agreements.  Illinois, Maryland, Nevada, Oregon, and Virginia all prohibit the use of noncompetes with low wage workers.  Washington state banned noncompetes for those earning under $100,000. Hawaii has prohibited noncompetes for tech workers only.  On the other hand, there are some states, like Idaho, which have actually passed laws making it easier for companies to enforce noncompete agreements.

Most workers live in states where there are few if any restrictions on the use of noncompete agreements.  And as the results of a national survey that included firms with at least 50 employees show, the use of noncompetes is common in workplaces with low pay (see the table below).  As the Economic Policy Institute report points out, although “the use of noncompetes tends to be higher for higher-wage workplaces than lower-wage workplaces . . . it is striking that more than a quarter—29.0%—of responding establishments where the average wage is less than $13.00 use noncompetes for all their workers.”

Popular outrage has sometimes forced companies to change their policies or state authorities to intervene on behalf of workers.  An example of the former: in 2015 Amazon began requiring its warehouse workers to sign noncompetes.  As The Verge reported:

The work is repetitive and physically demanding and can pay several dollars above minimum wage, yet Amazon is requiring these workers — even seasonal ones — to sign strict and far-reaching noncompete agreements. The Amazon contract, obtained by The Verge, requires employees to promise that they will not work at any company where they “directly or indirectly” support any good or service that competes with those they helped support at Amazon, for a year and a half after their brief stints at Amazon end. Of course, the company’s warehouses are the beating heart of Amazon’s online shopping empire, the extraordinary breadth of which has earned it the title of “the Everything Store,” so Amazon appears to be requiring temp workers to foreswear a sizable portion of the global economy in exchange for a several-months-long hourly warehouse gig.

The company has even required its permanent warehouse workers who get laid off to reaffirm their non-compete contracts as a condition of receiving severance pay. 

The company eventually ended the practice after its actions were widely reported in the media, generating bad publicity for the company.

Jimmy John’s offers an example of state action. In 2016, the attorneys general of New York and Illinois, reacting to public anger, forced Jimmy John to stop its franchises from using noncompetes that forbid its employees from working at any other sandwich shop within a 3-mile radius of the franchise for two years.

The cost of noncompetes to workers

When noncompetes are banned, worker pay rises.  One of the most detailed and complete studies of the wage consequences of such a change is based on Oregon’s 2008 decision to ban noncompetes (NCAs) for hourly wage workers.  As the authors of the study explain:

We find that banning NCAs for hourly workers increased hourly wages by 2-3% on average. Since only a subset of workers sign NCAs, scaling this estimate by the prevalence of NCA use in the hourly-paid population suggests that the effect on employees actually bound by NCAs may be as great as 14-21%, though the true effect is likely lower due to labor market spillovers onto those not bound by NCAs. While the positive wage effects are found across the age, education and wage distributions, they are stronger for female workers and in occupations where NCAs are more common. The Oregon low-wage NCA ban also improved average occupational status in Oregon, raised job-to-job mobility, and increased the proportion of salaried workers without affecting hours worked.”

Earlier studies of the consequence of changes in the use of noncompetes in other states produced similar results. For example, a study of Hawaii’s 2015 decision to ban noncompetes for tech workers showed a 4.2% pay bump for new hires and a 12% increase in worker mobility.

But even a change in law doesn’t necessarily bring an end to the practice, as highlighted by the California experience.  California courts will not enforce a noncompete contract, but that hasn’t stopped many California businesses from including them in their employment contracts.  One reason according to worker advocates, as reported by CBS News, is that most workers don’t know that noncompetes are banned in California: 

As a result, employers in California use these restrictive contracts just as much as employers elsewhere in the U.S., and they have their desired effect: scaring workers away from leaving for better jobs. 

“There’s no disincentive for the employer to include it in the employment contract. The worst thing that would happen is a court would declare [the noncompete] void,” said Harvard’s Gerstein. “There needs to be a disincentive to employer overreach.”

Possible federal action

President Biden pledged during his campaign to “eliminate all non-compete agreements, except the very few that are absolutely necessary to protect a narrowly defined category of trade secrets.”  On the other hand, his executive order speaks to “curtailing” their use.  The best outcome would be an FTC ban on the use of non-competes in all situations and for all workers; noncompetes are just another tool that businesses can use to exploit their workers.

But it may be that the FTC will instead seek to place limits on the use of such agreements, perhaps outlawing their use with low wage workers or establishing federal regulations that restrict their scope and duration.  Although such a step would be an improvement over the current situation, where most states do little to restrict the use of noncompetes, it may well result in an unsatisfying patchwork regulatory framework, much like that of our current unemployment system.

No matter how the FTC rules on the use of noncompete agreements, there are two other actions it should take that would significantly strengthen worker rights. Currently, many workers only learn they are subject to a noncompete agreement after they have already accepted a job.  The FTC should mandate that employers include any noncompete requirements in all job postings.

And as the California experience shows, companies will continue to use noncompetes even if they are not enforceable, relying on ignorance, intimidation, as well as the financial costs of court proceedings, to get workers to accept their terms.  Therefore, the FTC should also allow workers to sue for damages if a business is illegally attempting to enforce a noncompete agreement.

In the meantime, while we await FTC action, the greater the public knowledge about, and voiced opposition to the use of noncompetes, the better. 

Learning from history: community-run child-care centers during World War II

We face many big challenges.  And we will need strong, bold policies to meaningfully address them.  Solving our child-care crisis is one of those challenges, and a study of World War II government efforts to ensure accessible and affordable high-quality child care points the way to the kind of bold action we need. 

The child care crisis

A number of studies have established that high-quality early childhood programs provide significant community and individual benefits.  One found that “per dollar invested, early childhood programs increase present value of state per capita earnings by $5 to $9.”  Universal preschool programs have also been shown to offer significant benefits to all children, even producing better outcomes for the most disadvantaged children than means-tested programs.  Yet, even before the pandemic, most families struggled with a lack of desirable child-care options.    

The pandemic has now created a child-care crisis. As Lisa Dodson and Mary King point out: “By some estimates, as many as 4.5 million child-care ‘slots’ may be permanently lost and as many as 40 percent of child-care providers say they will never reopen.”  The lack of child care is greatly hindering our recovery from the pandemic.  Women suffered far greater job losses than men during 2020, including as child-care workers, and the child-care crisis has made it difficult for many working mothers to return to the labor force.  The cost goes beyond the immediate family hardship from lost income; there is strong evidence that a sustained period without work, the so-called employment gap, will result in significantly lower lifetime earnings and reduced retirement benefits.  

To his credit, President Biden has recognized the importance of strengthening our care economy.  His proposed American Families Plan includes some $225 billion in tax credits to help make child care more affordable for working families.  According to a White House fact sheet, families would “receive a tax credit for as much as half of their spending on qualified child care for children under age 13, up to a total of $4,000 for one child or $8,000 for two or more children. . . . The credit can be used for expenses ranging from full-time care to after school care to summer care.”

But tax credits don’t ensure the existence of convenient, affordable, high-quality child-care facilities staffed by well-paid and trained child-care providers.  And if that is what we really want, we will need to directly provide it.  That is what the government did during World War II.  While its program was far from perfect, in part because it was designed to be short-term, it provides an example of the type of strong, bold action we will need to overcome our current child-care crisis. 

Federal support for child care

During World War II the United States government financed a heavily-subsidized child-care program.  From August 1943 through February 1946, the Federal Works Agency (FWA), using Lanham Act funds, provided some $52 million in grants for child-care services (equal to more than $1 billion today) to any approved community group that could demonstrate a war-related need for the service.  At its July 1944 peak, 3,102 federally subsidized child-care centers, with some 130,000 children enrolled, operated throughout the country.  There was at least one center in every state but New Mexico, which decided against participation in the program.  By the end of the war, between 550,000 and 600,000 children received some care from Lanham Act funded child-care programs.  

Communities were allowed to use the federal grant money to cover most of the costs involved in establishing and running their centers, including facilities construction and upkeep, staff wages and most other daily operating costs.  They were required to provide some matching funds, most of which came from fees paid by the parents of children enrolled in the program.  However, these fees were capped. In the fall of 1943, the FWA established a ceiling on fees of 50 cents per child per day (about $7 now), which was raised to 75 cents in July 1945. And those fees included snacks, lunch, and in some cases dinner as well. Overall, the federal subsidy covered two-thirds of the total maintenance and operation of the centers.

The only eligibility requirement for enrollment was a mother’s employment status: she had to be working at a job considered important to the war effort, and this was not limited to military production. Center hours varied, but many accommodated the round-the-clock manufacturing schedule, staying open 24 hours a day, 6 days a week. 

The centers served preschoolers (infants, toddlers, and children up to 5 years of age) and school-age children (6 to 14 years of age). In July 1944, approximately 53,000 preschoolers and 77,000 school-age children were enrolled.  School-age enrollment always climbed during summer vacation.  However, in most months, preschoolers made up the majority of the children served by Lanham Act-funded centers. Enrollment of preschoolers peaked at some 74,000 in May 1945. 

Some 90 percent of the centers were housed in public schools, with newly contructed housing projects providing the next most used location. Although local school boards were free to decide program standards–including staff-child ratios, worker qualifications, and facility design–state boards of education were responsible for program supervision. The recommended teacher-child ratio was 10-to-1, and most centers complied.  According to Chris M. Herbst,

Anecdotal evidence suggests that preschool-aged children engaged in indoor and outdoor play; used educational materials such paints, clay, and musical instruments; and took regular naps. . . . Programs for school-aged children included . . . outdoor activities, participation in music and drama clubs, library reading, and assistance with schoolwork. 

Children at a child-care center sit for “story time.” (Gordon Parks / Library of Congress / The Crowley Company)

While quality did vary–largely the result of differences in community support for public child care, the willingness of cities to provide additional financial support, and the ability of centers to hire trained professionals to develop and oversee program activities–the centers did their best to deliver a high-quality childhood education.  As Ruth Peason Koshuk, the author of a 1947 study of the developmental records of 500 children, 2 to 5 years of age, at two Los Angeles Country centers, describes:

In these two . . . schools, as elsewhere, the program has developed since 1943, toward recognized standards of early childhood education. The aim has been to apply the best of existing standards, and to maintain as close contact with the home as possible. In-service training courses carrying college credit have been given, for the teaching staff, and a mutually helpful parent education program carried on in spite of difficulties inherent in a child care situation.

There has been a corresponding development in the basic records. A pre-entrance medical examination has been required by state law since the first center opened. In December 1943 a developmental record was added, which is filled out by the director during an unhurried interview with the mother just before a child enters. One page is devoted to infancy experience; the four following cover briefly the child’s development history, with emphasis on emotional experience, behavior problems he has presented to the parents, if any, and the control methods used, as well as the personal-social behavior traits which they value and desire for the child. After entrance, observational notes and semester reports are compiled by the teachers. Intelligence testing has been limited to cases where it seemed especially indicated. A closing record is filled out, in most cases, by the parent when a child is withdrawn. These records are considered a minimum. They have proved indispensable as aids to the teachers in guiding the individual children and as a basis for conferences on behavior in the home.

A 2013 study of the long-term effects on mothers and children from use of Lanham centers found a substantial increase in maternal employment, even five years after the end of the program, and “strong and persistent positive effects on well-being” for their children.

In short, despite many shortcomings, these Lanham centers, as Thalia Ertman sums up,

broke ground as the first and, to date, only time in American history when parents could send their children to federally-subsidized child care, regardless of income, and do so affordably. . . .

Additionally, these centers are seen as historically important because they sought to address the needs of both children and mothers. Rather than simply functioning as holding pens for children while their mothers were at work, the Lanham child care centers were found to have a strong and persistent positive effect on the well-being of children.

The federal government also supported some private employer-sponsored child care during the war. The most well-known example is the two massive centers built by the Kaiser Company in Portland, Oregon to provide child care for the children of workers at their Portland Yards and Oregon Shipbuilding Corporation. The centers were located right at the front of the shipyards, making it easy for mothers to drop their children off and pick them up, and were operated on a 24-hour schedule.  They were also large, each caring for up to 1,125 children between 18 months and 6 years of age. The centers had their own medical clinic, cafeteria, and large play areas, and employed highly trained staff.  Parents paid $5 for a six-day week for one child and $3.75 for each additional child.  For a small additional fee, the centers also prepared a small dinner for parents to pick up at the end of their working day.

While the Kaiser Company received much national praise as well as appreciation from its employees with young children, these centers were largely paid for by the government.  Government funds directly paid for their construction, and a majority of the costs of running the center, including staff salaries, were included in the company’s cost-plus contracting with the military.

Political dynamics

There was considerable opposition to federal financing of group child care, especially for children younger than 6 years of age.  The sentiment is captured in this quote from a 1943 New York Times article: “The worst mother is better than the best institution when it is a matter of child care, Mayor La Guardia declared.”  Even the War Manpower Commission initially opposed mothers with young children working outside the home, even in service of the war effort, stating that “The first responsibility of women with young children, in war as in peace, is to give suitable care in their own homes to their children.”

But on-the-ground realities made this an untenable position for both the government and business. Women sought jobs, whether out of economic necessity or patriotism.  The government, highlighted by its Rosie the Riveter campaign, was eager to encourage their employment in industries producing for the war effort.  And, despite public sentiment, a significant number of those women were mothers with young children. 

Luedell Mitchell and Lavada Cherry working at a Douglas Aircraft plant in El Segundo, Calif. Circa 1944. Credit: National Archives photo no. 535811

The growing importance of women in the workplace, and especially mothers with young children, is captured in employment trends in Portland, Oregon.  Women began moving into the defense workforce in great numbers starting in 1942, with the number employed in local war industries climbing from 7,000 in November 1942 to 40,000 in June 1943.  An official with the state child-care committee reported that “a check of six shipyards reveals that the number of women employed in the shipyards has increased 25 percent in one month and that the number is going to increase more rapidly in the future.” 

The number of employed mothers was also rapidly growing.  According to the Council of Social Agencies, “Despite the recommendations of the War Manpower Commission . . . thousands of young mothers in their twenties and thirties have accepted jobs in war industries and other businesses in Multnomah County. Of the 8,000 women employed at the Oregon Shipyards in January, 1943, 32 percent of them had children, 16 percent having pre-school children.”

Portland was far from unique.  During the war, for the first time, married women workers outnumbered single women workers.  Increasingly, employers began to recognize the need for child care to address absenteeism problems.  As a “women’s counselor” at the Bendix Aviation Corporation in New Jersey explained to reporters in 1943, child care is one of the biggest concerns for new hires. “We feel a mother should be with her small baby if possible. But many of them have to come back. Their husbands are in the service and they can’t get along on his allotment.”  Media stories, many unsubstantiated, of children left in parked cars outside workplaces or fending for themselves at home, also contributed to a greater public acceptance of group child care. 

An image of Rosie the Riveter that appeared in a 1943 issue of the magazine Hygeia

Finally, the government took action.  The Federal Works Agency was one of two new super agencies established in 1939 to oversee the large number of agencies created during the New Deal period.  In 1940 President Roosevelt signed into law the Lanham Act, which authorized the FWA to fund and supervise the construction of needed public infrastructure, such as housing, hospitals, water and sewer systems, police and firefighting facilities, and recreation centers, in communities experiencing rapid growth because of the defense buildup. In August 1942, the FWA decided, without any public debate, that public infrastructure also meant child care, and it began its program of support for the construction and operation of group child-care facilities.

The Federal Works Agency, the other super agency, whose oversight responsibilities included the Children’s Bureau and the U.S. Office of Education, opposed the FWA’s new child-care initiative.  It did so not only because it believed that child care fell under its mandate, but also because the leadership of the Children’s Bureau and Office of Education opposed group child care.  The FWA won the political battle, and in July 1943, Congress authorized additional funding for the FWA’s child-care efforts. 

And, as William M. Tuttle, Jr. describes, public pressure played an important part in the victory:

the proponents of group child care organized a potent lobbying effort. The women’s auxiliaries of certain industrial unions, such as the United Electrical Workers and the United Auto Workers, joined with community leaders and FWA officials in the effort. Also influential were the six women members of the House of Representatives. In February 1944, Representative Mary T. Norton presented to the House “a joint appeal” for immediate funds to expand the wartime child day care program under the FWA.

Termination and a step back

Congressional support for group child care was always tied to wartime needs, a position shared by most FWA officials.  The May 1945 Allied victory in Europe brought a drop in war production, and a reduction in FWA community child care approvals and renewals.  In August, after the Japanese surrender brought the war to a close, the FWA announced that it would end its funding of child-care centers as soon as possible, but no later than the end of October 1945.

Almost immediately thousands of individuals wrote letters, sent wires, and signed petitions calling for the continuation of the program.  Officials in California, the location of many war-related manufacturing sites and nearly 25 percent of all children enrolled in Lanham Act centers in August 1945, also weighed in, strongly supporting the call.  Congress yielded, largely influenced by the argument that since it would be months before all the “men” in the military returned to the country, mothers had no choice but to continue working and needed the support of the centers to do so.  It approved new funds, but only enough to keep the centers operating until the end of February 1946.

The great majority of centers rapidly closed not long after the termination of federal support, with demonstrations following many of the closings.  The common assumption was that women would not mind the closures, since most would be happy to return to homemaking.  Many women were, in fact, forced out of the labor force, disproportionately suffering from post-war industrial layoffs.  But by 1947, women’s labor force participation was again on the rise and a new push began for a renewal of federal support for community child-care centers. Unfortunately, the government refused to change its position. During the Korean War, Congress did approve a public child-care bill, but then it refused to authorize any funding.

After WWII, parents organized demonstrations, like this one in New York on Sept. 21, 1947, calling for the continuing funding of the centers. The city’s welfare commissioner dismissed the protests as “hysterical.” Credit: The New York Times

Finally, in 1954, as Sonya Michel explains, “Congress found an approach to child care it could live with: the child-care tax deduction.”  While the child-care tax deduction did offer some financial relief to some families, it did nothing to ensure the availability of affordable, high-quality child care.  The history of child care during World War II makes clear that this turn to market-based tax policy to solve child-care problems represented a big step back for working women and their children.  And this was well understood by most working people at the time. 

Sadly, this history has been forgotten, and Biden’s commitment to expand the child-care tax credit is now seen as an important step forward.  History shows we can and need to do better.

Time to put the spotlight on corporate taxes

A battle is slowly brewing in Washington DC over whether to raise corporate taxes to help finance new infrastructure investments.  While higher corporate taxes cannot generate all the funds needed, the coming debate over whether to raise them gives us an opportunity to challenge the still strong popular identification of corporate profitability with the health of the economy and, by extension, worker wellbeing.

According to the media, President Biden’s advisers are hard at work on two major proposals with a combined $3 trillion price tag.  The first aims to modernize the country’s physical infrastructure and is said to include funds for the construction of roads, bridges, rail lines, ports, electric vehicle charging stations, and affordable and energy efficient housing as well as rural broadband, improvements to the electric grid, and worker training programs.  The second targets social infrastructure and would provide funds for free community college education, universal prekindergarten, and a national paid leave program. 

To pay for these proposals, Biden has been talking up the need to raise corporate taxes, at least to offset some of the costs of modernizing the country’s physical infrastructure.  Not surprisingly, Republican leaders in Congress have voiced their opposition to corporate tax increases.  And corporate leaders have drawn their own line in the sand.  As the New York Times reports:

Business groups have warned that corporate tax increases would scuttle their support for an infrastructure plan. “That’s the kind of thing that can just wreck the competitiveness in a country,” Aric Newhouse, the senior vice president for policy and government relations at the National Association of Manufacturers, said last month [February 2021].

Regardless of whether Biden decides to pursue his broad policy agenda, this appears to be a favorable moment for activists to take advantage of media coverage surrounding the proposals and their funding to contest these kinds of corporate claims and demonstrate the anti-working-class consequences of corporate profit-maximizing behavior.  

What do corporations have to complain about?

To hear corporate leaders talk, one would think that they have been subjected to decades of tax increases.  In fact, quite the opposite is true.  The figure below shows the movement in the top corporate tax rate.  As we can see, it peaked in the early 1950s and has been falling ever since, with a big drop in 1986, and another in 2017, thanks to Congressionally approved tax changes.

One consequence of this corporate friendly tax policy is, as the following figure shows, a steady decline in federal corporate tax payments as a share of GDP.  These payments fell from 5.6 percent of GDP in 1953 to 1.5 percent in 1982, and a still lower 1.0 percent in 2020.  By contrast there has been very little change in individual income tax payments as a share of GDP; they were 7.7 percent of GDP in 2020.

Congressional tax policy has certainly been good for the corporate bottom line.  As the next figure illustrates, both pre-tax and after-tax corporate profits have risen as a share of GDP since the early 1980s.  But the rise in after-tax profits has been the most dramatic, soaring from 5.2 percent of GDP in 1980 to 9.1 percent in 2019, before dipping slightly to 8.8 percent in 2020.   To put recent after-tax profit gains in perspective, the 2020 after-tax profit share is greater than the profit share in every year from 1930 to 2005.

What do corporations do with their profits?

Corporations claim that higher taxes would hurt U.S. competitiveness, implying that they need their profits to invest and keep the economy strong.  Yet, despite ever higher after-tax rates of profit, private investment in plant and equipment has been on the decline.

As the figure below shows, gross private domestic nonresidential fixed investment as a share of GDP has been trending down since the early 1980s.  It fell from 14.8 percent in 1981 to 13.4 percent in 2020.

Rather than investing in new plant and equipment, corporations have been using their profits to fund an aggressive program of stock repurchases and dividend payouts.  The figure below highlights the rise in corporate stock buybacks, which have helped drive up stock prices, enriching CEOs and other top wealth holders. In fact, between 2008 and 2017, companies spent some 53 percent of their profits on stock buybacks and another 30 percent on dividend payments.

It should therefore come as no surprise that CEO compensation is also exploding, with CEO-to-worker compensation growing from 21-to-1 in 1965, to 61-to-1 in 1989, 293-to-1 in 2018, and 320-to-1 in 2019.  As we see in the next figure, the growth in CEO compensation has actually been outpacing the rise in the S&P 500.

In sum, the system is not broken.  It continues to work as it is supposed to work, generating large profits for leading corporations that then find ways to generously reward their top managers and stockholders.  Unfortunately, investing in plant and equipment, creating decent jobs, or supporting public investment are all low on the corporate profit-maximizing agenda.  

Thus, if we are going to rebuild and revitalize our economy in ways that meaningfully serve the public interest, working people will have to actively promote policies that will enable them to gain control over the wealth their labor produces.  One example: new labor laws that strengthen the ability of workers to unionize and engage in collective and solidaristic actions.  Another is the expansion of publicly funded and provided social programs, including for health care, housing, education, energy, and transportation. 

And then there are corporate taxes.  Raising them is one of the easiest ways we have to claw back funds from the private sector to help finance some of the investment we need.  Perhaps more importantly, the fight over corporate tax increases provides us with an important opportunity to make the case that the public interest is not well served by reliance on corporate profitability.

The U.S. recovery on pause, December brings new job losses

A meaningful working-class recovery from the recession seems far away.

After seven months of job gains, although diminishing gains to be sure, we are again losing jobs.  As the chart below shows,  the number of jobs fell by 140,000 in December.

We are currently about 9.8 million jobs down from the February 2020 employment peak, having recovered only 55 percent of the jobs lost.  And, as the following chart illustrates, the percentage of jobs lost remains greater, even now after months of job growth, than it was at any point during the Great Recession. 

If the job recovery continues on its current pace, some analysts predict that it will likely take more than three years to just get back to pre-pandemic employment levels.  However, this might well be too rosy a projection.  One reason is that the early assumption that many of the job losses were temporary, and that those unemployed would soon be recalled to employment, is turning out to be wrong.  A rapidly growing share of the unemployed are remaining unemployed for an extended period. 

As we see below, in October, almost one-third of the unemployed had been unemployed for 27 weeks or longer.  According to the December jobs report, that percentage is now up to 37 percent, four times what it was before the pandemic.  And that figure seriously understates the problem, since many workers have given up looking for work; having dropped out of the workforce, they are no longer counted as unemployed.  The labor force participation rate is now 61.5 percent, down from 63.3 percent in February.

Dean Baker, quoted in a recent Market Place story, underscores the importance of this development:

“This is obviously a story of people losing their job at the beginning of the crisis in March and April and not getting it back,” said Dean Baker, co-founder and senior economist with the Center for Economic and Policy Research.

Those out of work for 27 weeks or more make up a growing share of the unemployed, and that could have enduring consequences, Baker said.

“After people have been unemployed for more than six months, they find it much harder to get a job,” he said. “And if they do get a job, their labor market prospects could be permanently worsened.”

And tragically, the workers that have suffered the greatest job losses during this crisis are those that earned the lowest wages. 

It is no wonder that growing numbers of working people are finding it difficult to meet their basic needs.

There is no way to sugar coat this situation.  We need a significant stimulus package, a meaningful increase in the minimum wage, real labor law reform, a robust national single-payer health care system, and an aggressive Green New Deal designed public sector investment and jobs program.  And there is no getting around the fact that it is going to take hard organizing and mutually supportive community and workplace actions to move the country in the direction it needs to go.

Profits over people: frontline workers during the pandemic

It wasn’t that long ago that the country celebrated frontline workers by banging pots in the evening to thank them for the risks they took doing their jobs during the pandemic. One national survey found that health care workers were the most admired (80%), closely followed by grocery store workers (77%), and delivery drivers (73%). 

Corporate leaders joined in the celebration. Supermarket News quoted Dacona Smith, executive vice president and chief operating officer at Walmart U.S., as saying in April:

We cannot thank and appreciate our associates enough. What they have accomplished in the last few weeks has been amazing to watch and fills everyone at our company with enormous pride. America is getting the chance to see what we’ve always known — that our people truly do make the difference. Let’s all take care of each other out there.

Driven by a desire to burnish their public image, deflect attention from their soaring profits, and attract more workers, many of the country’s leading retailers, including Walmart, proudly announced special pandemic wage increases and bonuses.  But as a report by Brookings points out, although their profits continued to roll in, those special payments didn’t last long.

There are three important takeaways from the report: First, don’t trust corporate PR statements; once people stop paying attention, corporations do what they want.  Second, workers need unions to defend their interests.  Third, there should be some form of federal regulation to ensure workers receive hazard pay during health emergencies like pandemics, similar to the laws requiring time and half for overtime work.

The companies and their workers

In Windfall Profits and Deadly Risks, Molly Kinder, Laura Stateler, and Julia Du look at the compensation paid to frontline workers at, and profits earned by, 13 of the 20 biggest retail companies in the United States.  The 13, listed in the figure below, “employ more than 6 million workers and include the largest corporations in grocery, big-box retail, home improvement, pharmacies, electronics, and discount retail.” The seven left out “either did not have public financial information available or were in retail sectors that were hit hard by the pandemic (such as clothing) and did not provide COVID-19 compensation to workers.”

Pre-pandemic, the median wages for the main frontline retail jobs (e.g., cashiers, salespersons, and stock clerks) at these 13 companies generally ranged from $10 to $12 per hour (see the grey bar in the figure below).  The exceptions at the high end were Costco and Amazon, both of which had a minimum starting wage of $15 before the start of the pandemic. The exception at the low end was Dollar General, which the authors estimate had a starting wage of only $8 per hour.  

Clearly, these companies thrive on low-wage work.  And it should be added, disproportionately the work of women of color.  “Women make up a significantly larger share of the frontline workforce in general retail stores and at companies such as Target and Walmart than they do in the workforce overall. Amazon and Walmart employ well above-average shares of Black workers (27% and 21%, respectively) compared to the national figure of 12%.”

Then came the pandemic

Eager to take advantage of the new pandemic-driven business coming their way, all 13 companies highlighted in the report quickly offered some form of special COVID-19-related compensation in an effort to attract new workers (as highlighted in the figure below).  “Commonly referred to as “hazard pay,” the additional compensation came in the form of small, temporary hourly wage increases, typically between $2 and $2.50 per hour, as well as one-off bonuses. In addition to temporary hazard pay, a few companies permanently raised wages for workers during the pandemic.“

Unfortunately, as the next figure reveals, these special corporate payment programs were short-lived.  Of the 10 companies that offered temporary hourly wage increases, 7 ended them before the beginning of July and the start of a new wave of COVID-19 infections. Moreover, even with these programs, nine of the 13 companies continued to pay wages below $15 an hour.  Only three companies instituted permanent wage hikes.   While periodic bonuses are no doubt welcomed, they are impossible to count on and of limited dollar value compared with an increase in hourly wages.  So much, for corporate caring!

Don’t worry about the companies

As the next figure shows, while the leading retail companies highlighted in the study have been stingy when it comes to paying their frontline workers, the pandemic has treated them quite well.  As the authors point out:

Across the 13 companies in our analysis, revenue was up an average of 14% over last year, while profits rose 39%. Excluding Walgreens—whose business has struggled during the pandemic—profits rose a staggering 46%. Stock prices rose on average 30% since the end of February. In total, the 13 companies reported 2020 profits to date of $67 billion, which is an additional $16.9 billion compared to last year.

Looking just at the compensation generosity of the six companies that had public data on the total cost of their extra compensation to workers, the authors found that the numbers “paint a picture of most companies prioritizing profits and wealth for shareholders over investments in their employees. On average, the six companies’ contribution to compensating workers was less than half of the additional profit earned during the pandemic compared to the previous year.”

This kind of scam, where companies publicly celebrate their generosity only to quietly withdraw it a short time later, is a common one.  And because it is hard to follow corporate policies over months, they are often able to sell the public that they really do care about the well-being of their workers.  That is why this study is important—it makes clear that relying on corporations to do the “right thing” is a losing proposition for workers.

COVID-19 Economic Crisis Snapshot

 Workers in the United States are in the midst of a punishing COVID-19 economic crisis.  Unfortunately, while a new fiscal spending package and an effective vaccine can bring needed relief, a meaningful sustained economic recovery will require significant structural changes in the operation and orientation of the economy.

The unemployment problem

Many people blame government mandated closure orders for the decline in economic activity and spike in unemployment.  But the evidence points to widespread concerns about the virus as the driving force.  As Emily Badger and Alicia Parlapiano describe in a New York Times article, and as illustrated in the following graphic taken from the article:

In the weeks before states around the country issued lockdown orders this spring, Americans were already hunkering down. They were spending less, traveling less, dining out less. Small businesses were already cutting employment. Some were even closing shop.

People were behaving this way — effectively winding down the economy — before the government told them to. And that pattern, apparent in a range of data looking back over the past two months, suggests in the weeks ahead that official pronouncements will have limited power to open the economy back up.

As the graphic shows, economic activity nosedived around the same time regardless of whether state governments were quick to mandate closings, slow to mandate closings, or unwilling to issue stay-at-home orders.

The resulting sharp decline in economic activity caused unemployment to soar. Almost 21 million jobs were lost in April at the peak of the crisis.  The unemployment rate hit a high of 14.7 percent.  By comparison the highest unemployment rate during the Great Recession was 10.6 percent in January 2010.

Employment recovered the next month, with an increase of 2.8 million jobs in May.  In June, payrolls grew by an even greater number, 4.8 million.  But things have dramatically slowed since.  In July, only 1.8 million jobs came back.  In August it was 1.5 million.  And in September it was only 661,000.  To this point, only half of the jobs lost have returned, and current trends are far from encouraging.

The unemployment rate fell to 7.9 percent in September, a significant decline from April.  But a large reason for that decline is that millions of workers have given up working or looking for work and are no longer counted as being part of the labor force.  And, as Alisha Haridasani Gupta writes in the New York Times:

A majority of those dropping out were women. Of the 1.1 million people ages 20 and over who left the work force (neither working nor looking for work) between August and September, over 800,000 were women, according to an analysis by the National Women’s Law Center. That figure includes 324,000 Latinas and 58,000 Black women. For comparison, 216,000 men left the job market in the same time period.

The relationship between the fall in the unemployment rate and worker exodus from the labor market is illustrated in the next figure which shows both the unemployment rate and the labor force participation rate (LFPR), which is measured by dividing the number of people 16 and over who are employed or seeking employment by the size of the civilian noninstitutional population that is 16 and over.

The figure allows us to see that even the relatively “low” September unemployment rate of 7.9 percent is still high by historical standards.  It also allows us to see that its recent decline was aided by a decline in the LFPR to a level not seen since the mid-1970s.  If those who left the labor market were to decide to once again seek employment, pushing the LFPR back up, unless the economic environment changed dramatically, the unemployment rate would also be pushed up to a much higher level.

Beyond the aggregate figures is the fact, as Heather Long, Andrew Van Dam, Alyssa Fowers and Leslie Shapiro explain in a Washington Post article, that “No other recession in modern history has so pummeled society’s most vulnerable.”

As we can see in the above graphic, the 1990 recession was a relatively egalitarian affair with all income groups suffering roughly a similar decline in employment.  That changed during the recessions of 2001 and 2008, with the lowest earning cohort suffering the most.  But, as the authors of the Washington Post article state, “even that inequality is a blip compared with what the coronavirus inflicted on low-wage workers this year.”  By the end of the summer, the employment crisis was largely over for the highest earners, while employment was still down more than 20 percent for low-wage workers and around 10 percent for middle-wage workers.

Poverty is on the rise

In line with this disproportionate hit suffered by low wage workers, the poverty rate has been climbing.  Five Columbia University researchers, using a monthly version of the Supplemental Poverty Measure (SPM), provide estimates of the monthly poverty rate from October 2019 through September 2020.  They found, as illustrated below, “that the monthly poverty rate increased from 15% to 16.7% from February to September 2020, even after taking the CARES Act’s income transfers into account. Increases in monthly poverty rates have been particularly acute for Black and Hispanic individuals, as well as for children.”

The standard poverty measure used by the federal government is an annual one, based on whether a family’s total annual income falls below a specified income level.  It doesn’t allow for monthly calculations and is widely criticized for using an extremely low emergency food budget to set its poverty level.   The SPM includes a more complete and accurate measure of family resources, a more expansive definition of family, the cost of a broader basket of necessities, and is adjusted for cost of living across metro areas.

As we can see in the above figure, the $2.2 trillion Coronavirus Aid, Relief, and Economic Security (CARES) Act, which was passed by Congress and signed into law on March 27th, 2020, has had a positive effect on poverty levels.  For example, without it, the poverty rate would have jumped to 19.4 percent in April. “Put differently, the CARE Act’s income transfers directly lifted around 18 million individuals out of poverty in April.”

However, as we can also see, the positive effects of the CARES Act have gradually dissipated.  The Economic Impact Payments (“Recovery Rebates”) were one-time payments.  The $600 per week unemployment supplement expired at the end of July.  Thus, the gap between the monthly SPM with and without the CARES Act has gradually narrowed.  And, with job creation dramatically slowing, without a new federal stimulus measure it is likely we will not see much improvement in the poverty rate in the coming months.  In fact, if working people continue to leave the labor market out of discouragement and the pressure of home responsibilities, there is a good chance the poverty rate will climb again.

It is also important to note that the rise in monthly rates of poverty, even with the CARES Act, differs greatly by race/ethnicity as illustrated in the following figure.

The need to do more

Republican opposition to a new stimulus ensures that that there will be no follow-up to the CARES Act before the upcoming election.  Opponents claim that the federal government has already done enough and the economy is well on its way to recovery. 

As for the size of the stimulus, the United States has been a lagger when it comes to its fiscal response to the pandemic.  The OECD recently published an interim report titled “Coronavirus: Living with uncertainty.”  One section of the report looks at fiscal support as a percent of 2019 GDP for nine countries. As the following figure shows, the United States trails every country but Korea when it comes to direct support for workers, firms, and health care.  

A big change is needed

While it is natural to view COVID-19 as responsible for our current crisis, the truth is that our economic problems are more long-term.  The U.S. economy has been steadily weakening for years.  In the figure below, the “trend” line is based on the 2.1% average rate of growth in real per capita GDP from 1970 to 2007, the year before the Great Recession.  Not surprising, real per capita GDP took a big hit during the Great Recession.  But as we can also see, real per capita GDP has yet to return to its historical trend. In fact, the gap has grown larger despite the record long recovery that followed. 

As Doug Henwood explains:

Since 2009, the growth rate has averaged 1.6%. Last year [2019], which Trump touted as the greatest economy ever, it managed to get back to the pre-2008 average of 2.1%, an average that includes two deep recessions (1973–1975 and 1981–1982).

At the end of 2019, actual [real GDP per capita] was 13% below trend. At the end of the 2008–2009 recession it was 9% below trend. Remarkably, despite a decade-long expansion, it fell further below trend in well over half the quarters since the Great Recession ended. The gap is now equal to $10,200 per person—a permanent loss of income, as economists say. 

The pre-coronavirus period of expansion (June 2009 to February 2020), although the longest on record, was actually also one of the weakest. It was marked by slow growth, weak job creation, deteriorating job quality, declining investment, rising debt, declining life expectancy, and narrowing corporate profit margins. In other words, the economy was heading toward recession even before the start of state mandated lockdowns.  The manufacturing sector actually spent much of 2019 in recession.   

Thus, there is strong reason to believe that a meaningful sustained recovery from the current COVID-19 economic crisis is going to require more than the development of an effective vaccine and a responsive health care system to ensure its wide distribution.  Also needed is significant structural change in the operation and orientation of the economy.

Times remain hard, especially for low-wage workers

The current economic crisis has hit workers hard.  Unemployment rates remain high, with total weekly initial claims for unemployment insurance benefits continuing to grow.  Recent reports of a sharp rise in median earnings for full-time workers appear to complicate the picture.  However, a more detailed examination of worker earnings and employment not only helps to sharpen our understanding of the devastating nature of the current crisis for working people, but makes clear that low wage workers are the hardest hit.

Earnings growth

The labor department recently published data showing wages skyrocketing.  As Federal Reserve Bank of San Francisco researchers reported in a recent Economic Letter:

Recent data show that median usual weekly earnings of full-time workers have grown 10.4 percent over the four quarters preceding the second quarter of 2020. This is a 6.4 percentage point acceleration compared with the fourth quarter of 2019. The median usual weekly earnings measure that we focus on here is not an exception. Other measures of wage growth—like average hourly earnings and compensation per hour—show similar spikes.

The spike can be seen in the movement in the blue line in the figure below (which is taken from the Economic Letter).  As we can see, nominal average weekly earnings for full-time employees grew by 10.4 percent between spring of 2019 and spring of 2020, the fastest rate of growth in nearly 40 years.

While this earnings trend suggests a strong labor market, it is, as the researchers correctly note, highly misleading.  The reason is that this measure has been distorted by the massive loss of jobs disproportionally suffered by low wage full-time workers.  The decline in the number of full-time low wage workers has been large enough to change the earnings distribution, leading to a steadily growing value for the median earnings of the remaining full-time workers.

In other words, the spike in median earnings is not the result of currently employed workers enjoying significant wage gains.  This becomes clear when we adjust for the decline in employment by only considering the nominal median earnings of those workers that remained employed full-time throughout the past year.  As the downward movement in the green line in the above figure shows, the gains in medium earnings for those continuously employed has been small and falling.

Disproportionate job losses for full-time low-wage workers

The researchers confirmed that it was low-wage workers that have disproportionately suffered job losses by calculating the earnings distribution of the full-time workers forced to exit to, in the words of the researchers, “nonemployment” – by which they mean either unemployment or nonparticipation — each month over the past two decades.

They began by estimating the yearly share of full-time worker exits to unemployment and nonparticipation.  As we see in the figure above, in non-recession years, about 7 percent of those with full-time jobs become nonemployed each year—2 percent become unemployed and 5 percent leave the labor force.  During the Great Recession, nonemployment peaked in August 2009 at 11 percent, with most of the increase driven by a sharp rise in unemployment (as shown by the big bump in green area).  There was little change in the rate at which full-time workers dropped out of the labor force.

The severity of our current crisis is captured by the dramatic rise in the share of workers exiting full-time employment beginning in March 2020.  Exits to nonemployment peaked in May 2020 at 17 percent, with 9 percent moving to unemployment and 8 percent to nonparticipation. Not only is this almost twice as high as during the Great Recession, the extremely challenging state of the labor market is underscored by the fact that the share of nonemployed who chose nonparticipation and thus exit from the labor market was almost as great as the share who remained part of the labor force and classified as unemployed.

The next figure shows the share of workers exiting to nonemployment by their position in the wage distribution. The three areas depict exits by workers in the lowest quarter of the earnings distribution, the second lowest quarter, and the top half, respectively.

As the researchers explain,

In the months following the onset of COVID-19, workers in the bottom 25 percent of the earnings distribution made up about half of the exits to nonemployment. In contrast, the top half of the distribution only accounted for about a third of the exits. . . .

Therefore, the recent spike in aggregate nominal wage growth does not reflect the benefits of pay raises and a strong labor market for workers. Instead, it is the result of the high levels of job loss among low-income workers since the start of the pandemic.

Tragically, low wage workers have not only suffered disproportional job losses during this pandemic. Those who remain employed are increasingly being victimized by wage theft.  As Igor Derysh, writing in Salon, notes: “A paper released this week by the . . . Washington Center for Equitable Growth found that minimum wage violations have roughly doubled compared to the period before the pandemic.”

These are indeed hard times for almost all working people but, perhaps not surprisingly, those at the bottom of the wage distribution are suffering the most.

Big tech support for racial justice is more talk than action

In the month following the May 25th death of George Floyd, the largest technology companies collectively pledged more than a billion dollars in support of racial justice.  Sounds like a lot of money, but for these companies it is pocket change.  And, despite the accompanying corporate statements of support for structural change to fight racism, there is little indication that they plan to back up their words with meaningful action.

Big tech is riding high

In early June Apple announced the launch of a $100 million Racial Equity and Justice Initiative to “promote racial equality for people of color with a focus on ‘education, economic equality, and criminal justice reform.’”  But, as Jay Peters, writing in The Verge, makes clear, the amount doesn’t sound so impressive when you consider Apple’s earnings.

Apple is now the world’s most valuable company.  Apple made $6.3 million in profit every single hour in 2019, which means that its initiative cost it about 16 hours of business on one day of the year.

And despite the current recession, big tech appears set to earn more this year than last. “Right now, it’s big tech’s world and everyone else is paying rent,” said Wedbush Securities analyst Dan Ives. “They are consumer staples now and this crisis has bought their growth forward by about two years.”

Combined, Amazon, Apple, Alphabet and Facebook reported revenue of $206 billion and net income of $29 billion in the three months ending in late June 2020.  As the New York Times summarized:

Amazon’s sales were up 40 percent from a year ago and its profit doubled. Facebook’s profit jumped 98 percent. Even though the pandemic shuttered many of its stores, Apple increased sales of all its products in every part of the world and posted $11.25 billion in profit. Advertising revenue dropped for Alphabet, the laggard of the bunch, but it still did better than Wall Street had expected.

Very modest giving

To put tech company racial justice donations in perspective, Peters calculated what the equivalent giving would be for person earning the median U.S. salary of $63,179.  The calculation was based on the size of the corporate donation relative to company revenue, not profits, since the $63,179 is the median worker’s salary and not disposable income.  As the following figure shows, recent corporate donations are indeed quite modest.

If someone earning the median U.S. salary donated the same percentage of their salary to racial justice as Amazon, that person would be contributing a yearly amount of just $4.17.  The median salary annual equivalent donation would also be under $5 for Dell, Intel, Disney, and Verizon. Even for Facebook, the biggest giver, the equivalent would only be $100.  It would take Dell 6 minutes to recuperate its pledge, Intel 35 minutes, and Disney and Verizon less than 5 hours.

And as highlighted above, the reason for such modest giving is not low profits.  The figure below shows the pledged amount for racial justice by major U.S. tech companies and their annual profit.

As Peters commented:

Frankly, a lot of these contributions seem even tinier when you consider how much these companies tend to spend on other things. AT&T reportedly spent $73 million on a single campaign to advertise its fake 5G network, which is more than three times its commitment to Black lives. At $7 to $11 million per episode, Amazon would have been hard-pressed to produce three episodes of its alternate reality Nazi-fighting show The Man in the High Castle with the money it’s pledged since Floyd’s death. Microsoft spent over $100 million trying to reinvent the Xbox gamepad only to wind up nearly all the way back where it started.

Money isn’t everything

Of course, there are other things companies can do to promote racial equality. One is to change their hiring policies.  For example, the share of Black employees is just 3 percent at Google and 9 percent at Apple.  And beyond increasing numbers, it is essential that tech companies also reconsider how they organize and compensate the work of their Black employees.

An even more important action tech companies could take would be to listen to their workers and BIPOC leaders and reconsider the nature of the goods and services they choose to develop and sell.  Johana Bhuiyan, writing in the LA Times, highlights the contrast between corporate statements in opposition to racism and corporate profit-driven production priorities to illustrate what is at stake.  Here is her portrait of Amazon:

What [Amazon] said: “The inequitable and brutal treatment of black people in our country must stop. Together we stand in solidarity with the black community — our employees, customers, and partners — in the fight against systemic racism and injustice.”

What the record shows: At the center of the protests demanding justice for Floyd are calls for police reform and an end to racist policing. Amazon has several contracts with law enforcement agencies. Of particular note, Ring, Amazon’s home surveillance company, has partnerships with at least 200 police departments across the country, as Motherboard has reported. As part of its contract with some police departments, Ring incentivized police to encourage citizens to adopt the company’s neighborhood watch app — which has reported issues with racial profiling. After reviewing more than 100 posts on the app, Motherboard found that the majority of people who users deemed “suspicious” were people of color.

“Given the reality of police violence, with impunity, impacting primarily people of color in the United States, these kinds of acts threaten the lives of third parties who are simply, in some cases, doing their jobs or living in their own neighborhoods,” Shahid Buttar, director of grass-roots advocacy for the Electronic Frontier Foundation, told Motherboard.

Amazon also licenses facial-recognition software, called Rekognition, to law enforcement agencies. A study by the MIT Media Lab found that the software performed worse at identifying the gender of individuals with dark faces, although Amazon contested the validity of the findings. Other facial-recognition algorithms have struggled to accurately identify non-white faces.

 

We shouldn’t forget that it is the strength of the Black Lives Matter movement that pushed corporations to project themselves as supporters of racial justice and make their well-publicized donations.  And it is better to have them promoting racial equality than opposing it.  But to this point, corporate actions remain largely limited to public relations statements.  Since real change will require a fundamental rethinking of the organization and aims of corporate production, we shouldn’t count on CEOs going beyond that in any meaningful sense in the near future.  At the same time, as the movement for change grows both inside leading tech companies and in the broader community, we shouldn’t discount the possibility of winning meaningful shifts in corporate policy.

 

Defunding police and challenging militarism, a necessary response to their “battle space”

The excessive use of force and killings of unarmed Black Americans by police has fueled a popular movement for slashing police budgets, reimagining policing, and directing freed funds to community-based programs that provide medical and mental health care, housing, and employment support to those in need.  This is a long overdue development.

Police are not the answer

Police budgets rose steadily from the 1990s to the Great Recession and, despite the economic stagnation that followed, have remained largely unchanged.  This trend is highlighted in the figure below, which shows real median per capita spending on police in the 150 largest U.S. cities.  That spending grew, adjusted for inflation, from $359 in 2007 to $374 in 2017.  The contrast with state and local government spending on social programs is dramatic.  From 2007 to 2017, median per capita spending on housing and community development fell from $217 to $173, while spending on public welfare programs fell from $70 to $47.

Thus, as economic developments over the last three decades left working people confronting weak job growth, growing inequality, stagnant wages, declining real wealth, and rising rates of mortality, funding priorities meant that the resulting social consequences would increasingly be treated as policing problems.  And, in line with other powerful trends that shaped this period–especially globalization, privatization, and militarization–police departments were encouraged to meet their new responsibilities by transforming themselves into small, heavily equipped armies whose purpose was to wage war against those they were supposed to protect and serve. 

The military-to-police pipeline

The massive, unchecked militarization of the country and its associated military-to-police pipeline was one of the more powerful factors promoting this transformation.  The Pentagon, overflowing with military hardware and eager to justify a further modernization of its weaponry, initiated a program in the early 1990s that allowed it to provide surplus military equipment free to law enforcement agencies, allegedly to support their “war on drugs.”  As a Forbes article explains:

Since the early 1990s, more than $7 billion worth of excess U.S. military equipment has been transferred from the Department of Defense to federal, state and local law enforcement agencies, free of charge, as part of its so-called 1033 program. As of June [2020], there are some 8,200 law enforcement agencies from 49 states and four U.S. territories participating. 

The program grew dramatically after September 11, 2001, justified by government claims that the police needed to strengthen their ability to combat domestic terrorism.  As an example of the resulting excesses, the Los Angeles Times reported in 2014 that the Los Angeles Unified School District and its police officers were in possession of three grenade launchers, 61 automatic military rifles and a Mine Resistant Ambush Protected armored vehicle. Finally, in 2015, President Obama took steps to place limits on the items that could be transferred; tracked armored vehicles, grenade launchers, and bayonets were among the items that were to be returned to the military.

President Trump removed those limits in 2017, and the supplies are again flowing freely, including armored vehicles, riot gear, explosives, battering rams, and yes, once again bayonets.  According to the New York Times, “Trump administration officials said that the police believed bayonets were handy, for instance, in cutting seatbelts in an emergency.”

Outfitting police departments for war also encouraged different criteria for recruiting and training. For example, as Forbes notes, “The average police department spends 168 hours training new recruits on firearms, self-defense, and use of force tactics. It spends just nine hours on conflict management and mediation.”  Arming and training police for military action leads naturally to the militarization of police relations with community members, especially Black, Indigeous and other people of color, who come to play the role of the enemy that needs to be controlled or, if conditions warrant, destroyed.

In fact, the military has become a major cheerleader for domestic military action.  President Trump, on a call with governors after the start of demonstrations protesting the May 25, 2020 killing of George Floyd while in police custody, exhorted them to “dominate” the street protests.

As the Washington Examiner reports:

“You’ve got a big National Guard out there that’s ready to come and fight like hell,” Trump told governors on the Monday call, which was leaked to the press.

[Secretary of Defense] Esper lamented that only two states called up more than 1,000 Guard members of the 23 states that have called up the Guard in response to street protests. The National Guard said Monday that 17,015 Guard members have been activated for civil unrest.

“I agree, we need to dominate the battle space,” Esper said after Trump’s initial remarks. “We have deep resources in the Guard. I stand ready, the chairman stands ready, the head of the National Guard stands ready to fully support you in terms of helping mobilize the Guard and doing what they need to do.”

The militarization of the federal budget

The same squeeze of social spending and support for militarization is being played out at the federal level.  As the National Priorities Project highlights in the following figure, the United States has a military budget greater than the next ten countries combined.

Yet, this dominance has done little to slow the military’s growing hold over federal discretionary spending.  At $730 billion, military spending accounts for more than 53 percent of the federal discretionary budget.  A slightly broader notion, what the National Priorities Project calls the militarized budget, actually accounts for almost two-thirds of the discretionary budget.  The militarized budget:

includes discretionary spending on the traditional military budget, as well as veterans’ affairs, homeland security, and law enforcement and incarceration. In 2019, the militarized budget totaled $887.8 billion – amounting to 64.5 percent of discretionary spending. . . . This count does not include forms of militarized spending allocated outside the discretionary budget, include mandatory spending related to veterans’ benefits, intelligence agencies, and interest on militarized spending.

The militarized budget has been larger than the non-militarized budget every year since 1976.  But the gap between the two has grown dramatically over the last two decades. 

In sum, the critical ongoing struggle to slash police budgets and reimagine policing needs to be joined to a larger movement against militarism more generally if we are to make meaningful improvements in majority living and working conditions.