Pandemic economic woes continue, but so do deep structural problems, especially the long-term growth in the share of low wage jobs

Many are understandably alarmed about what the September 4th termination of several special federal pandemic unemployment insurance programs will mean for millions of workers.  Twenty-five states ended their programs months earlier, with government and business leaders claiming that their termination would spur employment and economic activity.  However, several studies have disproved their claims.

One study, based on the experience of 19 of these states, found that for every 8 workers that lost benefits, only one found a new job.  Consumer spending in those states fell by $2 billion, with every lost $1 of benefits leading to a fall in spending of 52 cents.   It is hard to see how anything good can come from the federal government’s willingness to allow these programs to expire nationwide. 

The Biden administration appears to believe that adoption of its physical infrastructure bill and $3.5 trillion spending plan will ensure that those left without benefits will find new jobs.  But chances for Congressional approval are growing dim.  Even more importantly, and largely overlooked in the debate over whether the time is right to replace the pandemic unemployment insurance programs with new spending measures, is that an increasing share of the jobs created by economic growth are low-wage, and thus inadequate to ensure workers and their families an acceptable standard of living. 

For example, according to another study, the share of low wage jobs has been steadily growing since 1979.  More specifically, the share of workers (18-64 years of age) with a low wage job rose from 39.1 percent in 1979 to 45.2 percent in 2017.  For workers 18 to 34 without a college degree the share soared from 46.9 percent to 61.6 percent over the same tyears. Thus, a meaningful improvement in worker well-being will require far more than a return to “normal” labor market conditions.  It will require building a movement able to directly challenge and transformation the way the US economy operates.  

The importance of government programs

The figure below provides some sense of how important government programs have been to working people.  Government support was truly a lifeline for working people, delivering a significant boost to total monthly personal income (relative to the February 2020 start of the pandemic-triggered recession), especially during the first months.  Even now, despite the fact that the recession has officially been declared over, it still accounts for approximately half the increase in total monthly income.   

The government’s support of personal income was anchored by three special unemployment insurance programs–the Federal Pandemic Unemployment Compensation (FPUC), Pandemic Emergency Unemployment Compensation (PEUC), and Pandemic Unemployment Assistance (PUA). 

The FPUC was authorized by the March 2020 CARES Act and renewed by subsequent legislation and a presidential order. It originally provided $600 per week in extra unemployment benefits to unemployed workers in states that opted in to the program. In August 2020, the extra payment was lowered to $300.

The PEUC was also established by the CARES Act. It provided up to 13 weeks of extended unemployment compensation to individuals that had exhausted their regular unemployment insurance compensation.  This was later extended to 24 additional weeks and then by a further 29 weeks, allowing for a total of 53 weeks.  The PUA allowed states to provide unemployment assistance to the self-employed and those seeking part-time employment, or who otherwise did not qualify for regular unemployment compensation.

Tragically, the federal government allowed all three programs to expire on September 4th. Months earlier, in June 2021, 25 states actually ended these programs for their unemployed workers, eliminating benefits for over 2 million.  Several studies, as we see next, have documented the devastating cost of that decision. 

The cost of state program termination

Beginning in April 2021, a number of business analysts and politicians began to aggressively argue that federally provided unemployment benefit programs were no longer needed.  In fact, according to them, the programs were actually keeping workers from pursuing available jobs, thereby holding back the country’s economic recovery. Using these arguments as cover, in June, 25 states ended their participation in one or more of these programs. 

For example, Henry McMaster, the governor of South Carolina, announced his decision to end his state’s participation in the federal programs, saying: “This labor shortage is being created in large part by the supplemental unemployment payments that the federal government provides claimants on top of their state unemployment benefits.”

Similarly, Tate Reeves, the governor of Mississippi, stated in a May 2021 tweet:

It has become clear to me that we cannot have a full economic recovery until we get the thousands of available jobs in our state filled. . . . Therefore, I have informed the Department of Employment Security to direct the Biden Administration that Mississippi will be opting out of the additional federal unemployment benefits as early as federal law allows—June 12, 2021.

The argument that these special federal unemployment benefit programs hurt employment and economic activity was tested and found wanting.  Business Insider highlights the results of several studies:

Economist Peter Ganong, who co-authored a paper that found the disincentive effect of benefits was small, told the [Wall Street] Journal: “If the question is, ‘Is UI [unemployment insurance] the key thing that’s holding back the labor market recovery?’ The answer is no, definitely not, based on the available data.” 

That aligns with other early research on the impact of benefits ending. CNBC reports that analyses from payroll firms UKG and Homebase both found that employment didn’t go up in the states cutting off the benefits; in fact, that Homebase analysis found that employment declined in the states opting out of federal benefits, while it went up in states that chose to retain benefits. In June, Indeed’s Hiring Lab found that job searches in states ending benefits were below April’s baseline.

In July, Arindrajit Dube, an economics professor at University of Massachusetts Amherst, found that ending benefits didn’t make workers rush back. “Even as there was a clear reduction in the number of people who were receiving unemployment benefits — and a clear increase in the number of people who said that they were having difficulty paying their bills — that didn’t seem to translate, at least in the short run, into an uptick in overall employment rates,” Dube told Insider at the time.

Dube, along with five other researchers, examined “the effect of withdrawing pandemic UI on the financial and employment trajectories of unemployed workers in [19] states that withdrew benefits, compared to workers with the same unemployment duration in states that retained these benefits.” 

They found, as noted above, that for every 8 workers who lost their benefits, only 1 found a new job.  And for every $1 of reduced benefits, spending fell by 52 cents—only 7 cents of new income was generated for each dollar of lost benefits. “Extrapolating to all UI recipients in the early withdrawal states, we estimate these states eliminated $4 billion in unemployment benefits paid by federal transfers as of August 6 [2021].  Spending fell by $2 billion and earnings rose by $270 million.  These states therefore saw a much larger drop in federal transfers than gains from job creation.”

An additional 8 million workers have now lost benefits because of the federal termination of these special unemployment insurance programs.  It is hard to be optimistic about what awaits them, given the experience of the early termination states.  And equally important, even if the “optimists” are proven right, and those workers are able to find employment, there is still reason for concern about the likely quality of those jobs given long-term employment trends.

The lack of decent jobs

There is no agreed upon definition of a low wage job.  David R. Howell and Arne L. Kalleberg note two of the most popular in their study of declining job quality in the United States.  One is to define low wage jobs as those that pay less than two-thirds of the median hourly wage.  The other, used by the OECD, is to define low wage jobs as those that pay less than two-thirds of the median hourly wage for full-time workers.

Howell and Kallenberg find both inadequate.  Instead, they define low wage jobs as those that pay less than two-thirds of the mean hourly wage for full-time prime-age workers (35-59).  Their definition sets the dividing line between low wage and what they call “decent” wage jobs at $17.50 in 2017.  As they explain:

This wage is well above the wage that would make a full-time (or near full-time) worker eligible for food stamps and several dollars above the basic needs budget for a single adult in most American cities, but is conservative in that the basic needs budget for a single adult with one child ranges from $22 to $30).

The figure below, based on their definition, shows the growth in low wage jobs for workers 18-34 years of age without a college degree (in blue), all workers 18-64 years of age (in gold), and prime age workers 35-59 years of age (in green).  Their dividing line between low wage and decent wage jobs, equivalent to $17.50 in 2017, is far from a generous wage.  Yet, all three groupings show an upward trend in the share of low wage jobs.  

The authors then divide their low wage and decent wage categories into upper and lower tiers.   The lower tier of the low wage category includes jobs that pay less than two-thirds of the median wage for full-time workers, which equaled $13.33 in 2017.  As the authors report:

Based on evidence from basic needs budgets, this is a wage that, even on a full-time basis, would make it extremely difficult to support a minimally adequate standard of living for even a single adult anywhere in the country. This wage threshold ($13.33) is just above the wage cutoff for food stamps ($12.40) and Medicaid ($12.80) for a full- time worker (thirty-five hours per week, fifty weeks per year) with a child; full-year work at thirty hours per week would make a family of two eligible for the food stamps with a wage as high as $14.46 and as high as $14.94 for Medicaid.  For this reason, we refer to this as the poverty-wage threshold.

The lower tier of the decent wage category includes jobs that pay less than 50 percent more than the decent-job threshold, which equaled $26.50 in 2017.  The figure below shows the overall job distribution in 2017.

The following table shows the changing distribution of jobs over the years 1979 to 2017 for all workers 18 to 64, for workers 18-34 without a college degree, and for workers 18-34 with a college degree.

While the share of upper-tier decent jobs held by workers 18 to 64 has remained relatively stable, there has been a notable decline in the share of workers with lower-tier decent jobs.  Also worth noting is the rise in the share of poverty-level low wage jobs. 

Perhaps most striking is the large decline in the share of decent jobs held by workers 18 to 34, those with and those without a college degree.  The share of poverty level jobs held by those without a college degree soared from 35.7 percent to 53.5 percent.  The share of low wage jobs also spiked for those with a college degree, rising from 22 percent to 39.1 percent, with an increase in the share of both low-wage tiers.

This long-term decline in job quality will not reverse on its own.  And, not surprisingly, corporate leaders remain largely opposed to policies that might threaten the status quo.

So, do we need a better unemployment insurance system? For sure.  Do we need a better funded and more climate resilient social and physical infrastructure?  Definitely.  But we also need a dramatically different economy, one that, in sharp contrast to our current system, is grounded in greater worker control over both the organization and aims of production.  Lots of work ahead.

Learning from history: community-run child-care centers during World War II

We face many big challenges.  And we will need strong, bold policies to meaningfully address them.  Solving our child-care crisis is one of those challenges, and a study of World War II government efforts to ensure accessible and affordable high-quality child care points the way to the kind of bold action we need. 

The child care crisis

A number of studies have established that high-quality early childhood programs provide significant community and individual benefits.  One found that “per dollar invested, early childhood programs increase present value of state per capita earnings by $5 to $9.”  Universal preschool programs have also been shown to offer significant benefits to all children, even producing better outcomes for the most disadvantaged children than means-tested programs.  Yet, even before the pandemic, most families struggled with a lack of desirable child-care options.    

The pandemic has now created a child-care crisis. As Lisa Dodson and Mary King point out: “By some estimates, as many as 4.5 million child-care ‘slots’ may be permanently lost and as many as 40 percent of child-care providers say they will never reopen.”  The lack of child care is greatly hindering our recovery from the pandemic.  Women suffered far greater job losses than men during 2020, including as child-care workers, and the child-care crisis has made it difficult for many working mothers to return to the labor force.  The cost goes beyond the immediate family hardship from lost income; there is strong evidence that a sustained period without work, the so-called employment gap, will result in significantly lower lifetime earnings and reduced retirement benefits.  

To his credit, President Biden has recognized the importance of strengthening our care economy.  His proposed American Families Plan includes some $225 billion in tax credits to help make child care more affordable for working families.  According to a White House fact sheet, families would “receive a tax credit for as much as half of their spending on qualified child care for children under age 13, up to a total of $4,000 for one child or $8,000 for two or more children. . . . The credit can be used for expenses ranging from full-time care to after school care to summer care.”

But tax credits don’t ensure the existence of convenient, affordable, high-quality child-care facilities staffed by well-paid and trained child-care providers.  And if that is what we really want, we will need to directly provide it.  That is what the government did during World War II.  While its program was far from perfect, in part because it was designed to be short-term, it provides an example of the type of strong, bold action we will need to overcome our current child-care crisis. 

Federal support for child care

During World War II the United States government financed a heavily-subsidized child-care program.  From August 1943 through February 1946, the Federal Works Agency (FWA), using Lanham Act funds, provided some $52 million in grants for child-care services (equal to more than $1 billion today) to any approved community group that could demonstrate a war-related need for the service.  At its July 1944 peak, 3,102 federally subsidized child-care centers, with some 130,000 children enrolled, operated throughout the country.  There was at least one center in every state but New Mexico, which decided against participation in the program.  By the end of the war, between 550,000 and 600,000 children received some care from Lanham Act funded child-care programs.  

Communities were allowed to use the federal grant money to cover most of the costs involved in establishing and running their centers, including facilities construction and upkeep, staff wages and most other daily operating costs.  They were required to provide some matching funds, most of which came from fees paid by the parents of children enrolled in the program.  However, these fees were capped. In the fall of 1943, the FWA established a ceiling on fees of 50 cents per child per day (about $7 now), which was raised to 75 cents in July 1945. And those fees included snacks, lunch, and in some cases dinner as well. Overall, the federal subsidy covered two-thirds of the total maintenance and operation of the centers.

The only eligibility requirement for enrollment was a mother’s employment status: she had to be working at a job considered important to the war effort, and this was not limited to military production. Center hours varied, but many accommodated the round-the-clock manufacturing schedule, staying open 24 hours a day, 6 days a week. 

The centers served preschoolers (infants, toddlers, and children up to 5 years of age) and school-age children (6 to 14 years of age). In July 1944, approximately 53,000 preschoolers and 77,000 school-age children were enrolled.  School-age enrollment always climbed during summer vacation.  However, in most months, preschoolers made up the majority of the children served by Lanham Act-funded centers. Enrollment of preschoolers peaked at some 74,000 in May 1945. 

Some 90 percent of the centers were housed in public schools, with newly contructed housing projects providing the next most used location. Although local school boards were free to decide program standards–including staff-child ratios, worker qualifications, and facility design–state boards of education were responsible for program supervision. The recommended teacher-child ratio was 10-to-1, and most centers complied.  According to Chris M. Herbst,

Anecdotal evidence suggests that preschool-aged children engaged in indoor and outdoor play; used educational materials such paints, clay, and musical instruments; and took regular naps. . . . Programs for school-aged children included . . . outdoor activities, participation in music and drama clubs, library reading, and assistance with schoolwork. 

Children at a child-care center sit for “story time.” (Gordon Parks / Library of Congress / The Crowley Company)

While quality did vary–largely the result of differences in community support for public child care, the willingness of cities to provide additional financial support, and the ability of centers to hire trained professionals to develop and oversee program activities–the centers did their best to deliver a high-quality childhood education.  As Ruth Peason Koshuk, the author of a 1947 study of the developmental records of 500 children, 2 to 5 years of age, at two Los Angeles Country centers, describes:

In these two . . . schools, as elsewhere, the program has developed since 1943, toward recognized standards of early childhood education. The aim has been to apply the best of existing standards, and to maintain as close contact with the home as possible. In-service training courses carrying college credit have been given, for the teaching staff, and a mutually helpful parent education program carried on in spite of difficulties inherent in a child care situation.

There has been a corresponding development in the basic records. A pre-entrance medical examination has been required by state law since the first center opened. In December 1943 a developmental record was added, which is filled out by the director during an unhurried interview with the mother just before a child enters. One page is devoted to infancy experience; the four following cover briefly the child’s development history, with emphasis on emotional experience, behavior problems he has presented to the parents, if any, and the control methods used, as well as the personal-social behavior traits which they value and desire for the child. After entrance, observational notes and semester reports are compiled by the teachers. Intelligence testing has been limited to cases where it seemed especially indicated. A closing record is filled out, in most cases, by the parent when a child is withdrawn. These records are considered a minimum. They have proved indispensable as aids to the teachers in guiding the individual children and as a basis for conferences on behavior in the home.

A 2013 study of the long-term effects on mothers and children from use of Lanham centers found a substantial increase in maternal employment, even five years after the end of the program, and “strong and persistent positive effects on well-being” for their children.

In short, despite many shortcomings, these Lanham centers, as Thalia Ertman sums up,

broke ground as the first and, to date, only time in American history when parents could send their children to federally-subsidized child care, regardless of income, and do so affordably. . . .

Additionally, these centers are seen as historically important because they sought to address the needs of both children and mothers. Rather than simply functioning as holding pens for children while their mothers were at work, the Lanham child care centers were found to have a strong and persistent positive effect on the well-being of children.

The federal government also supported some private employer-sponsored child care during the war. The most well-known example is the two massive centers built by the Kaiser Company in Portland, Oregon to provide child care for the children of workers at their Portland Yards and Oregon Shipbuilding Corporation. The centers were located right at the front of the shipyards, making it easy for mothers to drop their children off and pick them up, and were operated on a 24-hour schedule.  They were also large, each caring for up to 1,125 children between 18 months and 6 years of age. The centers had their own medical clinic, cafeteria, and large play areas, and employed highly trained staff.  Parents paid $5 for a six-day week for one child and $3.75 for each additional child.  For a small additional fee, the centers also prepared a small dinner for parents to pick up at the end of their working day.

While the Kaiser Company received much national praise as well as appreciation from its employees with young children, these centers were largely paid for by the government.  Government funds directly paid for their construction, and a majority of the costs of running the center, including staff salaries, were included in the company’s cost-plus contracting with the military.

Political dynamics

There was considerable opposition to federal financing of group child care, especially for children younger than 6 years of age.  The sentiment is captured in this quote from a 1943 New York Times article: “The worst mother is better than the best institution when it is a matter of child care, Mayor La Guardia declared.”  Even the War Manpower Commission initially opposed mothers with young children working outside the home, even in service of the war effort, stating that “The first responsibility of women with young children, in war as in peace, is to give suitable care in their own homes to their children.”

But on-the-ground realities made this an untenable position for both the government and business. Women sought jobs, whether out of economic necessity or patriotism.  The government, highlighted by its Rosie the Riveter campaign, was eager to encourage their employment in industries producing for the war effort.  And, despite public sentiment, a significant number of those women were mothers with young children. 

Luedell Mitchell and Lavada Cherry working at a Douglas Aircraft plant in El Segundo, Calif. Circa 1944. Credit: National Archives photo no. 535811

The growing importance of women in the workplace, and especially mothers with young children, is captured in employment trends in Portland, Oregon.  Women began moving into the defense workforce in great numbers starting in 1942, with the number employed in local war industries climbing from 7,000 in November 1942 to 40,000 in June 1943.  An official with the state child-care committee reported that “a check of six shipyards reveals that the number of women employed in the shipyards has increased 25 percent in one month and that the number is going to increase more rapidly in the future.” 

The number of employed mothers was also rapidly growing.  According to the Council of Social Agencies, “Despite the recommendations of the War Manpower Commission . . . thousands of young mothers in their twenties and thirties have accepted jobs in war industries and other businesses in Multnomah County. Of the 8,000 women employed at the Oregon Shipyards in January, 1943, 32 percent of them had children, 16 percent having pre-school children.”

Portland was far from unique.  During the war, for the first time, married women workers outnumbered single women workers.  Increasingly, employers began to recognize the need for child care to address absenteeism problems.  As a “women’s counselor” at the Bendix Aviation Corporation in New Jersey explained to reporters in 1943, child care is one of the biggest concerns for new hires. “We feel a mother should be with her small baby if possible. But many of them have to come back. Their husbands are in the service and they can’t get along on his allotment.”  Media stories, many unsubstantiated, of children left in parked cars outside workplaces or fending for themselves at home, also contributed to a greater public acceptance of group child care. 

An image of Rosie the Riveter that appeared in a 1943 issue of the magazine Hygeia

Finally, the government took action.  The Federal Works Agency was one of two new super agencies established in 1939 to oversee the large number of agencies created during the New Deal period.  In 1940 President Roosevelt signed into law the Lanham Act, which authorized the FWA to fund and supervise the construction of needed public infrastructure, such as housing, hospitals, water and sewer systems, police and firefighting facilities, and recreation centers, in communities experiencing rapid growth because of the defense buildup. In August 1942, the FWA decided, without any public debate, that public infrastructure also meant child care, and it began its program of support for the construction and operation of group child-care facilities.

The Federal Works Agency, the other super agency, whose oversight responsibilities included the Children’s Bureau and the U.S. Office of Education, opposed the FWA’s new child-care initiative.  It did so not only because it believed that child care fell under its mandate, but also because the leadership of the Children’s Bureau and Office of Education opposed group child care.  The FWA won the political battle, and in July 1943, Congress authorized additional funding for the FWA’s child-care efforts. 

And, as William M. Tuttle, Jr. describes, public pressure played an important part in the victory:

the proponents of group child care organized a potent lobbying effort. The women’s auxiliaries of certain industrial unions, such as the United Electrical Workers and the United Auto Workers, joined with community leaders and FWA officials in the effort. Also influential were the six women members of the House of Representatives. In February 1944, Representative Mary T. Norton presented to the House “a joint appeal” for immediate funds to expand the wartime child day care program under the FWA.

Termination and a step back

Congressional support for group child care was always tied to wartime needs, a position shared by most FWA officials.  The May 1945 Allied victory in Europe brought a drop in war production, and a reduction in FWA community child care approvals and renewals.  In August, after the Japanese surrender brought the war to a close, the FWA announced that it would end its funding of child-care centers as soon as possible, but no later than the end of October 1945.

Almost immediately thousands of individuals wrote letters, sent wires, and signed petitions calling for the continuation of the program.  Officials in California, the location of many war-related manufacturing sites and nearly 25 percent of all children enrolled in Lanham Act centers in August 1945, also weighed in, strongly supporting the call.  Congress yielded, largely influenced by the argument that since it would be months before all the “men” in the military returned to the country, mothers had no choice but to continue working and needed the support of the centers to do so.  It approved new funds, but only enough to keep the centers operating until the end of February 1946.

The great majority of centers rapidly closed not long after the termination of federal support, with demonstrations following many of the closings.  The common assumption was that women would not mind the closures, since most would be happy to return to homemaking.  Many women were, in fact, forced out of the labor force, disproportionately suffering from post-war industrial layoffs.  But by 1947, women’s labor force participation was again on the rise and a new push began for a renewal of federal support for community child-care centers. Unfortunately, the government refused to change its position. During the Korean War, Congress did approve a public child-care bill, but then it refused to authorize any funding.

After WWII, parents organized demonstrations, like this one in New York on Sept. 21, 1947, calling for the continuing funding of the centers. The city’s welfare commissioner dismissed the protests as “hysterical.” Credit: The New York Times

Finally, in 1954, as Sonya Michel explains, “Congress found an approach to child care it could live with: the child-care tax deduction.”  While the child-care tax deduction did offer some financial relief to some families, it did nothing to ensure the availability of affordable, high-quality child care.  The history of child care during World War II makes clear that this turn to market-based tax policy to solve child-care problems represented a big step back for working women and their children.  And this was well understood by most working people at the time. 

Sadly, this history has been forgotten, and Biden’s commitment to expand the child-care tax credit is now seen as an important step forward.  History shows we can and need to do better.

Time to put the spotlight on corporate taxes

A battle is slowly brewing in Washington DC over whether to raise corporate taxes to help finance new infrastructure investments.  While higher corporate taxes cannot generate all the funds needed, the coming debate over whether to raise them gives us an opportunity to challenge the still strong popular identification of corporate profitability with the health of the economy and, by extension, worker wellbeing.

According to the media, President Biden’s advisers are hard at work on two major proposals with a combined $3 trillion price tag.  The first aims to modernize the country’s physical infrastructure and is said to include funds for the construction of roads, bridges, rail lines, ports, electric vehicle charging stations, and affordable and energy efficient housing as well as rural broadband, improvements to the electric grid, and worker training programs.  The second targets social infrastructure and would provide funds for free community college education, universal prekindergarten, and a national paid leave program. 

To pay for these proposals, Biden has been talking up the need to raise corporate taxes, at least to offset some of the costs of modernizing the country’s physical infrastructure.  Not surprisingly, Republican leaders in Congress have voiced their opposition to corporate tax increases.  And corporate leaders have drawn their own line in the sand.  As the New York Times reports:

Business groups have warned that corporate tax increases would scuttle their support for an infrastructure plan. “That’s the kind of thing that can just wreck the competitiveness in a country,” Aric Newhouse, the senior vice president for policy and government relations at the National Association of Manufacturers, said last month [February 2021].

Regardless of whether Biden decides to pursue his broad policy agenda, this appears to be a favorable moment for activists to take advantage of media coverage surrounding the proposals and their funding to contest these kinds of corporate claims and demonstrate the anti-working-class consequences of corporate profit-maximizing behavior.  

What do corporations have to complain about?

To hear corporate leaders talk, one would think that they have been subjected to decades of tax increases.  In fact, quite the opposite is true.  The figure below shows the movement in the top corporate tax rate.  As we can see, it peaked in the early 1950s and has been falling ever since, with a big drop in 1986, and another in 2017, thanks to Congressionally approved tax changes.

One consequence of this corporate friendly tax policy is, as the following figure shows, a steady decline in federal corporate tax payments as a share of GDP.  These payments fell from 5.6 percent of GDP in 1953 to 1.5 percent in 1982, and a still lower 1.0 percent in 2020.  By contrast there has been very little change in individual income tax payments as a share of GDP; they were 7.7 percent of GDP in 2020.

Congressional tax policy has certainly been good for the corporate bottom line.  As the next figure illustrates, both pre-tax and after-tax corporate profits have risen as a share of GDP since the early 1980s.  But the rise in after-tax profits has been the most dramatic, soaring from 5.2 percent of GDP in 1980 to 9.1 percent in 2019, before dipping slightly to 8.8 percent in 2020.   To put recent after-tax profit gains in perspective, the 2020 after-tax profit share is greater than the profit share in every year from 1930 to 2005.

What do corporations do with their profits?

Corporations claim that higher taxes would hurt U.S. competitiveness, implying that they need their profits to invest and keep the economy strong.  Yet, despite ever higher after-tax rates of profit, private investment in plant and equipment has been on the decline.

As the figure below shows, gross private domestic nonresidential fixed investment as a share of GDP has been trending down since the early 1980s.  It fell from 14.8 percent in 1981 to 13.4 percent in 2020.

Rather than investing in new plant and equipment, corporations have been using their profits to fund an aggressive program of stock repurchases and dividend payouts.  The figure below highlights the rise in corporate stock buybacks, which have helped drive up stock prices, enriching CEOs and other top wealth holders. In fact, between 2008 and 2017, companies spent some 53 percent of their profits on stock buybacks and another 30 percent on dividend payments.

It should therefore come as no surprise that CEO compensation is also exploding, with CEO-to-worker compensation growing from 21-to-1 in 1965, to 61-to-1 in 1989, 293-to-1 in 2018, and 320-to-1 in 2019.  As we see in the next figure, the growth in CEO compensation has actually been outpacing the rise in the S&P 500.

In sum, the system is not broken.  It continues to work as it is supposed to work, generating large profits for leading corporations that then find ways to generously reward their top managers and stockholders.  Unfortunately, investing in plant and equipment, creating decent jobs, or supporting public investment are all low on the corporate profit-maximizing agenda.  

Thus, if we are going to rebuild and revitalize our economy in ways that meaningfully serve the public interest, working people will have to actively promote policies that will enable them to gain control over the wealth their labor produces.  One example: new labor laws that strengthen the ability of workers to unionize and engage in collective and solidaristic actions.  Another is the expansion of publicly funded and provided social programs, including for health care, housing, education, energy, and transportation. 

And then there are corporate taxes.  Raising them is one of the easiest ways we have to claw back funds from the private sector to help finance some of the investment we need.  Perhaps more importantly, the fight over corporate tax increases provides us with an important opportunity to make the case that the public interest is not well served by reliance on corporate profitability.

The failings of our unemployment insurance system are there by design

Our unemployment insurance system has failed the country at a moment of great need.  With tens of millions of workers struggling just to pay rent and buy food, Congress was forced to pass two emergency spending bills, providing one-time stimulus payments, special weekly unemployment insurance payments, and temporary unemployment benefits to those not covered by the system.  And, because of their limited short-term nature, President Biden must now advocate for a third.

The system’s shortcomings have been obvious for some time, but little effort has been made to improve it.  In fact, those shortcomings were baked into the system at the beginning, as President Roosevelt wanted, not by accident.  While we must continue to organize to ensure working people are able to survive the pandemic, we must also start the long process of building popular support for a radical transformation of our unemployment insurance system.  The history of struggle that produced our current system offers some useful lessons.

Performance

Our unemployment insurance system was designed during the Great Depression.  It was supposed to shield workers and their families from the punishing costs of unemployment, thereby also helping to promote both political and economic stability.  Unfortunately, as Eduardo Porter and Karl Russell reveal in a New York Times article, that system has largely failed working people.

The chart below shows the downward trend in the share of unemployed workers receiving benefits and the replacement value of those benefits.  Benefits now replace less than one-third of prior wages, some eight percentage points below the level in the 1940s.  Benefits aside, it is hard to celebrate a system that covers fewer than 30 percent of those struggling with unemployment.

A faulty system

Although every state has an unemployment insurance system, they all operate independently.  There is no national system.  Each state separately generates the funds it needs to provide unemployment benefits and is largely free, subject to some basic federal standards, to set the conditions under which an unemployed worker becomes eligible to receive benefits, the waiting period before benefits will be paid, the length of time benefits will be paid, the benefit amount, and requirements to continue receiving benefits.

Payroll taxes paid by firms generate the funds used to pay unemployment insurance benefits.  The size of the taxes to be paid depends on the value of employee earnings that is made taxable (the base wage) and the tax rate.  States are free to set the base wage as they want, subject to a federally mandated floor of $7000 established in the 1970s.  States are also free to set the tax rate as they want.  Not surprisingly, in the interest of supporting business profitability, states have generally sought to keep both the base wage and tax rate low.  For example, Florida, Tennessee and Arizona continue to set their base wage at the federal minimum value.  And, as the figure below shows, insurance tax rates have been trending down for some time.

While such a policy might help business, lowering the tax rate means that states have less money in their trust funds to pay unemployment benefits.  Thus, when times are hard, and unemployment claims rise, many states find themselves hard pressed to meet their required obligations.  In fact, as Porter and Russell explain:

Washington has been repeatedly called on to provide additional relief, including emergency patches to unemployment insurance after the Great Recession hit in 2008. Indeed, it has intervened in response to every recession since the 1950s.

This is far from a desirable outcome for those states forced to borrow, since the money has to be paid back with interest by imposing higher future payroll taxes on employers.  Thus, growing numbers of states have sought to minimize the likelihood of this happening, or at least the amount to be borrowed, by raising eligibility standards, reducing benefits, and shortening time of coverage, all of which they hope will reduce the number of people drawing unemployment benefits as well as the amount and length of time they will receive them.

Porter and Russell highlight some of the consequences of this strategy:

In Arizona, nearly 70 percent of unemployment insurance applications are denied. Only 15 percent of the unemployed get anything from the state. Many don’t even apply. Tennessee rejects nearly six in 10 applications.

In Florida, only one in 10 unemployed workers gets any benefits. The state is notably stingy: no more than $275 a week, roughly a third of the maximum benefit in Washington State. And benefits run out quickly, after as little as 12 weeks, depending on the state’s overall unemployment rate.

And, the growing stagnation of the US economy, which has led to more precarity of employment, only makes this strategy ever more fiscally “intelligent.”  For example, as the following figure shows, a growing percentage of the unemployed are remaining jobless for a longer time.  Such a trend, absent state actions to restrict access to benefits, would mean financial trouble for state officials.

Adding to the system’s structural shortcomings is that fact that growing numbers of workers, for example the many workers who have been reclassified as independent contractors, are not covered by it.  In addition, since eligibility for benefits requires satisfying a minimum earnings and hours of work requirement over a base year, the growth in irregular low wage work means that many of those in most need of the system’s financial support during periods of unemployment find themselves declared ineligible for benefits.

By design, not by mistake

Our current unemployment insurance system and its patchwork set of state standards and benefits dates back to the depression. While President Roosevelt gets credit for establishing our unemployment insurance system as part of the New Deal, the fact is he deliberately sidelined a far stronger program that, if it had been approved, would have put working people today in a far more secure position. 

The Communist Party (CP) began pushing an unemployment and social insurance bill in the summer of 1930 and, along with the numerous Unemployed Councils that existed in cities throughout the country, worked hard to promote it over the following years.  On March 4, 1933, the day of Roosevelt’s inauguration, they organized demonstrations stressing the need for action on unemployment insurance.

Undeterred by Roosevelt’s lack of action, the CP-authored “Workers Unemployment and Social Insurance Bill” was introduced in Congress in February 1934 by Representative Ernest Lundeen of the Farmer-Labor Party.  In broad brush, the bill mandated the payment of unemployment insurance to all unemployed workers and farmers equal to average local full-time wages, with a guaranteed minimum of $10 per week plus $3 for each dependent. Those forced into part-time employment would receive the difference between their earnings and the average local full-time wage.  The bill also created a social insurance program that would provide payments to the sick and elderly, and maternity benefits to be paid eight weeks before and eight weeks after birth.  All these benefits were to be financed by unappropriated funds in the Treasury and taxes on inheritances, gifts, and individual and corporate incomes above $5,000 a year.

The bill enjoyed strong support among workers—employed and unemployed—and it was soon endorsed by 5 international unions, 35 central labor bodies, and more than 3000 local unions.  Rank and file worker committees also formed across the country to pressure members of Congress to pass it.

When Congress refused to act on the bill, Lundeen reintroduced it in January 1935. Because of public pressure, the bill became the first social insurance plan to be recommended by a congressional committee, in this case the House Labor Committee.  However, it was soon voted down in the full House of Representatives, 204 to 52.

Roosevelt strongly opposed the Lundeen bill and it was to provide a counter that he pushed to create an alternative, one that offered benefits far short of what the Workers Unemployment and Social Insurance Bill offered, and was strongly opposed by many workers and all organizations of the unemployed.  Roosevelt appointed a Committee on Economic Security in July 1934 with the charge to develop a social security bill that he could present to Congress in January 1935 that would include provisions for both unemployment insurance and old-age security.  An administration approved bill was introduced right on schedule in January and Roosevelt called for quick congressional action. 

Roosevelt’s bill was revised in April by a House committee and given a new name, “The Social Security Act.”  After additional revisions the Social Security Act was signed into law on August 14, 1935. The Social Security Act was a complex piece of legislation.  It included what we now call Social Security, a federal old-age benefit program; a program of unemployment insurance administered by the states; and a program of federal grants to states to fund benefits for the needy elderly and aid to dependent children. 

The unemployment system established by the Social Security Act was structured in ways unfavorable to workers (as was the federal old-age benefit program).  Rather than a progressively funded, comprehensive national system of unemployment insurance that paid benefits commensurate with worker wages, the act established a federal-state cooperative system that gave states wide latitude in determining standards.

More specifically, the act levied a uniform national pay-roll tax of 1 percent in 1936, 2 percent in 1937, and 3 percent in 1938, on covered employers, defined as those employers with eight or more employees for at least twenty weeks, not including government employers and employers in agriculture.  Only workers employed by a covered employer could receive benefits.

The act left it to the states to decide whether to enact their own plans, and if so, to determine eligibility conditions, the waiting period to receive benefits, benefit amounts, minimum and maximum benefit levels, duration of benefits, disqualifications, and other administrative matters. It was not until 1937 that programs were established in every state as well as the then-territories of Alaska and Hawaii.  And it was not until 1938 that most began paying benefits.

In the early years, most states required eligible workers to wait 2 to 4 weeks before drawing benefits, which were commonly set at half recent earnings (subject to weekly maximums) for a period ranging from 12 to 16 weeks. Ten state laws called for employee contributions as well as employer contributions; three still do today.

Over the following years the unemployment insurance system has been improved in a number of positive ways, including by broadening coverage and boosting benefits.  However, its basic structure remains largely intact, a structure that is overly complex, with a patchwork set of state eligibility requirements and miserly benefits. And we are paying the cost today.

This history makes clear that nothing will be given to us.  We need and deserve a better unemployment insurance system. And to get it, we are going to have to fight for it, and not be distracted by the temporary, although needed, band-aids Congress is willing to provide.  The principles shaping the Workers Unemployment and Social Insurance Bill can provide a useful starting point for current efforts.

The U.S. recovery on pause, December brings new job losses

A meaningful working-class recovery from the recession seems far away.

After seven months of job gains, although diminishing gains to be sure, we are again losing jobs.  As the chart below shows,  the number of jobs fell by 140,000 in December.

We are currently about 9.8 million jobs down from the February 2020 employment peak, having recovered only 55 percent of the jobs lost.  And, as the following chart illustrates, the percentage of jobs lost remains greater, even now after months of job growth, than it was at any point during the Great Recession. 

If the job recovery continues on its current pace, some analysts predict that it will likely take more than three years to just get back to pre-pandemic employment levels.  However, this might well be too rosy a projection.  One reason is that the early assumption that many of the job losses were temporary, and that those unemployed would soon be recalled to employment, is turning out to be wrong.  A rapidly growing share of the unemployed are remaining unemployed for an extended period. 

As we see below, in October, almost one-third of the unemployed had been unemployed for 27 weeks or longer.  According to the December jobs report, that percentage is now up to 37 percent, four times what it was before the pandemic.  And that figure seriously understates the problem, since many workers have given up looking for work; having dropped out of the workforce, they are no longer counted as unemployed.  The labor force participation rate is now 61.5 percent, down from 63.3 percent in February.

Dean Baker, quoted in a recent Market Place story, underscores the importance of this development:

“This is obviously a story of people losing their job at the beginning of the crisis in March and April and not getting it back,” said Dean Baker, co-founder and senior economist with the Center for Economic and Policy Research.

Those out of work for 27 weeks or more make up a growing share of the unemployed, and that could have enduring consequences, Baker said.

“After people have been unemployed for more than six months, they find it much harder to get a job,” he said. “And if they do get a job, their labor market prospects could be permanently worsened.”

And tragically, the workers that have suffered the greatest job losses during this crisis are those that earned the lowest wages. 

It is no wonder that growing numbers of working people are finding it difficult to meet their basic needs.

There is no way to sugar coat this situation.  We need a significant stimulus package, a meaningful increase in the minimum wage, real labor law reform, a robust national single-payer health care system, and an aggressive Green New Deal designed public sector investment and jobs program.  And there is no getting around the fact that it is going to take hard organizing and mutually supportive community and workplace actions to move the country in the direction it needs to go.

America’s labor crisis

We face a multifacited labor crisis. One of the most important aspects of this crisis is the U.S. economy’s diminishing capacity to provide employment. This development is highlighted in the chart below, which shows the trend in civilian employment over the last thirty years.  Civilian employment includes all individuals who worked at least one hour for a wage or salary, or were self- employed, or were working at least 15 unpaid hours in a family business or on a family farm, during the week including the 12th of the month when surveys are taken.

As we can see, it took approximately 4 years to bring civilian employment back to its pre-crisis peak after the 2001 recession, and a much longer 6.5 years after the 2008 recession.  The number of years it will take to regain the pre-crisis peak employment level after the end of this recession (which remains ongoing) can be expected to be far greater, with some analysts predicting it could take a decade or more. And of course, new people will be entering the labor force over that decade, generating a serious unemployment problem.

The following chart, which shows the trend in the civilian labor force participation rate, offers additional evidence of the economy’s declining job creating potential. The civilian labor force participation rate is calculated by dividing the sum of all workers who are employed or actively seeking employment by the total noninstitutionalized, civilian working-age population.

As we can see, this measure has been in sharp decline for many years, including over the years of expansion that followed the 2008 recession.  With growing numbers of working-age people, including prime-age workers, forced to drop out of the labor force even during so-called “good times,” there is little reason to expect a significant improvement in employment opportunities in the years following the end of this recession.

These charts make clear that without a significant change in the workings of the economy, working people are facing a future of declining employment possibilities. And it certainly appears that there is no enthusiasm for major economic changes among the most powerful and wealthy in the United States.  According to a recent report, U.S. billionaires saw their fortunes soar by $434 billion during the nation’s lockdown between mid-March and mid-May. And Market Watch reported that the S&P 500 and Nasdaq just booked the best postelection day gains in history.  The reason:

Wall Street warmed to the possibility of a divided U.S. government and further political gridlock in Washington following a contentious election, potentially keeping Trump administration’s tax cuts in place no matter who sits in the White House.

In sum, if we want a meaningful economic recovery, one that serves majority needs, we will have to fight for it.  Among other things, this means finding new ways to strengthen labor-community coalitions and engage people in sustained conversation about the class-contradictory nature of our economic system.

There is a union difference: mortality rates from COVID-19 are lower in unionized nursing homes

We need strong unions, all of us.  Tragically, even during the pandemic, businesses continue to aggressively resist worker attempts at unionization. And recent decisions by the NLRB only add to worker difficulties.

Here is one example of what is at stake: a recently published study of New York State nursing homes found that mortality rates from COVID-19 were 30 percent lower in unionized nursing homes than in facilities without health care worker unions.  By gaining better protection for themselves, unionized workers were also able to better protect the health of those they served.

Although the pandemic makes organizing and solidarity actions more difficult, it is essential that we find effective ways to support worker struggles for strong unions.

Work during the pandemic

Many workers, especially those now celebrated as “essential” or “frontline,” don’t feel safe at work, and for good reason.  Many have been denied needed personal protective equipment (PPE) or even information about the health status of their coworkers.

While surveys find that many employers have implemented new workplace cleaning procedures, they also find that a large percentage of workers continue to work without access to PPE, especially masks and gloves.  Strikingly, according to one study,

If [worker] access to PPE was limited in our data, policies mandating that workers wear protective gear were even more uncommon. Around a third of workers in restaurants, fast food, coffee shops, and hotels and motels reported requirements to wear gloves. This share was dramatically lower (around 12%) in big-box stores, department stores, retail stores, grocery stores, and pharmacies. The share of workers required to wear gloves was even lower in warehouses, fulfillment centers, and in delivery. Mask requirements were vanishingly uncommon across workplaces, at between 2% and 7% in convenience stores, coffee shops, fast food, restaurants, grocery stores, retail, department stores, and big-box stores. Just 12% of those in fulfillment centers reported a mask requirement, which was significantly higher than the 5% of warehouse and delivery workers.

Adding to the danger, many companies are aggressively trying to keep information about worker infections secret from coworkers and the public.  As a Bloomberg Law post explains:

U.S. businesses have been on a silencing spree. Hundreds of U.S. employers across a wide range of industries have told workers not to share information about Covid-19 cases or even raise concerns about the virus, or have retaliated against workers for doing those things, according to workplace complaints filed with the NLRB and the Occupational Safety and Health Administration (OSHA).

Workers at Amazon.com, Cargill, McDonald’s, and Target say they were told to keep Covid cases quiet. The same sort of gagging has been alleged in OSHA complaints against Smithfield Foods, Urban Outfitters, and General Electric. In an email viewed by Bloomberg Businessweek, Delta Air Lines told its 25,000 flight attendants to “please refrain from notifying other crew members on your own” about any Covid symptoms or diagnoses. At Recreational Equipment Inc., an employee texted colleagues to say he’d tested positive and that “I was told not to tell anybody” and “to not post or say anything on social media.”

These policies may help the corporate bottom line, but they endanger workers and those they serve, and thereby help to spread the pandemic.

Without unions, workers have limited ways to force their employers to create a safe work environment.  One is to file a complaint with the Occupational Safety and Health Administration.  And, despite fears of retaliation, many workers have done just that.  As a Brookings blog post reports:

Using data from the Occupational Safety and Health Administration (OSHA), [the figure below] shows the cumulative number of COVID-19 related workplace safety complaints. Between April 20 and August 20, total COVID-19 related workplace safety complaints rose over 350 percent.

Unfortunately, these complaints have achieved little.  According to the Bloomberg Law post,Many thousands of OSHA complaints about coronavirus safety issues have yielded citations against just two companies—a health-care company and a nursing home—totaling about $47,000.” OSHA has still not issued any regulations that address the pandemic.

OSHA rarely sends out inspectors to investigate complaints.  The Bloomberg Law post describes one case in which a mechanic at Maid-Rite, a company that supplies frozen meat products to military bases, nursing homes, and schools, wrote to OSHA describing unsafe conditions:

The mechanic says OSHA called him to say it would be sending Maid-Rite a letter instead of coming to inspect the plant, and that was the last he ever heard from the agency about his complaint. Letters between OSHA and Maid-Rite show OSHA told Maid-Rite in April to investigate worker allegations itself, and Maid-Rite wrote back saying that it was providing and mandating masks and that 6-foot distancing sometimes wasn’t feasible.

No changes were made and so other workers followed up with more complaints over the following weeks, leading OSHA to finally send an inspector to the plant.  However,

in a break from typical protocol, [the inspector] gave the company a heads-up. “OSHA is here, so do everything right!” a supervisor told staff during the inspection, the mechanic later wrote in an affidavit. Fifteen minutes later, the supervisor returned to say “Never mind,” because the visit was over, the mechanic wrote: “As soon as OSHA left, everything went exactly back to the way it was.”

Unions can help

Unions are far from perfect, but they are one of the most effective means workers have to protect their interests, and by extension those they serve.  That point is highlighted by the results of the above noted study on COVID-19 deaths in nursing homes which found that mortality rates from COVID-19 are lower in unionized nursing homes.  This is significant because approximately 43% of all reported COVID-19 deaths in the United States have occurred in nursing homes.

The three authors–Adam Dean, Atheendar Venkataramani, and Simeon Kimmel–focused on nursing homes in New York State, which has had over 6,500 COVID-19 nursing home deaths, second only to New Jersey.  The authors built a model that attempted to explain the variation in confirmed COVID-19 deaths at these New York State nursing homes with an eye to determining if the presence of a health care union made a difference.  They used “proprietary data from 1199SEIU United Healthcare Workers East, the International Brotherhood of Teamsters, and the Communication Workers of America (CWA), as well as publicly-available data from the New York State Nurses Association (NYSNA) to determine if a labor union represented health care workers in each facility.”

Their cross-section regression model also included a range of nonunion variables as possible causes for the variation.  These variables included: whether or not a facility had an adequate supply of PPEs, including masks, eye shields, gowns, gloves, and hand sanitizer; the average age of residents; Resource Utilization Group Nursing Case Mix Index of resident acuity, which classifies patient care needs based on diagnosis, proposed treatment, and level of needed assistance with activities of daily living; occupancy rates; staff-hours-to resident-days ratios for RN, CNA, and licensed practical nurses; percent of residents whose primary support comes from Medicaid or Medicare; Overall 5-Star Rating; whether the nursing home was part of a chain; whether the nursing home was for-profit or non-profit; and county-level data on confirmed cases of COVID-19 and population.

Their main regression result, confirmed by several sensitivity tests, was that, taking all the other variables into account, the presence of a health care labor union was associated with a 30% relative decrease in the COVID-19 mortality rate compared to facilities without a health care labor union.

In examining possible reasons for this result, they ran two other regressions.  One found that the presence of a health care labor union was associated with a 13.8% relative increase in access to N95 masks and a 7.3% relative increase in access to eye shields. Labor union status was not a significant predictor of access to other types of PPE.  The other regression found that the presence of a health care labor union was associated with a 42% relative decrease in the COVID-19 infection rate.

The struggle ahead

There is good reason to believe that the union benefits found by Dean, Venkataramani, and Kimmel in their study are not limited to New York State nursing homes.  Unions are one of the most effective ways for workers to ensure access to critical PPEs and implementation of safety regulations, things that as noted above workers desperately seek.

But of course, corporations don’t want to pay the higher costs that come with unionization.  They prefer the status quo, where working people are forced to pay far greater costs, individually and collectively.  And even in the midst of the pandemic, the NLRB continues to pass new rules making it ever more difficult for workers to unionize.

Workers are increasingly coming to understand that they cannot rely on OSHA or the NLRB to defend their interests. Thus, growing numbers of workers are bravely engaging in direct action, risking their jobs, to fight for their rights and the safety of their co-workers.  We need to find ways to support them and improve the broader environment for organizing and unionizing. A recent Gallup poll offers one hopeful sign: approval of unions continues to grow.

The pandemic, technology, and remote work: the corporate push for greater control over workers’ lives

The U.S. economy is undergoing a major transformation largely driven by the coronavirus pandemic.  One hallmark of that transformation is the explosion in what is called “remote” work.

In 2017, according to a Census Bureau study, only 3 percent of full-time workers in the United States reported that they primarily worked from home.  Today, in response to the pandemic, some 42 percent of the U.S. labor force is working from home—with only 26 percent still working on-site.

Corporate leaders appear to have embraced this shift to at-home work and are pursing the use of new technologies designed to increase managerial control over the remote work process. The response of workers to these changes is still evolving.

The pandemic and the corporate embrace of at-home work

Although most corporations initially viewed the shift to remote work as a necessary short-term response to government mandated closures and consumer and worker health concerns, a number are now planning for a permanent, post-pandemic increase in its use. As the New York Times reports:

Facebook expects up to half its workers to be remote as soon as 2025. The chief executive of Shopify, a Canadian e-commerce company that employs 5,000 people, tweeted in May that most of them “will permanently work remotely. Office centricity is over.” Walmart’s tech chief told his workers that “working virtually will be the new normal.”

Quora, a question-and-answer site, said last week that “all existing employees can immediately relocate to anywhere we can legally employ them.” Those who do not want to go anywhere can still use the Silicon Valley headquarters, which would become a co-working space.

And these large firms are not alone.  As Luke Savage, writing in Jacobin, notes:

With the lockdown still only a few weeks old, a survey of company CFOs by PricewaterhouseCoopers found that almost 30 percent were already planning to reduce their business’s physical footprint, with an April study by Gartner suggesting that some three-quarters were planning to shift at least some employees to remote work on a permanent basis.

It’s a different world

Of course, this is not the first time that corporations have embraced remote work.  A number—including such major companies as IBM, Aetna, Best Buy, Bank of America, Yahoo, AT&T and Reddit—actively promoted telecommuting as recently as 15 years ago.  But they all eventually reversed course, concluding that employee productivity, loyalty, and innovation suffered.  Tech companies, in particular, responded by building expansive and expensive new facilities that offered a range of free on-site benefits such as communal cafeterias and gyms to keep employees motivated and loyal.

Because of this history, some analysts doubt that the current corporate celebration of remote work will last long.  But there is reason to believe that this time is different.  Certainly, early indications are that at-home workers remain focused and hard at work.  Savage cites a Globe and Mail article that leads with this head: “Employers used to believe remote workers were happier but less productive. Turns out it’s the opposite.”  The Globe and Mail article goes on to say:

One fear about shifting to a work-from-home culture is that it would lead to operational chaos: missed meetings, spotty WiFi, games of broken telephone (both figurative and literal). Instead, even companies with tens of thousands of employees are finding that the IT infrastructure is holding up and so are lines of authority. Workers are responding to their emails and joining Zoom calls at approximately the right time. Everyone is always reachable.

The Globe and Mail is not alone in finding evidence of high worker productivity.  For example, the New York Times quotes John Sullivan, a professor of management:

“The data over the last three months is so powerful,” he said. “People are shocked. No one found a drop in productivity. Most found an increase. People have been going to work for a thousand years, but it’s going to stop and it’s going to change everyone’s life.”  Innovation, Dr. Sullivan added, might even catch up eventually.

And Bloomberg came to much the same conclusion, reporting that corporate executives at several different finance and investment companies all see evidence of gains in productivity.

Underlying these gains are three potentially long-lasting developments that provide support for the view that the current corporate commitment to expanding remote work needs to be taken seriously. The first is the availability of relatively low cost and easy-to-use online communication platforms like Zoom that allow managers to easily communicate with their workers and for workers to engage in group work when necessary.  The online infrastructure for corporate communication continues to improve.

The second is the recent and ongoing development of technologies that allow management to monitor and evaluate the online work effort of their employees.  As the New York Times explains: “Demand has surged for software that can monitor employees, with programs tracking the words we type, snapping pictures with our computer cameras and giving our managers rankings of who is spending too much time on Facebook and not enough on Excel.”

Of course, corporations have long used technology to monitor and direct work, and large companies like Amazon have pioneered the development and use of software for directing and intensifying the pace of warehouse workers.  Josh Dzieza, writing in the Verge, offers an example:

Every Amazon worker I’ve spoken to said it’s the automatically enforced pace of work, rather than the physical difficulty of the work itself, that makes the job so grueling. Any slack is perpetually being optimized out of the system, and with it any opportunity to rest or recover. A worker on the West Coast told me about a new device that shines a spotlight on the item he’s supposed to pick, allowing Amazon to further accelerate the rate and get rid of what the worker described as “micro rests” stolen in the moment it took to look for the next item on the shelf.

But as Dzieza makes clear, there is also growing availability and use of new software that makes it possible for corporations to easily oversee the work effort of their online workers.  One example is WorkSmart.  Dzieza describes the experience of a software engineer in Bangladesh who was required to download the software as a condition of his employment with Austin-based Crossover Technologies.  Among other things:

The software tracked his keystrokes, mouse clicks, and the applications he was running, all to rate his productivity. He was also required to give the program access to his webcam. Every 10 minutes, the program would take three photos at random to ensure he was at his desk. If [he] wasn’t there when WorkSmart took a photo, or if it determined his work fell below a certain threshold of productivity, he wouldn’t get paid for that 10-minute interval.

Other recently developed software programs currently in use to monitor the work of call center employees could easily be used to monitor home-based employees doing the same work. Recording the number and length of calls is old hat.  These new programs, using artificial intelligence, can now evaluate the “emotional” tone of the worker’s voice during their conversations with customers.  Some programs can even “coach workers in real time, telling them to speak more slowly or with more energy or to express empathy.” The growing corporate interest in remote work can be expected to spur the development of ever more sophisticated products that will allow even tighter control over at-home work and more detailed evaluation of at-home workers.

The nature of the ongoing transformation of the economy is the third reason that this period may well mark the start of a major shift in the location of work.  Simply stated: unemployment is now high and, when possible, workers welcome a safe alternative to on-site employment.

In the past on-site work was the standard corporate practice and most workers preferred it.  Thus, workers were generally able to undermine individual corporate attempts to push them into working from home.  Now, not only is remote work the new norm, because of the virus it has actually become the desired alternative.  With fear of the virus likely to remain for some time, corporations are in a far stronger position than in the past to normalize remote work and win worker acceptance of new work relations even after the pandemic is brought under control.

Benefits and costs

It is easy to understand why corporations are excited about increasing their use of remote work.  One reason is that it will allow them to greatly reduce their spending on facilities.  Gains on the labor side are likely even larger.  Companies will be able to expand their job search, hiring workers who may live thousands of miles away from the location of corporate operations with no need to pay moving expenses and with the possibility of cheapening the cost of labor by paying salaries commensurate with local living costs.  And, as a bonus, the more a company’s labor force is geographically separated and isolated, the harder it will be for its workers to build the bonds of solidarity needed to challenge management demands.

The use of remote work opens up possibilities for even greater labor savings by making possible the reclassification of new hires into independent contractors.  After all, many remote workers are already paying for the equipment they need (desks, chairs, computers, webcam), the supporting technological infrastructure (high speed Wi-Fi), and office maintenance (cleaning).

Of course, most workers also viewed at-home work positively, at least initially.  They appreciated being able to remain employed and work safely from their homes during the pandemic. But the costs of remote work, as currently structured, are mounting up for workers.

As a Bloomberg article summarizes, “We log longer hours. We attend more meetings with more people. And, we send more emails.”  The article highlights a recently published study by the National Bureau of Economic Research which was based on surveys of some 3 million people at more than 21,000 companies across 16 cities in North America, Europe and the Middle East.  The researchers:

compared employee behavior over two 8 week periods before and after Covid-19 lockdowns. Looking at email and meeting meta-data, the group calculated the workday lasted 48.5 minutes longer, the number of meetings increased about 13% and people sent an average of 1.4 more emails per day to their colleagues.

An online survey of 20,262 people in 10 countries by the technology company Lenovo Group Ltd. found that “A disturbing 71% of those working from home due to Covid-19 have experienced a new or exacerbated ailment caused by the equipment they now must use. . . the most common symptoms [being] back pain, poor posture (e.g., hunched shoulders), neck pain, eye irritation, insomnia and headaches.”

Looking just at the United States, a study done by NordVPN, based on tracking when at-home workers connected and disconnected from its service, found that at-home workers logged three hours more per day on the job than before the start of city and state lockdowns.  And a survey of 1,001 U.S. employees by Eagle Hill Consulting found that “By early April, about 45% of workers said they were burned out. Almost half attributed the mental toll to an increased workload, the challenge of juggling personal and professional life, and a lack of communication and support from their employer.”

Given the direction of corporate planning, it is likely that the costs of remote work for workers—physical and emotional—will only increase.  As one public relations executive explained when discussing why his company now views remote work so positively: The technology is better. Moreover, “we have rules now,” he said. “You have to be available between 9 a.m. and 5:30 p.m. You can’t use this as child care.”

Challenges ahead

For many workers, it is the pandemic, with its forced isolation of family in small housing units, that has made remote work so difficult and emotionally wearing.  And, for many, the experience of on-site work before the coronavirus pandemic forced closures was also far from ideal.  Thus surveys show, as the New York Times reports,

Most American office workers are in no hurry to return to the office full time, even after the coronavirus is under control. But that doesn’t mean they want to work from home forever. The future for them, a variety of new data shows, is likely to be workweeks split between office and home.

For example, a survey by the company Morning Consult done in mid-June found that:

Overall, 73 percent of U.S. adults who have careers where remote work is possible report that the pandemic has made them feel more positively about the prospect of remote work. And given the option, three quarters of these workers say they would like to work from home at least 1-2 days a week once the pandemic is under control.

At issue, then, is who will decide the place of work and perhaps even more importantly, the conditions of work, including remote work.  Current indications are that corporations plan to push workers into more remote work than surveys suggest they want, and definitely under conditions of surveillance and evaluation that they will find objectionable.  It is less clear whether those working remotely or threatened with remote work will be able to organize rapidly enough to force corporations to bargain with them over both the location of work and the work process, on- and off-site, including the aim and uses of new technology.

If there is a reason for optimism it is that there appears to be a growing solidarity between white- and blue-collar workers in the tech industry that includes support for unionization, especially at some of the large firms like Google and Amazon. As Tyler Sonnemaker and Allana Akhtar, writing for Business Insider, describe:

Even a year ago, the idea that tech’s cafeteria workers and office workers were on the same page about forming a labor union would have seemed unthinkable.

The recent wave of employee activism and organizing efforts represents a widening rift between the industry’s rank-and-file employees and its executives. For the first time, developers and product managers with higher pay and closer ties to management are siding with their lower-paid colleagues in warehouses, cafeterias, and contract gigs. . . .

Frequent leaks to the media – notable given the historically tight-knit culture at tech companies – and the emergence of groups like Rideshare Drivers United, Tech Workers Coalition, Athena, and Amazonians United are just two signs of the rise in employee activism in recent years. But over the past few months, emboldened by the pandemic and racial justice protests, workers at startups like Away and giants like Facebook have become a vocal chorus of critics.

Passively allowing management to use technology to shape the work process and the resulting final product is a recipe for ever worsening working and living conditions for the great majority of working people. Hopefully, the ongoing worker agitation and organizing in the United States will continue regardless of the unpredictable nature of the pandemic, producing a shared critique of profit-driven work and support for new organizational forms, including unions, that can fight for a more humane economic system.

Racism, COVID-19, and the fight for economic justice

While the Black Lives Matter protests sweeping the United States were triggered by recent police murders of unarmed African Americans, they are also helping to encourage popular recognition that racism has a long history with punishing consequences for black people that extend beyond policing.  Among the consequences are enormous disparities between black and white well-being and security.  This post seeks to draw attention to some of these disparities by highlighting black-white trends in unemployment, wages, income, wealth, and security. 

A common refrain during this pandemic is that “We are all in it together.”  Although this is true in the sense that almost all of us find our lives transformed for the worst because of COVID-19, it is also not true in some very important ways.  For example, African Americans are disproportionally dying from the virus.  They account for 22.4 percent of all COVID-19 deaths despite making up only 12.5 percent of the population. 

One reason is that African Americans also disproportionally suffer from serious preexisting health conditions, a lack of health insurance, and inadequate housing, all of which increased their vulnerability to the virus.  Another reason is that black workers are far more likely than white workers to work in “front-line” jobs, especially low-wage ones, forcing them to risk their health and that of their families.  While black workers comprise 11.9 percent of the labor force, they make up 17 percent of all front-line workers.  They represent an even higher percentage in some key front-line industries: 26 percent of public transit workers; 19.3 percent of child care and social service workers; and 18.2 percent of trucking, warehouse and postal service workers.

African Americans have also disproportionately lost jobs during this pandemic.  The black employment to adult population ratio fell from 59.4 percent before the start of the pandemic to a record low of 48.8 percent in April.  Not surprisingly, recent surveys find, as the Washington Post reports, that:

More than 1 in 5 black families now report they often or sometimes do not have enough food — more than three times the rate for white families. Black families are also almost four times as likely as whites to report they missed a mortgage payment during the crisis — numbers that do not bode well for the already low black homeownership rate.

This pandemic has hit African Americans especially hard precisely because they were forced to confront it from a position of economic and social vulnerability as the following trends help to demonstrate.

Unemployment

The Bureau of Labor Statistics began collecting separate data on African American unemployment in January 1972.  Since then, as the figure below shows, the African American unemployment rate has largely stayed at or above twice the white unemployment rate. 

As Olugbenga Ajilore explains

Between strides in civil rights legislation, desegregation of government, and increases in educational attainment, employment gaps should have narrowed by now, if not completely closed. Yet as [the figure above] shows, this has not been the case.

Wages

The figure below from an Economic Policy Institute study, shows the black-white wage gap for workers in different earning percentiles, by education level, and regression-adjusted (to control for age, gender, education and regional differences).  As we can see, the wage gap has grown over time regardless of measure. 

Elise Gould summarizes some important take-aways from this study:

The black–white wage gap is smallest at the bottom of the wage distribution, where the minimum wage serves as a wage floor. The largest black–white wage gap as well as the one with the most growth since the Great Recession, is found at the top of the wage distribution, explained in part by the pulling away of top earners generally as well as continued occupational segregation, the disproportionate likelihood for white workers to occupy positions in the highest-wage professions.

It’s clear from the figure that education is not a panacea for closing these wage gaps. Again, this should not be shocking, as increased equality of educational access—as laudable a goal as it is—has been shown to have only small effects on class-based wage inequality, and racial wealth gaps have been almost entirely unmoved by a narrowing of the black–white college attainment gap . . . . And after controlling for age, gender, education, and region, black workers are paid 14.9% less than white workers.

Income

The next figure shows that while median household income has generally stagnated for all races/ethnicities over the period 2000 to 2017, only blacks have suffered an actual decline.  The median income for black households actually fell from $42,348 to $40,258 over this period.  As a consequence, the black-white income gap has grown.  The median black household in 2017 earned just 59 cents for every dollar of income earned by the white median household, down from 65 cents in 2000.

Moreover, as Valerie Wilson, points out, “Based on [Economic Policy Institute] imputed historical income values, 10 years after the start of the Great Recession in 2007, only African American and Asian households have not recovered their pre-recession median income.“  Median household income for African American households fell 1.9 percent or $781 over the period 2007 to 2017.  While the decline was greater for Asian households (3.8 percent), they continued to have the highest median income.

Wealth

The wealth gap between black and white households also remains large.  In 1968, median black household wealth was $6,674 compared with median white household wealth of $70,768.  In 2016, as the figure below shows, it was $13,024 compared with $149,703.

As the Washington Post summarizes:

“The historical data reveal that no progress has been made in reducing income and wealth inequalities between black and white households over the past 70 years,” wrote economists Moritz Kuhn, Moritz Schularick and Ulrike I. Steins in their analysis of U.S. incomes and wealth since World War II.

As of 2016, the most recent year for which data is available, you would have to combine the net worth of 11.5 black households to get the net worth of a typical white U.S. household.

The self-reinforcing nature of racial discrimination is well illustrated in the next figure.  It shows the median household wealth by education level as defined by the education level of the head of household. 

As we see, black median household wealth is below white median household wealth at every education level, with the gap growing with the level of education.  In fact, the median black household headed by someone with an advanced degree has less wealth than the median white household headed by someone with only a high school diploma.  The primary reason for this is that wealth is passed on from generation to generation, and the history of racism has made it difficult for black families to accumulate wealth much less pass it on to future generations. 

Security

The dollar value of household ownership of liquid assets is one measure of economic security.  The greater the value, the easier it is for a household to weather difficult times not to mention unexpected crises, such as today’s pandemic.  And as one might expect in light of the above income and wealth trends, black households have far less security than do white households.

As we can see in the following figure, the median black household held only $8,762 in liquid assets (as defined as the sum of all cash, checking and savings accounts, and directly held stocks, bonds, and mutual funds).  In comparison, the median white household held $49,529 in liquid assets.  And the black-white gap is dramatically larger for households headed by someone with a bachelors degree or higher. 

Hopeful possibilities

The fight against police violence against African Americans, now being advanced in the streets, will eventually have to be expanded and the struggle for racial justice joined to a struggle for economic justice.  Ending the disparities highlighted above will require nothing less than a transformational change in the organization and workings of our economy.

One hopeful sign is the widespread popular support for and growing participation in the Black Lives Matter-led movement that is challenging not only racist policing but the idea of policing itself and is demanding that the country acknowledge and confront its racist past.  Perhaps the ways in which our current economic system has allowed corporations to so quickly shift the dangers and costs of the pandemic on to working people, following years of steady decline in majority working and living conditions, is helping whites better understand the destructive consequences of racism and encouraging this political awakening. 

If so, perhaps we have arrived at a moment where it will be possible to build a multi-racial working class-led movement for structural change that is rooted in and guided by a commitment to achieving economic justice for all people of color. One can only hope that is true for all our sakes.

Victory: Ohio’s plan to deny workers their unemployment insurance is shelved

Some stories are just so satisfying that they deserve to be shared.  Here is one.

In early May, Ohio Republican Governor Mike DeWine began reopening the state economy.  And to support business and slash state expenses, both at worker expense, he had a “COVID-19 Fraud” form put up on the Ohio Department of Job and Family Services website where employers could confidentially report employees “who quit or refuse work when it is available due to COVID-19.”  Inspectors would then investigate whether the reported workers should lose their unemployment benefits and possibly be charged with unemployment fraud.

Significantly, as Sarah Ingles, the board president of the Central Ohio Worker Center, noted in a statement quoted by the Intercept, the form “does not define what constitutes a ‘good cause’ exemption, and by doing so, may exclude many Ohio workers who have justifiable reasons for not returning to work and for receiving unemployment insurance benefits.”  In other words, “while the state did not take the time to define what a ‘good cause’ exemption includes or does not include, it did have time to develop an online form where employers could report employees.”

However, thanks to the work of an anonymous hacker, the site has now been taken down. In officialese, “The previous form is under revision pending policy references.”  Most importantly, as Janus Rose writing for Motherboard reports:

“No benefits are being denied right now as a result of a person’s decision not to return to work while we continue to evaluate the policy,” ODJFS Director Kimberly Hall told Cleveland.com.

According to Rose, the hacker developed a script that overwhelmed the system:

The script works by automatically generating fake information and entering it into the form. For example, the companies are taken from a list of the top 100 employers in the state of Ohio—including Wendy’s, Macy’s, and Kroger—and names and addresses are randomly created using freely-available generators found online. Once all the data is entered, the script has to defeat a CAPTCHA-like anti-spam measure at the end of the form. Unlike regular CAPTCHAs, which display a grid of pictures and words that the user must identify, the security tool used by the form is merely a question-and-answer field. By storing a list of common questions and their respective answers, the script can easily defeat the security measure by simply hitting the “switch questions” button until it finds a question it can answer.

To make the code more accessible, software engineer David Ankin repackaged the script into a simple command line tool which allows users to run the script in the background of their computer, continuously submitting fake data to the Ohio website.

“If you get several hundred people to do this, it’s pretty hard to keep your data clean unless you have data scientists on staff,” Ankin told Motherboard.

The hacker told Motherboard they viewed their effort as a form of direct action against the exploitation of working people during the COVID-19 crisis.  Score one for working people.