The United States is an employment “at-will” country. That means, absent a union contract, a boss can fire a worker for almost any, or even no reason, and without advance notice. Well—with the exception of Montana. As the state’s employment division explains: “Montana is not an ‘at-will’ state. . . generally, once an employee has completed the established probationary period, the employer needs to have good cause for termination.”
While Montana is the exception in the United States, the United States is the exception among developed capitalist economies. In those other countries, most workers can only be dismissed for “just cause,” with just cause statutorily or judicially defined. For example, German workers employed for more than six months by a company with more than ten workers cannot simply be dismissed. The company must have a valid business or personal conduct reason. Moreover, the company is also required to notify the employee in advance, and in writing, of their termination. Many employees also receive severance pay proportional to their length of employment.
So, how big a deal is employment at-will in the United States? According to the results of a recent survey by the National Employment Law Project (NELP), carried out by YouGov, more than two out of three workers who have been discharged received no reason or an unfair reason for their termination. Almost three out of four received no warning before discharge.
With the Federal Reserve pushing up interest rates, we appear headed for a new recession. Sadly, our unemployment insurance system remains broken: too few unemployed receive benefits and the benefits are far too low. As a result, the next recession, when it comes, will again bring unnecessary suffering to millions of workers and their families.
It doesn’t have to be this way. Federal action during the recent pandemic crisis shows how our unemployment system can be dramatically improved. The problem is that many business and political leaders are content with the system as it is now. That means it is up to us to start agitating for reform, and the sooner the better.
President Calvin Coolidge, in a January 1925 speech to newspaper editors, asserted that “the chief business of the American people is business.” The claim, although far from true, did capture the short-lived success of business leaders in structuring the country’s social institutions for the benefit of the wealthy.
Tragically, we appear well into another period when business needs and desires are promoted as consistent with American values and enshrined into law. The pro-business orientation of the current Roberts Supreme Court highlights this reality. As Lee Epstein and Mitu Gulati show in their paper “A Century of Business in the Supreme Court, 1920-2020”:
the Roberts Court may be the most pro-business Court in a century. The win rate for business in the Roberts Court, 63.4 percent, is 15 percentage points higher than the next highest rate of business wins over the past century.
Slip slidin’ away—that is what tends to happen to pro-worker reforms in our economic system. Things are structured so that without constant vigilance and struggle on our part, gains are gradually undone. A case in point: overtime pay.
It wasn’t that long ago that most workers in the US were eligible for time and half pay for every hour worked beyond a 40-hour work week. Employers didn’t agree to overtime pay out of the goodness of their hearts. They did it because worker organizing and activism pressured Congress to pass a labor law requiring, although with some important exceptions, the payment of overtime wages. Now, a significant number of workers no longer have the right to overtime pay. For example, in 1975 more than 60 percent of salaried workers automatically qualified for time and half pay. That share fell to a low of 4 percent in 2000 before slowly rising to 15 percent in 2020.
President Biden’s 2022 State of the Union Address included a call for a $15 federal minimum wage. According to an Economic Policy Institutestudy, a phased increase to a $15 federal minimum wage by 2025 would raise the earnings of 32 million workers—21 percent of the workforce, no small thing.
The current federal minimum wage is $7.25. The federal minimum wage was established in 1938, as part of the Fair Labor Standards Act. Congress has voted to raise it 9 times since then, the last time in 2007. That last vote included a mandated three step increase that brought it to its current level in July 2009.
It has been 13 years since the last increase in the federal minimum wage, the longest period since its establishment without an increase. Taking inflation into account, workers paid the federal minimum wage in 2021 earned 21 percent less than what their counterparts earned in 2009, and prices keep rising. Outrageously, this eroding federal minimum wage continues to set the wage floor in 20 states. Where is the justice in that?
According to defenders of the status quo, the best response to our most serious problems is to let markets work their magic; government regulation of private business activity only makes things worse. That is certainly the line that big finance is pushing when it comes to our ever-worsening climate crisis.
A case in point is the growing popularity of ESG investing, which stands for Environmental, Social, and Governance responsive investing. You want to save the world, put your money in ESG funds which, according to money managers, will guarantee that your money rewards those companies that value sustainability as well as human and worker rights. What could be simpler.
One problem with this strategy is that ESG investing is largely a fraud, one that allows leading asset management companies to dramatically boost their profits, and the rest of the business community to continue on with their destructive business practices without fear of bad publicity or public action. The end result—the planetary crisis continues unabated and the investing public gets fleeced.
The Green New Deal has become a rallying cry for activists seeking to build a mass movement capable of addressing our ever worsening, and increasingly interrelated, climate and social crises. Building such a movement is no simple task, but I believe that our organizing efforts can greatly benefit from a careful study of the rapid transformation of the US economy from civilian to military production during World War II.
In two recent publications, with links below, I describe and evaluate the planning process responsible for the wartime transformation and offer my thoughts on some of the key lessons to be learned. In what follows I highlight some of the reasons why I believe Green New Deal advocates would benefit from careful study of the wartime experience.
There appears to be growing consensus among economists and policy makers that inflation is now the main threat to the US economy and the Federal Reserve Board needs to start ratcheting up interest rates to slow down economic activity. While these so-called inflation-hawks are quick to highlight the cost of higher prices, they rarely, if ever, mention the costs associated with the higher interest rate policy they recommend, costs that include higher unemployment and lower wages for working people.
The call for tightening monetary policy is often buttressed by claims that labor markets have now tightened to such an extent that continued expansion could set off a wage-price spiral. However, the rapid decline in the unemployment rate to historically low levels, a development often cited in support of this call for austerity, is far from the best indicator of labor market conditions. In fact, even leaving aside issues of job quality, the US employment situation, as we see below, remains problematic. In short: the US economy continues to operate in ways that fall far short of what workers need.
Are you searching for a way to highlight the negative consequences of racism? Try this: Justin M. Feldman and Mary T. Basset, in a recently published study, found that if everyone living in the United States, aged 25 years or older, died of COVID-19 at the same rate as college-educated non-Hispanic white people did in 2020, 48 percent fewer people would have died, 71 percent fewer people of color would have died, and 89 percent fewer people of color aged 25-64 would have died.
Pretty much everyone accepts that inequality is a big problem in the US. But it is doubtful that most people truly grasp how successfully US elites have captured the benefits of economic growth and, as a result, how much the resulting inequality has cost them. Here is one estimate of that cost—according to Carter C. Price and Kathryn A. Edwards, authors of a Rand Education and Labor study on income trends:
[the] aggregate income for the population below the 90th percentile . . . would have been $2.5 trillion (67 percent) higher in 2018 had income growth since 1975 remained as equitable as it was in the first two post-War decades. From 1975 to 2018, the difference between the aggregate taxable income for those below the 90th percentile and the equitable growth counterfactual totals $47 trillion.
That $2.5 trillion was enough to give each and every worker in the bottom nine income deciles an additional $1144 a month, every month of the year. That is life changing money for tens of millions—and that is only a partial measure of the costs of inequality.