What the New Deal can teach us about winning a Green New Deal: Part III—the First New Deal

In Part I and Part II of this series on lessons to be learned from the New Deal I argued that despite the severity of the Great Depression, sustained organizing was required to transform the national political environment and force the federal government to accept direct responsibility for financing relief and job creation programs. In this post, I begin an examination of the evolution and aims of New Deal programs in order to highlight the complex and conflictual nature of a state-directed reform process.

The New Deal is often talked about as if it were a set of interconnected programs that were introduced at one moment in time to reinvigorate national economic activity and ameliorate the hardships faced by working people.  Advocates for a Green New Deal, which calls for a new state-led “national, social, industrial, and economic mobilization” to confront our multiple interlocking problems, tend to reinforce this view of the New Deal.  It is easy to understand why: state action is desperately needed, and pointing to a time in history when it appears that the state rose to the occasion, developing and implementing the programs necessary to solve a crisis, makes it easier for people to envision and support another major effort.

Unfortunately, this view misrepresents the experience of the New Deal.  And, to the extent it influences our approach to shaping and winning a Green New Deal, it weakens our ability to successfully organize and promote the kind of state action we want.

The New Deal actually encompasses two different periods; the First New Deal was begun in 1933, the Second New Deal in 1935.  In both periods, the programs designed to respond to working class concerns fell far short of popular demands.  In fact, it was continued mass organizing, spearheaded by an increasingly unified unemployed movement and an invigorated trade union movement, that pushed the Roosevelt administration to initiate its Second New Deal, which included new and significantly more progressive initiatives.

Unfortunately, as those social movements lost energy and vision in the years that followed, pressure on the state for further change largely abated, leaving the final reforms won compromised and vulnerable to future attack.   The lesson from this history for those advocating for a Green New Deal is clear: winning a Green New Deal requires, in addition to carefully constructed policy demands, an approach to movement building that prepares people for a long struggle to overcome expected state efforts to resist the needed transformative changes.

The First New Deal

Roosevelt’s initial policies were largely consistent with those of the previous Hoover administration.  Like Hoover, he sought to stabilize the banking system and balance the budget.  On his first day in office Roosevelt declared a national bank “holiday,” dismissing Congressional sentiment for bank nationalization.  He then rushed through a new law, the Emergency Banking Act, which gave the Comptroller of the Currency, the Secretary of the Treasury, and the Federal Reserve new powers to ensure that reopened banks would remain financially secure.

On his sixth day in office, he requested that Congress cut $500 million from the $3.6 billion federal budget, eliminate government agencies, reduce the salaries of civilian and military federal workers, and slash veterans’ benefits by 50 percent.  Congressional resistance led to spending cuts of “only” $243 million.

Roosevelt remained committed, against the advice of many of his most trusted advisers, to balanced budget policies for most of the decade.  While his administration did boost government spending to nearly double the levels of the Hoover administration, it also collected sufficient taxes to keep deficits low.  It wasn’t until 1938 that Roosevelt proposed a Keynesian-style deficit spending plan.

At the same time, facing escalating demands for action from the unemployed as well as many elected city leaders, Roosevelt also knew that the status quo was politically untenable.  And, in an effort to halt the deepening depression and growing militancy of working people, he pursued a dizzying array of initiatives, most within his first 100 days in office.  The great majority were aimed at stabilizing or reforming markets, which Roosevelt believed was the best way to restore business confidence, investment, and growth.  This emphasis is clear from the following list of some of his most important initiatives.

  • The Agricultural Adjustment Act (May 1933). The act sought to boost the prices of agricultural goods. The government bought livestock and paid subsidies to farmers in exchange for reduced planting. It also created the Agricultural Adjustment Administration to manage the payment of subsidies.
  • The Securities Act of 1933 (May 1933). The act sought to restore confidence in the stock market by requiring that securities issuers disclose all information necessary for investors to be able to make informed investment decisions.
  • The Home Owners’ Loan Act of 1933 (June 1933). The act sought to stabilize the finance industry and housing industry by providing mortgage assistance to homeowners. It created the Home Owners Loan Corporation which was authorized to issue bonds and loans to help homeowners in financial difficulties pay their mortgages, back taxes, and insurance.
  • The Banking Act of 1933 (June 1933). The act separated commercial and investment banking and created the Federal Deposit Insurance Corporation to insure bank deposits, curb bank runs, and reduce bank failures.
  • Farm Credit Act (June 1933). The act established the Farm Credit System as a group of cooperative lending institutions to provide low cost loans to farmers.
  • National Industrial Recovery Act (June 1933). Title I of the act suspended anti-trust laws and required companies to write industrywide codes of fair competition that included wage and price fixing, the establishment of production quotas, and restrictions on market entry.  It also gave workers the right to organize unions, although without legal protection.  Title I also created the National Recovery Administration to encourage business compliance.  The Supreme Court ruled the suspension of anti-trust laws unconstitutional in 1935.  Title II, which established the Federal Emergency Administration of Public Works or Public Works Administration, is discussed below.

Roosevelt also pursued several initiatives in response to working class demands for jobs and a humane system of relief.  These include:

  • The Emergency Conservation Work Act (March 1933). The act created the Civilian Conservation Corps which employed jobless young men to work in the nation’s forests and parks, planting trees, reducing erosion, and fighting fires.
  • The Federal Emergency Relief Act of 1933 (May 1933). The act created the Federal Emergency Relief Administration to provide work and cash relief for the unemployed.
  • The Federal Emergency Administration of Public Works or Public Works Administration (June 1933). Established under Title II of the National Industrial Recovery Act, the Public Works Administration was a federally funded public works program that financed private construction of major public projects such as dams, bridges, hospitals, and schools.
  • The Civil Works Administration (November 1933).  Established by executive order, the Civil Works Administration was a short-lived jobs program that employed jobless workers at mostly manual-labor construction jobs.

This is without doubt an impressive record of accomplishments, and it doesn’t include other noteworthy actions, such as the establishment of the Tennessee Valley Authority, the ending of prohibition, and the removal of the US from the gold standard.  Yet, when looked at from the point of view of working people, this First New Deal was sadly lacking.

Roosevelt’s pursuit of market reform rather than deficit spending meant a slow recovery from the depths of the recession.  In fact, John Maynard Keynes wrote Roosevelt a public letter in December 1933, pointing out that the Roosevelt administration appeared more concerned with reform than recovery or, to be charitable, was confusing the former with the latter.  Primary attention, he argued, should be on recovery, and that required greater government spending financed by loans to increase national purchasing power.

Roosevelt also refused to address one of the unemployed movement’s major policy demands: the establishment of a federal unemployment insurance fund financed by taxes on the wealthy.  Finally, as we see next, even the New Deal’s early job creation and relief initiatives were deliberately designed in ways that limited their ability to meaningfully address their targeted social concerns.

First New Deal employment and relief programs

The Roosevelt administration’s first direct response to the country’s massive unemployment was the Civilian Conservation Corps (CCC).  Its enrollees, as Roosevelt explained, were to be “used in complex work, not interfering with normal employment and confining itself to forestry, the prevention of soil erosion, flood control, and similar projects.”  The project was important for establishing a new level of federal responsibility, as employer of last resort, for boosting employment.  Over its nine-year lifespan, its participants built thousands of miles of hiking trails, planted millions of trees, and fought hundreds of forest fires.

However, the program was far from meeting the needs of the tens of million jobless and their dependents.  Participation in the program was limited to unmarried male citizens, 18 to 25 years of age, whose families were on local relief, and who were able to pass a physical exam.  By law, maximum enrollment in the program was limited to 300,000.

Moreover, although the CCC provided its participants with shelter, clothing, and food, the wages it paid, $30 a month ($25 of which had to be sent home to their families), were low.  And, while white and black were supposed to be housed together in the CCC camps where participants lived under Army supervision, many of the camps were segregated, with whites given preference for the best jobs.

Two months later, the Roosevelt administration launched the Federal Emergency Relief Administration (FERA), the first program of direct federal financing of relief.  Under the Hoover administration, the federal government had restricted its support of state relief efforts to the offer of loans.  Because of the precariousness of their own financial situation, many states were unable to take on new debt, and were thus left with no choice but to curtail their relief efforts.

FERA, in contrast, offered grants as well as loans, providing approximately $3 billion in grants over its 2 ½ year lifespan. The grants allowed state and local governments to employ people who were on relief rolls to work on a variety of public projects in agriculture, the arts, construction and education.  FERA grants supported the employment of over 20 million people, or about 16 percent of the total population of the United States.

However, the program suffered from a number of shortcomings.  FERA provided funds to the states on a matching basis, with states required to contribute three dollars for every federal dollar.  This restriction meant that a number of states, struggling with budget shortfalls, either refused to apply for FERA grants or kept their requests small.

Also problematic was the program’s requirement that participants be on state relief rolls.  This meant that only one person in a family was eligible for FERA work.  And the amount of pay or relief was determined by a social worker’s evaluation of the extent of the family’s financial need.  Many states had extremely low standards of necessity, resulting in either low wages or inadequate relief payments which could sometimes be limited to coupons exchangeable only for food items on an approved list.

Finally, FERA was not directly involved in the administration and oversight of the projects it funded. This meant that compensation for work and working conditions differed across states.  It also meant that in many states, white males were given preferential treatment.

A month later, the Public Works Administration (PWA) was created as part of the National Industrial Recovery Act.  The PWA was a federal public works program that financed private construction of major long-term public projects such as dams, bridges, hospitals, and schools.  Administrators at PWA headquarters planned the projects and then gave funds to appropriate federal agencies to enable them to help state and local governments finance the work. The PWA played no role in hiring or production; private construction companies carried out the work, hiring workers on the open market.

The program lasted for six years, spent $6 billion, and helped finance a number of important infrastructure projects.  It also gave federal administrators valuable public policy planning experience, which was put to good use during World War II.  However, as was the case with FERA, PWA projects required matching contributions from state and local governments, and given their financial constraints, the program never spent as much money as was budgeted.

These programs paint a picture of a serious but limited effort on the part of the Roosevelt administration to help workers weather the crisis.  In particular, the requirement that states match federal contributions to receive FERA and PWA funds greatly limited their reach.  And, the participant restrictions attached to both the CCC and FERA meant that program benefits were far from adequate.  Moreover, because all of these were new programs, it often took time for administrators to get funds flowing, projects developed, participants chosen, and benefits distributed.  Thus, despite a flurry of activity, millions of workers and their families remained in desperate conditions with winter approaching.

Pressed to do more, the Roosevelt administration launched its final First New Deal jobs program in November 1933, the Civil Works Administration (CWA), under the umbrella of FERA.  It was designed to be a short-term program, and it lasted only 6 months, with most employment creation ending after 4 months.  The jobs created were primarily low-skilled construction jobs, improving or constructing roads, schools, parks, airports, and bridges. The CWA gave jobs to some 4 million people.

This was a dramatically different program from those discussed above.  Most importantly, employment was not limited to those on relief, greatly enlarging the number of unemployed who could participate.  At the end of Hoover’s term in office, only one unemployed person out of four was on a relief roll.  It also meant that participants would not be subject to the relief system’s humiliating means tests or have their wages tied to their family’s “estimated budgetary deficit.”  Also significant was the fact that although many of the jobs were inherited from current relief projects, CWA administrators made a real effort to employ their workers in new projects designed to be of value to the community.

For all of these reasons, jobless workers flocked to the program, seeking an opportunity to do, in the words of the time, “real work for a real wage.”   As Harry Hopkins, the program’s chief administrator, summed up in a talk shortly after the program’s termination:

When we started Civil Works we said we were going to put four million men to work.  How many do you suppose applied for those four million jobs? About ten million. Now I don’t say there were ten million people out of work, but ten million people walked up to a window and stood in line, many of them all night, asking for a job that paid them somewhere between seven and eighteen dollars a week.

In point of fact, there were some fifteen million people unemployed.  And as the demand for CWA jobs became clear, Roosevelt moved to end the program.   As Jeff Singleton describes:

In early January Hopkins told Roosevelt that CWA would run out of funds sooner than expected.  According to one account, Roosevelt “blew up” and demanded that Hopkins begin phasing out the program immediately.  On January 18 Hopkins ordered weekly wages cut (through a reduction in hours worked) and hinted that the program would be terminated at the beginning of March.  The cutback, coming at a time when the program had just reached its promised quota, generated a storm of protest and a movement in Congress to continue CWA through the spring of 1934.  These pressures helped the New Deal secure a new emergency relief appropriation of $950 million, but the CWA was phased out in March and April.

Lessons

The First New Deal did represent an important change in the economic role of the federal government.  In particular, the Roosevelt administration broke new ground in acknowledging federal responsibility for job creation and relief.  Yet, the record of the First New Deal also makes clear that the Roosevelt administration was reluctant to embrace the transformative role that many now attribute to it.

As Keynes pointed out, Roosevelt’s primary concern in the first years of his administration was achieving market stability through market reform, not a larger financial stake in the economy to speed recovery.  In fact, in some cases, his initiatives gave private corporations even greater control over market activity.

The Roosevelt administration response to worker demands for jobs and a more humane system of welfare was also far from transformative.  Determined to place limits on federal spending, its major initiatives required substantial participation from struggling state governments.  They also did little to challenge the punitive and inadequate relief systems operated by state governments.  The one exception was the CWA, which mandated wage-paying federally directed employment.  And that was the one program, despite its popularity, that was quickly terminated.

Of course, there was a Second New Deal, which included a number of important and more progressive initiatives, including the Works Progress Administration, the Social Security Act, and the National Labor Relations Act.  However, as I will discuss in the next post in this series, this Second New Deal was largely undertaken in response to the growing strength of the unemployed movement and workplace labor militancy.   And as we shall see, even these initiatives fell short of what many working people demanded.

One lesson to be learned from this history for those advocating a Green New Deal is that major policy transformations do not come ready made, or emerge fully developed.  Even during a period of exceptional crisis, the Roosevelt administration was hesitant to pursue truly radical experiments.  And the evolution of its policy owed far more to political pressure than the maturation of its administrative capacities or a new found determination to experiment.

If we hope to win a Green New Deal we will have to build a movement that is not only powerful enough to push the federal government to take on new responsibilities with new capacities, but also has the political maturity required to appreciate the contested nature of state policy and the vision necessary to sustain its forward march.

What the New Deal can teach us about winning a Green New Deal: Part II—Movement Building

In Part I in this series on lessons to be learned from the New Deal, I described the enormous economic and social costs of the first years of the Great Depression and the reluctance of business and government leaders to pursue policies likely to threaten the status quo.  I did so to demonstrate that we should not assume that simply establishing the seriousness of our current multifaceted crisis, especially one that has yet to directly threaten capitalist profitability, will be enough to win elite consideration of a transformative Green New Deal.

I also argued that it was the growth of an increasingly militant political movement openly challenging the legitimacy of the police, courts, and other state institutions that finally transformed the national political environment and pushed Roosevelt to change course and introduce his early New Deal employment and relief programs.  In this post, I examine the driving force of this movement, the movement of unemployed.

The growth and effectiveness of the unemployed movement owes much to the organizing and strategic choices of the US Communist Party (CP).  While there is much to criticize about CP policies and activities, especially its sectarianism and aggressive antagonism towards other groups, there is also much we can learn about successful organizing from its work with the unemployed in the early years of the depression.

The party faced the challenge of building a mass movement powerful enough to force a change in government policy. Although its initial victory was limited, the policy breakthrough associated with the programs of the First New Deal led to new expectations and demands, culminating in Roosevelt’s adoption of far more extensive employment and relief policies as part of his Second New Deal, only two years later.

We face a similar challenge today; we need to build a mass movement capable of forcing the government to begin adopting policies that help advance a Green New Deal.  Therefore, it is well worth our time to study how party activists built a national organization of the unemployed that helped the unemployed see that their hard times were the result of structural rather than personal failure; encouraged local, collective, and direct action in defense of immediate shared basic needs; and connected local actions to a broader national campaign for government action.

The CP and the unemployed movement

The CP made its decision to organize the unemployed even before the start of the Great Depression.  In August 1929, two months before the stock market crash, the CP established the Trade Union Unity League (TUUL) as an alternative to the AFL and called on that body to assist in the creation of a nation-wide organization of Unemployed Councils (UCs).

The CP was following the lead of the Communist International which had, in 1928, declared the start of the so-called Third Period, which was said to mark the beginning of capitalism’s terminal stage, and called on all communist parties to end their joint work with other organizations and prepare for the coming revolutionary struggle.  This stance meant that as unemployment exploded, those without work had the benefit of an existing organization to give them a voice and instrument of action.  Unfortunately, it also led to destructive attacks on other political tendencies and efforts to build organizations of the unemployed, thereby weakening the overall effort.

The CP’s first big effort directed towards the unemployed was the March 6, 1930 demonstrations against unemployment and for relief that drew some 500,000 people in twenty-five cities and was organized under the banner of “International Day for Struggle against Worldwide Unemployment.”  The New York City demonstration, the largest, was met by police repression, with many demonstrators beaten and arrested.  But another New York City protest by the unemployed in October produced a victory, with the city agreeing to boost relief spending by $1 million.  These actions created visibility for the CP’s fledgling national network of UCs and helped to build its membership.

The Unemployed Councils of the USA held its founding convention in early July.  The following month it issued a statement calling on Congress to adopt its “Workers Unemployment Insurance Bill.” The bill called for “payment of $35 per week for each unemployed worker plus an additional $5 per week per dependent and the creation of a ‘National Unemployment Insurance Fund’ to be generated through a tax on all property valued in excess of $25,000 and incomes of more than $5,000.” A new Workers’ Commission, to be elected by working people, was to control the distribution of funds.

To this point, the Unemployed Councils of the USA was dominated by the CP, and its general program and demands largely echoed those of the CP, often including foreign policy declarations expressing support for the Soviet Union.  However, in November, finally acknowledging that this dominance was limiting recruitment, the party agreed to give its organizers more independence and freedom to focus on the issues of most direct concern to the unemployed.  In the months that followed, “a wave of rent strikes, eviction fights, and hunger marches involving an estimated 250,000 workers in seventy-five cities and six states swept the country. The Unemployed Councils had become a force to be reckoned with.”

The party’s focus on building a confrontational movement operating both locally and nationally led it to reject a variety of other efforts embraced by some unemployed.  As Franklin Folsom describes:

Early in 1931, some leaders of Unemployed Councils had recommended setting up food kitchens, and Communists helped organize food collections. These were humane acts of assistance to people who needed something to eat immediately. In a few months, however, both the Communists and the Unemployed Councils abandoned the idea, saying it had nothing to do with solving the basic problems of the unemployed.  Similarly, Communist and council policy on the subject of looting varied depending on time and place.  In the early days of mass unemployment some Communists encouraged the direct appropriation of food.  Later the practice was frowned on because it solved no long-term problem and could provoke very costly counteraction.

Many unemployed also turned to self-help activities to survive.  The so-called “productive enterprise” movement, in which unemployed workers sought to create their own enterprises to produce either for the market or barter, spread rapidly.  According to one study, by the end of 1932 this movement was active in thirty-seven states, with the largest group in California.  The CP and UCs opposed this effort from the start, calling it a self-starvation movement.

The organization and activity of the UCs

Most UCs were neighborhood centered, since the unemployed generally spent most of their time in the neighborhoods where they lived. The basic unit of the UC was the block committee, which comprised all unemployed local residents and their family members.  Each block committee elected delegates to a neighborhood unemployed council, and these councils, in turn, elected delegates to county or city unemployed councils.

The block committee office served as a social center, where the unemployed could gather and build relationships.  Through conversation and even more importantly action they were also able to develop a new radical understanding of the cause of their unemployment as well as appreciation for collective power.  As Steve Nelson, a leader of the Chicago UC movement, explained, it was important for the unemployed to “see that unemployment was not the result of their own or someone else’s mistake, that it was a worldwide phenomenon and a natural product of the system.” Thus, “unemployed agitation was as much education as direct action.”

With time on their hands, the unemployed were generally eager to act in defense of their neighbors, especially around housing and relief.  Here is Christine Ellis, a UC organizer, talking about what happened at one UC meeting in a black neighborhood on the west side of Chicago:

We spoke simply, explained the platform, the demands and activities of the unemployed council. And then we said, “Are there any questions?”…. Finally an elderly Black man stood up and said, “What you folks figure on doing about that colored family that was thrown out of their house today?… They’re still out there with their furniture on the sidewalk.” So the man with me said, “Very simple. We’ll adjourn the meeting, go over there, and put the furniture back in the house. After that, anyone wishing to join the unemployed council and build an organization to fight evictions, return to this hall and we’ll talk about it some more.” That’s what we did…everybody else pitched in, began to haul in every last bit of furniture, fix up the beds…and when that was all done, went back to the hall. The hall was jammed!

Carl Winder, another UC activist, describes the response of the councils in New York to attempted evictions for nonpayment of rent:

Squads of neighbors were organized to bar the way to the dispossessing offices.  Whole neighborhoods were frequently mobilized to take part in this mutual assistance.  Where superior police force prevailed, it became common practice for the Unemployed Councils to lead volunteer squads in carrying the displaced furniture and belongings back into the home after the police had departed.  Council organizers became adept in fashioning meter-jumps to restore disconnected electric service and gas.

Hosea Hudson, a UC activist in Alabama, tells how landlords in Birmingham would sometimes allow tenants to stay even without paying rent “because if they put a family out, the unemployed workers would wreck the house and take it away for fuel by night…. This was kind of a free-for-all, a share-the-wealth situation.”

No Work, No Rent! was the common chant at UC anti-eviction actions.  And because UCs were part of a national organization, successful strategies in one area were quickly shared with UCs in another, spurring new actions.  According to one account, UCs had practically stopped evictions in Detroit by March 1931.  It was estimated that in 1932, 77,000 New York City families were moved back into their homes by UCs.  At the same time, these were costly actions. The police would often arrest many of those involved as well as use force to end resistance, leading to serious injuries and in some cases deaths.

UCs also mobilized to help people who were turned down for relief assistance.  Normally, UC organizers would gather a large crowd outside the relief agency and send in an elected committee to demand a meeting to reverse the decision.  Here is Hosea Hudson again, explaining the approach of the Birmingham UC:

If someone get out of food and been down to the welfare two or three times and still ain’t got no grocery order…. We’d go to the house of the person that’s involved, the victim, let her tell her story. Then we’d ask all the people, “What do you all think could be done about it?” We wouldn’t just jump up and say what to do. We let the neighbors talk about it for a while, and then it would be some of us in the crowd, we going to say, “If the lady wants to go back down to the welfare, if she wants, I suggest we have a little committee to go with her and find out what the condition is.”

In New York, UC members would often organize sit-ins at the relief office and refuse to leave until the center reversed a negative decision.  Intimidated by the aggressive protests, local relief officials throughout the country increasingly gave ground and approved relief requests.

This kind of activism directly challenged business and elite claims that prosperity was just around the corner.  It also revealed a growing radical spark, as more and more people openly challenged the legitimacy of the police, the court system, and state institutions.

With demands for relief escalating, cash-strapped relief agencies began pressing city governments for additional funds.  But city budgets were also shrinking.    As Danny Lucia reports in his study of unemployed organizing, this was an explosive situation.  In 1932, with Chicago’s unemployment rate at 40 percent, “Mayor Anton Cermak told Congress to send $150 million today or federal troops in the future.”

Thus, the militancy of the unemployed movement was now pushing mayors and even some business leaders to also press for federal action.  This development served to amplify the UCs own state and national campaigns demanding direct job creation and a program of federal relief.  These campaigns, by design, also helped generate publicity and support for local UC actions.

For example, in January 1931, a gathering of the Unemployed Councils of America and the TUUL decided to launch a national petition drive aimed at forcing Congress to pass a Federal Unemployment Insurance bill.  The UCs then began door-to-door canvassing for signatures.  Approximately a month later a delegation of 140 people was sent to Washington DC to deliver the petition to Congress on National Unemployment Insurance Day.  Demonstrations in support of the petition, organized by UCs, were held in most major cities on the same day.

Not long after, the CP set up a new organization, the Unemployed Committee for the National Hunger March, to coordinate a national hunger march on Washington DC to demand federal unemployment insurance and “the granting of emergency winter relief for the unemployed in the form of a lump-sum payment of $150 per unemployed worker, with an additional $50 for each dependent” as well as “a 7-hour workday, establishment of a union wage pay scale for unemployed workers, payment of a soldiers’ bonus to veterans of World War I, and an end to discrimination against black American and foreign-born workers.”  Local conferences selected 1,670 delegates, who converged on Washington from four separate columns in December 1931.  Their trip across the country was supported by local UCs.

Not surprisingly, the delegates were denied entrance to the Capital to present their demands.  They stayed two days and then started back, holding mass meetings across the country on their return trip to talk about their demands and the need for mass action to win them.

Another National Hunger March took place the following year.  This time 3,000 delegates came to Washington DC to again present their demands for winter relief and unemployment insurance.  These marches not only helped to strengthen the movement of the unemployed, they also greatly increased the pressure on elected officials to take some action to restore popular confidence in the government.

Underpinning the strategic orientation of the work of the UCs was the CP’s determination to build solidarity between the labor movement and the unemployed and anti-racist unity.  The first is highlighted by struggles in Detroit, where most unemployment was the result of auto factory layoffs.  There, the UCs and the Young Communist League led several marches to auto plants to protest the inadequate benefits given to laid-off workers.  Organizers would also read statements aimed at the workers still employed in the plants, pledging that the unemployed would not scab if workers struck for improved conditions.

As for anti-racism work, the CP “made sure that all of its agitation in the unemployed councils included protests against racial discrimination by relief agencies, landlords, and local and federal government.  On a more individual level, the Communists’ emphasis on multiracial organizing created situations in which whites and Blacks worked together for a common purpose and created personal bonds.”

Other organizing efforts

The CP was not the only left organization working to build a movement of the unemployed.  Both the Socialist Party and the Conference of Progressive Labor Action (CPLA), led by A.J. Muste, also created unemployed organizations that mobilized hundreds of thousands of jobless workers in local and national protests.  The Socialist Party created affiliated committees in a number of cities, the largest in Chicago and New York.  These committees were, like the UCs, generally oriented towards direct action in response to local conditions but they also engaged in electoral efforts.

The CPLA organized a number of Unemployed Citizen Leagues (UCLs) following the model of the Seattle Unemployed Citizens League. Established in the summer of 1931, the Seattle UCL quickly grew to a membership of 80,000 by 1933.  The UCLs initially focused on self-help through barter and labor exchange.  For example, members of the Seattle league:

persuaded farmers to let them harvest the fruit and potatoes for which there was no market, and they borrowed trucks to transport this produce.  Women exchanged sewing for food.  Barbers cut hair for canned berries.  This practice of barter spread and was highly organized. . . . Some men collected firewood from cutover forested areas; in all, they cut, split, and hauled 11,000 cords.  The products of these labors were shared by UCL members.  Some members repaired houses or worked in shoe repair shops, while others did gardening.  There were also child welfare and legal aid projects in which lawyers contributed their services.

The UCLs were also active in local elections, supporting candidates and legislation in favor of extended relief aid and unemployment insurance.  However, after a few years, most abandoned their focus on self-help, finding that “the needs of the jobless greatly exceeded the ability of a mutual aid program to meet them,” and turned instead to more direct-action protests similar to those of the UCs.  Although the CPLA failed to develop a national presence, their leagues were important in the Midwest, especially Ohio.

The CP was hostile to these organizations and their organizing efforts. In line with their Third Period strategy, the CP considered them to be a danger to the movement they were trying to build and their leaders to be “social-fascists.”  Party opposition went beyond denouncing these groups.  UC activists were encouraged to undermine their work, sometimes by physical force, other times by infiltrating and disrupting their meetings. This sectarianism clearly weakened the overall strength of the unemployed movement.  At the same time, local UC activists would sometimes ignore CP and UC leadership directives and find ways to build solidarity around joint actions on behalf of the unemployed.

The unemployed were not the only group whose organizing threatened the status quo.  As Steve Fraser pointed out: “Farmers took to the fields and roads in shocking displays of lawlessness. All across the corn belt, rebels banded together to forcibly prevent evictions of fellow farmers.” The Farm Holiday Association, an organization of midwestern farmers founded in 1932, not only mobilized its members to resist evictions, it also supported a progressive income tax, federal relief for the urban unemployed, and federal government control of the banks.  “In the South, tenants and sharecroppers unionized and conducted what a Department of Labor study called a ‘miniature civil war.’”

Veterans also organized.  World War I veterans from around the country, many with their families, traveled to Washington DC in summer 1932.  The call for a national Bonus March, although made by a largely anti-communist leadership, was inspired by the CP organized First National Hunger March. The veterans had been promised a bonus to compensate for their low war-time pay, but the Congress had delayed payment until 1945.  The veterans wanted their money now and set-up camps near the Capitol to pressure Congress to act.  Their camps were destroyed and the veterans violently dispersed by troops led by Douglas McArthur and Dwight Eisenhower.

In short, the political trajectory was one that concerned a growing number of political and business leaders.  Working people, largely anchored by a left-promoted, mass-based movement of unemployed, were becoming increasingly militant and dismissive of establishment calls for patience.  Continued federal inaction was becoming ever more dangerous.  Recognizing the need for action to preserve existing structures of power, it took Roosevelt only three months to drop his commitment to balanced budget orthodoxy in favor of New Deal experimentation.

Lessons

The multifaceted crisis we face today is significantly different from the crisis activists faced in the first years of the Great Depression.  But there is no question that, much like then, we will need to build a powerful, mass-movement for change if we hope to harness state power to advance a Green New Deal.

The First New Deal was not the result of administration concerns over the economic and social costs of the Great Depression.  Rather, it was political pressure that forced Roosevelt to begin experimenting with programs responsive to the concerns of working people.  And, not surprisingly, these experiments were, as will be discussed in the next post in this series, quite limited. It took new organizing to push Roosevelt to implement more progressive programs as part of his Second New Deal.

There are also lessons to be learned from the period about movement building itself, specifically the CPs organizing and strategic choices in targeting the unemployed and building a national movement of the unemployed anchored by a network of UCs.   The UCs helped transform how people understood the cause of their hard times.  They also created a local, collective, and direct outlet for action in defense of immediate shared basic needs.  The CP also emphasized the importance of organizing those actions in ways designed to overcome important divisions among working people.  Finally, the party and the UCs created broader campaigns for public policies on the national level that were directly responsive to local concerns and actions. Thus, organizing helped create a momentum that built political awareness, leadership capacity, class unity, and national weight around demands for new public initiatives.

The call for a Green New Deal speaks to a variety of crises and the need for change in many different sectors, including food production, energy generation, transportation, manufacturing, social and physical infrastructure, housing, health care, and employment creation.  It also projects a vision of a new more sustainable, egalitarian, and democratic society.  While it would be a mistake to equate the organizing work in the early years of the depression, which focused on employment and relief, with what is required today given the multifaceted nature of our crisis, we would do well to keep the organizing experience highlighted above in mind as we seek to advance the movement building process needed to win a Green New Deal.  It offers important insights into some of the organizational and political challenges we can expect to face and helpful criteria for deciding how best to respond to them.

For example, it challenges us to think carefully about how to ensure that our organizing work both illuminates the roots of our current multifaceted crises, building anti-capitalist consciousness, and challenges existing racial, ethnic, and gender divisions, strengthening working class unity.  It also challenges us to think about how to ensure that that our efforts in different geographic areas and around different issues will connect to build a national presence and organizational form that strengthens and unites our various efforts and also projects our overall vision of a restructured society.  And it also challenges us to think about how we should engage the state itself, envisioning and preparing for the ways it can be expected to seek to undermine whatever reforms are won.

What the New Deal can teach us about winning a Green New Deal: Part I–Confronting Crisis

The New Deal has recently become a touchstone for many progressive efforts, illustrated by Bernie Sanders’ recent embrace of its aims and accomplishments and the popularity of calls for a Green New Deal.  The reasons are not hard to understand. Once again, growing numbers of people have come to the conclusion that our problems are too big to be solved by individual or local efforts alone, that they are structural and thus innovative and transformative state-led actions will be needed to solve them.

The New Deal was indeed a big deal and, given contemporary conditions, it is not surprising that people are looking back to that period for inspiration and hope that meaningful change is possible.  However, inspiration, while important, is not the same as seeking and drawing useful organizing and strategic lessons from a study of the dynamics of that period.

This is the first of a series of posts in which I will try to illuminate some of those lessons.  In this first post I start with the importance of crisis as a motivator of change.  What the experience of the Great Depression shows is that years of major economic decline and social devastation are not themselves sufficient to motivate business and government elites to pursue policies likely to threaten the status quo.  It was only after three and a half years of organizing had also created a political crisis, that the government began taking halting steps at serious change, marked by the policies associated with the First New Deal.  In terms of contemporary lessons, this history should serve to dispel any illusions that simply establishing the seriousness of our current multifaceted crisis will be enough to win elite consideration of a transformative Green New Deal.

The Great Depression

The US economy expanded rapidly throughout the 1920s, a period dubbed the Roaring Twenties. It was a time of rapid technological change, business consolidation, and wealth concentration.  It was also a decade when many traditional industries struggled, such as agriculture, textiles, coal, and shipbuilding, as did most of those who worked in them.  Growth was increasingly sustained by consumer demand underpinned by stock market speculation and debt.

The economy suffered a major downturn in 1920-21, and then mild recessions in 1924 and 1927.  And there were growing signs of the start of another recession in summer 1929, months before the October 1929 stock market collapse, which triggered the beginning of the Great Depression.  The collapse quickly led to the unraveling of the US economy.

The Dow Jones average dropped from 381 in September 1929 to forty-one at the start of 1932.  Manufacturing output fell by roughly 40 percent between 1929 and 1933.  The number of full-time workers at United States Steel went from 25,000 in 1929 to zero in 1933.  Five thousand banks failed over the same period.  Steve Frazer captured the extent and depth of the decline as follows: “In early 1933, thirty-six of forty key economic indicators had arrived at the lowest point they were to reach during the whole eleven grim years of the Great Depression.”

The resulting crisis hit working people hard.   Between 1930 and 1932, the number of unemployed grew from 3 million to 15 million, or approximately 25 percent of the workforce.  The unemployment rate for those outside the agricultural sector was close to 37 percent.  As Danny Lucia describes:

Workers who managed to hold onto their jobs faced increased exploitation and reduction in wages and hours, which made it harder for them to help out jobless family and friends. The social fabric of America was ripped by the crisis: One-quarter of children suffered malnutrition, birth rates dropped, suicide rates rose. Many families were torn apart. In New York City alone, 20,000 children were placed in institutions because their parents couldn’t support them. Homeless armies wandered the country on freight trains; one railroad official testified that the number of train-hoppers caught by his company ballooned from 14,000 in 1929 to 186,000 in 1931.

“Not altogether a bad thing”

Strikingly, despite the severity of the economic and social crisis, business leaders and the federal government were in no hurry to act.  There was certainly no support for any meaningful federal relief effort.  In fact, business leaders initially tended to downplay the seriousness of the crisis and were generally optimistic about a quick recovery.

As the authors of Who Built America (volume 2) noted:

when the business leaders who made up the National Economic League were asked in January 1930 what the country’s ‘paramount economic problems’ were, they listed first, ‘administration of justice,’ second, ‘Prohibition,” and third, ‘lawlessness.’ Unemployment was eighteenth on their list!

Some members of the Hoover administration tended to agree. Treasury Secretary Andrew Mellon thought the crisis was “not altogether a bad thing.”  “People,” he argued, “will work harder, live a more moral life.  Values will be adjusted, and enterprising people will pick up the wrecks from less competent people.”

President Hoover repeatedly stated that the economy was “on a sound and prosperous basis.”  The solution to the crisis, he believed, was to be found in restoring business confidence and that was best achieved through maintaining a balanced budget.  When it came to relief for those unemployed or in need, Hoover believed that the federal government’s main role was to encourage local government and private efforts, not initiate programs of its own.

At time of stock market crash, relief for the poor was primarily provided by private charities, which relied on donations from charitable and religious organizations.  Only 8 states had any type of unemployment insurance.  Not surprisingly, this system was inadequate to meet popular needs.  As the authors of Who Built America explained:

by 1931 most local governments and many private agencies were running out of money for relief.  Sometimes needy people were simply removed from the relief rolls.  According to one survey, in 1932 only about one-quarter of the jobless were receiving aid.  Many cities discriminated against nonwhites.  In Dallas and Houston, African-Americans and Mexican-Americans were denied any assistances.

It was not until January 1932 that Congress made its first move to strengthen the economy, establishing the Reconstruction Finance Corporation (RFC) to provide support to financial institutions, corporations, and railroads.  Six months later, in July, it approved the Emergency Relief and Construction Act, which broadened the scope of the RFC, allowing it to provide loans to state and local governments for both public works and relief.  However, the Act was structured in ways that undermined its effectiveness. For example, the $322 million allocated for public works could only be used for projects that would generate revenue sufficient to pay back the loans, such as toll bridges and public housing.  The $300 million allocated for relief also had to be repaid.  Already worried about debt, many local governments refused to apply for the funds.

Finally, as 1932 came to a close, some business leaders began considering the desirability of a significant federal recovery program, but only for business.  Most of their suggestions were modeled on World War I programs and involved government-business partnerships designed to regulate and stabilize markets.  There was still no interest in any program involving sustained and direct federal relief to the millions needing jobs, food, and housing.

By the time of Roosevelt’s inauguration in March 1933, the economy, as noted above, had fallen to its lowest point of the entire depression.  Roosevelt had won the presidency promising “a new deal for the American people,” yet his first initiatives were very much in line with the policies of the previous administration. Two days after his inauguration he declared a national bank holiday, which shut down the entire banking system for four days and ended a month-long run on the banks. The “holiday” gave Congress time to approve a new law which empowered the Federal Reserve Board to supply unlimited currency to reopened banks, which reassured the public about the safety of their accounts.

Six days after his inauguration, Roosevelt, who had campaigned for the Presidency, in part, on a pledge to balance the federal budget, submitted legislation to Congress which would have cut $500 million from the $3.6 billion federal budget.  He proposed eliminating government agencies, reducing the pay of civilian and military federal workers (including members of Congress), and slashing veterans’ benefits by 50 percent.  Facing Congressional opposition, the final bill cut spending by “only” $243 million.

Lessons

It is striking that some 3 ½ years after the start of the Great Depression, despite the steep decline in economic activity and incredible pain and suffering felt by working people, business and government leaders were still not ready to support any serious federal program of economic restructuring or direct relief.  That history certainly suggests that even a deep economic and social crisis cannot be counted on to encourage elites to explore policies that might upset existing structures of production or relations of power, an important insight for those hoping that recognition of the seriousness of our current environmental crisis might encourage business or government receptivity to new transformative policies.

Of course, we do know that in May 1933 Roosevelt finally began introducing relief and job creation programs as part of his First New Deal.  And while many factors might have contributed to such a dramatic change in government policy, one of the most important was the growing movement of unemployed and their increasingly militant and collective action in defense of their interests.  Their activism was a clear refutation of business and elite claims that prosperity was just around the corner.  It also revealed a growing radical spark, as more and more people openly challenged the legitimacy of the police, courts, and other state institutions.  As a result, what was an economic and social crisis also became a political crisis.  As Adolf Berle, an important member of Roosevelt’s “Brain Trust,” wrote, “we may have anything on our hands from a recovery to a revolution.”

In Part II, I will discuss the rise and strategic orientation of the unemployment movement, highlighting the ways it was able to transform the political environment and thus encourage government experimentation.  And I will attempt to draw out some of the lessons from this experience for our own contemporary movement building efforts.

The 1933 programs, although important for breaking new ground, were exceedingly modest.  And, as I will discuss in a future post, it was only the rejuvenated labor movement that pushed Roosevelt to implement significantly more labor friendly policies in the Second New Deal starting in 1935.  Another post will focus more directly on the development and range of New Deal policies in order to shed light on the forces driving state policy as well as the structural dynamics which tend to limit its progressive possibilities, topics of direct relevance to contemporary efforts to envision and advance a Green New Deal agenda.

Forgotten Workers And The US Expansion

There is a lot of celebrating going on in mainstream policy circles.  The economy is said to be running at full steam with the unemployment rate now below 4 percent.  As Clive Crook puts it in Bloomberg Businessweek, “The U.S. expansion has put millions of people back to work and economists agree that the economy is now at or close to full employment.”

Forgotten in all this celebration is the fact that wages remain stagnant.  Also forgotten are the millions of workers who are no longer counted as part of the labor force and thus not counted as unemployed.

Forgotten workers

One of the best indicators of the weakness of the current recovery is the labor market status of what is called the core workforce, those ages 25-54.  Their core status stems from the fact that, as Jill Mislinski explains, “This cohort leaves out the employment volatility of the high-school and college years, the lower employment of the retirement years and also the age 55-64 decade when many in the workforce begin transitioning to retirement … for example, two-income households that downsize into one-income households.”

The unemployment rate of those 25-54 reached a peak of 9 percent in 2009 before falling steadily to a low of 3.2 percent as of July 2018.  However, the unemployment rate alone can be a very misleading indicator of labor market conditions.  That is certainly true when it comes to the labor market status of today’s core workforce.

A more revealing measure is the Labor Force Participation Rate, which is defined as the Civilian Labor Force (i.e. the sum of those employed and unemployed) divided by the Civilian Noninstitutional Population (i.e. those of working age who are not in the military or institutionalized). Because there can be significant monthly swings in both the numerator and denominator of this measure, the Labor Force Participation Rate shown in the chart below is calculated using a 12-month moving average.

As we can see, the Labor Force Participation Rate for the 25-54 core cohort has sharply declined, from a mid-2000 high of 84.2 percent, down to a low of 81.9 percent in July 2018. Mislinski calculates that:

Based on the moving average, today’s age 25-54 cohort would require 1.6 million additional people in the labor force to match its interim peak participation rate in 2008 and 2.9 million to match the peak rate around the turn of the century.

A related measure of labor market conditions is the Employment-to-Population Ratio, which is defined as the Civilian Employed divided by the Civilian Noninstitutional Population.  As we can see in the next chart, the Employment-to-Population Ratio of our core cohort has also declined from its mid-2000 peak.

Again, according to Mislinski,

First the good news: This metric began to rebound from its post-recession trough in late 2012. However, the more disturbing news is that the current age 25-54 cohort would require an increase of 1.2 million employed prime-age participants to match its ratio peak in 2007. To match its mid-2000 peak would require a 3.1 million participant increase.

The takeaway

Both the Labor Force Participation Rate and the Employment-to-Population Ratio are useful measures of the employment intensity of the economy.  And in a healthy economy we should expect to see high values for both measures for the 25-54 age cohort. That is especially true for a country like the United States, where the non-market public provision of education, health care, and housing is quite limited, and an adequate retirement depends upon private savings.  In other words, people need paid employment to live and these are prime work years.

The decline, over the business cycle, in both the Labor Force Participation Rate and the Employment-to-Population Ratio for our core cohort strongly suggests that our economy is undergoing a profound structural change, with business increasingly organizing its activities in ways that require fewer workers. More specifically, the lower values in these measures mean that millions of prime age workers are being sidelined, left outside the labor market.

It is hard to know what will become of these workers and by extension their families and communities.  Moreover, this is not a problem only of the moment.  This cohort is still relatively young, and the social costs of being sidelined from employment—and here we are not even considering the quality of that employment—will only grow with age.  We can only hope that workers of all ages will eventually recognize that our growing employment problems are the result, not of individual failings, but an increasingly problematic economic system, and begin pushing for its structural transformation.

Magical Bootstraps And The Struggles Of Working Americans

A recession is coming, sooner or later.  Once it hits, we can expect articles bemoaning the fact that working people didn’t build-up their savings during this record expansion to help them through the hard times.  If only they had pinched pennies here and there, skipped a new TV or smart phone, they could have generated some capital that could have been invested . . . Ah the missed opportunities.

Of course, the reality is quite different.  One reason is that the current so-called good times have not been very good for working people.  For example, as Jonathan Spicer points out, “the rise in median expenditures has outpaced before-tax income for the lower 40 percent of earners in the five years to mid-2017 while the upper half has increased its financial cushion, deepening income disparities.” In other words, a significant percentage of workers have had to run down their savings or borrow to survive; wealth accumulation has been out of the question.

The bootstrap theory of success

The notion that under capitalism each individual has the ability, without outside help, to “pull themselves up by their bootstraps,” has a powerful hold on popular consciousness.  And, its message of self-reliance and individual responsibility serves capitalist interests well by deflecting attention away from the systemic causes of current economic problems.

The irony is that the phrase itself originally referred to something that was physically impossible to achieve.  As Caroline Bologna explains:

The concept is simple: To pull yourself up by your bootstraps means to succeed or elevate yourself without any outside help.

But when you examine this expression and its current meaning, it doesn’t seem to make much sense.

To pull yourself up by your bootstraps is actually physically impossible. In fact, the original meaning of the phrase was more along the lines of “to try to do something completely absurd.”

Etymologist Barry Popik and linguist and lexicographer Ben Zimmer have cited an American newspaper snippet from Sept. 30, 1834 as the earliest published reference to lifting oneself up by one’s bootstraps. A month earlier, a man named Nimrod Murphree announced in the Nashville Banner that he had “discovered perpetual motion.” The Mobile Advertiser picked up this tidbit and published it with a snarky response ridiculing his claim: “Probably Mr. Murphree has succeeded in handing himself over the Cumberland river, or a barn yard fence, by the straps of his boots.”

“Bootstraps were a typical feature of boots that you could pull on in the act of putting your boots on, but of course bootstraps wouldn’t actually help you pull yourself over anything,” Zimmer told HuffPost. “If you pulled on them, it would be physically impossible to get yourself over a fence. The original imagery was something very ludicrous, as opposed to what we mean by it today of being a self-made man.” . . .

Beyond the Murphree example, versions of the phrase appeared in many published texts to describe something ridiculous. Popik has documented several of these examples on his blog.

Leaving aside questions about why the phrase “pulling oneself up by their bootstraps” is no longer used as a way to dismiss an impossibility or absurdity, its original meaning captures capitalist realities far better than does is its current meaning.  Quite simply, there are no magical bootstraps that enable working people to “pull themselves up” to economic security and well-being by dint of their own hard work.  The problem is that far too many Americans still believe in their existence and thus blame themselves for their economic situation.

The struggles of working Americans

In a Reuters article, Jonathan Spicer illustrates the fact that “behind the headlines of roaring job growth and consumer spending . . . the boom continues in large part by the poorer half of Americans fleecing their savings and piling up debt.”

The figure below shows the median income for each of five groups of Americans based on their before-tax income.

The next figure shows, for 2017, the difference between expenses and pre-tax income for each of the five groups.  As one can see, expenses (red circle) outstrip income (blue circle) for the bottom two groups or 40 percent of the population.  Those in the third group are barely keeping their heads above water.

The last figure below shows that 2017 was no aberration.  Despite the longest expansion in post-war US history, most Americans are struggling to meet expenses.  As Spicer comments, “lower-earners have been sinking deeper into red over the last five years.”

It is no wonder that the Federal Reserve, in its Report on the Economic Well-Being of US Households in 2017, found that forty percent of American adults don’t have enough savings to cover a $400 emergency expense such as an unexpected medical bill, car problem or home repair.

One important reason for these depressing trends is that there has been little growth in wages.   And as Jared Bernstein explains in the New York Times, that outcome is largely due to the exercise of class power:

The United States labor market is closing in on full employment in an economic expansion that just began its 10th year, and yet the real hourly wage for the working class has been essentially flat for two years running. Why is that?

Economists ask this question every month when the government reports labor statistics. We repeatedly get solid job growth and lower unemployment, but not much to show for wages. Part of that has to do with inflation, productivity and remaining slack in the labor market.

But stagnant wages for factory workers and non-managers in the service sector — together they represent 82 percent of the labor force — is mainly the outcome of a long power struggle that workers are losing. Even at a time of low unemployment, their bargaining power is feeble, the weakest I’ve seen in decades. Hostile institutions — the Trump administration, the courts, the corporate sector — are limiting their avenues for demanding higher pay.

It matters how Americans understand their situation and the broader dynamics that shape it.  Challenging the ideology that misleads popular understandings, and that includes fanciful notions of what pulling on bootstraps can accomplish, is an important part of the movement building process needed to achieve any meaningful social change.

US Militarism Marches On

Republicans and Democrats like to claim that they are on opposite sides of important issues.  Of course, depending on which way the wind blows, they sometimes change sides, like over support for free trade and federal deficits.  Tragically, however, there is no division when it comes to militarism.

For example, the federal budget for fiscal year 2018 (which ends on September 30, 2018), included more money for the military than even President Trump requested.  Trump had asked for a military budget of $603 billion, a sizeable $25 billion increase over fiscal year 2017 levels; Congress approved $629 billion.  Trump had also asked for $65 billion to finance current war fighting, a bump of $5 billion; Congress approved $71 billion.  The National Defense Authorization Act of 2018, which set the target budget for the Department of Defense at this high level, was approved by the Senate in a September 2017 vote of 89-9.

In the words of the New York Times: “In a rare act of bipartisanship on Capitol Hill, the Senate passed a $700 billion defense policy bill . . . that sets forth a muscular vision of America as a global power, with a Pentagon budget that far exceeds what President Trump has asked for.”

That Act also called for a further increase in military spending of $16 billion for fiscal year 2019 (which begins October 1, 2018).  And, in June 2018, the Senate voted 85 to 10 to authorize that increase, boosting the Defense Department’s fiscal year 2019 total to $716 billion.

This bipartisan embrace of militarism comes at enormous cost for working people.  This cost includes cuts in funding for public housing, health care and education; the rebuilding of our infrastructure; basic research and development; and efforts to mitigate climate change.  It also includes the militarization of our police, since the military happily transfers its excess or outdated equipment to willing local police departments.

And it also includes a belligerent foreign policy.  A case in point: Congress has made clear its opposition to the Trump administration decision to meet with North Korean leader Kim Jong-un and halt war games directed against North Korea, apparently preferring the possibility of a new Korean War.  Congress is also trying to pass a law that will restrict the ability of the President to reduce the number of US troops stationed in South Korea.

In brief, the US military industrial complex, including the bipartisan consensus which helps to promote militarism’s popular legitimacy, is one of the most important and powerful foes we must overcome if we are to seriously tackle our ever-growing social, economic, and ecological problems.

The military is everywhere

The US has approximately 800 formal military bases in 80 countries, with 135,000 soldiers stationed around the globe.  Putting this in perspective, Alice Slater reports that:

only 11 other countries have bases in foreign countries, some 70 altogether. Russia has an estimated 26 to 40 in nine countries, mostly former Soviet Republics, as well as in Syria and Vietnam; the UK, France, and Turkey have four to 10 bases each; and an estimated one to three foreign bases are occupied by India, China, Japan, South Korea, Germany, Italy, and the Netherlands.

US special forces are deployed in even more countries.  According to Nick Turse, as of 2015, these forces were operating in 135 countries, an 80 percent increase over the previous five years.  “That’s roughly 70 percent of the countries on the planet. Every day, in fact, America’s most elite troops are carrying out missions in 80 to 90 nations practicing night raids or sometimes conducting them for real, engaging in sniper training or sometimes actually gunning down enemies from afar.”

This widespread geographic deployment represents not only an aggressive projection of US elite interests, it also provides a convenient rationale for those that want to keep the money flowing.  The military, and those that support its funding, always complain that the military needs more funds to carry out its mission.  Of course, the additional funds enable the military to expand the reach of its operations, thereby justifying another demand for yet more money.

The US military is well funded 

It is no simple matter to estimate of how much we spend on military related activities.  The base military budget is the starting point.  It represents the amount of the discretionary federal budget that is allocated to the Department of Defense.  Then there is the overseas contingency operations fund, which is a separate pool of money sitting outside any budgetary restrictions, that the military receives yearly from the Congress to cover the costs of its ongoing warfare.

It is the combination of the two that most analysts cite when talking about the size of the military budget. Using this combined measure, the Stockholm International Peace Research Institute finds that the United States spends more on its military than the next seven largest military spenders combined, which are China, Russia, Saudi Arabia, India, France, the UK, and Japan.

As the following chart shows, US military spending (base budget plus overseas contingency operations fund), adjusted for inflation, has been on the rise for some time, and is now higher than at any time other than during the height of the Iraq war.  Jeff Stein, writing in the Washington Post, reports that the military’s base budget will likely be “the biggest in recent American history since at least the 1970s, adjusting for inflation.”

As big as it is, the above measure of military spending grossly understates the total.  As JP Sottile explains:

The Project on Government Oversight (POGO) tabulated all “defense-related spending” for both 2017 and 2018, and it hit nearly $1.1 trillion for each of the two years. The “defense-related” part is important because the annual National Defense Authorization Act, a.k.a. the defense budget, doesn’t fully account for all the various forms of national security spending that gets peppered around a half-dozen agencies.

William Hartung, an expert on military spending, went agency by agency to expose all the various military-related expenses that are hidden in different parts of the budget.  As he points out:

You might think that the most powerful weapons in the U.S. arsenal — nuclear warheads — would be paid for out of the Pentagon budget.   And you would, of course, be wrong.  The cost of researching, developing, maintaining, and “modernizing” the American arsenal of 6,800 nuclear warheads falls to an obscure agency located inside the Department of Energy, the National Nuclear Security Administration, or NNSA. It also works on naval nuclear reactors, pays for the environmental cleanup of nuclear weapons facilities, and funds the nation’s three nuclear weapons laboratories, at a total annual cost of more than $20 billion per year.

Hartung’s grand total, which includes, among other things, the costs of Homeland Security, foreign military aid, intelligence services, the Veterans Administration, and the interest on the debt generated by past spending on the military, is $1.09 trillion, roughly the same as the POGO total cited above.  In short, our political leaders are far from forthcoming about the true size of our military spending.

Adding insult to injury, the military cannot account for how it spends a significant share of the funds it is given.  A Reuters’ article by Scott Paltrow tells the story:

The United States Army’s finances are so jumbled it had to make trillions of dollars of improper accounting adjustments to create an illusion that its books are balanced.

The Defense Department’s Inspector General, in a June [2016] report, said the Army made $2.8 trillion in wrongful adjustments to accounting entries in one quarter alone in 2015, and $6.5 trillion for the year. Yet the Army lacked receipts and invoices to support those numbers or simply made them up.

As a result, the Army’s financial statements for 2015 were “materially misstated,” the report concluded. The “forced” adjustments rendered the statements useless because “DoD and Army managers could not rely on the data in their accounting systems when making management and resource decisions.” . . .

The report affirms a 2013 Reuters series revealing how the Defense Department falsified accounting on a large scale as it scrambled to close its books. As a result, there has been no way to know how the Defense Department – far and away the biggest chunk of Congress’ annual budget – spends the public’s money.

The new report focused on the Army’s General Fund, the bigger of its two main accounts, with assets of $282.6 billion in 2015. The Army lost or didn’t keep required data, and much of the data it had was inaccurate, the IG said.

“Where is the money going? Nobody knows,” said Franklin Spinney, a retired military analyst for the Pentagon and critic of Defense Department planning. . . .

For years, the Inspector General – the Defense Department’s official auditor – has inserted a disclaimer on all military annual reports. The accounting is so unreliable that “the basic financial statements may have undetected misstatements that are both material and pervasive.”

Military spending is big for business

Almost half of the US military budget goes to private military contractors.  These military contracts are the lifeblood for many of the largest corporations in America.  Lockheed Martin and Boeing rank one and two on the list of companies that get the most money from the government.  In 2017 Lockheed Martin reported $51 billion in sales, with $35.2 billion coming from the government.  Boeing got $26.5 billion. The next three in line are Raytheon, General Dynamics, and Northrop Grumman.  These top five firms captured some $100 billion in Pentagon contracts in 2016.

And, as Hartung describes,

The Pentagon buys more than just weapons. Health care companies like Humana ($3.6 billion), United Health Group ($2.9 billion), and Health Net ($2.6 billion) cash in as well, and they’re joined by, among others, pharmaceutical companies like McKesson ($2.7 billion) and universities deeply involved in military-industrial complex research like MIT ($1 billion) and Johns Hopkins ($902 million).

Not surprisingly, given how lucrative these contracts are, private contractors work hard to ensure the generosity of Congress. In 2017, for example, 208 defense companies spent almost $100 million to deploy 728 reported lobbyists.  Lobbying is made far easier by the fact that more than 80 percent of top Pentagon officials have worked for the defense industry at some point in their careers, and many will go back to work in the defense industry.

Then there are arms sales to foreign governments. Lawrence Wittner cites a study by the Stockholm International Peace Research Institute that found that sales of weapons and military services by the world’s largest 100 corporate military suppliers totaled $375 billion in 2016. “U.S. corporations increased their share of that total to almost 58 percent, supplying weapons to at least 100 nations around the world.”

Eager to promote the arms industry, government officials work hard on their behalf.  As Hartung explains: From the president on his trips abroad to visit allied world leaders to the secretaries of state and defense to the staffs of U.S. embassies, American officials regularly act as salespeople for the arms firms.”

More for the military and less for everything else

The federal budget is divided into three categories: mandatory spending (primarily social security and medicare), discretionary spending, and interest on the debt. Two trends in discretionary spending, the component of the budget set each year at the discretion of Congress, offer a window on how militarism is squeezing out funding for programs that serve majority needs.

The first noteworthy trend is the growing Congressional support for defense (base military budget) over non-defense programs. In 2001, the majority of discretionary funds went to non-defense programs,  However, that soon changed, as we see in the chart below, thanks to the “war on terror.”  In the decade following September 11, 2001, military spending increased by 50 percent, while spending on every other government program increased by only 13.5 percent.

In the 2018 federal budget, 54 percent of discretionary funds are allocated to the military (narrowly defined), $700 billion to the military and $591 billion to non-military programs. The chart below shows President Trump’s discretionary budgetary request for fiscal year 2019. As we can see, the share of funds for the military would rise to 61 percent of the total.

According to the National Priorities Project, “President Trump’s proposals for future spending, if accepted by Congress, would ensure that, by 2023, the proportion of military spending [in the discretionary budget] would soar to 65 percent.”  Of course, militarism’s actual share is much greater, since the military is being defined quite narrowly.  For example, Veterans’ Benefits is included in the non-defense category.

The second revealing trend is the decline in non-defense discretionary spending relative to GDP.  Thus, not only is the military base budget growing more rapidly than the budget for nondefense programs, spending on discretionary non-defense programs is not even keeping up with the growth in the economy.  This trend translates into a declining public capacity to support research and development and infrastructure modernization, as well as meet growing needs for housing, education, health and safety, disaster response . . . the list is long.

The 2018 bipartisan budget deal increased discretionary spending for both defense and non-defense programs, but the deal did little to reverse this long run decline in non-defense discretionary spending relative to the size of the economy.  A Progressive Policy Institute blog post by Ben Ritz explains:

The Budget Control Act of 2011 (BCA) capped both categories of discretionary spending as part of a broader effort to reduce future deficits. When Congress failed to reach a bipartisan agreement on taxes and other categories of federal spending, the BCA automatically triggered an even deeper, across-the-board cut to discretionary spending known as sequestration. While the sequester has been lifted several times since it first took effect, discretionary spending consistently remained far below the original BCA caps.

That trend ended with the Bipartisan Budget Act of 2018 (BBA). This budget deal not only lifted discretionary spending above sequester levels – it also went above and beyond the original BCA caps for two years. Nevertheless, projected domestic discretionary spending for Fiscal Year 2019 is significantly below the historical average as a percentage of gross domestic product. Moreover, even if policymakers extended these policy changes beyond the two years covered by the BBA, we project that domestic discretionary spending could fall to just 3 percent of GDP within the next decade – the lowest level in modern history [see dashed black line in chart below].

The story is similar for defense spending. Thanks to the pressure put on by the sequester, defense discretionary spending fell to just under 3.1 percent of GDP in FY2017. Under the BBA, defense spending would increase to 3.4 percent of GDP in FY2019 before falling again [see dashed black line in following chart]. Unlike domestic discretionary spending, however, defense would remain above the all-time low it reached before the 2001 terrorist attacks throughout the next decade.

In sum, Congress appears determined to squeeze non-defense programs, increasingly privileging defense over non-defense spending in the discretionary budget and allowing non-defense spending as a share of GDP to fall to record lows.  The ratio of discretionary defense spending relative to GDP appears to be stabilizing, although at levels below its long-term average.  However, discretionary defense spending refers only to the base budget of the Department of Defense and as such is a seriously understated measure of the costs of US militarism.  Including the growing costs of Homeland Security, foreign military aid, intelligence services, the Veterans Administration, the interest on the debt generated by past spending on the military, and the overseas contingency operations fund, would result in a far different picture, one that would leave no doubt about the government’s bipartisan commitment to militarism.

The challenge ahead

Fighting militarism is not easy.  Powerful political and business forces have made great strides in converting the United States into a society that celebrates violence, guns, and the military. The chart below highlights one measure of this success.  Sadly, 39 percent of Americans polled support increasing our national defense while 46 percent think it is just about right. Only 13 percent think it is stronger than it needs to be.

Polls, of course, just reveal individual responses at a moment in time to questions that, in isolation, often provide respondents with no meaningful context or alternatives and thus reveal little about people’s true thoughts.  At the same time, results like this show just how important it is for us to work to create space for community conversations that are informed by accurate information on the extent and aims of US militarism and its enormous political, social, economic, and ecological costs for the great majority of working people.

Living On The Edge: Americans In A Time Of “Prosperity”

These are supposed to be the good times—with our current economic expansion poised to set a record as the longest in US history. Yet, according to the Federal Reserve’s Report on the Economic Well-Being of US Households in 2017, forty percent of American adults don’t have enough savings to cover a $400 emergency expense such as an unexpected medical bill, car problem or home repair.

The problem with our economy isn’t that it sometimes hits a rough patch.  It’s that people struggle even when it is setting records.

The expansion is running out of steam

Our current economic expansion has already gone 107 months.  Only one expansion has lasted longer: the expansion from March 1991 to March 2001 which lasted 120 months.

A CNBC Market Insider report by Patti Domm quotes Goldman Sachs economists as saying: “The likelihood that the expansion will break the prior record is consistent with our long-standing view that the combination of a deep recession and an initially slow recovery has set us up for an unusually long cycle.”

The Goldman Sachs model, according to Domm:

shows an increased 31 percent chance for a U.S. recession in the next nine quarters. That number is rising. But it’s a good news, bad news story, and the good news is there is now a two-thirds chance that the recovery will be the longest on record. . . . The Goldman economists also say the medium-term risk of a recession is rising, “mainly because the economy is at full employment and still growing above trend.”

The chart below highlights the growing recession risk based on a Goldman Sachs model that looks at “lagged GDP growth, the slope of the yield curve, equity price changes, house price changes, the output gap, the private debt/GDP ratio, and economic policy uncertainty.”

Sooner or later, the so-called good times are coming to an end.  Tragically, a large percent of Americans are still struggling at a time when our “economy is at full employment and still growing above trend.” That raises the question: what’s going to happen to them and millions of others when the economy actually turns down?

Living on the edge

The Federal Reserve’s report was based on interviews with a sample of over 12,000 people that was “designed to be representative of adults ages 18 and older living in the United States.”  One part of the survey dealt with unexpected expenses.  Here is what the report found:

Approximately four in 10 adults, if faced with an unexpected expense of $400, would either not be able to cover it or would cover it by selling something or borrowing money. The following figure shows that the share of Americans facing financial insecurity has been falling, but it is still alarming that the percentage remains so high this late in a record setting expansion.

Strikingly, the Federal Reserve survey also found, as shown in the table below, that “(e)ven without an unexpected expense, 22 percent of adults expected to forgo payment on some of their bills in the month of the survey. Most frequently, this involves not paying, or making a partial payment on, a credit card bill.”

And, as illustrated in the figure below, twenty-seven percent of adult Americans skipped necessary medical care in 2017 because they were unable to afford its cost.  The table that follows shows that “dental care was the most frequently skipped treatment, followed by visiting a doctor and taking prescription medicines.”

Clearly, we need more and better jobs and a stronger social safety net.  Achieving those will require movement building.  Needed first steps include helping those struggling see that their situation is not unique, a consequence of some individual failing, but rather is the result of the workings of a highly exploitative system that suffers from ever stronger stagnation tendencies.  And this requires creating opportunities for people to share experiences and develop their will and capacity to fight for change.  In this regard, there may be much to learn from the operation of the Councils of the Unemployed during the 1930s.

It also requires creating opportunities for struggle.  Toward that end we need to help activists build connections between ongoing labor and community struggles, such as the ones that education and health care workers are making as they fight for improved conditions of employment and progressive tax measures to fund a needed expansion of public services.  This is the time, before the next downturn, to lay the groundwork for a powerful movement for social transformation.

______________

This post was updated May 31, 2018.  The original post misstated the length of the current expansion.

Class, Race, and US Wealth Inequality

People tend to have a distorted picture of US capitalism’s operation, believing that the great majority of Americans are doing well, benefiting from the system’s long-term growth and profit generation.  Unfortunately, this is not true.  Median wealth has been declining, leaving growing numbers of working people increasingly vulnerable to the ups and downs of economic activity and poorly positioned to enjoy a secure retirement.  Moreover, this general trend masks a profound racial wealth divide, with people of color disproportionally suffering from a loss of wealth and insecurity.

A distorted picture of wealth inequality

In a 2011 article, based on 2005 national survey data, Michael I. Norton and Dan Ariely demonstrate how little Americans know about the extent of wealth inequality.  The figure below (labeled Fig. 2) shows the actual distribution of wealth in that year compared to what survey respondents thought it was, as well as their ideal wealth distribution.  As the authors explain:

respondents vastly underestimated the actual level of wealth inequality in the United States, believing that the wealthiest quintile held about 59% of the wealth when the actual number is closer to 84%. More interesting, respondents constructed ideal wealth distributions that were far more equitable than even their erroneously low estimates of the actual distribution, reporting a desire for the top quintile to own just 32% of the wealth. These desires for more equal distributions of wealth took the form of moving money from the top quintile to the bottom three quintiles, while leaving the second quintile unchanged, evincing a greater concern for the less fortunate than the more fortunate.

The next figure reveals that respondents tended to have remarkably similar perceptions of wealth distribution regardless of their income, political affiliation, or gender.  Moreover, all the groups embraced remarkably similar ideal distributions that were far more egalitarian than their estimated ones.

Capitalist wealth dynamics

Wealth inequality has only grown worse since 2005.  As I previously posted, in 2016, the top 10 percent of the population owned 77.1 percent of the nation’s wealth, while the bottom 10 percent owned -0.5 percent (they are net debtors).  Even these numbers understate the degree of wealth concentration: the top 1 percent actually owned 38.5 percent of the wealth, more than the bottom 90 percent combined. This was a sharp rise from the 29.9 percent share they held in 1989.

Perhaps more importantly, median household wealth is not only quite small–not nearly enough to provide financial stability and security–but is actually growing smaller over time.  In fact, median household wealth in 2016 was 8 percent below what it had been in 1998.

 

The racial wealth divide

Of course, not all families receive equal treatment or are given similar opportunities for advancement.  While US capitalism works to transfer wealth upwards to the very rich, it has disproportionately exploited families of color.  This is made clear by the results of a 2017 study titled The Road to Zero Wealth by Dedrick Asante-Muhammad, Chuck Collins, Josh Hoxie, and Emanuel Nieves.

As we saw above, median household wealth has been on the decline since 2007, despite the growth in overall economic activity and corporate profits.  The figure below shows median wealth trends for White, Black, and Latino households.

As of 2013, median White household wealth was less than it had been in 1989. However, the wealth decline has been far worse for Black and Latino families.  More specifically, as the authors write:

Since 1983, the respective wealth of Black and Latino families has plunged from $6,800 and $4,000 in 1983 to $1,700 and $2,000 in 2013. These figures exclude durable goods like automobiles and electronics, as these items depreciate quickly in value and do not hold the same liquidity, stability or appreciation of other financial assets like a savings account, a treasury bond or a home.

Education is supposed to be the great equalizer, with higher levels of education translating into more income, and then wealth.  But as we see in the figure below, the combination of class policies on top of a history of discrimination and exclusion has left families of color at a significant disadvantage. For example, the median wealth of a family of color with a head of household with 4 year degree is far less than the median wealth of a White family with a head of household with only a high school diploma/GED.

The authors have created their own measure of “middle class wealth,” which they define:

using median White household wealth since it encompasses the full potential of the nation’s wealth-building policies, which have historically excluded households of color. More specifically, we use median White wealth in 1983 ($102,200 in 2013 dollars) as the basis for developing an index that would encompass “middle-class wealth” because it establishes a baseline prior to when increases in wealth were concentrated in a small number of households. Using this approach and applying Pew Research Center’s broad definition of the middle class, this study defines “middle class wealth” as ranging from $68,000 to $204,000.

As we can see in the figure above, only Black and Latino households with an advanced degree make it into that range. Moreover, trends suggest that, without major changes in policy, we can expect further declines in median wealth for households of color.  In fact,

By 2020, if current trends continue as they have been, Black and Latino households at the median are on track to see their wealth decline by 17% and 12% from where they respectively stood in 2013. By then, median White households would see their wealth rise by an additional three percent over today’s levels. In other words, at a time when it’s projected that children of color will make up most of the children in the country, median White households are on track to own 86 and 68 times more wealth, respectively, than Black and Latino households. . . .

Looking beyond 2043, the situation for households of color looks even worse. . . .If unattended, trends at the median suggest Black household wealth will hit zero by 2053. In that same period, median White household wealth is expected to climb to $137,000. The situation isn’t much brighter for Latino households, whose median wealth is expected to reach zero by 2073, just two decades after Black wealth is projected to hit zero. . . . Wealth is an intergenerational asset—its benefits passed down from one generation to the next— and the consequences of these losses will reverberate deeply in the lives of the children and grandchildren of today’s people of color.

Of course, knowledge of the fact that capitalism’s growth largely benefits capitalists, and that people of color pay some of the greatest costs to sustain its forward motion, does not automatically lead to class solidarity and popular opposition to existing accumulation dynamics.  Still, such knowledge does, at a minimum, help people understand that the forces pressing down on them are not the result of individual failure or lack of effort, but rather have systemic roots.  And that is an important step in the right direction.

The Bipartisan Militarization Of The US Federal Budget

The media likes to frame the limits of political struggle as between the Democratic and Republican parties, as if each side upholds a radically different political vision. However, in a number of key areas, leaders of both parties are happy to unite around an anti-worker agenda.  Support for the military and an aggressive foreign policy is one such area.

On September 18, US senators approved the National Defense Authorization Act (NDAA) of 2018.  Donald Trump had proposed increasing the military budget by $54 billion.  The Senate voted 89-9 to increase it by $37 billion more than Trump sought.  In the words of the New York Times:  “In a rare act of bipartisanship on Capitol Hill, the Senate passed a $700 billion defense policy bill on Monday that sets forth a muscular vision of America as a global power, with a Pentagon budget that far exceeds what President Trump has asked for.”

The NDAA calls for giving $640 billion to the Pentagon for its basic operations and another $60 billion for war operations in other countries, including Iraq, Syria, and Afghanistan.  The House passed its own version of the bill, which included a smaller increase over Trump’s request as well as new initiatives such as the creation of a Space Corps not supported by the Senate.  Thus, the House and Senate need to reconcile their differences before the bill goes to President Trump for his signature.

It is clear that Democratic Party opposition to Trump does not include opposition to US militarism and imperialism. As Ajamu Baraka points out:

Opposition to Trump has been framed in ways that supports the agenda of the Democratic Party—but not the anti-war agenda. Therefore, anti-Trumpism does not include a position against war and U.S. imperialism.

When the Trump administration proposed what many saw as an obscene request for an additional $54 billion in military spending, we witnessed a momentary negative response from some liberal Democrats. The thinking was that this could be highlighted as yet another one of the supposedly demonic moves by the administration and it was added to the talking points for the Democrats. That was until 117 Democrats voted with Republicans in the House—including a majority of the Congressional Black Caucus—to not only accept the administration’s proposal, but to exceed it by $18 billion. By that point, the Democrats went silent on the issue.

It is important to keep in mind that, as William D. Hartung shows, “there are hundreds of billions of dollars in ‘defense’ spending that aren’t even counted in the Pentagon budget.” Hartung goes agency by agency to show the “hidden” spending.  As he notes:

You might think that the most powerful weapons in the U.S. arsenal — nuclear warheads — would be paid for out of the Pentagon budget.   And you would, of course, be wrong.  The cost of researching, developing, maintaining, and “modernizing” the American arsenal of 6,800 nuclear warheads falls to an obscure agency located inside the Department of Energy, the National Nuclear Security Administration, or NNSA. It also works on naval nuclear reactors, pays for the environmental cleanup of nuclear weapons facilities, and funds the nation’s three nuclear weapons laboratories, at a total annual cost of more than $20 billion per year.

Hartung’s grand total, which includes, among other things, the costs of Homeland Security, foreign military aid, intelligence services, the Veterans Administration, and the interest on the debt generated by past spending on the military, is $1.09 Trillion.  In short, our political leaders are far from forthcoming about the true size of our military spending.

Militarization comes home

Opponents of this huge military budget are right to stress how it greatly increases the dangers of war and the harm our military interventions do to people in other countries, but the costs of militarism are also felt by those living in the United States.

For example, ever escalating military budgets fund ever new and more deadly weapons of destruction, and much of the outdated equipment is sold to police departments, contributing to the militarization of our police and the growing use of force on domestic opponents of administration policies, the poor, and communities of color.  As Lisa Wade explains:

In 1996, the federal government passed a law giving the military permission to donate excess equipment to local police departments. Starting in 1998, millions of dollars worth of equipment was transferred each year, as shown in the figure below. Then, after 9/11, there was a huge increase in transfers. In 2014, they amounted to the equivalent of 796.8  million dollars.

Those concerned about police violence worried that police officers in possession of military equipment would be more likely to use violence against civilians, and new research suggests that they’re right.

Political scientist Casey Delehanty and his colleagues compared the number of civilians killed by police with the monetary value of transferred military equipment across 455 counties in four states. Controlling for other factors (e.g., race, poverty, drug use), they found that killings rose along with increasing transfers. In the case of the county that received the largest transfer of military equipment, killings more than doubled.

Militarization squeezes nondefense social spending 

Growing military spending also squeezes spending on vital domestic social services, including housing, health, education, and employment protections, as critical programs and agencies are starved for funds in the name of fiscal responsibility.

The federal budget is made up of nondiscretionary and discretionary spending.  Nondiscretionary spending is mandated by existing legislation, for example, interest payments on the national debt.  Discretionary spending is not, and thus its allocation among programs clearly reveals Congressional priorities.  The biggest divide in the discretionary budget is between defense and nondefense discretionary spending.

The nondefense discretionary budget is, as explained by the Center on Budget and Policy Priorities:

the main budget area that invests in the nation’s future productivity, supporting education, basic research, job training, and infrastructure.  It also supports priorities such as providing housing and child care assistance to low- and moderate-income families, protecting against infectious diseases, enforcing laws that protect workers and consumers, and caring for national parks and other public lands.  A significant share of this funding comes in the form of grants to state and local governments.

As we see below, nondefense discretionary appropriations have fallen dramatically in real terms and could potentially fall to a low of $516 billion if Congress does not waive the sequestration caps established in 2011.

The decline is even more dramatic when measured relative to GDP.  Under the caps and sequestration currently in place, nondefense spending in 2017 equaled 3.2 percent of GDP, just 0.1 percentage point above the lowest percentage on record going back to 1962.  According to the Center on Budget and Policy Priorities, “That percentage will continue to fall if the caps and sequestration remain unchanged, equaling the previous record low of 3.1 percent in 2018 and then continuing to fall (see the figure below).”

Looking ahead

As the next figure shows, the proposed Trump budget would intensify the attack on federal domestic social programs and agencies.

If approved, it “would take nondefense discretionary spending next year to its lowest level in at least six decades as a percentage of the economy and, by 2027, to its lowest on that basis since the Hoover Administration — possibly even earlier.”  Of course, some categories of the proposed nondefense discretionary budget are slated for growth–veterans’ affairs and homeland security–which means that the squeeze on other programs would be worse than the aggregate numbers suggest.

No doubt the Democratic Party will mount a fierce struggle to resist the worst of Trump’s proposed cuts, and they are likely to succeed.  But the important point is that the trend of militarizing our federal budget and society more generally will likely continue, a trend encouraged by past Democratic as well as Republican administrations.

If we are to advance our movement for social change, we need to do a better job of building a strong grassroots movement in opposition to militarism.  Among other things, that requires us to do a better job communicating all the ways in which militarism sets us back, in particular the ways in which militarism promotes racism and social division, globalization and economic decay, and the deterioration of our environment and quality of life, as well as death abroad and at home, all in the interest of corporate profits.  In other words, we have to find more effective ways of drawing together our various struggles for peace, jobs, and justice.

State Conservatives Block City Progressives

Recently, organizers in a number of cities helped to build strong local coalitions which successfully won passage, either though ballot or elected official vote, of measures that improved majority living and working conditions.  Examples include higher minimum wages as well as fair scheduling, paid leave, and improved prevailing wage laws.

Now, conservative forces, organized by groups such as ALEC, are using their influence in state legislatures to pass preemption laws to block this progressive city strategy and, in some cases, roll back past gains. This development is well described by Marni von Wilpert in a recent Economic Policy Institute report titled “City governments are raising standards for working people—and state legislators are lowering them back down.”

Preemption and the rise of the right

Preemption allows a higher level of government to restrict the power of a lower level of government in areas where it believes that lower level government action conflicts, or might conflict, with its own actions. In terms of state politics, state governments can use preemption to restrict the rights of city governments.

A case in point, as described by von Wilpert:

In 2015, the Birmingham City Council passed an ordinance raising the city’s minimum wage to $8.50 effective July 2016 and to $10.10 effective July 2017. At the beginning of the 2016 session, the Alabama state legislature fast-tracked a minimum wage preemption law, which Governor Robert Bentley signed 16 days after the bill was first introduced, nullifying Birmingham’s ordinance and knocking the minimum wage back down to $7.25

At one time, preemption was used by more liberal state governments to keep more conservative city governments from undercutting social standards.  However, as von Wilpert explains, “Now that the Republican Party controls 33 governorships and has majority representation in both chambers of most state legislatures, conservative state legislators have increasingly used preemption laws to strike down local government efforts to increase the quality of life for working people in their municipalities.”

Preemption and minimum wage laws

The federal minimum wage has not been increased since 2009. In 2017, the federal minimum wage of $7.25 was worth 12 percent less, in real terms, than when it was last raised, and is 27 percent below its peak value in 1968.  Working people have therefore pushed hard to get their states and/or localities to take action, and with growing success at the local level.  “Before 2012, only five localities had enacted their own local minimum wage laws, but as of 2017, forty counties and cities have done so.”

But now, as the following figure from the EPI report makes clear, conservative state lawmakers are fighting back, using preemption to restrict local action.  Twenty-five states now have preemption laws denying local governments the right to set their own minimum wages; more than half of these laws were passed beginning in 2013.

Preemption and paid leave

State level right-wing forces have also taken aim at paid leave laws, which generally include the right to paid sick and family medical leave.  There is no federal law guaranteeing workers the right to paid leave, and, as with minimum wage gains, workers have been most successful in winning paid leave at the local level.  However, as we see in the following figure, state legislatures, since 2013, have been busy denying local governments the right to implement their own higher standards.  Twenty states now have preemption laws covering paid leave.

Preemption and fair scheduling

There are currently no federal laws that ensure workers basic fairness and predictability in scheduling.  As von Wilpert describes,

While waiting for the federal government to act, four cities and two states have passed various forms of fair work schedules legislation. But in the last few years, as local governments have begun to innovate in the arena of fair scheduling, state governments have stripped local governments’ abilities to do so—[as we see in the following figure] at least nine states have passed work scheduling preemption laws since 2015.

Preemption and prevailing wage/project labor agreements

Prevailing wage and project labor agreements require private contractors to treat workers fairly, including paying all their workers the prevailing wage, when doing work under government contract.  Such agreements keep private contractors from competing for public work at the expense of their workers.

And, as in the other areas of labor rights discussed above, we see a similar explosion in action by states to restrict the right of their localities to set higher standards for public contracting. At least 12 states now have preemption laws, all but one of which was passed beginning in 2013.

What’s next?

The current right-wing strategy highlighted above greatly reduces what working people can win at the city level in many states.  Of course, there are still many states where local initiatives can bring real improvement and these should obviously continue.  At the same time, it seems clear that the political environment is changing and not for the better in terms of what local efforts can produce.

While far from easy, this means that organizers have little choice but to deepen and extend their work. Among other things, this means pursuing efforts to link local/city coalitions in order to strengthen state level influence.  It also means that more emphasis needs to be put into building organizations as well as alliances of working people around a vision of good jobs for all, a strong and accountable public sector serving human needs, and healthy cities and communities that is to be won through organizing and direct action as well as electoral work.  Above all,  this will require seeking and sharing creative ways to strengthen working class solidarity, which is key if we are to overcome the existing divisions that allow right-wing forces to set the terms of our political choices.