Directing public money to private organizations to look after the common good is a story as old as the American republic, and a practice inherited from colonial times.
When the Great Depression overwhelmed the country's patchwork system of private charities and fraternal benefit organizations, President Franklin Roosevelt launched a new era of public policy by establishing a federal role in the welfare of American citizens, and forging the mechanisms to deliver it. The unfinished business of the New Deal, like federal aid to primary and secondary schools or national health insurance, preoccupied reformers for a generation, until Lyndon Johnson signed important precedents for both into law.
This well-known story is often poorly told. As I have argued in two separate books (on very different subjects), the ambition and institutional capacity of both the federal and state governments grew in tandem, not at each other's expense. Old and animated debates on federalism cannot be regarded as adequate storytellers of the state: the real story of the twentieth century is the expansion of the government power at all levels. For instance, the federal income tax, first assessed to most Americans in times of peace only after World War II, and the subsequent derivation of "federal gross adjusted income" figure, gave various states a path to calculate an income tax for all their wage-earning residents, rather than negotiate it from only a few. During that same time, the World War II GI Bill, which delegated administrative authority to the states for its famed education benefit, added large numbers of staff and new expectations to individual state departments of education.
In a variety of ways, standardized federal processes and expanded federal programs coaxed and cajoled growth and renovation throughout the states. As the late legal scholar Bill Stuntz once observed, the Bill of Rights, the first 10 amendments to the Constitution widely celebrated as a restraint on state power, may have endured precisely because they are not. Though much political rhetoric depicts otherwise, Americans are not governed by a "weak state," even when measured by social policy alone.
The survival and recently expanded role of private organizations in the conduct of public policy is a better known story, but also subject to its own distortions. Many date the privatization of federal functions back to Ronald Reagan, but in fact, seminal New Deal policies made concessions to both state governments and pre-existing private providers. Depending upon the program, some private providers receive money directly from the government; many others are compensated in the form of deductions from their tax bill. Legislators reluctant to raise taxes in the wake of the Reagan revolution have availed themselves of the latter funding method so much that anyone looking to understand the nature and scope of U.S. government intervention in the lives of its citizens must inspect the tax code and revenue collection well before looking to federal and state budgets.
Contrary to the run of political discourse, we have not lived in an era governed by deficit politics -- that is, a reluctance to spend money on the public good. Such a notion should be absurd on its face, given the extraordinary $2.7 trillion the United States has expended on a futile war in Iraq. But the observation holds even without the military: according to the Organization for Economic Cooperation and Development, the United States hosts the second largest welfare state in the world. Instead of austerity, what we've had is a political establishment that invokes its rhetoric in order to dictate how, and for whom, public money should be spent.
Certainly the majority of Americans are not the primary beneficiaries of this enormous and expensive web of activity. There is no national or single-payer healthcare in the United States; nor is there a defined benefit public pension that allows Americans to plan for and live comfortably in retirement. In fact, unlike the array of cradle-to-grave benefits that other wealthy nations sponsor as their social safety net, the United States ranks only 23rd in direct government spending on social welfare. To put it differently, New Deal public policy as it is traditionally understood, and against which conservatives ostensibly premise their revolution, accounts for only a portion of US investment in social policy.
This election has produced consistent -- albeit neglected -- evidence that especially young people are no longer satisfied with how the political equation of public policy is balanced in this country. The unimpressive performance and results of neoliberalism, or the withdrawal of government from the public sector, have driven insurgent candidacies in this presidential election cycle no less than disadvantageous trade agreements. Young people's hunger for better public policy has most often been recognized in our national conversation only indirectly, usually as derision for its projected costs. But in point of fact, the United States already has strong social policy state and robust social welfare investments; over the course of time, we have simply strayed from placing these in the service of the American people.
Of course the issue of privatization extends far beyond welfare provision: outsourcing formerly public functions to the private sector has fundamentally altered the US military, as well as how government builds infrastructure and performs maintenance. The residual and delegated powers vested in state and local government bears significance beyond the safety net as well. Both traditional and newly assigned forms of autonomy have ushered in a home and charter school revolution, often at the expense of local public schools; they have facilitated the divestment in public higher education and driven an unprecedented expansion in incarceration; local and state control has also enabled disparate treatment on a range of issues, from policing to access to healthcare.
Many Americans have begun to ask whether this public policy model performs as efficiently, or as equitably, as it should. Without detailed knowledge of adequate exposition of the inner workings of policy, they nonetheless gravitate toward proposals that implement policy through government budgets, via public mechanisms, and according to the principle of universal entitlement entailed in federal policy. Instead of an obscure and unevaluated "submerged state" of tax transactions, or private corporations enriched by but unaccountable to the public, millions of Americans seem willing to consider the proposition that government exists for a reason, and is itself a reasonable, sometimes superior option.
Among Democrats, President Obama's most celebrated domestic accomplishment is the Affordable Care Act, a "kludgey" piece of legislation designed to defer to private corporate interests and individual state governments in order to establish some basic rules of insurance coverage. Yet even the face of this acclaim, a recent poll found that a majority of all Americans (58 percent) prefer a single-payer or "Medicare for All" health care system.
Broadly speaking, in detectable but mostly unexamined ways, the public option is back on the table. This is itself a stunning rebuke of the political establishment, given that a case for government has been not been championed by either major party; if anything, it has been disparaged and dismissed. In spite of the ridicule, one of its most concrete expressions, single-payer healthcare, pulls in significantly higher approval rates than either presidential frontrunner. The idea of a government working on behalf of its people is the true underdog running in this race; if it were placed on the ballot, it would win. Not that it wouldn't face opposition: in Colorado a single-payer health policy proposal will be voted on in November and both parties, including Democratic Governor John Hickenlooper, have mobilized to defeat it.