TORONTO — Age of Fracture is a ‘big book.” It was penned by Daniel Rodgers, an intellectual historian of America, and professor emeritus at Princeton University. In 2012, Rodgers’ tome made him co-winner of the Bancroft Prize, a prestigious award for books about the Americas or diplomacy. Since its 2011 publication, a mini-industry of criticism and learned discussion has surrounded Age of Fracture, and the accolades and intellectual scrutiny are justified. Rodgers deploys an unparalleled ability to look from on high at discernible shifts in American cultural discourse from approximately 1980 until 2010.
My purpose here is to review the central strands of Rodgers’ powerful argument while considering current trends that sometimes buttress and sometimes challenge his thesis. Needless to say, Rodgers cannot be held accountable for analysis after the 2011 publication of his book. However, its strength and influence legitimates mulling its arguments some five years following its initial publication.
They believed in American exceptionalism, of the “shining city on a hill.”
As the 1960s began, elites in the United States forged a broad consensus over its mission in the world and its duties at home. American elites overwhelmingly agreed that the Soviet Union posed a security threat and a denial of democratic mores. Such Americans largely believed in American exceptionalism, of the “shining city on a hill.” They even concurred about some American foibles. For example, there was a broad consensus about the need to end racism at home. All of that began to fray with the debate over Viet Nam and the explosion of race tension in the aftermath of the assassination of Martin Luther King Jr., in 1968.
Rodgers argues that the American failure in Viet Nam, followed by the perceived weakness of President Jimmy Carter, gave rise to a counter-revolution in American thought. Make no mistake, this is an American book. It delves profoundly and cleverly into the interplay of political, media and academic discourse in a period that Rodgers frames as an “anti-Keynesian counter revolution.”
Rodgers asserts that God was re-born in the form of the free market as the Jimmy Carter presidency foundered and former Hollywood actor and California governor Ronald Reagan strode into the public sphere. With Reagan’s election, Rodgers asserts, a coalition of right-wing think tanks, White House advisors and fellow travellers in academia re-framed the argument for American exceptionalism. The country had lost its soul, so the argument went, in weak-kneed military policy and ineffectual social democratic policies in welfare, education and race relations that fostered dependence.
Image credit. >
Through Rodgers’ eyes the ascendancy of a revitalized right-wing, market-loving intelligentsia surrounding the Reagan presidency amounted to a “anti-Keynesian counter-revolution.” Its evangelists believed the market could do no wrong. American satellites such as Chile and Argentina needed to get with the programme and undergo “shock therapy” to wean themselves from social democracy or, worse, Latin American variants of Marxism.
Margaret Thatcher became a revered icon. Ronald Reagan, despite some transparent wobbles towards accommodation during his term as governor of California and in his presidency, was put on a pedestal as an unwavering champion of unflinching, inexorable market forces which, according to its zealots, would result in greater wealth for all.
He draws a link between the classic Tory concept of Burkean “little platoons” . . .
Rodgers takes no prisoners in his scathing review of presidential advisers such as the political advertising and media guru Michael Deaver and anti-Keynesian economists, like Milton Friedman, who achieved almost cult status among worshippers at the alter free markets. Rodgers describes the period as a victory for an elitist counter-intelligentsia.
This particular trahison de clercs (betrayal of the intellectuals) — to use Julien Benda’s memorable phrase about the unthinking embrace of nationalisms by intellectuals in the lead up to World War II — eventually foundered. America would back off the market-as-religion ideology, to some extent, with the defeat of President George H. W. Bush by Bill Clinton in 1992. Economists like Joseph Stiglitz regained favour with his highly developed and resounding put-down of “market Bolsheviks.” Jumping on a mild recession of the George H. W. Bush presidency, Clinton advisors proclaimed “it’s the economy stupid” and steered the campaign to echo an earlier period when activist government had legitimacy. However, the anti-Keynesian impulse was more than a matter of the Republican-Democrat divide; under Clinton, free market ideology granted big favours to Wall Street that contributed to the dot.com crash that immediately followed Bubba’s second term.
Image credit. >
Impressively, Rodgers does not de-bunk and ascribe blame. For instance, he is tuned to ways in which the right, sometimes intelligently, spoke to Americans in a way the left had abandoned. Rodgers draws a link between the classic Tory, indeed Burkean concept of “little platoons” to describe how both Presidents Bush appealed to the family, religious groups and local organisations to nurture community self-help as an antidote to big government fixes. George H. W. Bush evoked “a thousand points of light” as an alternative to the intrusion of a nanny state.
The broad acceptance of the charter school movement by Republicans and Democrats alike is an example of how the right’s ideology moved the needle of American acceptance of a retreating state. Even President Obama’s successful campaign for an expansion of available health care, by enlisting and nudging private insurers while creating profit opportunities for them, is also perhaps a legacy of the age of fracture Rodgers describes.
There are a number of tentative observations to be made and questions to raise by virtue of writing five years following the publication of Rodger's book. He ends with the attacks of September 11, 2001 and their patriotic aftermath. America, he argues, was temporarily re-united in a surge of patriotism that transcended party affiliation and ideology. I wonder how he would now assess the polemic between champions of the apparatus of the security state that emerged in ‘The War on Terror,’ and the followers of its opponents like Glenn Greenwald, Laura Poitras and Edward Snowden.
The fissures speak to disagreements about climate change, inequities of income, and the struggle between security and privacy in a digital age . . .
One of the best things in the book is how its author brilliantly describes a period of ascendancy for the American right, which begs the question: How would he account for the unlikely rise of Bernie Sanders, an unvarnished American democratic socialist, who is waging a credible campaign for the Democratic presidential nomination against establishment figure Hillary Clinton? Has Sanders tapped into “little platoons” of American environmentalists, post-secondary students, local economy advocates and anti-imperialists that would otherwise remain outside the traditional political arena? At this juncture, Hillary Clinton’s anticipated coronation might face a serious threat from the Sanders campaign. Speaking of fracture in the US, while Sanders is making an impact on the left, what remains of a Republican centrist establishment has been overwhelmed by the unpredictable insurgencies of Donald Trump and Ted Cruz in the quest to find a Republican presidential candidate. Fractures to the left and right, Dr. Rodgers!
And, looking beyond the US, how to account for the recent success of socialist Jeremy Corbyn as leader of the British Labour Party? After all, Margaret Thatcher had inspired Reagan, and Tony Blair was labelled 'Bush’s poodle' over the invasion of Iraq. Could an electorally significant segment of UK opinion now be veering away from American political leadership? And here in the colonies, what would Rodgers make of the unlikely rise of a social democratic government with environmentalist leanings led by one Rachel Notley in the oil rich (and oil dependent) province of Alberta?
If one extends Rodgers’ gaze into the present and regards the US and countries profoundly influenced by American thought, such as Canada and the United Kingdom, one sees that the fracture and disaggregation of which Rodgers wrote has continued. The fissures transcend Cold War ideologies. They speak to disagreements about climate change, inequities of income, the struggle between security and privacy in a digital age, and the response to the flight of hundreds of thousands, if not millions, of people across Europe’s borders as the Middle East disintegrates. Responses to these dilemmas will be at once ideological and unpredictable. Perhaps in 5 to 10 years a worthy successor to Daniel Rodgers will train a penetrating gaze on the years from 9/11 up unto our present crises.
This article was previously published in The Journal of Wild Culture, July 26, 2016.
JAMES CULLINGHAM is a journalism professor at Seneca College and documentary filmmaker in Toronto. He is the director and producer of In Search of Blind Joe Death — The Saga of John Fahey (2012) and executive producer of The Pass System (2015), an historical documentary film about segregation of Canadian First Nations people.
Since the founding of the Republic, “American Exceptionalism” has been a guiding principle for candidates seeking high office. Next only to a professed belief in God, a firm, if often unfounded, insistence in our benign and beneficent superiority to all other nations and cultures is indispensable for aspiring leaders. From Lincoln’s elegant warning that America is “the last, best hope of earth” to the religious overtones of Reagan’s “shining city on the hill,” our leaders embrace this notion of Exceptionalism with a fervor that often seems at odds with our more provincial aspirations, such as lowering taxes or reducing bureaucratic red tape.
But while American Exceptionalism guides candidates, the concept itself has two very distinct historical lineages. A more hands-off interpretation of our Exceptionalism, famously espoused by Washington in his Farewell Address, but most eloquently explained by John Quincy Adams goes like this: “America… abstain(s) from interference in the concerns of others…. Wherever the standard of freedom has been… unfurled, there will be her heart…. But she goes not abroad, in search of monsters to destroy.” This so-called “isolationist” brand of American Exceptionalism, however, is rarely espoused today. Of all the erstwhile candidates this presidential cycle only Bernie Sanders and Rand Paul fully embraced it.
The other, more popular strain of American Exceptionalism, is the Wilsonian brand. Wilson was not the first to espouse a more messianic American Exceptionalism, but he was—at least until George W. Bush-—its most prominent and blundering advocate. Most of the Republican presidential candidates this year and Hillary Clinton fall squarely into this camp whose members fervently believe there are few problems in today’s world that can be solved without U.S. involvement. They embrace the mythology of “interventionist” exceptionalism as fervently as the leaders of any bygone empire that sought to bring civilization and justice to a barbaric world. As former Secretary of State Albright, unabashedly preening, put it: “If we have to use force, it is because we are America. We are the indispensable nation. We stand tall. We see further into the future.”
Why the Good War was So Bad
Understanding why the Washingtonian brand of Exceptionalism has fallen out of favor and the Wilsonian messianic brand of Exceptionalism has taken hold in the American psyche is not possible without understanding America’s view of history. The nations of the world don’t agree on much, but one thing on which there is unanimity is the belief that Americans have no sense of history. Everywhere one looks from Asia to Europe to the Middle East there prevails an unshakeable belief that Americans are so busy with the future, they do not adequately value the past.
But they are mistaken. While it is certainly true that Americans have a wonderful facility for not allowing grievances from the distant past to control their outlook—how many Americans, for example, even know that the British burned Washington DC in 1814? And how many of those who do know think it was such a bad idea? We are not, however, completely clueless about history. The real problem is that most of us are only comfortable with a single decade of history—from 1938 to 1948—from Kristallnacht and Munich to the onset of the Cold War. For most of the American public and certainly for most of our political leaders, the other 5,000 years of history are no more than a quaint and irrelevant footnote.
And because we are perpetually stuck in this timewarp, we see foreign affairs through a strange prism: every attempt at compromise is “appeasement,” every dictator is “another Hitler,” every effort at nation-building must succeed just like the Marshall Plan did, every aggression is a move toward world domination, and most importantly, every enemy is an existential threat who only understands brute strength and firm resolve. So, when North Korea marched south it was like the Nazis storming into Poland, and when the North Vietnamese infiltrated the south Ho Chi Minh was another Hitler, and when Iraq took over Kuwait it was like the Soviets taking over Eastern Europe, and Saddam again was yet another Hitler, and the annexation of the Crimea was the Sudentenland all over again, and any effort to broker an agreement with Iran on nuclear matters must be appeasement.
Reassessing American Exceptionalism
As a people we Americans are in urgent need of a more nuanced, more mature view of history. We must come to see that not every competitor is an adversary and that every adversary need not be an enemy, and that even among our now myriad enemies, not all pose existential threats to us and our way of life. But this will take wisdom and courage, and a realization that most of history has been a bloody mess of misunderstandings and arrogant mistakes.
The madness of World War I almost brought us to our senses. It briefly convinced many in the West that war usually is a lose-lose proposition for civilization, but then Hitler “saved” us from peace and reason. Confronted with a genuine existential threat and appalled by the brutality and ruthlessness of the Nazi regime and Japanese Empire, we realized that waging war was necessary in that unique set of circumstances. But given the disastrous consequences of our repeated interventions over the last few decades, the histories of World War I and the Peloponnesian Wars are arguably better cautionary tales for the modern world than our simplistic embrace of the lessons learned from World War II and its immediate aftermath. Yet we will not forsake that cursed decade of triumph and clarity easily; we yearn for it and it beckons to us, offering heroic purpose and a comforting certainty about who we think we are as a nation.
The interventionists—both Democrats and Republicans–have gotten it wrong almost every time they have interceded in foreign conflicts. Since World War II going “abroad to destroy monsters” is exactly what we do, breathlessly, unthinkingly, blindly. We are no longer just a “bright beacon to the world”; for the last seventy years our exceptionalism has been imbued with a Messianic fervor and a Manichean perspective of absolute Good countering absolute Evil that threatens to destroy the very ideals we seek to promote.
But even fools can make wise observations: while the interventionists have been wrong over and over in getting us involved in military conflicts of questionable necessity, they have been right that John Quincy Adams’ brand of American Exceptionalism is outdated and inadequate for the modern age. The interventionist are correct that the modern world with all its technological advancements is too small, too interconnected, and too interdependent to allow us to isolate ourselves completely and merely cheer from the sidelines. It is doubtful that even Washington and Adams, if alive today, would have much confidence in the size of the oceans to ward off our enemies, and it is equally doubtful given the complexity of foreign relations today that they would be so disparaging of all foreign entanglements. But it is time—past time—for the pendulum to swing back after seventy years of hyper-interventionism to a more balanced brand of Exceptionalism that is neither reflexively interventionist nor irrationally isolationist.
What is to Be Done
First, use the label sparingly and prudently. Admittedly this will feel akin to amputating a leg or tearing out one’s own heart. A passionate belief in our Exceptionalism is something we all were handfed since we were infants and accepting that we are something less than unique and special will be painful. But like every other extremist “ism” we denounce and distrust and have sought to destroy, American Exceptionalism can sometimes embody a threat to Republican principles. Imbued with a strong sense of Exceptionalism, our leaders sometimes justify policies and practices that we would generally find loathsome. Just as Marxism and Fascism and Islamic Extremism enable their followers to commit deplorable acts for some greater future good, so too American Exceptionalism—despite all the good and inspiring aspects of it—has made it easier for us to do similarly appalling acts. Embracing American Exceptionalism with thoughtful humility rather than boastful pride would be salutary.
Second, be consistent. From the very outset post-World War II interventionism has suffered from a notable hypocrisy of purpose. We unabashedly rail against the tyrannies in Syria and Iran, but continue to ignore the equally tyrannical and arguably more dangerous governments in Saudi Arabia, Pakistan, and elsewhere. At one time this approach, while never morally defensible, made some sense from a practical perspective. It was arguably practical because we needed certain resources such as oil from the Middle East, and it was also practical because we were waging a decades-long cold war, so we cleverly delineated between totalitarian regimes (our enemies) and mere authoritarian regimes (our friends) despite there being little difference in how they ran their governments and mistreated their citizens.
But these practicalities no longer apply. The end of the Cold War and American technological ingenuity have freed us from compromising our principles, yet political inertia keeps us stuck in old thought patterns. An uninformed realism still pervades our thinking and we hesitate to make a clean break with the flawed strategy of the post-WW II era. This uninformed realism is worsened paradoxically by an overwrought sentimentalism in our formulation of foreign policy. How else can we plausibly explain our decades-long reluctance to accept the reality in Cuba or our refusal to ever discern a difference between American interests and Israeli interests? Combined, this misapplied realism and misplaced sentimentalism have led much of the world to see our interventions as hypocritical and self-serving. We would do well to recall Washington’s warning that a “nation which indulges towards another a habitual hatred or a habitual fondness is in some degree a slave. It is a slave to its animosity or to its affection, either of which is sufficient to lead it astray from its duty and its interest.”
Third, embrace as allies only those who share our ideals and interests. Ally is an overused and underappreciated relationship. To be called a “friend and ally” of America should mean something special. It frankly does not apply to many nations which today we profess to be allied with—much to our own detriment. There are few countries outside of Europe that would genuinely qualify as allies, and even some within Europe should be put on notice that an allied relationship with America is not always a given.
Fourth, distance ourselves from those who do not seek to emulate our ideals and principles. This does not mean we need to intervene in those countries such as Saudi Arabia or Pakistan or even pressure them to change. What it does mean is that we do not give economic support to them, we do not give them most-favored nation trade status, and we do not share with them sophisticated military and civilian technology. If American Exceptionalism means anything it should mean that we do not enable and protect those who have interests and ways of life that threaten our own principles and way of life.
Fifth, limit unilateral military interventions to only those rare instances when an immediate and compelling threat to the United States is clear. And then respond with overwhelming force and leave. Nation-building, generally an unrealistic fairytale born of the success of the Marshall Plan, should be avoided at least until all hostilities have ended. Moreover, we should not support in any manner any third country military conflicts, such as the ongoing war in Yemen, where all sides are blameworthy and committing atrocities.
Finally, clearly define the criteria for interventions of a humanitarian nature. While we do have a legal and moral obligation to prevent genocide and to safeguard civilian populations, such interventions should never be done unilaterally or only with allies. Only UN-sanctioned humanitarian interventions should be supported.
There are plenty of arguments against these proposals. Don’t we risk another Rwanda if we wait for the United Nations to act? Don’t we risk worsening the economic plight of innocent populations by not compromising with tyrannical regimes? Don’t we risk our own self-interest by not finding common ground with despotic nations like Saudi Arabia and China? But the record of the last seventy years doesn’t bode well for the next seventy if we continue along the same path. Over time a more hands-off policy toward those not like us and an even stronger embrace of those who share our values would eventually benefit both the US and much of the world. The UN should become more active and competent; innocent populations could rise up against their corrupt governments or those governments could see the benefits of change; unallied states such as Saudi Arabia and even China may eventually find it beneficial to institute political reforms and improve human rights.
As the last century began we sought to liberate Cuba, civilize the Philippines, and capture that terrorist Pancho Villa. These exercises in Exceptionalism culminated in our disastrous entry into World War I. In less than a year it will be the centennial of our entry into that war to safeguard democracy. As we learned at great cost, our intervention not only didn’t make the world safe for democracy, it did make the world safe for fascism and other forms of totalitarianism. We are at a similar crossroads today, striving to make the modern world safer—and unintentionally also safer for terrorism and lawlessness—as we engage more and more in counterproductive struggles around the world. We have already stumbled badly and repeatedly in the first sixteen years of this century. Our misuse of American Exceptionalism in Afghanistan, Iraq, Syria, Libya, Ukraine and elsewhere does not bode well for the rest of the century, but there is no need to keep repeating the same mistakes of the past if our leaders choose a different path. A foreign policy firmly based on consistency, restraint, and adherence to our founding principles would ultimately achieve what most of us mistakenly believe we already possess: an American Exceptionalism admired and envied by the world.
Books on the topic of this essay may be found in The Imaginative Conservative Bookstore.
Published: Oct 6, 2016
Joseph Mussomeli served for almost thirty-five years as an American diplomat, including tours in Egypt, Afghanistan, Morocco, and the Philippines. He was the U.S. ambassador to the Republic of Slovenia and the Kingdom of Cambodia. Before entering the U.S. Foreign Service in 1980, he worked as a Deputy Attorney General in New Jersey. Mr. Mussomeli is the author of The UnChristmas Story: The One Who Said No.
More articles from author