Untitled Flashcards Set

ON A BALMY THURSDAY NIGHT in Huntington, West Virginia, several thousand Trump fans lined up early outside the Big Sandy Superstore Arena. Some had traveled hundreds of miles; when a summer squall off the Ohio River drenched them in a downpour, nobody budged. It was August 2017, nearly nine months after Trump had won the White House, but the rallies continued. They had long since evolved past the point of drumming up votes; they had become something closer to a sacrament—a ritual that established what should be believed and what should not. For both the crowd and the performer, the evening promised a moment of escape. The city of Huntington was a place of faded glory; since the 1960s, the population had sunk by more than a third, to 49,000 people—smaller than Greenwich—and it made news, in most cases, for its distinction as a hot spot in the opioid crisis. Trump, for his part, was having a difficult week: in recent days, Republicans had finally acknowledged that they had been falsely promising their supporters that they could overturn Obamacare; moreover, his communications director had resigned after lambasting his colleagues in vivid profanity; and Trump himself had been roundly condemned for telling a crowd of police officers “please don’t be too nice” to people under arrest. But inside the arena, all of that seemed far away. The president, in a dark suit, white shirt, and striped blue tie, luxuriated in the usual chants—“Trump! Trump! Trump!” and “Lock her up!” He did his classic bits about “the swamp,” the “fake news,” and the second coming of “beautiful, clean coal.” He claimed, falsely, to have brought back “hundreds of thousands” of manufacturing jobs, and repeated a list of bogus talking points about the economy, immigration, and Hillary Clinton. For a demonstration of fealty, Trump invited up Jim Justice, the richest man in West Virginia, who was also the governor. Justice, a Democrat, announced that he was becoming a Republican. He offered Trump fulsome praise: “He cares about us in West Virginia. And most importantly of all, you know what, he has made us, as common everyday Americans, feel good and be proud of who we are.” But all of that was preamble to Trump’s main purpose of the night— to remind his people not to trust what they heard from the press, the Justice Department, or Congress. He taunted the special counsel Robert Mueller, who was investigating Russia’s involvement in the election. “Have you seen any Russians in West Virginia or Ohio or Pennsylvania?” he asked the crowd. “Are there any Russians here tonight?” Then, so subtly that it was easy to miss, the president eased from playful to conspiratorial. “They’re trying to cheat you out of the leadership that you want,” he declared, “with a fake story that is demeaning to all of us—and most importantly demeaning to our country and demeaning to our constitution.” Trump’s mode of deception had been so essential to his politics from the beginning that it was easy to lose sight of the accumulating effects it had on his audiences. To outsiders, it was tempting to write it off as nothing but a circus. But that was a mistake. Since the ancient Greeks, effective politics had combined spectacle and substance, and Trump had discovered that with the right spectacle, people would follow him almost anywhere on the substance. Did any of his words really matter? What were they giving people? Where would it lead? To understand the pressures that were building inside American political culture, I looked further into the past, to another uncertain and incendiary moment. In the summer of 1858, the American experiment was careening toward war, and two foes—Abraham Lincoln and Stephen A. Douglas—met in northern Illinois for the first in a series of debates on the future of slavery. Lincoln, who was challenging Douglas for his seat in the Senate, loomed a foot taller than his opponent, a squat, tenacious debater celebrated as the Little Giant. The men embodied both sides of America’s fatal divide: Douglas, who warned that Lincoln would make the prairie as “black as night,” advocated “popular sovereignty,” which would hasten the spread of slavery into the western territories—a prospect that Lincoln could not abide. By the standards of politics today, the debates—seven in all—were an exhibit of unrecognizable democratic rigor. In each one, either Lincoln or Douglas spoke first for an hour; then the other responded for an hour and a half; finally, the first spoke for another half hour. (In a previous encounter, they had held forth for seven hours straight.) Lincoln-Douglas was “the best circus in town,” as a reporter on the scene described it. Before the speakers began, bands played and liquor flowed. But if the debates were social occasions, they were not trivial ones. Thousands of people crowded around to listen, without the comfort of chairs or shade or electric amplification. Politics was mostly reserved for white, wealthy males, but on the edges of the crowd were women, European immigrants, and semiliterate frontiersmen. Attendees were so desperate to hear the debaters that they climbed onto a wooden platform, which collapsed under their weight. They shouted encouragement (“Hit him again!”) and hung banners with taunting nicknames (“Douglas the Dead Dog—Lincoln the Living Lion”). Week after week, the debaters traversed Illinois. Lincoln, short on cash, traveled by coach and ferry, while Douglas, a wealthy man whose wife owned slaves, journeyed on a private train, announcing his arrival by firing a cannon marked “Popular Sovereignty.” At times, the discourse onstage neared combustion. When Douglas falsely accused Lincoln of a conspiracy to abolish slavery, Lincoln leaped from his seat and advanced on his opponent until a colleague pulled him back. But the event stayed in the realm of persuasion. As Lincoln had put it, “Reason—cold, calculating, unimpassioned reason—must furnish all materials for our future support and defense.” For Lincoln, the debates became the venue for the full expression of his humanism. He sought to be progressive but electable, “radical without sounding too damned radical,” in the words of his biographer David S. Reynolds. Lincoln’s boldest comments came in the final encounter, when he made a stark distinction between “one class that looks upon the institution of slavery as a wrong, and another class that does not look upon it as a wrong.” Framing the issue in clear moral terms, he said, “It is the eternal struggle between these two principles—right and wrong—throughout the world. They are the two principles that have stood face to face from the beginning of time.” Lincoln lost his race for the Senate, but his performance in the debates made him famous. In the presidential contest of 1860, he won the North, which included all the states in which Black men could vote and also the six states in which the Lincoln-Douglas debates had been published. The Lincoln-Douglas debates came to be regarded as a preeminent example of American political discourse in the nineteenth century—a fierce clash of ideas, sustained by the close attention of the public. But they also came to represent a darker lesson: for all their eloquence, they could not avert the Civil War or protect Lincoln from assassination. American political culture was bounded by a contest between reason and violence—a seesawing battle that continues to this day, between the aspiration to persuade fellow citizens to accept your views and the raw instinct to force them to comply. In its most idealized form, the original ambition of the United States was to fashion a system that improved on “what kings and popes had decreed,” the Stanford historian Caroline Winterer wrote in her book American Enlightenments. “Wielding the gleaming razor of human reason, sharpened by empirical evidence, common sense, and withering sarcasm, they would slash away at traditions that rested on nothing but the dust of convention and privilege,” she observed. Early Americans formed literary salons, subscription libraries, and scientific societies, animated by the spirit of the Enlightenment. Benjamin Franklin gathered what he called “ingenious Acquaintances” into a “Club for mutual Improvement.” Known as the Junto, it was devoted to rigor, training, and the spread of the printed word, an ethic that the club called “Reason’s eye.” By the mid-nineteenth century, the country was in the midst of a vibrant literary outpouring. In Washington, orators such as Henry Clay and Daniel Webster gained influence through speeches that drew huge crowds. “Eloquence, in this empire, is power,” a journalist observed. A generation of thinkers and politicians—Margaret Fuller, Elizabeth Peabody, Frederick Douglass, Walt Whitman—produced impassioned writings and speeches that they hoped would reform the young republic, giving rise to what the scholar James Perrin Warren later called a “culture of eloquence.” They traveled from town to town on the lyceum circuit, an adult-education campaign offering lectures on everything from physical exercise to the moral crisis of slavery. Alfred Bunn, an Englishman visiting in 1853, said that it was “a matter of wonderment” to see “the over-tired artisan, the worn-out factory girl” rush from work to “the hot atmosphere of a crowded lecture room.” Even as the country slid toward the Civil War, the lectures continued, rooted in the belief in what Warren called “the word as a means toward reform.” At the same time, however, America was embarking on a surge of political violence, much of it directed at Black people, immigrants, Native Americans, and abolitionists. Between the 1830s and the outbreak of war, there were at least thirty-five major riots in the Northeast. One of them began in June 1857, when three nativist gangs—the Chunkers, the Rip-Raps, and the Plug Uglies—attacked Catholic immigrants in Washington, D.C., as they tried to cast ballots. But the most ominous sign for the republic was the growing brutality among some of the country’s most powerful people: members of Congress. In The Field of Blood, the Yale history professor Joanne B. Freeman examined scores of previously unstudied attacks and melees, often initiated by southern lawmakers who regarded opposition to slavery as a threat to their property and their power. In the 1840s, Representative John Dawson of Louisiana threatened to cut a colleague’s throat “from ear to ear,” and was stopped from shooting another only by the intervention of other congressmen. Freeman described a legislature guided by the ethics of professional wrestling: “Punching. Pistols. Bowie knives. Congressmen brawling in bunches while colleagues stood on chairs to get a good look.” The fighting escalated to the point that a southern lawmaker threatened to lead an assault on the Capitol, and British diplomats came to regard the House floor as too dangerous to visit. Benjamin Brown French, a genial New Englander who served as clerk of the House of Representatives, stopped socializing with southerners and ultimately took to carrying a pistol. When I asked Freeman how violence and the cult of reason could coexist, she said that they sprang from a shared motive: “How did you prove that you were a leader in that period, to a vast audience? How did you earn support? Maybe through aggressive oratory. Maybe by making, and keeping, promises for your constituents, state, and section of the Union. And, for a time, maybe by displaying your domination of the political playing field with bullying and aggression.” Freeman’s history of congressional violence is an account of how some of the most privileged members of a society began to see their counterparts as enemies, and eventually as existential threats. Once political leaders lost trust in one another, the public was doomed to follow. “Unable to turn to the government for resolution, Americans North and South turned on one another,” she wrote. The enduring tension between violence and politics attracted the attention of Richard Hofstadter, the historian best known for his work on what he called the “arena for uncommonly angry minds,” including antiintellectualism and “the paranoid style.” In 1970, near the end of his life, Hofstadter became fascinated by the juncture of democracy and force. It had swept through American society in recent years, producing assassinations and riots. Working with a coauthor, Michael Wallace, who collected two thousand cases of violence—massacres, rebellions, vigilantism—he hoped to address what he called the American paradox: “There is far more violence in our national heritage than our proud, sometimes smug, national self-image admits of.” Hofstadter noted that in America, unlike the rest of the world, political violence rarely involved poor citizens rising up against a powerful state; more often, citizens attacked one another, and, usually, the attackers were established Americans—white Protestants, in many cases—turning on minorities, immigrants, “Catholics, radicals, workers and labor organizers.” Hofstadter made note of “verbal and ideological violence” that laid the foundation for actual harm. He also fretted about a “rising mystique of violence on the left.” By 1969, the Student Nonviolent Coordinating Committee, a civil rights group cofounded by John Lewis a decade earlier, had elected new leadership and dropped “Nonviolent” from its name. The usually staid New York Review of Books had featured an instructional diagram for making a Molotov cocktail. In the age of television, Hofstadter sensed, practitioners had figured out that what played well on TV was often the language and the imagery of force. On both the left and the right, he wrote, politics was giving way to a culture of self-expression in which the “distinction between politics and theatre has been deliberately blurred.” That blurring of distinction, between politics and theater, was a remarkably fitting preamble to the observations, in the decade that followed, by Neil Postman in Amusing Ourselves to Death. He watched the 1984 presidential debates, between Ronald Reagan and Walter Mondale, and lamented the hollow dodges, casual deceptions, and abbreviated answers. With a level of concern that now looks quaint, he bemoaned Reagan’s easy laugh lines and wrote, “The men were less concerned with giving arguments than with ‘giving off’ impressions, which is what television does best.” It would be three decades before the host of a reality show entered a bid for the presidency. But Postman had already sensed that “the demarcation line between what is show business and what is not becomes harder to see with each passing day.” For as long as Americans had strained to cultivate “reason’s eye,” they had fretted about the perennial perils of ignorance. Less than a generation after the founding of the country, Thomas Jefferson wrote, “If a nation expects to be ignorant and free, in a state of civilization, it expects what never was and never will be.” But by the early years of the twenty-first century, Americans were no longer surprised by annual reports that showed our students falling behind other countries’. In a 2005 survey, two thirds of Americans could not name the three branches of government. Scarcely a third of high school seniors read at or above the level of proficiency. Every turn in technology carried potential for not only liberation but also deceit and degradation. In a radio address in 1931, the philosopher and educator John Dewey warned of a growing vulnerability in the very medium he was using. “Democracy will be a farce,” Dewey said, “unless individuals are trained to think for themselves, to judge independently, to be critical, to be able to detect subtle propaganda and the motives which inspire it.” Human beings, as Dewey knew, were better at absorbing new information than at defending against lies. In a classic study, people shown a list of novel pieces of information were warned that some of it was bogus; but, quizzed later, they tended to remember the information and forget which pieces were false. Advertisers have known for decades that people are not good at resisting even naked efforts to distort their decision-making. In a study by the marketing professors Gavan Fitzsimons, of Duke, and Baba Shiv, of Stanford, a group of subjects were told simply that cake “may have some major health benefits.” As vague as that was, when they were later offered a choice of cake or fruit, those subjects were nearly twice as likely to choose the cake as other people were. (In follow-up interviews, the cake eaters roundly denied that their behavior had been affected by the suggestions.) Manipulation is an ancient feature of politics, but in the modern age it has acquired new effectiveness, not only because the technologies of influence have been refined but also because the economic stakes have grown. In the late 1960s, at the same time that libertarians such as Bill Middendorf in Greenwich were growing concerned about the environmental and consumerprotection movements, executives were embarking on a creative campaign to defend their industries. In 1969, as the cigarette industry faced stricter regulations, a memo circulated among executives at Brown & Williamson tobacco. “Doubt is our product,” it explained, “since it is the best means of competing with the ‘body of fact’ that exists in the minds of the general public. It is also the means of establishing a controversy.” That strategy—the mass production of doubt, to compete against “the body of fact”—was a quiet revolution. The tools for generating the political theater of skepticism—think tanks, campaign finance, dubious science— would eventually be deployed against a broad range of political targets. On one front, Republican lawmakers promoted a campaign against “voter fraud,” calling for stricter ID laws that tended to reduce turnout among Democraticleaning voters—mainly minorities, students, and the poor. Researchers found that, of one billion votes cast in all American elections between 2000 and 2014, election officials detected a total of thirty-one possible cases of impersonation fraud. But the controversy was the point. In a candid email in 2011, collected during a campaign-finance investigation, a Republican lobbyist in Wisconsin wrote to colleagues even before votes had been counted: “Do we need to start messaging ‘widespread reports of election fraud’ so we are positively set up for the recount regardless of the final number?” (By the fall of 2020, polls showed that nearly half of registered American voters believed in the existence of widespread voter fraud.) Over time, contempt for facts came to be an ideology. In 2004, an aide to George W. Bush (widely identified as Karl Rove, though he denied it) dismissed the “reality-based community,” by which he meant people who insist on inconvenient facts. “We’re an empire now,” the aide told the journalist Ron Suskind, “and when we act, we create our own reality.” Magical thinking was taking its place on the main stage of politics. Bill Moyers, in a speech on end-times rhetoric in evangelical politics, lamented, “One of the biggest changes in politics in my lifetime is that the delusional is no longer marginal.” In the 2008 book The Age of American Unreason, Susan Jacoby declared, “America is now ill with a powerful mutant strain of intertwined ignorance, anti-rationalism and anti-intellectualism.” Factchecking, as a mode of thinking, became suspect. In 2012, after the Obama campaign said Romney’s claims about welfare were “blatantly false,” Romney’s pollster, Neil Newhouse, memorably responded, “We’re not going to let our campaign be dictated by fact checkers.” Even as industries were getting bolder in their manipulations of the public sphere, the die-off of traditional sources of information was accelerating. In February 2018, the Charleston Gazette-Mail, West Virginia’s most powerful newspaper, filed for bankruptcy. In August, in a milestone for the decline of local news, the Pittsburgh Post-Gazette announced that it would no longer print on Tuesdays and Saturdays—making Pittsburgh, a straight shot up the highway from Clarksburg, the largest city in America without a daily paper. In Clarksburg, The Exponent Telegram was scrambling to survive. In 2018, the local Kmart closed, followed by Sears—and they had been two of the biggest advertisers in town. Brian Jarvis, the young lawyer who had gone into the newspaper business, told me, “All of a sudden, they’re gone. We lost more than half of what we had in 2012.” He landed on an idea: he bought some other tiny papers—the Garrett Republican, the Weston News—and combined their back offices. He adopted the language of new media. The old office in Clarksburg became, in Jarvis’s words, a “creative hub.” At his cluttered desk in the creative hub, he told me, “We’ve got ninety-seven employees, thirty-eight content generators, and close to three hundred stories a week.” He smiled. “I’m only thirty-six, so I’m just going to keep going until somebody tells me to stop, or the good Lord does.” It was up to John Miller, the editor, to figure out how to generate those stories. He prodded his reporters to meet a formidable threshold: each was expected to write ten stories a week. He knew how difficult that was, but, as usual, he found a noble spin for it. “If you want to be a writer, you’ve got to practice the trade,” he told them. The blunt fact was that, as the business shrank, the paper could hardly afford to be as bold as it was in the days when the old editor, Bill Sedivy, was railing on the editorial page against mountaintop mines. “When everyone else is closing, it’s difficult for small newspapers to do things that are very controversial,” Julie Cryser, the former city editor, told me. “Newspapers are less likely to go across the line and really push the envelope, and to ask questions and look deeper.” Cryser worried that the disappearance of strong local papers was forcing people toward television and dubious sources on the Web. “There’s this big void where nobody is getting alternative opinions or ideas,” she told me. “There’s nothing that combats the concepts you have about the way the world works.” She brought up the KKK rally that came to town in 1999. “If, twenty years later, the KKK would come to Clarksburg today, I don’t know that it would have been the same as it was. People would be charged up by whatever they read on their iPhones or their Samsungs, and it would have been a lot more divisive and explosive than what it was.” Not far up the highway from Clarksburg, David Efaw, the miner who used to work for Patriot Coal, had received two newspapers at his house for years. “My mother-in-law preferred one, and I preferred the other, and then we’d swap,” he said. Even if he didn’t like it, he enjoyed seeing what the other had to say. For a time, even after his mother-in-law died, he kept reading them both. But eventually, they got so thin that he got more and more of his news from his computer’s homepage—which was set, out of habit, to MSN.com, a remnant from Microsoft’s heyday, which aggregated news stories from around the Web. “I used to read a lot more newsmagazines than I do now,” he added. But many of them—Newsweek, U.S. News—scarcely existed anymore. Most of all, Efaw said, he didn’t know what sources to believe anymore. He flicked back and forth between CNN and Fox but remained chronically unsatisfied. “So much of it is politically biased,” he said. It was exhausting to be so alert to deception. “You’ve got to look around you and think, Are they telling you the truth or not?” Of all the sentiments I heard as I ventured from place to place in those years, that was the most frequent—a constant, defensive anxiety that people felt unsure whom to believe. Once, in an unintended fashion, I found myself at the receiving end of that paranoia. At home on a Sunday in 2019, I went looking for the iPad that we used for watching movies—and for distracting kids on car trips. Ollie, who was three, loved the iPad, no surprise, and whenever he got ahold of it, he entered random numbers into the passcode screen to try to unlock it. When I found it that afternoon, Ollie had made an especially diligent run at it. It wasn’t clear how many codes he had tried, but by the time he gave up, the screen said “iPad is disabled, try again in 25,536,442 minutes.” That worked out to about forty-eight years. I took a picture of it with my phone, wrote a tweet asking if anyone knew how to fix it, and went downstairs to dinner. I didn’t think much about the iPad again until the next morning, when I received an email from the news division of CTV, a Canadian television network, asking for information about “the locked iPad.” Three minutes later, another email arrived, this time from the Daily Mail Online, in London. I ignored them. Soon I was hearing from CNN and USA Today. A British friend sent a text of condolences on the locked iPad and a screenshot of another article headlined “LOCK SHOCK: Baffled dad locked out of iPad for 25 million minutes after son, 3, tried to guess password.” (It was featured beside pregnancy photos of Meghan Markle.) When I went online, I discovered that the tweet had taken flight and generated thousands of reactions. Some people were scolding: “I wonder why a 3yo is in reach of an iPad. Deserved this tbh.” Others were heartfelt: “Obviously, your offspring has the dogged, unfailing persistence required for a future career in a research field.” But I was intrigued, above all, by a subset of readers who had scoured the photo like the Zapruder film and pronounced it a conspiracy or a fraud: “Its display is not Retina and the wallpaper is from the first iOS series,” someone wrote. “Great work of deceiving people!” I contacted the guy who wrote that—a Pakistani teacher named Khalid Syed—and he was happy to chat. “Sorry for my terrible English,” he said. He and friends had seen the story about my iPad on CNN and suspected it was corporate dark arts by rivals of Apple. Or maybe, he suspected, “you want to be popular on Twitter.” He said, “People are so after money. And can do anything to get money. I have seen it.” The emails and tweets kept arriving for several more days. The more I read, the more they reminded me of what Hannah Arendt called a “peculiar kind of cynicism” that settles into societies that allow the “consistent and total substitution of lies for factual truth.” To get through the day, she wrote, people eventually embrace “the absolute refusal to believe the truth of anything.” I had first jotted that line down a few years earlier, to make sense of my life in China. I had not expected it would be relevant once I came home. Often, when I talked to Reese Clark, he had a theory that he wanted to test out on me. The theories varied in reliability. At the most exotic end, he suspected the government was planning to impose martial law, beginning in Black neighborhoods. He pointed out the pattern of traffic circles that were common in residential enclaves on the South Side. Those, he said, were not intended to prevent cars from speeding; they were “checkpoints” in the event that the military ever moved into the neighborhoods. “And they made the expressway wide as hell—for the tanks,” he said. Most of the time, though, his anxieties seemed to reflect an effort to find a logic in the cruelties and frustrations around him. He suspected the federal government funneled guns into the Black community in order to keep people fighting one another. “We’ve got military-grade bullets. How did we get them?” He had a memory from his drug-dealing days when an older white addict beckoned him into a garage to look inside a wooden crate. “It had hay in it, and it had damn AK-47s. Military shit. And, I said, ‘Hell no, I’m gone. I don’t want no part of it.’” He had come to believe that politicians and police allowed Black neighborhoods to deteriorate on purpose; if people left, he figured, it would be easier to gentrify and profit from the real estate. “They’re trying to get Blacks out,” he said. “How do you get ’em out? You close down stores.” He added, “It’s not about being Black or white no more. It’s the haves and the have-nots.” Most of all, he said, he was convinced that Donald Trump had only won in 2016 by cheating. “The voting was rigged,” he told me. “I was watching the vote. They had the map of the United States, and they showed the blue parts, how much the Democrats won. Then they went to commercial, and when they came back, more of the states were red, and they said Trump won the motherfucking White House!” At its core, Trump’s project transcended simply dismissing the “realitybased community.” His administration was undermining the notion of verifiable reality itself. After less than twenty-four hours in office, his spokesman Sean Spicer accused the media of deliberately underestimating the size of the crowd for Trump’s inauguration. He called it the “largest audience to ever witness an inauguration—period—both in person and around the globe.” When Spicer was mocked and condemned, Kellyanne Conway went on television and serenely defended Spicer’s rant as an example of “alternative facts.” Her term produced a chorus of ridicule, but she was undeterred; in a radio interview, she framed the criticism as a condescending hang-up of the liberal elite. “Americans are their own fact checkers,” she said. “People know, they have their own facts and figures, in terms of meaning which facts and figures are important to them.” Within a month, Trump had declared the press the “enemy of the people,” and that attack became a standard piece of his presentations. By the time Trump was performing before the crowd in Huntington that summer, his supporters had already been conditioned to expect that he rejected the facts that others told them to believe. The following year, he made it explicit. In a moment that could pass as mock-Orwell, he told a crowd, “Just remember: what you’re seeing and what you’re reading is not what’s happening.” (By the end of his term, he had made 30,573 false or misleading claims, according to The Washington Post.) But the effect of his deceptions was not just an assault on knowledge. In ways that were only becoming visible over time, he was dismantling not only the concept of a common truth but also the notion of a truly shared world. He was rejecting the very notion of an empirical commons—the idea that anything could be free from the abuses and cynicism of politics. For a population already atomized and disillusioned by false promises, subjected to relentless reminders that government was not “here to help,” it was its own call to arms. Trump was giving them not only permission to ignore the facts that seemed to be hoarded and treasured by the meritocratic elite; he was offering them an exhilarating antidote to loneliness—a new sense of solidarity defined, above all, by doubt. To govern, he relied on the manufacturing of doubt, titillation, and seductive fictions—a politics by peek-a-boo. In its clearest formulation, Steve Bannon, the former head of Breitbart News and chief strategist for Trump, told the writer Michael Lewis, “We got elected on Drain the Swamp, Lock Her Up, Build a Wall.” He said, “This was pure anger. Anger and fear is what gets people to the polls.” Bannon added, “The Democrats don’t matter. The real opposition is the media. And the way to deal with them is to flood the zone with shit.” The forces arrayed on the side of truth were ludicrously outmatched. In March 2018, Google pledged $300 million over three years to “help journalism thrive in the digital age.” The company planned to train journalists in artificial intelligence and other technologies, and to help publishers enhance their digital products. It was a worthy initiative, though the impact was slight; the investment equated to less than the company earned in profits in a week. At the same time that Americans were becoming less mobile—socially and geographically—they were being liberated from the old boundaries of information. Ideas—at their best and worst—were becoming only more mobile, racing around the country without the friction once imposed by gatekeepers such as John Miller in Clarksburg. The arbiters who used to weigh what was verified and important enough to merit attention, and what deserved to be shunted to the margins, had lost their mandate. Americans were voting in smaller and smaller numbers for mayors and county commissioners while adopting fiercer and more strident positions on issues thousands of miles away, from transgender bathrooms in North Carolina to gas pipelines in North Dakota. The extraordinary political and economic changes of the last half century had exerted stresses on what people believed. In 1968, Peter Drucker, the American management consultant, predicted an “age of discontinuity” as globalization and technology made some jobs extinct and created new surges of wealth. In 1992, the political scientist Francis Fukuyama—who was often too quickly cast by critics as a triumphalist—warned that, after the Cold War, people in the West might well “struggle for the sake of struggle. They will struggle, in other words, out of a certain boredom: for they cannot imagine living in a world without struggle,” he wrote. “And if the greater part of the world in which they live is characterized by peaceful and prosperous liberal democracy, then they will struggle against that peace and prosperity, and against democracy.” There was a prescient truth in it. Trump, the Tea Party, the NRA—they all made use of that rising unease of Americans who could not quite put a name to the anxieties they felt about the disordering of their world, about the puncturing of American invincibility, the browning of America, the vanishing of jobs to automation, the stagnation of their incomes. The language of force gained ground. Sarah Palin, in her appearances at Tea Party rallies and online, made frequent use of metaphors from the Revolutionary War and the world of guns. “Don’t retreat, reload,” she liked to say. By the end of the Obama years, Americans were ideologically restless. Socialism was growing on the left, especially among young people who had been disillusioned by the Democratic Party. They had watched bankers escape punishment after the financial crisis, even as their own prospects for wealth and safety and opportunity declined. “You can make it if you try,” Obama liked to say, but it was becoming ever harder to see that as a fact. On the right, meanwhile, nativism was growing. For those who were already stewing in economic or racial resentment, it had to do less with ideology than with a rootlessness of the mind—a loss of purpose, inspiration, and community. For people who felt excluded from the commanding heights of American life, it was tempting to hunt for explanations in conspiracies and superstitions, even when they bordered on the supernatural. By November 2016, the demonization of Clinton and her advisers was so intense that even the most outlandish allegations about pedophilia, murder, and the occult found an audience. On Reddit, 4chan, and other forums, people hunted through emails leaked from the account of her campaign chairman, John Podesta. In their search for anything sinister, self-styled sleuths conjured up the suggestion of code words and hidden meanings: “pizza” could be code for child pornography; “pasta” meant little boy; “sauce” was an orgy. The delusions gained attention when they were amplified, on Twitter, by prominent figures such as retired general Michael Flynn, Trump’s designated national security advisor. An especially deranged thread on Reddit tied it together into a theory that Podesta was a pedophile running a child sex-trafficking ring with Clinton from the basement of a pizzeria in Washington called Comet Ping Pong. (Never mind that the building had no basement.) An anonymous poster wrote, “Everyone associated with the business is making semi-overt, semi-tongue-in-cheek, and semi-sarcastic inferences towards sex with minors.” “Pizzagate,” as it became known, seemed to crest on a Sunday afternoon a month after the election, when an armed twenty-eight-year-old believer who had binge-watched YouTube videos about Pizzagate, walked into Comet Ping Pong on a mission to save children. As people fled the building, he fired several rounds from his AR-15 rifle into a closet full of office equipment, hunting for a child-sex dungeon. After he surrendered to police, he told them he had come from his home in rural North Carolina to “self-investigate.” Over the next two years, the Pizzagate delusion continued to morph and spread like a pathogen, feeding into new conspiracy theories that fed even larger communities of belief. The most popular was QAnon, which emerged around a set of anonymous Web posts purported to be the work of a Trump loyalist, a U.S. government official with a high-level, or “Q-level,” security clearance. The QAnon posts drew people in by unspooling gnomic clues suggesting the existence of a hidden cabal of Satan-worshipping, cannibalistic pedophiles plotting against Trump. It spread from fringe message boards to mainstream platforms, and Facebook eventually found millions of followers across thousands of groups and pages. Trump encouraged the fantasy—retweeting messages about it and praising its believers as “people who love our country.” Anne Applebaum, an American author and journalist who lived in Poland, had watched a wave of delusions sweep through Polish politics, and she came to recognize that each delusion “offered a new reason to distrust the politicians, businesspeople, and intellectuals,” an explanation for hatred of the elites, she wrote. “It explains away complex phenomena, accounts for chance and accidents, offers the believer the satisfying sense of having special, privileged access to the truth.” Often, it didn’t matter how acquainted with the real world of politics you were; the satisfactions of the fantasy were powerful. In 2018, Kelly Johnston, a former secretary of the Senate who had helped oversee the day-to-day work of the chamber, adopted the fantasy that George Soros’s Open Society Foundations was secretly organizing the caravans of migrants that Trump had put at the center of his fearmongering. Johnston, who had become the vice president of government affairs for the Campbell Soup Company, tweeted photos of migrants in Mexico, which he appended with his imaginings about Soros: “See those vans on the right? What you don’t see are the troop carriers and the rail cars taking them north.” (Campbell Soup disavowed his comments, and he exited the company.) The more Trump struggled to control the government, the more he leaned on the register of force. He stoked racial hostility, white identity politics, and fantastical fears of marauding immigrants. Technology, of course, was linking right-wing populist believers together across vast distances. In economically struggling parts of the country, people were using technology not only to amplify their discontent but also to forge a sense of solidarity, to see validation in one another—a sign that they are not suffering alone. It allowed them to feel a sense of shared passion and fears and culture in ways that the nation’s founders had not predicted. In Federalist No. 10, James Madison had argued that the scale of the United States would make it difficult for any faction to dominate all others. Earlier movements had made use of technology as well. After World War II, the civil rights movement that germinated in the rural South achieved broad national impact through the print and broadcast media, which attracted northern activists and eventually pressured politicians to respond. Later, the conservative movement used its own media networks, in newsletters, talk radio, and cable television. For Trump, the movement was growing on yet another generation of technology—Twitter, Facebook, 4chan, Reddit. Watching people slip deeper into Trump’s fantasies and conspiracy theories, I was struck by how much it reminded me of my years in China— where people were sometimes desperate to find causes that inspired them. Once when I wrote about a surge of toxic nationalism there, an observant young writer and translator named Lu Han told me, “Growing up in China, there are very few chances for you to feel like that—to be lifted spiritually, to be working on something bigger than yourself, more important than your immediate, ordinary life circle.” Now I was seeing it around me in America. Boredom, in Fukuyama’s sense, was not a lack of stimulation; it was a grasping for meaning and recognition. In 2019, a team of political scientists who study the flow of information online discovered that an obscure segment of the American electorate was rapidly gaining influence through their use of social media. The scholars—Michael Bang Petersen, Kevin Arceneaux, and Mathias Osmundsen—called those users “marginalized status-seekers”; typically, they were male, languishing in jobs that they considered beneath them, and acutely sensitive to slights or condescension from elites and political celebrities. With few other ways to have an impact, they adopted what the researchers called a “strategy of last resort”—amplifying the most outrageous and incendiary information they could find: “conspiracy theories, fake news, discussions of political scandals and negative campaigns.” They had no ideology to advance; they were not sharing rumors because they actively believed them; it was, the authors wrote, “simply a tool to create havoc.” They were the superspreaders behind the madness of the birther faction, of Pizzagate, of Alex Jones’s delusions that the Sandy Hook Elementary School shooting was staged to promote gun control. In surveys, they asked people if they agreed with apocalyptic statements such as “When I think about our political and social institutions, I cannot help thinking ‘just let them all burn.’” They were startled to find that 40 percent of their respondents agreed. For the moment, the authors interpreted these “chaotic motivations” not as a sign that Americans were preparing for “actual fights with the police or to commit other forms of political violence.” Instead, they wrote, it was a window into the “thoughts and behaviors that people are motivated to entertain when they sit alone (and lonely) in front of the computer, answering surveys or surfing social media platforms.” Part of America’s predicament was that its political parties magnified the intensity of factions, rather than serving to negotiate their differences toward a compromise. Ideally, parties pull people into blocs that help bridge their racial, religious, and professional differences; they provide an alternative collective identity. But America’s parties were doing precisely the opposite: they compounded and amplified the differences. In the five years since McConnell had used the bumper sticker “COAL. GUNS. FREEDOM,” the identities had become even more distinct. The latest popular T-shirt on sale at rallies declared: “I support Donald Trump. I love freedom. I drink beer. I turn wrenches. I protect my family. I eat meat & I own guns. If you don’t like it, MOVE.” At the Fund for Peace, a think tank in Washington, researchers ranked the political “cohesion” of various countries between 2008 and 2018; they measured the entrenchment of factions, trust in the security forces, and the level of popular discontent. The United States recorded the largest drop in cohesion among any of the countries studied, including Libya, Mali, and Bahrain. In a paper presented in 2018, the political scientists Nathan Kalmoe and Lilliana Mason found that 15 percent of Republicans and 20 percent of Democrats believed that the United States would be better off if large numbers of the opposing party “just died.” The culture of political warfare was about more than guns or fringe conspiracy theories. It was a mutant version of a mainstream ethos: a survival mindset derived from a sense of zero-sum contests, in which only one side can prevail. The weaker the public felt, the more they grasped for gestures of force; as in Freeman’s portrait of antebellum violence, Americans were coming to believe that they could no longer afford to abide by the old norms. Freeman told me that violence was filling a void left by America’s eroded democracy: “The current moment has reams of people who feel unheard and unrepresented amidst multiple crises, people who have been stewing in that gripe for years. They sense that the tides of demographics and culture are turning against them.” She continued, “Cloak that in the rhetoric of democracy, and it has a real appeal.” On the left, the sense of an existential showdown was finding its own acute form. In the years since Trump entered politics, far-left vigilantes, operating under the loose label of Antifa, for “anti-fascist,” adopted confrontational tactics inspired by the European anarchist tradition. The term originated in the 1930s, when German leftists brawled with Nazis in the streets; in the 1980s, members of the British punk scene tried to purge racism and hypernationalism from their ranks, sometimes with street-level violence. In America, Antifa protesters, often wearing black clothing and bandannas or masks over their faces, became visible on the edges of protests against Trump and white nationalism. Some employed violence in the belief that it was “preemptive” self-defense. Antifa gained wider public attention on Trump’s inauguration day, when the white nationalist Richard Spencer was giving a television interview on a street corner and a masked protester, dressed in black, punched him in the head. Republicans and commentators took to condemning Antifa as a symbol of leftist excess and chaos, and a justification for tougher police tactics. By the fall of 2018, the tempo of political violence was at a turning point. On both coasts, members of the Proud Boys, a self-described violent “Western chauvinist” group, fought in the streets with activists who identified with Antifa or Black Lives Matter. In Portland and New York City, police broke up melees, and some of Trump’s allies expanded their preoccupation with violence in Chicago to blame it on their political opponents. Jeff Sessions, the attorney general, said, “If you want more shootings, more death, then listen to the ACLU, Antifa, Black Lives Matter, and groups who do not know the reality of policing.” The language of mortal confrontation was permeating politics so thoroughly that it no longer attracted notice. More than a year after white supremacists in Charlottesville chanted “You will not replace us,” that message had been taken up by mainstream conservative commentators. In December 2018, Tucker Carlson told his audience, “It’s like, shut up, you’re dying. We’re going to replace you.” On Fox, Ann Coulter said, “You can’t shoot Americans. You can shoot invaders.” On August 3, 2019, on 8chan, a far-right forum, a commentator named Patrick Crusius combined those ideas and committed himself to “defending my country from cultural and ethnic replacement brought on by an invasion.” Several minutes later, he walked into a Walmart in El Paso and killed twenty-three people. In a single week in October 2018: A gunman in Louisville killed two Black senior citizens in a grocery store, telling a bystander “whites don’t shoot whites.” A gunman in Pittsburgh killed eleven people at a synagogue, the deadliest attack on Jews in American history. And, in Florida, a man who lived in a van plastered with Trump signs sent pipe bombs to a dozen people, including Soros and two former presidents. (Reporters later discovered that his house had been foreclosed in 2009 by a bank whose principal owner and chairman was Trump’s treasury secretary, Steven Mnuchin.) At the end of that bloody seven-day span, even Trump seemed to sense that he had unleashed forces that could, in an instant, burn beyond his control. At a nighttime rally in Murphysboro, Illinois, a small city in the rural southern reaches of the state, he said, “If you don’t mind, I’m going to tone it down—just a little.” But it was too late. The crowd roared back with a resounding “No!”

robot