After dragging their feet for 3 years, the Biden administration seems to be putting some juice to get the DEA to move
Is it opening the door to a new era? It seems the Biden administration has suddenly decided to follow up on their 2020 campaign promises. But does the sense of urgency reflect not only their need to engage younger voters but something else? Is the administration racing to reschedule by 4/20. President Biden brought up federal rescheduling as part of his proclamation declaring April to be “Second Chance Month.” Followed by his mentioning it in the State of the Union, this should be a signal to the Federal Drug Agency (FDA) to move on the recommendation by other agency and act.
Having made the promise, for almost the first 3 years of his tenure, Biden barely acknowledged the cannabis industry. This despite sales in the industry continues to grow and now, over 50% of the country has legal access to products. Those under 40 have an entirely different take on marijuana with Gen Z drifting away from alcohol and moving to weed. Beer sales have mirrored the societal shift. They have been out of step with the public.
Photo by Chip Somodevilla/Getty Images
Biden is struggling with younger voters. Media like the New York Times has been piling on by highlighting why he is losing and gently making it a much bigger issue. The campaign is concerned and sees to reengage this demographic. Biden is viewed favorably by only 31% of people ages 18 through 29, much worse than he fares with other age groups, according to a recent Economist/YouGov poll.
The White House begrudgingly started the process of rescheduling last year. Currently, cannabis classified as having zero medical benefit is lumped into the same category as heroin and LSD. Neither alcohol or tobacco are boxed into this category despite having zero health benefits and litany of problems the does cause.
Rescheduling would be an immediate benefit to an industry struggling with a host of issues including tough business rules around the classification, chaos in the New York and California market, and a dropping of flower prices. Some older Senators including James Risch (R-ID) and Pete Ricketts (R-NE) are making a last ditch effort to stop the process.
While 4/20 has long been a wink wink nod to marijuana use for those in the know, thanks to the legal sales it is another big media day. Like the 4th of July or Drinksgiving/Green Wednesday, it is a time where they could get the most engagement with the public.
Why would the most notoriously cash-strapped man in America waste money on frivolous lawsuits?
On Monday, Donald Trump—whose lawyers recently announced that he can’t come up with the money to post a $454 million bond in his civil fraud case—fired off yet another suit against a news organization that reported facts he didn’t like. The targets this time are ABC News and its anchor George Stephanopoulos, who Trump alleges defamed him by stating that Trump had been found liable for raping E. Jean Carroll.
The case looks like a sure loser. Trump was technically found liable under New York law for sexual abuse, not for rape, but the judge in the civil case ruled that, by forcibly penetrating Carroll’s vagina with his fingers, “Mr. Trump in fact did ‘rape’ Ms. Carroll as that term commonly is used and understood.” But no matter. The Stephanopoulos suit slots into a well-worn groove for Trump, who for years has lodged periodic lawsuits against alleged purveyors of “fake news” about him. Targets have included The Washington Post, The New York Times, CNN, Bob Woodward, and a Wisconsin TV station that ran an attack ad against him during the 2020 campaign. Trump has even gone after the board of the Pulitzer Prizes for awarding Pulitzers to the Post and the Times for their coverage of his connections to Russia.
Filing these suits has been costly for Trump—or rather, for donors to his campaign and affiliated political action committees, who have footed millions of dollars in legal fees. Not one of Trump’s media lawsuits has ever succeeded, nor is one ever likely to, given both the underlying facts and the towering bar a president or former president faces in proving defamation. In one case against The New York Times, a judge found Trump’s argument so flimsy that he ordered Trump to pay the Times’ legal fees. In other cases, such as the one involving the Wisconsin station, the suit was quietly withdrawn a few months after it was filed.
So why does he keep doing it? On a basic level, this appears to be just Trump being Trump—peevish, headstrong, and narcissistic. For decades, his love-hate relationship with reporters has tended to flare into legal action, as it did in 2006 when he sued the writer Tim O’Brien over a few pages in a book that questioned Trump’s personal wealth. As Trump told me in an interview in 2016, he knew he couldn’t win that suit (he didn’t) but brought it anyway to score a few points. “I spent a couple of bucks on legal fees, and [O’Brien’s publisher] spent a whole lot more,” he said then. “I did it to make his life miserable, which I’m happy about.”
But Trump’s quixotic legal crusades are not as irrational as they appear. Suing the press serves as a branding exercise and a fundraising tool. The lawsuits show his supporters that Trump is taking the fight to those lying journalists—so won’t you contribute a few dollars to the cause? They thus have become an end unto themselves, part of an infinite loop: sue, publicize the suit, solicit and collect donations, sue again. The cases may be weak on the legal merits, but they “further his narrative of being persecuted by the radical left media,” Brett Kappel, a campaign-finance lawyer who has researched Trump’s legal actions against the press, told me.
This narrative has been a fixture of Trump’s fundraising pitches for years. A few weeks after his inauguration, in 2017, one of his fundraising committees sent out an email urging donors “to do your part to fight back against the media’s attacks and deceptions” by sending contributions that would help “cut through the noise” of news reports. Even before Trump filed a lawsuit against CNN in August 2022 (for describing his election lies as “the Big Lie”), his campaign was using the nonexistent suit to drum up contributions. “I’m calling on my best and most dedicated supporters to add their names to stand with me in my impending LAWSUIT against Fake News CNN,” read a fundraising email. A second email sent out under Trump’s name a few hours later struck a sterner tone: “I’m going to look over the names of the first 45 Patriots who added their names to publicly stand with their President AGAINST CNN.”
When Trump got around to filing the suit two months later, the appeals began anew. “I am SUING the Corrupt News Network (CNN) for DEFAMING and SLANDERING my name,” the campaign email read, in a chaotic typographical style reminiscent of a ransom note. “They’ve called me a LIAR, and so far, I’ve been proven RIGHT about EVERYTHING. Remember, when they come after ME, they are really coming after YOU … I’m calling on YOU to rush in a donation of ANY AMOUNT RIGHT NOW to make a statement that you PROUDLY stand with me.” The suit was dismissed last year by a federal judge appointed by Trump. Trump is appealing.
Of course, the cost of suing news organizations is a pittance compared with what Trump’s donors are spending on his criminal defense. But it isn’t cheap. According to Federal Election Commission records culled by Kappel, the Trump-controlled Save America PAC shelled out nearly $500,000 to the firm that sued the Pulitzer Prize board on Trump’s behalf in 2022. It paid $211,000 last year to another law firm that handled Trump’s litigation against CNN, among other matters, and an additional $203,000 to the firm handling the appeal.
The biggest recipient, by far, has been the attorney Charles Harder, the defamation specialist who represented Hulk Hogan in his successful suit against Gawker Media in 2016. From early 2018 to May 2021, according to FEC records, Harder took $4.4 million in fees from Trump-affiliated organizations. At one point in 2020, Harder’s Beverly Hills firm received more money than any other firm doing work for Trump.
Harder’s work on Trump’s behalf didn’t produce anything close to his career-making Hogan verdict, which resulted in a $140 million award that drove Gawker into bankruptcy. Harder took the lead in Trump’s effort to suppress publication of Michael Wolff’s book Fire and Fury in 2018; he sent cease-and-desist letters to Wolff and his publisher, Henry Holt and Co., before the book’s release, claiming that it contained libelous passages. The book was released as scheduled and became a best seller, and Trump didn’t sue. In 2020, Harder handled Trump’s lawsuit against the Times, alleging that an opinion piece by the former Times editor Max Frankel was defamatory. A judge dismissed that suit in 2021. (Harder, who no longer represents Trump, declined to comment for this story.)
Whether Trump’s beat-the-press strategy is a net financial winner, once all the donations are collected and the attorney fees are subtracted, is hard to say. But Trump’s filing of another hopeless lawsuit this week suggests that the math may be in his favor. Why bother paying lawyers millions of dollars to sue and appeal if the return on investment is less than zero? Trump may be petty and irrational, but he has never been accused of neglecting his own financial interests. (A Trump spokesperson didn’t return a request for comment.)
At the moment, of course, Trump has much bigger headaches. As of this writing, he’s days away from having his assets seized to satisfy that civil-fraud judgment. His overall fundraising has lagged President Joe Biden’s. And he is burning through his supporters’ money to pay for his criminal defense. Despite all that, he still finds a way to keep filing lawsuits against the media. You almost have to admire the commitment.
COLLEGE POINT, Queens (WABC) — Pro-Palestinian protesters swarmed the New York Times printing facility in Queens, one of the largest facilities in the nation.
Some popular newspapers will likely be delivered on a delay Thursday morning due to the commotion at the facility.
Police say that at around 1 a.m. Thursday, protestors prevented tucks from accessing the 300,000-square-foot building by blocking the roads with debris.
Many laid down in a chain, connecting to each other with tubes. They held signs that read, “Stop the presses. Free Palestine” and “Consent for genocide is manufactured here.”
This facility is responsible for printing the New York Times, USA Today, Wall Street Journal, Newsday, and the New York Post. There are 27 printing facilities across the country.
Law enforcement was called to clear the protesters. No arrests were made.
The trucks eventually gained access to the building.
Have a breaking news tip or an idea for a story we should cover? Send it to Eyewitness News using the form below. If attaching a video or photo, terms of use apply.
What was the worst moment for the American economy in the past half century? You might think it was the last wheezing months of the 1970s, when oil prices more than doubled, inflation reached double digits, and the U.S. sank into its second recession of the decade. Or the 2008 financial collapse and Great Recession. Or perhaps it was when COVID hit and millions of people abruptly lost their job. All good guesses—and all wrong, if surveys of the American public are to be believed. According to the University of Michigan Surveys of Consumers, the most widely cited measure of consumer sentiment, that moment was actually June 2022.
Inflation hit 9 percent that month, and no one knew if it would go higher still. A recession seemed imminent. Objectively, it’s hard to claim that the economy was in worse shape that month than it had been at those other cataclysmic times. But substantial pessimism was nonetheless explicable.
Over the next 18 months, however, the economy improved rapidly, and in nearly every way: Inflation plummeted to near its pre-pandemic level, unemployment reached historic lows, GDP boomed, and wages rose. The turnaround, by most standard economic measures, was unprecedented. Yet the American people continued to give the economy the kind of approval ratings traditionally reserved for used-car salesmen. Last June, the White House launched a campaign to celebrate “Bidenomics”—the administration’s strong job-creation record and big investments in manufacturing and clean energy. The effort flopped so badly that, within months, Democrats were begging the president to abandon it altogether.
More recently, consumer sentiment has improved. After falling for months, it suddenly rebounded in December and January, posting its largest two-month gain in more than 30 years—even though the economy itself barely changed at all. Yet as of this writing, sentiment remains low by historical standards—nothing like the sunny outlook that prevailed before the pandemic.
What’s going on? The question involves the psychology of money—and of politics. Its answer will shape the outcome of the presidential election in November.
The toll of inflation on the American psyche is undoubtedly part of the story. That people hate high inflation is not a novel observation: The Federal Reserve has long been obsessed with preventing another ’70s-style inflationary spiral; its patron saint is Paul Volcker, the former Fed chair who famously broke that spiral by jacking up interest rates, which plunged the economy into a recession. But although experts and political leaders know that inflation matters, the way they understand the phenomenon is very different from how ordinary people experience it—and that alone may explain why sentiment stayed low for so long, and has only now begun to rise.
When economists talk about inflation, they are often referring to an index of prices meant to represent the goods and services a typical household buys in a year. Each item in the index is weighted by how much is spent on it annually. So, for instance, because the average household spends about a third of its income on housing, the price of housing (an amalgam of rents and home prices) determines a third of the inflation rate. But the goods that people spend the most money on tend to be quite different from those that they pay the most attention to. Consumers are reminded of the price of food every time they visit a supermarket or restaurant, and the price of gas is plastered in giant numbers on every street corner. Also, the purchase of these items can’t be postponed. Things like a new couch or flatscreen TV, in contrast, are purchased so rarely that many people don’t even remember how much they paid for one, let alone how much they cost today.
The irony is that consumers spend a lot more, on average, on expensive, big-ticket items than they do on groceries or takeout, which means the prices we pay the most attention to don’t contribute very much to overall inflation numbers. (Less than a tenth of the average consumer’s budget is spent at the supermarket.) Some measures of inflation—“core” and “supercore” inflation among them—exclude food and energy prices altogether. That is reasonable if you’re a Fed official focused on how to set interest rates, because energy and food prices are often extremely sensitive to temporary fluctuations (caused by, say, a drought that hurts grain harvests or an OPEC oil-supply cut). But in practice, these measures overlook the prices that matter most to consumers.
This dynamic alone goes a long way toward explaining the gap between “the economy” and Americans’ perception of it. Even as core inflation fell below 3 percent over the course of 2023, food prices increased by about 6 percent, twice as fast as they had grown over the previous 20 years. “I think that explains a huge part of the disconnect,” Paul Donovan, the chief economist at UBS Global Wealth Management, told me. “You won’t convince any consumer that inflation is under control when food prices are rising that fast.”
Consumers say as much when you ask them. In a recent poll commissioned by The Atlantic, respondents were asked what factors they consider when deciding how the national economy is doing. The price of groceries led the list, and 60 percent of respondents placed it among their top three—more, even, than the share that chose “inflation.” This isn’t exactly a new development. In 2002, Donovan told me, Italian consumers were convinced that prices were soaring by nearly 20 percent even though actual inflation was a stable 2 percent. It turned out that people were basing their estimates on the cost of a cup of espresso, which had abruptly risen as coffee makers rounded their prices up after the introduction of the euro.
What’s more, most people don’t care about the inflation rate so much as they care about prices themselves. If inflation runs at 10 percent for a year, and then suddenly shrinks to 2 percent, the damage of the past year has not been undone. Prices are still dramatically higher than they were. Overall, prices are nearly 20 percent higher now than they were before the pandemic (grocery prices are 25 percent higher). When asked in a survey last fall what improvement in the economy they would most like to see, 64 percent of respondents said “lower prices on goods, services, and gas.”
What about wages? Even adjusted for inflation, they have been rising since June 2022, and recently surpassed their pre-pandemic levels, meaning that the typical American’s paycheck goes further than it did prior to the inflation spike. But wages haven’t increased faster than food prices. And most people think about wage and price increases very differently. A raise tends to feel like something we’ve earned, Betsey Stevenson, an economist at the University of Michigan, told me. Then we go to the grocery store, and “it feels like those just rewards are being unfairly taken away.”
If inflation is in fact the main reason the American people have been so down on the economy—and its future—then the story is likely to have a happy ending, and soon. My great-grandmother loved to reminisce about the days when a can of Coke cost a nickel. She didn’t, however, believe that the country was on the verge of economic calamity because she now had to spend a dollar or more for the same beverage. Just as surely as people despise price increases, we also get used to them in the end. A recent analysis by Ryan Cummings and Neale Mahoney, two Stanford economists and former policy advisers in the Biden administration, found that it takes 18 to 24 months for lower inflation to fully show up in consumer sentiment. “People eventually adjust,” Mahoney told me. “They just don’t adjust at the rate that statistical agencies produce inflation data.”
Mahoney and Cummings posted their study on December 4, 2023—18 months after inflation peaked in June 2022. As if on cue, consumer sentiment began surging that month. (Perhaps helping matters, food inflation had finally fallen below 3 percent in November 2023.)
There is another story you can tell about consumer sentiment today, however, one that has less to do with what’s happening in grocery stores and more to do with the peculiarities of tribal identity.
It’s well established that partisans on both sides become more negative about the economy when the other party controls the presidency, but this phenomenon is not symmetrical: In a November analysis, Mahoney and Cummings found that when a Democrat occupies the White House, Republicans’ economic outlook declines by more than twice as much as Democrats’ does when the situation is reversed. Consumer-sentiment data from the polling firm Civiqs and the Pew Research Center show that Republicans’ view of the economy has barely budged since hitting an all-time low in the summer of 2022.
Meanwhile, although sentiment among Democrats has recovered to nearly where it stood before inflation began to rise in 2021, it remains well below its level at the end of the Obama administration. It may never return to its previous heights. Over the past decade, the belief that the economy is rigged in favor of the rich and powerful has become central to progressive self-identity. Among Democrats ages 18 to 34, who tend to be more progressive than older Democrats, positive views of capitalism fell from 56 to 40 percent between 2010 and 2019, according to Gallup. Dim views of the broader economic system may be limiting how positively some Democrats feel about the economy, even when one of their own occupies the Oval Office. According to a CNN poll in late January, 63 percent of Democrats ages 45 and older believed that the economy was on the upswing—but only 35 percent of younger Democrats believed the same. To fully embrace the economy’s strength would be to sacrifice part of the modern progressive’s ideological sense of self.
The media may be contributing to economic gloom for people of every political stripe. According to Mahoney, one possible explanation for Republicans’ disproportionate economic negativity when a Democrat is in office is the fact that the news sources many Republicans consume—namely, right-wing media like Fox News—tend to be more brazenly partisan than the sources Democrats consume, which tend to be a balance of mainstream and partisan media. But mainstream media have also gotten more negative about the economy in recent years, regardless of who’s held the presidency. According to a new analysis by the Brookings Institution, from 1988 to 2016, the “sentiment” of economic-news coverage in mainstream newspapers tracked closely with measures such as inflation, employment, and the stock market. Then, during Donald Trump’s presidency, coverage became more negative than the economic fundamentals would have predicted. After Joe Biden took office, the gap widened. Journalists have long focused more on surfacing problems than on highlighting successes—bringing problems to light is an essential part of the job—but the more recent shift could be explained by the same economic pessimism afflicting many young liberals (many newspaper journalists, after all, are liberals themselves). In other words, the media’s negativity could be both a reflection and a source of today’s economic pessimism.
What happens to consumer sentiment in the coming months will depend on how much it is still being dragged down by frustration with higher prices, which will likely dissipate, as opposed to how much it is being limited by a combination of Republican partisanship and Democratic pessimism, which are less likely to change.
Will the place that it finally settles in come November matter to the election? How people say they are feeling about the economy in an election year—alongside more direct measures of economic health, such as GDP growth and disposable income—has in the past been a good predictor of whom voters choose as president; a healthy economy and good sentiment strongly favor the incumbent. Despite all the abnormalities of 2020—a pandemic, national protests, a uniquely polarizing president—economic models that factored in both economic fundamentals and sentiment predicted the result and margin of that year’s presidential election quite accurately (and much more so than polling), according to an analysis by the political scientists John Sides, Chris Tausanovitch, and Lynn Vavreck.
It is of course possible that consumer sentiment is becoming a more performative metric than it used to be—a statement about who you are rather than how you really feel—and perhaps less reliable as a result. Still, the story that voters have in their heads about the economy clearly matters. If that story were influenced solely by the prices at the pump and the grocery store or the number of well-paying jobs, then—absent another crisis—we could expect the mood to be buoyant this fall, significantly helping Biden’s prospects for reelection. But the stories we tell ourselves are shaped by everything from the news we read to the political messages we hear to the identities we adopt. And, for better or worse, those stories have yet to be fully written.
February marks a century since the death of Woodrow Wilson. Of all America’s presidents, none has suffered so rapid and total a reversal of reputation.
Wilson championed—and came to symbolize—progressive reform at home and liberal internationalism abroad. So long as those causes commanded wide support, Wilson’s name resonated with the greats of American history. In our time, however, the American left has subordinated the causes of reform and internationalism to the politics of identity, while the American right has rejected reform and internationalism altogether. Wilson’s standing has been crushed in between.
Explore the March 2024 Issue
Check out more from this issue and find your next story to read.
Wilson’s fellow presidents esteemed him too. Harry Truman wrote, “In many ways, Wilson was the greatest of the greats.” Richard Nixon admired Wilson even more extravagantly. He hung Wilson’s portrait in his Cabinet room, and used as his personal desk an antique that he believed—mistakenly, it turns out—had been used by Wilson.
Yet over the past half decade, Wilson’s name has been scrubbed from schools and memorials across the country. Wilson’s own Princeton, which he elevated from mediocrity to greatness in his eight years as university president, has removed his name from its school of public policy and a dormitory. “We have taken this extraordinary step,” the university announced in June 2020, “because we believe that Wilson’s racist thinking and policies make him an inappropriate namesake for a school whose scholars, students, and alumni must be firmly committed to combatting the scourge of racism in all its forms.”
These acts of obloquy are endorsed across the spectrum of liberal and progressive opinion. The New York Times editorial board had urged the renaming and damned Wilson as “an unrepentant racist.” In his recent history, American Midnight, the eminent liberal writer Adam Hochschild accuses Wilson of culpability for the unjust imprisonment, illegal abuse, and outright murder of trade unionists and anti-war dissenters. Here at The Atlantic, the historian Timothy Naftali described Wilson as “an awful man who presided over an apartheid system in the nation’s capital.”
Unlike other historical figures criticized by American progressives, such as Robert E. Lee and Christopher Columbus, Wilson has found few countervailing defenders among American conservatives. If anything, contemporary conservatives revile Wilson even more than progressives do.
The columnist George Will spices his speeches with a favorite joke about Wilson’s trajectory from the loser in an academic fight at Princeton to the president who “ruined the 20th century.” In his 2007 book, Liberal Fascism, Jonah Goldberg (then an editor at National Review) condemned Wilson as “the twentieth century’s first fascist dictator.” Glenn Beck regularly fulminated against Wilson on his Fox News show in the early 2010s. Beck called Wilson an “evil SOB” and a “dirtbag racist.” He summed up: “I hate this guy. I don’t even want to show his picture.”
Anti-Wilson animus has even swayed the conservative jurists of the U.S. Supreme Court. In 2022, the Court delivered a ruling in West Virginia v. Environmental Protection Agency that dramatically curtailed greenhouse-gas regulations in the United States. To support his concurrence with the decision, Justice Neil Gorsuch devoted a footnote entirely to damning Wilson as an antidemocratic bigot. Wilson was one of the first American scholars to study the emerging administrative state, and conservatives like Gorsuch imagine that if they can discredit him, they can discredit it as well—and doom environmental regulations by association.
Wilson’s bigotries were very real. As a historian, he made the case that freedmen had too hastily been given the franchise following the Civil War. All his life, he accepted a subordinate status for Black Americans. As a politician, he enforced and extended it. In private, he told demeaning jokes in imitated dialect and delighted in minstrel shows. He was said to have praised D. W. Griffith’s film The Birth of a Nation—originally titled The Clansman—as “like writing history with lightning,” though this at least is almost certainly untrue: Wilson viewed the movie in silence, according to a witness at the time. He may have been annoyed because an inter-title within the movie quoted Wilson’s A History of the American People as seeming to praise the Ku Klux Klan. The relevant section had in fact rebuked the Klan for its lawless violence. But Wilson objected only to the Klan’s means, not its ends. He wholeheartedly endorsed the extinguishing of Reconstruction-era reforms by state legislatures and white-dominated courts.
Wilson’s bigotries were shared by his predecessors and immediate successors in the presidency. In his 1909 inaugural address, William Howard Taft repudiated equal voting rights for Black Americans and justified the exclusion of immigrants from China. Taft’s predecessor, Theodore Roosevelt, enthusiastically promoted the pseudoscience of racial hierarchy that placed white Europeans at the top. The segregation of the federal civil service that Wilson’s administration instituted was maintained by the four presidents who followed him: Warren Harding, Calvin Coolidge, Herbert Hoover, and FDR.
My point is not to acquit Wilson of the charges against him, nor to minimize those charges by blaming the times, rather than him. Historical figures are responsible for their beliefs, words, and actions. But if one man is judged the preeminent villain of his era for bigotries that were common among people of his place, time, and rank, that singular fixation demands explanation. Why Wilson rather than Taft or Coolidge?
It is hard to avoid the conclusion that Wilson must be brought low because he stood so high. He is scorned now because of our weakening attachment to what was formerly regarded as good and great.
Here’s the story that once would have been told about Wilson by the liberal-minded.
After winning the presidential election of 1912, Wilson broke four decades of conservative domination of U.S. politics to lead the most dramatic social-reform program since the 1860s.
He and his party’s majority in both houses of Congress lowered the tariffs that had loaded the cost of government onto working people. In place of those high tariffs, Wilson and the Democrats enacted an income tax, a first step toward a more redistributive fiscal policy in the United States—and among the gravest of his sins in the eyes of conservative critics.
They also gave the U.S. a central banking system, the Federal Reserve, to counter the deflationary effect of the gold standard, which often favored lenders at the expense of borrowers. They ensured that the Fed would represent the interests of the public, and not be controlled by large private banks, as many Republicans of the day preferred. They introduced the first federal regulation of wages and hours in the United States. Wilson and his congressional majority passed laws against abusive corporate practices and created the Federal Trade Commission to enforce those laws.
Wilson supported women’s suffrage during his presidency. He opposed alcohol prohibition, albeit with less success. He twice vetoed literacy tests for immigrants, which were an early harbinger of the ethnically discriminatory immigration restrictions of the 1920s. He nominated the first Jew to serve on the Supreme Court, Louis Brandeis. (Earlier, as governor of New Jersey, Wilson had also appointed the first Jew to that state’s supreme court.) After the U.S. entered the First World War, Wilson’s administration nationalized the country’s railway system. It simplified the route network, streamlined operations, and improved pay and working conditions in the huge and crucial industry—then rapidly returned the rails to private ownership.
Wilson’s most impressive innovations came in the realm of foreign affairs. He granted substantial autonomy to the Philippines, America’s largest colonial possession, and opened a path to full independence. Wilson negotiated payment to Colombia for the loss of Panama in a revolution that had been fomented by Theodore Roosevelt. He resisted military intervention in the Mexican Revolution, and he tried to mediate a negotiated end to World War I. When at last forced into that war, Wilson sought a generous and enduring peace for all of the combatants. He put his hopes in the League of Nations; even if that project largely failed, it paved the way for the more successful forms of collective security created after 1945. Sumner Welles, perhaps FDR’s most trusted foreign-policy adviser, wrote in 1944 that Wilson’s vision of world order had excited his own generation “to the depths of our intellectual and emotional being.”
Even at the zenith of Wilson’s repute, his most sophisticated admirers attached important caveats to their story. Wilson had wanted to stay out of the war in Europe. He failed. He then tried to negotiate peace. He failed again. His commitment to self-determination did not apply to the small countries of this hemisphere: A U.S. intervention he ordered in Haiti in 1914 extended into a 20-year occupation.
Wilson’s admirers also could not deny that each of those failures was in great part his own fault. In his earlier academic writings, Wilson had praised compromise and concession. As president, his early concessions to white southerners cost him the support of some northern African Americans who had flipped from the Republican Party to back him in 1912. One of those who endorsed Wilson was W. E. B. Du Bois. The next year, Du Bois lamented his decision in an editorial for The Crisis, the magazine of the NAACP: “Not a single act and not a single word of yours since election has given anyone reason to infer that you have the slightest interest in the colored people or desire to alleviate their intolerable position.” Wilson met with disillusioned Black former supporters once in 1913, then again in 1914. That second meeting ended in a rare eruption of Wilson’s temper. He ordered his visitors out of his office and never received them again. As he settled into the presidency, Wilson became more rigid, more convinced of his own righteousness and his adversaries’ wickedness.
Wilson’s offenses multiplied after a disabling stroke in 1919. He clung to office, barely able to move or communicate, his condition concealed by his wife and his doctor. (The Twenty-Fifth Amendment, ratified in 1967, offered a solution to the Wilson problem—a president who cannot do his job but will not resign.) Many of the darkest acts of his administration occurred during this period of feebleness: mass deportations of foreign-born political radicals; passivity in the face of the murderous anti-Black pogroms that flared across America’s big cities; a de facto granting of permission to the most repressive and reactionary tendencies in U.S. society.
In the era of liberal academic hegemony, historians sought to weigh Wilson’s errors and misdeeds against his administration’s accomplishments, reaching a range of conclusions. But that era has closed. We live now in a more polarized time, one of ideological extremes on both left and right. Learned Hand, a celebrated federal judge of Wilson’s era, praised “the spirit which is not too sure that it is right.” Our contemporaries have exorcised that spirit. We are very sure that we are right. We have little tolerance for anyone who seems in any degree wrong.
In our zeal, we refuse to understand past generations as they understood themselves. We expect them to have organized their mental categories the way we organize ours—and we are greatly disappointed when we discover that they did not.
Today, we tend to think of economic and racial egalitarianism as closely yoked causes. One hundred years ago, this was far from the case. In the late 19th and early 20th centuries, many of those Americans most skeptical of corporate power were also the most hostile to racial equality, while those Americans who most adamantly rejected economic reform hoped to mobilize racial minorities as allies.
The leading proponent of racial segregation in Wilson’s administration was his postmaster general, a Texan named Albert Sidney Burleson. Before 1913, about 4,000 of the Post Office’s more than 200,000 employees were Black. Burleson dismissed Black postmasters across the South. At postal headquarters, in Washington, D.C., he grouped the facility’s seven Black clerks together and screened them off from white employees. Burleson segregated dining rooms and bathrooms too. When the U.S. declared war against Germany, Burleson used his powers to bar dissenting magazines and newspapers from the mail, for most small periodicals their only way to reach their audiences—no hearings, no appeals, just his whim and will.
From this sorry history, you might infer that Burleson was an all-around reactionary. But no.
Elected to the U.S. House of Representatives in 1898, Burleson immediately showed himself to be a progressive and a reformer. He fiercely opposed the use of federal injunctions against striking trade unionists. He advocated for lower tariffs and a redistributive income tax. He rejected the gold standard. Burleson and his wife, Adele, were ardent proponents of women’s suffrage in the state of Texas. One of their daughters, Laura, was elected to the Texas legislature in 1928, only the fourth woman to reach that chamber.
The seeming contradiction between Burleson the white supremacist and Burleson the social reformer recurred again and again in Wilson’s administration. Wilson’s Navy secretary, Josephus Daniels, was an even more virulent racist than Burleson. As a newspaper editor in Raleigh, Daniels incited the 1898 insurrection that crushed the vestiges of Black political rights in North Carolina. Daniels supported railroad regulation and greater investment in public education. FDR would later appoint him ambassador to Mexico. In that post, Daniels opposed U.S. action to undo the Mexican nationalization of the oil industry and sympathized with the anti-Franco side of the Spanish Civil War.
The disconnect between race and reform operated in reverse, too. Wilson’s most effective and hated political rival was Henry Cabot Lodge, the leader of the Senate Republicans after 1918. Lodge was in most respects deeply conservative: a champion of corporate prerogatives, the gold standard, and high tariffs. Lodge, an enthusiastic imperialist, had called for the annexation of the Philippines and Puerto Rico. Lodge despised and distrusted the new immigrants from Eastern and Southern Europe. When 11 Italian immigrants were lynched in New Orleans in 1891, he published an article justifying and excusing the crime. Yet Lodge was also the author and lead sponsor of an important 1890 House bill to protect Black voting rights in the South, the last such effort in Congress until the modern civil-rights era.
In the time of Woodrow Wilson, issues and ideas were clustered very differently from today. Champions of Black political rights could display bitter animosity toward Catholic immigrants. Many exponents of women’s suffrage also held racist views. Some defenders of labor rights also supported bans on teaching evolution. Heroes of free academic inquiry were fascinated by the project of eugenics. Early advocates of sexual autonomy were attracted to fascism or communism or—as George Bernard Shaw was—both.
What are you to do with this information once you have it? The leading men and women of America’s past were frequently tainted by bigotries and misjudgments that appear repulsive now. Yet if repulsion is all we feel, we do a great injustice both to them and to ourselves. The good and great country that you inhabit today was inherited from imperfect leaders such as Wilson, as uncomfortable as that may make some on the left. And the gradual progress that the U.S. has made since 1787 has all depended on the respect Wilson and other leaders had for the original plan, as much as some on the right insist that they betrayed it. Demand that Americans preserve their collective past unchanged, and you doom the whole structure to decay and ultimate collapse. Teach Americans to despise their collective past, and their future will hold only a struggle for power, pitting group against group, without rules or restraints.
“It would be the irony of fate if my administration had to deal chiefly with foreign affairs.” Woodrow Wilson spoke those famous words to a friend shortly before his inauguration. That irony of fate of course came true.
Wilson is one of the very few presidents to have bequeathed an ism. There is no Washingtonism, there is no Lincolnism, there is no Rooseveltism, but there is “Wilsonianism.” Wilsonianism is almost universally regarded in a negative light—as, at worst, bad and dangerous or, at best, sweetly naive but sadly unrealistic.
But Wilson was far from naive. He grew up in the ruined landscape of the post–Civil War South. His prepresidential writing often cautioned against too much confidence in human beings and too much certainty about human institutions.
In his message to Congress on April 2, 1917, when he called for a declaration of war, Wilson insisted that “the world must be made safe for democracy.” Modern-day Americans commonly interpret those words as a vow to convert the whole world to democracy. What Wilson meant, however, was that the nation could no longer hope to find security in the “detached and distant situation” of its geographic location, as Washington described it in his farewell address. The United States had grown too big; distances of time and space had narrowed too much for it to be unaffected by the actions of once-remote countries. The menace to “peace and freedom,” Wilson saw, “lies in the existence of autocratic governments backed by organized force which is controlled wholly by their will, not by the will of their people.” Not all nations would or could be democratic, but from then on, American peace and freedom would be safeguarded not by geography but by “a partnership of democratic nations.”
Recoiling from Wilson’s vision of mutual international benefit, many of his present-day critics yearn for a foreign policy that relies on dominating a small number of client states and ignoring the rest of the world from behind border walls and trade protections.
People who take this view call themselves “America First,” perhaps unaware that Wilson himself seized the phrase as a campaign slogan in 1916 to condemn both the ethnic lobbies he regarded as too pro-German and the industrial and financial interests he mistrusted as too pro-Allies. In the 1930s and early ’40s, the slogan was appropriated by the isolationists and Axis sympathizers of the America First Committee. The outrage of Pearl Harbor and the horror of Auschwitz then discredited “America First” for a long time—but not forever.
Now, in the 21st century, we see the strange sight of political partisans using Wilson’s own “America First” phrase to attack Wilson’s highest ideals. In February 2023, one of the harshest critics of U.S. support for democratic Ukraine spoke at the Heritage Foundation. At the core of Senator Josh Hawley’s remarks was an attack on Wilson:
Woodrow Wilson, as you may remember, was a dedicated internationalist. He was a dedicated globalist on principle, by the way. I mean, he thought that “we should make the world safe for democracy.” That was his line that he famously used. And I think what you saw is after the Cold War, you had a whole generation of American policy makers who said the Wilsonian moment has now arrived. Borders don’t matter. American uniqueness doesn’t matter. We’re going to make all of the world more like America and we’re going to make America more like the world and there’ll be this great global integration.
Wilson believed almost none of those things. What Wilson did believe was that American security had become inseparable from the security of others, and that American power would be accepted only if guided by universal values. Wilson argued this case most explicitly in a January 1918 address to Congress. The speech is famous for the 14 points he enumerated as U.S. war aims. But more important than any specific aim was the logic undergirding them all:
What we demand in this war, therefore, is nothing peculiar to ourselves. It is that the world be made fit and safe to live in; and particularly that it be made safe for every peace-loving nation which, like our own, wishes to live its own life, determine its own institutions, be assured of justice and fair dealing by the other peoples of the world as against force and selfish aggression. All the peoples of the world are in effect partners in this interest, and for our own part we see very clearly that unless justice be done to others it will not be done to us.
Wilson was the first world leader to perceive security as a benefit that could be shared by like-minded nations. Until then, each great power had clambered over others to field bigger armies, float bigger navies, and accumulate more colonies. This competition had culminated in the disastrous outbreak of the Great War. Wilson glimpsed the possibility of a different way: that shared values might provide a more stable basis for peace among advanced nations than the quest for military dominance.
Only the U.S. possessed the wealth and power to make the vision work. Tragically, neither the U.S. nor the world was ready for this vision in Wilson’s lifetime. The president himself lacked the skill, expertise, and tact to realize it. But the vision lay dormant, waiting for a future chance.
I am not personally a thorough admirer of Wilson’s. A famous quip attributed to Winston Churchill (about another political moralist) might have applied to Wilson’s austere personality: “He has all the virtues I dislike and none of the vices I admire.” An evening with Theodore Roosevelt would have been fun, but most of us would have wished to bid an early good night to Wilson—especially once he’d revealed that his favorite form of humor was mildly smutty limericks.
Wilson’s bigotry was as chilly as his wit. He started his teaching career at Bryn Mawr. One of his associates there, the daughter of an abolitionist minister, remarked to an early biographer that Wilson was the first southern white man she’d ever met with no personal warmth for any individual Black person.
Wilson’s tariff, banking, and regulatory reforms were driven more by a quest for rationality and efficiency than by empathy and compassion. The British Liberal governments that held power from 1905 to the outbreak of World War I introduced that country’s first old-age pensions and unemployment insurance. In the United States, broad programs of social insurance would have to await the New Deal of the 1930s.
As a war leader, Wilson deferred absolutely to professional soldiers’ advice, even though those soldiers had learned their trade in small wars against weak enemies. That approach cost many American lives when the top U.S. military commander, John Pershing, rebuffed British and French efforts to teach American troops the painful lessons they had learned from prior years of Western Front experience. Americans went into battle in 1918 still using the human-wave tactics that had cost the British and French so dearly.
Wilson’s gravest failures were in his chosen mission as a peacemaker. As the former U.S. diplomat Philip Zelikow details in his damning book The Road Less Traveled, Wilson personally bungled a real opportunity to reach peace in the second half of 1916. All of the principal combatants yearned for such a peace, but none dared be the first to ask for it. All were looking for the U.S. to lead, as it had led the peace negotiations after the Russo-Japanese War of 1904–05. Wilson fatally hesitated to apply such leadership, nor did he delegate the task to anybody who might have succeeded.
When the war instead ended with the German collapse in 1918, Wilson never grasped or even paid much attention to the problems of postwar economic recovery, domestic or international. He was a man of ideas and ideals, not one of ledgers and accounts; of words, not numbers. The United States plunged into a severe economic depression in 1920. War-scarred and hungry Europe suffered even more. Voters emphatically rejected Wilson’s party in the 1920 elections.
The Republican congressional majorities of the 1920s returned to the high-tariff policies of the 19th century, dooming any hope that Germany, Britain, France, Belgium, Italy, and other former combatants might export their way to economic normality. Instead, the United States insisted on collecting war debts from former allies. To repay the U.S., the former allies were left no choice but to squeeze Germany for reparations. To finance reparations, Germany massively borrowed from U.S. private-sector lenders. This cycle of tariff-driven debt helped set in motion the catastrophe of the Great Depression.
The post-Wilson Democrats bitterly split along regional and cultural lines. It took them 103 ballots to nominate a presidential candidate at their convention in New York City in 1924. The Republicans would win that year’s election decisively, and 1928’s too, by running against Wilson’s war and the depression that followed. Only after another war, even more terrible than the one that came before it, was Wilson’s foreign-policy legacy at last rehabilitated. As Americans and their allies developed institutions of collective security, free trade, and global governance after 1945, Wilson’s best ideals were realized at last.
This is the Wilson who remains to this day the founder and definer of American world leadership. Henry Kissinger, who despised Wilson and (I suspect) inwardly hoped to displace his intellectual primacy, ultimately had to admit in his 1994 book, Diplomacy : “It is above all to the drumbeat of Wilsonian idealism that American foreign policy has marched since his watershed presidency, and continues to march to this day.” I very much believe that the United States has been a force for good in the world in the 20th and 21st centuries. If you do also, then our appreciation must begin with the foundational achievement of the president who first exerted that force.
You do not need to withhold any single criticism of Woodrow Wilson, the man and the president, to regret the harm done by the unbalanced and totalizing censure that has been heaped upon him over the past decade. Wilson was a great domestic reformer. He was the first American president to perceive and explain how American power could anchor the peace of a future democratic world.
His ideas and ideals still undergird American foreign policy at its most generous and successful. His words still reverberate more than a century later, long after those of his contemporary critics have lapsed into obscurity. When the United States rallies to the defense of Ukraine against Russian invasion or of Guyana against Venezuelan threats, when it seeks peace through free-trade agreements and joins with allies to deter aggression, it is speaking in the language originally chosen by Woodrow Wilson.
So how should we comprehend the people of bygone times when their principles and prejudices diverge from those that now prevail? In a speech delivered in 1896, Wilson declared:
Nothing is easier than to falsify the past. Lifeless instruction will do it. If you rob it of vitality, stiffen it with pedantry, sophisticate it with argument, chill it with unsympathetic comment, you render it as dead as any academic exercise … Your real and proper object, after all, is not to expound, but to realize it, consort with it, and make your spirit kin with it, so that you may never shake the sense of obligation off.
Modern America owes just such an obligation to Wilson. He showed the way to the modern world. He did not reach his hoped-for destination, but neither yet have we. Cancel Wilson, and you empower those who seek to discredit the high goals for which he worked. Those are goals still worth working toward. To realize them, supporters of American global leadership cannot dispense with the practical and moral legacy of Woodrow Wilson.
Acknowledge his flaws and failures. Then restore Wilson’s name to the places of honor from which it was hastily and wrongly purged.
This article appears in the March 2024 print edition with the headline “In Defense of Woodrow Wilson.” When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.
Some days Wordle is easier to solve than others so if you’re struggling, Newsweek is here to help.
Wordle is a puzzle that goes live every day at midnight, your local time, with players having to try and figure out the five-letter word of the day. People get a maximum of six attempts to solve the puzzle by typing in what they think the answer could be.
There is a system that will help you try and find the answer, you simply have to start by guessing a word. The letters will turn green if they’re correct and in the right position, yellow if you have the right letter in the wrong place, or gray if the letter isn’t in the day’s word at all.
Wordle took the world by storm not too long after it was published online for people to use. Created by software engineer Josh Wardle, he wanted to help keep his crossword-loving partner entertained during the COVID-19 lockdown. Once he realized that other people might enjoy solving Wordle‘s daily brainteasers, he uploaded the puzzle online so it was accessible to everyone.
This photo illustration shows a person playing online word game “Wordle” on a mobile phone in Arlington, Virginia, on May 9, 2022. Software engineer Josh Wardle created the game to help keep his crossword-loving partner… This photo illustration shows a person playing online word game “Wordle” on a mobile phone in Arlington, Virginia, on May 9, 2022. Software engineer Josh Wardle created the game to help keep his crossword-loving partner entertained. MICHAEL DRAPER/AFP via Getty Images
Following its debut, Wordle exploded from 90 users on November 1, 2021, to 300,000 on January 2, 2022, according to figures by Statista. Thanks to its popularity, The New York Times bought it in early 2022 for a seven-figure sum.
“The idea to impose a limitation came when my partner and I started getting into crosswords during the pandemic. In particular, The New York Times have this puzzle called ‘Spelling Bee,’ which has this once-a-day model that I thought was really effective,” he said.
“I liked the idea that everyone around the world was trying to solve the exact same word at the exact same time. You keep them hooked without taking over their lives. It’s also interesting because this [notion] runs counter to a lot of what you expect from mobile games.
“The assumption is that they’re supposed to keep you engaged at all times, but most people can solve a Wordle puzzle in about five minutes and then forget about it.”
If you’re hoping to solve today’s brainteaser yourself, don’t scroll to the end of the article to avoid seeing the answer.
Read More Entertainment News
‘Wordle’ #951, Clues for Friday, January 26
Hint #1: Today’s answer contains two vowels.
Hint #2: It starts with the letter “A.”
Hint #3: There is one repeated letter.
Hint #4: Today’s answer is an adjective and an adverb.
Hint #5: Synonyms include the words “distant” and “detached.”
‘Wordle’ #951 Answer for Friday, January 26
Today’s Wordle answer is “Aloof.”
According to Merriam-Webster, the adjective definition of aloof is “removed or distant either physically or emotionally.” The adverb definition is “at a distance.”
How did you do with today’s Wordle, did you manage to figure it out? Congratulations if so but no worries if you didn’t, there will be a new puzzle for you to try and solve on Saturday.
Do you have a good tip for the best starting words when playing Wordle? Share them with us by emailing: entertainment@newsweek.com.
Uncommon Knowledge
Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.
Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.
If Donald Trump has benefited from one underappreciated advantage this campaign season, it might be that no one seems to be listening to him very closely anymore.
This is a strange development for a man whose signature political talent is attracting and holding attention. Consider Trump’s rise to power in 2016—how all-consuming his campaign was that year, how one @realDonaldTrump tweet could dominate news coverage for days, how watching his televised stump speeches in a suspended state of fascination or horror or delight became a kind of perverse national pastime.
Now consider the fact that it’s been 14 months since Trump announced his entry into the 2024 presidential race. Can you quote a single thing he’s said on the campaign trail? How much of his policy agenda could you describe? Be honest: When was the last time you watched him speaking live, not just in a short, edited clip?
It’s not that Trump has been forgotten. He remains an omnipresent fact of American life, like capitalism or COVID-19. Everyone is aware of him; everyone has an opinion. Most people would just rather not devote too much mental energy to the subject. This dynamic has shaped Trump’s third bid for the presidency. As Katherine Miller recently observed in TheNew York Times, “The path toward his likely renomination feels relatively muted, as if the country were wandering through a mist, only to find ourselves back where we started, except older and wearier, and the candidates the same.”
Perhaps we overlearned the lessons of that first Trump campaign. After he won, a consensus formed among his detractors that the news media had given him too much airtime, allowing him to set the terms of the debate and helping to “normalize” his rhetoric and behavior.
But if the glut of attention in 2016 desensitized the nation to Trump, the relative dearth in the past year has turned him into an abstraction. The major cable-news networks don’t take his speeches live like they used to, afraid that they’ll be accused of amplifying his lies. He’s skipped every one of the GOP primary debates. And since Twitter banned him in January 2021, his daily fulminations have remained siloed in his own obscure social-media network, Truth Social. These days, Trump exists in many Americans’ minds as a hazy silhouette—formed by preconceived notions and outdated impressions—rather than as an actual person who’s telling the country every day who he is and what he plans to do with a second term.
To rectify this problem, I propose a 2024 resolution for politically engaged Americans: Go to a Trump rally. Not as a supporter or as a protester, necessarily, but as an observer. Take in the scene. Talk to his fans. Listen to every word of the Republican front-runner’s speech. This might sound unpleasant to some; consider it an act of civic hygiene.
Yes, there are other ways to familiarize yourself with the candidate and the stakes of this election. (And, of course, some people might not feel safe at a Trump event.) But nothing quite captures the Trump ethos like his campaign rallies. This has been true ever since he held his first one at Trump Tower, in June 2015. Back then, he had to stack the crowd with paid actors, prompting many in the press (myself included) to dismiss the whole thing as an astroturf marketing stunt. But the rallies, like the campaign itself, soon took on a life of their own, with thousands of people flocking to Phoenix or Toledo or Daytona Beach to witness the once-in-a-generation spectacle firsthand. What would he do? What would he say? I still remember the night of the 2016 Nevada caucuses, standing in line for Trump’s victory rally at the Treasure Island Hotel and Casino and overhearing one gawker enthuse to another, “This is a cultural phenomenon. We have to see it.”
Regardless of your personal orientation toward Trump, attending one of his rallies will be a clarifying experience. You’ll get a tactile sense of the man who’s dominated American politics for nearly a decade, and of the movement he commands. People who comment on politics for a living—journalists, academics—might find certain premises challenged, or at least complicated. Opponents and activists might come away with new urgency (and maybe a dash of empathy for the people Trump has under his sway). The experience could be especially educational to Republican voters who are not Trump devotees but who see the other GOP candidates as lost causes and plan to vote for Trump over Joe Biden. Surely, they should see, before they cast their vote, what exactly they’re voting for.
I recently undertook this challenge myself. As a reporter, I’ve covered about 100 Trump rallies in my life. For a stretch in the fall of 2016, I spent more time in MAGAfied arenas and airplane hangars than I did sleeping in my own bed. What I remember most from that year is the unsettling, anything-might-happen quality of the events. The chaos. The violence. The glee of the candidate presiding over it all.
But with the commencement of a new election year, it occurred to me that I hadn’t been to a rally since 2019. The pandemic, followed by a book project and a series of story assignments unrelated to Trump, had kept me largely off the campaign trail. I was curious what it would be like to go back. Had anything changed? Was my impression of Trump still up-to-date? So, one night earlier this month, I parked my rental car on a scrap of frozen grass near the North Iowa Events Center in Mason City and made my way inside.
A line had formed hours before Trump was scheduled to speak, but the people trickling in from the cold through metal detectors were in good spirits. They chatted amiably about their holiday travel and arranged themselves in groups for selfies. An upbeat soundtrack played over the speakers—Michael Jackson, Adele, Panic! at the Disco—and people excitedly pointed out recognizable faces in the media section. “You’re that guy from CBS!” one attendee exclaimed to a TV-news correspondent.
I found the wholesome, church-barbecue vibe a little jarring. For months, my impression of the 2024 Trump campaign had been shaped by the apocalyptic rhetoric of the candidate himself—the stuff about Marxist “vermin” destroying America, and immigrants “poisoning the blood of our country.” The people here didn’t look like they were bracing for an existential catastrophe. Had I overestimated the radicalizing effect of Trump’s rhetoric?
Only once I started talking to attendees did I detect the darker undercurrent I remembered from past rallies.
I met Kris, a 71-year-old retired nurse in orthopedic sneakers, standing near the press risers. (She declined to share her last name.) She was smiley and spoke in a sweet, grandmotherly voice as she told me how she’d watched dozens of Trump rallies, streaming them on Rumble or FrankSpeech, a platform launched by the right-wing MyPillow founder Mike Lindell. (She waited until Lindell, who happened to be loitering near us, was out of earshot to confide that she preferred Rumble.) The conversation was friendly and unremarkable—until it turned to the 2020 election, which Kris told me she believes was “most definitely” stolen.
“You think Trump should still be president?” I asked.
“By all means,” she said. “And I think behind the scenes he maybe is doing a little more than what we know about.”
“What do you mean?”
“Military-wise,” she said. “The military is supposed to be for the people, against tyrannical governments,” she went on to explain. “I hope he’s guiding the military to be able to step in and do what they need to do. Because right now, I’d say government’s very tyrannical.” If the Democrats try to steal the election again in 2024, she told me, the Trump-sympathetic elements of the military might need to seize control.
Around 8 p.m., Trump took the stage and launched into his remarks, toggling back and forth between what he called “teleprompter stuff” (his prepared stump speech) and the unscripted riffs that he’s famous for. Seeing him speak in this setting after so many years was strange—both instantly familiar and still somehow shocking, like rewatching an old movie you saw a hundred times as a kid but whose most offensive jokes you’d forgotten.
When he talked about members of the Biden administration, he referred to them as “idiots” and “lunatics” and “bad people.” When he talked about the “invasion” of undocumented immigrants at the southern border, he punctuated the riff with ominous warnings for his mostly white audience: “They’re occupying schools …They’re sitting with your children.” When he mentioned Barack Obama, he made a point of using the former president’s middle name—“Barack Hussein Obama”—and then veered off into an appreciation of Rush Limbaugh, the late conservative talk-radio host who taught him this trick. “We miss Rush,” Trump said to enthusiastic cheers. “We need you, Rush!”
I’d forgotten how casually he swears from the podium—deriding, at one point, his Republican rival Nikki Haley’s recent statement on the Civil War as “three paragraphs of bullshit”—and how casually people in the crowd swear back. Throughout the speech, two young men near the front repeatedly screamed “Fuck Biden!” prompted a wave of naughty giggles from others in the crowd.
If one thing has noticeably changed since 2016, it’s how the audience reacts to Trump. During his first campaign, the improvised material was what everyone looked forward to, while the written sections felt largely like box-checking. But in Mason City, the off-script riffs—many of which revolved around the 2020 election being stolen from him, and his personal sense of martyrdom—often turned rambly, and the crowd seemed to lose interest. At one point, a woman in front of me rolled her eyes and muttered, “He’s just babbling now.” She left a few minutes later, joining a steady stream of early exiters, and I wondered then whether even the most loyal Trump supporters might be surprised if they were to see their leader speak in person.
My own takeaway from the event was that there’s a reason Trump is no longer the cultural phenomenon he was in 2016. Yes, the novelty has worn off. But he also seems to have lost the instinct for entertainment that once made him so interesting to audiences. He relies on a shorthand legible only to his most dedicated followers, and his tendency to get lost in rhetorical cul-de-sacs of self-pity and anger wears thin. This doesn’t necessarily make him less dangerous. There is a rote quality now to his darkest rhetoric that I found more unnerving than when it used to command wall-to-wall news coverage.
These were my own impressions of the rally I attended; yours may very well be different. The only way to know is to see for yourself. Every four years, pundits try to identify the medium that will shape the presidential race—the “Twitter election,” the “cable-news election.” In 2024, with both parties warning of existential stakes for America, perhaps the best approach is to simply show up in real life.
Shortly before Trump began speaking, I met a friendly young dad in glasses who’d brought his 6-year-old son to the event. He’d never attended a Trump rally before and was excited to be there. When I asked if I could chat with him after Trump’s speech to see what he thought of the event, he happily agreed.
As Trump spoke, I glanced over at the man a few times from the press section. His expression was muted; he barely reacted to the lines that drove the crowd wild. The longer Trump spoke, I noticed, the further the man drifted backward toward the exits. Of course, I don’t know what was going through his head. Maybe he was just a stoic type. Or maybe his enthusiasm was tempered by the distraction of tending to a 6-year-old. All I know is that, halfway through the speech, he was gone.
The former president was ordered to pay the newspaper and three reporters $392,638 in legal fees after he lost a lawsuit over coverage of his tax records.
The New York Times played into Trump’s narrative by trying to both sides the 1/6 attack, but reader backlash made them change their headline.
Here are the two headlines side by side:
The New York Times wrote a cowardly “both sides” headline and — after getting slammed by me and many others — rewrote it to make explicit who’s lying. pic.twitter.com/7na3H4b6yk
Given the ample evidence of Trump’s behavior and what happened during the 1/6 attack, the first headline that the New York Times went with could be considered willfully blind in the most generous interpretation.
More realistically, The Times played into the Trump narrative by ignoring facts about the 1/6 attack.
Corporate Media Like The New York Times Are Desperate To Have Trump Back In Power
Donald Trump was very good business for big corporate media companies. Trump lavished attention on old media dinosaurs like The New York Times and Washington Post. Trump’s criticism of corporate media elevated them and increased their profits.
With Biden in office, legacy media has hemorrhaged readers, viewers, and subscribers.
These corporations need Donald Trump and the danger that he places the country in back in the Oval Office.
They don’t care about democracy, freedom, facts, or accuracy.
Publications like The New York Times are secretly rooting for a Trump win, and their choices are a way to put their thumb on the scale to help Trump in 2024.
If you want media that is committed to democracy and truth, please consider donating to PoliticusUSA.
Jason is the managing editor. He is also a White House Press Pool and a Congressional correspondent for PoliticusUSA. Jason has a Bachelor’s Degree in Political Science. His graduate work focused on public policy, with a specialization in social reform movements.
Awards and Professional Memberships
Member of the Society of Professional Journalists and The American Political Science Association
“Serious Medical Errors Rose After Private Equity Firms Bought Hospitals” was the headline of a New York Timesarticle looking at the findings of “a major study of the effects of such acquisitions on patient care in recent years” published in the December issue of JAMA. The paper was also written up in USA Today, MarketWatch, Common Dreams, and The Harvard Gazette.
“This is a big deal,” Ashish Jha, dean of the Brown University School of Public Health, told Times reporters Reed Abelson and Margot Sanger-Katz. “It’s the first piece of data that I think pretty strongly suggests that there is a quality problem when private equity takes over.”
Abelson, Sanger-Katz, and their fellow reporters misrepresented the findings of the study, which suffers from its own “quality problems.”
Even its premise is fuzzy. The authors never say what they mean by “private equity,” which has no formal definition. Half of the hospitals in the study were already privately owned, for-profit hospitals before they were acquired. The authors suggest that what they call “private equity” is characterized by excessive leverage and short horizons, but present no data on either factor. Times readers may interpret the phrase private equity to mean “evil Wall Street greedheads,” in which case it seems logical that patient care would deteriorate.
Even the paper’s lead author started with that assumption. “We were not surprised there was a signal,” Massachusetts General Hospital’s Sneha Kannan told the Times. “I will say we were surprised at how strong it was.”
Bias was built into the study design. Research that looks only at “adverse” events and outcomes is designed to dig up dirt and will tend to come up with meaningless conclusions. Serious investigators study all events and outcomes—good and bad—in search of accurate, balanced conclusions.
The study’s strongest finding shows that lives were saved in hospitals acquired by private equity—the opposite of what Kannan expected to find. Patient mortality, the most important measure, dropped a statistically significant 9 percent in the study group, which represents nearly 500 lives saved.
The paper could have been headlined “Patient Mortality Fell After Private Equity Firms Bought Hospitals,” except JAMA might not have published it, TheNew York Times certainly wouldn’t have bothered to write it up, and Common Dreams couldn’t have run with the headline, “We Deserve Medicare for All, But What We Get Is Medicare for Wall Street.” So the study authors fell over themselves to explain this finding away. They theorized, without any evidence, that maybe private equity hospitals routinely transfer out patients who are near death. Though they raise legitimate reasons for skepticism that private equity acquisition saved patient lives, they apply equally to the negative findings that are trumpeted both in the study and the news write-ups.
Another one of the 17 measures the study authors looked at was length of stay. They found that at the private equity hospitals the duration of stays was a statistically significant 3.4 percent shorter, which was another finding the authors were quick to downplay.
Falls are the most common adverse events in hospitals, and the study found that they were more likely to occur in hospitals acquired by private equity. According to the Times, the “researchers reported…a 27 percent increase in falls by patients while staying in the hospital.”
This isn’t what the study says. The rate of falls stayed the same at hospitals after they were acquired by private equity at 0.068 percent. Falls didn’t decline at the rate that they did at hospitals in the control group—from 0.083 percent to 0.069 percent—which is where the 27 percent number came from.
In other words, the situation improved in the control group but didn’t get worse or better in hospitals acquired by private equity. So the authors assumed that there was some industrywide drop in hospital falls and that this positive trend didn’t take place at the private equity hospitals.
What this finding actually suggests is that the control hospitals were badly chosen and run worse (at least when it comes to preventing patient falls) than the acquired hospitals both before and after private equity acquisition. That falls could change by 27 percent without any cause (the control hospitals were not purchased by anyone) makes nonsense of claiming statistical significance for much smaller changes in other factors.
Let’s even assume that there was an industrywide decline in falls and that private equity hospitals didn’t see the improvement that would have taken place had their greedy new owners not been allowed to acquire them. If that improvement had taken place, there would have been 20 fewer falls in the study group. Doesn’t that matter less than the 500 deaths prevented—the stat that the authors chose to downplay?
The Times article mentions that bed sores increased at the private equity hospitals even though that wasn’t a statistically significant finding, meaning that there weren’t enough data included in the study to make that assertion. The study authors acknowledged that this finding wasn’t significant, but the Times journalists chose to report it anyway.
The study authors did claim that another one of their adverse findings was statistically significant: Bloodstream infections allegedly increased in private equity hospitals from about 65 cases to 99 cases. This is indeed serious, as such infections can easily be fatal. However, the finding had marginal statistical significance, meaning it was unlikely, but not completely implausible, to have arisen by random chance if private equity acquisition did not affect the rate of bloodstream infections. If the only hypothesis that the authors had tested was whether private equity acquisition increased bloodstream infections, then the finding would meet standard criteria for statistical significance.
If you run a fishing expedition for adverse events and outcomes, you are very likely to find some findings that occur by random chance. The authors were aware of this and adjusted the claimed significance of this result as if they had tested eight hypotheses. But the paper reported 17 measures, and the authors may have tested more. If we adjust for 17 hypotheses, the bloodstream infection result loses its statistical significance.
The rigorous way to do studies is to pre-register hypotheses to ensure that the authors can’t go fishing in a large amount of data to pick out a few conclusions that they like that happen to appear statistically significant by random chance. The authors did not report pre-registration.
So what can we conclude from this study? The Times reporters seem to have gone on a second fishing expedition, this one for a scholar willing to conclude from the study’s findings that we need more government regulation, or perhaps a ban on private equity hospital acquisitions. To their credit, none of the experts they quoted fully delivered, forcing the reporters to blandly conclude that the study “leaves some important questions unanswered for policymakers.”
“This should make us lean forward and pay attention,” was the best Yale economist Zack Cooper was willing to give Abelson and Sanger-Katz, adding that it shouldn’t lead us to “introduce wholesale policies yet.” Rice economist Vivian Ho told the Times that she “was eager to see more evidence.”
Setting out to find “more evidence” of a conclusion that researchers already believe to be true, instead of going where the data lead, is what leads to such sloppy and meaningless research in the first place.
The New York Times has initiated a federal lawsuit against OpenAI and Microsoft, alleging copyright infringement for using its stories to train chatbots. The lawsuit, filed in Manhattan federal court, accuses both companies of unlawfully utilizing the Times’ content to enhance their AI technologies, which they claim compete with and threaten the Times’ own services. OpenAI and Microsoft have yet to respond to these allegations.
This legal action is part of a broader trend of individuals and publishers challenging OpenAI’s use of copyrighted material. The Times argues that OpenAI and Microsoft’s actions constitute a free ride on its journalistic investments. They have not specified the damages sought but are pushing for the destruction of GPT and other large language models that include their work.
Microsoft, a major investor in OpenAI since 2019, integrates OpenAI’s technology into its products. The increasing use of AI in media and other industries has sparked significant legal and ethical debates.
The lawsuit also highlights failed negotiations between the Times, OpenAI, and Microsoft. The Times had previously approached the companies in April, seeking fair compensation and a responsible approach to developing AI technology that supports a well-informed public. However, these discussions did not reach a resolution, leading to the current legal action.
The New York Police Department is yet again trying to shuffle the reporters who cover them—this time to a trailer outside their headquarters. For years, reporters have worked inside police headquarters at 1 Police Plaza, in a section of the building referred to as “the Shack.” There, you’ll find a warren of individual offices occupied by several news organizations—the New YorkPost,Newsday,TheNew York Times, CBS, Gothamist/WNYC, and The New York Daily News—and, on a crowded day, about a half dozen reporters dispersed among them. Rumors that Deputy Commissioner of Public Information Tarik Sheppard—the NYPD’s chief spokesperson—wanted to relocate reporters, purportedly to make more space for NYPD units, have been circulating for months. But on Monday, the idea seemed to become a reality, when two reporters who happened to be at HQ that day got an informal tour of their new digs. They were told they’d be moving the following Monday.
Reporters’ objections to the move are not a matter of comfort. The Shack itself is “pretty disgusting,” as one police reporter noted. “It’s not the Ritz.” But having a desk inside police headquarters has offered crucial access to key players that some fear will be cut off in the move outside. “The concern is: Is this a good faith attempt to make more space for whoever they need to make more space for? Or is this a slippery slope, where we’re going to be eventually pushed out altogether from this area?” said a second police reporter. Sheppard, I’m told, has previously mentioned to reporters that he doesn’t get a fair shake from the tabloids. The move to the trailer comes “against a backdrop of complaints about the coverage of crime,” one veteran crime reporter said, which has “raised everybody’s antenna.” A third police reporter added: “Everybody feels it’s somewhat troublesome, like this is a punitive thing for negative coverage—particularly tabloid coverage.”
The rollout of the move has been a major source of frustration among police reporters, who say that DCPI has not provided an official briefing to the group. Reporters who weren’t in on Monday didn’t realize a tour was even taking place. “There’s been no direct communication with all of us at the same time about what’s happening,” said the first police reporter. The line of reasoning for the move, they added, “has been all over the place.” Whether the move actually happens, or starts to happen, on Monday is somewhat unclear, as a third police reporter told me that DCPI has pulled back on Monday due to logistical matters.
In a statement, a DCPI spokesperson said the move will begin “early next week” and disputed the idea that reporters are in the dark about the transition. “Sheppard previously met and spoke with representatives from each media outlet that occupies the existing press area inside Police Headquarters and explained that the move is simply to accommodate additional outlets that have asked to cover the NYPD in the same manner,” the spokesperson said, adding that the new location is “much larger, contains private conference rooms and bathrooms,” and is “located literally feet from the building, still very much inside the secure perimeter of One Police Plaza.” (One of the reporters I spoke to admitted the trailer was “way better” than they expected. It resembles a “semi-permanent module attached to HQ. We’d still be able to go in and out, our badges would work from what I’m told,” they said, adding, “but again, we still don’t have anything official from DCPI.”)
The DCPI spokesperson also disputed the idea that the move is in any way a response to negative coverage. “Change is sometimes difficult for people, we understand. But this is hardly punitive by any stretch of the imagination. This is a planned move—in the works since the start of the current administration—toward greater NYPD transparency, to allow more access to more reporters from more media outlets that desire to cover the police department on an increasing basis.”
It’s not the first time that the future of the Shack, which has been at 1 Police Plaza since the building was erected in the 1970s, has hung in the balance. Other commissioners have tried to evict reporters, such as in 2009, under Police Commissioner Raymond Kelly. The removal of the press offices seemed so assured that TheTimes wrote an entire obituary for the hub, only for department officials to backtrack on the eviction. A few months later, reporters were relocated down the hall to what is the current Shack. “Over the years, as papers and the news media sort of contracted, people in the Shack diminished,” said the veteran crime reporter. “Outlets that had four or five reporters were down to two or one; some were no longer there.”
Lawyers representing the various media organizations with offices in police HQ have been communicating with each other in light of the impending move, according to several reporters. “What can they really do? It’s the NYPD’s property,” the second police reporter noted. The media lawyers’ role in this is more to “show resistance,” said the third police reporter, “so that the next move is not out on the street.”
“Reporters should be in a newsroom collaborating with their fellow reporters, or they should be in a statehouse, in city hall, in police departments,” said the first police reporter. “Meeting and greeting and talking to people and getting the buzz. Isolating people like this is just another way of siloing the public—and that’s who we are, we’re representatives of the public. I think that they forget that.”
The Trump years had a radicalizing effect on the American right. But, let’s be honest, they also sent many on the left completely around the bend. Some liberals, particularly upper-middle-class white ones, cracked up because other people couldn’t see what was obvious to them: that Trump was a bad candidate and an even worse president.
At first, liberals tried established tactics such as sit-ins and legal challenges; lawyers and activists rallied to protest the administration’s Muslim travel ban, and courts successfully blocked its early versions. Soon, however, the sheer volume of outrages overwhelmed Trump’s critics, and the self-styled resistance settled into a pattern of high-drama, low-impact indignation.
Explore the January/February 2024 Issue
Check out more from this issue and find your next story to read.
Rather than focusing on how to oppose Trump’s policies, or how to expose the hollowness of his promises, the resistance simply wished Trump would disappear. Many on the left insisted that he wasn’t a legitimate president, and that he was only in the White House because of Russian interference. Social media made everything worse, as it always does; the resistance became the #Resistance. Instead of concentrating on the hard work of door-knocking and community activism, its members tweeted to the choir, drawing no distinction between Trump’s crackpot comments and his serious transgressions. They fantasized about a deus ex machina—impeachment, the Twenty-Fifth Amendment, the pee tape, outtakes from The Apprentice—leading to Trump’s removal from office, and became ever more frustrated as each successive news cycle failed to make the scales fall from his supporters’ eyes. The other side got wise to this trend, and coined a phrase to encapsulate it: “Orange Man Bad.”
The Trump presidency was a failure of right-wing elites; the Republican Party underestimated his appeal to disaffected voters and failed to find a candidate who could defeat him in the primary. Once he became president, the party establishment was content to grumble in private and grovel in public. But the Trump years demonstrated a failure of the left, too. Trump created an enormous reservoir of political energy, but that energy was too often misdirected. Many liberals turned inward, taking comfort in self-help and purification rituals. They might have to share a country with people who would vote for the Orange Man, but they could purge their Facebook feeds, friendship circles, and perhaps even workplaces of conservatives, contrarians, and the insufficiently progressive. Feeling under intense threat, they wanted everyone to pick a side on issues such as taking the Founding Fathers’ names off school buildings and giving puberty blockers to minors—and they insisted that ambivalence was not an option. (Nor was sitting out a debate, because “silence is violence.”) Any deviation from the progressive consensus was seen as a moral failing rather than a political difference.
The cataclysms of 2020—the pandemic and the murder of George Floyd—might have snapped the left out of its reverie. Instead, the resisters buried their heads deeper in the sand. Health experts insisted that anyone who broke social-distancing rules was selfish, before deciding that attending protests (for causes they supported, at least) was more important than observing COVID restrictions. The summer of 2020 made a best seller out of a white woman’s book about “white fragility,” but negotiations around a comprehensive police-reform bill collapsed the following year. As conservative Supreme Court justices laid the ground for the repeal of Roe v. Wade, activist organizations became fixated on purifying their language. (By 2021, the ACLU was so far gone, it rewrote a famous Ruth Bader Ginsburg quote on abortion to remove the word woman.) Demoralized and disorganized, having given up hope of changing Trump supporters’ minds, the left flexed its muscles in the few spaces in which it held power: liberal media, publishing, academia.
If you attempted to criticize these tendencies, the rejoinder was simple whataboutism: Why not focus on Trump? The answer, of course, was that a bad government demands a strong opposition—one that seeks converts rather than hunting heretics. Many of the most interesting Democratic politicians to emerge during this time—the CIA veteran Abigail Spanberger, in Virginia; the Baptist pastor Raphael Warnock, in Georgia; Michigan Governor Gretchen Whitmer, who promised to “fix the damn roads”—were pragmatists who flipped red territories blue. When it came to the 2020 election, Democrats ultimately nominated the moderate candidate most likely to defeat Trump.
That Joe Biden would prevail as the party’s candidate was hardly a given, however. He defeated his more progressive rivals for the Democratic nomination only after staging a comeback in the South Carolina primary. He was 44 points ahead of his closest rival, Bernie Sanders, among the state’s Black voters, according to an exit poll. That is not a coincidence. These voters recognized that they had far more to gain from a candidate like Biden, who regularly talked about working with Republicans, than from the activist wing of the party. As Biden put it in August 2020, responding to civil unrest across American cities: “Do I look like a radical socialist with a soft spot for rioters?”
Biden is older now, and a second victory is far from assured. If he loses, the challenges to American democratic norms will be enormous. The withering of Twitter may impede Trump’s ability to hijack the news cycle as effectively as last time, but he’ll only be more committed to enriching himself and seeking revenge. I hope that the left has learned its lesson, and will look outward rather than inward: The battle is not for control of Bud Light’s advertising strategy, or who gets published in The New York Times, but against gerrymandering and election interference, against women being locked up for having abortions, against transgender Americans losing access to health care, against domestic abusers being able to buy guns, against police violence going unpunished, against the empowerment of white nationalists, and against book bans.
The path back to sanity in the United States lies in persuasion—in defending freedom of speech and the rule of law, in clearly and calmly opposing Trump’s abuses of power, and in offering an attractive alternative. The left cannot afford to go bonkers at the exact moment America needs it most.
This article appears in the January/February 2024 print edition with the headline “The Left Can’t Afford to Go Mad.”
Joe Biden’s presidential campaign accused the political press this past week of not shining “a bright enough light” on Donald Trump’s abortion record, taking specific aim at a New York Timespiece that described the former president—who has boasted about his Supreme Court picks overturning Roe v. Wade—as now “employing vagueness and trying to occupy a middle ground of sorts” on the issue. “It’s time to meet the moment and responsibly inform the electorate of what their lives might look like if the leading GOP candidate for president is allowed back in the White House,” the campaign wrote. Biden campaign aides reiterated this critique on X. “Good to see folks have learned nothing from a decade of covering Donald Trump,” wrote deputy campaign manager Rob Flaherty.
The Biden team has long-running grievances with the Times, as well as with the broader news media—“a perpetual chip on their shoulders stemming from their belief that reporters consistently underestimate their boss, only focus on his negatives, and don’t give him enough credit for his legislative successes,” as Politico recently noted. The latest pushback, Politico wrote, “foreshadows a campaign in which these gripes are no longer a sideshow but a central element of the reelection strategy.”
Biden campaign officials seemed to confirm just that in an interview with Vanity Fair.
“The traditional media is sort of falling down on the job here, but I think the nice thing is we have the opportunity to take a little bit of this into our own hands,” Flaherty told me, adding that their pushback online is almost like “a distributed press secretary.”
Just look at the campaign’s Biden-Harris HQ account, which, alongside highlighting the administration’s accomplishments, takes aim at Trump, prominent Republicans, and, at times, the news media. The campaign has compared Bloomberg’s economic assessments one year apart, and when Democrats captured control of the Virginia state legislature in the 2023 elections, it trolled a Times op-ed headlined “Glenn Youngkin and the Lost Republican Art of Winning,” writing, “Oops.” And there was this election night meme:
X content
This content can also be viewed on the site it originates from.
Every presidential campaign works the refs, as they say, to try to get better coverage for its candidate—from spinning reporters to phoning editors—but such conversations typically happen in private. To the Biden campaign, the stakes are too high to just sit back and watch, so it’s increasingly taking such complaints public.
“In order to cure the problem, you have to…diagnose it first, and you have to treat the symptoms with the tools that you have,” said Michael Tyler, the campaign’s communications director. “We are diagnosing it through our communications efforts—calling it out—and then we are treating it both through communications, through digital, through our organizing apparatuses.” The strategy, said principal deputy campaign manager Quentin Fulks, hasn’t so much changed as ramped up with Trump’s return to the campaign trail. “When your opponent sort of sticks their neck out and gives you an opportunity, you have to react to it,” he said. “He has ramped up in his rhetoric,” but it is “not being covered in the way that we feel like it should.”
Just a day after ripping the Times for its abortion story, the campaign blasted out a release praising the paper’s reporting on Trump’s extreme second-term immigration plans—“Sweeping Raids, Giant Camps, and Mass Deportations,” read the headline—but accusing others in the media, particularly the major TV networks, of ignoring the story. That same day, the campaign also highlighted Axios’s reporting on how any Republican president—including Trump—could ban most abortions if elected, while suggesting that “most of the political press” was refusing “to shine a light on Trump’s extreme abortion agenda.”
Last week, when The New York Times and Siena College released a poll that showed President Joe Biden in trouble in battleground states, Democrats began to sound apocalyptic. The panic, turbocharged by social media, was disproportionate to what the surveys actually showed. Although the results in my home state, Nevada, were the worst for the president out of the six swing states that were polled, the findings are almost certainly not reflective of the reality here, at least as I’ve observed it and reported on it.
Nevertheless, they bring to the surface trends that should worry Democrats—and not just in Nevada.
The Times/Siena data show Donald Trump ahead of Biden in Nevada 52 percent to 41 percent, a much larger margin than the former president’s lead in the other battleground states. Could this be true? I’m skeptical, and I’m not alone. After the poll came out, I spoke with a handful of experts in both parties here, and none thinks Trump is truly ahead by double digits in the state, where he lost by about 2.5 points in the previous two presidential cycles. But Nevada is going to be competitive, perhaps more so than ever.
Some of the Times/Siena poll’s internal numbers gave me pause. Among registered voters in Clark County, where Las Vegas is located and where 70 percent of the electorate resides, the poll found Trump ahead of Biden 50–45. But Democrats make up 34 percent of active voters in the county, compared with Republicans’ 25 percent, and Biden won Clark by nine percentage points in 2020.
Other recent polls, not quite as highly rated as Times/Siena’s, have found the presidential race here to be much closer than the Times did. Last month, a CNN poll of registered Nevada voters found Biden and Trump virtually tied. Recent surveys from Emerson College, which has been unreliable in the state in the past, and Morning Consult/Bloomberg both had Trump up three points among likely voters. The Times/Siena polling outfit has a good reputation, but shortly before the 2020 election, it found Biden ahead of Trump in Nevada by six percentage points, more than double Biden’s eventual margin of victory.
Nevada is difficult to poll for a variety of reasons. Here as much as anywhere else, pollsters tend to underestimate the number of people they need to survey by cellphone to get a representative sample, and they generally don’t do enough bilingual polling in Nevada, where nearly a third of the population is Hispanic. Nevada also has a transient population, lots of residents working 24/7 shifts, and an electorate that’s less educated than most other states’. (“I love the poorly educated,” Trump said after winning Nevada’s Republican caucuses in 2016.) The polling challenge has become only more acute, because nonpartisan voters now outnumber Democrats and Republicans in Nevada, making it harder for pollsters to accurately capture the Democratic or Republican vote. (Since 2020, a state law has allowed voters to register at the DMV, and if they fail to do so, their party affiliation is defaulted to independent.)
Nevada matters in presidential elections, but we are also, let’s face it, a tad weird.
Still, Democrats have reasons to worry. Nevada was clobbered by COVID disproportionately to the rest of the country, because our economy is so narrowly focused on the casino industry. The aftereffects—unemployment, inflation—are still very much being felt here. Nevada’s jobless rate is the highest in the country, at 5.4 percent. That’s down dramatically from an astonishing 28.2 percent in April 2020, when the governor closed casinos for a few months. Although the situation has clearly improved, many casino workers still haven’t been rehired.
Democrat Steve Sisolak was the only incumbent governor in his party to lose in 2022, and his defeat was due at least partly to the fallout from COVID. Fairly or not, President Biden wears a lot of that too, as all presidents do when voters are unhappy with the economy. The Morning Consult/Bloomberg poll illuminated the bleak pessimism of Nevada voters, 76 percent of whom think the U.S. economy is going in the wrong direction.
Here, as elsewhere, voters are also concerned about Biden’s age, and that informs their broader views of him. Sixty-two percent of Nevadans disapprove of Biden’s performance, according to the Times, and only 40 percent have a favorable impression of him. Trump’s numbers, although awful—44 percent see him favorably—are better than Biden’s here, as well as in some blue or bluish states.
In Nevada, and in general, Biden is losing support among key groups—young and nonwhite voters. The Times/Siena poll found Biden and Trump tied among Hispanics in the state, despite the fact that Latinos have been a bedrock of the Democratic base here for a decade and a half. In the 2022 midterms, polls taken early in the race showed Catherine Cortez Masto, the first Latina elected to the U.S. Senate, losing Hispanic support, though her campaign managed to reverse that trend enough to win by a very slim margin.
Democratic presidential nominees have won Nevada in every election since 2008. Democrats also hold the state’s two U.S. Senate seats and three of the four House seats, and the party dominates both houses of the legislature. But the state has been slowly shifting to the right—not just in polling but in Election Day results. In 2020, Nevada was the only battleground state that saw worse Democratic performance compared with 2016, unless you include the more solidly red Florida. Nevada’s new Republican governor, Joe Lombardo, is building a formidable political machine. Republicans have made inroads with working-class white voters here, leaving Democrats with an ever-diminishing margin of error.
Abortion, an issue that was crucial to Cortez Masto’s narrow victory, could help Biden in Nevada. The Times/Siena poll showed that only a quarter of Nevadans think abortion should be always or mostly illegal. A 1990 referendum made abortion up to 24 weeks legal here, and the law can be changed only by another popular vote. Democrats in Nevada, though, want to take those protections a step further next year and are trying to qualify a ballot measure that would amend the state constitution to guarantee the right to abortion. As the off-year elections last week showed, that issue, more than the choice between Biden and Trump, could be what saves the president a year from now. Nevada also has a nationally watched Senate race in 2024, in which the incumbent Democrat, Jacky Rosen, has already signaled that she will mimic her colleague Cortez Masto and put abortion front and center in her campaign.
So many events could intervene between now and next November, foreign and/or domestic, and we have yet to see how effective the Trump and Biden campaigns will be, assuming that each man is his party’s nominee. Democratic Senator Harry Reid was deeply unpopular here in 2009, then got reelected by almost six percentage points; Barack Obama was thought to be in trouble in 2011, then won Nevada and reelection.
Democrats clearly hope that if Trump becomes the Republican nominee, many voters will see the election as a binary choice and will back Biden. But if the election instead becomes a referendum on Biden’s tenure, including the economy he has presided over, Trump could plausibly win Nevada—and the Electoral College.
It’s a year before the presidential election, and Democrats are panicking. Their incumbent is unpopular, and voters are refusing to give him credit for overseeing an economic rebound. Polls show him losing to a Republican challenger.
What’s true now was also true 12 years ago. Today, Democrats are alarmed by recent surveys finding that President Joe Biden trails Donald Trump in five key swing states. But they were just as scared in the fall of 2011, when President Barack Obama’s approval rating languished in the low 40s and a pair of national polls showed him losing to Mitt Romney, the former Massachusetts governor who would become the GOP nominee. Barely one-third of independent voters said Obama deserved a second term. A New York Times Magazine cover story asked the question on many Democrats’ minds: “Is Obama Toast?”
A year later, Obama beat Romney handily, by a margin of 126 in the Electoral College and 5 million in the popular vote. Those results are comforting to Democrats who want to believe that Biden is no worse off than Obama was at this point in his presidency. “This is exactly where we were with Obama,” Jim Messina, the former president’s 2012 campaign manager, told me by phone this week. For good measure, he looked up data from earlier elections and found that George W. Bush and Bill Clinton each trailed in the polls a year out from their reelection victories. Perhaps, Messina hoped, that would “calm my bed-wetting fucking Democratic friends down.”
Yet the comparison between Biden today and Obama in 2011 goes only so far. The most obvious difference is that Biden, who turns 81 this month, is nearly three decades older than Obama was at the time of his second presidential campaign. (He’s also much older than Clinton and Bush were during their reelection bids.) Voters across party lines cite Biden’s age as a top concern, and a majority of Democrats have told pollsters for the past two years that he shouldn’t run again. Obama was in the prime of his political career, an electrifying orator who could reenergize the Democratic base with a few well-timed speeches. Not even Biden’s biggest defenders would claim that he has the same ability. Put simply, he looks and sounds his age.
In a recent national CNN poll that showed Trump with a four-percentage-point lead over Biden, just a quarter of respondents said the president had “the stamina and sharpness to serve”; more than half said the 77-year-old Trump did. Privately, Democratic lawmakers and aides have fretted that the White House has kept the president too caged in for fear of a verbal or physical stumble. At the same time, they worry that a diminished Biden is unable to deliver a winning economic message to voters.
“The greatest concern is that his biggest liability is the one thing he can’t change,” David Axelrod, Obama’s longtime chief strategist, wrote on X (formerly Twitter) on the day that The New York Times and Siena College released polls showing Trump ahead of Biden by as much as 10 points in battleground states. “The age arrow only points in one direction.” Axelrod’s acknowledgment of a reality that many senior Democrats are hesitant to admit publicly, and his gentle suggestion that Biden at least consider the wisdom of running again, renewed concerns that the president and his party are ignoring a consistent message from their voters: Nominate someone else.
Tuesday’s election results, in which Democratic candidates and causes notched wins in Virginia, Kentucky, and Ohio, helped allay those concerns—at least for some in the party. “It’s way too early to either pop the champagne or hang the funeral crepe,” Steve Israel, the former New York representative who chaired the Democrats’ House campaign arm during Obama’s presidency, told me on Wednesday. “Biden has the advantage of time, money, a bully pulpit, and, based on last night’s results, the fact that voters in battleground areas seem to agree with Democrats on key issues like abortion.”
The Biden campaign embraced the victories as the continuation of a trend in which Democrats have performed better in recent elections than the president’s polling would suggest. “Time and again, Joe Biden beats expectations,” the campaign spokesperson Michael Tyler told reporters Thursday morning. “The bottom line is that polls a year out don’t matter. Results do.”
The Democrats’ strength in off-year elections, however, may not contradict Biden’s lackluster standing in a hypothetical matchup against Trump. The political realignment since Obama’s presidency—in which college-educated suburban voters have drifted left while working-class voters have joined Trump’s GOP—has given Democrats the upper hand in lower-turnout elections. The traditionally left-leaning constituencies that have soured on Biden, including younger and nonwhite voters, tend to show up only for presidential votes.
As Messina pointed out, the overall economy is better now than it was in late 2011 under Obama, when the unemployment rate was still over 8 percent—more than double the current rate of 3.9 percent. But voters don’t seem to feel that way. Their biggest economic preoccupation is not jobs but high prices, and although the rate of inflation has come down, costs have not. Polling by the Democratic firm Blueprint found a huge disconnect between what voters believe Biden is focused on—jobs—and what they care most about: inflation. “It’s very alarming,” Evan Roth Smith, who oversaw the poll, told reporters in a presentation of the findings this week. “It tells a lot of the story about why Bidenomics is not resonating, and is not redounding to the benefit of the president.”
Nothing stirs more frustration among Democrats, including some Biden allies, than the sense that the president is misreading the electorate and trying to sell voters on an economy that isn’t working for them. “It takes far longer to rebuild the middle class than it took to destroy the middle class,” Representative Ro Khanna of California, a former Bernie Sanders supporter who now serves on an advisory board for Biden’s reelection, told me. “No politician, president or incumbent, should be celebrating the American economy in the years to come until there is dramatic improvement in the lives of middle-class and working-class Americans.” Khanna said that Biden should be “much more aggressive” in drawing an economic contrast with Trump and attacking him in the same way that Obama attacked Romney—as a supplicant for wealthy and corporate interests who will destroy the nation’s social safety net. “Donald Trump is a much more formidable candidate than Mitt Romney,” Khanna said. “So it’s a harder challenge.”
Just how strong a threat Trump poses to Biden is a matter of dispute among Democrats. Although all of the Democrats I spoke with predicted that next year’s election would be close, some of them took solace in Trump’s weakness as a GOP nominee—and not only because he might be running as a convicted felon. “Donald Trump, for all of his visibility, is prone to making big mistakes,” Israel said. “A Biden-versus-Trump matchup will reveal Trump’s mistakes and help correct the current polling.”
The New York Times–Siena polls found that an unnamed “generic” Democrat would fare much better against Trump than Biden would. But they also found that a generic Republican would trounce Biden by an even larger margin. “Mitt Romney was a much harder candidate than Donald Trump,” Messina told me. (When I pointed out that Khanna had made the opposite assertion, he replied, “He’s in Congress. I’m not. I won a presidential election. He didn’t.”)
None of the Democrats I interviewed was pining for another nominee, or for Biden to drop out. Representative Dean Phillips of Minnesota hasn’t secured a single noteworthy endorsement since announcing his long-shot primary challenge. Vice President Kamala Harris is no more popular among voters, and all of the Democrats I spoke with expressed doubts that the candidacy of a relatively untested governor—say, Gavin Newsom of California, Gretchen Whitmer of Michigan, or Josh Shapiro of Pennsylvania—would make a Democratic victory more likely. Messina said that if Biden dropped out, a flood of ambitious Democrats would immediately enter the race, and a free-for-all primary could produce an even weaker nominee. “Are we sure that’s what we want?” Messina asked.
Others downplayed Biden’s poor polling, particularly the finding that Democrats don’t want him to run again. Their reasoning, however, hinted at a sense of resignation about the coming campaign. Israel compared the choice voters face to a person deciding whether or not to renew a lease on their car: “I’m not sure I want to extend the lease, until I looked at other models and realized I’m going to stick with what I have,” he explained. Senator Chris Murphy of Connecticut said that voters he talks to don’t bring up Biden’s age as an issue; only the media does. “I don’t know. He’s old, but he’s also really tall,” Murphy told me. “I don’t care about tall presidents if it doesn’t impact their ability to do the job. I don’t really care about presidents who are older if it doesn’t impact their ability to do the job either.” He was unequivocal: “I think we need Joe Biden as our nominee.”
For most Democrats, the debate over whether Biden should run again is now mostly academic. The president has made his decision, and top Democrats aren’t pressuring him to change his mind. Democrats are left to hope that the comparisons to Obama bear out and the advantages of incumbency kick in. Biden’s age—he’d be 86 at the end of a second term—is a fact of life. “You have to lean into it,” Israel told me. “You can’t ignore it.” How, I asked him, should Biden lean into the age issue? “I don’t know,” Israel replied. “That’s what a campaign is for.”
In a significant escalation in a criminal investigation into New York Mayor Eric Adams’s victorious 2021 campaign, Federal Bureau of Investigation investigators seized at least two cell phones and an iPad from the mayor early last week, The New York Timesreported Friday afternoon.
The investigation, which concerns whether the Adams campaign conspired with the Turkish government to solicit illegal donations via a Brooklyn-based construction company, burst into public view earlier this month when FBI agents raided the Crown Heights apartment of a former Adams intern and current chief fund-raiser, Brianna Suggs. The agents seized two laptops, three iPhones, a manila folder labeled “Eric Adams,” seven “contribution card binders,” and other physical materials, according to the search warrant obtained by the Times.
On Friday, Adams’s lawyer, Boyd Johnson, said in a statement that the mayor was cooperating with the FBI and had “proactively reported” at least one person who engaged in improper behavior. The statement did not say whether the reported conduct was related to the FBI seizure of Adams’ devices. Johnson said that Adams has not been accused of any wrongdoing and “immediately complied with the FBI’s request and provided them with electronic devices.”
In his statement, the mayor said, “As a former member of law enforcement, I expect all members of my staff to follow the law and fully cooperate with any sort of investigation — and I will continue to do exactly that.” Adams added that he had “nothing to hide.”
According to a source who spoke to the NYT, FBI agents climbed into Adams’s SUV after an event early last week and executed the search warrant. The cell phones and iPad were returned to the mayor after a few days, but investigators had the legal authority to copy data on seized devices.
On Wednesday—two days after the FBI had seized his devices, and two days before the seizure was reported to the public—Adams said he would be “shocked” if anyone on his campaign had done anything wrong. “I cannot tell you how much I start the day with telling my team, ‘We gotta follow the law. Gotta follow the law,’” Adams said. “Almost to the point that I’m annoying.”
When reporters asked whether the mayor was in touch with investigators following the raid of Suggs’s apartment, another Adams lawyer, Lisa Zornberg, preemptively answered the question. “The answer is yes, of course we are,” Zornberg told reporters. “The mayor has pledged his cooperation, and we’ve been in touch.” Zornberg failed to mention the FBI search.
During Wednesday’s press conference, Adams said he’d met with Turkish President Recep Tayyip Erdoğan just once, when the two “exchanged pleasantries” at an event during Adams’ tenure as Brooklyn borough president. But Adams has traveled to Turkey on numerous occasions, bragging last month, “I’m probably the only mayor in the history of this city that has not only visited Turkey once, but I think I’m on my sixth or seventh visit to Turkey.” Turkish entities reportedly paid for some of those visits.
On Friday, just before the news of the FBI seizure broke, The City reporter Katie Honan asked Adams about early speculation that, amidst this investigation, Adams will face several primary challengers in 2025. “Wait before you hate,” Adams cryptically replied.
The New York Times sent a letter to Sen. Tom Cotton (R-Ark.) on Friday condemning him for suggesting that its employees were involved in Hamas’ attack on Israel last month, calling him out for “parroting disinformation.”
The message came in response to a letter that Cotton sent to Times leadership Thursday citing “reports” that the newspaper’s journalists were “embedded with Hamas, knew about the attack, and … accompanied members of Hamas as they carried out the attack.”
The “reports” Cotton referenced are completely unverified and irresponsible to share as valid sources of information, Times counsel David McCraw said in response.
“As I am sure you agree, the spread of disinformation and incendiary rhetoric threatens the health of our democracy. Sadly, your letter to The New York Times of November 9 exacerbates those very problems,” McCraw wrote.
Sen. Tom Cotton (R-Ark.) accused The New York Times of embedding its staff with Hamas.
Tom Williams via Getty Images
″[Y]ou are merely parroting disinformation harvested form the internet based on a website that has conceded it had no evidence for its claims,” the letter continued, adding: “Falsehoods circulated on the Internet are many things, but they are most certainly not ‘reports.’ They also should not be abused by a U.S. Senator to falsely accuse fellow Americans of crimes.”
In his letter, Cotton demanded that the Times say how many members of its staff have been embedded with Hamas, when the paper became aware of their involvement with the terrorist group and how much funding the Times has given to Hamas.
“To make it plain for you,” the Times responded, “the only connection The New York Times has to Hamas is that we report on the organization fearlessly and at times at great risk, bringing essential information to the public about the terrorist attacks in Israel and the ongoing conflict in Gaza.”
Cotton has made aggressive statements in support of Israel’s counterstrikes on Gaza, where members of the Hamas militant group are based.
“As far as I’m concerned, Israel can bounce the rubble in Gaza,” Cotton said on Fox News last month. The phrase refers to further damaging something that is already destroyed.
After members of Hamas stormed Israel, killing an estimated 1,200 people and taking roughly 240 people hostage, Israel launched an all-out attack on Gaza, killing at least 11,000 people so far in the Palestinian territory.
It’s been nearly a month since Hamas militants broke through the Gaza border and carried out a brutal attack that has resulted in 1,400 lives lost in Israel, triggering a military response with thousands of Palestinians killed and more than a million displaced. Among the latest developments, a UNICEF spokesperson this week described Gaza as a “graveyard for children,” as calls for a cease-fire mounted and the Biden administration urged a humanitarian “pause” in the conflict.
Through it all, the world has been grappling not only with the fog of war, but also misinformation, disinformation, propaganda, antisemitism, Islamophobia, heated debates over language, and a widening chasm between those whose foremost sympathies lie with the Israelis in one corner, or with the Palestinians in the other. With images of mass civilian casualties seared into our minds and talk of World War III lingering in the air, these are scary, depressing, fraught days, in which all it takes is one wrong word, gesture, or social media post to instantly and profoundly inflame the tensions.
In Gaza, there’s an escalating military campaign that threatens to spiral into a wider conflict, with a very limited number of journalists for major global news outlets able to bear witness on the ground. (I wrote about them last month.) Here in the US, Muslims are facing incendiary rhetoric from powerful conservatives in Congress and the media, while American Jews are facing a rise in antisemitism that has infiltrated daily life in a way many may have thought unfathomable just weeks ago.
Jodi Rudoren has been especially attuned to the latter story. She’s the editor in chief of the Forward, the age-old Jewish news organization that has been an essential source for many American Jews as they process the latest headlines. It’s a tricky job—the Forward is “independent” and “nonideological,” as Rudoren puts it, which means the publication’s largely but not exclusively Jewish readership runs the gamut from pro-Israel hard-liners to left-leaning Palestinian activists. Rudoren is also a former New York Times Jerusalem bureau chief who has lived in Israel and reported from Gaza, including coverage of past wars there. I was interested in her perspective on all of this, so we hopped on a Zoom. The following is a condensed and edited transcript of our conversation.
Vanity Fair:Give our readers a sense of your background covering Israel and Gaza.
Jodi Rudoren: I worked at The New York Times for 21 years and the LA Times for six years before that. In 2012 I became the Jerusalem bureau chief of The New York Times. I was in that job through the end of 2015, which means I covered two wars in Gaza. There was an eight-day war in 2012. I was on the ground in Gaza for that entire thing and a little bit after. And then in 2014 there was the summer war. I spent most of that war in Jerusalem and around Israel, but I was in Gaza some during that conflict. I’ve probably been to Gaza eight or 10 times over those almost four years, many of them peacetime. I left The New York Times in 2019 to become editor in chief of the Forward, which is a 126-year-old Jewish news organization. We’re a nonideological, independent, nonprofit newsroom. It’s news through a Jewish lens, for a largely Jewish audience, but not entirely.
Safe to say this is the biggest story in your time editing theForward?
Of course. First of all, there’s the magnitude of the attack and the response. There’s the length [of the conflict], and that it seems likely to continue. And in the last week or so, I think we’ve come to understand, in a deeper way, the repercussions here in the American Jewish world.
Diana Henriques was first stricken in late 1996. A business reporter for The New York Times, she was in the midst of a punishing effort to bring a reporting project to fruition. Then one morning she awoke to find herself incapable of pinching her contact lens between her thumb and forefinger.
Henriques’s hands were soon cursed with numbness, frailty, and a gnawing ache she found similar to menstrual cramps. These maladies destroyed her ability to type—the lifeblood of her profession—without experiencing debilitating pain.
“It was terrifying,” she recalls.
Henriques would join the legions of Americans considered to have a repetitive strain injury (RSI), which from the late 1980s through the 1990s seized the popular imagination as the plague of the modern American workplace. Characterized at the time as a source of sudden, widespread suffering and disability, the RSI crisis reportedly began in slaughterhouses, auto plants, and other venues for repetitive manual labor, before spreading to work environments where people hammered keyboards and clicked computer mice. Pain in the shoulders, neck, arms, and hands, office drones would learn, was the collateral damage of the desktop-computer revolution. As Representative Tom Lantos of California put it at a congressional hearing in 1989, these were symptoms of what could be “the industrial disease of the information age.”
By 1993, the Bureau of Labor Statistics was reporting that the number of RSI cases had increased more than tenfold over the previous decade. Henriques believed her workplace injury might have had a more specific diagnosis, though: carpal tunnel syndrome. Characterized by pain, tingling, and numbness that results from nerve compression at the wrist, this was just one of many conditions (including tendonitis and tennis elbow) that were included in the government’s tally, but it came to stand in for the larger threat. Everyone who worked in front of a monitor was suddenly at risk, it seemed, of coming down with carpal tunnel. “There was this ghost of a destroyed career wandering through the newsroom,” Henriques told me. “You never knew whose shoulder was going to feel the dead hand next.”
But the epidemic waned in the years that followed. The number of workplace-related RSIs recorded per year had already started on a long decline, and in the early 2000s, news reports on the modern plague all but disappeared. Two decades later, professionals are ensconced more deeply in the trappings of the information age than they’ve ever been before, and post-COVID, computer use has spread from offices to living rooms and kitchens. Yet if this work is causing widespread injury, the evidence remains obscure. The whole carpal tunnel crisis, and the millions it affected, now reads like a strange and temporary problem of the ancient past.
So what happened? Was the plague defeated by an ergonomic revolution, with white-collar workers’ bodies saved by thinner, light-touch keyboards, adjustable-height desks and monitors, and Aeron chairs? Or could it be that the office-dweller spike in RSIs was never quite as bad as it seemed, and that the hype around the numbers might have even served to make a modest problem worse, by spreading fear and faulty diagnoses?
Or maybe there’s another, more disturbing possibility. What if the scourge of RSIs receded, but only for a time? Could these injuries have resurged in the age of home-office work, at a time when their prevalence might be concealed in part by indifference and neglect? If that’s the case—if a real and pervasive epidemic that once dominated headlines never really went away—then the central story of this crisis has less to do with occupational health than with how we come to understand it. It’s a story of how statistics and reality twist around and change each other’s shape. At times they even separate.
The workplace epidemic was visible only after specific actions by government agencies, employers, and others set the stage for its illumination. This happened first in settings far removed from office life. In response to labor groups’ complaints, the Occupational Safety and Health Administration began to look for evidence of RSIs within the strike-prone meatpacking industry—and found that they were rampant.
Surveillance efforts spread from there, and so did the known scope of the problem. By 1988, OSHA had proposed multimillion-dollar fines against large auto manufacturers and meatpacking plants for underreporting employees’ RSIs; other businesses, perhaps spooked by the enforcement, started documenting such injuries more assiduously. Newspaper reporters (and their unions) took up the story, too, noting that similar maladies could now be produced by endless hours spent typing at the by-then ubiquitous computer keyboard. In that way, what had started playing out in government enforcement actions and statistics morphed into a full-blown news event. The white-collar carpal tunnel crisis had arrived.
In the late 1980s, David Rempel, an expert in occupational medicine and ergonomics at UC San Francisco, conducted an investigation on behalf of California’s OSHA in the newsroom of The Fresno Bee. Its union had complained that more than a quarter of the paper’s staff was afflicted with RSIs, and Rempel was there to find out what was wrong.
The problem, he discovered, was that employees had been given new, poorly designed computer workstations, and were suddenly compelled to spend a lot of time in front of them. In the citation that he wrote up for the state, Rempel ordered the Bee to install adjustable office furniture and provide workers with hourly breaks from their consoles.
A computer workstation at The Fresno Bee in 1989 (Courtesy of David Rempel)
Similar injury clusters were occurring at many other publications, too, and reporters cranked out stories on the chronic pain within their ranks. More than 200 editorial employees of the Los Angeles Times sought medical help for RSIs over a four-year stretch, according to a 1989 article in that newspaper. In 1990, The New YorkTimes published a major RSI story—“Hazards at the Keyboard: A Special Report”—on its front page; in 1992, Time magazine ran a major story claiming that professionals were being “Crippled by Computers.”
But ergonomics researchers like Rempel would later form some doubts about the nature of this epidemic. Research showed that people whose work involves repetitive and forceful hand exertions for long periods are more prone to developing carpal tunnel syndrome, Rempel told me—but that association is not as strong for computer-based jobs. “If there is an elevated risk to white-collar workers, it’s not large,” he said.
Computer use is clearly linked to RSIs in general, however. A 2019 meta-analysis in Occupational & Environmental Medicine found an increased risk of musculoskeletal symptoms with more screen work (though it does acknowledge that the evidence is “heterogeneous” and doesn’t account for screen use after 2005). Ergonomics experts and occupational-health specialists told me they are certain that many journalists and other professionals did sustain serious RSIs while using 1980s-to-mid-’90s computer workstations, with their fixed desks and chunky keyboards. But the total number of such injuries may have been distorted at the time, and many computer-related “carpal tunnel” cases in particular were spurious, with misdiagnoses caused in part by an unreliable but widely used nerve-conduction test. “It seems pretty clear that there wasn’t a sudden explosion of carpal tunnel cases when the reported numbers started to go up,” Leslie Boden, an environmental-health professor at the Boston University School of Public Health, told me.
Such mistakes were probably driven by the “crippled by computers” narrative. White-collar workers with hand pain and numbness might have naturally presumed they had carpal tunnel, thanks to news reports and the chatter at the water cooler; then, as they told their colleagues—and reporters—about their disabilities, they helped fuel a false-diagnosis feedback loop.
It’s possible that well-intentioned shifts in workplace culture further exaggerated the scale of the epidemic. According to Fredric Gerr, a professor emeritus of occupational and environmental health at the University of Iowa, white-collar employees were encouraged during the 1990s to report even minor aches and pains, so they could be diagnosed—and treated—earlier. But Gerr told me that such awareness-raising efforts may have backfired, causing workers to view those minor aches as harbingers of a disabling, chronic disease. Clinicians and ergonomists, too, he said, began to lump any pain-addled worker into the same bin, regardless of their symptoms’ severity—a practice that may have artificially inflated the reported rates of RSIs and caused unnecessary anxiety.
Henriques, whose symptoms were consistent and severe, underwent a nerve-conduction test not long after her pain and disability began; the result was inconclusive. She continues to believe that she came down with carpal tunnel syndrome as opposed to another form of RSI, but chose not to receive surgery given the diagnostic uncertainty. New YorkTimes reporters with RSIs were not at risk of getting fired, as she saw it, but of ending up in different roles. She didn’t want that for herself, so she adapted to her physical limitations, mastering the voice-to-text software that she has since used to dictate four books. The most recent came out in September.
As it happens, a very similar story had played out on the other side of the world more than a decade earlier.
Reporters in Australia began sounding the alarm about the booming rates of RSIs among computer users in 1983, right at the advent of the computer revolution. Some academic observers dismissed the epidemic as the product of a mass hysteria. Other experts figured that Australian offices might be more damaging to people’s bodies than those in other nations, with some colorfully dubbing the symptoms “kangaroo paw.” Andrew Hopkins, a sociologist at the Australian National University, backed a third hypothesis: that his nation’s institutions had merely facilitated acknowledgement—or stopped suppressing evidence—of what was a genuine and widespread crisis.
“It is well known to sociologists that statistics often tell us more about collection procedures than they do about the phenomenon they are supposed to reflect,” Hopkins wrote in a 1990 paper that compared the raging RSI epidemic in Australia to the relative quiet in the United States. He doubted that any meaningful differences in work conditions between the two nations could explain the staggered timing of the outbreaks. Rather, he suspected that different worker-compensation systems made ongoing epidemics more visible, or less, to public-health authorities. In Australia, the approach was far more labor-friendly on the whole, with fewer administrative hurdles for claimants to overcome, and better payouts to those who were successful. Provided with this greater incentive to report their RSIs, Hopkins argued, Australian workers began doing so in greater numbers than before.
Then conditions changed. In 1987, Australia’s High Court decided a landmark worker-compensation case involving an RSI in favor of the employer. By the late 1980s, the government had discontinued its quarterly surveillance report of such cases, and worker-comp systems became more hostile to them, Hopkins said. With fewer workers speaking out about their chronic ailments, and Australian journalists bereft of data to illustrate the problem’s scope, a continuing pain crisis might very well have been pushed into the shadows.
Now it was the United States’ turn. Here, too, attention to a workplace-injury epidemic swelled in response to institutional behaviors and incentives. And then here, too, that attention ebbed for multiple reasons. Improvements in workplace ergonomics and computer design may indeed have lessened the actual injury rate among desk workers during the 1990s. At the same time, the growing availability of high-quality scanners reduced the need for injury-prone data-entry typists, and improved diagnostic practices by physicians reduced the rate of false carpal tunnel diagnoses. In the blue-collar sector, tapering union membership and the expansion of the immigrant workforce may have pushed down the national number of recorded injuries, by making employees less inclined to file complaints and advocate for their own well-being.
But America’s legal and political climate was shifting too. Thousands of workers would file lawsuits against computer manufacturers during this period, claiming that their products had caused injury and disability. More than 20 major cases went to jury trials—and all of them failed. In 2002, the Supreme Court ruled against an employee of Toyota who said she’d become disabled by carpal tunnel as a result of working on the assembly line. (The car company was represented by John Roberts, then in private appellate-law practice.) Meanwhile, Republicans in Congress managed to jettison a new set of OSHA ergonomics standards before they could go into effect, and the George W. Bush administration ended the requirement that employers separate out RSI-like conditions in their workplace-injury reports to the government. Unsurprisingly, recorded cases dropped off even more sharply in the years that followed.
Blue-collar workers in particular would be left in the lurch. According to M. K. Fletcher, a safety and health specialist at the AFL-CIO, many laborers, in particular those in food processing, health care, warehousing, and construction, continue to suffer substantial rates of musculoskeletal disorders, the term that’s now preferred over RSIs. Nationally, such conditions account for an estimated one-fifth to one-third of the estimated 8.4 million annual workplace injuries across the private sector, according to the union’s analysis of Bureau of Labor Statistics reports.
From what experts can determine, carpal tunnel syndrome in particular remains prevalent, affecting 1 to 5 percent of the overall population. The condition is associated with multiple health conditions unrelated to the workplace, including diabetes, age, hypothyroidism, obesity, arthritis, and pregnancy. In general, keyboards are no longer thought to be a major threat, but the hazards of repetitive work were always very real. In the end, the “crippled by computers” panic among white-collar workers of the 1980s and ’90s would reap outsize attention and perhaps distract from the far more serious concerns of other workers. “We engage in a disease-du-jour mentality that is based on idiosyncratic factors, such as journalists being worried about computer users, rather than prioritization by the actual rate and the impact on employment and life quality,” Gerr, the occupational- and environmental-health expert at the University of Iowa, told me.
As for today’s potential “hazards at the keyboard,” we know precious little. Almost all of the research described above was done prior to 2006, before tablets and smartphones were invented. Workplace ergonomics used to be a thriving academic field, but its ranks have dwindled. The majority of the academic experts I spoke with for this story are either in the twilight of their careers or they’ve already retired. A number of the researchers whose scholarship I’ve reviewed are dead. “The public and also scientists have lost interest in the topic,” Pieter Coenen, an assistant professor at Amsterdam UMC and the lead author of the meta–analysis from 2019, told me. “I don’t think the problem has actually resolved.”
So is there substantial risk to workers in the 2020s from using Slack all day, or checking email on their iPhones, or spending countless hours hunched at their kitchen tables, typing while they talk on Zoom? Few are trying to find out. Professionals in the post-COVID, work-from-home era may be experiencing a persistent or resurgent rash of pain and injury. “The industrial disease of the information age” could still be raging.