Latest Entries


A “mossback” is an extreme conservative, one so bound up in the past and resistant to forward motion that it (figuratively speaking) is covered in moss, like a stone.

The term mossback originally referred to people dodging the draft during the Civil War; mossbacks were people from the Carolinas who hid to avoid being called up to serve as soldiers and were willing to hide “until moss grew on their backs”. The term later came to mean reactionaries and hidebound conservatives.

In 1896 the editor of the Emporia Gazette, William Allen White, used the term in his fiery editorial, “What’s the Matter with Kansas?” White later said that he regretted the editorial, which he wrote in a fit of pique after a run-in with a group of populists who backed William Jennings Bryan for president. The editorial lambasted Kansas, saying:

We all know; yet here we are at it again. We have an old mossback Jacksonian whosnorts and howls because there is a bathtub in the State House; we are runningthat old jay for governor. We have another shabby, wild-eyed, rattle-brained fanatic who has said openly in a dozen speeches that “the rights of the user areparamount to the rights of the owner”; we are running him for Chief Justice, so that capital will come tumbling over itself to get into the state.

Years later, President Harry Truman was fond of deriding his political opponents as “mossbacks” who were trying to stand in the way of progress. A New York Times report from 1952 read, in part, “He [Truman] derided critics of his farm and other policies, describing them as “mossbacks” and accusing them of spreading “just plain hokum.” He repeated that these critics were mossbacks and said he could call them by name if anybody asked him to.””

In 1992, when President George H.W. Bush was running for re-election, much was made of his ongoing dispute with the Democratic-controlled Congress. Bush gave a number of campaign speeches in which he lashed out at Congressional spending and complained that they were in the hands of special interest groups; for his part, House Speaker Thomas Foley called Bush a “bystander president” and an obstructionist. Bush lamented that he had tried to get along with Congress; he claimed that he had stretched out his hand in friendship at the beginning of his term, but that “these old mossbacks bit it off.” 

Mossback is not the most common term, so when it’s used, it’s often used for comic effect. The Baltimore Sun, for example, ran a piece discussing the idea that there is no real difference between American politicians of different political parties. The piece read:

Critics of American politics – generally from the left – often say there’s no difference between Democrats and Republicans.

Ralph Nader argued in 2000 that Al Gore and George W. Bush were virtually the same. Europeans like to say that all American politicians are conservative and that Barack Obama and Hillary Clinton would be mossbacks in Sweden or France…

missile gap

missile gap

“Missile gap” is a phrase used during the Cold War, referring to the theory that the US lagged behind the Soviet Union in terms of its ballistic missile defenses.

The US and the USSR were engaged in a high-stakes arms race by the 1950s. That race intensified after the Soviets successfully launched Sputnik I in 1957. Sputnik was the world’s first man-made satellite. On its own, it did not pose any threat to the US or other nations. However, Sputnik was proof positive that the USSR possessed missile technology capable of launching an object into orbit – something which the US did not yet possess. 

With the launch of Sputnik, the US government became convinced that the Soviet Union had the capability to threaten the continental United States with ballistic missiles. In the same year, a presidentially-commissioned report on US nuclear policies was commissioned, which drew some very pessimistic conclusions about America’s missile readiness compared to that of the Soviet Union. The s-called Gaither Report claimed that the Soviet Union could have a “significant” inter-continental missile capability within two years and that it might be able to strike at America’s Strategic Air Command’s bomber fleet. The report was classified as top secret but some of its contents were leaked to the media, feeding the public perception that America faced an existential threat from Soviet missiles.

President John F Kennedy is closely associated with the concept of the missile gap; JFK talked about the need for the US to increase its ballistic missile system from 1958 onwards, when he was running for re-election to the Senate. Kennedy also critiqued President Dwight D Eisenhower as being supposedly lax on missile defense.

Once he was in office, JFK was informed that in fact, there was no missile lag; the US was believed to be on a par with, or even superior to, the USSR as far as ballistic missile technology went.  (The intelligence community had revised its earlier, overblown estimates of Russia’s capabilities by the time JFK came into office.) JFK did not publicly refute what he had said, but he joked about his mistake behind closed doors, telling his advisors that “a patriotic and misguided man” had “put that myth around.”

American anxiety about the missile gap reached a peak in 1962 with the Cuban missile crisis, an event lasted just under two weeks but resonated in the public imagination for far longer. On October 16, 1962, the president was informed that the Soviet Union was building missile launch sites in Cuba. (This came in response to the Bay of Pigs invasion and to the American decision to install its own missile launch sites in Italy and in Turkey.) In response, JFK imposed a naval blockade around Cuba which remained in place until November 20, when the US confirmed that all of the ballistic missile systems had been taken apart.

Behind the scenes, during those tumultuous two weeks, the US and the USSR reached a private agreement under which the USSR would dismantle its Cuban missile sites in return for the US doing the same to its own sites in Turkey.

military industrial complex

military industrial complex

The “military industrial complex” is a term referring to all the components of a nation’s military establishment, including the private businesses involved in producing weapons and other military equipment.

The term was popularized by President Dwight D. Eisenhower, who used his last official speech to denounce the rise of the military industrial complex. Eisenhower warned that the growth of the defense industry posed a threat to democracy:

In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.

We must never let the weight of this combination endanger our liberties or democratic processes. We should take nothing for granted. Only an alert and knowledgeable citizenry can compel the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals, so that security and liberty may prosper together.

Eisenhower tied the problem of the military-industrial complex to what he saw as a larger issue of big government overshadowing individual enterprise and research. He argued that the “technological revolution” was making it harder and harder for the “solitary inventor” to compete with massive research facilities and government laboratories:

Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades.

In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.

Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present.

Eisenhower himself was a retired five-star general who served as a commander of the Allied forces in World War II; he also orchestrated the D-Day invasion of France. 

During Eisenhower’s presidency, the US military expanded more than at any other time in the nation’s history. The military retained a large standing army even after the close of the Korean war; the Cold War between the US and the USSR led to a steady, ongoing increase in military spending on the part of both nations.

The phrase “military industrial complex” was later adopted by many on the left who criticized the power of the defense industry. On the far left, Noam Chomsky argued that Eisenhower had not gone far enough in his warning, and that in fact, the entire modern economy was built around government controls:

I think Eisenhower’s warning was appropriate, but either he didn’t understand or else commentators don’t understand, but the military-industrial complex, as he called it, is actually the core of the modern economy. It’s not specifically military. The reason we have computers, the Internet, telecommunications, lasers, satellites, an aeronautical industry, tourism, run down the list, is because of a technique to ensure that the U.S. is not a free-enterprise economy. Some are more extreme than others in this respect.

merchants of death

merchants of death

“Merchants of death” is a reference to the bankers and arms manufacturers that supplied and funded World War I. The phrase is also used to refer to arms dealers in general. The term has also been extended to other industries.

The term was first applied to an arms dealer named Basil Zaharoff. Relatively little is known of Zaharoff’s personal life. He was born Basileios Zacharias to poor Greek parents and spent some of his early life in Russia, before moving to Istanbul and then London; he was often referred to as the “mystery man of Europe.” By the end of the 19th century, Zaharoff was acting as one of the leading representatives for the Vickers Company, a British arms manufacturer. And by the time World War I broke out, Zaharoff had made himself a millionaire from arms sales. 

Zaharoff’s life and legacy were complex; as the Smithsonian magazine put it:

Few men have acquired so scandalous a reputation as did Basil Zaharoff, alias Count Zacharoff, alias Prince Zacharias Basileus Zacharoff, known to his intimates as “Zedzed.” Born in Anatolia, then part of the Ottoman Empire, perhaps in 1849, Zaharoff was a brothel tout, bigamist and arsonist, a benefactor of great universities and an intimate of royalty who reached his peak of infamy as an international arms dealer—a “merchant of death,” as his many enemies preferred it.

Zaharoff’s critics, in fact, accused him of plotting to start the Great War just for the sake of extending his profits. Anti-war activists on both sides of the political divide charged that Zaharoff, and other arms dealers, had brought about the war; the question went as far as the Senate.

In 1934, the Senate Munitions Committee met to formally investigate the question of whether arms dealers had, in fact, inappropriately influenced the US government’s decision to enter the first World War. The committee came to be known as the Nye committee, after its chair, Senator Gerald Nye. Nye, a North Dakota Republican, famously said that “when the Senate investigation is over, we shall see that war and preparation for war is not a matter of national honor and national defense, but a matter of profit for the few.” The committee never found any proof of undue influence, however.

In modern times, the term “merchant of death” has been applied to mercenary arms dealers, notably the Russian gunrunner Viktor Bout, who allegedly sold weapons to Colombian rebel groups and to both dictators and rebels alike in much of South America, the Middle East, and Africa. Bout’s story was also the inspiration for the movie “Lord of War.”

The phrase “merchants of death” has also been applied to other industries, especially the tobacco and the oil industries. A book by Lawrence C. White was titled “Merchants of Death: the American Tobacco Industry.” Other writers have charged that lobbyists for big tobacco and oil firms deliberately worked cooperatively with each other in order to craft strategies to keep the public in the dark about the dangers involved in their industries. 



McCarthyism takes its name from Senator Joseph McCarthy, who led a campaign against supposed communists living in the United States. McCarthy dominated the so-called “Red Scare” of the 1940s and 1950s, a period when many Americans were afraid that communists had infiltrated the country’s institutions.

The term “McCarthyism” has also come to mean a tendency to make widespread, unfounded allegations against people for political reasons.  

Joseph McCarthy was a World War II hero who was first elected to the US Senate in 1946. In 1950, the Wisconsin Senator made national headlines when he gave a speech at the Ohio County Women’s Republican Club in Wheeling, West Virginia. McCarthy brandished a piece of paper on which, he said, held a list of 205 State Department employees who were communist sympathizers and were, as he said, “working and shaping policy” in the State Department.

McCarthy continued to make headlines as he spoke out against what he saw as the growing communist threat. After America entered the Korean War, McCarthy’s viewpoint became more widespread, and in 1953, he was made chairman of the Committee on Government Operations, which allowed him to investigate people, both inside and outside the Federal government, whom he believed to be communist sympathizers. McCarthy’s excesses became notorious. Decades later, Democrats in the Senate requested that the committee’s records from the McCarthy era be made public, in part so that people could learn from them. The request read, in part:

Senator McCarthy’s zeal to uncover subversion and espionage led to disturbing excesses. His browbeating tactics destroyed careers of people who were not involved in the infiltration of our government. His freewheeling style caused both the Senate and the Subcommittee to revise the rules governing future investigations, and prompted the courts to act to protect the Constitutional rights of witnesses at Congressional hearings. Senator McCarthy’s excesses culminated in the televised Army-McCarthy hearings of 1954, following which the Senate voted overwhelmingly for his censure.

The Army-McCarthy hearings were probably the pinnacle of McCarthy’s power, but those same hearings also led to his downfall. In 1954, the senator announced that he wanted to root out communist sympathizers within the US Army, and he did this by organizing televised hearings. During the hearings, McCarthy accused the Army’s lawyer, a man named Joseph Welch, of one employing a man who had belonged to a communist group. Welch gave a resounding counter-attack, looking at McCarthy and asking, “Have you no sense of decency, sir? At long last, have you left no sense of decency?”

The televised interaction played in Welch’s favor, sinking McCarthy in public opinion. McCarthy’s image was further damaged when the journalist Edward R. Murrow ran an investigative piece on McCarthy and his methods on the CBS show, “See It Now.” Murrow concluded the show with a brief editorial, a call to conscience which read, in part:

The actions of the junior Senator from Wisconsin have caused alarm and dismay amongst our allies abroad, and given considerable comfort to our enemies. And whose fault is that? Not really his. He didn’t create this situation of fear; he merely exploited it — and rather successfully. Cassius was right. ‘The fault, dear Brutus, is not in our stars, but in ourselves.

man in the street

man in the street

“Man in the street” is used to evoke the idea of the average voter, with mainstream political opinions and interests. 

Merriam Webster notes that the phrase was first used in 1831, to mean an average or ordinary person. The phrase likely reflected an increased interest in the political opinions of average men, since they were becoming enfranchised for the first time. Historians estimate that by 1840, most white men in America had the right to vote; before that, most states only allowed white male property owners to cast votes at the ballot box.

Today, journalists often conduct “man on the street” interviews following major news events. Such interviews aim to understand how the average citizen feels about major events, and to understand what their views and their concerns are. Man on the street interviews also aim to understand how much the average person knows about recent news events.

Some “man on the street” interviews also aim to collect a range of views about the major issues of the day, to see where popular sentiment lies.

In 2016, the late-night TV host Jimmy Kimmel provoked some controversy put out a series of “idiot on the street” interviews. Kimmel’s staff interviewed men and women who were walking down Hollywood Boulevard to see what they knew about the day’s politics, especially about Hillary Clinton and Donald Trump. Kimmel’s team focused on people who seemed to be deliberately lying about what they knew; the segment was known as “Lie Witness News.”

Kimmel’s fans argued that the interviews proved how little the average voter thought about politics, and how easy it was to get people to perpetuate lies. His critics argued that he was taking cheap shots at ordinary people and helping to give liberal voters a sense of superiority over conservatives.

A number of journalists have argued that man on the street interviews (also known as “vox populi,” or “vox pop,”) are a thing of the past, and that they have outlived their usefulness. The journalist Elise Czajkowski has written about her own mixed experiences, both as a journalist and as an interview subject; she concluded that the man on the street interview tends to be overly simplistic:

“News needs people. Characters, narratives, and commentary are valuable elements of journalism. But by abolishing the man on the street interview, we force ourselves to dig deeper, to find a voice that represents a point of view rather than thrusting Jane Doe into the role of community spokesperson. Engagement means knowing that you have sources to call on deadline who will provide you with those characters. It means letting your audience bring their best to your community reporting. And it means trusting your audience enough to care that you’ve talked to the right person, not just a person.”

Czajkowski is not alone in this opinion. A survey of journalists found that the majority have a poor opinion of vox pop, or man on the street, interviews. Interestingly, the same study found that news organizations continue to require such interviews, undeterred by reporters’ views.



Broadly, a mandate is the authority that voters confer on an elected official to act as their representative. Usually, though, a political mandate, or “popular mandate,” refers to the idea that a political official has been elected because the public strongly supports their platforms and wants to see them enacted. 

After Franklin Delano Roosevelt won his re-election by a landslide, he felt confident that the American people had given him a mandate to expand his New Deal policies. explains that there was a tension between what businessmen and bankers wanted, on the one hand, and what Roosevelt believed that his supporters wanted, on the other hand: 

By 1935 the Nation had achieved some measure of recovery, but businessmen and bankers were turning more and more against Roosevelt’s New Deal program. They feared his experiments, were appalled because he had taken the Nation off the gold standard and allowed deficits in the budget, and disliked the concessions to labor. Roosevelt responded with a new program of reform: Social Security, heavier taxes on the wealthy, new controls over banks and public utilities, and an enormous work relief program for the unemployed.

In 1936 he was re-elected by a top-heavy margin. Feeling he was armed with a popular mandate, he sought legislation to enlarge the Supreme Court, which had been invalidating key New Deal measures. Roosevelt lost the Supreme Court battle, but a revolution in constitutional law took place. Thereafter the Government could legally regulate the economy.

In modern times, virtually every president claims to have been elected by a powerful mandate – no matter how close the actual election was. And, with every election, the press assesses the size of the new president’s mandate. When Obama won re-election in 2012, for example, NPR ran a piece titled “For Obama, Vindication, but not a Mandate.” The piece acknowledged that Obama felt differently about his own victory and believed himself to have a mandate:

Despite the close result in the popular vote nationwide, Obama wasted no time claiming vindication for his ideas. In his victory speech early Wednesday in Chicago, he tied his re-election to two centuries of American progress.

Just a few years later, Donald Trump’s election staggered the media and led to a rash of articles debating whether the new president had a mandate. For some, the mandate was clear; as AP reported, then- House Speaker Paul Ryan called Trump’s victory “a repudiation of the status quo of failed liberal progressive policies.” Critics, however, argued that Trump did not have a mandate because he had won the electoral college but had failed to win the popular vote.

Sometimes, though, a mandate can be conferred long after a president is already elected. In 2002, the Atlantic ran a piece arguing that President George Bush had finally won his mandate two years into his term. Bush made the midterm congressional elections a referendum on his presidency, and Republicans swept Congress in that election:

President Bush has finally won his mandate, the one he failed to get two years ago.

Presidents are supposed to have coattails when they get elected, not in a midterm election. Midterms are when presidents are supposed to see their parties suffer setbacks in Congress. That’s been the rule in virtually every midterm since the Civil War.

maiden speech

maiden speech

The first speech that an elected official makes in front of a legislature. The term is most commonly used in the UK and in Commonwealth countries, but it is also used in the US. A maiden speech is also known as an inaugural speech.

In the US, newly-elected senators traditionally waited for a few weeks or longer before delivering their maiden speech on the Senate floor. The rationale was that new senators should demonstrate their humility; in return, senior senators should respect their restraint. This tradition of restraint has disappeared, although the tradition of paying special attention to a legislator’s first, or maiden, speech remains.

The maiden speech is an opportunity for a legislator to make a bit of a splash and show a national audience precisely who he or she is. One of the most famous maiden speeches in American history is the speech which Richard Nixon gave after being elected to the House of Representatives.  Nixon used the speech to denounce Gerbert Eisler, an alleged anti-American spy who, in Nixon’s words, was “a seasoned agent of the Communist International, who had been shuttling back and forth between Moscow and the United States from as early as 1933, to direct and master mind the political and espionage activities of the Communist Party in the United States.”

Nixon spent most of the speech describing the particulars of Eisler’s situation. But he also called, more broadly, for the federal government to crack down on communist sympathizers. He concluded:

I think that every Member of the House is in substantial agreement with the Attorney General in his recent statements on the necessity of rooting out Communist sympathizers from our American institutions. By the same token I believe that we must all agree that now is the time for action as well as words.

Over a decade later, Senator Ted Kennedy used his own maiden speech to call for civil rights legislation. Kennedy began his speech rather self-consciously, explaining that he had hesitated over whether he had the right to speak as such a new member of Congress. He was also aware that he was occupying his brother’s former seat in the senate:

Mr. President, it is with some hesitation that I rise to speak on the pending legislation before the Senate: A freshman Senator should be seen, not heard; should learn, and not teach. This is especially true when the Senate is engaged in a truly momentous debate. in which we have seen displayed the most profound skills of the ablest Senators, in both parties, on both sides of the issue.

Kennedy described himself as being forced into speech by circumstances – he could not remain silent, he said, on an issue as important as this one:

I had planned, about this time in the session, to make my maiden speech in the Senate on issues affecting industry and employment in my home State. I still hope to discuss these questions at some later date. But I could not follow this debate for the last 4 weeks—I could not see this issue envelop the emotions and the conscience of the Nation—without changing my mind. To limit myself to local issues in the face of this great national question, would be to demean the seat in which I sit, which has been occupied by some of the most distinguished champions of the cause of freedom.

machine politics

machine politics

“Machines politics” is a phenomenon in urban politics, especially in the 19th and 20th centuries.

Political machines are characterized by tight organization and a strong centralized leadership, typically in the form of a “boss.” They operate by dominating the political landscape. The “machine” gets its name from its ability to reliably, even mechanically, turn out the votes needed to get its members elected and its measures passed.

One of the most famous examples of political machines was Tammany Hall, which operated in New York City from 1789 until some time in the 1950s. Tammany Hall opened its doors as a benevolent institution, and in theory the group was concerned with helping out immigrants and the needy; in fact, it acted as a vote-collecting arm of the local Democratic party, playing on ethnic divisions in the city to help get out the vote for the right candidates. 

Like other political machines, Tammany Hall was rife with corruption. It is known for leaders like William Plunkitt, who famously held forth about the difference between honest and dishonest graft (Plunkitt was all for honest graft). Tammany Hall’s best-known leader was probably William Magear Tweed, usually referred to as Boss Tweed. 

Tweed, born on the Lower East Side of Manhattan, began his political career as an alderman and quickly worked his way up. He served a term in Congress but was more interested in local, New York politics. By 1860, he had established control over the nomination process for every significant elected office in the city, and commanded the loyalty of Democratic leaders. Tweed used his connections to exact payments from businesses and ordinary citizens; he had countless schemes involving faked leases, padded bills, and trumped-up fees.

Tweed’s corruption was both notorious, and fairly standard; machine politics and corruption often go hand in hand. At the same time, Tammany Hall did act as a safety net for many New Yorkers, providing social services that the government wasn’t equipped for. Tammany Hall’s leaders sent baskets of food to the poor; they also helped out any of their supporters who ran into legal troubles. Tammany Hall also helped get white men who did not own property the right to vote. The result was a loyal base of voters who could be relied on to support any of Tammany Hall’s favored politicians.

In a book titled “Machine Made: Tammany Hall and the Creation of Modern American Politics,” the historian Terry Golway takes a more positive view of Tammany Hall than is usually seen. Golway argued that Tammany Hall’s critics were often motivated by anti-Irish sentiment. He also argued that Tammany Hall did a lot of good, smoothing the way for the city’s poor and especially for newly arrived immigrants.

Golway told NPR:

Every history of Tammany Hall is told as a true-crime novel, and what I’m trying to suggest is that there’s this other side. I’m arguing, yes, the benefits that Tammany Hall brought to New York and to the United States [do] outweigh the corruption with which it is associated. I’m simply trying to complicate that story… Tammany Hall was there for the poor immigrant who was otherwise friendless in New York.

lunatic fringe

The “lunatic fringe” is the wing of a political or social group that holds more extreme views than the rest of that group.  The lunatic fringe tends to hold stronger opinions, as well as more fanatical views.

Merriam Webster notes that the phrase was first used in 1913 and is generally used as a pejorative term. Politicians from both parties like to warn the public that their opponent is being controlled by the “lunatic fringe.”

In 2020, President Donald Trump told audiences that his Democratic challenger, Joe Biden, was a puppet of the radical left and that he was being controlled by extremists. “He’s a candidate that will destroy this country, Trump said. “And he may not do it himself. He will be run by a radical fringe group of lunatics that will destroy our country.”

A few years earlier, when Barack Obama was president, some analysts on the right called the president’s nuclear policy “lunatic fringe” stuff. Speaking to the right wing publication Newsmax, Frank Gaffney of the Center for Security Policy slammed Obama’s plan to drastically cut America’s nuclear arsenal. He said,

“The American people, I trust, believe, pray, have enough common sense to realize this is nuts. This is lunatic, fringe, leftist stuff and it’s likely to make the world a much more dangerous place, rather than less … Exercising his responsibilities as commander in chief, taken to its logical, or illogical, extreme, he could disarm the United States entirely.”

On the left, meanwhile, pundits claimed that a “lunatic fringe” in the Republican party hated President Obama with so much fervor that they had managed to infect the rest of their party. Progressives warned that the lunatic fringe must be reined in – or else the country would suffer:

In a country with our history of assassinations, this mostly subterranean sense of rage in a nation awash with guns and with more than its share of deranged souls is scary, dangerous stuff. Worse still, few of the “mainstream” conservatives have raised their voices against the lunatic fringe while some have even abetted it….the climate that has been created by those who compare Obama with Hitler, wave provocative signs, or show up at presidential events brandishing weapons is more than worrisome. This nation has been traumatized by too many tragedies to sit back and allow such sentiments to be whipped up by fanatics and lunatics. It’s time to demand that responsible Republicans denounce such tactics.

“Lunatic fringe” is one of the rare political words which is used just as often by ordinary people as it is by professional commentators. Letters to the editor often mention the “lunatic fringe.” This one, for example, denounces the entire Democratic Party as one big lunatic fringe:

The lunatic fringe, also known as Democrats, has been asleep at the wheel since Donald Trump’s election. Instead of planning for a pandemic, they were out of sight and out of their minds. Their two biggest fantasies, Russian collusion and the impeachment fiasco, were an effort in futility.

loneliest job in the world

The “loneliest job in the world” is a reference to the presidency of the United States, supposedly a supremely lonely and isolating job because of the enormous responsibility that it entails.

William Howard Taft, upon handing over power to Woodrow Wilson, warned the newly-elected president that the job would leave him feeling isolated. “This is the loneliest place in the world,” Taft said, referring to the White House. The warning, apparently, didn’t quite hit home; Wilson said later that he had been taken by surprise by the feeling of aloneness. “I never dreamed such loneliness and desolation of heart possible,” Wilson went on to write. 

A few decades later, then-president Harry Truman was asked whether he agreed with Taft’s assessment of the White House. Truman said that he did, and went on to describe all the work involved in the presidency. His day, he said, began at dawn and went on until midnight:

“I get up at 5:30 in the morning and get to that desk at 6 o’clock, and I stay there until eight. Then I come over here and sit at this desk until one o’clock or so, go over there and have lunch with the family – if they are at home – and go bcack up there and transact some business, and try my best to take a thirty minute nap if I can. I get back over here at 3 o’clock and stay here until the business is wound up…” 

The phrase is also closely associated with a 1961 photo of John F. Kennedy by the New York Times photographer George Tames. The photo shows JFK, just months after coming into office, standing at his desk in the Oval Office with his head bowed. The photo’s caption originally read, “Awaiting the arrival of French Ambassador Herve Alphand, the President ‘as is his habit’ snatches a moment to read an official document, leaning over the table.”

Writing in LawFare, Quinta Jurecic argued that the view of the presidency as lonely grew naturally out of the “Hamiltonian” view of the office, which stressed the importance of an energetic, determined president in order to keep the country running on track. Hamilton wrote, “Energy in the Executive is a leading character in the definition of good government. It is essential to the protection of the community against foreign attacks. . . . A feeble Executive implies a feeble execution of the government. A feeble execution is but another phrase for a bad execution; and a government ill executed, whatever it may be in theory, must be, in practice, a bad government.”

Jurecic noted that then-president Barack Obama was almost over-playing his experience of loneliness and responsibility, something which she contrasted with then-candidate Donald Trump — but added that this may have been inevitable:

This Hamiltonian vision of the presidency lends itself well to Obama’s public expressions of anguish and moral seriousness. After all, if Hamilton is right about the presidency, Obama should feel that he, and he alone, is responsible. He should feel deeply the weight of his terrible burden. Perhaps the nature of the office as “the loneliest job” is intrinsic to the singular structure of the presidency itself.

little tin box

“Little Tin Box” is the title of a song in the 1959 musical, “Fiorello,” which told the story of one of New York City’s most famous mayors. Fiorello LaGuardia, a progressive politician who was a strong supporter of the New Deal, was mayor of New York for three terms, serving from 1933 to 1945.

LaGuardia was known for his pro-labor, anti-monopoly stance and for his reformist agenda. During his tenure as New York’s mayor, he created a new city charter, overhauled the city’s police and fire departments, expanded welfare services, and carried out programs aimed at revitalizing the slums. He was also known for public works, notably LaGuardia Airport.

LaGuardia was loved for his quirks, too. During a city-wide strike in which newspapers were not being delivered, the mayor complained that the city’s children were being deprived of their usual comic strips. So, he took to the radio and read each day’s “funnies” out loud.

The musical “Fiorello” won a Pulitzer prize for drama in 1960; it won a number of Tony awards in the same year, including Best Musical and Best Director of a Musical. The musical’s writers, Jerry Bock and Sheldon Harnick, are probably better known today for writing Fiddler on the Roof. Like “Fiddler,” “Fiorello” is concerned with the lives of ordinary people. The musical depicts LaGuardia as a tireless friend of the underdog, who rises from obscurity to become a Congressman, and then mayor of New York.

The Guide to Musical Theater sums it up this way:

Fiorello joins the workers of Nifty Shirtwaists, who are on strike, and convinces them to desert their picket lines and join him at his headquarters to discuss election tactics. There he gives a rousing lecture on the deplorable social conditions of the city among the working classes – sweat shop labour, tyrannical bosses, long hours, low wages, etc. He promises them legal backing should their protests result in arrest.

Running for office, LaGuardia pursues a vigorous campaign by going to all the different ethnic groups that make up the city. He speaks to them in their own language so they are left in no doubt of his resolve to better their conditions. He succeeds in creating an electoral upset when he becomes the first Republican the district has ever sent to Washington.

The best-known song in the musical is “Little Tin Box,” which skewers the corrupt politicians who dominated New York politics when LaGuardia was running for office. The setting is a trial, in which the politicians are depicted as corrupt, dishonest, and endlessly greedy, as the lyrics reveal.

The “little tin box” — like the deduct box — is the piggy bank where the corrupt politicians claim they have been saving their pennies so that they can buy luxuries:

Mr. X, may we ask you a question?
It’s amazing, is it not,
That the city pays you slightly less than fifty bucks a week,
Yet you’ve purchased a private yacht?”
“I am positive your Honor must be joking!
Any working man can do what I have done.
For a month or two I simply gave up smoking,
And I put my extra pennies one by one
“Into a little tin box,
A little tin box
That a little tin key unlocks.
There is nothing unorthodox
About a little tin box.

little old ladies in tennis shoes

little old ladies in tennis shoes

“Little old ladies in tennis shoes” is a derisive reference to members of the John Birch society.

In 1961, the California Attorney General’s offices investigated the ultra-conservative John Birch society and determined that the group was paranoid and authoritarian but, ultimately, not dangerous to American society. The report called the Birch society “pathetic” and described its members as being chiefly governed by fear:

The cadre of the John Birch society seems to be formed primarily of wealthy businessmen, retired military officers and little old ladies in tennis shoes. They are bound together by an obsessive fear of ‘communism,’ a word which they define to include any ideal differing from their own, even though these ideas may differ even more markedly with the ideas of Marx, Engels, Lenin and Khrushchev. In response to this fear they are willing to give up a large measure of the freedoms guaranteed them by the United States constitution in favor of accepting the dictates of their “Founder.” They seek, by fair means or foul, to force the rest of us to follow their example. They are pathetic.

In fact, by the early 1960s the Birch Society had a membership of at least 100,000 Americans with an annual budget of several millions of dollars. The group was well-known enough that Bob Dylan wrote a song about it, “John Birch Paranoid Blues.” The organization was founded in 1958 by Robert Welch, a retired candy maker from Boston. Welch wanted the group to fight back against what he saw as the Communist threat to infiltrate American society. He named the group after John Birch, a missionary who was killed while living in China; Welch saw Birch as the first American casualty of global communism. 

The Birch Society says that its mission is to “bring about less government, more responsibility, and – with God’s help – a better world by providing leadership, education, and organized volunteer action in accordance with moral and Constitutional principles.” The group opposes US membership in the United Nations and is strongly opposed to the Federal Reserve; the Birch Society also opposes illegal immigration and amnesty programs. 

Membership in the Birch Society declined after its heyday in the 1960s; however, some reports say that the group has been on the rise in recent years. In 2017, a Birch Society spokesperson also told Politico that the organization was seeing its membership rise, although he declined to give specific figures. “There definitely is an increase in [our] activity, particularly in Texas, because Americans are seeking answers, but they can’t quite put their finger on what some of the real problems are,” the spokesperson said.

The Southern Poverty Law Center, which described the organization’s members as “conspiracy theory-loving, U.N.-hating, federal government-despising, Ron Paul-supporting, environmentalist-bashing, Glenn Beck-watching true believers,” also said that some “Bircher” ideas had made it into mainstream Republican thought. The SPLC, citing a long-time researcher named Chip Barlet, argued that those ideas include

the belief that big government leads to collectivism which leads to tyranny; that liberal elites are treacherous; that the U.S. has become a nation of producers versus parasites; that the U.S. is losing its sovereignty to global treaties; that the “New World Order” is an actual plan by secret elites promoting globalization; and that multiculturalism is a conspiracy of “cultural Marxism.”

little group of willful men

“Little group of willful men” is a reference to President Woodrow Wilson’s dispute with a group of anti-war congressmen in the lead-up to America’s entry into World War Two.

The dispute led to the introduction of a cloture rule in the US Senate.

In early 1917, American sentiment was increasingly in favor of entering the war in Europe. A bill which would arm American merchant ships was making its way through Congress; the law would give the ships the power to defend themselves against German submarine attacks. The bill easily passed the House, but when it reached the Senate it was blocked by a small group opposed to the war.

The group was led by Senator Robert La Follette, of Wisconsin, and by Senator George Norris, of Nebraska. On March 4, 1917, those senators organized a filibuster so that the bill could not come up for a vote. That’s when President Wilson spoke up. Angered, the president said that the “Senate of the United States is the only legislative body in the world which cannot act when its majority is ready for action. A little group of willful men, representing no opinion but their own, have rendered the great government of the United States helpless and contemptible.”

Wilson closed his statement by urging the Senate to adopt a cloture rule, which would set a limit on the length of debate so that bills could not be blocked indefinitely by filibuster. Wilson said, “the only remedy is that the rules of the Senate shall be so altered that it can act. The country can be relied upon to draw the moral. I believe that the Senate can be relied on to supply the means of action and save the country from disaster.”

Days later, on March 8, 1917, Congress met in a special session and agreed to a compromise — a rule that would preserve debate but would allow for cloture in the case of a super majority. The rule allowed the Senate to end debate only when a two thirds majority agreed to do so. It continues to be rare for the Senate to invoke cloture, although it has become far more common in the 21st century. The rules on cloture have also changed so that only 60 Senators are required in order to end debate.

The practice of filibustering dates back at least as far as ancient Rome. The Roman senator Cato the Younger famously spoke until the sun went down in order to stall votes on issues he opposed – notably, Cato filibustered a vote that would have allowed Julius Caesar to return to Rome in 60 BC.

Writing in the Atlantic, Rob Goodman and Jimmy Soni argued that the Founding Fathers were aware of the Roman filibuster and that they saw it as a threat; for them, the filibuster was yet another way for a minority to dominate a majority. Goodman and Soni wrote,

“when the filibuster starts to become the rule, rather than the exception, the minority may find itself with more and more power in a Congress that matters less and less. Minority rule will ultimately mean more power for the presidency, the lawyers who draft executive orders, unelected judges, and the federal bureaucracy. Placing limits on the filibuster is the wisest course for any senator who cares about the institution’s future.” 

kitchen cabinet

“Kitchen cabinet” is a reference to a president’s informal circle of advisers, as opposed to the official members of his cabinet.

The term was first used during the presidency of Andrew Jackson. Jackson took office in 1829, after a bruising and divisive election. The president found his cabinet members ineffectual (some say that the president, in a kind of power play, purposely appointed lackluster men to cabinet positions.) As a result, Jackson turned to his own trusted friends when he wanted advice on politics.

The so-called kitchen cabinet lasted until 1831. In that year, a series of scandals within the administration led to the resignations of both secretary of state Martin van Buren and  secretary of war John Eaton. The president ordered the entire cabinet to resign, and appointed new, more trusted men to fill their places. As a result, the kitchen cabinet declined in importance.

The concept of the “kitchen cabinet” persisted long after Andrew Jackson’s presidency. Abraham Lincoln had his own close circle of advisers, many of whom were not actually political figures. Lincoln corresponded with newspaper editors like Horace Greeley of the New York Tribune; James Gordon Bennett, of the New York Herald; and Henry Raymond, of the New York Times. These men all gave the president counsel and came to be known as Lincoln’s own kitchen cabinet.

Much later, John F. Kennedy had his own kitchen cabinet. JFK’s advisers included members of his own family – notably his brother, Robert Kennedy. Ted Sorensen, a lawyer and speechwriter, was another of the president’s closest advisers, although he did not serve in the cabinet. Years later, Sorensen described his close relationship with Kennedy, which he saw as a friendship and a meeting of minds:

“Despite all our surface differences—he was a millionaire’s son, a Roman Catholic, a war hero, a Harvard graduate—and I was at the opposite end of almost all of those.  Nevertheless, we found that we wanted this to be a better country, we both believed in public service, we both were interested in public policy, and we both wanted to see a peaceful world.”

In the 21st century, many of Donald Trump’s critics grumbled that the president was listening too closely to his “kitchen cabinet” and that he was isolating himself from the kinds of highly experienced policy makers who could have given him better advice.

Writing in Foreign Policy, Dov Zakheim, a one-time deputy director of defense, argued that Trump was relying on “amateurs” rather than on experts. The piece, titled “Beware Trump’s Kitchen Cabinet,” claimed that while other presidents have had kitchen cabinets in the past, the Trump presidency had pushed the executive agencies further away than any previous administration. Zakheim wrote,

“When President Donald Trump has met with, or spoken by phone to, foreign leaders, National Security Advisor Mike Flynn has not always been at hand. But Bannon, the president’s political consigliere; Jared Kushner, the president’s son-in-law; and Steve Miller, Bannon’s deputy and protégé, have always been present. They are clearly the policy advisors of last resort, and, presumably, are in a position to invalidate, or for that matter block, any other inputs the president might otherwise have received.”

johnson treatment

Johnson treatment

President Lyndon Baines Johnson was famous for his ability to coerce members of Congress into supporting his legislation. LBJ’s combination of charm, persuasion, and sheer intimidation came to be known as the “Johnson treatment.” 

Johnson was notoriously aggressive, especially when he wanted to achieve one of his political goals. LBJ managed to rise from his simple beginnings in the tiny, rural town of Johnson, Texas and reach the White House. Historians say that he managed this ascent by ruthlessly seeking out power and never losing sight of his goals. Even while he was a college student, Johnson reportedly declared that he was only interested in dating girls with “rich daddies,” who could presumably give him access to power and money.

LBJ held on to this drive for power throughout his career, even once he had reached the White House. He was highly skilled at identifying people’s insecurities and worming his way into their confidence. As a negotiator, he knew exactly how to wheel and deal, offering each member of Congress just the right concessions to get them on his side. Both over the telephone and in face to face meetings, Johnson was a persistent,  tireless dealmaker, who juggled a detailed knowledge of the law with an intuitive understanding of what made people tick. And, when all else failed, Johnson did not hesitate to use his physical presence to frighten his opponents.

Congressman Richard Bolling, who experienced the Johnson treatment himself, described Johnson as a man with no natural boundaries, who didn’t hesitate to be rough and even animalistic when it could help him. Bolling said, “I wouldn’t say Johnson was vulgar — he was barnyard.” Johnson had no sense of personal space and treated conversation as a creepy hands-on affair. Miller learned from Washington Post editor Ben Bradlee that, “You really felt as if a St. Bernard had licked your face for an hour, had pawed you all over.”

New York Times reporter Tom Wicker wrote, years after the fact, about his own experiences with the Johnson treatment, during the years that he spent covering the White House for the Times. Wicker described a time when he was called into LBJ’s office after writing something unfavorable. The president was having his hair cut and stared Wicker down:

I had thought I was on easy terms with the senator, then the vice president. But was this the same garrulous man I had known — this silent, staring president? Whoever it was, I was quickly intimidated, unnerved, reduced to a sort of nothingness by those unblinking eyes, that jowly familiar face turned implacable, that motionless form under the barber sheet, the brooding silence in which I was being regarded, or perhaps measured.

I shuffled and writhed. He still said nothing. Finally I knew I was beaten, and to my shame I mumbled some banality about the nation’s good fortune in having such a man to take over. Only then, as if just noticing my presence, he whipped off the barber sheet, stood up and spoke, as if those interminable moments had never happened.

Forty years later, whenever I remember that first interview with a new president, I still feel diminished by my small experience of the Johnson Treatment.

John Q Public

John Q. Public

John Q. Public is a reference to the ordinary man or woman. His name is used as a shorthand for popular opinion and a personification of the general public.

Katy Waldman described John Q. Public as follows:

He’s an upstanding sort who shovels the ice off his stretch of sidewalk, writes a check to his local ASPCA, and tries to be a loving dad to his 2½ kids. He sits in traffic. He has a particular order in which he reads the newspaper. Pace Hollywood, he looks nothing like Denzel Washington, though occasionally in the morning, freshly shaven, adjusting his tie in front of the mirror, he thinks to himself that he’s not too bad.  

Waldman characterized John Q. Public as a “square who cares,” an upstanding citizen who is reasonably prosperous and is probably a pillar of his community. He can be contrasted with other “everyman” names, like Joe Blow, or Joe Schmo, which usually refer to an ordinary person who’s somewhat down on their luck. 

John Q Public was invented in 1922; he was the creation of the cartoonist Vaughan Shoemaker. Shoemaker’s John Q. Public was a symbol for the “beleaguered American taxpayer,” according to the New York Times; he first appeared in the Chicago Daily News. The cartoon was later syndicated to over 75 newspapers around the country. 

Shoemaker’s cartoons were likely influenced by an earlier cartoon character, “Mr. Common Man,” which first appeared early in the 20th century and was the creation of the political cartoonist Frederick Opper. Opper invented Mr. Common Man while he was working for William Randolph Hearst’s New York Journal. Opper invented the cartoons during the presidential campaign of 1900, during a time when the Hearst newspapers were going all-out to critique monopolies and “trusts.” Opper, for his part, created an alphabet of characters, each representing a fat trust who blithely kicked and beat a hatless little man who was known as Mr. Common Man.

It's the economy stupid

It’s the economy, stupid

“It’s the economy, stupid” was a phrase coined by James Carville in 1992, when he was advising Bill Clinton in his successful run for the White House.

In 1992, the US was experiencing an economic recession and the incumbent president, George HW Bush, was perceived as out of touch with the needs of ordinary Americans. Carville told campaign staffers to hammer on the importance of the economy at every chance they got – he even went so far as to hang a sign in campaign headquarters reading, in part, “the economy, stupid.”

The phrase became a mantra for the Clinton campaign. Since then, it’s turned into a catchphrase which pops up whenever analysts are discussing an upcoming election. The phrase has endless possible variations; it could be “it’s the schools, stupid,” or “it’s the environment, stupid,” or almost anything else. The slogan serves to highlight one key issue and to make it the central focus of a campaign.

In 2004, many Americans were angered by the US invasion of Iraq. “It’s the war, stupid,” read some demonstrators’ signs. News sites ran editorials with headlines screaming the same thing, pointing out that the war was the biggest issue on many voters’ minds. However, Democrats failed to rally behind the slogan. They failed to capitalize on the growing anti-war sentiment, as the New York Times noted:

Agree with him or not, the president does stand for something. He led, and the Democrats followed. The polls, far from rationalizing the Democrats’ timidity, suggest they might have won a real debate had they staged one… the Democratic leaders never united around a substantive alternative vision to the administration’s pre-emptive war against the thug of Baghdad. That isn’t patriotism, it’s abdication.

Today, newspapers still love to use Carville’s old slogan whenever an election comes up. “It’s the economy, stupid” is a reminder that appears in headlines every four years.

In 2019, the Wall Street Journal advised President Trump to “adopt James Carville’s mantra” and talk more about the economy – especially to audiences of color. The piece argued that Trump has done a fine job bolstering the nation’s economy, but that he just hasn’t done enough to tell voters about it:

Mr. Trump has spent a fair amount of time in front of mostly white audiences boasting about what he’s done for black people on the jobs front, and it’s not a good look. A more fruitful approach might be for the president to visit some low-income minority communities in places like Detroit, Milwaukee and Philadelphia, listen to their concerns, engage them in a way that Democrats are not, and talk about what strong economic growth has enabled blacks and others to do for themselves. If there was ever a time for Donald Trump to be channeling James Carville, it’s now.

In 2020, the Boston Globe ran a piece urging Democrats to challenge Trump on the economy; the Globe argued that Democrats need to seize control of the issue, which had so far been in the president’s hands:

No matter what, it’s clear that when it comes to the economy, the Democrats have not yet created a message — or a plan — that will motivate voters to shift how they think about their pocketbooks. Perception is everything, and right now, if it’s the economy, stupid, it’s time for some candidate to get smart.

iron curtain

iron curtain

During the Cold War, the division between western Europe and the Soviet bloc countries was called the “iron curtain.” The iron curtain was never a physical barrier, but served as a metaphor to describe the limit of Soviet influence.

The phrase “iron curtain” may have existed as early as the 19th century. But British prime minister Winston Churchill was the first to use it in its modern sense. In a speech at Westminster College, Missouri in 1946, Churchill declared:

From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the Continent. Behind that line lie all the capitals of the ancient states of Central and Eastern Europe. Warsaw, Berlin, Prague, Vienna, Budapest, Belgrade, Bucharest and Sofia, all these famous cities and the populations around them lie in what I must call the Soviet sphere, and all are subject in one form or another, not only to Soviet influence but to a very high and, in many cases, increasing measure of control from Moscow. Athens alone-Greece with its immortal glories-is free to decide its future at an election under British, American and French observation.

Historians generally agree that Churchill’s speech marked the opening of the Cold War. Not only did Churchill clearly lay out the demarcation of Soviet and western power, he also talked about the need to use maximum military strength to defeat the Soviets. Churchill also warned that there were “fifth columns” in the west aligned with the Soviet Union and conspiring to take down western democracies.

Churchill’s speech was enthusiastically received by the US president, Harry Truman, and by his administration. Truman had already begun to argue that the Soviet Union needed to be confronted with a show of strength and that it must be contained. In Russia, Churchill’s speech was described as both imperialist and racist by leaders in the Kremlin.

The iron curtain was, for the most part, a metaphorical division. However, Berlin’s Brandenburg Gate, which separated communist East Germany from West Germany, served as a physical symbol of that division. President Ronald Reagan played on that symbol when he delivered a rousing speech from the western entrance to that gate, in 1987. Reagan implored Mikhail Gorbachev, the Soviet leader, to tear down the gate and, by doing so, bring the Cold War to an end:

Behind me stands a wall that encircles the free sectors of this city, part of a vast system of barriers that divides the entire continent of Europe. . . . Standing before the Brandenburg Gate, every man is a German, separated from his fellow men. Every man is a Berliner, forced to look upon a scar. . . . As long as this gate is closed, as long as this scar of a wall is permitted to stand, it is not the German question alone that remains open, but the question of freedom for all mankind. . . .

General Secretary Gorbachev, if you seek peace, if you seek prosperity for the Soviet Union and Eastern Europe, if you seek liberalization, come here to this gate.
Mr. Gorbachev, open this gate!
Mr. Gorbachev, tear down this wall!

Infrastructure week

infrastructure week

In 2017, President Donald Trump announced plans for an “infrastructure week,” a series of high-profile events which were aimed at building support for the president’s trillion-dollar plan to rebuild the country’s highways and bridges.

Said Trump economic adviser Gary Cohn: “We’ve had some achievements to date … but we’re really formally launching the things we’re doing. Next week we’re going to announce a few very interesting things.”

Vice President Mike Pence promised that the week would usher in a “new era for American infrastructure in the United States.” 

However, as The Hill noted later, the whole week was overshadowed by ex-FBI Director James Comey’s gripping testimony on Capitol Hill:

Much of the derailment on the infrastructure rollout has been of President Trump’s own making. He repeatedly veered off message in tweets and during infrastructure-themed speeches, flouting some of White House staffers’ carefully laid plans.

Throughout the Trump administration, the idea of an “infrastructure week” has turned into a bit of a running joke, as the Trump administration has announced one infrastructure week after another. The Week noted that there had been six “infrastructure weeks” in the period between 2017 and 2019, with little visible output from any of the events. The Week declared:

The words “Infrastructure Week” have become synonymous with any unsuccessful or clumsy attempts to get an actual policy off the ground, as well as with the administration’s odd tendency of pushing infrastructure whenever unfavorable headlines start appearing in the news.

The Week wasn’t alone in its derision. Infrastructure week has become an event that journalists love to hate. In 2019, the New York Times wrote:

At this point in the Trump presidency, “Infrastructure Week” is less a date on the calendar than it is a “Groundhog Day”-style fever dream doomed to be repeated.

And in fact, one “infrastructure week” after another has been derailed by other news stories. The first week, back in 2017, was overshadowed by Comey’s testimony, of course. The following event was overshadowed by the “Unite the Right” rally in Charlottesville, and a later infrastructure week was buried in the headlines about the resignation of White House aide Rob Porter.

influence peddler

influence peddler

An influence peddler is one who uses their political influence to try and win favors for others. An influence peddler is a bit of a wheeler dealer, trading access in exchange for payment in one form or another.

A similar word for an “influence peddler” might be “lobbyist.” Like lobbyist, influence peddler is a very negative term. Influence peddlers are almost universally despised, and one politician after another has promised sweeping reforms aimed at limiting their powers.

Back in 1980, the Senate issued a report criticizing what it said was undue influence peddling on the part of Billy Carter, who was the brother of then-president Jimmy Carter. William Safire, writing in the New York Times, called the Senate report “an emphatic condemnation of White House venality.” 

And in 1988 Michael Dukakis, the Democratic candidate for the presidency, pledged that he would crack down on lobbyists by making it much harder for retired government officials to use their connections to lobby the government. The Chicago Tribune reported at the time,

“Dukakis promised, if elected, to issue an executive order forbidding former senior officials in his administration to lobby the federal government at any time before he leaves office.

To forestall conflicts of interest, Dukakis` proposal would expand the scope of current influence-peddling laws, which set delays of one to three years before someone who leaves government service may lobby various agencies.”

A few decades later, Barack Obama campaigned on a promise to increase transparency and to end the backdoor deals and influence peddling which, critics said, were a feature of the Bush administration. In Obama’s first days as president, he issued a series of orders aimed at just that, as the Washington Post reported:

In two executive orders and three presidential directives, Obama laid out stringent lobbying limits that will bar any appointees from seeking lobbying jobs while he is president and will ban gifts from lobbyists to anyone in the administration. He also ordered agencies to presume that records should be publicly released unless there are compelling reasons not to do so, and he loosened restrictions on the release of records related to former presidents and vice presidents.

Politicians tend to accuse the opposing party of influence peddling. It’s unusual for the party in power to admit that lobbyists are a serious and ongoing problem for them. Journalists also like to write about influence peddling, but most of the time they take aim at just one political party at a time.

Once in a while, though, newspapers publish a sort of “plague on all their houses” editorial, condemning the all-around culture of influence peddling. That’s what the LA Times did, for example, in a 2019 article about the relatives of powerful politicians cashing in on their influence. The Times argued that Hunter Biden had traded on his father’s (Joe Biden’s) influence, but that Trump’s children had done much the same thing – as had others: 

Hunter Biden is only the latest in a long line of relatives of elected leaders who appear to have used their names to open doors. In recent decades, Presidents Nixon, Carter, Clinton and George W. Bush all had troublesome family members.

Which brings us to three other children: Donald Trump Jr., Ivanka Trump and Eric Trump.



One who already holds a political office. Usually, in US politics, the word incumbent refers to the sitting official who is running for re-election.

A lot of ink has been spilled about whether the incumbent has a better chance of winning elections than a challenger does. In presidential races, at least, the incumbent has historically had a strong advantage. In the course of US history, only ten presidents have run for re-election and lost. 

In the 20th century, the presidents who tried and failed were William Howard Taft, Herbert Hoover, Gerald Ford, Jimmy Carter, and George H.W. Bush. So far, every president in the 21st century has won re-election.

It’s only natural that the incumbent should have an edge over the competition, experts say. Allan Lichtman, a presidential historian at American University, told NPR:

Incumbents have the following advantages. Name recognition; national attention, fundraising and campaign bases; control over the instruments of government; successful campaign experience; a presumption of success; and voters’ inertia and risk-aversion.

The website Open Secrets has also pointed out that incumbent members of Congress have a strong advantage when it comes to being re-elected. In the House of Representatives, re-election rates hover somewhere between 90 and 100 percent most of the time, only occasionally dipping down to around 80 percent. In the Senate, rates are a little bit more volatile, with a few major dips during politically charged years (as in 1980, for example, when President Reagan swept into the White House). Still, even accounting for those dips, the incumbent is overwhelmingly favored.

Most pundits agree that when it comes to electing the president, voters are overwhelmingly making their decisions based on the economy. As NPR noted in early 2020, that fact may hand President Trump the advantage:

Every presidential election revolves around this simple question: Are you better off now than you were four years ago? For most people — at least in terms of the economy — the answer is yes. Unemployment has continued to tick down the last few years, and the stock market is booming. Gross domestic product growth is not what Trump wanted, but fears of a recession have not materialized. And notably, wages have risen faster for low-income workers since 2018 than for others.

Of course, NPR’s piece came out well before the coronavirus pandemic, and its impact on the US and global economy.

It’s worth noting that the presidents who lost re-election in the 20th century almost all presided over struggling economies. Herbert Hoover, of course, came into office in 1929, the year that the stock market tanked and ushered in the Great Depression. Jimmy Carter presided over a period of rampant inflation (or “stagflation”) and soaring oil prices, which made it difficult for the government to continue most of its social spending programs. 

Last but not least, George H.W. Bush presided during a recession; inflation and oil prices were not a huge problem, but unemployment soared. The president was also widely seen as oblivious to the country’s economic woes – and Bill Clinton, his Democratic challenger, never missed a chance to bring it up. 

imperial presidency

An imperial presidency which one characterized by greater powers than are clearly provided for in the Constitution.

The historian Arthur Schlessinger popularized the term with a book, Imperial Presidency, published in 1973. Schlessinger’s book focused on what he saw as the abuses of the Nixon administration, and called on Congress to impeach the president for going so far beyond the bounds of his constitutional powers. 

Schlessinger argued that, with the end of World War I and the onset of the Cold War, the United States had turned into the most powerful nation on earth. By extension, the US president had become a kind of elected world emperor. More specifically, Schlessinger complained that Nixon was abusing war powers which should have been reserved for Congress. 

Since 1973, the term “imperial presidency” has been applied pretty routinely to many administrations, both Republican and Democrat. In 2001, for example, the Cato Institute summed up President Clinton’s tenure by calling him an imperial president. The group argued that Clinton had been “Nixonian” in his foreign policy, and that he had completely bypassed Congress in his bombing of the Balkans and in his threats to invade Haiti:

As President Clinton’s tenure ends, pundits are trying to define the “Clinton Legacy.” Many have focused on the Lewinsky scandal and impeachment, but Clinton may find his legacy in a less sordid but no less shameful aspect of his presidency: his abuse of executive authority in foreign affairs.

Undeclared wars and contempt for constitutional limits on presidential power mark Clinton’s foreign policy. Future historians may well remember Clinton as the man who ensured that the “Imperial Presidency” would not vanish with the end of the Cold War.”

A few years later, the New York Times was applying roughly the same language to the George W. Bush administration:

The war is hardly the only area where the Bush administration is trying to expand its powers beyond all legal justification. But the danger of an imperial presidency is particularly great when a president takes the nation to war, something the founders understood well. In the looming showdown, the founders and the Constitution are firmly on Congress’s side.

President Obama’s critics also accused him of abusing executive powers, especially when it came to immigration, relations with Iran, and natural gas. (His critics on the left complained about his allegedly illegal use of drone strikes, too.) As it happens, President Trump is the first president in recent memory to not face accusations of being “too strong.” Still, Trump’s critics complain that, even when the president is in a weak position, he may be misusing the powers of the presidency. At least, that’s what one op-ed in the New York Times suggested:

The president may seem weak, but the presidency remains strong. Mr. Trump has illustrated that even a feeble commander in chief can impose his will on the nation if he lacks any sense of restraint or respect for political norms and guardrails.  True, Mr. Trump has not been able to run roughshod over Congress or ignore the constraints of the federal courts. But he has been able to inflict extensive damage on our political institutions and public culture.

I'd rather be right

I’d rather be right…

Henry Clay was a U.S. congressman who eventually served as Secretary of State under John Quincy Adams. Clay also ran for the presidency three times, losing on each venture. Today, he is probably best-remembered for a speech in which he said, “I’d rather be right than be president.”

Clay’s complex stance on slavery probably lost him the chance at winning the presidency. In 1839, Clay was running for the presidency for the third time. After his two filed presidential runs, Clay believed that he might finally have a shot in the 1840 presidential cycle. However, he was having trouble positioning himself on the divisive issue of slavery. Clay considered himself to be a moderate – he claimed to dislike the institution of slavery, but he also disagreed with the abolitionist movement. As a result, Clay had enemies on both sides of the political divide, and he was often painted as an “extremist” in one direction or another.

In February of 1839, in an effort to prove that he was not a wild-eyed abolitionist, Clay gave a speech on the Senate floor expressing his opposition to the abolitionist movement. Ironically, the speech cost Clay the support of anyone opposed to slavery. Clay must have realized that the speech was costing him any chance he might have had at winning the presidency, which is why he said, “I’d rather be right than be president.” 

Clay was known as the Great Pacificator, or, sometimes, the Great Compromiser. A slave owner himself, Clay described slavery as a great evil and a dark stain on the United States. He called for a gradual end to slavery and advocated relocating freed slaves to Africa. Clay was a member of the American Colonization Society, or the American Society for Colonizing the Free People of Color of the United States. The organization held that freed slaves could never be successfully integrated into American society and that former slaves should be relocated to western Africa.

Clay also brokered the series of compromises — the Missouri Compromise of 1820, the Tariff Compromise of 1833, and the Compromise of 1850 — which put off, for a while, the national crisis over slavery. 

However, Clay was also fiercely opposed to the abolitionist movement, which he saw as extremist and dangerous. And when it came to his own slaves, Clay was unwilling to compromise. In 1829, a woman named Charlotte Dupuy, a slave in Clay’s household, sued for her freedom. Dupuy claimed that her former master had promised her freedom, and she filed a suit with the U.S. Circuit Court. Clay described himself as “shocked and angered” and fought against Dupuy’s claim with all his might.

Decades later, “I’d Rather Be Right” was the title of a Broadway musical. The 1937 musical told the story of a young New York couple who wanted to get married, but couldn’t quite afford it. Luckily, president Franklin Roosevelt appears in the play, along with his entire cabinet and a host of his New Deal programs, ready to save the young couple and, by implication, the entire nation.

I am the law

“I am the law” is a phrase attributed to Frank Hague, the mayor of Jersey City from 1917 until he retired in 1947. He is remembered as the ultimate political boss, in an era when bosses ruled local politics.

Hague was famous for bending the law to his own purposes and wielding absolute power of his small corner of the world. In one famous story, he declared, “I am the law.” There are a few different variations of the “I am the law” story, but they all boil down to the same rough idea: 

A boy, not yet 16 years old, was caught skipping school. The truant officer hauled him in again and again, but the boy refused to go to class. Eventually, the boy was brought to Hague, who asked him why he wasn’t going to school. The boy explained that he wanted to get a job so that he could help his mother make ends meet. 

Hague was understanding. He turned and asked one of his aides to get the boy a job so that he could earn some money. The aide explained that since the boy wasn’t yet 16, it was against the law for him to work full time.

“Against the law?” Hague is supposed to have hollered. “In this case, I AM THE LAW! Now get this young man a job!” 

Hague himself was expelled from school at the age of 13 for bad behavior. The son of poor Irish Catholic immigrants, he grew up in the rough streets of Jersey City’s “Horseshoe” neighborhood, an area which had been carefully gerrymandered to maximize the Democratic vote. 

Hague won his first election at age 21, when he became a constable. His campaigning style was telling; Hague borrowed 75 dollars from a local bar owner and used it to win friends and votes. Hague’s years as mayor were marked by both a reform-minded agenda, aimed at helping the city’s many immigrants, and also a general disregard for the niceties of the law.

Of course, Hague was not the only politician to set himself above the law. His famous declaration, “I am the law,” is a nice echo of the French absolutist king, Louis XVI. Louis once announced, “l’etat, c’est moi,” or, “the state is me,” as a way of describing his oneness with the law and the government.

A few decades after Hague’s time, President Richard Nixon expressed a similar sentiment, reasoning that the president had an executive prerogative that allowed him to break the law, or rather to change the definition of the law, at least in wartime or in the case of a national security crisis. “When the president does it, that means it is not illegal,” Nixon told the journalist David Frost. Nixon went on to explain,

I do not mean to suggest the president is above the law … what I am suggesting, however, what we have to understand, is, in wartime particularly, war abroad, and virtually revolution in certain concentrated areas at home, that a president does have under the Constitution extraordinary powers…



Hustings are the speeches and campaign events associated with an election cycle. “On the hustings” is a synonym for being on the campaign trail.

The word itself derives from the Old Norse word “husthings,” or “house assembly,” which was a meeting of all the men in the household of a nobleman or a king. The word entered Old English as “husting,” where it was used to mean any meeting or tribunal. By the early 18th century, the word was used to mean a “temporary platform for political speeches;” the meaning eventually shifted to include the election process as a whole.

“Hustings” is probably more commonly used in the UK and in Canada than it is in the United States. The BBC notes that the word has a very rough and tumble connotation, implying combat and improvisation:

“The most famous election in literature, at Eatanswill, in Dickens’s The Pickwick Papers, sees the unfortunate Mr Pickwick accidentally pushed up on to the hustings platform where he looks down on a scene ‘from whence arose a storm of groans, and shouts, and yells, and hootings, that would have done honour to an earthquake’.” 

In America, too, the word “hustings” connotes energy and activity, rather than staid fundraisers or political ads or TV. The word is used to imply that a candidate is busy being a go-getter. In 2016, the Huffington Post described Bernie Sanders in this way:

Bernie had a good week on the hustings, pulling in a whopping 27,000 people to a rally in Washington Square Park, and chalking up his first Senate endorsement to boot.

Ross Douthat, writing about Ted Cruz in the same year, used the phrase to evoke Cruz’s hard working, tireless style:

Cruz will be back, no doubt. He’s young, he’s indefatigable, and he can claim — and will claim, on the 2020 hustings — that True Conservatism has as yet been left untried. But that will be a half-truth; it isn’t being tried this year because the Republican Party’s voters have rejected him and it, as they rejected another tour for Bushism when they declined to back Rubio and Jeb.

In 2020, the Washington Post argued that “sweat equity” plays a major role in elections. Victory often goes to the candidate who’s willing to put in the time to rally voters, the Post argued; this might give President Trump an edge over Joe Biden in the upcoming election:

Biden-in-the-basement has worked well so far, but he may not be able to compete with a fully unleashed Trump on the hustings. Trump is part showman, part chief marketing officer, part bomb-thrower.

Hustings are, in general, an opportunity for throwing political barbs, slinging mud, and casting aspersions on one’s opponents. When a candidate is on the hustings, he or she is not being genteel. The Business Times used the term to evoke the scrappy atmosphere of the 2020 Democratic primary season:

As Senator Kamala Harris pointed out during the hustings, Mr Biden has a long record of supporting legislation which many believe was socially divisive, and which the powerful Black Lives Matter movement may find objectionable.

honeymoon period

honeymoon period

A “honeymoon period” is a period of popularity enjoyed by a new leader. Usually, the term refers to an incoming president. Traditionally, both Congress and news outlets give presidents a bit of a break at the start of their first terms, so that they can ease into the office.

As the Five Thirty Eight blog has noted, incoming presidents tend to be popular – after all, they were just elected by a plurality of Americans. Researchers have found that this translates into political power early in a president’s first term. A new power enters office with a mandate, and Congress is likely to respect this mandate, at least during the first few months of the first term. This means that a president’s first 100 days in office are the ideal time for them to pass legislation.

Gallup has found that presidential honeymoon periods are getting shorter and shorter. By the last few decades of the 20th century, the typical honeymoon period had shrunk to seven months, down from an average of 26 months earlier in American history:

“Presidents typically enjoy positive approval ratings during the early stages of their presidencies, commonly known as the “honeymoon” period. Barack Obama is no exception, with ratings that have generally been above 60%. But recent presidents’ honeymoons have typically ended much sooner than those of their predecessors. Whereas presidents from Harry Truman through Richard Nixon spent an average of 26 months above the historical average 55% presidential job approval rating after they took office, presidents from Gerald Ford to George W. Bush spent an average of just seven months above this norm.”

Interestingly, some two-term presidents may actually enjoy two honeymoon periods, benefitting from a bounce in their popularity after being elected to a second term. The Washington Post noted that this had happened to Obama, at least:

President Obama is enjoying a sort of second political honeymoon in the wake of his re-election victory last November with a series of national polls showing his job approval rating climbing from the middling territory where it lagged for much of the last several years…Obama approval is at 52 percent while his disapproval is at 43 percent. That may not seem like much but it marks a significant improvement over where he was for much of 2010 and 2011.

Many pundits claim that President Trump never had any honeymoon at all; the 45th president, they say, faced conflict and criticism from the moment he stepped into office. Opinions are divided as to who carries the blame for that. The New York Times blamed the president, arguing that he had squandered any good will that should have been coming to him by refusing to go out on the road to rally his followers, as previous presidents had done:

President Trump has become a virtual homebody during his first few months in office, largely sitting out the honeymoon period that other presidents have used to hit the road and rally support for their priorities.

The Miller Center noted that Trump had come into office at a time of unprecedented polarization in the country, and that his party held only a slim majority in the House; as a result, the incoming president faced gridlock in Congress. He had also won a majority off the electoral votes but had failed to win the popular vote, which automatically put him at a disadvantage and diminished his “honeymoon” period.



“Hizzoner” is a nickname used by journalists to refer to big city mayors, especially in New York City. Hizzoner is a contraction of “his honor,” the mayor’s formal title.

Merriam Webster notes that the term was first used in 1882. William Safire has said that the term was first popularized during the mayoralty of Fiorello LaGuardia; LaGuardia, as Safire says, was definitely not a formal figure, so a nickname which played with his office’s formal title sat well with him.

In 2019, when New York mayor Bill DeBlasio was running for the presidency, CBS created a feature called “Where’s Hizzoner?” The regular segment tracked DeBlasio’s movements and was a response to criticism that the mayor was spending too much time on the campaign trail and not enough governing the city.

New York’s tabloids aren’t generally known for their subtlety. Headlines about the mayor include items like “Hizzoner, the humongous hypocrite for sale,” which ran in the Daily News in 2018 and read, in part, 

“Here’s what Bill de Blasio, campaign-finance-reform champion and world-class hypocrite, said to the guy breaking the law and those systems to get bribe money to him: “Listen, I don’t know, I don’t want to know. Just do whatever you got to do.”

When Michael Bloomberg, the billionaire co-founder of Bloomberg MP, was running for mayor of New York, his own media had to chronicle his campaign. This made for some puzzling writing. “Hizzoner, Mayor Bloomberg?” read a headline in Bloomberg Businessweek

“So what makes a billionaire media executive who runs a company with 7,200 employees think he can manage the Big Apple and its 8 million citizens? Truth is, Michael R. Bloomberg thinks he can do pretty much anything. Especially when everyone else thinks he can’t. Once he officially announces his candidacy in two months or so, Bloomberg will sell himself to the voters as a political outsider with vision and managerial expertise. “I won’t be beholden to anybody,” he vows.”

Chicago mayors can also be called “Hizzoner” – so can any big-city mayor. For example, the Wall Street Journal reported on a plan to lure businesses to Chicago by offering lower taxes:

“The bright idea comes from Chicago Mayor Richard Daley, who is looking to lure employers from Oregon after that state’s voters approved a huge tax increase last week. The tax hike in Oregon “will help our economic development immediately. You’d better believe it,” Hizzoner told the Chicago Sun Times late last week. “We’ll be out in Oregon enticing corporations to relocate to Chicago.”

In Los Angeles, the Los Angeles Times ran a slightly snarky item titled “Hizzoner Talks!” to announce an upcoming talk by mayor Eric Garcetti, who had been in office just a few months at the time.

A number of plays and movies have used the title “Hizzoner,” usually to recount the biography of one mayor or another. Hizzoner was also the name of a very short-lived sitcom starring Kathy Cronkite and Mickey Deems. The show focused on the ups and downs of life as the mayor of a small midwestern town and ran for one season in 1979.

heartbeat away from the presidency

heartbeat away from the presidency

The phrase “heartbeat away from the presidency” refers to the fact that the vice president will automatically succeed the presidency in the case of the president’s death, disability, or resignation.

The vice presidency is not a powerful position in itself. The Senate’s own website calls the job “the least understood, most ridiculed, and most often ignored constitutional office in the federal government.” Franklin Roosevelt’s first vice president, John Nance Garner, once said that the title wasn’t worth “a bucket of warm spit.” The position has grown in importance over the years, with Dick Cheney (vice president under George W. Bush) arguably elevating the job to one of real significance. Still, the job doesn’t come with its own powers.

The vice president presides over the Senate and may cast a deciding ballot in the case of deadlock; the vice president may also advise the president on policy matters. Broadly, though, the vice president’s job is to be prepared to take over if anything happens to the president. As puts it:

The primary responsibility of the Vice President of the United States is to be ready at a moment’s notice to assume the Presidency if the President is unable to perform his duties. This can be because of the President’s death, resignation, or temporary incapacitation, or if the Vice President and a majority of the Cabinet judge that the President is no longer able to discharge the duties of the presidency.

Over the course of US history, a total of nine vice presidents have succeeded presidents in the middle of their terms. Eight of those occurred because of a president’s death. One president, Richard Nixon, resigned and was replaced by his vice president, Gerald Ford. Considering that the nation has had only 45 presidents to date, this means that 20% of America’s presidents have been succeeded mid-term by their vice presidents.

Presidential candidates like to jab at the other side’s vice-presidential pick, whenever possible. In 2008, when Barack Obama was running against John McCain, McCain surprised many by picking an unknown Alaskan politician as his running mate. Sarah Palin, the former mayor of the town of Wasilla, was a political newcomer. The Obama campaign released this curt statement when McCain announced that she was joining his ticket:

Today, John McCain put the former mayor of a town of 9,000 with zero foreign policy experience a heartbeat away from the presidency. Governor Palin shares John McCain’s commitment to overturning Roe v. Wade, the agenda of Big Oil and continuing George Bush’s failed economic policies — that’s not the change we need, it’s just more of the same.

The conservative commentator William Kristol jumped to Sarah Palin’s defense. Writing in the New York Times, Kristol reminded his readers that throughout US history, politicians have been fretting needlessly about vice presidential experience. William McKinley’s campaign advisers worried that McKinley’s running mate, Theodore Roosevelt, was far too young and inexperienced to succeed McKinley. In the event, of course, Roosevelt proved them wrong.

Sarah Palin, Kristol argued, was an all-American “Wal-Mart” mom, and that may be a good thing. He wrote, “A Wasilla Wal-Mart Mom a heartbeat away? I suspect most voters will say, No problem. And some – perhaps a decisive number – will say, It’s about time.”

hat in the ring

hat in the ring

Throwing one’s hat in the ring means announcing one’s intention to compete in a contest. In politics, it means running for political office.

The phrase originally comes from boxing, where contestants would literally throw their hats into the boxing ring as a signal that they wanted to join the fight. In boxing, the expression dates back at least to the beginning of the 19th century. 

An article in The Morning Chronicle of London, dated November 30, 1804, read in part:

The fight which we stated a few days since to be about to take place between Tom Belcher, brother to the champion of that name, and Bill Ryan, son of the late noted pugilist, who fought with Johnson some years since, was yesterday decided at Wilsdon Green, on the Edgware Road, the spot where the hard battle was fought between Blake and Holmes, a twelvemonth since, and where Pictoun beat Will Wood in June last. A council was held among the gentry of the fist on Tuesday last, when the misunderstanding respecting the purse to be fought for was adjusted, and the champions agreed that the fight should take place yesterday, instead of Monday next. The champions arrived at Wilsdon Green at eleven o’clock in two hackney coaches. Belcher first threw his hat into the ring over the heads of the spectators, as an act of defiance to his antagonist, who received him in the ring with a welcome smile.

In modern politics, candidates often wait until the last minute to throw their hats into the ring. Their announcements are awaited eagerly, and the press speculates about whether they will or won’t eventually enter the contest. For example, former New York mayor Michael Bloomberg waited months to officially declare that he was running for the 2020 Democratic presidential nomination. The long wait led to seemingly endless press coverage, like this Forex piece which said, wryly,

“For what it is worth, Democrat Michael Bloomberg is still mulling whether to throw his hat in the ring as a Democratic alternative to Elizabeth Warren/Joseph Biden vs. Pres. Trump. This is nothing really new.”

Former vice president Joe Biden also spent months deliberating before finally throwing his hat in the ring and announcing his White House run in April 2019.

Biden first considered running for the presidency in 2016, but decided against it after his son’s death. As Wilmington News Journal has reported, Biden began reconsidering as soon as President Trump was elected. Essentially, the former vice president spent a few years testing the waters and weighing his chances before deciding to throw his hat in the ring:

By May 2017, he started a political action committee to support Democrats in the upcoming midterm elections. He solicited donors — something he’s never enjoyed — and began mapping out a plan to be a prominent player in the Democratic bid to regain the House and defend difficult seats in the Senate.

As the midterms neared, Biden started getting the feedback he hoped for. In August 2018, he boarded a flight from Washington to New York and a string of passengers encouraged him to run in 2020.

hatchet man

A “hatchet man” is an operative in charge of doing political dirty work — or dirty tricks — both during a campaign and sometimes as part of normal government functions.

The word was first popularized during the Watergate scandal. Several of Richard Nixon’s aides, notably Charles Colson and H.R. Haldeman, were known as the president’s hatchet men, charged with taking care of his dirty work.

As the New York Times reported, Colson “caught the president’s eye” and rose in the administration quickly, thanks to his apparent ruthlessness.

His “instinct for the political jugular and his ability to get things done made him a lightning rod for my own frustrations,” Nixon wrote in his memoir, RN: The Memoirs of Richard Nixon. In 1970, the president made him his “political point man” for “imaginative dirty tricks. When I complained to Colson, I felt confident that something would be done,” Nixon wrote. “I was rarely disappointed.”

Colson hired E. Howard Hunt, a former CIA operative, to spy on Nixon’s political opponents. He also admitted to conspiring to destroy the reputation of Daniel Ellsberg, the former National Security Council member who leaked the Pentagon papers. He served time in jail, where he said he had experienced a religious awakening, eventually becoming an evangelical leader and forging  a coalition of Republican protestants and Catholics.

happy days are here again

Happy Days Are Here Again

“Happy Days Are Here Again” is the title of Franklin Delano Roosevelt’s official campaign song in 1932. The song remained the unofficial anthem of the Democratic Party for many years.

In 1932, America was mired in the Great Depression. “Happy Days Are Here Again,” with its upbeat lyrics and melody, helped set the mood for FDR’s optimistic candidacy and his promise that good times were coming again.

The song’s lyrics are not subtle. The chorus sings,

Happy days are here again
The skies above are clear again
So let’s sing a song of cheer again
Happy days are here again

The song, written by Jack Yellen and Milton Ager, was first performed by the George Olsen orchestra on Black Thursday, at the very onset of the stock market crash of 1929. Ironically, Yellen, who wrote the song’s lyrics, considered himself a Republican. Yellen and Ager wrote the song for a movie titled “Chasing Rainbows,” about World War I; the song was supposed to evoke the soldiers’ joy when they heard that peace had been made. However, the studio delayed release, so Yellen and Ager shopped it around to different performers. That’s how it came to be performed at New York’s Pennsylvania Hotel, in front of a crowd of ruined stock speculators.

As Time Magazine has pointed out, “Happy days” became FDR’s campaign song almost by accident. The campaign was originally planning to use “Anchors Aweigh,” the fight song of the US Navy, as its theme. However, the man who introduced FDR at the 1932 Democratic convention delivered a strikingly dull speech and then walked off stage to the strains of Anchors Aweigh. FDR’s team desperately wanted to change  the mood before the candidate walked onstage, so they asked for a new song. The one chosen was, of course, “Happy Days Are Here Again.”

This was the first time that a pre-existing pop song had been chosen for a political campaign’s theme music. Prior to 1932, campaigns usually hired musicians to write songs for them. William Howard Taft’s campaign, for example, came up with a tune called “Get on a Raft with Taft,” extolling the candidate as

“The man to lead
Our strong and mighty craft
Through storm at sea
To victory…
It’s William Howard Taft.”

The old campaign songs may seem hokey now, but they were certainly full of drama. James Madison’s campaign used a song called “Huzzah for Madison” to tout their candidate and to warn voters that Satan was always on the prowl:

And should the Tories all unite
And join again with British foes;
Though Satan might applaud the sight,
The heavens would soon interpose.

While Jefferson to shade retires
And Madison like morn appears
Fresh confidence and hope inspires
And light again the nation cheers.

In 1964, Lyndon Johnson split the difference between pop and original. His campaign took the new and popular “Hello Dolly” and reimagined it as “Hello, Lyndon,” urging the candidate to “promise you’ll stay with us in ’64.”

happy warrior

A politician who is undaunted and cheerful, even in the face of adversity, is said to be a “happy warrior.” 

The phrase comes from an 1806 poem by William Wordsworth, titled “Character of the Happy Warrior.” Wordsworth described the “happy warrior” as a brave, generous, and moral man, who was able to remain virtuous even in the midst of distress. More than anything, though, Wordsworth’s happy warrior is able to stay optimistic and to thrive amidst conflict. As Wordsworth sees it, adversity makes the happy warrior even more joyful:

“if he be called upon to face

Some awful moment to which Heaven has joined

Great issues, good or bad for human kind,

Is happy as a Lover; and attired

With sudden brightness, like a Man inspired…”

Hubert Humphrey, who served as vice president under Lyndon Johnson and later as Senator from Minnesota, was often known as the “happy warrior.” Humphrey is remembered as a progressive and a champion of civil liberties. He is also remembered as a sunny and upbeat personality. After his death, Minnesota senator Amy Klobuchar said in a statement, “You can go down the list of landmark federal legislation from the past 60 years, and Hubert Humphrey’s fingerprints are there: civil rights, Medicare, nuclear arms control, the Peace Corps, and countless others. But I think the most important thing about Hubert Humphrey is that he was an optimist, and he believed in America and believed in our democracy.”

A few decades later, Ronald Reagan was also called a “happy warrior.” Even his political opponents grudgingly noted his charm and likeability. After Reagan passed away, the Guardian noted, “Ronald Reagan was a happy warrior whose easy-going “Aw, shucks” style could make people smile who never voted for him. “Wake me up in an emergency,” he used to say, “even if I’m in a cabinet meeting.” Reagan himself liked the phrase “happy warrior;” in 1985 he told CPAC attendees, “We’ve made much progress already. So, let us go forth with good cheer and stout hearts — happy warriors out to seize back a country and a world to freedom.”

The happy warrior phrase gets used on both sides of the political divide. In 2012, Barack Obama won re-election and referred to Joe Biden as a “happy warrior.” In 2018, Rick Perry said the same of Donald Trump: “The onslaught that goes at him, the forces of evil that are arrayed against him, it’s stunning,” Perry said. “And let me tell you, he is a happy warrior. We talk about Ronald Reagan being a happy warrior. Ronald Reagan ain’t got nothing on Donald Trump. This guy is fascinating. His stamina. Watch him on TV. He is amazing.”


In politics, a handler manages a candidate during an election. 

A handler can fill a variety of roles. At the lowest end of the spectrum, a handler can take care of the candidate’s basic needs, fetching cups of coffee or take-out meals. Further up the totem pole, a handler can manage a candidate’s interactions with the media or give advice on the direction the campaign ought to take. Often, “handler” is used interchanged with “PR expert.”

A piece in New York Magazine’s “Workplace Confidential” described some of the work involved in “handling” a candidate:

A lot of the day-to-day work is helping the candidate improve. Is he or she getting sharper on the stump? It’s about practice and a willingness by the candidate to literally watch themselves and watch other people. Oppo[sition] research is one of the fun parts of the game. It’s easier these days to get the stuff out there, for sure. There are so many outlets, and somebody’s going to run with it. You just need to make sure the reporter you give it to isn’t going to waste it on a tweet.

Of course, the term “handler” is often seen as pejorative, probably because it’s closely associated with animals. An animal handler is responsible for every aspect of an animal’s welfare, from feeding and exercise to proper training and socialization. An animal handler might also show a dog or a horse in a competition. Unsurprisingly, most people don’t want to be compared to zoo animals, and so avoid describing anyone on their staff as a “handler.”

The term “handler” is also associated with boxing. The person (usually a man) who trains a prizefighter and coaches them while they’re in the ring is known as a handler. But politicians also don’t want to be compared to prizefighters, most of the time. This is probably why, as William Safire has pointed out, most politicians refer to their handlers as “consultants,” or “advisers,” or “aides.” 

Generally, when someone uses the term “handler,” it’s about a member of the opposing party. Politicians usually talk about their opponents as having handlers. Journalists write, often critically, about a particular politician’s handlers. It’s an attention-grabbing word with a subtly negative flavor; it has the added virtue of being short enough to fit in a headline. So, journalists can write headlines like “Trump is ‘a full-blown lunatic’, says ex-handler Scaramucci.” Or, authors can give their books titles like “Obama Unmasked: Did Slick Hollywood Handlers Create the Perfect Candidate?”

Sometimes, politicians would rather just not have any handlers at all. In 1988, Dan Quayle was running for vice president and was facing criticism for his overly scripted and “robotic” public appearances. Quayle, frustrated, announced that he was breaking free from his political handlers, as the New York Times reported:

“‘I just said, Lookit,” he recalled today when asked to explain the change to the reporters traveling with him. ”I said I’ve done it their way this far and now it’s my turn. I’m my own handler. Any questions? Ask me.”

An advance man is also a handler.

gutter flyer

gutter flyer

A “gutter flyer” is a political attack ad, traditionally distributed in paper form. It is also typically anonymous, so that nobody can be held accountable for it or asked to verify the information contained in it.

Gutter flyers are a prime example of mudslinging and negative campaigning. 

In 1963, opponents of President John F Kennedy distributed around 5,000 copies of a flyer to people in Dallas, Texas. (The distribution came just ahead of a presidential visit to Dallas.) The flyer read “Wanted for Treason” and accused JFK of a long list of crimes, including “betraying the Constitution,” being lax on communism, and appointing “anti-Christians” to federal jobs.

Attack ads and gutter flyers are almost as old as the United States itself. In 1828, Andrew Jackson ran for president in what historians have called the first modern American campaign. Jackson’s opponents circulated a flyer decorated with coffins and depicting the former general as a killer responsible for the deaths of his own men, in addition to those of Native Americans and lawbreakers. In response, Jackson’s supporters depicted John Quincy Adams as a political insider who had served as the Czar’s pimp while he was a diplomat in Russia.

Because gutter flyers are usually anonymous, they can be an effective way to spread dirt about a candidate without getting anyone’s hands dirty. Politicians normally distance themselves from political mudslinging, wanting to give the impression that they are above the fray. Gutter flyers are a fixture in American elections at every level, from the city council to the presidency. However, they’re also routinely denounced. Virtually every election cycle includes a good deal of hand-wringing about how politicians are slinging mud at one another at a greater frequency than ever before.

All of this explains why it’s relatively rare for a politician to take responsibility for a gutter flyer. It does sometimes happen, of course. In 2000, Bill Bradley was running for the Democratic presidential nomination against Al Gore. Bradley told CNN that he was angry at the way Gore’s supporters had been behaving; Bradley claimed that they had literally thrown mud at Bob Kerrey, who was campaigning on Bradley’s behalf. Bradley was especially indignant because, he said, he had taken responsibility for his own campaign’s good manners. “When my campaign in New Hampshire put out a flyer that I didn’t like, I took responsibility for it and apologized,” he said. “When this kind of incident occurs, you have to take responsibility for it and apologize.” 

In the United States, libel laws tend to favor the defendant, making it difficult to sue anyone for the content of a political flyer. In many other countries, however, libel laws tend to favor the plaintiff. In Canada, for example, the mayor of Dieppe, New Brunswick was able to sue two of his constituents after he found that they were behind a brochure that criticized him. Mayor Yvon Lapierre sued the two men for defamation and eventually settled; the pamphlet they circulated claimed that he had broken the Municipalities act and that he was mismanaging money and contracts.

guns before butter

guns before butter

“Guns before butter” refers to the debate over how governments should use their revenue: should resources be used to build up the military, or should they be spent on domestic programs?

The concept of “guns before butter” was probably first laid out by William Jennings Bryan, the Progressive politician who ran unsuccessfully for the presidency time and again. Bryan served as Secretary of State in the cabinet of Woodrow Wilson but resigned in protest after Wilson decided to emphasize the production of weapons instead of the production of dairy. (Wilson was responding to the sinking of the Lusitania and the build-up to World War I.)

Decades later, the Nazi party had its own twist on the question of guns and butter. “Guns will make us powerful; butter will only make us fat,” declared Hermann Goring, Hitler’s economics minister. Goring was Hitler’s second economics minister; Hitler had installed him after firing Hjalmar Schacht. Schacht had implemented large-scale public works and had overseen a dramatic improvement in Germany’s economy, but he had run afoul of Hitler because he was critical of the country’s ever-increasing military spending.

The phrase was further popularized by the economist Paul Samuelson, the first American to win the Nobel Prize in economics. Samuelson was the author of a widely-used textbook in which he explained, among other things, that resources are finite and that budgets are a series of decisions about priorities. What you spend on guns, you won’t be able to spend on butter, in other words.

Over the years, “guns before butter” has become a shorthand to express the federal government’s dilemma over how to allocate funds. In 2014, Reuters ran a blog titled “Obama learns LBJ’s tough lesson: You can have guns or butter, not both.” The piece argued that Obama had run into the same problem as President Johnson had, decades earlier: his ambitious social programs had come into conflict with military reality. A few years later, Slate complained that the Trump administration’s budget was “all guns, no butter.” Slate grumbled that the spending far exceeded the actual needs of the military, and that the money would be better spent funding the State Department so that aims could be achieved through diplomacy, rather than through war.

Of course, similar conflicts exist throughout the economy. In 2018, the Economist pointed out that California’s wine growers were being hurt by the legalization of marijuana; as a result, local governments were moving to restrict cannabis production. The magazine wrote,

“Booze and drugs usually belong together like Fred and Ginger. But not, it seems, in California’s wine region. Wine-makers are fretting that recreational marijuana use, which became legal in the state in January, could challenge their dominance of what is delightfully known as people’s “intoxication budgets”. They also complain that they can no longer afford seasonal labour to harvest their grapes because workers have better-paid, year-round jobs on cannabis farms. Sonoma County, one of the state’s main wine-producing regions, recently imposed restrictions on who may grow weed, and where.”

gunboat diplomacy

gunboat diplomacy

The practice of backing up diplomatic efforts with a visible show of military might. A nation using gunboat diplomacy is making use of implicit military threats to achieve its policy objectives. 

A gunboat was a relatively small ship which could navigate through shallow waters; easy to maneuver, the boats were fitted with heavy weapons.

The most obvious examples of gunboat diplomacy come from the 19th and early 20th century. In 1854, Japan and the United States signed the Treaty of Kanagawa, opening up trade between the two nations for the first time in 200 years. The agreement came about after Commodore Matthew Perry led a naval squadron to Tokyo Bay. As the US State Department has put it,

“Perry arrived in Japanese waters with a small squadron of U.S. Navy ships, because he and others believed the only way to convince the Japanese to accept western trade was to display a willingness to use its advanced firepower.”

President Theodore Roosevelt is often credited with expanding America’s use of gunboat diplomacy. Roosevelt famously said that his diplomatic motto was to “speak softly and carry a big stick,” which, he said, meant that the nation had to be ready to back up words with force. Roosevelt built up the US military might and routinely made a practice of showing off the nation’s might as a way to pre-empt potential challenges. 

In order to show off America’s naval power, Roosevelt sent a naval fleet around the world, on a tour which lasted 14 months. The fleet was known as the Great White Fleet (its ships were painted white, instead of the usual gray), and consisted of 16 battleships manned by 14,000 sailors. The fleet set out on December 16, 190y and concluded its journey on February 22, 1909. The Great White Fleet called in Hawaii, New Zealand, Australia, Japan, and Egypt, before continuing on to Italy and Gibralter. (Along the  way, the sailors provided assistance to victims of an earthquake in Sicily.)

In theory, the era of gunboat diplomacy ended with Franklin Roosevelt’s first term. FDR announced his “good neighbor” policy in his first inaugural address, vowing that “in the field of world policy I would dedicate this nation to the policy of the good neighbor—the neighbor who resolutely respects himself and, because he does so, respects the rights of others.”

In reality, of course, the US has never completely abandoned the show of force. In 2011, the New York Times summed up the Obama administration’s activities in Asia:

“the Obama administration has been an active practitioner of gunboat diplomacy, a term that refers to achieving foreign-policy objectives through vivid displays of naval might. Last fall, Mr. Obama sent the aircraft carrier George Washington to the Yellow Sea for joint exercises with South Korea, sending a message to both North Korea and its key backer, China. The move echoed the Clinton administration’s decision in 1996 to send the Seventh Fleet to warn China against attacking Taiwan.”

More recently, the Heritage Foundation noted approvingly that the Trump Administration was using gunboat diplomacy in Iran:

“The U.S. is not the world’s policeman or its babysitter, but it doesn’t want to be blindsided by bad actors who think Washington is so preoccupied elsewhere that they can take advantage of the situation. Thus, the U.S. has to demonstrate it is present and capable of acting where it needs to.

The deployment to the Gulf will be a deterrent to conflict because it shows the world that the U.S. will act wherever necessary to protect its vital interests. Testing the U.S. is the last step any adversary should want to take, Tehran included.”


In politics, gridlock is a situation in which the government is unable to pass new legislation, often because the presidency and the Congress are controlled by different political parties.

As the Brookings Institution has pointed out, gridlock has been around for as long as the United States, if not longer. Alexander Hamilton complained bitterly about the trouble the Continental Congress had in coming to an agreement; the debates between the Federalists and the Republicans were as fierce as any debates today.

The late Supreme Court justice Antonin Scalia argued that gridlock has gotten a bad rap; in fact, Scalia said, gridlock is just one more necessary part of the founding fathers’ plan. “Gridlock is what our system is designed for,” he told the president of the Newseum. At the same time, Scalia did point out that the Supreme Court operates more smoothly than the rest of the federal government. “We have to act. We can’t just say, ‘We haven’t decided about this case, so go away.’ Sooner or later you gotta vote, so there it is. Congress doesn’t have to do that…That’s the principle reason people don’t accuse us of gridlock. They accuse us of a lot of other stuff.”

Few people seem to share Scalia’s sunny view of gridlock. Journalists and politicians periodically complain that gridlock is making it impossible to solve the most serious problems of our day. In 2017, the Daily Beast went so far as to argue that political gridlock is “killing us, literally.” The blog argued that gridlock and lack of political will were allowing politicians to dodge dealing with issues like gun control and the soaring national debt: “our political system is grinding to a halt and producing more demagoguery than governance. Political gridlock is killing us. Literally.”

Similarly, in 2019, the Brookings Institution issued a report warning that gridlock was likely to destroy the US economy, or at least to put a major dent in it. When parties can’t reach political compromises, the report said, it means hold-ups on issues like tariffs, infrastructure projects, and budget balancing: “Put bluntly, when political discord leads to infrastructure failure, it doesn’t just deepen our distrust of government—it also takes our economy down with it.”

Gridlock, unsurprisingly, increases with the rise of partisanship. As America grows more politically divided, so does the federal government, making it tougher to reach compromises. Business Insider has reported, for example, that the Trump administration has been unable to pass any legislation that requires bipartisan support. Without the support of any Congressional Democrats, the administration relied heavily on executive orders and on other actions that didn’t require Congress.

The American public is apparently deeply pessimistic about the future of gridlock in the country. A Pew Center poll carried out in 2018 (just after the midterm elections) found that a majority of Americans believe that the president and Congress will fail to get legislation passed, because of persistent gridlock. Most of the people surveyed believed that partisanship was likely to either stay at the same level or get worse over the coming years. Ironically, the Pew Center also reported that most Americans were happy with the results of the midterm elections.

Great Society

The Great Society was a sweeping set of proposals for social reform, put forward by President Lyndon Johnson in 1964 and aimed at improving access to education, good jobs, and healthcare for ordinary Americans.

Johnson had already proposed a “War on Poverty” during his State of the Union address, warning that “many Americans live on the outskirts of hope — some because of their poverty, and some because of their color, and all too many because of both. Our task is to help replace their despair with opportunity.”

A few months after declaring war on poverty, Johnson set out his vision for a “great Society” in a 1964 speech in two speeches, at the University of Ohio and the University of Michigan. He argued that the nation needed to establish a level playing field so that all Americans had an equal chance at success. 

In a speech to the graduating class at the University of Michigan, Johnson described the “Great Society” he wanted to see America transform into. The picture he painted was every bit as utopian as John Winthrop’s “city on a hill.” Johnson said:

The Great Society rests on abundance and liberty for all. It demands an end to poverty and racial injustice, to which we are totally committed in our time. But that is just the beginning.

The Great Society is a place where every child can find knowledge to enrich his mind and to enlarge his talent. It is a place where leisure is a welcome chance to build and reflect, not a feared cause of boredom and restlessness. It is a place where the city of man serves not only the needs of the body and the demands of commerce but the desire for beauty and the hunger for community. It is a place where man can renew contact with nature. It is a place which honors creation for its own sake and for what it adds to the understanding of the race. It is a place where men are more concerned with the quality of their goals than the quantity of their goods.”

As part of the Great Society initiative, Congress passed the Civil Rights Act, banning all discrimination based on race and gender in the workplace. The Act also banned segregation in any public facility. Congress also passed the ambitious Economic Opportunity Act of 1964. That law created vocational training and jobs programs aimed at getting more Americans good jobs. The Johnson administration also dedicated funds to improve schools and set up preschool programs, and made it easier for working and middle class Americans to attend college.

Historians continue to debate whether the “Great Society” ever achieved its goal. Johnson’s administration was also bogged down in the Vietnam War, which took funds and attention away from the president’s domestic goals. 

One of the Great Society’s staunchest enemies was Ronald Reagan. In 1966, when Reagan was preparing to run for governor of California, he delivered a speech denouncing the Great Society and warning against “an unprecedented federalization of American life” and a “welfare society.” In 1983, after becoming president, Reagan called the Great Society “the central political error of our time,” warning that the governments believed that “government and bureaucracy” was “the primary vehicle for social change.”

great debates

great debates

The “great debates” were a series of public debates between Abraham Lincoln and Frederick Douglas. In 1858 Douglas, an Illinois Democrat, was running for re-election to the US Senate. Lincoln, a Republican, challenged him. The two held a series of seven debates which focused on the issue of slavery.

In the 1850s, Americans were at loggerheads over the idea of slavery and, especially, over the question of whether slavery would be permitted in the newly added territories. The recent Mexican War had added new territories to the country, which had brought the issue to the forefront. Frederick Douglas came down squarely on the side of expanding slavery into the new territories. His Kansas-Nebraska Act, introduced in 1854, had ended the ban on slavery in the northern territories of the United States. Douglas proposed local rule, or “popular sovereignty,” which would allow the settlers in each territory to decide whether they wanted to allow slavery.

Lincoln, for his part, was a newly minted abolitionist who was looking to burnish his anti-slavery credentials. His debates with Douglas were a chance to get nation-wide attention and position himself as a rising star in the Republican party. By all accounts, people at the time realized that the Lincoln-Douglas debates would be watched and remembered for a long time. As Lincoln said, the issues would be discussed long after “these poor tongues of Judge Douglas and myself shall be silent.” A newspaper at the time wrote, “The battle of the Union is to be fought in Illinois.”

During the fifth of the seven debates, Douglas defended the institution of slavery by arguing that the founding fathers never intended the Declaration of Independence apply to everyone. In Douglas’ view, the founding fathers were very intentional about setting up two tracks. When they talked about rights, Douglas argued,

“They referred to white men, to men of European birth and European descent, when they declared the equality of all men. I see a gentleman there in the crowd shaking his head. Let me remind him that when Thomas Jefferson wrote that document, he was the owner, and so continued until his death, of a large number of slaves.” Douglas went on to reason that by arguing for equal rights for slaves, the abolitionists were calling out the founding fathers as hypocrites: “It must be borne in mind that when that Declaration was put forth, every one of the thirteen Colonies were slaveholding Colonies, and every man who signed that instrument represented a slave-holding constituency. Recollect, also, that no one of them emancipated his slaves, much less put them on an equality with himself, after he signed the Declaration.”

Lincoln responded by framing slavery as a moral issue, rather than a question of logic. He argued that once slavery was acknowledged as a moral wrong, it would be impossible to look the other way. He went on:

Now, I confess myself as belonging to that class in the country who contemplate slavery as a moral, social and political evil, having due regard for its actual existence amongst us and the difficulties of getting rid of it in any satisfactory way, and to all the Constitutional obligations which have been thrown about it; but, nevertheless, desire a policy that looks to the prevention of it as a wrong, and looks hopefully to the time when as a wrong it may come to an end.

grass will grow in the streets

grass will grow in the streets

“Grass will grow in the streets” is a gloom-and-doom phrase sometimes used by politicians to imply that the country will go to economic ruin if they don’t win election, or if their own plan doesn’t prevail.

The phrase is sometimes credited to Herbert Hoover. However, the populist William Jennings Bryan used the expression decades before Hoover. In his famous “cross of gold” speech, delivered at the 1896 Democratic convention, Bryan argued that America’s farms were more crucial to the country’s economy than the coastal cities were. He also warned that if America didn’t switch to a silver standard (making it easier for poor farmers to repay their debts), the country’s farms would be destroyed, leaving the whole country economically crippled. Bryan thundered,

“You come to us and tell us that the great cities are in favor of the gold standard; we reply that the great cities rest upon our broad and fertile prairies. Burn down your cities and leave our farms, and your cities will spring up again as if by magic; but destroy our farms, and the grass will grow in the streets of every city in the country.“

In 1932, Herbert Hoover was facing the Great Depression and a tough reelection challenge from Franklin Roosevelt. Hoover echoed Bryan’s words, warning that the country would be destroyed if his policies weren’t followed. (Ironically, Hoover was a staunch defender of the gold standard.) With the election coming up, Hoover took to the radio and declared that if Roosevelt were elected, then

“the grass will grow in the streets of a hundred cities, a thousand towns; the weeds will overrun the fields of a thousand farms….” 

Since then, politicians have used variations on the phrase to predict doom and gloom if they didn’t get their way. In 1992, an op-ed in the Washington Post compared then-incumbent President George HW Bush to Herbert Hoover. Like Hoover, Bush was facing a tough challenge from a challenger amid a troubled economy. And, the Post argued, Bush was similarly lashing out at his opponent by warning that the country would be ruined if people voted for Clinton. Bush talked about “Main Street” suffering in much the same way as Hoover talked about grass growing in the streets.

“President Hoover didn’t know how to cope with the Depression so he attacked his opponent. “Grass would grow in the streets of 100 cities,” he said, if New York Gov. Roosevelt were elected. The other day President Bush was predicting “misery on Main Street” if Arkansas Gov. Bill Clinton were elected. The GOP then trotted out ex-President Reagan doing the same for Bush.”

In 2008, Barack Obama used similar language to contrast the needs of ordinary, rural Americans with the demands of coastal elites. Like William Jennings Bryan, Obama warned that what hurts rural America (“Main Street”) will ultimately also hurt the big cities (“Wall Street”):

Too often, over the last quarter century, we have lost this sense of shared prosperity. And this has not happened by accident. It’s because of decisions made in boardrooms, on trading floors and in Washington. We failed to guard against practices that all too often rewarded financial manipulation instead of productivity and sound business practices. We let the special interests put their thumbs on the economic scales. The result has been a distorted market that creates bubbles instead of steady, sustainable growth; a market that favors Wall Street over Main Street, but ends up hurting both.



The grassroots are the ordinary people in a region, or in a political party. The “grassroots” level is the opposite of the leadership level. In politics, having grassroots support means having the backing of the people, rather than of party bigwigs. 

A grassroots movement, or campaign, is one which organizes people at the most local level to take political action. This could mean advocating for a cause, protesting a policy, or rallying around a particular candidate. Often, a grassroots effort mobilizes people to turn out and vote. Grassroots actions can also include contacting members of Congress or signing petitions for change.

Grassroots organizing usually bypasses traditional channels like television and radio. Instead, organizers rely on face to face meetings, telephone, and especially on social media and other internet-based outreach efforts to mobilize people.

Barack Obama may have had the first modern grassroots presidential campaign. Of course, he was far from the first president to seek out the support for “ordinary Americans,” but Obama’s New Media team was innovative in their approach. One former Obama organizer described the energy of the first campaign:

Back then, we called ourselves the New Media team—and valued our artists, filmmakers, writers and online community builders as highly as our Google-trained data analysts. Our team culture was disciplined and yet explosively creative. Densely packed into our cubicles, we finished each other’s sentences and fed off the energy of our supporters as they built a movement that was going to bring change to Washington… We drove across the country, spending hours and sometimes days interviewing people about their lives. We often stayed in “supporter housing” instead of hotels—talking late into the night with our hosts in kitchens from Oregon to Mississippi to New Hampshire.

Years later, Vermont Senator Bernie Sanders ran a series of grassroots campaigns for the presidency. Massachusetts Senator Elizabeth Warren was also generally seen as a grassroots candidate.

What do you call a grassroots campaign which is not, in fact, driven by the grass roots? Pundits have a name for it – astroturf. Sometimes, a political movement masquerades as a community-based initiative when in fact, the movement is directed by a small group of people in power. The term astroturfing may have first been used in 1985, when then-Senator Lloyd Bentsen complained that he was getting piles of letters from constituents who seemed to have been mobilized by the insurance lobby.  In the internet age, of course, it’s easier than ever to create the illusion of broad-based support for a cause, so astroturfing is easier than ever.

grand design

grand design

A “grand design” refers to any kind of deliberate plan of action. In politics, the term is usually used to mean an overarching strategy or a long-term plan.

A grand design implies long-term thinking.

The opposite of a grand design, of course, is a series of disconnected responses to events. That approach – the “muddling through” approach – is not a favorite with political scientists.

“Grand design” can also have religious connotations. The term can refer to the “grand design” supposed to originate with God. A book by the Mormon author E. Douglas Clark, for example, is titled “The Grand Design: America from Columbus to Zion.” Clark argues such people as Christopher Columbus and the Founding Fathers were all led by the hand of God to carry out a key mission towards fulfilling America’s destiny.

In the past, Americans talked about “grand design” more often than they do today. The concept of “manifest destiny,” or the belief that Americans were intended to spread out across the continent of North America, can be described as a particularly fervent example of grand design.

Secular historians also like to talk about the founding fathers and grand design. In the secular sense, grand design means a plan which originated with the founding fathers themselves, rather than a plan handed down from on high. Usually, historians describe the writing of the Constitution and the formation of the US government as an example of grand design. 

Still, historians and think tanks have a tendency to see a grand design in whatever aspect of the government they approve of. Features which they disapprove of aren’t usually described in this way.

The libertarian Hoover Institution, for example, issued a paper titled “Federalism: the Grand Design.” The paper enthuses over state rights and the federalist system at large:

Federalism was part of the constitutional tapestry designed by our Constitution’s framers to create an effective national government while protecting liberty. First, they invested the national government with limited and specifically prescribed powers, only those powers essential for effective governance. They also established specific constraints on government power and recognized specific rights in the Bill of Rights.

Sometimes, the founding fathers had conflicting grand designs. The Federalists clashed with the Republicans over the reach of the national government and the economy. If the Federalists had one grand design, the Republicans had their own, competing grand design.

Jeffrey Estano delved into this in an essay titled Friendship and Conflict: The Relationship of the U.S. Founding Fathers:

Economic differences between Federalists and Republicans were a primary source of conflict. The most drastic point of contention centered on Secretary of the Treasury Alexander Hamilton’s plan for a national economy, and the opposition he faced from the Republicans.  Hamilton’s grand design called for the assumption of state debts by the national government, the formation of a national bank, and the establishment of national credit. A true genius (and a favorite of President George Washington), Secretary Hamilton stood in position to permanently elevate the federal government’s power over that of the state governments.”

Go fight City Hall

“Go fight City Hall” is a phrase expressing the futility of trying to battle government bureaucracy. The phrase sounds like a call to action but in fact, it is the opposite. An equivalent would be “you can’t fight City Hall.” 

In the past, “go fight City Hall” may have had a more optimistic ring, judging by at least one old newspaper article. In 1928, a short item appeared in the Brooklyn Citizen announcing an upcoming tax cut which, the newspaper said, was a direct result of fighting City Hall: 

“The “Go Fight City Hall” spirit may mean a savings of $8,000,000 to the taxpayers of Queens. With the Board of Assessors in executive session concerning its recommendations for a decrease in the assessment levied on Queens property owners for the construction of the $16,300,000 Jamaica “sewer scandal,” the only question seems to involve the amount of the relief.”

Countless books, movies, and television shows have dealt with the question of whether it’s possible to “go fight City Hall.” The phrase itself may have been popularized by a 1945 book, “Go Fight City Hall” by Ethel Rosenberg. (Note that the author is not the convicted spy of the same name.)

The 1939 film Mr. Smith Goes to Washington is probably one of the best-known takes on the issue of fighting city hall. Of course, in that case, Mr. Smith is not literally fighting “city hall” but is taking on entrenched interests in Congress. Still the idea — an ordinary citizen trying to prevail against the establishment — is the same. 

It’s also worth noting that the “fight against City Hall” isn’t always portrayed as a noble one. A 1962 episode of the old TV show “Naked City” was titled “Go Fight City Hall.” That episode wasn’t about a heroic struggle against a faceless bureaucracy; rather, it’s about a couple of thieves who try to make fools out of good detectives.

Most of the time, though, the fight against City Hall is seen as a sort of David and Goliath struggle, with the heroes being either the ordinary citizen, or perhaps a stalwart activist. In New York City, for example, Jane Jacobs’ struggles against Robert Moses’ developments is often remembered as a heroic win against “City Hall.”

Among journalists, taking on City Hall is usually a badge of honor. In 1976 the National Freedom of Information Coalition, an organization dedicated to increasing transparency at all levels of government, published a study titled “Go Fight City Hall: Informal Methods of Combatting Secrecy in Local Government.” The study noted that fighting the government can be ruinously expensive and risky:

…court action is not an easy decision for an editor. Only the wealthier papers can sue without weeping at the cost, and cost is just one problem. A lawsuit can turn a difference of opinion between a newspaper and a public official into overt hostility, making a rational solution difficult. And the news media must pick suits with care. If they don’t win, they can be in worse trouble than before. Further, the law is slow. No suit can get a public meeting open in time for the last deadline.

Give ’em hell Harry

“Give ’em hell Harry” is a reference to President Harry Truman’s 1948 re-election campaign. It’s also the name of a very successful play and movie.

In 1948, President Harry Truman was running for re-election. During a campaign stop in Bremerton, Washington, Truman delivered a rousing speech attacking the Republicans. One of Truman’s supporters called out, “give ‘em hell Harry!” Truman replied, “I don’t give them hell. I just tell the truth about them, and they think it’s hell.”

The 1948 campaign was hard-fought – in fact, it was such a close contest that, at the last minute, some newspapers called the outcome the wrong way. A famous photo shows Truman, smiling triumphantly and holding up a two-day old copy of the Chicago Daily Tribune; the headline read, “Dewey Defeats Truman.” Truman, of course, had defeated Dewey.

On the campaign trail, Truman faced massive dissent in his own party, with a group of Southern Democrats breaking away to form the Dixiecrat party. On the left, Henry Wallace and his new Progressive Party also threatened to siphon off votes from Truman. 

Faced with all this, Truman decided to embark on a cross-country tour, aboard a train which he named the “Truman Special.” The whole tone of the trip was set as the train pulled out of the station in Washington. Senator Alben Barkley, of Kentucky, was on hand to see Truman off; Barkley called out to Truman, “Goodbye, good luck, and mow ‘em down.” Truman replied, “We’ll mow ‘em down Albers, and we’re going to give ‘em hell.” 

“Give ‘em hell Harry” is also the title of a play by Samuel Gallu, which tells the story of Truman’s life and presidency. The play, a one-man show originally starring James Whitmore, opened in 1975 at Ford’s Theater, in Washington DC. Truman’s daughter Margaret attended the opening, and so did then-president Gerald Ford. Years later, in 2017, Truman’s grandson, Clifton Truman Daniel, took over the role of the former president. The play has also been 

Decades later, Roger Stone used the phrase “give ‘em hell” to shower Donald Trump with praise. According to Stone, there were some very clear parallels between Trump’s 2016 campaign and the campaign that Truman fought in 1948:

Borrowing a chapter from the “Give ’Em Hell Harry” book, Trump took on a work schedule that would kill younger men. He dazzled at five and six stops a day. He slept four hour a night. Truman used a train; Trump used “Hair Force One” — his private plane. I have never seen a better closer.

…Like Hillary Clinton, Dewey was stilted in public — detached, and not natural mingling with people. Truman and Trump thrived on the energy of their crowds as they hit each stop. Truman drove the engineers to break all speed laws to maximize time for speeches at each stop. Trump did the same to his pilots as he hopscotched through Michigan, Wisconsin and Pennsylvania in a frenzy of action. Both pulled massive crowds.

gag rule

gag rule

A gag rule prevents members of a legislative body from raising a particular issue, usually because that issue is considered too controversial or divisive.

In the United States, the most famous example of a gag rule involved slavery. Members of the House of Representatives were barred from putting forward any petition that discussed slavery during the period from 1836 to 1844. The gag rule was imposed by a series of congressional resolutions; the first of those was agreed on in 1836. The last of those resolutions was finally repealed in 1844, as a concerted action taken by John Quincy Adams and a group of his supporters in the House.

The gag order was a clear response to the abolitionist movement, which was increasing in strength. In 1834, an organization called the American Anti-Slavery Society urged its members to sign petitions and send them to Congress; over the course of the next few years, this initiative grew exponentially. In the year after the first gag order, abolitionists sent at least 130,000 petitions to Congress. Pro-slavery members of Congress stepped up their defense of slavery as more and more petitions circulated in Congress.

The initiative behind the gag rule came from Southern politicians. Representative James Hammond, of South Carolina, proposed the first gag rule in December of 1835. James Polk, of Tennessee – the man who later became the 11th president of the United States – was Speaker of the House at that time. He referred the matter to a committee chaired by Henry Pinckney, of South Carolina. Pinckney’s committee decided that any mention at all of slavery should be blocked from the House floor. This meant that no petitions, resolutions, or memorials which discussed slavery were to be discussed.

John Quincy Adams immediately voiced his objections. During the roll call vote, Adams shouted, “I hold this resolution to be a direct violation of the Constitution of the United States.”

Nevertheless, the gag rule stayed in place until December 3, 1844, when Adams pulled together enough support to repeal it.

Meanwhile in the Senate, John Calhoun’s proposal to impose a gag order was voted down. In 1836, Calhoun – a senator from South Carolina – was a vocal opponent of the abolitionist movement. He argued that Congress had no place regulating slavery and the status quo should be preserved. “The relation which now exists between the two races,” he said, “has existed for two centuries. It has grown with our growth and strengthened with our strength. It has entered into and modified all our institutions, civil and political. We will not, cannot permit it to be destroyed.”

In 1836, Calhoun proposed a gag order but was voted down. It’s worth noting that historians don’t believe that most Senators had noble motives for voting against the gag order. They weren’t ardent defenders of the people’s right to petition the government for redress of grievances. Rather, they were concerned that a gag order would elevate the abolitionist cause. They also felt confident that they could just bury abolitionist petitions in committee for long enough to keep them from stirring up any trouble.

freedom riders

freedom riders

Freedom riders were northerners who took interstate buses down to the south in order to protest Jim Crow and segregation policies.

Most of the freedom riders were college students; about half of them were black and about half were white. Most of them (an estimated 75 percent) were men. The first freedom ride took place in the summer of 1961 ; other rides followed.

Freedom riders held sit-ins at lunch counters, waiting rooms, and restrooms in interstate bus stations throughout the deep south. Their goal was to put to the test a recent Supreme Court ruling which had declared that segregation on interstate bus and rail stations was unconstitutional. 

The freedom rides were organized by the Congress on Racial Equality, or CORE. The first bus set off from Washington DC on March 4, 1961, carrying seven black and six white protesters to the deep south. The initiative was modeled after CORE’s “Journey of Reconciliation,” which took place in 1947. The Journey of Reconciliation had groups of black and white volunteers ride buses in the south to  test out a Supreme Court decision (Morgan v. Virginia, 1946) which declared that segregated bus seating was unconstitutional. 

The Freedom Riders soon encountered violent opposition. The historian Raymond Arsenault recounted some of them in his book, Freedom Riders: 1961 and the Struggle for Racial Justice. Arsenault described the experience of a group of freedom riders in the little town of Anniston, Alabama:

“As the crowd of about fifty surrounded the bus, an eighteen-year-old Klansman and ex-convict named Roger Couch stretched out on the pavement in front of the bus to block any attempt to leave, while the rest — carrying metal pipes, clubs, and chains — milled around menacingly, some screaming, “Dirty Communists” and “Sieg heil!” There was no sign of any police, even though Herman Glass, the manager of the Anniston Greyhound station, had warned local officials earlier in the day that a potentially violent mob had gathered around the station.”

That angry mob beat the northern protesters viciously; one man threw a firebomb threw a bus window. The mob also slashed the bus tires so that the freedom riders had to abandon their burning bus and carry out the next stage of their trip by plane.

Many northerners were horrified by the response of southern police to the attacks on the freedom fighters. Police largely stood by and did nothing to protect the freedom riders against violent mobs. In Jackson, Mississippi, police arrested hundreds of the freedom riders and charged them with breach of the peace. Convicted, they spent up to six weeks in what the New York Times called “sweltering, filthy and vermin infested cells.”

Accounts of the violence spread around the country and helped to raise awareness of the freedom riders and their goals; this also inspired other people to join the movement, and put pressure of President Kennedy to take action. Eventually, Robert Kennedy ordered federal marshals to protect the freedom riders, and the Interstate Commerce Commission banned segregation on interstate travel.

40 acres and a mule

forty acres and a mule

“Forty acres and a mule” is a popular name for an order which promised freed slave that every family would be given a plot of land, measuring up to 40 acres. The land was to be seized from southern plantation owners and divided up among the men and women who had formerly worked it as slaves.

On January 16, 1865, the Union general William Tecumseh Sherman issued an order – Special Field Order 15 – to seize 400,000 acres of land and redistribute them to the newly freed black families, after parceling the land out into 40-acre units. This order, which had been approved by President Lincoln, eventually came to be known as “40 acres and a mule,” although the idea of loaning out government mules to help work the land came later.

Sherman did not come up with the idea of redistributing land; neither did the Secretary of War, Edwin Stanton. In fact, the idea came out of a meeting which Sherman and Stanton held with a group of black ministers in the days following Sherman’s famous March to the Sea. The meeting took place in Savannah, Georgia; Stanton preserved a transcript of it and sent it to the abolitionist Henry Ward Beecher, boasting that “for the first time in the history of this nation, the representatives of the government had gone to these poor debased people to ask them what they wanted for themselves.” The transcript was later published in the New York Daily Tribune.

PBS notes that Stanton and Sherman met with 20 black ministers, mostly Baptist and Methodist. 11 of the ministers had been born free; the other nine had lived as slaves. The ministers’ spokesman was a 67 year old former slave named Garrison Frazier  who had purchased his own freedom and the freedom of his wife. 

Stanton and Sherman asked the group of ministers what it was that they wanted for the black community. The men said, unanimously, that they wanted land of their own. Frazier said, “The way we can best take care of ourselves is to have land, and turn it and till it by our own labor … and we can soon maintain ourselves and have something to spare … We want to be placed on land until we are able to buy it and make it our own.” Sherman and Stanton then asked whether the freed slaves “would rather live — whether scattered among the whites or in colonies by themselves.” Frazier replied, “I would prefer to live by ourselves, for there is a prejudice against us in the South that will take years to get over … ” The others agreed, and just four days later, Sherman issued his order.

In the event, the order was reversed within just a few months. After Lincoln’s assassination, his Democratic vice president, Andrew Johnson, came into office and reversed Sherman’s order. The land was handed back to its former Confederate owners. WEB Dubois later said that “the vision of forty acres and a mule…was destined in most cases to bitter disappointment.” 

forgotten man

forgotten man

In politics, a phrase invoking the average American citizen. The implication is usually that the forgotten man has suffered some major economic hardship and has been neglected by the federal government. 

The phrase was first popularized in 1932 by Franklin Roosevelt during his first presidential campaign. FDR delivered a radio address setting out his argument against the Hoover administration. FDR called for an end to “the illusions of economic magic” and urged, instead, policies that would rebuild the economy “from the bottom up” and would focus on the real needs of ordinary Americans – the “forgotten man at the bottom of the economic pyramid,” as he put it.

But FDR didn’t invent the phrase. Writing at the end of the 19th century, the sociologist William Sumner wrote about the plight of the “forgotten man.” Sumner’s forgotten man had a very different connotation from FDR’s. For Sumner, the forgotten man wasn’t someone who needed help from the government. Rather, he was the victim of an over-reaching government.

Sumner was looking out for hard-working Americans who, as he saw it, were being over-taxed so that idealistic social programs could be put into place. The forgotten man himself never saw any benefit from those programs. Sumner wrote,

It is when we come to the proposed measures of relief for the evils which have caught public attention that we reach the real subject which deserves our attention…Their [the reformers’] law always proposes to determine what C shall do for X or, in the better case, what A, B and C shall do for X. …what I want to do is to look up C. I want to show you what manner of man he is. I call him the Forgotten Man. Perhaps the appellation is not strictly correct. He is the man who never is thought of. He is the victim of the reformer, social speculator and philanthropist.

Decades later, Richard Nixon returned to the idea of the “forgotten man.” Nixon argued that American politics was being dominated by a few, outspoken voices on the extreme ends of the political spectrum. He said that instead of listening to the loud voices of activists and anti-war protesters, he wanted to listen to “another voice. It is the quiet voice in the tumult and the shouting. It is the voice of the great majority of Americans, the forgotten Americans — the non-shouters; the non-demonstrators. They are not racists or sick; they are not guilty of the crime that plagues the land. They are black and they are white — they’re native born and foreign born — they’re young and they’re old…They give drive to the spirit of America.”

In 2017, President Trump referred again to the forgotten man during his inaugural address. The president vowed that during his presidency, the “forgotten men and women of this country” would be treated fairly:

The forgotten men and women of our country will be forgotten no longer. Everyone is listening to you now. You came by the tens of millions to become part of a historic movement the likes of which the world has never seen before. At the center of this movement is a crucial conviction: that a nation exists to serve its citizens.

floor fight

floor fight

A floor fight is an argument that threatens to derail either a convention or a congressional proceeding. Most of the time, floor fights are non-violent; the fighting is verbal. However, American history also includes some memorable incidents in which floor fights became physical. 

In the past, when brokered conventions were the norm, floor fights often broke out in the midst of a party convention; typically, they arose when delegates disagreed over the contents of the party’s platform. However, it has now been 70 years since the United States saw a brokered convention.

Today, “floor fight” usually refers to a fight on the Senate or House floor. In theory, of course, there should never be any fights at all in Congress, especially in the Senate. Thomas Jefferson wrote a “Manual of Parliamentary Practice” in which he explained the importance of decorum in the Senate:

“No one is to disturb another in his speech by hissing, coughing, spitting, speaking or whispering to another; nor to stand up or interrupt him; nor to pass between the Speaker and the speaking member; nor to go across the [Senate chamber], or to walk up and down it, or to take books or papers from the [clerk’s] table, or write there.”

In reality, of course, Congress members have not always lived up to Jefferson’s ideal. In 1902, a physical fight broke out between the junior and senior senators from South Carolina. The junior senator, John McLaurin, accused the senior senator, Ben Tillman, of “a willful, malicious, and deliberate lie,” which led Tilman to punch him in the jaw. Senators rushed to separate the men.  The fight did not last long, but remains part of Senate lore. 

The House of Representatives has a rowdier history. On February 6, 1858, a huge brawl broke out during a debate of the Lecompton Constitution, a pro-slavery document which had been proposed as the new constitution for the Kansas Territory. The debate lasted late into the night; at about 2AM, Pennsylvania Republican Galusha Grow and South Carolina Democrat Laurence Keitt first flung words at each other, and then began to fight with their fists. Dozens of other representatives waded into the fray, with Republicans taking sides against Democrats. At one high point in the fight, two Wisconsin Republicans ripped off the wig of a Democrat from Mississippi.

Most of the time, of course, a “floor fight” amounts to little more than a series of dramatic speeches and a tightly contested vote. In 2019, for example, the House of Representatives voted to impeach President Donald Trump. The vote took place almost entirely on party lines and was accompanied by impassioned speeches from both sides. As NBC reported:

As a technical matter, the resolution was a dry set of rules for the public phase of an investigation into President Donald Trump that has been under way informally almost since Democrats took control of the House in January. But on a political level, the floor fight over it was nasty, brutish and relatively short — just over an hour — ending in a nearly perfect party-line vote.

fireside chat

fireside chat

A series of radio addresses which President Franklin Roosevelt carried out over the course of his presidency. Roosevelt delivered a total of 30 such addresses between 1933 and 1944. They were known as “fireside chats” because they were delivered in an informal, relatively intimate style, as though the audience were sitting around the fireside chatting with the president.

FDR’s first fireside chat was delivered on March 12, 1933. The president used the address to explain the ongoing “bank holiday” and ask Americans for their cooperation in the midst of America’s banking crisis. The country had recently experienced a month-long run on the banks, which prompted FDR to announce a Bank Holiday, shutting down the banking system for a week.

In his first fireside chat, FDR explained the, in straightforward language, the way the banking system worked, and set out his reasons for shutting down the banks. He urged Americans to put their faith in the system instead of, as he put it, keeping their money under a mattress. The chat appears to have worked; within two weeks after the bank holiday ended, Americans returned more than half of their money to the banks.

Over the years, FDR delivered “chats” about his economic policies, unemployment figures, military initiatives, and a range of other topics. He used the chats to appeal directly to the American people, building up popular support for his policies and bypassing the media entirely. This also gave him the opportunity to address criticism against him.

In his fifth fireside chat, delivered on June 28, 1934, FDR acknowledged that there had been some problems with his New Deal, but insisted that those hurt by his programs were the greedy and the self-interested:

“In the working out of a great national program which seeks the primary good of the greater number, it is true that the toes of some people are being stepped on and are going to be stepped on. But these toes belong to the comparative few who seek to retain or to gain position or riches or both by some short cut which is harmful to the greater good.”

FDR also used that speech to take on his critics, depicting them as “complicated” and positioning himself as a plain-spoken American:

“A few timid people, who fear progress, will try to give you new and strange names for what we are doing. Sometimes they will call it ” Fascism”, sometimes “Communism”, sometimes “Regimentation”, sometimes “Socialism”. But, in so doing, they are trying to make very complex and theoretical something that is really very simple and very practical. I believe in practical explanations and in practical policies.”

Decades later, FDR continued to have his critics. In 1964, the New York Times published an editorial titled “The Case Against The ‘Fireside Chat.” The piece urged President Lyndon Johnson to stop appealing directly to “the people” to support his civil rights programs. The Times warned that FDR shouldn’t be used as a role model, explaining:

There are dangers in passionate appeals by the President for popular support. Ours is a constitutional society. Hopefully, that means that we govern ourselves through representatives even while restraining ourselves — and our representatives — within the confines of an elaborate structure of laws, institutions, rules, procedures and customs that collectively comprise our constitutional order. Government by popular referendum is the antithesis of constitutional government.

finger on the button

finger on the button

The person who has his “finger on the button” has the power to launch a nuclear weapon. The expression is used to evoke the possibility of nuclear war and to imply that the president of the United States – or his counterpart in other nuclear-powered states – has the power to set off an atomic war at any moment.

There is, of course, no actual nuclear “button” which can be pressed to launch a nuclear missile. However, it is true that in the United States, the president has the sole authority to decide when to launch the nuclear weapon. He is not required to consult with his advisors before making that decision, and nobody can legally prevent the use of nuclear weapons once the president has issued an order.

This unique power may be why the “finger on the button” phrase has been used again and again over the years by politicians, especially in the heat of a presidential race. 

President Lyndon Johnson, for example, told his Republican challenger, Barry Goldwater, that the president had to “do anything that is honorable to avoid pulling that trigger, mashing that button that will blow up the world. For his part, President Richard Nixon talked about exploiting the threat of nuclear weapons. He told his staff that he wanted the North Vietnamese leadership to believe that he was a “madman” who could not be held back “when he’s angry, and he has his hand on the nuclear button.”

The phrase is most often thrown around ahead of a presidential election, especially when one politician wants to attack another.  In 2008, a US Representative from Hebron, Kentucky called then-candidate Barack Obama a “snake oil salesman” and warned that he should not be trusted with the “button.” Davis told his audience, “I’m going to tell you something: That boy’s finger does not need to be on the button,” Davis said. “He could not make a decision in that simulation that related to a nuclear threat to this country.”

A few years later, Hillary Clinton told her supporters that Donald Trump shouldn’t be trusted with his own finger on the button. Clinton went beyond simply being concerned about nuclear weapons, to suggest that, more broadly, Donald Trump should not be trusted. Clinton said, “The bottom line is that just like Trump shouldn’t have his finger on the button or his hands on our economy, he should not have anything to do with our children’s education and our public schools.”

Of course, activists and pundits also use the phrase. In 2016, the former editor of the Bulletin of the Atomic Scientists wrote an editorial for the Chicago Tribune titled “Nuclear weapons: Whose finger do you want on the button?” The piece said, in part:

Putin is something of a chest-thumper. The two leading GOP candidates, Donald Trump and Ted Cruz, are also chest-thumpers. Given that, it is not difficult to imagine a scenario in which the world’s two leading nuclear weapon states are led by presidents who lack the temperament to handle a rapidly deteriorating confrontation…That’s a reality that we need to consider when we finally enter the voting booth in November.

Final Solution

The Final Solution was a euphemistic name used by Nazi leaders for their plan to exterminate all of the Jews in Europe. The plan’s full name was the “final solution to the Jewish question.” The plan led to the murder of six million Jews during the period from 1941 until 1945, when Allied forces liberated Europe from the Nazis.

Anti-semitism was a key part of Nazi policy from the time the party came into power, but during the early years, the party did not explicitly talk about extermination. However, Hitler himself had talked about his genocidal plans well before he ever came into power. In 1922, he told the journalist Josef Heil:

If I am ever really in power, the destruction of the Jews will be my first and most important job. As soon as I have power, I shall have gallows after gallows erected, for example, in Munich on the Marienplatz-as many of them as traffic allows.

When the Nazi party did come into power, in 1933, the party’s anti-Semitism took the form of anti-Jewish legislation and the violent Kristallnacht pogroms. Later, after the onset of World War II, the Nazis began to set up ghettos to contain the Jewish populations in countries under Hitler’s control. Conditions in these ghettos were unsanitary and dangerous; residents faced overcrowding and severe food shortages. 

The Nazi leadership considered deporting Germany’s Jewish population, instead of murdering them. In fact, until at least the end of the 1930s Hitler thought that mass deportation was the best way to achieve his dream of eliminating Germany’s Jewish population. The goal behind all of the Nazi party’s anti-Jewish violence and legislation was to convince as many Jews as possible to emigrate. In 1939, Hitler delivered a speech to the German parliament in which he criticized western governments for failing to give asylum to Jewish immigrants. Hitler warned that if there was a war, he would bring about the “annihilation” of European Jews.

However, the leadership’s thinking apparently changed after the invasion of Russia. During the invasion, which the Germans called “Operation Barbarossa,” members of Hitler’s special forces showed that they were willing to carry out mass murders, leading Hitler to believe that his forces would be willing to carry out a genocide. Hitler had picked 3,000 men to serve in the special force, known as the Einsatzgruppen. Their orders were to find and murder all Jews – men, women, and children. Heinrich Himmler later wrote in October of 1943:

We were faced with the question: what about the women and children? – I have decided on a solution to this problem. I did not consider myself justified to exterminate the men only – in other words, to kill them or have them killed while allowing the avengers, in the form of their children, to grow up in the midst of our sons and grandsons. The difficult decision had to be made to have this people disappear from the earth.

Meanwhile, the anti-Semitic violence in Germany was leading to an outbreak of anti-Semitism across the western world. In France, a group calling themselves the Cagoulards, or “hooded men,” espoused fascist and anti-Semitic views. Fascist and anti-semitic groups also formed in the UK, where they were known as the British Union of Fascists, and in the United States, where they were called the German-American Bund.

fifth column

A “fifth column” is a group which operates in secret, usually within enemy lines, in order to help further a cause which they secretly support. 

The term originated with Emilio Mola Vidal, a Nationalist general who served under Franco during the Spanish Civil War. As Mola Vidal was marching on Madrid with four columns of his own army, he announced that he also had a “fifth column” of supporters who were working to help him from within the capital.

Classically, a fifth column works by infiltrating a nation, introducing its supporters into positions of trust, and gradually influencing public policy and military issues. Fifth column workers can also influence the people of a nation by spreading rumors and fear.

However, the notion of the “fifth column” can also be used to create fear and distrust among the people of a nation. In the aftermath of the September 11 attacks, for example, the New York Post warned darkly that there was a “fifth column” operating in the United States and working to bring down the country:

The FBI is looking for hundreds of men inside the United States suspected of playing a role in Osama bin Laden’s terror network. The support network that made last week’s attacks possible is right here, burrowed inside Arab and Muslim communities in American neighborhoods. For the first time in American history, we have irrefutable evidence that there is a dangerous and functional foreign-born “fifth column” at work on American soil.

The Post was hardly the first to issue warnings about a fifth column operating in the United States. Decades earlier, Franklin Delano Roosevelt warned that there was a “fifth column” of Nazi sympathizers working within the United States where, he said, they were plotting to carry out espionage and sabotage.

In modern times, some on the left have been warning that the Russian president, Vladimir Putin, is setting up a “fifth column” throughout Europe:

 Putin…has formed an alliance with many European far-right political parties and their leaders, who have delivered consistent adherence to Russian interests even when it contradicts some of their past positions…These far-right parties are capitalizing on economic and security crises in Europe to build popular support and now operate as a fifth column that is undermining the Western liberal order from within. President Donald Trump’s unwavering support for Putin and his pursuit of policies that advance Russia’s goals show disturbing similarities to the European far right that are equally difficult to rationalize.”

It’s worth noting that the Cato Institute has argued that the true danger of the fifth column isn’t the danger posed by the saboteurs and secret agents – it’s the fear which they cause among the general population. In Spain, for example,

“The city [Madrid] never fell to the nationalists, but fear of this “fifth column” caused the Republican government under Francisco Caballero to abandon Madrid for Valencia and it led to a massacre of nationalist prisoners in Madrid during the ensuing battle. So a “fifth column” is not so much an insidious group of spies or traitors as it is the threat of such a group which causes the incumbent power to miscalculate and overreact.”

on the fence

To be”on the fence” is to be hesitant about taking a political stance. Someone who is “on the fence” resists joining one side or the other of an argument, especially when taking a side could be politically risky.

On a literal level, of course, fences define the boundaries between properties. Sitting astride of a fence indicates that you have one foot in each of two properties. Metaphorically, sitting on the fence means that you have one foot in each of two opposing positions.

“Fence sitting” tends to be used as an insult, but the phrase can also be flipped on its head. A more positive term for a fence sitter is a moderate – and the world is full of praise for moderates. Writing in the Wall Street Journal, Daniel Akst looked at some recent definitions of the moderate:

A true moderate…rather than seeking “safe spaces,” welcomes opposing views. Moderates know that nobody has a monopoly on the truth and are willing to appear inconsistent in order to follow the facts, moving (deliberately) first to one side and then the other like human ballast in the interests of keeping the ship of state on an even keel.

Moderates are increasingly rare in the US today. Americans are more and more divided on political, economic, and social issues. The Pew Center calls political polarization a “defining feature of American politics today” and notes that there is a widening gap between conservatives and liberals on issues like gender equality, the environment, and a host of other issues. 

The gap between Democrats and Republicans has widened since the onset of COVID-19, the Pew Center has found. In a survey carried out in late June, the group found that 61 percent of Republicans believed that the US had “turned a corner” when it came to dealing with the coronavirus. In contrast, just 23 percent of Democrats said the same; 76 percent of Democrats surveyed agreed with the statement that “the worst is still to come” in terms of the coronavirus.

With the country increasingly polarized, some marketing experts say nobody can afford to sit on the fence any longer. Even corporations must now take a stand, some say, aligning themselves with one political position or another. A panel hosted by the Business Marketing Association at the Wall Street Journal found that

“In the Trump era, the longtime practice of sitting on the fence is over for brands who must not only know their political values, but openly share them. But that’s not to say brands must take strong stances on every issue. Instead, they should speak in broader terms about issues tied to their values and avoid calling out any specific political figures or voter blocks.”

For what it’s worth, Britain’s national health service has reported that there may be health benefits to political extremism. The NHS summarized what it called a “tongue in cheek” study published in the Mail Online which found that people who identify with extreme political positions tend to get more exercise than self-described moderates. In other words, the NHS wrote, “People who sit on the fence, it seems, spend too much time sitting on the couch as well. “

fat cat

In politics, a “fat cat” is a rich and influential person, usually one who donates generously to political campaigns. 

Typically, “fat cat” refers to an executive whose earnings vastly exceed those of the average American. The expression suggests that the person is bloated and slightly grotesque, like a cat who’s been over-eating for years and has become grossly overweight. 

The phrase “fat cat” was in use by the 1920s in America; Merriam Webster claims that the term was first used in 1928. However, others claim that an article in the Baltimore Sun in 1925 grumbled about “fat cats” as early as 1925. The article read, in part,

“It ought perhaps to be explained that Fat Cat is the significant and revealing name in political circles for the sleek, rich fellows who enter politics for one reason or another and depend for their standing and success upon the liberality with which they shell out the dollars.”

The term “fat cat” often gets thrown around by politicians and pundits who are looking for a way to rebuke their political enemies. In 2009, then-president Obama used the term to describe bankers who were opposed to his proposed financial regulations. 

“I did not run for office to be helping out a bunch of fat cat bankers on Wall Street,” Obama told 60 Minutes. He added, “the people on Wall Street still don’t get it. They’re still puzzled why it is that people are mad at the banks. Well, let’s see. You guys are drawing down $10 (million), $20 million dollar bonuses after America went through the worst economic year in decades and you guys caused the problem.”

Just a few years later, though, Obama himself was being described as a “fat cat.” Headlines pointed out that after leaving office, Obama had charged as much as $400,000 for a single speaking engagement. He was speaking at a Wall Street conference organized by the investment firm Cantor Fitzgerald. Eventually, Obama seemed to apologize for calling bankers fat cats, telling the New York Times, “it hurt their feelings. I would have some of them say to me, ‘You know, my son came home and asked me, ‘Am I a fat cat?”

During the 2016 presidential campaign, Donald Trump repeatedly slammed Hillary Clinton as a “fat cat.” Trump told NBC’s Meet the Press that Clinton was going overboard in her fundraising effort and that she had sold out to Wall Street: “[Clinton] is selling herself to Wall Street, and the Wall Street fat cats are putting up a lot of money for her,” Trump said, pointing out that his campaign had no such need to fundraise.

A few years later, of course, critics of President Trump mocked him as a “fat cat.” This was a popular theme with political cartoonists and columnists. One cartoonist drew the president as a portly orange cat wearing a yellow hairpiece. Another created merchandise satirizing Dr Seuss’s famous “cat in the hat;” the president was depicted as a fat cat in a blue suit and a MAGA hat.

Fair Deal

The Fair Deal was a package of economic and social reforms put forward by President Harry Truman, with the stated purpose of giving all Americans access to education, healthcare, and good jobs.

Truman began talking about reform almost as soon as he came into office. In 1945, he asked Congress to create legislation that would expand social security, create new public housing, and enact civil rights legislation, including a permanent Fair Employment Practices Act. Congress did pass an Employment Act, which made it the federal government’s responsibility to ensure that all Americans could find work. But Truman’s other reforms didn’t get any traction.

In 1949, fresh after winning re-election, Truman made the Fair Deal the focus of his State of the Union address.  “Every segment of our population and every individual has a right to expect from our Government a fair deal,” Truman told Congress. He named his proposal the “fair deal” in a reference to Franklin Roosevelt’s New Deal; Truman’s proposal was intended to take the New Deal into the present.  His goal was to increase the nation’s prosperity and to spread out the wealth, since, he said, we “cannot maintain prosperity unless we have a fair distribution of opportunity and a widespread consumption of the products of our factories and farms.” 

Truman told Congress:

We must spare no effort to raise the general level of health in this country. In a nation as rich as ours, it is a shocking fact that tens of millions lack adequate medical care. We are short of doctors, hospitals, nurses. We must remedy these shortages. Moreover, we need–and we must have without further delay–a system of prepaid medical insurance which will enable every American to afford good medical care.

It is equally shocking that millions of our children are not receiving a good education. Millions of them are in overcrowded, obsolete buildings. We are short of teachers, because teachers’ salaries are too low to attract new teachers, or to hold the ones we have. All these school problems will become much more acute as a result of the tremendous increase in the enrollment in our elementary schools in the next few years. I cannot repeat too strongly my desire for prompt Federal financial aid to the States to help them operate and maintain their school systems.

However, the United States had moved to the right since the period of the Great Depression, when FDR’s New Deal passed through Congress. Truman turned out to have a greater hurdle to clear than his predecessor had; the president had misjudged the direction the country was taking. His Fair Deal was popular with liberals in Congress, but it ran into stiff opposition from conservative Democrats and Republicans. Southern Democrats carried out a filibuster and blocked Truman’s civil rights legislation. An agricultural program geared at family farmers also failed. Congress did, however, pass legislation to increase the minimum wage, and it established a Housing Act to create new houses for the poor. Congress also expanded social security benefits. 

evil empire

evil empire

“Evil empire” was President Ronald Reagan’s name for the USSR.

Reagan often portrayed the struggle between the US and the USSR as a moral war between good and evil. In some of his most famous speeches, he advocated a strong stance against the USSR, warning that the alternative was to abandon the struggle between right and wrong.

Reagan first referred to the USSR as the “evil empire” during a speech he delivered to the British House of Commons in 1982. The following year, he used the phrase again when he spoke to a convention of the National Association of Evangelicals.

In the 1983 speech, Reagan urged Evangelical leaders to do their part in what he described as a “spiritual” crisis, a “test of moral will and faith.” He insisted that the USSR needed to be totally eliminated in order to keep America free:

…in your discussions of the nuclear freeze proposals, I urge you to beware the temptation of pride–the temptation of blithely..uh..declaring yourselves above it all and label both sides equally at fault, to ignore the facts of history and the aggressive impulses of an evil empire, to simply call the arms race a giant misunderstanding and thereby remove yourself from the struggle between right and wrong and good and evil.

I ask you to resist the attempts of those who would have you withhold your support for our efforts, this administration’s efforts, to keep America strong and free, while we negotiate–real and verifiable reductions in the world’s nuclear arsenals and one day, with God’s help, their total elimination. [Applause]

While America’s military strength is important, let me add here that I’ve always maintained that the struggle now going on for the world will never be decided by bombs or rockets, by armies or military might. The real crisis we face today is a spiritual one; at root, it is a test of moral will and faith.

Reagan’s supporters have argued that his hard-line stance against the Soviet Union was crucial for bringing about the fall of the USSR and dismantling the Soviet bloc. Writing a quarter of a century after Reagan’s “evil empire” speeches, Newt Gingrich praised the “radicalism” behind those speeches. 

“By calling the Soviet Union an “evil empire,” Reagan sent a clear signal that America was going to challenge the Soviet Union morally, win the psychological information war, and de-legitimize it. If the government was evil, he argued, how could it have authority?” Gingrich reasoned.

Ironically, Reagan himself distanced himself from his “evil empire” rhetoric towards the end of his presidency. In 1988, the president visited Moscow, met with Mikhail Gorbachev, and toured the Kremlin and Red Square. A reporter asked him directly whether he still thought of the Soviet Empire as an evil empire, and he said that he did not. “You are talking about another time, another era,” Reagan explained.

During the same visit, Reagan spoke at the House of Writers in Moscow. There, he said that it was vital not to caricature any nation or group of people. He explained,

“Pretty soon,” he said, “at least for me, it becomes harder and harder to force any member of humanity into a straitjacket, into some rigid form in which you all expect to fit.”

every man a king

every man a king

“Every Man a King” is the title of a speech delivered in 1934 by Senator Huey Long of Louisiana. The speech, which Long delivered on national radio, is one of Long’s most famous speeches, along with his “Share the Wealth” speech.

Long, a populist politician, used the speeches to rail against the concentration of wealth in a few hands and to highlight the problems of the man poor people in his own state. The “every man a king” speech said, in part:

Now, we have organized a society, and we call it “Share Our Wealth Society,” a society with the motto “every man a king.”

Every man a king, so there would be no such thing as a man or woman who did not have the necessities of life, who would not be dependent upon the whims and caprices and ipsi dixit of the financial martyrs for a living. What do we propose by this society? We propose to limit the wealth of big men in the country. There is an average of $15,000 in wealth to every family in America. That is right here today.

We do not propose to divide it up equally. We do not propose a division of wealth, but we propose to limit poverty that we will allow to be inflicted upon any man’s family. We will not say we are going to try to guarantee any equality, or $15,000 to families. No; but we do say that one third of the average is low enough for any one family to hold, that there should be a guaranty of a family wealth of around $5,000; enough for a home, and automobile, a radio, and the ordinary conveniences, and the opportunity to educate their children; a fair share of the income of this land thereafter to that family so there will be no such thing as merely the select to have those things, and so there will be no such thing as a family living in poverty and distress.”

Long’s radio speeches also represented his break with President Franklin D Roosevelt. Long had introduced legislation into the U.S. Senate in an effort to limit incomes and redistribute wealth. However, his legislation never got off the ground; most of his fellow senators considered them to be too radical. 

Long had actually helped FDR win the Democratic presidential nomination in 1932, but broke with the administration in 1934, after he gave up hope that the New Deal would make a meaningful difference in the lives of most Americans. That’s when Long decided to appeal directly to the American people, with his “every man a king” and his “share the wealth” speeches.

Long had also used “every man a king” as a campaign slogan; he also made it the title of his autobiography, which was published in 1933. Long also used the phrase as the title of his campaign song. He co-wrote the song, “Every Man A King,” with Professor Castro Carazo, who was the head of the Louisiana State Band. 

The slogan may also be the origin of Long’s nickname, Kingfish.

Era of Good Feeling

Era of Good Feeling

The “Era of Good Feeling” refers to a period in U.S. history from about 1815 until about 1825, characterized by a sense of optimism and positivity. The era is closely associated with the presidency of James Monroe, who served two terms from 1817 to 1825.

Monroe easily won the presidential election of 1816, garnering 183 electoral votes while the opposing Federalist party won just 34. His victory signaled the effective end of the Federalist party and ushered in a period of total dominance by Monroe’s Democratic-Republican party.

After the election, Monroe went on a prolonged victory tour throughout New England. It was during this tour that one newspaper, the Columbian Centinel, published an article titled “The Era of Good Feeling.” The piece described a festive, upbeat mood which was shared by “eminent men of all political parties.”

The era was marked by America’s victory in the War of 1812. In Europe, the Napoleonic Wars were at an end, which also left Americans free to concentrate on their own affairs. The age is characterized by a growing isolationism.

Historians say that the era of good feeling was also shored up by economic prosperity. During Monroe’s first term, America put in place its first protective tariffs and established the Second National Bank. Congress, at Monroe’s request, also put an end to property taxes and other federal taxes. The federal government was able to pay off the nation’s extensive war debt using the money from tariffs.

At the same time, America continued to expand across the continent. In 1819, Andrew Jackson invaded Florida, which eventually led to a treaty with Spain that handed Florida over to the United States. During this period, America also stepped up its western expansion. In 1823, the president also articulated the Monroe Doctrine, which defined the Western Hemisphere as the United States’ sphere of influence and warned Europeans not to interfere in the region.

The era of good feeling was at an end by 1825. Even during Monroe’s second term, the sense of national goodwill was beginning to fade, and major conflicts over slavery and national expansion were making themselves felt. The period of one-party rule was also coming to an end. 

Since the Federalist party had collapsed, the presidential election of 1824 featured candidates who were all from the Democratic-Republican party. Four candidates vied for the presidency: Andrew Jackson, John Quincy Adams, Treasury Secretary William Crawford, and House Speaker Henry Clay. None of the candidates was able to win a majority in the electoral college, so that decision went to the House of Representatives. The choice was between Adams and Jackson; neither Crawford nor clay had enough votes to compete.

The House handed the presidency to Adams, although Andrew Jackson had won the most popular votes and the most electoral votes. The election marked a split in the party, leading Americans to re-organize into two new parties: the Democrats, loyal to Jackson, and the Whigs, who were allied to Adams. In 1828, Andrew Jackson ran again and, this time, defeated Adams in his re-election bid.

eunuch rule

eunuch rule

The “eunuch rule” is a reference to the provisions in many state constitutions which prevented state governors from running for a second consecutive term in office. Those provisions have been amended in almost every state; as of 2020, Virginia is the only state which still prevents governors from holding two consecutive terms in office.

The rules on gubernatorial qualifications, succession, and term length are decided on a state-by-state basis. In most states, governors serve a four-year term and may serve back-to-back terms; however, they may be limited to a number of consecutive or lifetime terms.

The “eunuch rule” got its name because, in theory, it put the incumbent governor in a weak position (like that of a eunuch, with no real power). William Safire wrote:

In most states, particularly in the South, governors are rendered politically impotent – LAME DUCKS from the moment they enter the Statehouse – by the eunuch rule. This was designed to prevent four-year governors from building long-lasting machines…Whenever the eunuch rule applies, the governor starts thinking about (1) running for senator, (2) laying the groundwork for a career in private business, or (3) “modernizing the state constitution to permit reelection.

Historically, politicians have gone to great lengths to get around the “eunuch rule.” In 1966 George Wallace, the governor of Alabama, was nearing the end of his term. Alabama’s state laws prevented him from seeking a second term in office, and so Wallace decided to put his wife, Lurleen, on the ballot instead. 

Lurleen Wallace beat out 10 opponents in the Democratic primary and also defeated her Republican opponent, James Douglas Martin. She became the first female governor of Alabama. Lurleen’s working class roots helped to make her very popular in the state. Like her husband, Lurleen actively opposed desegregation efforts. She is also remembered, though, for her efforts to improve mental health care and for her work to expand state parks and recreational facilities. 

Today, Virginia is the only state which prevents governors from succeeding themselves. There is an ongoing effort to amend the state’s constitution so that governors can serve consecutive terms. That state tends to be popular among Democrats, who have won recent gubernatorial elections; many Republicans oppose the proposed amendments.

In 2019, Dawn Adams, a Democrat from Richmond, sponsored House Joint Resolution 608, which would amend the state constitution to let governors elected after 2021 serve for two terms in a row. Adams described the current system as a “detriment to the commonwealth.” “She told Delmarva Now, “Now is the time we should look to pass a constitutional amendment for consecutive but limited governor terms.” Adams also noted that the current system leads to “inefficiency, waste, duplication of services, low morale and low productivity.”

However, Republicans in Virginia said the term limit is a much-need check to the strong power of the executive; Virginia’s governor has the power to amend and veto bills, appoint officials and order a special legislative session. Senate Majority Leader Thomas Norment, a Republican, said a big argument against the proposed change was also that he wouldn’t have wanted recent governors to stay in power for more than four years.

Said Norment: “I would very succinctly and ecumenically say two words: Gilmore and McAuliffe.”

Enemies List

enemies list

A list of political opponents kept by the Nixon administration. The phrase “enemies list” is now used as shorthand to refer to suspected abuses of power in any administration.

In 1973, former White House aide John Dean III told the Senate that President Nixon kept a list of his political opponents. The so-called “opponents list” had been compiled for Nixon’s trusted aide, Charles Colson. It featured the names of public figures who were thought to pose a threat to the Nixon administration. The list included notes about any known weaknesses of the “enemies,” and also suggested finding a way to “use the available federal machinery to screw our political enemies.”

Dean revealed the list’s existence during a hearing of the Watergate Committee. On the same day, the CBS journalist Daniel Schorr managed to get a copy of the list, which he began reading, out loud and on air, to his audience. (A second, longer version of the list appeared later that year.) 

“I got to No. 17, and I said, ‘No. 17, Daniel Schorr, a real media enemy,’ ” Schorr told The Hill, decades later.

“I almost collapsed on the air. I had never read it before, never seen it before, never expected it. But I continued and said, ‘No. 18, Paul Newman. No. 19, Mary McGrory [the Pulitzer Prize-winning reporter for The Washington Post].’ It was such a distinguished list,” he said, joking that the notoriety of the list made him more popular. “My lecture fees went up.”

Schorr wasn’t the only one who was proud to be on Nixon’s list. The Village Voice ran a tongue in cheek piece that year titled “The Shame of Being Left Off Nixon’s Enemies List.” The article speculated about what might happen to leftist commentators, activists, and others who prided themselves on being “anti-establishment,” if those people turned out not to be on Nixon’s all-important list:

“What newspaper is going to shell out hard cash for a columnist whose opinions are so tame that even the White House doesn’t consider him dangerous? Poor Nick Von Hoffman. Darling of the New Left, intimate of numerous Democrats, defender of the Chicago Seven, how does he face his readers, knowing that Nixon, Haldeman, Ehrlichman, and Dean all considered Max Lerner a greater threat to their empire? And Jimmy Breslin. For years he and Pete Hamill have fought tooth and nail to see which one could say the most outrageous things about the Nixon regime. How can Breslin even manage to drag his poor broken body out of bed in the morning now that Hamill has administered the tour de force?”

A few decades later, some journalists and pundits began making comparisons between Nixon’s enemies list and the Trump administration. In 2018, the Trump White House announced that it had stripped the security clearance from former CIA director John Brennan and that it was considering doing the same to others. Mike Mullen, a former chair of the Joint Chiefs of Staff, told Fox News that this meant the president was “creating a list of political enemies.”

eight millionaires and a plumber

eight millionaires and a plumber

“Eight millionaires and a plumber” is a dismissive reference to President Dwight D. Eisenhower’s first cabinet.

Eisenhower’s critics complained that the president’s top advisers were all wealthy and therefore, by implication, out of touch with ordinary people. The only exception – the “plumber” in the phrase – was Marty Durkin, the new labor secretary. Durkin had previously headed up the Plumbers’ Union.

Many of Eisenhower’s cabinet members came from the private sector and lacked experience in government. The secretary of the treasury, George Humphrey, had a background in law and eventually became the president and chairman of the board of the steelworks M. A. Hanna and Company. 

Eisenhower’s pick for Secretary of Defense, Charles Wilson, was an engineer who had risen to become the vice president of General Motors. Wilson was very open about his strong ties to the private sector; he considered them an asset, not a problem. He once famously told the Senate Armed Services Committee that “for years I thought what was good for the country was good for General Motors, and vice versa.”

Martin Durkin, the “plumber” in the cabinet, came from a much humbler background than his colleagues. Durkin grew up in Illinois and attended evening school, leaving at the age of 17 to become a steamfitter’s apprentice. From there, he joined the plumbers’ and pipe fitters’ union, eventually rising through the ranks to become president of the union. In 1933, he became the Director of Labor for the State of Illinois. Durkin was also the only Democrat in the cabinet. A former Union man, Durkin pushed hard to revise the Taft-Hartley Act. He was unable to get the changes he wanted, and stepped down from his cabinet post after just eight months in office.

Decades later, pundits compared President Donald Trump’s cabinet to President Eisenhower’s. Screaming headlines criticized the president for appointing a team of wealthy individuals with little government experience. A piece in Politico was titled “Trump’s Team of Gazillionaires;” the article pointed out that the president’s cabinet picks seemed out of step with his campaign message, which had promised to fight for the forgotten working class. The Washington Post claimed that Trump had assembled “the richest administration in modern American history.” 

However, an op-ed in the Washington Post also pointed out that most of America’s presidents and cabinet members have been wealthy. JFK, FDR, and the trust-busting Teddy Roosevelt all had personal fortunes which allowed them to pursue independent policies. (In his own time, FDR was labeled a “traitor to his own class” for his tax policies, among other things.) The Post op-ed closed by defending Eisenhower’s “eight millionaires and a plumber” cabinet:

Although Republican President Dwight Eisenhower was from a more modest background, his Cabinet picks were roundly mocked as “eight millionaires and a plumber.” Yet they managed to serve their country well and selflessly, acting against their own economic interests by maintaining a top marginal income tax rate of 91 percent throughout Eisenhower’s eight years in office. The revenue helped build our interstate highways and create NASA, among other achievements.



To “electioneer” is to actively take part in an election by working for the election of a candidate or a party.

The word is almost always used in a pejorative sense. Most of the time “electioneering” is used to suggest something tawdry, or underhanded; the word implies discomfort with the way that campaigns are carried out.

Electioneering has had a bad rap since the time of the Founding Fathers. In 1796, James Madison wrote a letter to Thomas Jefferson, grumbling that a statement from the French foreign minister (Pierre Auguste Adet) was being dismissed as mere electioneering. Madison wrote:

Adêts Note which you will have seen, is working all the evil with which it is pregnant. Those who rejoice at its indiscretions and are taking advantage of them, have the impudence to pretend that it is an electioneering manoeuvre, and that the French Govt. have been led to it by the opponents of the British Treaty.

A few years later, John Adams wrote at length about the problem with electioneering. Adams was also writing to Thomas Jefferson. His 1814 letter complains that electioneering is taking over just about everything – and he expects it to get worse. Adams wrote:

I dare not look beyond my Nose, into futurity. Our Money, our Commerce, our Religion, our National and State Constitutions, even our Arts and Sciences, are So many Seed Plotts of Division, Faction, Sedition and Rebellion. Every thing is transmuted into an Instrument of Electioneering. Election is the grand Brama, the immortal Lama, I had almost Said, the Jaggernaught, for Wives are almost ready to burn upon the Pile and Children to be thrown under the Wheel.

In 1801, Thomas Jefferson issued an executive order barring all federal workers from doing anything which would “influence the votes of others, nor take part in the business of electioneering.” Jefferson’s executive order is often seen as the forerunner of the Hatch Act, which puts strict limits on the political activities of government employees at all levels.

The Bipartisan Campaign Reform Act of 2002 differentiated between electioneering and what it called “issue-related speech,” arguing that electioneering was not entitled to First Amendment protections because it involved an individual candidate or party, rather than an idea. That distinction has not been universally accepted, but it explains why legally, ads defined as “electioneering” can be restricted.

In 2012, for example, a US District Judge found that a series of ads produced by the Hispanic Leadership Fund should be classified as “electioneering communication.” The judge found that the ads in question seemed to implicitly endorse President Obama and could, therefore, be restricted during the 60 days before the election. 

The judge also rejected the claim made by the Hispanic Leadership Fund that the electioneering communication disclosure provisions violated constitutional rights to political speech and to due process. In his decision, the judge wrote in part,

“Both the Supreme Court and the Fourth Circuit have made clear that [the Federal Election Campaign Act’s] disclosure requirements for electioneering communications are constitutional because they are justified by the public’s interest in knowing who is speaking about a candidate during the election period.”

effete snobs

effete snobs

“Effete snobs” was a phrase used by Vice President Spiro Agnew to denounce anti-war protesters, and young intellectuals in general, during the Vietnam era. The phrase quickly caught on and was adopted as a slogan by the anti-war movement.

Agnew had a reputation as a no-nonsense, law and order politician and a dramatic orator. His law and order reputation was badly dented later when he was forced to resign as vice president amid charges of tax evasion and bribery. But in 1969, Agnew was at the height of his power.

His famous “effete snobs” speech called out not only leftist protesters, but a whole group of pseudo-intellectuals who Agnew believed were brainwashing young students:

Education is being redefined at the demand of the uneducated to suit the ideas of the uneducated. The student now goes to college to proclaim, rather than to learn. The lessons of the past are ignored and obliterated, and a contemporary antagonism known as “The Generation Gap.” A spirit of national masochism prevails, encouraged by an effete core of impudent snobs who characterize themselves as intellectuals.

Agnew was responding to the so-called Peace Moratorium, a national day of protest in which an estimated two million people across the United States took part in demonstrations against the war in Vietnam. A quarter of a million people gathered in Washington DC, singing songs and holding a late-night vigil. The police clashed with activists outside the White House and elsewhere in the country. In DC, the child development expert Dr Benjamin Spock told the crowd that the war was a “total abomination” that was crippling America and must be stopped.

Agnew delivered his “effete snobs” remarks about a week after the demonstration, at a fundraiser in New Orleans. The vice president asserted that “hardcore dissidents and professional anarchists” were inciting the protests, and that the demonstrators didn’t reflect the true views of most Americans. Nixon had already pledged that he would not be moved in any way by the protests.

Agnew made a series of speeches over the years, criticizing leftist student protesters; in 1970, the New York Times put together a collection of some of his most memorable quotes. The vice president said, for example:

Most of these young people who depend upon the ideology of ‘the movement’ for moral and mental sustenance will in time . . . return to the enduring values, just as every generation before them has done. But unfortunately, there is a much smaller group of students who are committed to radical change through violent means. . . . This is the criminal left that belongs not in a dormitory but in a penitentiary…

One of Agnew’s more memorable descriptions was of the media, when he called them “nattering nabobs of negativism.

Agnew’s “effete snobs” speech, though, quickly became notorious and was co-opted by the left for their own purposes. Political buttons soon appeared, reading, for example, “snob for peace” or “I’m an effete snob for peace.” 

The Baltimore Sun has pointed out that ever since Agnew’s speech, politicians have tried to make political capital by attacking out of touch “snobs” and “elitists” at America’s colleges. Ironically, the Sun notes, these anti-elitist politicians are all highly educated themselves.

dyed in the wool


“Dyed in the wool” is a phrase referring to people who hold very strong opinions and are unwilling to change them. Synonyms include “uncompromising” and “inveterate.” In politics, people might be can be referred to as “dyed in the wool Democrats” or “dyed in the wool Republicans.”

Merriam Webster notes that the phrase was first used in its modern sense in 1580. Merriam Webster says that in the 16th century, writers began to use the expression to discuss ways that “children could, if taught early, be influenced in ways that would adhere throughout their lives.” The phrase was used in its political sense as early as the beginning of the 19th century, when Daniel Webster complained about a certain type of Democrat whose views were “as unyielding as the dye in unspun wool.”

In its literal sense, “dyed in the wool” means that the wool has been dyed before it is spun into thread. This produces a strong and long-lasting color. Metaphorically, dyed in the wool means that someone’s opinions were formed and set at an early stage in their development and that they can’t be washed away.

In 1870 Frederick Douglass delivered a speech to new voters, urging them to vote their conscience; he described himself as a “Republican dyed in the wool” but told his audience that they had an absolute right to vote as they saw fit:

I hear some men say that if the black man, in this enlightened age, should vote the Democratic ticket let him be denounced. Gentlemen I do not share that opinion at all. 1 am a Republican – a Black Republican  dyed in the wool- and I never intend to belong to any other than the party of freedom and progress. But if one of my colored fellow-citizens chooses to think that his interests and rights and the interests of the country can be better sub-served by giving his vote against the Republican party, I, as an American citizen, and as one desirous to learn the first principles of free government, affirm his right-his undoubted right-to vote as he chooses.

Dyed in the wool can be used in either a positive or a negative sense, depending on who is speaking. For example, when President Obama nominated Sonia Sotomayor to the Supreme Court, some leftists complained that Sotomayor was not a reliable, or dyed in the wool, liberal herself. Sotomayor did not have a strong liberal record on issues like abortion, same-sex marriage, and the death penalty. “The fact that she hasn’t gone off on these sorts of questions I think shows that honestly she’s not a dyed in the wool liberal,” Thomas Goldstein, a leading appellate attorney, told Politico, adding, “There are places where Sotomayor will be more conservative than Souter.”

In contrast, in 2018 the chair of the Pennsylvania GOP asserted that Conor Lamb, who was then running for Congress, was a “dyed in the wool liberal” and a staunch supporter of Nancy Pelosi. The implication was that Lamb was trying to portray himself as more centrist than he actually was, and that his true colors would show through after election.


don’t change horses

“Don’t change horses” is a phrase used to urge voters to stick with the incumbent president during times of turmoil and conflict. The full expression is “don’t change horses mid-stream” (or, sometimes, “don’t swap horses midstream”).

The expression is usually credited to Abraham Lincoln who, during the Civil War, said that voters should re-elect him because it would be foolish to change leaders in the middle of such a turbulent time. After his nomination to run for a second term, Lincoln told a group of his supporters,

“I do not allow myself to suppose that either the convention or the League have concluded to decide that I am either the greatest or best man in America, but rather they have concluded that it is not best to swap horses while crossing the river, and have further concluded that I am not so poor a horse that they might not make a botch of it in trying to swap.” 

Lincoln’s phrase spread quickly. Harper’s Weekly ran a political cartoon which showed “Old Abe” as a steady-looking, bearded horse with a voter sitting in the saddle. Lincoln’s opponent, George McClellan, was back in the bushes, surrounded with promises of peace and compromise.

In modern times, the message of sticking with your leader in times of trouble still resonates. Some pundits say that this is what allowed George W. Bush to beat out John Kerry and win a second term. Polls showed that more than anything, voters were concerned about global terrorism in the wake of the 9/11 attacks. That meant that Bush, the incumbent, was a natural choice for Americans who wanted strength and continuity.

The phrase doesn’t just apply to presidents. In 2009, in the middle of the global economic downturn, MarketWatch urged President Obama to leave Ben Bernanke in his post as chair of the Federal Reserve “so that he can ride out the financial crisis.” 

In March 2020, the New York Post suggested that Donald Trump was trying to portray himself as a “wartime president” in order to win the upcoming election. (The “war” he was engaged in was against the coronavirus pandemic.)

“Donald Trump, making his own case for re-election in the midst of the coronavirus pandemic, has been wielding martial rhetoric more and more frequently in his barrage of daily briefings on the unfolding calamity,” The Post wrote. The paper also cited White House economic adviser Peter Navarro, who had told Fox News, “We have essentially a wartime president now, and the war is against this coronavirus. And there can’t be any dissension in the ranks.” The Post reported that Trump’s poll numbers had been up ever since he shifted to the “war president” strategy.

Of course, the “don’t change horses” strategy is not foolproof. In 1932, Herbert Hoover was the incumbent president; his supporters tried to portray him as a sort of wartime leader who was battling the Depression. However, many voters blamed him for the Depression. When Hoover’s supporters urged voters not to change horses, voters chanted, “change horses or drown!”

Hoover was defeated by Franklin Roosevelt. Ironically, FDR went on to win re-election by urging voters, once again, not to change horses in mid stream.

do nothing Congress

Do-Nothing Congress

In 1948, when President Truman was running for re-election, he frequently attacked the Republican-controlled Congress as the “do-nothing Congress.”

In fact, the 80th Congress passed 388 public laws, making it hard to call it exactly “do nothing.” But, the president charged that Republicans in Congress were blocking his “Fair Deal” legislation, which would have lowered housing and food costs. In a campaign event in New Jersey, in October of 1948, Truman said:

I have been trying to get the Republicans to do something about high prices and housing ever since they came to Washington.  They are responsible for that situation, because they killed price control, and they killed the housing bill.  That Republican, 80th “do-nothing” Congress absolutely refused to give any relief whatever in either one of those categories.

Some people say I ought not to talk so much about the Republican 80th “do-nothing” Congress in this campaign.  I will tell you why I will talk about it.  If two-thirds of the people stay at home again on election day as they did in 1946, and if we get another Republican Congress like the 80th Congress, it will be controlled by the same men who controlled that 80th Congress…

Since then, of course, the term “do-nothing Congress” has been applied again and again. In 2013, Politico ran a piece arguing that the 113th Congress was “on track to go down as the least productive in history.” (The article was titled The (really) do-nothing Congress.) The author, Manu Raju, noted that at the time of his writing, the 113th Congress had so far enacted only 49 laws, the lowest level since 1947.  

A few years later, in 2016, the Washington Post’s Aaron Blake chided the 114th Congress for its inactivity. Blake saw Congress’s inaction as part of a larger trend. He wrote, “With the 114th session of Congress coming to an end, we can now take stock of just how much was accomplished over the past two years. And for this entire decade, the answer to that question has essentially been: not much.” Blake pointed out that Americans are more divided, politically, than ever before, making it harder for politicians to reach across the aisle and make deals.

That trend has only continued, and it’s been aggravated by a series of government shutdowns. By 2019, ABC News reported that members of Congress were, themselves, frustrated by their own inability to pass laws. “What people want from congress is plenty,” ABC News noted. “What they get, can fall short. Both sides are disappointed so far. House Minority Leader Kevin McCarthy (R-CA), said so far it’s been “100 days of nothing.” Senate Minority Leader Chuck Schumer, D-N.Y., called that chamber a “legislative graveyard”.

Of course, as William Safire pointed out, politicians have been accusing each other of being “do nothings” for hundreds of years. In medieval France, the Merovingian dynasty became known as the “do-nothing kings;” in the 20th century, FDR mocked Herbert Hoover as a “do nothing” president. More recently, historians have tried to redeem Hoover’s legacy, but the image of him as a do-nothing president has stuck.


domino theory

The domino theory was critical in shaping US foreign policy during the Cold War. Domino theory argued that if one nation became communist, its neighboring states would go the same way. In theory, if one state “fell” to communism, its neighbors would also fall, setting off a chain reaction comparable to a line of toppling dominoes.

The drive to contain communism was a major influence on the Truman administration, motivating the government to start providing aid to the French government in Indochina. Truman also assisted Greece and Turkey in the late 1940s in an effort to contain the spread of communism.

However, it was President Eisenhower who really popularized the concept of domino theory. At a press conference in April, 1954, a reporter asked Eisenhower about his opinion on Indochina and its strategic importance. Eisenhower replied, in part:

You have, of course, both the specific and the general when you talk about such things. First of all, you have the specific value of a locality in its production of materials that the world needs. Then you have the possibility that many human beings pass under a dictatorship that is inimical to the free world.

Finally, you have broader considerations that might follow what you would call the ‘falling domino’ principle. You have a row of dominoes set up, you knock over the first one, and what will happen to the last one is the certainty that it will go over very quickly. So you could have a beginning of a disintegration that would have the most profound influences.

Eisenhower returned to this concept repeatedly, in speeches about the “aggressive” nature of communism and the need to contain its spread. The Eisenhower administration also used domino theory to explain why the US was intervening in Indochina but not, for example, in Franco’s Spain. The National Security Council agreed that it did not want to “police the governments of the entire world” – it did, however, want to fight communism and beat out Moscow.

Domino theory was closely linked to US presence in Vietnam. Both the Kennedy and the Johnson administrations used domino theory to justify escalating the American military presence in Vietnam. Henry Kissinger, the Secretary of State under Presidents Nixon and Ford, was a vocal defender of the theory. Speaking at a press conference in 1975, Kissinger said, “We must understand that peace is indivisible. The United States cannot pursue a policy of selective reliability. We cannot abandon friends in one part of the world without jeopardizing the security of friends everywhere.”

Decades later, President Reagan again referenced domino theory to explain US intervention in Latin America; Reagan argued that any communist presence in Latin America was a threat to the entire Western Hemisphere. “We believe that the government of El Salvador is on the front line of a battle that is really aimed at the very heart of the Western Hemisphere and eventually at us,” Reagan told reporters, adding that if El Salvador were to “fall” to communism, then “I think Costa Rica, Honduras, Panama and all of these would follow.”



The Dixiecrats were a group of Southern Democrats who broke away from their party in 1948 because they objected to the Democratic Party’s stance on desegregation. The Dixiecrats were also known as “States’ Rights Democrats.” They represent part of a massive shift in party allegiance that reshaped the politics of the South during the second half of the 20th century.

Up through the end of the Second World War, the Democratic Party dominated the US South; it was virtually impossible for Southern politicians to win office unless they were Democrats. There were rumbles of discontent in the 1930s, as many southern politicians objected to FDR’s social policies and to his support of the labor movement. Still, southern Democrats remained loyal to their party. 

That all changed in 1948, when President Harry Truman presented a pro-civil rights platform at the party’s political convention. A group of southern Democrats, led by Strom Thurmond, walked out of the convention in protest. Those men, who became known as the Dixiecrats, organized their own, separate presidential convention in Birmingham Alabama; footage from the time shows participants waving confederate flags as they strode into their convention hall. An estimated six thousand people from 13 southern states participated in the convention. 

Their plan was to field their own presidential candidate in the upcoming election. They didn’t expect to win, but they hoped to earn all the southern states’ electoral votes, so that neither the Democrats nor the Republicans would be able to win the election.  This would have meant that the House of Representatives would decide the vote, and the Dixiecrats believed that southern states had enough power in the House to deadlock the outcome until Truman dropped his civil rights platform.

In the event, the Dixiecrats nominated Strom Thurmond as their candidate for president. Fielding L. Wright was nominated as vice president. Thurmond was a prominent opponent of de-segregation efforts. A native of South Carolina, he went on to have a long career in the US Senate, where he eventually became the oldest serving senator (until he was overtaken by Robert Byrd of West Virginia). He made a name for himself as a staunch opponent of civil rights legislation and a proponent of military spending. In 1948, Thurmond received over one million votes. He carried four states (South Carolina, Mississippi, Louisiana, and Alabama) and won 39 electoral votes. 

After 1948, the Dixiecrats never fielded another presidential candidate. However, they did meet later that year for a second convention, this time in Oklahoma City. There, they drew up and unanimously adopted a party platform. The platform calls for an end to de-segregation and for an increase in states’ rights. It reads, in part,

“We stand for the segregation of the races and the racial integrity of each race; the constitutional right to choose one’s associates; to accept private employment without governmental interference, and to earn one’s living in any lawful way. We oppose the elimination of segregation, the repeal of miscegenation statutes, the control of private employment by Federal bureaucrats called for by the misnamed civil rights program. We favor home-rule, local self-government and a minimum interference with individual rights.” 

city on a hill

city on a hill

A “city on a hill” is a phrase used to refer to America’s supposed standing in the world, as a “beacon of hope” which other nations can look to for moral guidance.

The phrase can be traced back to the New Testament. In the Sermon on the Mount (as recounted in the book of Matthew), Jesus tells his followers:

You are the light of the world. A city set on a hill cannot be hidden. Nor do people light a lamp and put it under a basket, but on a stand, and it gives light to all in the house. In the same way, let your light shine before others, so that they may see your good.

John Winthrop, who helped found the Massachusetts Bay colony, was the first person to apply the phrase to America. In 1630 Winthrop and a group of his fellow Puritans traveled from England to the New World, in order to found a colony near Plymouth. While aboard the Arabella, Winthrop delivered a speech which has become known as the “city on a hill” sermon.

Winthrop told his fellow Puritans that they would have to work hard, sacrificing their own personal desires for the good of the community and for the sake of their religion: “for we must consider that we shall be as a City upon a Hill, the eyes of all people are upon us.”

In 1961, president-elect John F Kennedy told the people of Massachusetts that, as he prepared to assume the presidency:

I have been guided by the standard John Winthrop set before his shipmates on the flagship Arbella three hundred and thirty-one years ago, as they, too, faced the task of building a new government on a perilous frontier.

“We must always consider that we shall be as a city upon a hill — the eyes of all people are upon us.

Today the eyes of all people are truly upon us–and our governments, in every branch, at every level, national, state and local, must be as a city on a hill — constructed and inhabited by men aware of their great trust and their great responsibilities.

A few decades later, Ronald Reagan made frequent references to the “city on the hill.” Reagan made John Winthrop and the “shining city” the centerpiece of his farewell speech to the nation, at the end of his second term:

I’ve spoken of the shining city all my political life, but I don’t know if I ever quite communicated what I saw when I said it. But in my mind, it was a tall proud city built on rocks stronger than oceans, wind swept, God blessed, and teeming with people of all kinds living in harmony and peace – a city with free ports that hummed with commerce and creativity, and if there had to be city walls, the walls had doors, and the doors were open to anyone with the will and the heart to get here…

And she’s still a beacon, still a magnet for all who must have freedom, for all the Pilgrims from all the lost places who are hurtling through the darkness, toward home.

chilling effect

chilling effect

A “chilling effect” is a situation in which rights are restricted, often because of indirect political pressure or overbroad legislation. Chilling effect is usually used to refer to free speech restrictions.

The term, and in fact the doctrine, first became widespread in the middle of the 20th century. That’s when the courts were asked to respond to McCarthy era laws aimed at monitoring communist sympathizers. In a series of landmark cases in the 1960s, the Supreme Court ruled that even when they don’t explicitly infringe on speech, laws can effectively restrict speech through intimidation.

Today, we mainly use “chilling effect” to talk about the subtle ways that politics, money, and power can impact free speech. The phrase is in frequent use by people on all points of the political spectrum. It doesn’t always refer to free speech; a “chilling effect” can also deter people from taking unpopular political positions, or from carrying out certain actions.

In 2016, for example, a prominent critic of the Clintons argued that President Obama should not have endorsed Hillary Clinton for president. Peter Schweizer, the author of “Clinton Cash,” said that the endorsement was likely going to deter the FBI from investigating Hillary Clinton’s email setup.

“The timing is horrible,” he said of Obama’s endorsement. “The optics are horrible. And you’re not going to convince me, I don’t think anybody’s going to convince me, that this is not going to have some sort of chilling effect on the FBI.”

A few years later, Democratic lawmakers expressed concern that President Trump had allegedly silenced a whistleblower. The whistleblower in question claimed that he had information about Trump’s conversation with the Ukrainian leader, in which Trump allegedly asked for an investigation of then-vice president Joe Biden’s son.

However, as the Washington Post reported, Democrats weren’t just concerned about the whistleblower in that case. They were concerned, they said, about the knock-on effect this might have on future whistleblowers. “The President’s brazen effort to intimidate this whistleblower risks a chilling effect on future whistleblowers, with grave consequences for our democracy and national security,” said Adam Schiff, Elijah Cummings, Jerrold Nadler and Eliot L. Engel.

Around the same time, former FBI agents told CNN that they were concerned about a possible chilling effect within the FBI as a result of comments from President Trump and Attorney General William Barr. The former FBI agents said that Barr’s “harsh” rhetoric was likely to stop current agents from “sticking their necks out” and undertaking other politically risky investigations.

“These comments will have a chilling effect on the workforce,” said one recently retired agent who has handled surveillance warrants under the Foreign Intelligence Surveillance Act, the kind abused according to the inspector general report.

Of course, the term “chilling effect” isn’t always about politicians and their actions. Sometimes, the phrase is used to describe a broader culture that discourages free speech. In 2016, for example, Conor Friedersdorf published an article in the Atlantic arguing that college campuses were becoming so obsessed with “political correctness” that they were dampening free speech.

chicken in every pot

chicken in every pot

“Chicken in every pot” was Republican campaign slogan of the late 1920s. The slogan is often incorrectly attributed to Herbert Hoover; it became a means for Democrats to attack Republicans as out of touch with economic reality.

The desire for there to be a “chicken in every pot” dates back at least to 16th century France. That’s when Henri IV supposedly wished that every peasant in his kingdom, no matter how poor, could have a chicken in his pot every Sunday. (It’s not clear whether Henri ever actually uttered these words, but the story persists.)

Centuries later, the phrase resurfaced in the United States. In 1928, a group of Republican businessmen created an ad touting the supposed gains the Republican Party had made for working Americans. The ad ran in the New York World and the headline read, “A Chicken in Every Pot.”

“The Republican Party isn’t a poor man’s party,” the ad began. It went on to say that “Republican efficiency has filled the workingman’s dinner pail – and his gasoline tank besides…Republican prosperity has reduced hours and increased earning capacity, silenced discontent, put the proverbial “chicken in every pot.” And a car in every backyard, to boot.”

Later that year, Al Smith, the Democratic candidate for the White House, waved the ad around and quoted from it derisively. According to William Safire, Smith read out some of the ad to a waiting crowd and then asked his audience, “just draw on your imagination for a moment, and see if you can in your mind’s eye picture a man working at $17.50 a week going out to a chicken dinner in his own car with silk socks on.”

Hoover easily beat Smith in the 1928 election.  It’s worth noting that Hoover never actually promised Americans a chicken in every pot, as Smith suggested. But Hoover did run on a “prosperity” platform, promising ordinary Americans a better life. That may be why the “chicken in every pot” slogan stuck to him so well, and caused him so much trouble later on.

In 1932, Hoover was running for re-election and America was in the throes of the Great Depression. Many Americans blamed the president for the economic downturn, and the language of the time reflects it. Slums were known as “Hoovervilles,” and empty, turned-out pockets were known as “Hoover flags.” The promise of a car in every back yard and a chicken in every pot seemed laughable to many, which helps explain the record turnout to vote Hoover in.

Decades later, when John F. Kennedy was running for president, he dredged up the “chicken in every pot” slogan all over again. JFK attributed the quote to Hoover, expanding the original slogan to include twice as many chickens as before. In a speech in Blountville, Tennessee, JFK said,

“It is my understanding that the last candidate for the Presidency to visit this community in a Presidential year was Herbert Hoover in 1928.President Hoover initiated on the occasion of his visit the slogan “Two chickens for every pot”, and it is no accident that no Presidential candidate has ever dared come back to this community since.”

cemetery vote

cemetery vote

The “cemetery vote” refers to a form of voter fraud, in which votes are cast in the names of registered voters who have, in fact, passed away. The term is also sometimes used when a vote is improperly cast by someone who no longer lives in the electoral district.

It’s related to “ballot box stuffing.”

In 2016, a CBS investigation found that there had been “multiple cases” of votes being cast by dead men and women in Denver, Colorado. The votes were cast months, or years, after the actual voters passed away.

In one instance, CBS found that ballots had been cast in the name of a woman named Sara Sosa in 2010, 2011, 2012, and 2013. However, Sosa had died in October of 2009. Her husband, Miguel, died in 2008 but a vote was cast in his name in 2009.

Chicago has its own, storied history of voter fraud, and the Chicago Tribune ran a cheery editorial after CBS unearthed fraud in Colorado:

As Chicagoans, we have one thing to say: amateurs. The Denver investigation turned up only four confirmed cases of corpses actually casting ballots. And the whole scandal seems to be the work of a few hapless freelancers — presumably, someone getting hold of a ballot mailed to the home of the deceased and deciding not to waste it.

After all, the famous, cheeky order to “vote early and vote often” is closely linked to Chicago. Historians are not sure who, exactly, first uttered the phrase, but it’s always attributed to a Chicagoan. It’s thought that it was either the famous gangster, Al Capone; Richard Daley, mayor from 1955 to 1976; or William Hale Thompson, who was mayor from 1915-1923 and from 1931-1935

In mid-century Chicago, Mayor Richard Daley was rumored to regularly engineer voter fraud so that his Democratic allies kept on winning. Many people said that Daley was also behind the 1960 victory of John F Kennedy; Daley supposedly “stole” the presidency for JFK by making sure that the Chicago vote went his way. That account has been disputed, but it’s still a popular and widely-believed story.

More recent reporting, though, indicates that cemetery votes are still being used in Chicago. A 2016 investigation by CBS2 found that over the course of the previous decade, 199 dead people had “voted” a total of 229 times.

A Board of Elections spokesman downplayed the investigation, dismissing the votes as accidents or as irrelevant mistakes. “This is not the bad old days,” Jim Allen told CBS. “There are just a few instances here where a father came in for a son, or a neighbor was given the wrong ballot application and signed it.” Allen argued that a number of the “fraudulent” votes were simple clerical errors.

The CBS investigation found that 60,000 voters had not been purged from the city’s voting rolls. In several cases, relatives of people who had passed away complained that they had repeatedly notified officials of their family members’ deaths – only to find out later that their family members had “voted” several times after death.

captive candidate

captive candidate

A “captive candidate” is one who is allegedly “owned” by special interests or political groups. Calling someone a “captive candidate” is similar to saying that they are the puppet or the pawn of an interest group.

As William Safire has pointed out, the phrase is often associated with Adlai Stevenson, the Democratic candidate for the presidency in 1952. Stevenson ran against the Republican Dwight D. Eisenhower. Eisenhower’s campaign accused Stevenson of being in the pocket of the Democratic political bosses and the labor unions. Republicans had been making similar charges against Democrats for decades, but by 1952 they were beginning to stick.

Stevenson, though, fought back by reclaiming the word. At first he was content to laugh off the Republican allegations. Soon, he turned the expression around and used it to attack Eisenhower. He argued that Republicans in general were beholden to big business and that their voting record proved it. In a Labor Day speech to his supporters, Stevenson depicted Eisenhower as the pleasant, slightly vacant face of the Republican party:

It’s a good thing the people have the Democratic Party to count on. For it’s a sure thing they cannot count on the Republican Party. The Republicans are still the party of the special interests, still the errand boys of the big lobbies, still the ones who want to exploit labor and the farmers and the consumers. The only thing different about them this year is that they are trying to hide behind a new face–their lonely, captive candidate.

They have tried disguises before. They always try to put a new face on the elephant at election time. But the disguise never works because the rest of the elephant is too big to hide–and the rest of the elephant has the record of Republican reaction written all over him.

We might not use the phrase “captive candidate” very often these days, but similar allegations get thrown around in every election cycle. One of the most common threads in political attack ads is the claim that one candidate or another is the tool of a special interest group.

In politics, the opposite of a “captive candidate” is probably a “grassroots-funded” candidate. During the 2020 primary season, both Elizabeth Warren and Bernie Sanders claimed that their campaigns were “100% grassroots funded.” The implication is that they weren’t beholden to special interests or lobbyists, since they were funding themselves through small donations from ordinary voters. As the Washington Post reported, this claim was a mixture of fact and omission.

Donald Trump put an interesting twist on all of this back in 2015, when he was running for the Republican presidential nomination. Trump repeatedly argued that his opponents were being bought by the special interests who contributed to their campaign funds. By contrast, Trump asserted, he was unbuyable – because he was so rich already that nobody could tempt him with cash.

As Politico reported, Trump imagined a scenario in which all of his rivals were beholden to their donors and would have to do the donors’ bidding down the line:

So their lobbyists, their special interests and their donors will start calling President Bush, President Clinton, President Walker. Pretty much whoever is president other than me. Other than me. And they’ll say: ‘You have to do it. They gave you a million dollars to your campaign.



The acronym CREEP is short for The Committee for the Re-election of the President, which in 1972 was the fundraising organization of then-president Richard Nixon’s re-election campaign. The committee officially launched in 1971 and was originally abbreviated CRP. After the Watergate scandal, it retroactively became known as CREEP. Formed ostensibly to “do whatever it takes” to get Nixon to a second term, members of CREEP would ultimately get caught up in the Watergate scandal, sending some of them prison and all of them to infamy.

As described by Smithsonian: “The Committee to Reelect the President was organized to win a second term for Richard Nixon in 1972. Headed by former Atty. Gen. John Mitchell, CRP included many former Nixon White House staffers. As advertising and marketing plans for Nixon’s campaign moved forward in the spring of 1972, so did covert plans — wiretaps and other forms of harassment directed against the president’s opponents — that would eventually bring down the second Nixon administration.”

When Nixon set out to be re-elected, he faced some fierce opposition and plenty of people that Nixon perceived to be “enemies.” As laid out on, it was fertile ground for the formation of a committee like CREEP: “A forceful presidential campaign therefore seemed essential to the president and some of his key advisers. Their aggressive tactics included what turned out to be illegal espionage. In May 1972, as evidence would later show, members of Nixon’s Committee to Re-elect the President…broke into the Democratic National Committee’s Watergate headquarters, stole copies of top-secret documents and bugged the office’s phones.”Among the more famous members of CREEP were campaign director John Mitchell and campaign manager G. Gordon Liddy. Both of them would be indicted.

In addition to its re-election duties, and its support of the burglars who broke into Watergate, CREEP was known to use money laundering and slush funds as part of its activities. As described by Vox, the committee also illegally attempted to interfere in the 1972 Democratic primaries by promoting the nomination of George McGovern, as they thought he was more easily defeated. “CRP operative Donald Segretti was involved in many of the worst of these efforts, including fabricating multiple documents with stationery from Maine Sen. Edmund Muskie, the 1968 vice presidential nominee and a strong contender for the presidency that year.”

As part of one of the biggest scandals in political history, the legacy of CREEP is one of deception, burglary, illegal banking activity, forgery and perjury. From ”Besides bringing shame on the office of the President of the United States, the illegal acts of the CRP helped turn a burglary into a political scandal that would bring down an incumbent president and fuel a general mistrust of the federal government that had already begun festering as protests against continued U.S. involvement in the Vietnam War took place.”

concession speech

concession speech

A “concession speech” is the speech a candidate delivers after the vote results are clear, when they publicly acknowledge that they’ve been defeated in an election. These speeches are typically delivered in front of supporters, and when they’re at their best are well-choreographed political events.

Much has been written about the importance of a good concession speech. As noted in Newsweek: “One of the most sacred traditions in American politics is the loser of presidential elections conceding victory to the winner. The peaceful transition of power is one of the pillars on which the country’s democracy is built…”

In a commentary from a 2018 article in San Diego Union-Tribune, the author posits that “concession speeches are an important and necessary ritual.”

Additionally, a 2016 USA Today article points out: “How a candidate drops out can be as important as how he/she announces. A good model is Hillary Clinton, who, in conceding the Democratic nomination to Barack Obama in 2008, said that ‘although we weren’t able to shatter the highest, hardest glass ceiling this time … it’s got about 18 million cracks in it!’”

Political scientists and speechwriters study concession speeches. In a 2012 interview with NPR, former Reagan speechwriter Peter Robinson claimed that good concession speeches show “unity, gracefulness and also, frankly, a kind of fundamental humility,” using Al Gore’s 2000 concession speech as an example of one exhibiting all of those qualities.

A Time Magazine article touches upon the history of the concession speech, and traces the first “congratulatory telegram” to the election of 1896, when William Jennings Bryan conceded to William McKinley. At noted in the article: “Al Smith gave the first radio-broadcast concession speech in 1928 and Adlai Stevenson first did so on television in 1952.”

The article goes on to point out the “formulaic” nature of concession speeches, adding “The basics of that formula are such: the speaker says that he or she has congratulated the winner—usually not that he or she has lost; the word ‘concede’ is rarely heard—to the opponent; the speaker calls for unity; the speaker summons supporters to both accept the result and to continue to fight for their cause in the future.”

While it’s clear that there is a certain formula to concession speeches, historians are quick to point to Hillary Clinton’s 2016 speech as veering from that formula, breaking from tradition by saying a word no other presidential loser has ever said: “sorry.”

While there is much debate about who delivered the best presidential concession speeches of all time, a 2016 Business Insider article put together a list of the Top 10.

compact of fifth avenue

Compact of Fifth Avenue

In the summer of 1960, aspiring presidential candidate Richard Nixon met Nelson Rockefeller in Rockefeller’s New York City home to discuss Nixon’s campaign. What resulted from that meeting is known as the “Compact of Fifth Avenue.”

Also referred to as the Treaty of Fifth Avenue, the compact was a way for Nixon to receive the backing of Rockefeller, a powerful force in the Republican party. In fact, Rockefeller himself was considering seeking the nomination for the presidency that year. But when it was determined he couldn’t win, he decided to take on the role of kingmaker instead.

As laid out by “Rockefeller, though no longer seeking the nomination, was determined to influence the GOP platform. As critical as any Democrat of [Eisenhower] administration military policy, the New York governor strongly echoed the 1958 Rockefeller Brothers Fund report on national security, especially the recommendations for a mandatory national fallout shelter program, for accelerated ICBM development, and for bigger conventional forces.”

Rockefeller, long considered a more moderate voice than many Republican candidates, gave Nixon his support, but in return Nixon promised to incorporate Rockefeller’s agenda into his campaign and the ensuing presidency, should he win.

As described in the Denver Post: “This was no small tête-à-tête. Nixon succumbed to a GOP policy agenda executed by the Rockefeller, stamping this meeting as a key example of the post-New Deal moderation of the Republican Party.”

According the description of the compact from a 1967 edition of Congressional Quarterly, the key points of the Compact of Fifth Avenue were: “It called for expansion and acceleration of the defense program; strong federal action to remove discrimination in voting, housing, education and employment; stimulation of the economy to achieve a minimum 5-percent growth rate; and a medical care plan for the aged.”

Many political scientists consider this compact to be a key component of the birth of modern conservatism. But the compact didn’t sit well with all Republicans: “The so-called Compact of Fifth Avenue created controversy at the convention; conservatives saw the treaty as a backdoor surrender of conservative principles to the moderate Rockefeller.”

Fifty-two years later, during the height of the 2012 Republican primaries for president, the co-op where this so-called treaty was brokered made news again when it went on the market for $27.5 million. From “Half-century later, that apartment is up for sale and the Republican Party continues to hash out divisions between conservatives like presidential hopefuls Gov. Rick Perry, Congresswoman Michele Bachmann and moderates like former Gov. Mitt Romney. Real estate prices have gone up, but in some ways, it’s just like it’s 1960 all over again.”

clothespin vote

clothespin vote

A “clothespin vote” is a colorful term referring to a vote given to the “less objectionable” candidate despite a distaste for him or her. It’s commonly used during elections in which both choices are equally disliked. The concept is akin to “holding one’s nose and voting” and is closely related to the “the lesser of two evils” principle.

The term can be traced back to the tradition of depicting a person, particularly in cartoons, of trying to avoid unpleasant odors by putting a clothespin on his or her nose.

One notable example of a “clothespin vote” was during the French election of 2002, in which The Telegraph described the election as a choice between “cholera” Chirac and “plague” Le Pen, adding: “Many are promising to go to the polling booths tomorrow wearing rubber gloves, with clothes pegs on their noses as a symbol of their disgust.”

During the 2000 presidential election, William Safire lamented in the New York Times the choice between Al Gore and George W. Bush:

So I’m conflicted. But failing to vote is not an option; I know that even when one’s candidate does not win, choosers are never losers. Here’s my way out: our system offers us an opportunity to hedge our bets. Even when forced to cast a ”clothespin vote” (go explain that metaphor to a washer-dryer generation that has never seen a clothesline), we have a way to ease the pain of choice.

In the summer of 2016, disapproval of both Donald Trump and Hillary Clinton was so high that a website called was launched. As described here: “In excess of 25,000,000 Americans engaged in the primary process did not choose Clinton or Trump as a presidential candidate. The Washington Post cited in early June over 12,000,000 votes have been cast against Hillary and more than 15,000,000 votes against Trump. These voters have three options: don’t vote, write in, or cast a Clothespin Vote for one of the options on the ballot.”

There is much debate about concept of the clothespin vote, and whether it’s worth voting at all if you strongly dislike both choices. As argued by The Foundation for Economic Education:    “Every eligible voter will have to decide, based on his or her own conscience, whether the Common Good compels voting for the LOTE. Each will have to assess the relative moral harm of the candidates, based on their own values.”

A 2016 Psychology Today article suggests that the clothespin vote – or compromise – is a hallmark of democracy:

Some people do apply the ‘lesser of two evils’ logic in their everyday life. They bolt upright and say ‘I’m not going to take this anymore. No compromise!’ It rarely turns out well for them. So why do we apply that logic to a national democratic election? To live in a democracy means compromise. Why suddenly the proud switch to a no-compromise rule when most of us know better than to apply that rule in everyday life?

can't win technique

can’t win technique

The “can’t win technique” is a campaign strategy used during the primary season. Typically, it means telling delegates and voters that your rival can’t possibly win the general election. The idea is to present one candidate as more electable, while diminishing the other, more exciting candidate.

In the mid 20th century, Robert A. Taft was an ambitious senator from Ohio. His father, of course, was the former president William Howard Taft. A fiscally conservative Republican, Taft led a coalition of Southern Democrats and Northern Republicans; he was considered a formidable foe to President Truman. Taft was widely known as “Mr. Republican” and was seen as a natural opponent to the East Coast, moderate branch of the GOP.

Taft tried to win the Republican presidential nomination in 1940, 1948, and 1952. By the second time he ran, Taft’s rivals were declaring that “Taft can’t win.” And sure enough, Taft lost the Republican nomination in 1948. In 1952, Taft’s rivals again convinced the Republican party bosses that Taft didn’t have a chance. Eisenhower ran against him and won. The race is generally seen as a contest between a moderate (Eisenhower) and a hardliner (Taft), but it’s also an instance of the can’t win technique at work.

Further back in American history, a man named Henry Clay fell victim to the can’t win technique. Clay, known as the “great compromiser,” was running for the Whig nomination to the presidency in 1840. That’s when Thurlow Weed, a political boss who wanted Martin van Buren to win, started a campaign to convince Whigs that Clay didn’t have a chance of winning. Weed’s plan was effective. (Years later, a bitter-sounding Clay said, “I’d rather be right than be president.)

In modern elections, voters often agonize over who the most “electable” candidate might be. The more polarized the electorate, the more voters fret about electability. In the lead up to the 2020 presidential election, many Democratic voters said that they wanted to choose whichever candidate had the best chance of defeating the incumbent president, Donald Trump. Unfortunately, though, this led to what some journalists called a feedback loop, or a self-fulfilling prophecy.

As NBC News described the can’t win technique: Some pundits and voters say one candidate — former Vice President Joe Biden — is the most electable, so they tell pollsters they want Biden, which produces media coverage that reinforces the idea that Biden is most electable, which then filters back down to voters concerned about “electability.”

And in fact, some people – like the pollster Nate Silver – argue that it’s almost impossible to know who is truly electable:

Political scientists study electability, but electability ain’t no science. Instead, researchers say, it’s basically a layer of ex post facto rationalization that we slather over a stack of psychological biases, media influence and self-fulfilling poll prophecies. It’s not bullshit, exactly; some people really are more likely to be elected than others. But the reasons behind it, and the ability to make assumptions based on it, well …

clean sweep

clean sweep

In politics, a “clean sweep” occurs in an election when a candidate or party achieves an overwhelming or complete victory, winning in all or almost all districts or precincts. A related term is “landslide” or “wipeout” victory.

In open democracies that are deeply partisan, as in the United States, clean sweeps on a national level are uncommon. Even locally, true clean sweeps, in which one candidate or party receives the vast majority of the vote, are rare.

In the history of presidential elections, there has never been a true clean sweep, but in 1972 Richard Nixon won 49 of 50 states, and in 1984 Ronald Reagan lost only Minnesota and Washington, D.C.

Notably, in many countries where corruption is built into the political system, clean sweeps are much more common, as in Belarus and Iraq, in which Saddam Hussein would commonly boast of “clean sweeps,” often winning 99% of the vote.

There are examples of legitimate clean sweeps occurring in nationwide elections, when there is overwhelming rejection of a country’s political class, as in the 2018 national elections of Barbados, in which the opposition party won every seat in the country’s parliament.

Sometimes, if an opposition party boycotts an election, a clean sweep can happen by default, as was the case in the Jamaican elections of 1983, in which the Conservative JLP party “won all 60 House seats and formed a one-party legislature.”

One of the most storied true clean sweeps in electoral history took place in 1987 in New Brunswick, Canada, as described by the CBC: “Liberal Frank McKenna had expected to win, but he never expected this. His Liberal party has won every single seat in the New Brunswick legislature. A clean sweep like this has only ever happened once before, in P.E.I. in 1935. ‘I did not anticipate it, and I guess it really hasn’t sunk in yet, as to what it means,’ says a stunned McKenna in this CBC news clip. McKenna must now get to work to figure out how to run a government with no opposition.”

Not limited to politics, the term “clean sweep” is also used in sports, as in Major League Baseball, where there have been 21 World Series sweeps, in which a team has won the series without losing a game.


“Camelot” is a reference to President John F. Kennedy’s administration.

Kennedy’s brief, ill-fated presidency has been highly mythologized; some people point to it as a shining example of what the US government should look like. Calling that administration “Camelot” highlights its idealized qualities.

Camelot, of course, was the castle at the center of Arthurian Britain. In legend, King Arthur and his knights of the round table lived in Camelot, or at least they rested there in between adventures. (Camelot is an imaginary spot, but historians believe that it may have been based on a real location in Cornwall or Wales. In the same way, Arthur may have been based on a real Celtic leader.)

The word “Camelot” evokes utopian ideals and high hopes. King Arthur and his knights are supposed to be pure-hearted, chivalrous, and endlessly courageous. In the same way, the Kennedy administration is sometimes remembered as a period of optimism, expanding opportunities, and humanitarian goals. JFK has been lionized as a civil rights hero; he is also remembered for his dream of exploring outer space.

Jackie Kennedy, the widow of John F. Kennedy, was the first to refer to the JFK administration as Camelot. She gave an interview to Life magazine just days after JFK’s assassination. Jackie deliberately brought up Camelot during the interview, and even quoted from a popular musical of the day.

“Don’t let it be forgot, that for one brief, shining moment there was Camelot,” she said, borrowing a line from her husband’s favorite Broadway musical. Years later one of Jackie’s Secret Service agents, Clint Hill, wrote that Jackie had deliberately planted the reference. “She wanted to be sure he was remembered as a great president,” Hill said.

Of course, calling the JFK administration “Camelot” also implies a kind of monarchy. The Kennedy family is sometimes called “American royalty,” and pundits love to talk about how that family is the closest thing America has to a royal family. Linking the administration to one of the most famous kings in history just furthers that association.

It’s worth noting that JFK’s critics argued that he never managed to achieve most of his own high-flown goals. His plans to enact Medicare and to expand civil rights were postponed until the Johnson administration. His actions may have helped embroil the U.S. in Vietnam. And his Bay of Pigs invasion was a thorough failure. Even so, Kennedy’s idealism and charm have gone a long way to make him the most popular president in American history.

Decades after JFK’s death, Barack Obama tried to sum up the Kennedy legacy:

To those of us of a certain age, the Kennedys symbolized a set of values and attitudes about civic life that made it such an attractive calling. The idea that politics in fact could be a noble and worthwhile pursuit. The notion that our problems, while significant, are never insurmountable. The belief that America’s promise might embrace those who had once been locked out or left behind. The responsibility that each of us have to play a part in our nation’s destiny, and, by virtue of being Americans, play a part in the destiny of the world.

deep state

deep state

The “deep state” is a conspiracy theory which suggests that collusion exists within the U.S. political system and a hidden government within the legitimately elected government.

The term “deep state” was a term was originally used to describe a shadow government in Turkey that disseminated propaganda and engaged in violence to undermine the governing party.

However, during President Donald Trump’s administration the term has come to refer to an organized resistance within the government, working to subvert Trump’s presidency. Trump allies blame career bureaucrats, many of whom they see as loyal to former President Barack Obama, for leaking damaging information to the news media.

A Washington Post article suggested that people close to Trump believe these longtime government workers amount to a “Deep State” that could undermine his presidency.

Rep. Steve King (R-IA) told the New York Times: “We are talking about the emergence of a Deep State led by Barack Obama, and that is something that we should prevent… I think it’s really the Deep State versus the president, the duly elected president.”

Former White House adviser David Gergen told Time magazine the term was appropriated by former Trump campaign manager Steve Bannon in order to delegitimize Trump’s critics.



A “cabal” is a group of people involved in a secret plot or conspiracy. Cabal can also refer to the plot itself, or to the secret organization of the plotters.

Cabal originally is derived from the Hebrew word Kabbalah, which refers to a mystical Jewish tradition centered around the “direct receipt” of scriptural knowledge. During the 16th and 17th centuries, Christians in Europe were becoming more aware of the Jewish tradition of Kabbalah. Still, it was little understood and was associated with a sense of mystery and with secret, jealously-guarded knowledge.

Cabals can operate on a small or a massive scale. There are, of course, true plots and cabals, some that might be hatched in “smoke-filled rooms.” But more often, cabals exist in the realm of paranoia. They’re often the subject of conspiracy theories, with people imagining that a secret cabal is plotting to take over the world. The BBC has noted, for example, that the Bilderberg Group, an organization of businessmen and politicians, has spawned any number of conspiracy theories. So has the Freemasons organization.

In modern times, conspiracy theories spread more quickly than ever. QAnon, for example, is a pro-Trump conspiracy theory which claims that a cabal of “Deep State” activists are plotting against the president. The left has its own theories about the Trump administration’s involvement in secret cabals.

In 2018 Ben Rhodes, who served as a national security adviser to President Obama, published an op-ed in the New York Times. It was titled “We Are Not a ‘Cabal,’ Just Critics of Trump.” Rhodes was responding to a National Security memo obtained by The New Yorker, in which members of the Trump administration claimed that a “network” of former Obama officials was working together to “undermine President Trump’s foreign policy.” Rhodes wrote that

“the memo is a glimpse into a world in which dissent is viewed as dangerous. A group of aides to a president from a different party must be part of some cabal, manipulating the media and working against America’s interests. This goes hand-in-glove with the “deep state” conspiracy, which has led President Trump to disparage the intelligence and law enforcement community, purge the State Department of expertise, urge investigations into political enemies and strip security clearances from former officials.”

Of course, the Trump administration is not the first administration to imagine that its enemies are conspiring against it. Back in the 1990s, the Clinton White House put together a 332 page report about an ongoing “conspiracy commerce” against the president. The memo claimed that there was a “vast right wing conspiracy” against the Clinton White House and that the media was being used to spread false information about Whitewater and about the suicide of White House aide Vincent Foster.

The memo, obtained by the Wall Street Journal, confirmed what many reporters already believed about the state of mind at the White House.

The Washington Post wrote:

The conclusion has long been a favorite of Clinton loyalists: that a cabal of right-wing extremists had figured out how “fantasy can become fact” by advancing rumors about Whitewater and Clinton’s personal life through a “media food chain” that starts in ideological journals and ultimately finds its way onto the front pages of mainstream U.S. newspapers.

madman theory

The “madman theory” is a political theory commonly associated with President Richard Nixon’s foreign policy during the Cold War.

Nixon tried to make the leaders of hostile Soviet bloc nations think the American president was irrational and volatile. According to the theory, those leaders would then avoid provoking the United States, fearing an unpredictable American response.

The Atlantic notes that Nixon used the theory in April 1971 when he faced an impasse in negotiations with the North Vietnamese to end the Vietnam War. Nixon told national security adviser Henry Kissinger to convey the United States might use of nuclear weapons.

NIXON: You can say, “I cannot control him.” Put it that way.

KISSINGER: Yeah. And imply that you might use nuclear weapons.

NIXON: Yes, sir. “He will. I just want you to know he is not going to cave.”

Some believe that President Trump employed his own “madman theory” in dealing with several nations, but as Jim Sciutto says in his book, The Madman Theory, it was probably sometimes intentionally and sometimes not.

The concept of a madman theory dates back to at least 1517 when Niccolò Machiavelli wrote in The Prince that sometimes it is “a very wise thing to simulate madness.”



A boondoggle is a wasteful or extravagant project with no practical value. Usually, a boondoggle makes use of public funds and carries at least a whiff of corruption.

The word boondoggle dates back at least to the 1920s, when it was the name for a harmless boy scout craft. Scouts used their downtime to make lanyards and bracelets, and the craft was known as boondoggling.

Then, in 1935, the New York Times reported that the Works Progress Administration (WPA) had funded a 3-million-dollar program to teach white collar workers shadow puppetry and boondoggling. In theory, the program was training people to teach underprivileged kids how to make arts and crafts out of reusable materials. However, readers were horrified at the sum being spent on lanyards. From that day on, boondoggling has been synonymous with wasteful government spending.

In 2012, The Atlantic wrote about what it called “the Federal government’s $10 billion plutonium  boondoggle.” The magazine warned that, even as they wrangled over the details of a new student loan plan, both Democrats and Republicans were throwing away money in a “plutonium pit.”

From The Atlantic:

Some members of Congress are trying to restore billions in funding for a new factory at the Los Alamos National Laboratory to make plutonium cores for nuclear bombs that the military doesn’t need. Meanwhile, President Obama is plowing ahead with plans to make plutonium fuel rods for power reactors that no power company wants to buy. Together, construction costs for these two radioactive white elephants add up to over $10 billion, and rising.

In 2015, CNBC had a piece titled “The $20 Million Political Boondoggle That Just Won’t Die” about an elaborate project which involved shipping coal from the hills of eastern Pennsylvania all the way to the town of Kaiserslauntern in Germany. The coal in question was used by a large US military installation, where people were under strict orders to burn only the anthracite coal shipped from the US. CNBC reported that the cost to taxpayers was $20 million per year – not even accounting for the cost of transport.

Boondoggle is often used in ways that are roughly synonymous with “slush funds” and “grifting.” In 2020, amid the economic downturn caused by the coronavirus pandemic, politicians started throwing around accusations about boondoggles. Lisa McCormick, a candidate for US Congress from New Jersey, has been critiquing the “COVID boondoggle.” McCormick said that relief funds appropriated by Congress to ease the economic pain of the global pandemic had been misapplied and has implied that it’s disproportionally benefiting Trump donors.

Said McCormick: “The Congress enacted this massive appropriation without safeguards or oversight to ensure that taxpayers would be protected, and now the money is gone and only predators seem satisfied while 22 million Americans are filing for unemployment. In addition to the .2 trillion included in the bill is another trillion that the Federal Reserve will distribute as part of Trump’s slush fund.””



A “boodle” refers to a large sum of bribe money or graft money.

Boodle can also be used to mean a large collection of something. In fact, some linguists believe that the phrase “the whole kit and kaboodle” is a corruption of the phrase “the whole kit and boodle.” However, “boodle” rarely used in this sense today.

The word “boodle” originally comes from the Dutch “boedel,” meaning wealth and riches. Boodle was first used in its modern sense of dirty money in 1858.

Today, boodle is often used to refer to ill-gotten gains by grifters. In 2019, the New Republic wrote about the Ukrainian oligarch Kolomoisky, who has been accused of laundering millions of dollars through US real estate purchases and shell companies. The New Republic wrote that:

Instead of plopping his funds in Manhattan high-rises or Miami beachfronts, Kolomoisky’s network tried a different tack, opting to stuff his boodle in metallurgy plants across the Rust Belt and buildings in downtown Cleveland.

Boodle can also mean political spoils, or an undeserved windfall. In this sense, the boodle might not be illegally come by. This is sometimes called “honest graft.” But using the term “boodle” suggests that the recipient doesn’t truly deserve the money.

In 2019, for example, the Boston Globe wrote about what it dubbed the “tale of two welfare programs.” The newspaper criticized the bailouts that the Trump administration was handing to American farmers in the midst of cuts to the SNAP food stamp program. The Globe called the payouts to farmers inefficient and wasteful, writing,

“Some of the boodle is going to people who are barely farmers at all. (Hey, Senator Grassley!) Most of it is buoying not mom and pop farms, but the giant operations that gobble them up.”

The term boodle is often associated with the kind of corruption found in machine politics. In Chicago, during the gilded age, the city’s government was rife with low-level graft. Many of the city’s politicians were Irish American and were referred to “Irish boodle politicians.” During this time, boodling also referred to the practice of selling city franchises to private businessmen.

In 19th century America, sheriffs had their own kind of specialized boodle. According to most state and local laws, authorities were allowed to arrest vagrants and lock them up in jail. They were assigned funds to feed the prisoners and run the jails, but often pocketed most of that money. The jails which housed vagrants came to be known as “boodle jails.”

The word boodle is used in a few different countries, generally with a different meaning than in the United States. In South Africa, boodle means a money but does not have the negative connotation that it does in the US; the meaning seems to be closer to what Americans would call a “bundle.” Boodle Loans is one of the leading payday lenders in South Africa.

In the Philippines, a “boodle fight” is a meal that’s spread out on a table and eaten without utensils. “Boodle” refers to the plentiful food, and “fight” refers to the fact that, since everyone is sharing, they end up fighting to get the most food. The tradition started in the Filipino military, where it was supposed to instill a sense of brotherhood among soldiers.

bleeding heart

bleeding hearts

The term “bleeding hearts” refers to people who care deeply  — so deeply that their hearts bleed — about the suffering of the needy. The term is almost always derogatory. It’s usually applied to those on the left, hence the phrase “bleeding heart liberal.”

“Bleeding hearts” has a long history in literature. Merriam Webster points out that the phrase dates back at least as far as Geoffrey Chaucer. Chaucer’s 14th century poem Troilus and Criseyde describes the experience of unrequited love as having a bleeding heart. “Bleeding heart” can also have religious connotations; in Christian religious writing, there are many references to Jesus’s feelings for the poor and the downtrodden.

Today, a “bleeding heart” is someone who empathizes very strongly with the oppressed and the poor.  A bleeding heart liberal is someone who wants social programs and government safety nets to care for the poor. Bleeding hearts are criticized by their enemies for being naïve at best, or hypocritical at worst; the term is often used as an insult. The natural opposite of a bleeding heart liberal is a “heartless conservative.”

A newspaper columnist named Westbrook Pegler first used “bleeding heart” in a political sense in the 1930s. Pegler was a frequent critic of Franklin Roosevelt’s programs. He believed that they were using a humanitarian façade to win votes and rouse people’s emotions. In 1938, Pegler denounced an anti-lynching bill which, he thought, didn’t address the nation’s real problems:

I question the humanitarianism of any professional or semi-pro bleeding heart who clamors that not a single person must be allowed to hunger but would stall the entire legislative program in a fight to ham through a law intended, at the most optimistic figure, to save fourteen lives a year.

American liberalism probably reached its peak expression during the administration of Lyndon Johnson, whose “Great Society” programs aimed to roll back poverty and expand educational opportunities for Americans from all walks of life.

By the late 1960s, though, some former bleeding hearts were experiencing a change of sentiment. As the Brookings Institute has noted, that’s when some “maverick liberals in government and academia” started to question their own long-held assumptions. They wondered why the kinds of liberal programs they’d pushed for hadn’t, in fact, made a major difference in society; they questioned whether social policy was, in fact, beginning to have unintended and negative consequences. These doubters became known as neo-conservatives.

Not every bleeding heart is a liberal, of course. Jack Kemp, a Republican politician and a self-described “bleeding heart conservative.” Kemp talked often about his sympathies for the poor and for minorities; he believed that his particular brand of conservative economic policies would improve their lives. He famously helped to architect the Reagan administration’s sweeping tax cuts, which he argued would benefit the poor just as much as the wealthy. Kemp later tried to reform the nation’s public housing system by giving residents the chance to become owners of their own apartments.

big tent

big tent

In politics, a big tent refers to an inclusive party which encourages a wide swathe of people to become members. The opposite of “big tent” would be a party which is narrowly focused on only a few issues, or which caters to a particular interest group.

Merriam Webster notes that the expression was first used in its current, political sense in 1975. Big-tent can also be used as an adjective.

The benefits of having a big tent are obvious. A big-tent party can amass support from a huge range of voters. It isn’t beholden to any one group, since it has a broad base of support. Losing one group of voters doesn’t spell its political end.

Having a big tent liberates  a party from the need for a “litmus test” or an ideological “purity test,” as Barack Obama pointed out during the 2019 primary season. Speaking to a group of Democratic donors in California, Obama warned against limiting the reach of the party. “We will not win just by increasing the turnout of the people who already agree with us completely on everything,” Obama said. “Which is why I am always suspicious of purity tests during elections. Because, you know what, the country is complicated.”

On the other hand, some politicians argue that having a big tent means that a party can lack focus. Alexandria Ocasio Cortez, a Democratic-Socialist member of Congress from New York, has sometimes criticized the Democratic party for being too inclusive.

In January of 2020, Ocasio Cortez grumbled to New York Magazine that “Democrats can be too big of a tent.” She complained that even the term “progressive” had been watered down and had lost its meaning. The Congressional Progressive Caucus should impose some kind of rules about who could join, Ocasio Cortez said, but instead “they let anybody who the cat dragged in call themselves a progressive. There’s no standard.”

It can be hard for pundits to agree on the practical definition of “big tent.” Is the Republican party a big tent party right now? Journalists periodically get excited about the Republican Party’s growing tent, especially when the party appears to veer away from social conservatism. In 2003, the New York Times wrote that the election of Arnold Schwarzenegger as governor of California was a sign that the party was, indeed, opening up. Citing the long-time Republican consultant Frank Luntz, the Times wrote that

A Schwarzenegger victory would send a strong message that the Republican Party is a tent big enough to include a pro-abortion, pro-gay rights Hollywood superstar who has acknowledged manhandling women and smoking marijuana.

In 2016 Mitchell Blatt, writing in The Federalist, argued that the GOP is indeed the “real” big tent party; Blatt pointed to the fact that the Republican presidential candidates talked about socialized healthcare and the decriminalization of drugs. He saw this as an indication that the party was expanding into new territory. However, NPR has argued that the Republicans’ big tent is “lily white,” which ultimately limits just how big-tent the party can truly be.

big government

big government

Broadly speaking, “big government” is a political term that refers to how much influence the federal government has on the day-to-day lives of American citizens. More granularly, as defined by the Brookings Institute, it refers how much government spends, how much it does and how many people it employs.

Even more granularly, how “big” the government is can be determined by the number and scope of federal agencies, how much money goes into that bureaucracy, the amount of legislation and/or regulations that the government passes, and how much financial help the government provides to people in need.

Over the years, the concept of “big government” has become a highly partisan rhetorical issue, with most conservatives being against it and liberals being more in favor of it. In 2017, a Gallup revealed that 67% of Americans considered “big government” the biggest threat, adding: “Republicans primarily drive this pattern, as they consistently show more concern than Democrats about big government — and even more so when Democrats occupy the White House.”

One of the early battlegrounds over the size of government was in 1933. With FDR poised to take over and vastly expand the government with his New Deal, Herbert Hoover warned of the consequences. From The Atlantic: “Throughout the campaign, Hoover had attacked what he considered a ‘social philosophy very different from the traditional philosophies of the American people,’ warning that these ‘so-called new deals’ would ‘destroy the very foundations’ of American society. As Hoover later put it, the promise of a ‘New Deal’ was both socialistic and fascistic; it would lead the country on a ‘march to Moscow.’”

The success of the New Deal notwithstanding, over the last 50 years, many Republicans have run campaigns with anti-big government messaging as one of their main tenets. In 1986, Ronald Reagan famously said the nine most terrifying words in the English language are: “I’m from the government and I’m here to help.”

While most Democrats embrace a larger role for government, many political scientists attribute the success of Bill Clinton in the 1990s to his ability to co-opt the right’s anti-big government messaging. Indeed, in his 1996 State of the Union address, Clinton famously proclaimed “the era of big government is over.” From The Hill: “Republicans were aghast. He was stealing their issues. He was talking like a Republican, but he was still a liberal at heart. Republicans knew that if Clinton governed like them, he would be hard to beat. They were right. He was hard to beat, and Bob Dole didn’t put up much of a fight.”

The role of government will always be debated, and it will continue to expand and contract with the times. As described in The New Republic, the age-old argument about how big a role the government should play took on new resonance during the coronavirus pandemic: “It didn’t occur to the right that a more terrifying series of words than ‘I’m from the government, and I’m here to help’ would turn out to be ‘I’m from the government, and I guess I anticipated that the private sector would have engaged.’

bargaining chip

bargaining chip

In politics, a “bargaining chip” refers to something that is used as leverage in a negotiation, an attempt to pass legislation, or an effort to get concessions from another party.

More often than not, the term is used cynically, or in a pejorative sense, as politicians often use “bargaining chips” to gain advantage without concern for how the parties being used as the “chips” are being affected.

In 2018, when reporting on President Donald Trump’s push for immigration reform, The Washington Post accused the president of using dreamers as “leverage in a high-stakes game of political horse-trading.” The same year, the Post suggested Trump was also using federal employees as “bargaining chips” to try to pressure Democrats to fund his border wall.

U.S. Presidents and politicians have a long history of using bargaining chips in the process of pushing proposed legislation or political agendas. A 2008 U.S. News and World Report article describes the relationship between Bill Clinton and Newt Gingrich:

Although the abortion issue was important for the large contingent of social conservatives in his party, Gingrich viewed it as a bargaining chip that could be used to exact concessions from Democrats on issues that were more important to him, such as increased spending for defense and space exploration. Neither he nor Clinton wanted it to block their larger agenda.

Not limited to domestic politics, bargaining chips have long been a tool used in international diplomacy, even between enemies. After World War II, FDR famously used financial aid to USSR as diplomatic leverage: “Roosevelt, mindful of the inherent conflict between American democracy and Soviet communism, counted on using U.S. military aid to the Soviet Union as a bargaining chip in post-war diplomatic relations.”

A more recent example is outlined in a 2015 article in The Diplomat: “The history of the S-300 in Russo-Iranian relations shows that this particular weapons system, long sought by Tehran and dangled just out of reach by Moscow, serves primarily as a bargaining chip for the Kremlin in its relations with the West and is unlikely to actually be delivered to the Iranian military.”

One of the most controversial use of bargaining chips throughout history is the use of hostages, most famously by the Iranian regime, and more recently by North Korea. Since the early 1970s, the United States has a long tradition of not acknowledging hostages as bargaining chips, and not engaging with foreign governments that use them as leverage.

However, in 2020, the New Yorker magazine warned of a more recent erosion of this long-standing policy.

balanced ticket

balanced ticket

A balanced ticket is a paring of political party candidates designed to appeal to a broad swathe of the electorate. A balanced ticket normally includes candidates likely to be approved of by different racial, regional, and religious groups.

The term was first used in 1937, according to Merriam Webster.

In modern presidential elections, the presidential nominees choose their running mates carefully, hoping to create a balanced ticket. This is going to look different for every candidate. The general rule is that the running mate should possess whatever important quality the presidential candidate is lacking. Of course, this will also depend on the electorate and their perceived requirements.

In 2008, Barack Obama announced that Delaware senator Joe Biden would be his running mate. Biden was seen as a foreign policy expert; he was also someone with decades of experience in Washington. In that way, he balanced out Obama’s relative lack of experience and lack of foreign policy credentials.

Obama’s opponent, John McCain, picked a relatively unknown running mate – Sarah Palin, the governor of Alaska. While Palin was criticized as a lightweight, some pundits argued that, in fact, picking her was a stroke of genius. Her freshness and energy could balance out McCain’s age and experience. The McCain-Palin ticket was a combination of political insider and brash outsider, in much the same was as the Obama-Biden ticket way.

In 2016 the Democratic presidential nominee, Hillary Clinton, chose Senator Tim Kaine as her running mate. Analysts pointed out that Kaine represented Virginia, a “battleground” state; Kaine also spoke fluent Spanish and had “working class roots.” But Kaine was also expected to balance the ticket by the sheer fact that he was a white man, a demographic which the Clinton campaign was struggling to win over.

Clinton’s opponent, Donald Trump, chose Senator Mike Pence as his running mate. In an op-ed in the Washington Post, Andrew Downs wrote:

Donald Trump’s presidential campaign has been unconventional, but the naming of Indiana Gov. Mike Pence as Trump’s vice presidential choice is quite conventional. Pence balances the ticket in almost every way.

What many people may notice first is how Trump’s and Pence’s personalities balance each other. Trump is unpredictable, forceful and, at times, impolite. Pence is predictable, some might say to a fault. Pence does not shy from a fight, but “forceful” is not a word that is used often to describe him. Pence is Midwestern polite. “

Some analysts argue that balancing the ticket isn’t always a great idea. Yes, a balanced ticket is likely going to appeal to a broader group of voters. But in some cases, it could turn off core voters who were energized by the nominee and might not necessarily want those qualities to be evened out.

In this vein, a 2020 op-ed in the New York Times argued that the system of balancing the ticket is outdated and that it’s far more important for the candidate to pick someone whom they align with. The piece argued that “imposing a running mate for the purpose of pushing the nominee’s positions in a certain direction can lead to a tense and nearly dysfunctional White House.”



A “backgrounder” is an off-the-record briefing for members of the media. Reporters are free to report on what they learn at a background briefing but normally are restricted as to how they cite their sources. Merriam Webster says that the term was first used in 1942.

Government officials give background briefings in order to announce, for example, new legislative proposals or new developments in foreign relations. International organizations (like the World Bank or the IMF) also give background briefings. So do advocacy groups and think tanks announcing new developments in their field, and businesses launching new products.

Like a pen and pad briefing, a background briefing is never filmed or broadcast; its purpose is to inform reporters only. Some backgrounders allow reporters to join by phone or video link, while others are live. Briefers are normally given anonymity; in articles about the briefing, they’re cited as “administration officials” or similarly. Depending on the briefers’ preferences, journalists may not be allowed to use direct quotes in their reporting.

The term backgrounder could also refer to the packet of research and/or photos sent to reporters to give them information on a given topic. That background information is sometimes referred to as a white paper. And the term backgrounder is sometimes used to mean an “explainer” piece run in a newspaper. In that case, the backgrounder is a long-form article meant to give readers context on an ongoing issue in the news.

Advocacy groups and think tanks often publish their own backgrounders, which are meant to provide foundational information on a little-known issue. However, these reports can also reflect the bias or narrow focus of the groups which produce them

Political reporters are often critical of background briefings, which are perceived as a way for the government to “spin” the media. Writing in the Atlantic in 2015, Ron Fournier complained that reporters were too willing to let public relations teams guide their coverage of politics, business, and sports.

Fournier argued that background briefings allow the government to get their messages into the press without having to stand behind what they say. Because officials are speaking anonymously, they never have to take public responsibility for the things they’ve said. Fournier wrote,

“Of the many ways that modern journalists cede power to authority, none is easier to fix than the notion that government officials are allowed to gather several reporters in a room or on a conference call, spew their clever lines of lies and spin, and declare it all “on background”—shielded from accountability “on condition of anonymity.”

Writing in the Washington Post, Paul Farhi also noted that background briefings are a way for the government to control the news cycle. Not only do officials impose rules about reporting, they also handpick the journalists allowed to attend their briefings, which creates an incentive for journalists to try and stay on officials’ good side.

Farhi wrote, “White House reporters tend to view the background briefings as a kind of mixed blessing. While they bristle at the rules, they say the briefings occasionally generate useful information they wouldn’t learn another way. Quoting a senior official without identifying him is imperfect, they acknowledge, but better than nothing.”

back channel

back channel

A “back channel” is an unofficial means of communication between two nations or two political entities. “Backchanneling” is also used as a verb, to refer to the act of holding behind-the-scenes talks.

Back channels are often used when two governments don’t have formal diplomatic relations with each other. The United States and North Korea, for example, often rely on back channel diplomacy when they want to exchange messages. They have a few long-established informal “channels” of communication. One is in New York, where North Korea’s ambassadors  can meet with US officials at UN headquarters. (North Korea does not have a diplomatic presence in Washington DC, and the US doesn’t have a diplomat stationed in Pyongyang.)

Even when a high-profile meeting does take place between US officials and North Koreans, that meeting has normally been preceded by extensive backchannel communication. That’s because the public meetings are rare and are usually high-stakes. Both nations rely on backchanneling to help prepare ahead of time and lay ground rules.

Before the historic summit between President Trump and Kim Jong Un, for example, the two countries held private talks to plan out all the details. Preparatory back channel talks typically cover issues like location, timing, and topics to be included.

The US government uses back channels in its communication with Iran, Cuba, and other nations or groups which it doesn’t have formal relations with. In some cases, third parties – countries which have diplomatic relations with both of the countries – can facilitate discussions.

Further back, private channels have sometimes paved the way to establish diplomatic relations between two states. During the onset of World War Two, Franklin Roosevelt’s administration used back channels to make contact with the Soviet Union. The State Department was not involved in the private talks, which eventually led to formal diplomatic relations between the US and the USSR.

During the Kennedy administration, Robert Kennedy served as an informal go-between during negotiations between the US and the USSR. Those back channel talks helped diffuse tensions over the Cuban missile crisis and eventually led to the Soviets pulling their missiles out of Cuba.And during the Nixon administration, the US and Soviets kept a shaky peace thanks, in part, to an ongoing back channel negotiation between Henry Kissinger and Soviet ambassador Anatoly Dobrynin.

Since 2016, there have been a number of concerns raised about the private channels of communication used between the Russian government and Donald Trump’s team. Before Trump’s election, members of his campaign reportedly had close contacts with Russian officials and with “operatives” linked to Russia. These contacts were not disclosed to those outside of the campaign, but at least nine other campaign staffers were aware of them.

After Trump was elected, Trump’s son in law, Jared Kushner, reportedly discussed the possibility of setting up a back channel between the Trump transition team and the Kremlin. Russia’s ambassador to the US, Sergey Kislyak, told Moscow officials that Kushner had made that suggestion to him personally during a meeting in Trump Tower in December 2016. Kushner later denied the report.

It is unusual for a president-elect to set up a back channel with another state. However, PBS notes that it’s not unprecedented; Richard Nixon, Ronald Reagan, and Barack Obama were all accused of doing just that.



In politics, a “bellwether” refers to a geographic area whose political beliefs and voting preferences reflect that of a wider area.

For example, a county might be said to be a “bellwether county” if it consistently votes the same way as the majority of the state. A state is considered a “bellwether state” if it usually votes the same way as a majority of the country.

According to Merriam-Webster, the term is derived from the Middle Ages:

Long ago, it was common practice for shepherds to hang a bell around the neck of one sheep in their flock, thereby designating it the lead sheep. This animal was called the bellwether, a word formed by a combination of the Middle English words belle (meaning “bell”) and wether (a noun that refers to a male sheep that has been castrated).

One famous example of a specific bellwether in politics is known as the “Missouri Bellwether,” in which the state of Missouri has voted for the presidential winner in every election from 1904 to 2016. This phenomenon is explained in a 2016 article from The American Conservative.

Political scientists spend a great deal of time analyzing bellwethers to try to discern voting patterns and political affiliations. But as pointed out by NPR in 2008, true bellwethers are becoming harder and harder to come by as demographics change on both local and state levels: “Every election season, reporters fan out to states and counties that claim to be political bellwethers. After all, if the voters in these places have been right in the past, maybe they’ll be right again. But in presidential politics, there are actually few true bellwethers left.”

An example is the state of Ohio, which was also long considered a bellwether for national voting preferences, but started losing its status as a bellwether as the state started more conservative, as noted by the New York Times in 2016.

A Michigan State University academic paper points out that the concept of a bellwether itself is not held with as much regard as it used to be:

Bellwethers aren’t traditionally embraced by political science. But that is because the concept is traditionally measured in very poor way. Counties are termed bellwethers based on a history of coincidence wherein a county has picked the winner of the state or country for the last X elections and what defines a successful county pick is a very arbitrary 50% cutoff.

To make the concept of a bellwether more precise and accurate, political scientists have identified three types of bellwethers:

  • All-or-Nothing Bellwethers: these bellwethers are states or counties that have chosen the national presidential winners with a great deal of accuracy, such as Missouri, Ohio and New Mexico.
  • Barometric Bellwethers: these bellwethers are ones that best reflect the national vote percentage.
  • Swingometric Bellwethers: these bellwethers are counties or states that directly reflect the swing in voting on a national level, and tend to vote differently depending on the specific candidate or issue.

A related term in politics is “battleground.”



In politics, a Bircher is an adherent to the teachings and philosophies of the John Birch Society, an anti-communist organization founded in 1958. The heyday of the Bircher movement was in the 1960s and early 1970s, when the organization had 60 staffers and over 100,000 paying members, in addition to an estimated 4 to 6 million followers nationwide.

Most active in the aftermath of the McCarthy era and into the 1970s, Bircherism has mostly been defined by its support for limited government and an antipathy towards wealth redistribution, unionization, communism, workers’ rights, and socialism.

While active in mainstream politics, Birchers have long had a reputation for being mainly a fringe organization. From The Conversation:

Birchers expressed a belief in domestic communist conspiracies. They went so far as to accuse President Dwight Eisenhower and Chief Justice Earl Warren of being communist dupes and agents – building on the legacy of Sen. Joseph McCarthy whose movement of predominantly Midwestern Republicans found the society’s agenda appealing.

Positions over the years have been controversial and wide-ranging, including opposition to Civil Rights and the Equal Rights Amendment for women’s equality, and a hard-line stance on immigration.

The Birchers’ most notable presidential endorsement was in 1964, when they backed right-wing firebrand Barry Goldwater. A Goldwater spokesman was once quoted as saying about the Birchers: “All those little old ladies in tennis shoes that you called right-wing nuts and kooks…they’re the best volunteer political organization that’s ever been put together.”

Among their most complicated relationship was with Richard Nixon, who the Birchers considered an enemy, in part because of the group’s fervent anti-Vietnam War beliefs. In turn, Nixon once said that the Birchers were a fringe group that will “pass.” Yet even with a history of being anti-Nixon, Bircher chief Welch was quoted in 1975 New York Times article as calling his ouster part of an “international communist conspiracy.”

While most of the tenets of Bircherism fell out of favor during the last few decades with the rise of neoconservatism, the presidency of Barack Obama and the subsequent election of Donald Trump has led to resurgence in Bircher philosophies, particularly in the Deep South states such as Texas, and are fueled mainly by conspiracy theories.

From a 2017 Politico article:

This is what the 21st-century John Birch Society looks like. Gone is the organization’s past obsession with ending the supposed communist plot to achieve mind-control through water fluoridation. What remains is a hodgepodge of isolationist, religious and right-wing goals that vary from concrete to abstract, from legitimate to conspiracy minded-goals that don’t look so different from the ideology coming out of the White House.

Political scientists today draw a straight line from the Birchers of the 1960s and 1970s to political heavyweights such as the Koch Brothers and the emergence of the Tea Party Patriots in Republican politics.

Chatham House Rule

The Chatham House Rule is a system for holding discussions on potentially controversial topics, particularly in politics and public affairs.

At a meeting held under the Chatham House Rule, you are free to use information from the discussion, but you are not allowed to reveal who specifically provided it. The rule is intended to increase openness of debate. It also allows individuals to speak for themselves and not necessarily for affiliated organizations.

Specifically, the rule states:

When a meeting, or part thereof, is held under the Chatham House Rule, participants are free to use the information received, but neither the identity nor the affiliation of the speaker(s), nor that of any other participant, may be revealed.

The rule is invoked by the host of the meeting stating up front that the meeting is operating under the Chatham House Rule. Of course, the effectiveness of the rule relies on trust and sometimes requires disciplinary action, such as exclusion of the violating participant from future meetings.

The rule is essentially a compromise between private meetings, where revealing what was said is forbidden, and on the record events where the discussion is usually attributed to the speakers.

As a result, it is typically not used in an official setting where public meetings of lawmakers and government officials must be open to the public.

The rule is named after the headquarters of the U.K. Royal Institute of International Affairs, based in Chatham House, London, where the rule originated in June 1927. The rule was refined in 1992 and 2002. The Chatham House building was once the home to three British Prime Ministers.

The rule has also been translated into several languages.

ballot box stuffing

ballot box stuffing

In politics, “ballot box stuffing” is a term that refers to the practice of illegally submitting more than one vote in a ballot in which just one vote is actually permitted. The goal is ballot box stuffing is to rig the outcome of an election in favor of one candidate over another.

The term is often synonymous with “electoral fraud” or “voting irregularities.” One form of the tactic is leveraging the “cemetery vote.

While not widespread in the United States, the practice of stuffing the ballot box is considered common in corrupt countries such as Russia. As described by a 2018 AP article:

CCTV footage of a voting station in the Moscow suburb of Lyubertsy shows a woman taking a ballot from a table, looking around to see if anyone is watching, then putting it in the box. She repeats the action, again and again.

And from a 2012 Foreign Policy magazine article:

Sometimes election fraud can be laughingly obvious. When Vladimir Putin took 99.8 percent of the vote in Chechnya in this year’s Russian presidential election, it probably wasn’t because the republic where he had violently crushed an armed insurgency a little more than a decade ago had developed an overwhelming affection for him.

However, ballot box stuffing is not limited to Russia. A 2019 New York Times article reported on alleged ballot box stuffing in the election in Afghanistan:

Abdul Wahed Nasery, another elder from the district, said local strongmen had stuffed the boxes. ‘They sat together, and each filled for their guy. They were saying, ‘We can’t leave these boxes empty,’ Mr. Nasery said. ‘We said, ‘But what about the biometric verification?’ ‘They said, ‘Who is going to look?’

In the U.S., some of the most notable examples of ballot box stuffing aren’t from political elections, but from sports. In 1957 and again in 1999, All-Star ballots for Major League Baseball were tainted by teams stuffing the ballot box with their players, and in 2015 the Kansas City Royals were accused of trying to tip the votes in favor of their roster. The league was forced to throw out 65 million votes.

During the era of the coronavirus pandemic in 2020, a push was made to start using mail-in ballots for elections to replace in-person voting. Many people, mostly on the conservative side of the aisle, including President Trump, suggested that mail-in ballots were more susceptible to “ballot box stuffing” than traditional voting methods, though no evidence exists to corroborate that claim.

amiable dunce

amiable dunce

Ronald Reagan’s critics often referred to the president as an “amiable dunce.” The phrase was meant to suggest that Reagan was friendly and likeable, but fundamentally not very bright.

Clark Clifford, the former Defense secretary and presidential adviser, was the first to coin the term. He made the remark at a private dinner party, and probably never intended for it to be repeated. However, his hostess was secretly recording the conversation at dinner. A copy of the recording was leaked to the Wall Street Journal, which published Clifford’s remarks.

Explaining himself afterwards, Clifford said:

In the fall of 1982, President Reagan said he would cut taxes by $750 billion, substantially increase defense expenditures and balance the budget in the 1984 fiscal year. Those were public promises. I made a comment that if he would accomplish that feat, he’d be a national hero. If, on the other hand, it did not work out after such a specific and encouraging promise and commitment, I thought the American people would regard him as an amiable dunce.

Clifford wasn’t the only one to disparage Reagan’s intelligence. Peggy Noonan, a former Regan speechwriter, said that the president’s mind was a “barren terrain.” Noonan also implied that Reagan was easily influenced by his advisors. ”The battle for the mind of Ronald Reagan was like the trench warfare of World War I: Never have so many fought so hard for such barren terrain,” Noonan declared.

Of course, Reagan’s friends and supporters rejected the “amiable dunce” label. The former president of Canada, Brian Mulroney, told the press, “The Reagan supporters will tell you that Ronald Reagan never made a mistake in his life. And his denigrators would tell you he is an ‘amiable dunce,’ as I’ve said. Well, neither of course is in any way true.”

amen corner

In politics, the “amen corner” refers to the most fervent supporters of a politician or an ideology.

The term originally was used in a religious context. Inside a church, the “amen corner” referred to the section where the most devout (and vocal) worshippers sat. Over time, the phrase expanded to mean a group of people with strong, fixed political beliefs.

A politician’s “amen corner” supports him unquestioningly, in much the same way as church’s amen corner supports the preacher.

William Safire said that amen corner was first used back in 1860, in a religious context. By 1884, the expression was used to refer to political support. It had a negative connotation right away. In 1894, for example, the Congressional Record sneered at “those saintly Republican monopolists who sit in the ‘amen corner’ of protected privilege.”

In 1990, Pat Buchanan used the term “amen corner” to criticize supporters of the first Gulf War. Buchanan was a former presidential candidate and a staunch isolationist. In a TV appearance, he said, ”There are only two groups that are beating the drums for war in the Middle East – the Israeli Defense Ministry and its amen corner in the United States.”

In 2009, Barack Obama gave a speech to the NAACP. He received a warm welcome. The crowd hung on his words, sometimes repeating his words right back to him. The president got such a positive reaction that at one point he laughed and said, “I’ve got an amen corner back there.”

More recently, Bloomberg News revisited the term. In 2015, Israeli Prime Minister Benjamin Netanyahu visited the US and gave a speech to Congress. Bloomberg reported that a select group of Netanyahu supporters was thrilled with the speech (Pat Boone, Joe Lieberman, Newt Gingrich, and Sheldon Adelson). The article was titled “At Netanyahu’s Speech, Scenes from the Amen Corner.”

Pundits often use the term as a way to jab at politicians, implying a kind of guilt by association. In 2016, the progressive Right Wing Watch published an article titled “Donald Trump’s Amen Corner: Prosperity Preachers and Dominionists.” The article charged that Trump was supported by “preachers who tout wealth as a sign of God’s favor” and by “a leading advocate of Seven Mountains dominionism, which teaches that government and other spheres of influence…are meant to be run by the right kind of Christians.”

In Pittsburgh, there is a real club which calls itself the Amen Corner. That club – one of the most exclusive organizations in Pennsylvania – caters to politicians and lawyers. It’s often described as an “old boy’s club” and has been around since 1870. In 1965, Gerald Ford was invited to be a member of the club. In a speech to his fellow members, Ford gushed with gratitude and said that becoming an “Amener” was a much bigger deal than being elected House minority leader earlier that year.

“There is absolutely no parallel between acceptance as a member of Amen Corner and an obscure political happening in Washington not so long ago,” Ford said.