History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 18 Aug 2019 23:42:00 +0000 Sun, 18 Aug 2019 23:42:00 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://w.hnn.us/site/feed The Washington Nationals Should Stop Letting Teddy Win – He Hated Baseball

The racing presidents

 

According to the most recent standings (kept in fastidious detail by LetTeddyWin.com), Theodore Roosevelt’s lead over George Washington is now eight.  Thomas Jefferson sits nine wins back.  Abraham Lincoln, plagued by a series of poor decisions and stumbles (literally), resides in the cellar.   

 

Confused?  

 

I am, of course, talking about Teddy, the big-headed baseball mascot, Roosevelt.  In the middle of the fourth inning at Washington Nationals’ home baseball games, presidential mascots race around the field to the cheers of the District’s baseball patrons.  The tradition began in 2006.  Usually Roosevelt faces Washington, Jefferson and Lincoln in the no-holds-barred race. Occasionally there are guest competitors.  For a few years, the Nationals even introduced lesser presidents (Taft, then Coolidge, then Hoover) into the contest.  This trio of interlopers, however, has been retired to the team’s spring training complex in Florida.  Now it’s just the Mt. Rushmore quartet, battling it out at every home game—from the frigid first games of April, through the swampy heat of the summer, into the crisp evenings of September.        

 

This presidential mascot race might just been another stadium promotion except for the fact that from 2006 to 2012, TR never won.  Like never. He lost 525 consecutive races. Even when Jayson Werth tried to help, TR never crossed the line first.  

 

Ah, what a glorious era of historical karma!    

 

All this losing became made the race a thing in DC.  During the last few months of the 2012 season, pressure mounted to let Teddy win at least once.  Ken Burns, ESPN’s E:60, and the late Senator John McCain all got involved. An amusing and witty seven minute documentary outlining a “vast left wing conspiracy” meant to keep Teddy from ever winning debuted.  The Wall Street Journal put the story on the front page, with a ubiquitous Hedcut picture of Teddy the mascot.  

 

While some poor saps may have actually felt sorry for Teddy, I hope that no historian in good standing with the AHA or OAH was among them.  After all, it was Theodore Roosevelt who shunned baseball first.    

 

Baseball’s Great Roosevelt Chase

 

Theodore Roosevelt romped to reelection (well just election technically, but that’s a different story) in 1904.  He won nearly 60 percent of the popular vote.  He had become the nation’s first “celebrity president,” connecting with Americans in a new and power way.  Recognizing good press when they saw it, baseball’s leaders tried hitch their train to the popular President’s steam engine.    

 

They used simple juxtaposition first.  TR and baseball.  Baseball and TR.  In the 1905 Spalding’s Official Base Ball Guide, which featured a provocative essay on baseball’s origins (Was it possible the game was not uniquely American?), trotted out the Rough Rider angle. “Wellington said that ‘the battle of Waterloo was won on the cricket fields of England,” Spalding’s explained.  “President Roosevelt is credited with a somewhat similar statement that ‘the battle of San Juan Hill was won on the base ball and foot ball fields of America.” 

 

The next year’s publication of the popular guide shifted tactics slightly.  Baseball was in fact an embodiment of Roosevelt’s “Square Deal.” “When two contesting nines enter upon a match game of Base Ball, they do so with the implied understanding that the struggle between them is to be one in which their respective degrees of skill in handling the bat and ball are alone to be brought into play.”  Roosevelt’s “Square Deal,” which had become a “new National Phrase,” was essentially the “Love of Fair Play” that had always been inherent in baseball.

 

Golden Tickets!

 

Roosevelt did not attend a single baseball game during his first term in office.  Nor in 1904 or 1905.  So, in 1906 the American League’s Ban Johnson tried a new approach to get Roosevelt out to the ballpark in the District.  “The management has issued a golden pass to President Roosevelt, who may desire to see what a real, strenuous, bold athlete looks like,” the Sporting Life reported in 1906.  “Mr. Roosevelt is the first man of the land,” the article continued, “if he sees fit, may adjourn the Senate and both houses and take the whole bunch to the game!” 

            

The golden ticket was just what it sounded like.  A ticket laced with gold that allowed the President free entry into any American League game held at the District’s ballpark.  And he could bring as many friends as he wanted.  

 

The 1906 season came and went; Roosevelt never used his golden pass.  

 

Undeterred, supporters of baseball tried again as the 1907 season dawned.  Although Roosevelt was not particularly susceptible to peer pressure, the Sporting Life and other baseball-friendly dailies mounted a campaign that portrayed Roosevelt as a politician, perhaps the only politician, out of step with overwhelming political support for the game of baseball.  

 

“Chief Justice Harlan, of the Nation’s Highest Court, Plays Base Ball and makes a Home Run in His 74th Year,” trumpeted one headline.  “Far from distracting from the dignity of the distinguished incumbent of the Supreme Court seat, the ability of Harlan as a hitter will add to it. That home run is a human touch, a specimen of Americanism that will go far toward popularizing the venerable judge.” Then, just so its readers would not miss the point, the writer posed a rhetorical, shaming question: “How Theodore Roosevelt, who instinctively seems to know how to do the thing that pleases the people, came to overlook the diamond and its opportunities is a mystery.”    

 

The pursuit was getting embarrassing for baseball.  

 

Maybe another, even more golden, ticket would do the trick.  The National Association of Professional Base Ball Leagues, which eventually become baseball’s minor leagues, decided to step up the pressure on Roosevelt significantly.  Rather than just awarding the President a pass to one particular league, for a given season, the NAPBBL invited the President to attend baseball games forever. 

 

The pass presented to Roosevelt on May 16, 1907 at the White House transcended almost every conceivable baseball boundary.  The “President’s Pass” covered thirty six leagues and 256 cities; it gave Roosevelt “life membership in the National Association of Professional Base Ball Leagues, with the privilege of admission to all the games played by the clubs composing the association.”  The honorary pass was made of solid gold.  

 

And it could do things. The ticket “doubles in two on gold hinges to fold, so that it may be carried in the vest pocket.”  The ticket had an engraved picture of the President and the date of presentation, May 16, 1907, on its front.  “The photograph of President Roosevelt is beautifully enameled on the fold.  The rim is intertwined with delicate chase work.  This remarkable card was engraved by Mr. Arthur L. Bradley… It is pronounced by all who have seen it to be a fine piece of artistic workmanship.”  

 

Roosevelt never used it. 

 

Why Teddy, Why?

 

As Roosevelt left the White House, the Washington Post finally gave up.  “With all of his love of outdoor life and sports,” the Post reported in 1909, “Mr. Roosevelt did not go with the ball grounds during his seven years in the White House.”    

 

Why? 

 

“I don’t think that I should be afraid of anything except a baseball coming at me in the dark,” Theodore Roosevelt once said.  Readers with a psychological bent can dig deep here, in terms of what Roosevelt was trying to say.  But there is a simple explanation as well: Theodore Roosevelt had very poor eyesight. Without the aid of corrective spectacles until his teenage years, Roosevelt never had much of a chance as a young ballplayer.    

 

But why Roosevelt rejected baseball as adult, as a fan, we don’t really know.  Roosevelt’s oldest daughter Alice once summed it up as a matter of toughness.  

 

“Father and all us regarded baseball as a mollycoddle game.  Tennis, football, lacrosse, boxing, polo, yes – they are violent, which appealed to us.  But baseball? Father wouldn’t watch it, not even at Harvard.” 

 

To lose out to tennis on the mollycoddle scale; that hurts.

 

And Now Teddy is Winning?!

 

The Nationals caved in 2012.  The club let TR break through, in a rather fraudulent manner, on the last day of the regular season.  Roosevelt won.  What a mistake.  

 

Now, fast forward 7 years, Teddy the mascot is not only winning occasionally, he is leading the season long tally at Nationals’ park.  And the Nats, while surging after the All-Star break, are stuck in second place. 

 

So what should the Nats do as they try to chase down the NL East leading Atlanta Braves?  While Juan Soto is working hard to make Nats fans forget Bryce Harper and the team’s bullpen might just be getting its act together, I’d suggest the Washington Nationals get their history back in order. Don’t let TR, a noted baseball curmudgeon, win anymore.  No mas! Get right with baseball history and perhaps, just maybe, the Nationals will find themselves playing playoff baseball again this October.  

 

 

For more on Teddy Roosevelt and sports, read Ryan Swanson's latest book: 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172784 https://historynewsnetwork.org/article/172784 0
Bearing Human Cargo: 400 Years Since the Arrival of the First African Slaves in the American Colonies

A plaque in Virginia commemorates the arrival of "twenty and odd" African in 1619

 

 

Scholars of the African diaspora speculate that between 1525 and 1866, more than 12.5 million people were stolen from the African continent and transported to the New World. Of those that traveled, only 10.7 million survived the treacherous and excruciating journey known as the Middle Passage. Spain and Portugal were the initial European powers to engage in the horrific yet profitable trade, but others soon joined suit, including Great Britain, whose colonists settled in large numbers on the North American mainland beginning in 1607. The arrival of the first Africans in the New World in 1619 marked the beginning of the English slave trade. 

 

According to historian Tim Hashaw, the first Africans in the colonies came from Angola and were initially captured by Portuguese raiders in a series of skirmishes against the Kongo and Ndongo Kingdoms located near West Central Africa. In the summer of 1619, Portuguese raiders marched the captives some 200 miles towards the coast, to the slave port of Luanda. Disease, thirst, and hunger killed many. 350 remaining captives were put on board the Portuguese slave vessel San Juan Bautista which had the initial destination of Vera Cruz, on the coast of modern day Mexico. 

 

Upon arrival to the Gulf of Mexico, two British privateers, the White Lion and the Treasurer attacked the slaver. Hoping to find a merchant vessel fat with gold stocks, they settled for the human cargo and took possession of approximately 60 Africans on board. The two ships, flying with Dutch colors,  set sail for the English colonies on the American mainland, and The White Lion arrived at Point Comfort in modern day Hampton, Virginia, towards the end of August 1619. The majority of the men and women were immediately sold into bondage to wealthy planters, distributed across the colony. Among the buyers were Governor Sir George Yeardley, and powerful merchant Abraham Piersey. According to scholar Kwando Mbiassi Kinshasa, among the Africans were a man and a woman known as Antoney and Isabella whose baby became the first documented African baptized in English North America - a child known as William Tucker. 

 

A few days after the arrival of the White Lion, the Treasurer arrived to port at Point Comfort, and possibly sold an additional 7 to 9 Africans. Among them a young woman named Angela, who was purchased by Lieutenant William Pierce, a wealthy tobacco planter and burgess representing Jamestown. In an interview for the Washington Post, historian James Horn, president of the Jamestown Rediscovery Foundation said, “That is a chilling aspect of the slave trade. People are being treated like livestock. The capability of women to have children was in slavers’ minds. To survive a journey like that, my own sense is she was young and possibly very young. Where there is no evidence, it is fair to speculate.”

 

When the ship arrived to the Jamestown colony, John Rolfe, the former husband of Pocahontas, simply described it as “a ship brought nothing but 20. and odd Negroes.” A census conducted in 1625 revealed a total of 23 Africans living in Virginia, a number which increased to approximately 800 only 40 years later. The Africans who arrived in 1619 represented the first in a long succession of slaves transported to the Americas. It wasn’t until Abraham Lincoln’s Emancipation Procclamation in 1863, 244 years later that slavery was officially outlawed in the United States of America. By then, over 4 million African Americans lived in bondage. 

 

Although 1619 remains a historic marker, overemphasizing the date or dismissing the long and complicated history of slavery in the Americas would be a mistake. The date may very well serve as the marker of the beginning of African slave possessions but Africans were present on the continent long before this date. For example, in 1526, several enslaved Africans were part of a Spanish expedition in modern day South Carolina, rebelling against their captors and helping to prevent the founding of a colonial settlement in the region. According to some scholars, there is also evidence that there were numerous African slaves present on the fleet of Sir Francis Drake when he arrived to Roanoake Island in 1586. 

 

While 1619 marks the de facto beginning of African slavery in the British colonies in North America, it also misguides the fact that free Africans and African slaves were a vibrant and important part of the “New World,” long before 1619. Historians such as John Thornton have shown that the Atlantic slave trade was a transnational endeavor, and was already well established long before the arrival to Jamestown. Michael Guasco, writing Smithsonian Magazine, noted, “As early as May 1616, blacks from the West Indies were already at work in Bermuda providing expert knowledge about the cultivation of tobacco. There is also suggestive evidence that scores of Africans plundered from the Spanish were aboard a fleet under the command of Sir Francis Drake when he arrived at Roanoke Island in 1586.” Guasco argues that highlighting the 1619 date effectively erases the memory of many more African peoples than it memorializes. Scholar Eric Hershthal argues that the date “suggests a certain timelessness to anti-black prejudice, when in fact racism developed over time, and was as much a consequence of slavery as it was a cause of it.” 

 

Still, the year 1619 is significant. In 1619, the English settlers in Britain’s new overseas colonies faced a test. Would they reject the European standard of African forced servitude, or incorporate it into their newfound society? They were relative newcomers to African slavery but embraced it wholeheartedly, and set off a historical chain of events which still reverbarates in American society where freedom and equality are still unattainable for many. 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172762 https://historynewsnetwork.org/article/172762 0
When Clowns Rule the World

Boris Johnson gets stuck on a zipline in 2012

 

But where are the clowns

Send in the clowns

Don’t bother, they’re here

-Stephen Sondheim, 1973

A LITTLE NIGHT MUSIC

 

“The problem with political jokes,” Groucho Marx once said, “is that they keep getting elected.” Never has that been more true than today.  We live in a world ruled by clowns.  I mean that both literally and figuratively.  Our century has ushered in the Age of the Clown Politician.

 

In Ukraine, Volodymyr Zelensky, who played a Ukrainian president on the popular television comedy SERVANT OF THE PEOPLE, was elected to be the real president with over 70 percent of the vote. Zelensky is literally a clown.  In Great Britain, Boris Johnson, who will replace Theresa May as prime minister, is a buffoon who elicits laughter –usually unintended- wherever he goes.  Mr. Johnson is figuratively a clown.  

 

The founder of Italy’s Five Star Movement, which is part of the government ruling coalition, is comedian Beppe Grillo, another clown, literally. Last year in Slovania, satirist Marjan Sarec, a comedic actor, was elected prime minister. And in 2015, another comic actor, Jimmy Morales, was elected president of Guatemala.

 

History provides us with no shortage of clowns and buffoonswho were in politics.  Some leaders may have appeared clownish yet did great damage. Adolph Hitler, the target of humorous jabs from Charlie Chaplin in THE GREAT DICTATOR (1940) would be a truly laughable figure had he not caused so much harm. Much the same might be said of Pol Pot of Cambodia, Idi Amin of Uganda, Maummar Qadaffi of Libya, Kin Jong Un of North Korea, and Mussolini of Italy. 

 

We are concerned here, however, not with deranged and demented leaders such as Caligula of Rome, or with leaders who are considered to have been just plain wrong. No, we are looking for buffoonery, cartoon characters who are in and of themselves, clownish. People such as Rob Ford, who served as Mayor of Toronto, Canada (2010-2014) and provided us with laughs throughout his term of office; or Nero, Justin II of Byzantine (who heard voices and would hide under his bed to find solace, and commanded his servants to play organ music to drown out the voices – and who bit servants heads when they disobeyed). These are people who deserved to be laughed at, who earned not our respect but our mocking scorn.  

 

It should also be noted that merely saying something stupid does not qualify. For example, former Australian prime minister Tony Abbott’s explanation that “No one, however smart, however well-educated, however experienced, is the suppository of all wisdom,” ranks among the funniest lines ever uttered by a politician, but one malapropism does not a clown make. The same goes for George W Bush, who once said that “Our enemies are innovative and resourceful, and so are we. They never stop thinking about new ways to harm our country and our people, and neither do we.”

 

There is of course, a rich history of comedy used to skewer those in power, from the comic actors of ancient Rome to the classical Greek playwrights, to today’s late-night comedians. These comedic anthropologists dissect and ridicule power as a way to strike a blow against authority and bring the pretentious down a peg or two.

 

Most clown princes, however, are populists who bash the elites of society. These populist leaders tear down the haughty elites who –so it is believed – keep “us” down. Puncturing a hole in pretensions is a healthy thing; believing that the puncturing is also a way to rebuild society is a dangerous fiction.  The clown princes are good at tearing down, poor at building up.

 

Over the years, many world leaders have been funny – Abraham Lincoln, John Kennedy, Ronald Reagan, and Winston Churchill come to mind. Their wit, their one-liners were often used strategically to disarm critics or make themselves appear self-effacing and therefore more human and accessible.  Clowns, while they sometimes use humor strategically, are buoyed by the fact that their clown persona is the draw.  Their very existence thumbs a nose at the status quo. The clown is “one of us” against the people who keep us down.  And as Napoleon said, “In politics, absurdity is not a handicap.” In fact, it is precisely the absurdity of the clown prince that sets him (they have thus far all been men) apart.  

 

Virtually all the clown princes today are virulent nationalists, nativists, opponents of immigration, fearful of “the other” at our gates, often openly racist, collect alleged resentments, blame the educated and the wealthy for their plight, and believe that they are ridiculed by those who see themselves as superior to the common folks.  The enemy is the status quo, and it is time to “get them”.  Their bite may be couched in buffoonery, but that bite can inflict great pain. 

 

Humor directed at leaders can serve important functions.  In THE JOKE AND ITS RELATION TO THE UNCONSCIOUS, Freud argued that jokes can serve as a rebellion against established authority, and help us cope with power disparities in society.  Making fun of those who have power over us is a small blow against authority. 

 

But the clown princes go further. What could be more anti-elitist than to take politics to the polar opposite extreme? Elitists read books, use evidence to make arguments, rely on science, demand proof; the clown prince needs no such intellectual crutches; they rely on passion, emotion, feelings. Lashing out is their feel-good option. 

 

While otherwise quite humorless, Donald Trump seems to come alive in his mass rallies where he is egged on by as he eggs on his passionate base.  It is a mob that still, 2 1/2 years after the last election, chants “Lock Her Up, Lock Her Up.”  Really?  Still? 

 

 America’s clown-in-chief even went so low as to mock a handicapped reporter (to the great amusement of our Pied Piper’s loyal followers).  Notorious for his rapier-like monikers he applies to critics to put-down his rivals, Trump, seemingly incapable of introspection or self-effacing humor, revels in these digs against his opponents. Lyin’ Ted Cruz, or Crooked Hillary, Low energy Jeb, or Little Marco, Pocahontas or Leakin’ James Comey, Trump dismissively labels his rivals to the delight of his base. Our clown prince tears down, but he is unable to build up. Trump’s rivals have tried to fight fire with fire, pinning Trump with their own sobriquets such as Con Don, Cheatin’ Don, Traitor Trump, Agent Orange, Hair Hitler, King Leer, and Benedict Trump, but thus far, no single moniker has demonstrated staying power. 

 

It is a sign of good mental health to occasionally laugh at ourselves, to see the humor in human folly, to take a joke. But Trump even went so far as to call Saturday Night Live “a Democratic spin machine,” that “can’t be legal,” adding, that “this should be tested in court.”  Trump seems incapable of laughing at himself, as if one crack in the constructed veneer – manly, strong, a winner – and the entire  Trump edifice will come tumbling down. So fragile is his ego that he can admit to no human weakness.  

 

Jokes can be deadly serious, and even today’s clown princes have a point to make. But clowns best serve us while on the outside looking in. Once in office, these clown princes cease to be funny.  They have also been failures at governing. When the court jester becomes the king, when buffoons are taken seriously, when clowns rule, is the end far behind? Of course, we’ve weathered worse storms before. At least with this one, we may die laughing.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172763 https://historynewsnetwork.org/article/172763 0
“This letter is the last that I write to you:" The Courage of Youth Resisters in World War 2 France

 

World War II in Europe was the cause of innumerable atrocities against civilians: for instance, the Shoah, Allied and Axis carpet bombing, the destruction of Warsaw, and the cruelties of enforced starvation. Another example is less known: the arrest, torture, and execution of many young people who harassed the Germans during their occupation of France (1940-1945). 

 

About 3000 resisters, many under the age of 25, were tried in German military courts in France, found guilty of clandestine “terrorist” activity, sentenced to death, then executed. The cut-off age for military execution was 16, and some of these boys had barely passed their 17th birthday. For the most part, the prisoners would be informed of their execution only a few hours in advance. Cruelly, under a thin veneer of generosity, the prison guards allowed the condemned to write final notes and letters to their families. The Germans forbade relatives to share them, but many did make copies; some were published in underground newspapers; others reached the BBC in London where they were read for broadcast in France. After the war, a good number of these heart-breaking missives were published.  

 

The authors of these letters reveal an assumed bravado mixed with regret and a moving sensitivity toward the loved ones who would read them. They often showed an admirable self-awareness, gave justification for their actions, and revealed how they hoped to be remembered. Recovered letters from girls and young women were rare, for these “terrorists” were sent to Germany for incarceration or execution by beheading.

 

In March 1942, 19-year-old Fernand Zalkinow wrote a final missive to his sister from Paris’s notorious La Santé prison. A few days earlier, he had been found guilty in a mock trial, and was executed along with six other young communists at Mont-Valérien, a military encampment outside of Paris where at least a thousand were shot over the four years of the Occupation. It is a long letter, as if he thought he might postpone the inevitable by keeping his pencil scratching across the cheap paper. An excerpt:

 

Since I’ve been here, I have looked deeply into myself. I have come to realize that, despite all of my faults—and I have more than a few—I wasn’t so bad as all that, and that I could’ve been a pretty good guy. … I’m a bit of a blowhard, I know. But to tell the truth, I still can’t explain why [now] I am so calm. Before my sentence, I often cried, but since, I haven’t shed a tear.  I have the sense of an inner tranquility, a deep quietness ... I have only one more test to pass, the last, and everything will be over, nothing more.

 

This letter, and dozens like it, serve to remind us of the courage of the young people who had decided, often against advice from their parents, relatives, and mentors, “to do something” (faire quelque chose) against the German occupiers and their French collaborators. Though some took up arms, most resisters at first were non-violent—printing and distributing tracts, hiding arms, protecting refugees and downed Allied pilots, passing information, and so forth-- but these actions could still carry harsh penalties.

 

Henri Fertet, age 17, wrote to his parents from a prison in Besançon. Imprisoned for three months, he learned that he was to die within a few hours. At the end of his last letter, he scribbled:

 

My letter is going to cause you great pain, but I have seen you so full of courage that I am certain you will remain strong, if only because you love me. … The soldiers are coming to get me. I’m hurrying. My writing is perhaps shaky, but that’s because I have only a bit of pencil. I’m not afraid of death, I have a tranquil conscience. … I die willingly for my country. ... Adieu; death calls me, I want neither blindfold nor to be tied. I love you all. It’s hard, still, to die. …

PS. Please excuse my spelling mistakes, no time to reread.

Sender: M. Henri Fertet, in Heaven, near God.

 

The German jailors themselves were not immune to the sight of so many French and refugee youth being shot. Frequently, the execution squads charged with shooting the boys were themselves composed of young soldiers. There were official reports of their own tears, and their depression following these executions. The courage of their condemned contemporaries had a marked effect on more than a few of them. We know too, from his diary, how Franz Stock, a markedly empathetic German chaplain, a devout Catholic, becalmed the anxieties of those facing imminent death--communists, Jews, Protestants, and non-believers. Though tasked by the Wehrmacht to minister to the condemned, and to be present at their execution, he often passed notes from their parents to the boys and consoled families afterwards, reassuring them that their sons and brothers had died bravely, calmly. His ministry to hundreds of the condemned was vividly recalled after the war. In 1994, a square was named for him at the base of the Mont-Valérien killing field. 

 

The young men—and women—who took up arms, metaphorically and realistically, against their oppressors were a minority. In fact, the resistance networks and movements in general were composed of but a small percentage of French citizens, despite Charles de Gaulle’s claim at the end of the war that a majority of the French people had “resisted.” Moreover, the Germans were so skillful at keeping resistance groups off guard that doubt remains as to how crucial they were in ridding France of their occupiers. Yet, their persistent encounters reminded their fellow citizens that to act was to remain free. To this point, French writer Jean Paulhanhas used a wistful metaphor: “You can squeeze a bee in your hand until it suffocates. But it won’t die before stinging you. That’s not so bad, you say, not bad all. But if didn’t sting you, bees would’ve disappeared a long time ago.”  

 

Today, televised scenes of youthful marchers and demonstrators in Algiers, Hong Kong, Istanbul, and in our own nation, should bring to mind their predecessors in occupied France. It is why we must revive the memory of their deeds and the price they paid. The young are often the first to challenge authority, to take to the streets, to put their lives and their careers on the line. They struggle to fashion a future that will happily replace the curdled present in which they live. Their impatience, their perceptive view of a better tomorrow, their faith in the power of solidarity, and their energetic fearlessness serve to remind us, adults, that hope is a thing, that change is possible, and that all good life begins with dreams. 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172787 https://historynewsnetwork.org/article/172787 0
Many Historians Got It Wrong on Obama's Foreign Policy

 

Most historians tend to be liberal in their orientation and this perspective can sometimes cloud their judgment in assessing liberal presidents. 

 

For example, Woodrow Wilson and John F. Kennedy continue to receive high marks compared to their conservative counterparts, even though both presidents carried outruinous policies. Wilson led the United States into World War I on dubious pretexts, resulting to the death of 100,000 Americans, while Kennedy escalated the American involvement in Vietnam, backed cutting tax rates for the wealthy, and expanded covert military interventions.

 

Today, while Donald Trump is reviled, many liberal historians have a soft spot for his predecessor Barack Obama, whose policies were just as bad in many areas. In a 2017 poll taken just after President Trump’s inauguration, historians ranked Obama as the twelfth best president of all-time, praising him for his commanding moral authority.

 

Presidential historian Doris Kearns Goodwin and Laura Belmonte of Oklahoma State told Time Magazine that Obama deserved credit for ending combat in Iraq and Afghanistan, while Timothy Naftali of New York University said that Obama had redefined American engagement with the world. 

 

Obama, however, initiated a major troop surge in Afghanistan and redeployed troops to Iraq while starting a new war in Yemen using Saudi Arabia and the United Arab Emirates as proxies. His defense budgets outstripped those of Bush by an average of $18.7 billion per year and his base budgets exceeded those of Bush during his two terms by $816.7 billion.

 

In a volume edited by Julian Zelizer of Princeton University, University of Texas historian Jeremi praises Obama for offering a liberal internationalist vision – emphasizing multilateralism, negotiation and disarmament – after eight years of neoconservative militarism under President George W. Bush. Suri writes that “Obama’s vision was progressive and pragmatic, focused on American leadership through democratic alliances and common law that would underpin legitimate force. He worked vigorously to build alliances and negotiations. He sought to tame war with law and where possible end American military conflicts. As a good lawyer, he sought to nurture careful procedures for assessing targets, collateral damage and regional reverberations.” 

 

Such claims are undercut by the fact that Obama sowed regional instability byinvading and bombing seven Muslim countries, and ordering the assassination of terrorist suspects without due process. A CNN report found that less than two percent of drone fatalities were high value insurgent leaders, while the Bureau of Investigating Reporting estimated that at least 1,100 civilians were killed.

 

My book, Obama’s Unending Wars: Fronting for the Foreign Policy of the Permanent Warfare State (Atlanta: Clarity Press, 2019) provides the first comprehensive critical history of the Obama Administration focused specifically on  its foreign policies. I argue that Obama was one of the most effective front-men for war in American history. He successfully curtailed popular dissent that had existed under Bush and helped to institutionalize a permanent warfare and a mass surveillance state.

 

The president whom I compare Obama to is Woodrow Wilson, whose reverential treatment by historians is epitomized by a recent biography entitled: “The Moralist.” 

 

Both Wilson and Obama came from academic backgrounds and were brilliant orators skilled in “camouflaging their settled opinion,” as Senator Richard Pettigrew, (R-SD) a populist and a contemporary of Wilson, wrote in his memoirs. Both were capable of winning over liberals while functioning essentially as Tories. Pettigrew accurately predicted before Wilson’s election that he would “undertake some reforms [and] rail about the bosses [and] talk about purity but he is absolutely owned by the great moneyed interests of the country who paid the expenses of his campaign.” So too, of course, was Obama.

 

Wilson’s political career was bankrolled by his Princeton classmate Cleveland H. Dodge, scion of the Dodge copper and munitions fortune and a director of the National City Bank and Winchester Arms Company, Obama received major financial support from the Pritzker banking dynasty and the Crown family, which owned the largest percentage of shares in General Dynamics, one of the leadings arms manufacturers in the world. 

 

Both Wilson and Obama were especially gifted in framing military interventions as moral crusades. Wilson packaged intervention in the Great War as a matter of “vindicating principles of freedom and justice in the life of the world as against selfish and autocratic power [Germany].” Obama invoked the concept of “humanitarian intervention” and the Responsibility to Protect (R2P) act to justify military aggression against Libya, which resulted in Africa’s wealthiest country devolving into a failed state.

 

Rep. Ilhan Omar (D-MN) said it best when she told Politico that: “We can’t be only upset with Trump. … His policies are bad, but many of the people who came before him also had really bad policies. They just were more polished than he was… We don’t want anybody to get away with murder because they are polished. We want to recognize the actual policies that are behind the pretty face and the smile.” 

 

Many historians appear to have been so mesmerized by Obama’s charisma and progressive domestic policies that they cannot properly criticize the foreign policy of Obama. It’s worth remembering that the malaise of American political life far transcends Donald Trump.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172773 https://historynewsnetwork.org/article/172773 0
Walkability Is at the Heart of Human Societies

 

 

Until the 1920s to mid-1930s, public roads were for everyone. Cars had no greater claim to the space than cyclists, horses, wagons, or pedestrians. The fact that we no longer see walking or walkability as integral to human life is a huge shift in human history. The most recent anthropological research into early human life has shown the importance of walking in the formation and strength of human communities. By designing walking out of our lives, humanity loses something essential.  

 

Our bipedal walking, unique among mammals, took millions of years to evolve and refine, yet nobody from evolutionary biologists to paleoanthropologists knows exactly why we do it. It’s unsteady and unsafe, yet somehow it lies at the core of what it means to be human. “There’s nothing necessary about walking on two legs,” wrote anatomist and anthropologist Alice Roberts in her book The Incredible Unlikeliness of Being. “Not many animals do it. It’s a stupid thing to do; you’re much more stable on four.” From a biological and physiological perspective, this statement is true, but it seems clear from anthropological evidence that human societies developed in part through the unique vulnerability bipedal walking encourages. Walking on two legs allowed human beings to use tools with their hands and run across the savannah, but it also made them physically unstable and very possibly contributed to the formation of tribes through trust and cooperation—because surviving alone was impossible.

 

“Community” is often presented as an abstract concept or a handy political slogan like “Main Street businesses” or “real American,” but true community rich in social capital and interpersonal interactions has always been essential to human survival and the evolution of human society. John Cacioppo, the director of the University of Chicago’s Center for Cognitive and Social Neuroscience until his death in 2018, studied loneliness for over twenty years before publishing his book Loneliness. In a 2016 interview he said that “Loneliness is like an iceberg, we are conscious of the surface but there is a great deal more that is phylogenetically so deep that we cannot see it.” The pain of loneliness, or “perceived isolation,” is profoundly linked to the evolution of our species and the formation of our societies.

 

Cacioppo describes loneliness as an evolved pain, one that developed over tens of thousands of years of evolution to force humans to give time and attention to their social connections. It’s not being alone that harms us; it’s feeling lonely. The deeper that neuroscience research delves into this topic, the more evidence researchers find that human beings require regular social interaction. Chronic loneliness isn’t just a state of mind; it’s an illness that leads to identifiable and dire physical consequences including compromised immunity, increased cortisol levels (an indicator of stress), depression, anxiety, and heart disease. Social isolation, wrote Cacioppo in Loneliness, is comparable to high blood pressure, lack of exercise, obesity, or smoking in its effects on human health.

 

Thel oneliness epidemic reinforces the need for walkability in our communities. Donald Appleyard, a professor of urban design at the University of California, Berkeley, ran the seminal study on walkability and social alienationin the 1960s. Dr. Appleyard found that people who lived in places with fewer spaces to encounter neighbors and strangers had far fewer friends and acquaintances than people who lived in high-traffic areas. More recent research from the U.K. as well as the U.S. has shown that neighborhoods with low walkability have far less social capital than those that are designed with walkability in mind, meaning, in essence, that communities without walkability lack a sense of neighborliness and have higher levels of isolation.

 

In Los Angeles in 2016, actor Chuck McCarthy launched a “people-walking” business whose success bore out the craving for the human interaction that walkability can bring. He initially charged $7 per mile to walk with people along streets or through parks and the response was so positive that by 2019 he had a full-time business with employees and had developed an app. People feel isolated not necessarily because they don’t have friends, said McCarthy, but due to “fluid schedules in the gig economy,” meaning it’s difficult to coordinate spending time with friends and family. “We’re on phones and computers constantly communicating,” McCarthy said, “but we’re not connecting as much. We need that human interaction.”

 

American society has for decades promoted the idea of individual strength as the key to success—Laura Ingalls Wilder’s Little House on the Prairie books, for example, depict her pioneer family fending for themselves, surviving and carving out a completely self-sufficient life. At least, that’s the mythology. But the truth is that the Ingalls family depended completely on the support of the various communities they lived in for everything from financial loans to medical care to food to fend off starvation in winter. 

 

The myth of self-sufficiency and independence, however, remains powerful in American culture, and contributed the eventual trend of single families living in communities that required transportation to get them from one place to another without having to interact with other people. In other words, far-flung suburbs, car ownership, and heavy investment in highways to serve both. 

 

We severed ourselves from one another, thinking it would make us free. Instead, the outcome has been a gnawing, widespread loneliness that manifests itself as depression, anger, existential angst, and any number of personality quirks and mental illnesses. Loneliness has become an epidemic, a combination of isolation and lack of exercise depriving our psyches of purpose, stimulation, and the companionship of others. 

 

We’ve forgotten about shared space, about public roads, and the fact that tens of thousands of years of human history demonstrates a profound need for the daily, in-person interactions walkability provides. It is this forgetfulness that allows us to roll across crosswalks and through intersections with our eyes and attention on the smartphone. Forgetfulness also makes us think we don’t need walking. Critics might claim that restructuring towns and cities to be walkable is almost un-American, but in truth it’s the most practical and accessible way to strengthen our relationships, our communities, and ourselves.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172786 https://historynewsnetwork.org/article/172786 0
The Civil War and the Black West

 

In 1861, as Confederate armies prepared to crush the Union, President Lincoln commanded only 13,000 men and officers. At this moment ragged immigrant invaders seeking asylum reached the southern border of Kansas.

 

Surrounded in Oklahoma on three sides by slaveholding states, they found themselves defenseless when the President withdrew Federal troops from the Indian Territory for service back east. Confederate diplomat Albert Pike and his armies arrived to take over. People of color and some poor white Oklahomans said they wanted no more of white men’s wars or slavery. Voting with their feet for neutrality and emancipation, a ragged band of 10,000 revolutionaries picked up and fled.   

 

Strongly supporting this revolution from within most Indigenous Nations, was the Keetowah or PINS, a secret anti-slavery network of enslaved and free Africans and Indigenous Americans. 

 

Opothla Yahola, a wealthy Creek chief, who favored neutrality agreed to open his 2000-acre North Fork stock raising and grain plantation to the refugees. Yahola had fought slaveholder armies in two Seminole wars and a voice for his people’s traditional values. As the exodus settled in North Fork, he freed his slaves. Soon one in seven residents of Oklahoma lived on his plantation. It began to look a refugee camp and sound like a community meeting.

 

Confederate Colonel Douglas Cooper, an Indigenous leader forced to sign a Confederate treaty, was ordered to march 1400 troops on North Fork. This triggered another exodus by Yahola’s largely unarmed neutrals. They planned to circle northern Oklahoma recruiting more neutrals– but then Cooper’s armies attacked.

 

At Round Mountain November 19, 1861 neutrals using bows and arrows and a few rifles drove Cooper’s Army back to Fort Gibson. On December 7th while their white officers were away, conscripted Indian Confederates visited Yahola’s camp. What began with warm greetings ended the next morning with most of Indigenous Confederate recruits leaving for home and some joining the marchers.  

 

Winter’s savage wind and snow, one of the worst in years, began to sweep though Oklahoma. At Chustenahlah the day after Christmas, Confederate troops opened fire on marchers. Hundreds were slain. Cattle, ponies and sheep died and survivors fled leaving their wounded, food, supplies and dead in the snow. One Seminole chief recalled: “At that battle we lost everything we possessed: everything to take care of our women and children with, and all that we had.” 

 

The neutrals now determined to reach Union lines in Kansas. They walked into a “march of blood on ice.” In early 1862 about 7600 ragged, wounded and traumatized men, women and children straggled into southern Kansas. It had few relief facilities to offer, only cold, hard ground. Yahola had lost his daughter in the march, and he died soon after in a small tent. 

 

Devastating attacks and staggering losses changed many survivors. Young male neutrals now wanted to fight planters and free slaves. Their commitment to anti-slavery came months before Lincoln thought of Emancipation, and long before emancipation opened recruiting in the deep South. News ofthis new liberation army drew runaways from Arkansas, Missouri, Kentucky and Tennessee to Kansas and enlistment. By 1865 the state’s Black population had swelled to 12,527.

 

ABOLITIONISTS MEET VOLUNTEERS OF COLOR

 

The anti-slavery enlistees in Kansas luckily stumbled on a group of abolitionist officers who eagerly embraced their ideals and eager to train them. Some officers had arrived or rode with John Brown in “Bleeding Kansas” in the 1850s. Now they had the chance to lead the multicultural Army Brown always wanted.

 

Another of history’s wild straws appeared when President Lincoln asked General James Lane to command his Kansas recruiting. Lane had been a committed racist who believed “Africans were a connecting link between human and orangutan.” As an Iowa Senator and General during the Mexican war, he helped the U.S. seize Mexican land from the Mississippi to the Pacific for U.S. slaveholders. 

 

A man of recurring changes, in 1856 Lane rode into “Bleeding Kansas” to serve John Brown’s raiders. In January 1861 Lane was made president of the Topeka Free Soil Convention that wrote its anti-slavery Constitution. Kansas appointed Lane one of its first two U.S. Senators.

 

As President-elect Lincoln prepared for his inauguration amid rumors of assassination plots, Lane met him in Washington. Lane agreed to recruit 116 armed Kansans to guard the White House. For twenty days his men camped in the east room and Lane slept outside the President’s bedroom. Lane, anchored only by his ambitions, was Lincoln’s choice for Commander of Kansas recruiting. 

     

 

  “ . . . we need the services of such a man out there [in Kansas] at once --that we better appoint him a brigadier-general of volunteers today, and send him off with such authority to raise a force . . . [and] get him into actual work quickest.”

           President Abraham Lincoln, June 1861

 

  

Fierce, grim-faced governed by a ruthless fighting code and given to explosive announcements, Lane was both the President’s Kansas Commander and a powerful military figure in the Trans-Mississippi West. He soon won the support of his officers and men, became a popular hero and a man women feared.

 

In the Summer of 1861 Lane led the Ist Regiment Kansas Volunteer Cavalry, the first ex-slave soldiers of the war. With thirty men he invaded Missouri and defeated 130 mounted Confederate guerrillas. He then called his men “the finest specimens of manhood I have ever gazed upon.”  

 

Lane authorized General Blunt, John Brown’s highest ranking officer, to recruit African Americans. By Fall Blunt and his officers led troops of color in liberating Missouri slaves. Lane, offering certificates of freedom and cash to recruit runaways for his army --- also threatening those reluctant to volunteer. 

 

In July 1862 General Blunt was authorized by Lane to recruit any men in Kansas “willing to fight.” Under him Abolitionists Colonel William Phillips, one of Brown’s two brigade commanders, and Captain Richard J. Hinton, Brown’s admiring biographer, also led raids into slave territory. The President’s Kansas operation was now directly defying the President’s War Department policies. Meanwhile their soldiers of color rolled up one victory after another.

 

The invasions of Missouri continued to fill ranks of Kansas soldiers of color and as did runaways fleeing slave states. The appearance of armed Black men stirred escape plots and disrupted plantations months before President Lincoln embraced Emancipation. 

 

Between 1860 and 1863 Missouri’s enslaved population fell from 114 000 to 78,000. Runaways increased in Iowa, Illinois as well as Kansas. Masters seeking to avoid advancing Black troops began to force march enslaved families to Arkansas and Texas. 

 

NEW UNITS AND BATTLES

     

In the “Indian Home Guards” and “Indian Brigades” Lane found the Black Indian members served as interpreters between their Indigenous brothers and the white officers.

 

On July 17, 1863 Blunt’s multicultural army of 3,000 at Honey Springs fought the largest battle of the Indian Territory. White Confederate soldiers expecting the appearance of armed whites frighten Blacks to surrender, so they arrived with 300-400 pairs of handcuffs.

 

Instead, the Confederates found the revolutionaries routed their cavalrymen, guerilla units and 6,000 Missouri State Guards. The New York Times described the “desperate bravery” of the outnumbered African American force. A white officer reported, “they fought like tigers.” Six Killer, a Black Cherokee, shot two Confederates, bayoneted another and clubbed fourth with his rifle butt. Sgt. Edward Lowry, ordered to surrender by three mounted Confederates, knocked all three from their horses.

 

From trans-Mississippi battlefields Union Generals and northern journalists continued to report the “extraordinary courage under fire” of these units of color. Emancipation still lay in the future.

 

Clearly soldiers of color were fighting for something more fiercely personal than saving the Union. So were Lane and his white officers. They repeatedly petitioned the War Department to pay their men the same $13 a month granted whiterecruits. They also urged their men be commission as officers. Col. James Montgomery’s “tri-colored brigade” -- Indians, whites and ex-slaves -- fought for nine months without any wages as a protest. But wage equality had to wait until after the Emancipation Proclamation and slave recruitment.

 

On Emancipation Day January 1, 1863 Lane’s men and officers held their own liberation celebration. Lincoln’s policy now marched in step with their fight against the rebels. In a richly symbolic ceremony Lane’s multicultural Army honored their victories, the people they liberated and saluted their hero, John Brown. Wives and children cheered their courage and rescue of enslaved families. Blunt’s men sang the John Brown song, and added a final line -- “John Brown sowed, and the harvesters are we.” Men and officers, Blunt wrote, then shared a barbecue and “strong drink.” 

 

THE WAR CHANGES COURSE

 

The President Lincoln early called for 75,000 volunteers to serve for three months. Neither he nor 31 million other Americans expected a four-year carnage that would take 750,000 lives and leave no family in the north and south, white or people of color, untouched.  

 

Lincoln’s early policies had drawn fire from Frederick Douglass and fellow abolitionists. They claimed he invited defeat and encouraged treachery. Douglass called for a “people’s revolution” one that would arm the enslaved and overthrow the planters’ system. The Kansas volunteers of color and their white officers were the first to recruit, arm and trains men who would turn that about on the battlefield.

 

Lincoln’s mass recruitment of former slaves and free men of color arrived was pivotal. Both sides were running out of reserves. Enslaved families had brought in the food for southern citizens and soldiers and picked the cotton to sell abroad. But now many were runaways. Confederate desertions rose and their cities faced food riots. 

 

Emancipation also brought Frederick Douglass and abolitionists back as Lincoln’srecruiters. 180,000 Black men served in 150 regiments before the war ended. Some served as spies. Men and women labored in Union Army camps. With less training, fewer medical officers and facilities, recruits of color left 37,000 comrades dead on the battlefield. 

 

The new Union manpower tipped the balance. African American soldiers liberated Petersburg, Wilmington, Charleston, the heart of the Confederacy, and finally Richmond, its Capitol. Black Union cavalrymen escorted President Lincoln through the streets of the Richmond filled with cheering, liberated people,.

 

In August 1864 Lincoln admitted that without his Black troops “we would be compelled to abandon the war in three weeks.” He had first doubted Black men would dare fight against their former masters. Confederate General Howell Cobb now admitted, “If slaves make good soldiers our whole theory of slavery is wrong.” In November 1864 the battlefield victories of African American troops gave President Lincoln a second term in the White House. 

                   

BLACK OFFICERS 

         

In 1862 Lane appointed African American William D. Mathews to Artillery officer. Mathews daring began in Maryland running a station of the Underground Railroad. In Kansas he served with John Brown and ran the busy “Waverly House” station. In January 1863 Captain Mathews was commissioned the Union’s first official Black officer.

 

In July 1864 Commander Lane elevated Patrick Minor and H. Ford Douglass, two courageous African American soldiers, to Artillery officers. With Mathews they became the Union’s highest-ranking Black field officers.  

 

These three men used their new authority to investigate the treatment of soldiers of color. Commander Lane, they discovered, threatened or ordered men beaten and starved to force enlistments. On August 3, 1862 Lane threatened a thousand Black men to join the First Kansas Colored Infantry Regiment shouting, “If you won’t” fight, “we will make you.” His Black officers halted this illegal practice.

 

Next the threefound Lane had Black soldiers too traumatized to fight or fire guns arrested for “shirking official duty.” White soldiers suffering similar problems were discharged to their families. Matthews, Minor and Douglass eventually won Black men the right to return to their families.

 

THE END OF THE WAR

 

In 1865 Commander Lane, a Kansas hero, decided rejoin the racist opposition. The Kansas Legislature still chose Lane their Senator. The General had one more shot to fire –he took his own life. Kansans named Lane University in his honor.

 

Matthews, Minor and Douglass settled into Kansas’s African American society and organized the first Kansas Colored Convention. The Convention promoted equal education and citizenship. Kansans later erected monuments honoring John Brown and its soldiers of color.

The Trans-Mississippi revolution in Oklahoma became a liberation Army in Kansas.  This kind of American Army did not reappear until the 1950s during the Korean War.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172789 https://historynewsnetwork.org/article/172789 0
The Ever-Present Billy the Kid: Protagonist of the Old West

 

More than 1,000 books and essays have been published about Billy the Kid. He competes with Genl. George Custer and Jesse James as the most-written-about character of the Old West.

 

Why the Kid's enduring popularity? What has so fascinated readers and viewers with this desperado of the Southwest to keep him endlessly alive in biographies and histories, novels, and movies? Why should a wealthy businessman pay more than $2 million for the only authenticated, 2-by-2 inch photograph of Henry McCarty, Billy Bonney, or, as better known, Billy the Kid?

 

Several reasons repeatedly surface as explanations of the Kid's continuing popularity. Billy's youthfulness, his never-ending adventures still stir readers and viewers. His portrayal as an energetic outlaw in the best-selling biographies by Englishman Frederick Nolan and Robert Utley, the fiction of Ron Hansen, and the Pat Garrett and Billy the Kid and Young Guns movies keeps audiences enthralled.

 

Those who tout Billy also hold him up as a champion of the underdog. Much of New Mexico in the 1870s was in the hands of the political-economic overlords known as the Santa Fe Ring. Lawyer, politician, and businessman Thomas Catron, via off-and-on underhanded agreements with regional entrepreneurs and legal officials, controlled much of the territory's politics.  Working closely with The House  (a group of men, led by L. G. Murphy and Jimmy Dolan, that opposed those who backed Billy the Kid and his supporters), Catron and the Ring determined much of New Mexico's Lincoln County history in the late 1870s.

 

Billy's advocates frequently portray him as the most important opponent of these Ring and House interests in Lincoln County after his arrival in southeastern New Mexico in the early fall of 1877. Such writers as the prolific Gale Cooper describe Billy as a  modern-day Robin Hood, opposing the rich and powerful, supporting and leading the unfortunate and underlings.  Governors, territorial legislatures, county politicians, business moguls, and even military leaders at nearby forts--Billy opposed all these, as he continued to be an agent of justice and equality.

 

Billy is also depicted as a close friend of women. Spending his formative years with his loving, supportive mother Catherine, without a father, and overlooked by his stepfather William Antrim, Billy naturally gravitated toward warm relationships with adult, mothering women.  In Silver City, Clara Louisa Truesdell (the mother of Billy's buddy Chauncey Truesdell), Sara Knight (the older sister of another Kid friend Anthony Conner, Jr.), and teacher Mary P. Richards looked after Billy, especially after his mother died from tuberculosis in 1874.  Later in Fort Sumner, Billy was close with Luz Maxwell, matriarch of the influential Maxwell family.  The Kid's attractions to younger women such as Luz's daughter Paulita, Celsa Gutiérrez (a married woman), and Abrana Garcia (a common-law wife?) revealed his romantic interests. Conversely, Billy was unable to win over the strong support of Sue McSween, the most important woman in the town of Lincoln.

 

Several Hispanics forged friendships with and became supporters of the Kid.  In Silver City, where school-age Hispanics outnumbered Anglos nearly 7 to 1, Billy learned fluent Spanish. That fluency helped him to connect with Hispanics in the communities of Silver City, Lincoln, and Fort Sumner. Several Hispanics rode with Billy in their opposition to The House interests.  These included Yginio Salazar, Martin Chávez, and José Chávez y Chávez. Although not as close to the Kid as others, Juan Patrón, the Spanish-speaking leader in Lincoln County, sided with McSween, rancher John Tunstall, and the Kid coteries against Murphy and Dolan.  Once Billy broke out of the Lincoln jail in spring 1881, Hispanic herders and townspeople hid him away, protecting him from Pat Garrett and other pursuers.

 

Still others salute Billy for his comradeship and leadership. He rode with the pack, first as a follower in 1877-1878 and then as a chief from 1879 to 1881. As George Coe, Billy's early acquaintance and chum in Lincoln County noted, everywhere Billy went he "was the center of interest." "Because of his humorous and pleasing personality," Coe added, the Kid "grew to be a community favorite."  George's cousin Frank Coe described the Kid as "naturally full of fun and jollity," as "gentlemanly as a college-bred youth." Even as a follower, Billy spoke out forcefully.  For example, he urged his companions to chase down and eradicate those who murdered Tunstall in February 1878. And during the horrendous burning of the McSween house on the Big Kill day of 19 July 1878, Billy took over the commander's role from the faltering Alex McSween and engineered an escape that saved his and others' lives. Soon thereafter, Billy took over direction of the Regulators, the Lincoln County riders who violently confronted The House and Santa Fe Ring dictators, and led that counter group for the next two to three years.

 

Even more important than Billy's youthful vibrancy and adventuresome spirit, his warm connections with women and Hispanics, and his comradeship and leadership in sustaining his ongoing popularity is another ingredient. Those who love the Kid have enormous difficulty embracing his darker sides. They want to whitewash his character.  But when still in his mid-teens, Billy became a thief, actions that expanded into his major occupation in the closing years of his life. Worse, the Kid was a murderer, killing at least four men on his own and involved in group killings of as many as a half-dozen others. Nor was Billy a moral giant.  The best evidence suggests he became intimate with married women. Finally, when opposed, when pushed into a corner, or bested in competitions, Billy frequently resorted, spur of the moment, to his guns and other violence. If these desperado sides of Billy's character are forgotten or erased, he remains an upright and admirable hero for his aficionados.

 

Interest in the Kid remains uniformly high, but the predominant images of him in the past half century in histories, biographies, novels, and films have notably shifted. If many of the first pictures of the Kid portrayed him as a villain and if later depictions were more upbeat and even heroic, beginning in the 1970s and 1980s, a new, more divided Billy appeared on the scene. Neither entirely a desperado or hero, he was parts of both. He was becoming a Grey Hat of the American West, a Dr. Jekyll and Mr. Hyde embodying competing repugnant and admirable sides. It was this new, complex Billy who rode through the pages of Frederick Nolan and Robert Utley's recent histories and biographies, the fiction of Ron Hansen, and the Young Guns movies.

 

Roughly a generation into the twenty-first century, Billy the Kid still rides--at or near the front of all the popular Old West demigods.  But he now wears a new, more shadowy face, a blend of rascality and the honorable. Thankfully, this recent bifurcated Billy gallops closer to the truth.

 

© Richard W. Etulain 2019

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172772 https://historynewsnetwork.org/article/172772 0
Book Review: Jeremy Black's Imperial Legacies

 

Throngs of people lined the excited streets of London.  For hours, joyous crowds joined the revelry to mark the sixtieth year of the reign of Queen Victoria and celebrate the accomplishments of the British Empire – the largest empire in world history. After attending an event with Austrian Archduke Franz Ferdinand at Buckingham Palace the previous evening, the diminutive Queen began a six-mile journey in an ornate open carriage – powered by eight off-white horses – to St. Paul’s cathedral on Tuesday, 22 June 1897 for her Diamond Jubilee. Amid countless numbers of hanging Union Jacks, deafening waves of applause and outbursts of “God Save The Queen,” British citizens paused to express pride in their monarchy and expansive Empire as a vehicle of progress for peoples across the world.  

 

During its decline from the end of World War II (1945) to the latter-half of the twentieth century, a near-universal consensus among international scholars rendered a far-different verdict on the British Empire.  Rather than an engine of enlightenment through the promotion of “Christianity, commerce and civilization,” an army of new academics denounced British imperialism as a progenitor of racism, economic exploitation, cultural coercion and violence.  By the end of the twentieth century, few politicians defended British colonialism. When Oxford-trained, conservative historian Niall Ferguson published Empire: The Rise and Demise of the British World Order and Lessons for Global Power (2002), most of the tenured intelligentsia panned his attempt to quasi-rehabilitate “the empire on which the sun never sets” as a flawed yet progressive force in the making of the modern world.  

 

For more than a decade and a half, his defense of British exceptionalism stood virtually alone until the recent publication of Imperial Legacies: The British Empire around the World (2019) by Jeremy Black. Despite being an esteemed professor at the University of Exeter with dozens of influential publications on world, military and European history, his new survey on the long era of British hegemony only partly succeeds in redefining the British Empire as a relatively liberal and humane actor on a world stage replete with despotic states and autocratic empires.  While Black renders judgments on the contours of British imperialism on four continents, the crux of his revisionist semi-apologia targets the colonial histories of India and China. 

 

The British and India

In the process of furnishing a broad, composite sketch of a world strewn with competing and emerging empires in the eighteenth and nineteenth centuries and delving into the complex relations between Britain and the Indian subcontinent in the fourth chapter, Black challenges sweeping stereotypes of British plunder and conquest. Hence, the author begins by noting that the British East India Company, which maintained an army to protect its surging commercial interests in the region, actually recruited Dalits or “untouchables” from the lowest Indian caste to wage an armed struggle against the formidable Maratha Empire (1674-1818).  At the Battle of Koregaon near the Bhima River (south of present-day Mumbai) on 1 January 1818, the combined British East India Company-Dalit forces crushed the army of Peshwa (leader) Baji Rao II and hastened the collapse of Maratha rule. Far from subjugation, Black contends that the victory at Koregaon constituted a substantive opening salvo against the oppressive caste system and symbolized an oft-forgotten and underappreciated dynamic of British imperialism – the willing cooperation of indigenous peoples eager for liberation, trade and/or protection from other empires.  In further crediting the British administration for its valiant attempts to eradicate two ultra-patriarchal traditions, female infanticide and Sati (an ancient Hindu-Sikh custom whereby widows immolated themselves upon the funeral pyre of their deceased husbands as a final expression of love and grief) and ushering in lengthy periods of peace in parts of India, the author bolsters his portrayal of the British as a largely civilizing influence. Although Britain developed commerce, transferred medical and transportation technology to the subcontinent and combatted socially-destructive superstitions, Black vastly understates the baleful underside of British rule.

 

In fact, that underside appears in the foundation of East India Company and its evolution from a trading presence of less than three hundred representatives to wielding effective control of India by the mid-nineteenth century.  As commerce spiked through exports from Bombay (Mumbai), Calcutta (Kolkata) and Madras (Chennai), the East India Company and its army formed alliances with princes.  Six years after repelling an attack mounted by a Mughal viceroy (nawab) and allied soldiers from the French East India Company at Arcot (1751), Major General Robert Clive established British supremacy in Bengal by gaining the support of Hindu elites with shrewd diplomacy, achieving an armed victory at the Battle of Plassey (1757) and replacing the disesteemed nawab with a pliant, pro-Company governor.  Thereafter, the reorientation of the agricultural economy in Bengal to suit British ideology and interests plunged a significant percentage of Indian farmers into poverty, despair and death.  In some cases, the pursuit of well-intentioned, paternalistic policies disrupted socio-economic mores maintained by indigenous populations and resulted in deleterious outcomes.  Due to their unshakeable faith in largely unregulated free-markets (laissez-faire) and the “laws” of supply and demand to produce prosperity, colonial officials, who adhered to the non-interventionist economic dogma of Adam Smith and his seminal work The Wealth of Nations (1776), initially refused to set-up direct relief programs during the Agra Famine of 1837-38 – a catastrophe that claimed 800,000 lives.  Subsequently, the British directed additional time, effort and resources to preventing and ameliorating the effects of drought and disease in India.

 

Somewhat inexplicably, Black only makes a passing reference to the seismic event that defined British-Indian relations in the nineteenth century.  On 10 May 1857, years of simmering discontent over the erosion of social and economic sovereignty to the East India Company exploded into a subcontinent-wide revolt. Upon discovering that the paper ammunition cartridges for their Enfield P-53 rifles, which required users to open the packets of gunpowder with their teeth, contained grease made of pork and beef (sacrilegious to Muslims and Hindus respectively), many of the 300,000 sepoys (Indian soldiers serving in the East India Company army) turned on the regime.  News of the inadvertent British slur on the two major religions of India spread quickly and triggered scores of localized and regional rebellions. As the emerging nation of India divided between pro and anti-British loyalties, a gruesome, atrocity-laden war raged until November 1858. Although victorious in thwarting nascent nationalist aspirations, the monarchy and the British government took direct control over India in order to reorganize British-Indian affairs on more equitable lines.  Strikingly, Black fails to appreciate the underlying cause of the Indian Rebellion of 1857 – the semi-tyrannical rule maintained by “the terror of [British] arms” – a strategy initially articulated by none other than Robert Clive. 

 

Under the Raj (1858-1947), Indians remained marginalized on their own soil.  In 1883, a legislative bill proposed to place British citizens under the jurisdiction of Indian courts.  From their regnant belief in racial hierarchy and superiority of Anglo-Saxons to other races, outraged Britons vociferously protested and succeeded in enervating the act.  At the same time, the administration further consolidated control over the economy by seizing the salt trade and depriving its subjects from storing or marketing the near-universally consumed commodity.  After decades of fear and passivity, tens-of-thousands of Indians – stirred by Mohandas Gandhi and his Satyagraha (non-violent civil disobedience) campaign – challenged British control.  In March-April 1930, Gandhi and his dedicated followers trekked more than two hundred miles to the Arabian Sea to declare its salt the property of India.  Despite arresting of approximately 60,000 dissenters in retaliation for their defiant passive-resistance (including Gandhi), the British struggled to restore order.  Indeed, a majority of Indians from the fin de siècle to the eve of Indian independence (1947) shared Gandhi’s view of British imperialism in 1905-06:

And why do I regard the British rule as a curse?  It has impoverished the dumb millions by a system of progressive exploitation…It has reduced us politically to serfdom.  It has sapped the foundations of our culture…[and] it has degraded us spiritually.

 

While the British deserve credit for opening up segments of the Indian economy, introducing the concept of parliamentary democracy, combatting extreme forms of patriarchy and deliveringhumanitarian aid, the East India Company and the Raj must also be charged with harming Indian society with policies and laws designed to circumscribe individual rights through socio-economic exclusion. 

 

Britain & China: A Duality of Imperial Histories

In Chapter 5, Black provocatively re-interprets the historical development of China through a lens overly favorable to the British.  From the mid-nineteenth-century to the triumph of Mao Zedong and totalitarian communism in 1949, the discursive underpinnings of Chinese nationalism relied on vitriolically denouncing the British Empire for waging two wars against Beijing (1839-1842, 1856-1860) to preserve the lucrative opium trade irrespective of its prohibition by the state.  To buttress the British case, the author declares opium-use not only failed to roil the sensibilities of many (if not most) peoples of East and South Asia but its consumption garnered extensive support.  While on-point in his assertion, Black distorts the social and political frame by neglecting key (and utterly imperative) details.  On the eve of the First Opium War (1839-1842), for example, nearly thirty-percent of Chinese males had become addicted to the drug.  The consequent socio-health crisis in China, which led Beijing to issue decrees curtailing its use, began to alter perceptions of the narcotic, and restrictions on the trade proved increasingly popular among the Chinese.  In Britain, public opinion also bifurcated.  A substantial coalition in Parliament, including future Prime Minister William Gladstone, railed against the prospect of dispatching the Royal Navy to “open” China to free trade and protect the commercial status of opium.  Yet, the government deftly maneuvered to narrowly overcome spirited opposition in the House of Commons and prosecuted the war until the surrender of Beijing on 29 August 1842.  When delegates from the Qing Dynasty (1644-1912) arrived to sign the Treaty of Nanking under threat of British bombardment, a “Century of Humiliation” had commenced.  From the First Opium War until the World War II, Imperial China would endure repeated invasions and ignominious defeats by Britain, France and Japan. 

 

If Black erroneously dilutes the hubristic motives and the sordid impact of British imperialism on China during the Opium Wars, his riposte to the narrative ascribing the mid-nineteenth century decline of the Qing Dynasty (1644-1912) to Western intervention aligns with the larger geopolitical realities of the period.  In 1850, the corrupt Manchu elite faced a formidable rebellion throughout southern China from discontented, impoverished farmers and their fanatical leader Hong Xiuquan – a syncretic Christian zealot with a warped messiah-complex and a plan to re-distribute land on an egalitarian basis.  To achieve their myopic utopia, however, the Taipings (followers of Hong Xiuquan) turned to forced conscription and committed countless atrocities against both non-partisan and resistant peasants.  To quash the revolt, Beijing welcomed the arrival of British Major General Charles Gordon to command the Manchu-allied, Ever Victorious Army in a series of pivotal campaigns in the latter stages of the civil war.  Only months after battling Imperial Britain in the Second Opium War (1856-1860), the Qing Dynasty quickly reversed course and allied with its Western adversary to maintain power – a decision that crossed into realpolitik. 

 

For his professionalism and vital role in ending the fourteen-year war that claimed upwards of 20-30 million lives, the Emperor and other Manchu leaders bestowed military titles and commendations upon Gordon.  Hence, Imperial Britain both violated and rescued the sovereignty of China in accordance with mutual national interests.  As such, the tacit alliance between Beijing and London during the Taiping Rebellion partly undercuts the Chinese claim of “humiliation” and decline at the hands of the British and illuminates how the protean temper of realist politics structured state-to-state relations – as ultimately purported by Black.  

 

Paradoxes without Conclusions: The Elaborate Legacy of the British Empire 

On 6 February 2012, the United Kingdom launched a week of Diamond Jubilee events to honor the sixty-year reign of Queen Elizabeth II.  At the time her succession in 1952, Winston Churchill had recently returned to 10 Downing Street to serve his second and final stint as prime minister. Despite the loss of India to independence five years earlier, the British Empire remained intact and seemed poised to survive after effectively mobilizing its colonies to defeat Nazi Germany. Only a few decades later, however, Imperial Britain fell under the weight of inexorable nationalist tides in Egypt, Sudan, Kenya, Nigeria and elsewhere – prompting myriad scholarly assessments of its legacy – a legacy not easily ascertained due to the vicissitudes of its mission and rule.  

 

Over nearly three centuries, the British Empire 1) engaged in economic exploitation and increased trade and wealth, 2) aided and abetted the slave trade and led in the worldwide abolition of slavery, 3) rendered irreparable damage to indigenous cultures and communitiesand introduced (and sometimes imposed) the concepts of parliamentary democracy and individual liberty for the benefit of minority populations and 4) ruled with degrees of repression and protected peoples from the rule of other empires bent on abject subjugation (i.e. the Empire of Japan, Nazi Germany, USSR).  While Black lapses into a biased apologia and generalizes at the expense of factual evidence in several instances, Imperial Legacies, on the whole, delivers a long overdue re-contextualization of the British Empire in an age of empires and nonetheless possesses a plethora of salient historical judgments worthy of scholarly consideration.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172788 https://historynewsnetwork.org/article/172788 0
Can the NRA Survive its Current Crisis?

The National Rifle Association faced the most serious crisis of its modern existence. Dissidents within the NRA directly challenged Executive Director Wayne LaPierre’s leadership. Not only had the nation’s oldest gun group’s net worth declined by more than $60 million, exacerbated by stagnant membership numbers, but LaPierre stood accused of wasteful spending, as had the NRA’s public relations firm, Ackerman-McQueen. One former NRA insider accused both of “financial bloodsucking.” Worse, the NRA suffered political reversals at the polls, and their reputation was further tarnished in the aftermath of a devastating massacre, and the organization’s maladroit public comments. The crisis came to a head over a celebrity brought in to improve the organization’s image and save LaPierre’s job.

This crisis moment for the NRA came not last year, but over twenty years ago. LaPierre weathered that storm, including political losses in the 1992 elections and the 1995 Oklahoma City bombing, thanks in part to LaPierre’s installment of actor Charlton Heston to beat back the insurgents and improve the NRA’s tarnished image. The ploy worked, and the NRA went on to help elect gun-friendly president George W. Bush, paving the way for the enactment of its top policy priorities.

As the NRA’s comeback 20 years ago suggests, only a fool would count the NRA out now as a key force in the gun issue. Early in the twentieth century, the fledgling organization nearly disappeared, rescued mostly thanks to government-lent support to buttress its shooting and training activities. Among the NRA’s rescuers was hunting enthusiast Theodore Roosevelt.

But their current woes—financial, political, legal—overshadow past crises.

The story begins, ironically, with their jackpot political bet on Donald Trump’s presidential bid. Pouring $31 million into his campaign—triple what it spent on Mitt Romney’s 2012 presidential campaign—and over $50 million in all into the 2016 elections, the NRA sat on top of the political world at the start of 2017. Yet in a shocking series of cascading crises, it all rapidly unraveled.

Media investigations reported disturbing and possibly illegal connections, perhaps including money laundering, between the NRA and Russian officials tied to the 2016 election. The NRA embarked on a series of cost-cutting measures, culminating in an angry and litigious split with the NRA’s long-time ad firm, Ackerman McQueen, which had received decades of lavish annual fees—$40 million in 2017 alone. Then the NRA pulled the plug on its expensive and little viewed online media outlet, NRATV.

Validating the NRA’s money woes, it was outspent in the 2018 midterm elections, for the first time ever, by gun safety groups, and a bevy of candidates around the country ran and won on a gun safety agenda. These reversals were spearheaded by a reinvigorated gun safety movement led by students from Parkland High School in Florida, the scene of the worst high school shooting in many years.

And then came the NRA’s annual spring convention. Usually a picture of gun rights unity and pride, this year’s event was a public relations debacle. The NRA’s president, Oliver North, issued a letter to the NRA’s board accusing LaPierre of profligate and improper personal spending, including $275,000 on clothing from a Beverly Hills boutique, travel to several posh resorts, and a charge of sexual harassment. North insisted that LaPierre step down, calling the situation an “existential crisis.” LaPierre fired back, accusing North of extortion and having an improper relationship with Ackerman-McQueen (which also represented North and paid him $1 million a year; ironically, LaPierre and his wife both had long ties to the ad firm). LaPierre prevailed, and North learned that he was out while the convention was still underway.

Yet that didn’t stop the bad news revelations: since 2010, the NRA had drawn over $200 million in cash from its non-profit NRA Foundation to keep the doors open. As of the end of 2017, NRA available assets were in the negative, to the tune of $31.8 million. In the last ten years, while its revenues grew only .7% per year, its expenses grew on average 6.4% per year. Recent annual deficits ran to $40 million. Some board members loudly condemned LaPierre’s profligate ways, including former congressman Allen West and former NRA president Marion Hammer.

On the legal front, a New York state agency ruled that the NRA’s “Carry Guard” insurance program, designed and marketed by Ackerman-McQueen to generate revenue by providing policies to cover self-defense shootings, violated state insurance rules. Critics dubbed it “murder insurance” because it could provide coverage for a criminal act. The NRA sued; the expensive litigation continues. The state’s Attorney General launched an investigation into the NRA’s tax status (the NRA is chartered in NY). NRA officials have already been served with subpoenas. Congressional investigations are following similar leads, including possible campaign finance law violations involving Russia.

In the latest turn, the NRA’s chief political strategist and second-in-command, Chris Cox, was suddenly forced out by LaPierre. Cox was accused of conspiring with North, a claim Cox denied. The sudden ouster was an even greater shock than North’s since he was with the organization for 24 years, and was viewed as LaPierre’s successor. Several wealthy donors to the NRA announced that they would stop all contributions until LaPierre was dismissed.

Each of these revelations is a body blow. Cumulatively, they could spell the end of the NRA as it has operated in modern times. Here are at least four likely developments:

--the NRA will survive, but not in its current incarnation. No stopgap cash infusion can cure its money problems and profligate spending habits.

--the NRA will ultimately need new leadership. No Charlton Heston can repair the damage.

--the NRA will not be a major factor in the 2020 elections. It will continue to speak out on gun issues, but it does not have and will not acquire the kind of cash needed to be a real player in time. NRA loyalists will continue to back Trump, but they would do so anyway.

--Despite other obstacles, gun safety groups will have an open field to advance their cause. On the other hand, the NRA’s political muscle lies less with money than with its loyal and devoted grassroots base, which remains intact. State gun rights groups will work to mobilize that base in the months to come to fill the void created by the NRA’s manifold problems.

The NRA has survived existential threats from within and without, and it’s nearly impossible to imagine gun politics today without the bulldog NRA chewing its way through the political landscape. Yet it is equally difficult to imagine how it can regain its bygone political mojo.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172782 https://historynewsnetwork.org/article/172782 0
The History and Mythology of the Mayflower Arrival in 1620

The Mayflower and its ‘Pilgrims’ reminds us of an event which has entered into the cultural DNA of the United States. This is so, despite the fact that those who sailed and settled did so as English citizens and subjects of the British crown. As with many such formational national epics, myths jostle with realities as events are remembered and celebrated. Three particularly stand out from this epic story.

The myth of Plymouth Rock and Mary Chilton’s bold leap ashore has reverberated through art and popular accounts of the Mayflower. It was a 17th-century event which might be summed up as: “One small step for a woman; one giant leap for the USA.”

The problem is that this story only emerged in 1741 when a project to build a new wharf at Plymouth, Massachusetts, prompted a ninety-five-year-old local resident - Elder Faunce - to claim that a particular rock (about to be buried in the construction process) had received the first step taken onto the shore. Elder Faunce had been born in about 1646, but he had heard of the rock’s ‘history’ from his father, who had arrived in Plymouth in 1623 (three years after the Mayflower). However, no account which is contemporary with the landing in 1620 substantiates this claim. Neither William Bradford nor Edward Winslow made any reference to the ‘rock’ in their records relating to the momentous arrival.

Despite this, the rest, as they say, is ‘history’. Or rather, ‘mythology’. But today that potent myth is enshrined (literally) on the sea shore under its classically-inspired canopy. The rock itself is now much reduced in size, having been broken by various attempts to move it and by the actions of souvenir hunters. Nevertheless, it is a reminder of the power of such symbols to engage with personal and community imaginations.

Then there is the myth of Thanksgiving. As with Plymouth Rock, the image of this event is vivid. And yet the reality is that nobody in the fall of 1621 would have described what occurred as  ‘Thanksgiving’. This was because ‘Thanksgivings’ were solemn observances, with long services, preaching, prayer and praise. They did not officially have one of these until July 1623.

What occurred in 1621 was a ‘Harvest Home’ celebration. We do not even know exactly when it happened; but it probably took place in late October or early November. Native Americans were definitely there. However, whether they were invited in gratitude for their assistance or simply arrived because food was available we cannot know. What is strange is that when William Bradford later compiled the record known as Of Plymouth Plantation he failed to mention the event at all. He just said that the Pilgrims enjoyed “good plenty” after the harvest of 1621. He had clearly forgotten the event!

If it was not for the 115 words preserved in another document, called Mourt’s Relation, we would know nothing about it whatsoever. This account, probably written by Edward Winslow, says that after the harvest was safely brought in, four men were sent off on a day of duck hunting to provision a special celebration. This celebration included marching and the firing off of muskets, viewed by both Pilgrims and Native Americans. This was then followed by a feast that lasted three days. To this feast the Native Americans added a contribution of five deer. No turkey or cranberry sauce was present.

The myth of virgin territory is rather less specific but it tends to color much of how we view the matter of the Pilgrims’ arrival and settlement. In this construct we imagine the arrival of the Mayflower as the first footfall of Europeans on a territory hitherto untouched by such arrivals. Nothing seems to convey the sense of their epic voyage as much as the impression of ‘first contact’ between the emigrants and a landscape and native community that had no previous connection with Europeans. From this perspective, the First Encounter with the Nauset people, on Cape Cod in December 1620, seems to reveal a native community whose first reaction to the newcomers was inexplicably hostile.

The reality was much more complex and reveals both the remarkably connected nature of the northern Atlantic communities by the 1620s and the reasons why the Nauset were so unwelcoming in their reaction to the exploratory party of Pilgrims. French and English fishermen and Basque whalers had been landing on the New England coast for over a generation. This helps explain why the Pilgrims were later assisted by Native Americans (Samoset and Tisquantum) who could speak English. As a result, alien diseases had cleared coastal communities before anyone’s foot was placed on the legendary ‘Plymouth Rock’.   

As early as 1616, perhaps as many as ninety per cent of the people living in the vicinity of what would become Plymouth had died in an epidemic. This was why the Pilgrims found cleared fields but no native inhabitants there. And the reason for the hostility shown by the Nauset was because they had lost members to English ‘traders’, who had kidnapped them as slaves. It was a slaving expedition that had taken Tisquantum to England, via Málaga in Spain, and then back to his (now devastated) North American home, with the ability to speak English.  

When writing Mayflower Lives (published by Pegasus Books, New York), and exploring the contrasting lives of 14 ‘Saints’ and ‘Strangers’, the interplay between myth and reality was apparent. This is not to disparage the impact of the Mayflower voyage and settlement through a reductionist revisionism. It is simply to reaffirm the central nature of disentangling myth from reality in any historical exploration of this momentous time. That is the very nature of historical enquiry and we should not fear it, even when applying it to an iconic event.

Perhaps more importantly, though, it is a testimony to the remarkable potency of the Mayflower and its legacy, that it has become the stuff of mythology as well as of history. I think that the original Pilgrims would have understood, for they believed that what they had embarked on was not a run-of-the-mill activity. They believed that they walked hand-in-hand with providence and this was the foundation of their mindset and outlook. They always believed that what they were doing was of greater significance than it might outwardly appear.

In the 21st century we may agree or disagree with their perspective on life, but what is undeniable is the fact that what they achieved has inspired the imaginations of huge numbers of people and still challenges us, as we seek to understand it today. It is both history and myth intertwined to a remarkable extent.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172776 https://historynewsnetwork.org/article/172776 0
We Need a Unifier in 2020

 

Two recent mass shootings in El Paso and Dayton remind us how hate-filled our country has become. In an online manifesto that appeared just before the first shooting, the killer of 22 people wrote; “This attack is a response to the Hispanic invasion of Texas. . . . I am simply defending my country from cultural and ethnic replacement brought on by an invasion.”  

Some Democratic 2020 presidential candidates faulted President Trump for stoking racism and targeting Hispanic Americans and immigrants. Bernie Sanders, among others, also criticized the National Rifle Association (NRA) and Republican lawmakers who supported its opposition to meaningful gun control. 

Writing of the 2016 presidential election, historian Jill Lepore observed, “The election had nearly rent the nation in two. It had stoked fears, incited hatreds.” She also believed that the Internet and social media had “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right.” Now, three years later, the situation has worsened. Shortly before the El Paso shooting, the shooter’s manifesto appeared on 8chan, an online message board. It encouraged his “brothers” on this site to spread widely the contents of the manifesto. As a New York Times article noted, “In recent months, 8chan has become a go-to resource for violent extremists.” 

In contrast to such violent extremism, some of our finest Americans urged the opposite approach.  Dorothy Day, a woman identified by both Barack Obama and Pope Francis as one of our great Americans, stated, “We must always be seeking concordances, rather than differences.” In his Keynote Address to the 2004 Democratic Convention, Obama (then still only an Illinois state senator) voiced a similar appeal. He urged us to remember that “we’re all connected as one people. . . . It is that fundamental belief . . . that makes this country work.  It’s what allows us to pursue our individual dreams, and yet still come together as one American family ‘E pluribus unum’ [as is stamped on much of our currency]. Out of many, one. . . . There is not a black America and white America and Latino America and Asian America [nor he indicated a Red-State America and a Blue-State America]; there is the United States of America.” The young Obama then told his listeners, “That’s what this election is about.  Do we participate in a politics of cynicism or do we participate in a politics of hope?”  

In 2010, now as president, he reiterated his plea for unity and told University of Michigan graduates, “We can't expect to solve our problems if all we do is tear each other down. You can disagree with a certain policy without demonizing the person who espouses it.”

Although President Trump has often demonized those who oppose him, we should remember that we can vigorously oppose his policies without demonizing all his supporters. In 2020 we need a unifier, as Franklin Roosevelt was. A unifier who can help us narrow the yawning gap between Republicans and Democrats, between different segments of our multiethnic society.   

Cory Booker began his presidential campaign in early 2019 acknowledging our nation’s divisiveness and stressing the need for love and unity.  Another candidate, South Bend (IN) mayor Pete Buttigieg, has urged the need for political humility, for trying to understand the position of those with whom one disagrees, and for realizing that some Trump supporters can, in many ways, be good people. 

For various reasons, however, neither Booker nor Buttigieg has emerged, at least yet, as one of the top four Democratic candidates. But like Dorothy Day, Obama, and Lepore, the two candidates are correct to urge more mutual understanding. Political compromises may not always be possible. Nor are they always for the best. But humility, empathy, and open-mindedness regarding political opinions, our own and others, are worthwhile goals that should not be minimized.

Seeking unity out of the diverse composition of our multifaceted nation has always been the aim of our most enlightened leaders, from the framers of our constitution to President Obama. In his acclaimed biography of George Washington, Ron Chernow writes of our first president as “the incarnation of national unity” and as ‘an apostle of unity.”  On the eve of the Civil War, Abraham Lincoln told our nation, “We [North and South] must not be enemies. Though passion may have strained it must not break our bonds of affection.” We should listen to “the better angels of our nature.”

One of Lincoln’s contemporaries and a great admirer of him, the poet Walt Whitman, also stressed what unified us as a people. In a recent Atlantic essay on the poet, Mark Edmunson wrote:

At a time when Americans hate one another across partisan lines as intensely perhaps as they have since the Civil War, Whitman’s message is that hate is not compatible with true democracy, spiritual democracy. We may wrangle and fight and squabble and disagree. Up to a certain point, Whitman approved of conflict. But affection—friendliness—must always define the relations between us. When that affection dissolves, the first order of business is to restore it. 

Several years after Lincoln’s death the great abolitionist Frederick Douglass gave a major speech praising the composite nature of the United States. His message was that if “we seek the same national ends” our diversity—“Indian and Celt; negro and Saxon; Latin and Teuton; Mongolian and Caucasian; Jew and Gentile”—can be a great blessing. 

A great admirer of Whitman and Lincoln—his six-volume biography of Lincoln earned him a Pulitzer Prize in History—poet Carl Sandburg extolled the beauty of our composite nation in all its glorious variety. The son of Swedish immigrants himself, he appreciated our ethnic diversity. His Jewish friend Harry Golden wrote in 1961 that “the fight against anti-Semitism and Negrophobia had been a special project” to him. As a newspaper man in Chicago after World War I, he printed the platform of the National Association for the Advancement of Colored People (NAACP), and was later honored by being made a lifetime member of that organization. During World War II, he hired two Japanese-Americans to work for him during the same period that over 100,000 other such Americans were being uprooted and sent to internment camps. In addition, he wrote a column warning against such prejudice. 

Although a strong supporter of Franklin Roosevelt, he also understood and appreciated political differences. In his long, long poemThe People, Yes (1936) he wrote: 

The people have the say-so. 

Let the argument go on

. . . . . . . . . . . . . . 

Who knows the answers, the cold inviolable truth?  

 

Yet, like Whitman and Lincoln he championed national, and even international, unity. In the Prologue for The Family of Man (1955), a book of photographs from around the world, he wrote of how alike we are “in the need of love, food, clothing, work, speech, worship, sleep, games, dancing, fun. From the tropics to arctics humanity lives with these needs so alike, so inexorably alike.” 

 

His friend Adlai Stevenson (the Democratic presidential candidate in 1952 and 1956) once said that Sandburg was “the one living man whose work and whose life epitomize the American dream.” In 1959, on the 150th anniversary of Lincoln's birth, he addressed a Joint Session of the U. S. Congress, becoming the first private citizen to do so in the twentieth century. After his death in 1967, he was honored by almost six thousand people at the Lincoln Memorial, including President Lyndon Johnson, Chief Justice Warren of the Supreme Court, Sandburg’s friend Justice Thurgood Marshall, and various poets and members of Congress. 

 

Another poet who died the same year as Sandburg, but first arose to fame as part of the Harlem Renaissance of the 1920s, was strongly influenced by both him and Whitman. This was Langston Hughes, and his poem “Let America Be America Again” (1935) reflects their spirit. 

 

Let America be the dream the dreamers dreamed—

Let it be that great strong land of love. 

. . . . . . . . . . . . . . 

I am the poor white, fooled and pushed apart, 

I am the Negro bearing slavery's scars. 

I am the red man driven from the land, 

I am the immigrant clutching the hope I seek—

. . . . . . . . . . . . . . 

O, yes, I say it plain, 

America never was America to me, 

And yet I swear this oath—

America will be!

 

Decades later, Martin Luther King, Jr. (MLK) expressed a similar dream in his most famous speech: “I have a dream that one day down in Alabama. . . little black boys and black girls will be able to join hands with little white boys and white girls as sisters and brothers. . . . With this faith we will be able to transform the jangling discords of our nation into a beautiful symphony of brotherhood.” 

 

Five years later, immediately following news of MLK’s death, Senator Robert Kennedy (RFK) urged a similar unity: “What we need in the United States is not division; what we need in the United States is not hatred; what we need in the United States is not violence and lawlessness, but is love, and wisdom, and compassion toward one another, and a feeling of justice toward those who still suffer within our country, whether they be white or whether they be black.” (For more on King and Kennedy, see here.)

 

Now, a half century after hatred had taken the lives of MLK and RFK, rancor again runs rampant in our country. Although we need a unifier to emerge as president in the 2020 election, we perhaps first need the electorate to value and seek unity, to remember that the aim of politics should be to further the common good, not just our own narrower partisan interests. 

 

Recently some groups and individuals have moved in that direction. In December 2016 a group of Trump and Hilary Clinton supporters got together in Ohio and from that meeting emerged “Better Angels,” a “national citizens’ movement to reduce political polarization in the United States by bringing liberals and conservatives together to understand each other beyond stereotypes, forming red/blue community alliances, teaching practical skills for communicating across political differences, and making a strong public argument for depolarization.”

 

On an individual level, writers such as the recently deceased Tony HorwitzGeorge Saunders, and Pam Spritzer have, like Dorothy Day, MLK, RFK, Barack Obama, and Pope Francis, advised dialogue and working (in King’s words) “to transform the jangling discords of our nation.”In the months leading up to the 2020 elections we would do well to heed their advice. We can do this not only by working toward the defeat of the divider Donald Trump, but also by replacing his politics of divisiveness with one of inclusiveness and compassion.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172765 https://historynewsnetwork.org/article/172765 0
The Cultural History Behind Once Upon a Time...in Hollywood

Note: the following article contains spoilers

 

 

Once Upon a Time…in Hollywood, a film filled with nostalgia for old movies, music, television programs, cars, and celebrities, is Quentin Tarantino’s love letter to Los Angeles.  In an interview with Entertainment Weekly, the writer/director declared:  “I grew up in Los Angeles….the only people who love it the right way, are the people who grew up here….The film became a big memory piece.”  Though he does not make much of an effort to dig deep into historical issues, he creates a cast of characters (some real, some fictional) and provides images that offer an interesting and sometimes provocative glimpse of life in L.A. in 1969. He also explores some of the tensions between old-time Hollywood and the 1960s counterculture that was becoming more prevalent and menacing by the end of the decade.  Tarantino leaves few doubts about where his sympathies lie.  

 

The plot, if we can call it that, is very simple, covering just three days in 1969, two in February and the October day of the Manson family massacre.  We follow the activities of two fictional aging Hollywood figures, a former big time TV star, Rick Dalton, (Leonardo DiCaprio) and his stunt double, Cliff Booth (Brad Pitt).  Dalton is passed his prime as an actor and is struggling to remain relevant, though his roles are now limited to playing the “bad guy,” which talent agent Marvin Schwarzs (Al Pacino) tells him is the kiss of death.  The movie also follows Sharon Tate (Margot Robbie) as she shops for presents for her husband Roman Polanski, sees herself in a movie, and parties at the Playboy Mansion with numerous celebrities.  As Rick works on the set of the TV show The Lancer, Cliff spends his days fighting Bruce Lee, fixing Rick’s TV antenna, and picking up a hitchhiker who happens to be a member of the Manson “family.”  After a six-month stay in Italy where Rick tries to revive his career by taking roles in spaghetti westerns, Rick and Cliff return to LA several hours before the time of the massacre. 

 

Los Angeles was (and still is) a city filled with cars. Many scenes in the movie are set in cars travelling through the remarkably recreated streets of 1960s L.A. The director emphasizes the personal nature of the film by using shots in moving cars that are pointed upwards, as if from the perspective of asix-year-old Tarantino sitting  inside his stepfather’s Karmann-Ghia, which happens to be the type of car that Cliff drives when he’s not using Rick’s. Through the eyes of a child, we see the billboards for old products like RC and Diet Rite Cola, movie theater marquees announcing names of films being shown, and, of course, classic cars like Mustangs, Cadillac Coupe de Villes, and VW Beetles.  The producer acquired nearly 2,000 cars to use in the background to help set the tone. Avid car enthusiasts may be bothered by the inclusion of some cars that had not been produced before 1969, but the overall impact of the old cars in the film is terrific.  

 

Car radios supply much of the soundtrack throughout the film, which includes a number of hit songs by well-known artists who remain popular to this day, such as Neil Diamond, Deep Purple, and Simon and Garfunkel. Yet the soundtrack goes beyond a predictable 60s-greatest-hits collection and includes numerous songs by lesser known acts like the Buchanan Brothers, the Box Tops, Buffy Saint-Marie, and Willie Mitchell, which puts the viewer in the back seat of a car listening to whatever happens to come on the radio, just as it would have been in 1969 before the days of personal playlists and specialized satellite radio channels.  We even have to hear the commercials. Tarantino also includes some news bulletins, including one on Sirhan Sirhan, Robert Kennedy’s assassin.  (More of these kinds of bulletins would have added to the richness of the historical context.)  The soundtrack also raises some interesting references in the film. For example, Cliff is listening to “Mrs. Robinson” as he eyes a flirtatious hippie teenager named Pussycat, who ends up being a member of Manson’s cult.  In an interesting twist on the song and the film with which it is inextricably linked, The Graduate, the older Cliff rebuffs Pussycat’s precocious sexual advances because she can’t provide proof of her age.  The scene also brings to mind a comparison with one of the historical figures in the film, Polanski, who in 1977 sexually assaulted a thirteen-year-old girl.  

 

More than anything else, Once Upon a Time is about the entertainment industry, and it is filled with dozens of references to movies and TV programs.  In one of the more memorable scenes, in the middle of the day Sharon Tate walks into an LA theater showing Wrecking Ball, starring herself and Dean Martin, and asks the manager if she can go in for free because she is in the movie, just as Tarantino had once done while trying to impress a date by taking her to see True Romance, which he had written.  Instead of reshooting scenes from Wrecking Ball with Margot Robbie playing Tate’s role, we see the actual fight scene between Tate and Nancy Kwan that was choreographed by Bruce Lee, as the fictional Tate (Robbie) soaks up the audience’s reaction.  In this writer’s favorite scene, as Rick reflects on how his career would have been so different had he been given the iconic role of Virgil Hilts in The Great Escape instead of Steve McQueen, Tarantino splices Rick into the actual movie.  We see Rick’s Hiltz defiantly delivering the same lines (“I intend to see Berlin…before the war is over”) to Commandant von Luger as he is sent to “the cooler” after his first escape attempt was thwarted. We also see many classic TV shows like Mannix in the background of many scenes, a show Brad Pitt told Entertainment Weekly was his father’s favorite show. The film includes many actors playing the stars of the era.  At a party at the Playboy Mansion, we see Michelle Phillips, Momma Cass, and Roman Polanski.  Damian Lewis, who bears a striking resemblance to Steve McQueen, makes the iconic star seem rather creepy and strange as he talks about Tate and Polanski’s relationship.  Mike Moh’s portrayal of Bruce Lee is also unflattering, so much so that the martial arts star’s family publicly objected to it.  

 

An overarching theme of the film is the clash of old Hollywood and the counterculture.  Early in the film, a group of teenage hippies—who end up being part of the Manson cult—is shown digging through a dumpster as they sing the lyrics to an actual Charles Manson song, “I’ll Never Say Never to Always”:  “Always is always forever/As long as one is one/Inside yourself for your father/All is more all is one.”  Rick and Cliff, the Hollywood heroes, are repulsed as they catch a glimpse of the hippies in the dumpster and they frequently show contempt for them throughout.  Though Manson (Damon Herriman) appears only once in the film driving his Twinkie truck outside the Tate/Polanski home months before the murders, his presence is felt throughout by the way his “family” members talk about him.  There is a very convincing portrayal of the cult at its home base at Spahn Ranch, an old site used in westerns like the ones Rick used to star in.  

 

Cliff, a decorated war veteran from either World War II or Korea, is the hero in this movie.  It’s not hard to imagine Cliff doing stunts for actors like Gary Cooper and John Wayne. Though Cliff was rumored to have killed his wife (in a flashback, we see Cliff on a boat pointing a spearfishing gun at his wife as she berates him for being a “loser,” but we don’t see him pull the trigger), he acts with a cool, detached dignity for most of the film. For example, after refusing Pussycat’s sexual advances in his car, he drops her at Spahn Ranch where she lives with dozens of members of the Manson “family.” He asks the teen and other members of the cult about George Spahn (Bruce Dern), whom he remembers from his work on Rick’s TV show at the ranch. In his attempt to get to Spahn, Cliff encounters Squeaky Fromme (the future would-be assassin of Gerald Ford played by Dakota Fanning), who refuses to allow the stuntman to see Spahn. (Tarantino accurately portrays the fact that Fromme had a transactional sexual relationship with Spahn that enabled the cult to live at his ranch.  Spahn also gave her the infamous nickname by which she is known.)  Cliff calmly yet firmly informs Squeaky that he’s coming in and that she can’t stop him. Once convinced that Spahn is not threatened by the hippies, Cliff leaves the ranch, but not before pummeling one of the male cultists for slashing his tire.  

 

As the time of the massacre approaches, the washed-up Hollywood duo are hardly in any condition to heroically prevent the horrific violence at Tate’s home next door to Rick’s.  As the Manson murderers walk up Ciello Drive, Rick is drunk, floating in the pool with headphones on, and Cliff is at the beginning of an LSD trip from an acid-laced cigarette. At this point, Tarantino abandons the original events and creates a fictional ending.  Instead of breaking  into Tate’s home, the three cult members (four were actually present) go to Rick’s, only to be brutally beaten (in Tarantino-style violence) by Cliff and mauled by his pit bull.  Rick is oblivious to all of this until one of the screaming assailants jumps into his pool to escape.  As if to punctuate the point that these old- time heroes are still relevant, Rick retrieves a flame thrower used in one of his movies to incinerate a group of Nazi officers, and turns it on the girl flailing in the pool to eliminate the threat of the cult.  (We see the scene from Rick’s movie earlier, and it’s a clear reference to the climax of Tarantino’s Inglorious Basterds.)  The film concludes with a pregnant Sharon Tate and one of the other guests greeting Rick and finding out about all the commotion. Old Hollywood has saved the day.  

 

Make no mistake, this movie is a folktale, just as the title suggests.  Tarantino does not really attempt to explore in any great depth the many critical political, economic, and social developments at this critical juncture in history.   Viewers looking for signs of the environmental distress in the city brought on by the ubiquitous cars, racial tensions in the aftermath of the Watts riots, or other issues confronting the city will be sorely disappointed.  But like most folktales, Once Upon a Time…in Hollywood is filled with interesting characters, events, and messages from a bygone era.  

 

                  

 

                  

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172769 https://historynewsnetwork.org/article/172769 0
Russian History Gives the United States an Ominous Warning  

       

 

         Russia is often in the news these days – corrupt and repressive at home, aggressive and malevolent in relation to neighbors and rivals. Yet this Russia is heir to a country that shaped the twentieth century and had a formative impact on the cultural and political history of the modern world. It cannot be dismissed as a plaything of Vladimir Putin’s arrogant ambitions. Over the past hundred years, Russia has been a bellwether, not an exception. We should take heed.

 

         Russia has more than once demonstrated the ease with which complex societies can fall apart. It has shown how difficult it is to uphold the legitimacy of nations and to install and sustain democratic regimes. The country we know as the Russian Federation changed names, borders, and political systems twice in the course of the twentieth century. We remember the collapse of the Soviet Union in 1991. A megapower suddenly vanished--the ideology that sustained it deflated like a punctured balloon. The periphery defected – fourteen former Soviet republics emerged as independent nations. Nevertheless, Moscow remains the center of a multiethnic territory that continues to span the Eurasian continent. Democratic in form, authoritarian in practice, Russia is still a major player on the international stage.

 

         In 1991 the center not only held, but other structures also persisted. Positions of power within the Soviet hierarchy translated into opportunities to amass personal fortunes. Alumni, like Putin, of the Soviet political police, the former KGB, used their insider status and institutional leverage to shape a new form of authoritarian rule. A pseudo-capitalist oligarchy arose on the ashes of Soviet Communism, while the welfare of the majority of the population eroded. What we now deplore as the increasing disparity between rich and poor in the developed industrial nations in Russia was an instant product of Communism's collapse. Much has improved in the post-Soviet sphere; open borders, an uncensored press, freedom of speech and assembly (though increasingly imperiled), prosperity for the better off, not just the elite. But the rule of law is a mere fig leaf and the hope that free markets would result in a free society has not been fulfilled.

 

         This recent transition – by now thirty years old – was not the first time the center held, against all odds, and the promise of liberation was disappointed. Seven decades earlier, between 1917 and 1921, an entire civilization collapsed and a new one was founded. In 1913 Tsar Nicholas II celebrated the three-hundredth anniversary of the Romanov dynasty; in August 1914 he took Russia into World War I on the side of the Allied powers. In March 1917 mutinies in the imperial armed forces, bread riots by working-class women, industrial strikes in the key cities, and peasant revolts in the countryside led to the defection of the military and civilian elites. For years, a burgeoning civil society and a disaffected radical fringe had been dreaming of change – the one of the rule of law, the other of socialist revolution. When Nicholas renounced the throne, a seven-month experiment in democratic politics ensued – at the grass roots in the form of elected soviets (councils), on the scale of empire in the form of elections to a Constituent Assembly. Millions voted at every level; democracy was in the air. Yet, the Provisional Government, which honored Russia’s commitment to the Allied cause, could not cope with the same war that had proved the monarchy’s undoing. In October 1917, the Bolshevik Party, under the leadership of Vladimir Lenin, arrested the liberal ministers, took control of the soviets, and heralded the installation of the world’s first socialist government.

 

         Few thought this handful of radical firebrands would stay in the saddle. The old elites launched a fierce opposition. As committed internationalists, anticipating the outbreak of world revolution, the Bolsheviks immediately sued for peace. In March 1918 they signed a separate treaty with the Central Powers but the fighting nevertheless continued. Though relatively bloodless, the October coup unleashed a plethora of brutal civil conflicts lasting another three years. The old regime, desperate to bolster its popularity in wartime, had mobilized the population against itself, demonizing the inner enemy, only to weaken itself from within. The fissures held together by autocratic rule and the imperial bureaucracy now broke open – class against class, region against region, community against community.

 

         Armies were not the only combatants in the struggle for independence and domination that followed 1917. All sides used the energy of popular anger to strengthen their own cause. As early as December 1917 the Bolsheviks had established a political police, ancestor of the KGB, to direct the furies of class conflict at officially stigmatized social categories and political rivals. Defenders of monarchy vilified the Jews; Polish, and Ukrainian nationalists took aim at each other. In the context of fluid and endemic violence, vulnerable communities took the brunt. Across the former Pale of Settlement (abolished after March 1917) tens of thousands of Jewish inhabitants were murdered; Muslims and Christians in the Caucasus settled old scores. Enemies and traitors, real and imagined, were everywhere targets of spontaneous and organized rage.

 

         The Civil War was a consequence of state collapse, but it generated the birth of nations. Poland, Finland, and the Baltic States, with outside backing and by force of arms, established their own borders. World revolution had not materialized; the movement for national self-determination, a Leninist slogan, took its place. In the end, the Bolsheviks maintained control of the heartland – moving the capital from Petrograd to Moscow in March 1918, conquering the breakaway Ukrainian provinces, defeating a range of military and ideological opponents on both the Right and the Left, and reconstituting a massive, multiethnic state on the footprint of the former empire. They created a new, “people’s army,” using the endemic violence of social breakdown to form a new type of regime, energized by continuous internal struggle. The self-proclaimed dictatorship of the proletariat promised a new and higher form of democracy and the inauguration of a new and brighter era for humankind. Instead, it resulted in a system that inflicted untold damage on its own population: forced collectivization, murderous famines, purges, and the Gulag.

        What eventually became the Soviet Union in 1924 nevertheless survived the Stalinist Terror and the onslaught of World War II, playing a decisive role in Allied victory. With Stalin’s death, the system began slowly to soften, but until the last moment the basic principles of class warfare and ideological dictatorship endured. Soviet Communism is now dead; people are beginning to forget why it invoked passions on both sides – either fiery commitment or moral outrage. The Western democracies cannot boast, however, of the triumph of capitalist markets and liberal constitutions. Civil societies are generating antidemocratic populist movements and corrupt and self-serving politicians are brazenly flouting the law. Racial and cultural antagonism and nationalist fervor, encouraged from on high, bolster the power of corrupt elites. Trump and Putin more and more mirror each other. The democratic impulse that flourished in the Revolution and was defeated in the Civil War emerged again in Russia after 1991 but has once againbeen foiled. We can’t afford to look down our noses at Russia. Its history over the last century should give us pause. Great Powers die and democracy easily withers.    

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172774 https://historynewsnetwork.org/article/172774 0
America’s Self-Cultivation Crisis

 

No sooner did candidate and self-help guru Marianne Williamson engineer her breakout moment in the Democrat’s presidential debate on July 31 in Detroit than she found herself panned for half-baked views on depression and mental health. But Williamson’s quixotic campaign has highlighted one salutary theme: America had better learn to up its game in cultivating civic empathy lest the “dark psychic force of collectivized hatred” of which she spoke tear us apart.

 

Mass shootings in El Paso and Dayton over the Aug. 3 weekend, in which hate-filled gunmen killed 31 people and wounded dozens more, brutally underscore the point. White supremacists and weaponized haters represent the antithesis of civic empathy, and by now we know good intentions alone won’t fix the curse of gun violence in America; we need consensus and action on sane gun-control measures. We also need a more robust empathy offensive to reknit our fraying commonweal.

 

What’s stopping us? The list is as long as a Donald Trump necktie, but let’s start with the president. 

 

For someone as uncouth as our trash-talking commander in chief, personal cultivation can evoke images of raised teacups, curled pinky fingers and snoots in the air; high culture is hedge-fund moneybags snapping selfies with a Hollywood celebrity at a golf tournament. In the president’s incurious, ego-bound world, self-promotion trumps self-cultivation. But nurturing respect for the urge to improve ourselves for the common good is as American as Abe Lincoln’s bootstrapping fondness for book learning or the heroes championing literacy and reading programs today. 

 

Even so, rowdy disregard for soulful striving is as old as it is nonpartisan. “It’s a revolt of the mediocre many against the excellent few,” wrote New York Times columnist Bret Stephens, speaking specifically of today’s “campus radicals” on the activist left. “And it is being undertaken for the sake of radical egalitarianism in which all are included, all are equal, all are special.” But you could make a similar argument about radical populists on the right, in the way, as Stephens says, it “emboldens offense-takers, promotes doublethink, coddles ignorance … [and] gets in the way of the muscular exchange of honest views in the service of seeking the truth.”

 

Purposeful self-cultivation is the natural antidote to that kind of obdurate yahooism. No, it’s not likely to dilute the toxic delusions of hardcore white nationalists any time soon, if ever. But just as Trump’s hate speech has created a climate in which  hate groups can flourish, it’s important in our competitive, free-agent nation that we work on a counter-climate—one that helps blunt our sharp elbows and creates space for sober reflection based on thought, study and regard for the importance of issues beyond the self. Civic-minded cultivation values individual well-rounding, dedication to craft and quiet competence. The rub, sad to say, is that increasingly larger segments of American society appear to want none of it. 

 

Author Tom Nichols argues our “Google-fueled” culture has eroded respect for personal achievement in the public interest. Skepticism of the high and mighty is a time-honored and healthy feature of American democracy. Yet today, as Nichols says in his 2017 book “The Death of Expertise,” we’re no longer just properly skeptical about our experts. Rather, “we actively resent them,” he writes, “with many people assuming that experts are wrong simply by virtue of being experts. We hiss at ‘eggheads’—a pejorative coming back into vogue—while instructing our doctors about which medications we need or while insisting to teachers that our children’s answers on a test are right even if they’re wrong. Not only is everyone as smart as everyone else, but we all think we’re the smartest people ever …. And we couldn’t be more wrong.”

 

It’s hard for Americans to cultivate fruitful conversation when we’re shouting across a mountain range of misplaced ego, let alone a cavernous income divide. For a majority of American workers real wages haven’t budged for some 40 years; the ever-widening gap between the rich and the rest now means that America’s “1 percent” averages 39 times more income than the bottom 90 percent; women on the job make 79 cents to men’s dollar, and the income split between whites and minorities has deepened.

 

Yet it’s clear the great American income squeeze has hit more than the pocketbook. How many people have time or energy to read a book, enjoy a concert, enroll in tango lessons or get involved in community building activities on a sustained basis when they’re struggling to keep heads above water? How do you fructify life in a world of shifting job prospects, burdensome college debt and eclipsed expectations?

 

“If we pull back from a narrow focus on incomes and purchasing power …  we see something much more troubling than economic stagnation,” Brink Lindsey argued in The American Interest. “Outside a well-educated and comfortable elite comprising 20-25 percent of Americans, we see unmistakable signs of … social disintegration — the progressive unraveling of the human connections that give life structure and meaning: declining attachment to work; declining participation in community life; declining rates of marriage and two-parent childrearing.”

“This is a genuine crisis,” said Lindsey, “but its roots are spiritual, not material, deprivation.”

Little wonder cognoscenti have touted a link between self-cultivation and self-preservation since time out of hand. In the ideal state, Cicero said, the individual “is endowed with reason, by which he comprehends the chain of consequences, perceives the causes of things, understands the relation of cause to effect and of effect to cause, draw analogies, and connects and associates the present and the future” so he can assess “the course of his whole life and makes the necessary preparations for his conduct.”

 

My maternal grandmother, Alice Brasfield, didn’t know from Cicero, but she saw the linkage between self-cultivation and survival clear as day. Forced to quit school at 12, she outlasted a gothic girlhood in 1890s Canada by reading voraciously and committing the dictionary to memory. When I knew her in the 1950s, Alice had cultivated a light touch on the piano, wrote thoughtful letters in an elegant hand, and relished handing all comers their rear-ends in Scrabble. She preached old school: reading until eyeballs bled, knowing some poetry, a few songs and jokes by heart, and learning to offer others something in conversation beyond self-regard. She had nothing against baseball, but bristled at my decision to give up the music lessons she paid for to dawdle, inconclusively, on the diamond.

 

Of course, it was easier working toward such high-minded goals in the booming economy of 60 years ago when earning a living wasn’t as much of an uphill fight as it can be today, and time moved at its less frantic, pre-digital pace.  

 

Like Alice, millions of working-class Americans, who had scaled the rough side of the mountain, saw self-cultivation not only as a stepping stone to a more complete life but a boon to community, as well. While looking forward to the Book-of-the-Month Club selection landing in their mailboxes, working-class folks read the news as a civic duty and bore the art of eyeball-to-eyeball conversation as a serious pastime. Even late-night TV tipped its hat to the higher culture, wedging in among the stupid pet tricks and celebrity buzz, literary lions like Lillian Hellman, James Baldwin and William Saroyan. 

 

Empathy grew from the urge to experience a more expansive life. As Anton Chekhov put it in a letter to a troubled brother, the cultivated “have sympathy not for beggars and cats alone. Their heart aches for what the eye does not see.”

 

Philosopher John Dewey saw that impulse as vital in a democracy, the goal of which, according to Bill Kovach and Tom Rosenstiel in “The Elements of Journalism,” “was not to manage public affairs efficiently. It was to help people develop to their fullest potential.” As Dewey himself said, closing the loop between individual and community, “I believe that education is a regulation of the process of coming to share in the social consciousness; and that the adjustment of individual activity on the basis of this social consciousness is the only sure method of social reconstruction.”

 

Embracing culture allows individuals to see the kind of “dark psychic force” Marianne Williamson cited, as well. In his momentous novel “Invisible Man,” for example, Ralph Ellison, in advising us to remember that “the mind that has conceived a plan of living must never lose sight of the chaos against which that patterned was conceived,” is suggesting that power and entitlement, misused, is a force for social disintegration and blindness. Making the invisible visible, on the other hand, gives a society greater tensile strength.

 

Today, in a country riven by matters of race and gender, immigration and identity, and rural vs. urban rivalry, we’re at a historically delicate moment. “American confidence is in tatters …” New York Times columnist David Brooks wrote. “As a result, we’re suffering through a national identity crisis. Different groups see themselves living out different national stories and often feel they are living in different nations.” What’s needed, as Brooks suggests, is for Americans to create a new national story to help us explain to ourselves who we are and what we value to the point of action, and that’s not possible without the exercise of civic empathy.

 

As college teacher, I’m hopeful we’ll get there. Young people I know, students and former students now in their 20s and 30s, are making headway against our material-driven culture by opting for downsized homes and more frugal lifestyles. Too often that’s out necessity, but the shift also speaks to a focus on “genuine” rather than “plenty,” and a growing recognition that unchecked materialism not only plays havoc with  the ozone layer, but punches holes in the soul in a way that only psychic income, not greenbacks, can fill.

 

Wild prediction: Marianne Williamson will not become our next president. Nonetheless, hercall for spiritual renewal—you might call it a New Deal for Hearts and Minds—makes good practical sense. Sustained work at self-cultivation opens the eyes and feeds the spirit, defuses hair-trigger judgments, and generally makes for a more even-keeled society.

 

So, here’s a message for candidates who do have a shot at becoming president: Turn your telescopes around—see the state of the nation’s soul, not as new-age mumbo jumbo, but as an umbrella idea that houses important but necessarily wonky policy prescriptions for fixing immigration, healthcare, income inequality, access to education and student debt.

For most of us mortals, cultivating the self, and adding our “light to the sum of light,” as Tolstoy put it, is an elusive goal. But it’s worth aiming for. At a minimum, it’s the best revenge for living in a savage world. It’s also a prudent bet that integrating more rounded lives into our society will give us one that works better than it does now. Our survival may depend on it.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172767 https://historynewsnetwork.org/article/172767 0
We Must Stop Valuing Guns More Than People

A memmorial for the victims of the El Paso mass shooting

 

There is a darkness within the soul of America. It shows itself with increasing frequency all across the country, from garlic festivals to shopping malls, from schools and churches and synagogues. But these mass shootings are only symptoms of the disease at the core of the nation. Some of the worst diseases have the fewest symptoms and the very worse have only one symptom: death with no warning. 

 

The wave of mass shootings and the many thousands of others killed one by one are only symptoms of the American disease. It is a disease most people deny which does not make it go away. The disease in the American soul is that we as a society value guns more than people. It is a choice this country has made and one it has no real desire to cure. 

 

That this country values guns more than people doesn’t prevent the predictable reactions after the shootings: The cries of oh no, not again. The wringing of hands. The tears rolling down the cheek. The expected question of why doesn’t somebody do something to stop this with the unspoken answer that nobody will do anything about it. We as a society are deeply hypocritical on this issue publically crying that something must be done without any real desire to change things. If we were to ever do something about this, which is improbable at best, we would have to first admit that we care more about guns and gun owners than the murdered and maimed victims of guns. 

 

America’s deep devotion to guns expresses itself in many other ways most strongly expressed in the modern distortion of the plain meaning of the Second Amendment. When the Constitution was in the process of ratification by the original states one concern raised about taking this major step into a new government was whether each state could still maintain their local militia. Would, they asked, the Massachusetts or Georgia or other state militias be abolished or absorbed into a national army. The framers of the Second Amendment answered by guaranteeing “well regulated” state militias, now more commonly known as state national guards, alongside the federal army and navy, It never precluded individuals from owning firearms but it never granted civilians an unlimited right to own and use any gun they wanted. If “well regulated” applied to organized military units it certainly more than applied to individuals. The firearms industry and its lackey the National Rifle Association are lying when they claim the Second Amendment is anything more. 

 

The cliché response to all mass shootings including the latest in El Paso and Dayton is to proclaim that our thoughts and prayers are with the victims. Thoughts and prayers are fine, but God can’t solve this problem. Nor is it enough to answer as most people, including a majority of gun owners, do on polls that they support background checks, assault weapon bans or other measures that would make at best a small dent in the problem. To think in the light of repeated shootings followed by nothing that answering a poll will change anything is in plain terms stupid. 

 

If mass shootings are ever to end, which they almost certainly never will, it will require first admitting we as a nation have valued guns over people. Then we would have to make the deepest change in our soul since the Civil War when a majority stopped believing that black people were no more than livestock to be bought and sold. If we as a nation and as indiduals could change to that degree we might change things. 

 

Because most of us will not be killed by a gun we can continue as before, pretending to care about something we have no will or intention to change. If we are not willing to change let us at least treat the disease by being honest about it.    

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172768 https://historynewsnetwork.org/article/172768 0
Art and Defacement: Basquiat at the Guggenheim

 

Consider the following facts as you wend your way to the Guggenheim Museum and its uppermost gallery, where you will presently find The Death of Michael Stewart (1983), Basquiat’s gut-punching tribute to a slain artist, and the centerpiece for an exhibition that could hardly be more timely. Black people are three times more likely to be killed by police than white people. According to mappingpoliceviolence.org in 2014, fewer than one in three black people killed by police in the U.S. were suspected of a violent crime and allegedly armed. As American pediatrician Dr. Benjamin Spock once observed, “Most middle-class whites have no idea what it feels like to be subjected to police who are routinely suspicious, rude, belligerent, and brutal.” Such brutality is the focal point for Basquiat’s “Defacement”: The Untold Story, an exhibition that commences from a painting created by Jean-Michel Basquiat in honor of a young, black artist – Michael Stewart – who met his tragic end when he was supposedly caught by the New York City Transit Police tagging a wall in an East Village subway station during the early morning hours of September 15, 1983. What precisely transpired that night remains unsettled to this day, but what is sufficiently known is that the twenty-five year old Stewart was handcuffed, beaten and strangulated by a chokehold with a nightstick – likely causing a massive brain hemorrhage, whereby he fell into a coma and never regained consciousness, dying two weeks later. Other artists, among them Andy Warhol and Keith Haring, responded to Stewart’s death with commemorative works of their own, which are featured in the exhibition. Also included is a yellow flyer created by David Wojnarowicz – portraying the officers with vicious, skeletal faces –  to announce a September 26, 1983, rally in Union Square in protest of Stewart’s “near-murder”, when the young man was still languishing in a coma, “suspended between life and death”. In fact, Basquiat must have seen Wojnarowicz’s poster (which was taped “all over” downtown, as another artist recalls), and apparently it served as a direct source for the composition of Basquiat’s painting.  The Death of Michael Stewart (informally known as Defacement) was originally painted directly onto the drywall of Haring’s Cable Building studio; later cut out of the wall and placed within an ornate gilded frame which Haring hung immediately above his bed. There the painting remained until Haring’s death from AIDS-related illness in February 1990. Two positively ravenous officers – with pink flesh and blue uniforms – wield their nightsticks above a solitary, black and haloed figure fixed motionlessly between them. In the upper register of the painting, above the trio of figures is the word ¿DEFACEMENTO? – a word that during the 1980s was often used as a term for graffiti. In the context of the painting, the artist draws our attention to the reality that what was truly being defaced was not a piece of property but a life: it is the police officers, teeth bared and thirsty for blood, who are committing the act of defacement, of disfigurement. Basquiat’s art was constantly in dialogue with the history of Western painting; and in this case, his work may in fact be seen as revisiting and restaging a classic theme – namely, the flagellation of Christ. The exhibition includes several other works by Basquiat, dealing with closely related subjects that occupied him throughout his relatively short but intense and extraordinarily prolific career. Irony of a Negro Policeman (1981), La Hara (1981) and Untitled (Sheriff) (1981), all take up the themes of white power, authority and law enforcement – generally portraying the police as frightening and monstrous. La Hara is an especially mesmerizing work, the title of which – repeated four times in the upper left-hand portion – refers to a Nuyorican/Boricua slang term for a police officer; derived from O’Hara, since at one time a large contingent of New York City law enforcement was Irish. The officer in this work is downright scary: with a ghostly white complexion, blood shot eyes and crooked, menacing teeth, set within a jaw that is open wide enough for the figure to be talking to us  – all of which serves to convey a kind of seething rage, ready to explode in violence at the slightest provocation. As with many of his figures, Basquiat has painted this officer with his rib cage exposed, and in certain areas we can see right through him to the fire-engine red background. In other words, what we have is a skeletal figure, whose bleached white bones invoke a kind of living dead: not simply a monster but an abomination. Charles the First (1982) and CPRKR (1982), both references to jazz saxophonist Charlie Parker, are among the paintings of Basquiat to champion and glorify the father of bebop – granting him, in fact, the stature of a king. These two works, different as they are from Defacement, nevertheless share with it certain themes. At a basic level, all three works are concerned with death, and precisely the death of the young, black, male artist. CPRKR is a kind of grave marker for Parker who was dead at thirty-five: a minimalist work consisting almost entirely of the initials in the title, references to the place (“THE STANHOPE HOTEL”) and year of Charlie Parker’s death (“NINETEEN FIFTYFIVE”), and a cross. At the bottom of the work, Basquiat has printed the name “CHARLES THE FIRST”. Charles the First abounds with references to the life and work of the great musician; but two features are particularly notable in the present context: at the painting’s top left corner is the word “HALOES” – indicating that in Basquiat’s scheme of things Parker is also a kind of saint, one of a number characteristics he shares with the Stewart of Defacement.  At the bottom of the painting, Basquiat issues the warning “MOST YOUNG KINGS GET THEIR HEADS CUT OFF” – which at the very least reminds us that, for Basquiat, a premature death is the price that the black artist pays for genius. Basquiat himself died in 1988 at the age of twenty-seven from a heroin overdose. The Guggenheim’s glance back to 1983 and the death of Michael Stewart accomplishes what art exhibitions should, but all too rarely do – it grants us perspective on our present moment; a way of confronting the reality that we are currently living and navigating. We all know the names of unarmed black men who recently had their lives cut short – Trayvon Martin (killed in 2012), Eric Garner (killed in 2014, by an illegal chokehold like the one that killed Stewart), twelve-year old Tamir Rice (shot dead in 2014 by white police officer Timothy Loehmann), eighteen-year old Michael Brown (also shot dead by white Ferguson police officer Daren Wilson), Philando Castile (killed in 2016) – and the list goes on. The show does not allow us to forget that this violence has a long, painful history in America. Basquiat’s “Defacement”: The Untold Story does what exhibitions should do – it tells us a story we don’t want to hear but need to hear.  

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172771 https://historynewsnetwork.org/article/172771 0
Donald Trump is no James Monroe

 

 

When the vice-presidential candidates squared off against each other in the 1988 debate, Lloyd Bentsen delivered one of the sharpest political blows ever landed on an opponent. Republican Dan Quayle was proudly touting his years of experience and equating them with John F. Kennedy’s 14 years in Congress before his 1960 presidential campaign. That’s when Bentsen pounced on the unsuspecting Indiana senator in a memorable and flawlessly delivered take-down: . “I served with Jack Kennedy. I knew Jack Kennedy. Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy.” It was a defining moment and Bentsen’s language has since been used as a formula for other political insults.

 

Whether he knows it or not, Trump thinks he embodies President James Monroe because he claims to have done what Monroe accomplished. Trump insists he is America’s favorite president, that he has made America great again, and that he is immune from impeachment because “you can’t impeach somebody that’s doing a great job.” But how does President Trump really stack up, especially when measured through the lens of history? How does Trump compare to last of our founding father presidents as he sought reelection in 1820? Are Trump’s claims to greatness justified, or is he embellishing his record, just as Dan Quayle boastfully compared himself to John F. Kennedy?

 

There are four key areas where Trump falls short of the reputation and statesmanship of his predecessor, James Monroe, from 200 years ago.

 

Elections

 

Trump will never be as popular as James Monroe was in the early 1800s. When Monroe was first elected in 1816, his electoral victory was impressive. He racked up a rousing 84% of the electoral votes. His victory of electoral votes is the 15th highest percentage ever won. By contrast, in 2016, Trump won just 56% of the electoral votes. Despite the president’s claims that he won the Electoral College in a landslide, his Electoral College victory ranks very low – 46th out of 58 presidential elections.

 

In 1820, Monroe received all the electoral votes except one, coming the closest of any president to tying the unanimous Electoral College victories of George Washington.

 

Without ever cracking a history book, Trump has created his desired narrative that his 2016 election was a very substantial victory despite evidence to the contrary. He also predicts he will win an even more astounding victory in 2020. But in our divided nation, it is a pipe dream to suggest that Trump will come anywhere close in 2020 to Monroe’s stunning second term victory. In fact, there are scenarios in which Trump may lose in a monumental landslide.

 

When it comes to electoral victories, Donald Trump is certainly no James Monroe and never will be. He simply doesn’t measure up to Monroe’s popularity.

 

Leadership

 

Monroe’s impressive electoral victories reflected the hope and sense of national unity and optimism following the War of 1812. In his inaugural address on March 4, 1817, Monroe pledged that “harmony among Americans…will be the object of my constant and zealous attentions.” It was the beginning of the Era of Good Feelings, a catch phrase that came to be associated with Monroe’s presidency.

 

Monroe was a pragmatic president. He tried to govern in a non-partisan manner, noting that “the existence of parties is not necessary to free government.” Monroe recognized his obligations to all Americans and not just those of his Democratic-Republican Party. Biographer Harry Ammon observed that Monroe “viewed the party as embracing all elements of American society and therefore he accepted the fact that it must also adopt measures meeting the needs of the widest possible spectrum of American opinion.”

 

While the Era of Good Feelings characterized Monroe’s tenure as president, Trump’s presidency might best be described in the dark “American carnage” language of his inaugural address. Rather than unite the nation and promote harmony and non-partisan governing, Trump has consistently stirred up and created divisions among Americans and governed from a blatantly political posture that panders only to a dedicated base of supporters. He quickly abandoned his election day victory pledge to “be president for all of Americans” and to “seek common ground, not hostility; partnership, not conflict.” Trump’s leadership from the gutter is a far cry from the stable and enlightened leadership demonstrated by James Monroe.

 

Monroe was certainly the beneficiary of the generally optimistic mood of the nation when he assumed the presidency. He was the right man for the moment, with “a good heart and an amiable disposition,” in the words of one congressman. It’s not a phrase likely to be used to describe Trump, who has also been the beneficiary – and notably creator – of the nation’s mood. He has stoked and fanned the flames of fear, anger, and bigotry in a deeply divided nation.

 

Maturity

 

In 1793, Monroe, then a senator from Virginia, wrote a letter to his friend, Secretary of State Thomas Jefferson, about the futility of responding to foreign and personal insults. “The insults of Spain, Britain, or any other of the combined powers,” he wrote, “I deem no more worthy of our notice as a nation than those of a lunatic to a man in health, - for I consider them as desperate and raving mad.”

 

By personality, Monroe was not a sable-rattling pugilist. He was able to ignore such “desperate and raving mad” comments. In contrast, Trump is, by his own admission, a “counterpuncher.” He allows no insult to go unanswered, and his behavior is quite predictable. While he may have praised an individual in the past, once their real feelings about him become public, he attacks.

 

For example, on April 11, 2018, Trump tweeted his warm regard for House Speaker Paul Ryan, praising him as “a truly good man, and while he will not be seeking re-election, he will leave a legacy of achievement that nobody can question. We are with you Paul!”

 

The feeling was apparently not mutual from Ryan’s perspective. Ryan’s real opinion about Trump leaked out recently from Tim Alberta’s new book “American Carnage.” According to Alberta, Ryan thought Trump was inept:  “I told myself I gotta have a relationship with this guy to help him get his mind right. Because I’m telling you, he didn’t know anything about government…I wanted to scold him all the time.” At another point, Ryan said he saw retirement as his “escape hatch” from having to work with Trump for two more years.

 

With remarkable predictability, Trump tweeted an angry tirade on the same day that Ryan’s comments became public. “Paul Ryan, the failed V.P. candidate & former Speaker of the House, whose record of achievement was atrocious (except during my first two years as President),” the petulant president typed, “ultimately became a long running lame duck failure, leaving his Party in the lurch both as a fundraiser & leader…” Trump continued his Twitter rant against Ryan in two additional derogatory tweets.

 

When it comes to seasoned maturity, Monroe was secure enough in his own skin to recognize the futility of punching back with insults of his own. Trump has never learned that skill but is forever stuck in juvenile behavior that unnecessarily escalates the vitriolic heat in the public square. While Trump envisions himself as the perfect leader without any flaws, his record speaks otherwise, and fails to demonstrate the calm maturity of Monroe who saw the futility of responding to insults.

 

Military

 

James Monroe was the second and last president to have served in the Revolutionary War. He volunteered to fight for independence. Serving under General George Washington during the Battle of Trenton – a surprise attack on Christmas Day by American forces on Hessian mercenaries – 18-year-old Monroe was one of just a half-dozen Americans soldiers injured in the fighting. The bullet that pierced his shoulder remained there the rest of his life.

 

In contrast, Donald Trump sought and received a deferment from the draft for military service in Vietnam due to bone spurs in his heels. Serious questions have been raised about whether the doctor issuing the report was simply doing a favor for Trump’s father, or if Trump really did have a medical condition that would disqualify him from military service.

 

James Monroe honorably served the embryonic nation with heroism in the military, while Donald Trump found a way to avoid military service, and doesn’t come close to matching Monroe’s sacrificial service.

 

Conclusion

 

The character and characteristics of the 45th president are a far cry from the honorable service and integrity of our 5th president. Trump fantasizes about big election victories; Monroe actually had them two centuries ago. Trump wants to be viewed as a strong president, but his vision of strength is to divide America while Monroe put his leadership skills to work in uniting the nation with the Era of Good Feelings. Trump has a microscopically thin skin while Monroe exhibited a tough and seasoned maturity by ignoring insults. Trump loves the armed forces enough to throw himself a July 4th extravaganza, but not enough to have served in the military. James Monroe put his life on the line in helping purchase American independence as a soldier. And so to paraphrase the words of Lloyd Bentsen, “I knew James Monroe. James Monroe was a friend of mine. Mr. President, you’re no James Monroe.”

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172779 https://historynewsnetwork.org/article/172779 0
The Greatest Generals Across Generations

 

 

Reflecting on the 1989 invasion of Panama and subsequent hunt for strongman Manuel Noriega, Colin Powell lamented that, "A President has to rally the country behind his policies. And when that policy is war, it is tough to arouse public opinion against political abstractions. A flesh-and-blood villain serves better." This American tendency to personalize conflicts and world events is reminiscent of the “Great Man” theory of history, which posits that world events can largely be explained by the impact – positive or negative – of individual leaders.

A similar phenomenon occurs in military history, as the analysis of campaigns and wars is often reduced to an assessment of the opposing commanders’ performance. Although the quality of an army’s generals is a critical factor determining victory or defeat in battle, chance often plays as much of a role as skill or institutional variables in determining who commands at a given time or place. Nowhere is this more evident than with the U.S. Army during World War II, which produced the greatest generation of operational commanders in U.S. history. It is easy to view photographs of George Marshall beside his mentor General Pershing, or of George Patton standing next to a tank in 1918, know what they achieved in building and leading the Army in WWII, and therefore perceive their rise as inevitable. Yet as Edward Gibbon observed, “The fortune of nations has often depended on accidents.” Indeed, a string of accidents and coincidences were vital to shaping the roster of U.S. commanders in World War II – and hence the Allied victory – in three underappreciated ways.         

First, there was the simple timing of the war. As the legendary British general Sir Edmund Allenby once told Patton, for every Napoleon and Alexander that made history “there were several born. Only the lucky ones made it to the summit.” In other words, which commanders achieve greatness is partly determined by fate, specifically whether they are actually in command when a great conflict starts. For example, if WWII had broken out in 1934 rather than 1939, Marshall would still have been in the exile pettily imposed upon him by then-chief of staff Douglas MacArthur, serving with the Illinois National Guard instead of a position that allowed him to become what Winston Churchill called “the Organizer of Victory.” Conversely, if the war had broken out in 1944, Marshall would have already retired after serving four years as chief of staff from 1939-1943. Thus, the timing of Hitler’s invasion of Poland and France played a significant role in determining who led U.S. forces in WWII.

        

Second, in Strange Victory Ernest May suggests that the German invasion of France in 1940 was likely saved when a falling boar’s head in the Belgian hotel serving as Heinz Guderian’s command post narrowly missed the brilliant Panzer commander’s head. Similarly, a series of near-misses preserved the leaders who would command U.S. forces in that conflict. On the morning of November 11, 1918, hours before the Armistice, an errant bomb was dropped on the other side of a stone wall from where Marshall was eating breakfast in the 1st Army’s mess. Marshall escaped with just a nasty bump on his head, but as one historian observes, “Had the walls of the old house been less sturdy, a different chief of staff would have led the American armies against the Germans in the next war.” In 1920 Dwight Eisenhower and Patton came within “five or six inches” of being decapitated by a snapped steel cable while experimenting with tanks at Camp Meade, and in 1924 Omar Bradley was earning extra money working construction on the Bear Mountain Bridge when a cable snapped and cut the watch off his wrist. Eisenhower again narrowly escaped death in the 1930s when his plane nearly crashed upon takeoff in the Philippines. Although the pilot announced, “We ain’t going to make it,” the plane cleared the hill at the runway’s end by a few inches. Conversely, his friend James Ord – whom Ike called “the most brilliant officer” of his time – was killed in a plane crash in the Philippines in 1938. Other officers whom fate cruelly denied the opportunity to earn glory in WWII included Adna Chaffee, Jr., the father of the Armored Corps, who died on active duty of cancer in 1941; and Bradford Chynoweth, an innovative contemporary of Eisenhower and Patton's who took command of a Philippine division in November 1941 and hence was doomed to spend the war in a Japanese prison camp after the surrender on Bataan.

        

Finally, some tragedies inadvertently proved fortuitous to the American war effort. If Marshall’s wife Lily hadn’t suddenly died in 1927, he would have remained an instructor at the Army War College. Instead, unable to bear the constant reminders of her at Washington Barracks, his friends on the General Staff arranged for him to become assistant commandant of the Infantry School at Ft. Benning, where two hundred of the instructors and students who served under him from 1927-1932 – including Bradley, Matthew Ridgway, and “Lightning Joe” Collins, amongst others – rose to the rank of general during WWII. Similarly, Eisenhower was called to the War Department after Pearl Harbor because the head of the War Plans Division’s Asia department was killed in a plane crash on December 10, 1941. This accident forced Marshall to find a replacement, thereby setting in motion the partnership that was crucial to winning the war.

        

None of this is to say that individual commanders don’t matter, or that institutional factors such as professional military education or training exercises are unimportant in shaping those men eventually placed in command of great armies. Rather, it is important to recognize that whereas in retrospect history often appears to have unfolded in a straight line, reality is almost always more chaotic and uncertain. In the end, the Greatest Generation’s generals’ triumphs were anything but predetermined, and required a series of accidents and twists of fate to bring them to the point where their innate courage, intelligence, and determination could be decisive.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172778 https://historynewsnetwork.org/article/172778 0
Tiananmen Square-1989: Beijing’s Amnesia and Memory Hole  

 

 

 

Recently we commemorated the 30th anniversary of the massacre of Chinese Students in Tiananmen Square in Beijing.  These students and other participants were peacefully protesting corruption in their government and calling for democratic political reforms in China. On June 4thth 1989, their protest was brutally crushed by the People’s Liberation Army acting under the direct orders of Deng Xiaoping. Hundreds, maybe thousands, of students and their supporters were either killed or seriously wounded. Many more were arrested and imprisoned for trying to promote democracy in China.

 

Since that event, the Chinese government has launched a full campaign, largely successful, to expunge this episode from Chinese history and the minds and memories of the Chinese people. It would also like the rest of the world to forget and disregard what happened on that day. Unfortunately, this brutal suppression of political/human rights was broadcast globally on television. To this very day, the Chinese government has never apologized to the victims, or their families, of this brutal political crackdown. Instead, the leaders in Beijing have tried to spread two separate narratives to two separate audiences. To the Chinese people, they say this event never happened. They want the rest of the world to believe it was fault of the students who only got what they deserved. But, the Chinese Communist Party (CCP) cannot have it both ways.

 

On the one hand, the leaders in Beijing would have the world believe that such a dastardly event never happened. Or, if it did, and this is quite a concession, it was a minor affair. If it was such an inconsequential event, then why employ the full power of the People’s Liberation Army which included both armed soldiers, who fired on the students, and tanks. Then, in an absolutely mind-boggling explanation, the CCP has turned itself into victims and unarmed students into perpetrators.

 

As the Chinese journalist Louisa Lim argues in her well researched book: The People’s Republic of Amnesia: Tiananmen Revisited, Tiananmen seems to have gone down the Chinese version of George Orwell’s Memory Hole. In a recent article in the New York Times, she wrote that the CCP has waged a kind of war against any mention of Tiananmen. The episode is erased from official histories. Web sites that document it are blocked.

 

Any mention of this event calls forth Chinese defiance. Recently, Chinese Defense Minister Wei Fenghe told an international conference in Singapore that the student gathering was a kind of riot; “political turmoil that the central government needed to quell.”

 

Within China, the government has tried its best to completely inhibit or censor internal discussion of Tiananmen altogether. Late last year, the government began a crackdown on Twitter users who posted criticism of Communist rule. A former leader of the 1989 protest was barred from traveling to Hong Kong to commemorate the anniversary.

 

There are reports by students in China that their virtual-private-network services have been recently suspended. VPNs, as they are called, are used by citizens in authoritarian countries to by-pass state censorship of the Internet. Some believe that this directly connected to the Tiananmen “anniversary.” But the official story from Beijing is that it is retaliation for new American tariffs. What ever the case, there is little doubt that every

 

year the commemoration of Tiananmen is greeted with anxiety, even trepidation in Beijing. It is simply amazing that this self-described “incident” in Chinese history could still stoke such fear among the Chinese leadership.

 

One wonders why, if for nothing else, their own peace of mind, the CCP does not just admit it culpability for what happened, apologize to the Chinese people, put this event in their rear-view mirror, turn the historical page, and move on? The unwillingness of the CCP to take this seemingly rational step might well speak to its political insecurity. Might this honesty not play well with and gain the respect of the Chinese people?

 

But sadly, this is not the case. The actions of the CCP are typical of dictatorships. The Soviet Union, from whom China borrowed its Leninist political system, would erase the entries of unpopular officials from official encyclopedias. In recent years, Mao’s last wife and the leader of the notorious “Gang of Four”, Jiang Qing, was cropped out of the picture taken at the funeral of Mao Zedong. Simply put, she was historically erased from this event. Maybe the fact that she ended up in prison where she committed suicide had something to do with her trip down Beijing’s “Memory Hole”.

 

What makes the threat from China quite dangerous, however, is that the Chinese are accumulating the power to also mold the collective memory of people around the world. They are not there yet.

 

But China’s intent to own pieces of the world’s digital infrastructure, as well as social-media platforms, threatens to make the free world’s internet as limiting as the one China imposes on its own citizens.

 

It isn’t just China’s push for Huawei to help built 5G wireless networks around the world. China’s Byte Dance owns the popular social-media platform Tik Tok which is popular among young web users in the West.

 

A Chinese gaming company now owns the popular gay dating app-Grinder and China’s We Chat is rapidly expanding its market shares in Europe and Asia for its easy-to-use program to pay for goods and services with a phone.

 

If the CCP had learned the lessons of Tiananmen and met the demands of its people for more personal freedoms, these developments would not be so fearful. In open societies there is a division between private business and the government. But this is often not the case in China.

 

In 2017, the state enacted a new national intelligence law that compels private Chinese businesses to cooperate with the Chinese government. US companies, as a price of doing business in China, have sometimes also cooperated but at least they have a choice.

 

This means that the data collected by Huawei, We Chat and Tik Tok can be collected, stored and searched by Chinese security agencies.

 

Minimally, this arrangement would give China the means to bully Western companies to comply to the type of web censorship it imposes domestically on its own citizens. It has already done this with American technology companies when it comes to their Chinese products. A  sobering scenario would allow Chinese government officials to mine the personal data of non-Chinese citizens.

 

Fortunately, the United States has begun to meet this threat. It has launched a global campaign to convince allies to block Huawei from participating in the building of national 5G networks. Last month, President Trump signed an executive order prohibiting the purchase or use of communication technologies owned or controlled by a foreign adversary.

 

One can only hope that this is not too late to save the internet from a Chinese takeover. If it is, then the rest of the world could soon be subjected to the war on history China now wages on its own people.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172781 https://historynewsnetwork.org/article/172781 0
Can Muslims get a fair shake in India?

India is back to square one, thanks to Prime Minister Narendra Modi's move Monday to scrap special political rights for a Muslim-majority state in the Hindu-dominated country. The Muslim-rights issue, which led to the creation of Pakistan as a Muslim homeland in 1947, has now resurfaced: Can the Muslims get a fair shake in India?

 

By scraping Kashmir's special autonomy status, Modi has taken a dangerous step toward implementing the vision of his ultra-nationalist party's spiritual guru, the late V.D. Savarkar, who proposed more than 90 years ago to keep minorities under control in an India ruled by the Hindu majority.

 

Sitting in a prison cell on the Andaman Islands in the Bay of Bengal, the convicted-armed-revolutionary-turned-Indian-nationalist drew up his solution to the vexing question of India's minorities. His idea: The Muslims and Christians can stay in India, but they will be subservient to the Hindus; they will be granted no rights that may infringe upon Hindu rights; and since they are minorities, they must obey the majority.

 

This was not his initial plan, however. He initially wanted to convert the Muslims and the Christians back into Hindu. But he faced a big obstacle. Savarkar could convert the Muslims or the Christians, but could not arbitrarily decide their caste. A Hindu must belong to a hierarchical caste, and it is acquired through birth only. Hindu religion does not permit assigning a caste.

 

To overcome this insurmountable barrier, he revised his idea. He decided he is a Hindu, not an Indian. His motherland is Hindustan, which encompasses the land from the Himalayas to the Indus River. Hindustan boasts a 5,000-year-old rich culture, which influenced a vast number of people from Greece to Japan. On the contrary, India is a concept championed by the nationalists who wanted an independent united country for all of its inhabitants, regardless of their religion.

 

Muslims and Christians Unwelcome

In Savarkar's Hindudom, the Muslims and the Christians were less than welcome. He disliked them because of their allegiance to Mecca and Rome; they worshiped foreign gods and had no cultural affinity toward Hindustan. Even though Buddhists and Sikhs were no longer pure, they were still acceptable because their religions originated in Hindustan.

 

Sarvarkar, an atheist who labeled his vision as non-religious and cultural, was unwilling to give the Muslims a separate homeland next to Hindustan. He feared that even though they were only 25 percent of the total population, they could still someday reconquer Hindustan if they were allowed to have their own country. The Muslims were a small band, too, when they conquered India in 712 AD and eventually built a vast empire. 

 

He figured that the next time around they would be in a much stronger position to repeat their past success, because they would receive support from other Muslim nations. To nip that possibility in the bud, he supported the creation of Israel. He saw the Jewish state as a barricade against the Muslim Arab world.

 

He feared a Muslim resurgence so much that he wanted British rule in India to continue. He sought only dominion status for Hindustan. Only Britain, he believed, was powerful enough to keep the Muslims at bay if they ever attempted to invade Hindustan again.

 

But to his chagrin the nationalist tide swept India, as independence stalwarts like M.K. Gandhi, Jawaharlal Nehru and Moulana Abul Kalam Azad pressed the colonial power to leave. Savarkar's idea took the back seat, but remained very much alive, even though malnourished.

 

After the murder of Prime Minister Indira Gandhi in 1984, the Indian National Congress party, the champion of secular India, fell on hard times; it had no comparable charismatic leader to carry forward the torch. Savarkar's followers gradually gained ground and picked Modi, who was once condemned globally as the mastermind behind Muslim massacre in his home state of Gujrat, as the reincarnation of their guru.

 

Modi shows anti-Muslim bias

With a huge re-election victory two months ago, Modi embarked upon implementing Savarkar's dream to appease his hardcore anti-Muslim forces. First, he nullified a Muslim marriage law that had existed for centuries. India's constitution, however, protects religious laws of other minority groups, and Modi did not touch them, showing his bias against Islam. Even the Mogul or the British did not touch India's religious laws.

 

On Monday, keeping Muslims leaders under house arrest and deploying tens of thousands of soldiers in Kashmir, the prime minister moved to take away the special rights — own flag, own law and property rights — granted by India's constitution to the state in a blitzkrieg exercise in a matter of hours.

 

Imran Khan, prime minister of nuclear-armed Pakistan, arch-rival of nuclear-armed India, has threatened war. Pakistan considers Kashmir a disputed territory. China, which occupies parts of the state, denounced India's action as “unacceptable,” but is unlikely to take any military action. Pakistan can do very little on its own, unless it wants to risk a nuclear confrontation. Washington seems less than thrilled to stick out its neck. Nonetheless, the danger level remains high, and the fallout will be felt in India and plague its neighbors.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172770 https://historynewsnetwork.org/article/172770 0
Counterculture 1969: a Gateway to the Darkest and the Brightest

In late December 1966, Time released its first publication for the coming new year, an issue that has annually featured the magazine’s “Man of the Year” since 1927. The choice of Man of the Year goes, as Time reports, “to the person or idea or thing that, “for better or worse, has done the most to influence the events of the year.”

 

For most of the years that the Man of the Year had been awarded, the recipients had been individuals – those towering figures of the 20th century from Lindberg to Hitler to Churchill to Roosevelt.  But for 1966, the honor went, for the first time, to an entire generation: The Time Man of the Year for 1966 was those Americans under 25 years old – the young people in the United States who, the magazine reported, “had already shown they would remake the world.” 

 

“In the closing third of the 20th century,” Time wrote, “that generation looms larger than all the exponential promises of science or technology: it will soon be the majority in charge. “Never have the young been so assertive or so articulate, so well educated or so worldly,” Time reported. “Predictably, they are a highly independent breed, and – to adult eyes – their independence has made them highly unpredictable. This is not just a new generation, but a new kind of generation.” 

 

A new kind of generation to be sure. The real impact of young people in 1960s America was beginning to be heard, with groundbreaking influence from the counterculture that was developing across the country.

 

How counterculture arrives

 

Recognizing that a counterculture exists is simple; appreciating its starting points can be more challenging. 

 

How does a counterculture emerge? Books, college courses, and academic careers are built on exploring that question. But the most basic answer is that every culture produces countercultures – to challenge, to question, and to defy mainstream society.   The 1960s would produce its own unique brand of counterculture -- a product of events, timing, and the will of a new generation of Americans to change a host of social ills ranging from civil rights to saving the environment -- with effects that endure today.

 

“Some of counterculture comes from the deep psychological needs of people to rebel, to create an identity,” said MIT sociologist Gary Mark. 

 

“An important factor had to do with the end of World War II – the triumphs, and the economic expansion,” Marx said. “Depression-era people had to struggle and were focused on obtaining some kind of economic security. Suddenly the next generation saw more affluence, and affluence meant that there was no need to struggle.

 

“Once you had that security,” said Marx, “you had the leisure to engage in alternative kinds of ideas.”

 

And in the 60s there were more young people than to consider these ideas – an abrupt result of the Baby Boom of 1945 to 1964.  By the middle of the 1960s, there were nearly as many Americans under 25 as over 25 (by the end of 1969, 100 million Americans in the Baby Boom Generation would be 25 or under).  

 

The moment was right for protest, change, and rebellion.

 

 

 

Baby boomers: the key to the 1960s

 

“The story of 1960s counterculture is the story of those baby boomers,” said Ken Paulson, dean of the College of Media and Entertainment at Middle Tennessee State and former editor of USA Today. “With baby boomers, you have a generation that had far more free time and far more disposable income. They had discovered that by sheer numbers they could drive demand for the things they cared about.”

 

The counterculture of the 60s represented not only activism about national issues, but it also inspired rejection of social norms.

 

“There were at least two very distinct countercultures, especially here in California,” said Stanford cultural historian Fred Turner.  “One centered in Berkeley – the new left, doing politics to change politics; and the other centered in San Francisco – wanting to avoid politics, celebrating consciousness, psychedelia, and transformation of consciousness.”   

 

Vietnam: the center of dissent

 

Overarching all other counterculture grievances -- and the focal point for many of them -- was the war in Vietnam, which over the course of four presidential administrations, had become a seemingly unsolvable political and strategic catastrophe.  

 

For growing numbers of Americans in the late 1960s, Vietnam was a conflict with none of the clarity and noble mission of World War II, no end in sight, and progress measured only in terms of the growing death toll.  By 1969, almost 50,000 Americans had died in Vietnam, and troop strength was its highest: some 520,000 men and women.  Everyone in America knew someone fighting in the war, or who had died in the war, or who might be drafted to fight in a country they did not know and for a cause they did not understand.  

 

Colleges as the catalysts

 

In the 1960s, college campuses simultaneously became a norm of middle-class culture and the focal points for dissent and protest. 

 

In stark contrast to previous generations, in which education after high school was a rare privilege, college attendance in post-war America exploded; in 1940, about 500,000 Americans were in college; 20 years later, college enrollment had increased to 3.6 million.  College granted more time for reflection and discussion, chances to question the status quo, and opportunities to explore the unconventional to make a better world. 

 

In 1960s America, counterculture became a mix of social activism, environmental awareness, and a platform of expanding demands for civil rights, social equality, and the rights of women.

 

The Free Speech Movement – which had developed at UC Berkeley in 1964 as a response to, of all things, the university’s policies that restricted political activities on campus – took root as the first significant civil disobedience on college grounds; most other major universities would soon follow with their own protests, especially in support of civil rights and opposition to the war.

 

And then there was the most tangible denunciation of all: complete rejection of society, and dropping out entirely. The hippie living in a commune – although only a minor percentage of counterculture lifestyle choices – became the high-profile symbol of counterculture values.

 

Exploring the divisions 

 

CBS News tried to make some sense out of the impact of counterculture with a project called “Generations Apart,” an ambitious exploration of the generation gap as seen through national surveys and interviews with both young and older Americans. Hosted by reporter John Laurence, the results of the project aired in three broadcasts in late May and June 1969. The programs became a vivid reminder of how countercultural values and generational differences had changed America. 

 

The episode of “Generations Apart” titled “A Profile in Dissent” was particularly hard-hitting, and described how young people (17-23) and parent-age Americans viewed social change and political controversy.

 

“It is a time of dissent for many of America’s young,” said Laurence. “The collision of events that they could not control has caused a challenge to values they cannot accept.”

 

“The majority are quiet as they always are, ready to conform as they always are,” admitted Laurence. “But a growing minority is shaking up society and raising their voice. There is a swelling tide of dissent among the young in America today. It is surging up against some of the most basic institutions of adult society.”

 

The survey showed a widening generation gap on a range of fundamental issues involving sex, religion, drugs, and money. The program’s most startling finding was the broad rejection of the most basic ideals of middle-class values: six out of 10 young people said they want “something different in life” from what their parents wanted. Among college students, only one-third said patriotism is important, while two-thirds of all young people said civil disobedience is sometimes justified.  

 

“Will they find the definitions for the ‘something different’ that they are searching for?” Laurence said of a new generation of Americans. “Only the young can tell us. And maybe not even the young can say for sure how they are going to shape this society, until they are older.”

 

The peak of counterculture

 

By some measures, counterculture in the 1960s would reach a pinnacle in August 1969, when two events in particular – incredibly, only a week apart – would symbolize both the worst and the best of the era: the first -- two nights of murder by the followers of Charles Manson -- would demonstrate the tragic vulnerability of some who sought alternatives to mainstream 1960s America.  And the second -- the gathering of 400,000 in rural New York for “three days of peace and music” that forever after would be known simply as Woodstock -- would showcase the new generation at its best.

 

The counterculture of the 1960s would continue to evolve into the early 1970s.  Bythe mid-70s, much of the energy of the 60s had changed, in part because some of the major goals of the 60s, such as expanding national social programs, the environment, and civil rights, had been at least partially achieved – or perhaps more important, had moved into the ongoing mainstream discussion of America’s political and social concerns. The counterculture of the 1960s in its endlessly evolving forms continues today, now as a broad influential force in a spectrum of social movements and cultural expression.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172783 https://historynewsnetwork.org/article/172783 0
1919, the Year of Racial Violence: An interview with David Krugler

History is a record of the incessant struggle

of humanity against ignorance and oppression.

Helen Keller, 1918

In the wake of the Great War, Americans were hopeful for a new year of domestic tranquility and prosperity. Black troops came home from the battlefields of France to claim the same democracy for which they had fought. They were badly disappointed. Atrocities by whites against African Americans intensified. Lynching of black citizens continued with impunity and mob violence targeting black citizens exploded in cities across the country. African Americans suffered tremendous losses but also defended themselves against the onslaught of horrific violence launched to protect and enforce white supremacy.

But 1919 became the bloodiest year of racial violence in American history. In his new book 1919, The Year of Racial Violence: How African Americans Fought Back (Cambridge), history professor David F. Krugler vividly details the extent of the violence from white mob attacks on black citizens in cities from Charleston and Washington, D.C., to Chicago and even Bisbee, Arizona, as well as dozens of lynchings, from the murders of individual blacks to the greatest mass lynching in our history at Elaine, Arkansas, resulting in as many as 230 deaths, a white majority response to efforts by African American sharecroppers to organize a union.

Professor Krugler goes beyond previous histories of this tumultuous period by putting African Americans at the center of the story. He stresses the character of the violence as antiblack collective violence by whites—rather than “race riots.” He recounts the efforts of African Americans to resist the violence through heroic self-defense in the streets, media campaigns to correct inaccuracies in the mainstream white press, and work by the NAACP and others to achieve justice and equal protection through the legal system.

The book also shows how black resistance to white mob violence laid the foundation for the Civil Rights Movement 40 years later that would lead to many of the policies for which the African Americans of 1919 struggled. It’s also the first work to document government efforts to disarm African Americans and to obstruct their legal right to obtain weapons and defend themselves at that time.

Critics have praised 1919, The Year of Racial Violence for its new perspective on history, its extensive and original research, and its lively prose. Adriane Lentz-Smith of Duke University wrote: “This powerful book captures the high cost and high stakes of the War for Democracy brought home. By turns devastating and inspiring, it sets the new standard for exploring African Americans' struggle for safety, truth, and justice in the aftermath of World War I." And Chad Williams of Brandeis University commented: "With meticulous research and narrative force, David Krugler has produced a brilliant account of one of the most turbulent and bloody years in American history. As he powerfully demonstrates, African Americans, in the face of horrific nationwide racial violence, used every tool at their disposal to fight back and preserve both their citizenship and humanity. 1919, The Year of Racial Violence is a landmark achievement." 

David F. Krugler, a professor of history at University of Wisconsin—Platteville, specializes in modern US history. His other books include The Voice of America and the Domestic Propaganda Battles, 1945-1953, and This Is Only a Test: How Washington, D.C., Prepared for Nuclear War.  

Professor Krugler generously talked about his new book on racial violence by telephone from his office in Wisconsin.

Robin Lindley: What inspired your sweeping study on the racial violence in America in 1919?

Professor David Krugler: I originally set out to do a project on all of the post-World War I upheaval in the United States. Because most of my research was in the post-World War II period, I was looking for a new era within modern U.S. history to study.

I began my research with an episodic, sweeping view in mind on 1919. A book came out in 2007 when I was in the early stages of research by Anne Hagedorn, a journalist and historian, called Savage Peace: Hope and Fear in America in 1919. That proved to be great timing on her part because it got me thinking about what I could do that was new. So I decided to narrow my focus, and I’m happy I made that decision to focus on racial conflict and black resistance.

Robin Lindley: I think many readers will be surprised by the extent of racial violence in 1919. You detail many of the conflicts and atrocities, but you also offer a fresh perspective on the violence.

Professor David Krugler: In the original draft, I felt I was dwelling too much on white-on-black violence. I got some good feedback when I was writing. The reader suggested that I foreground black resistance rather than the violence visited upon African Americans.

The extent and the frequency of the violence and the seemingly minor causes of violence are shocking to the modern reader. I wanted to lay the groundwork so that readers understand the ideology of the time and how entrenched white supremacy was.

Today we see white supremacy as extremism and hate groups on the fringe, so it was important to me to lay out how structural white supremacy was. For millions of white Americans there was no discrepancy between being enthusiastic supporters of the war to make the world “safe for democracy,” as President Woodrow Wilson called the Great War, and denying African Americans equal opportunity and constitutional rights because of the scientific racism of the time and because the ideology and practice of white supremacy was so entrenched.

I wanted people to understand why violence and mob violence were used so frequently. Then the narrative turned toward what African Americans did in response so that the book didn’t become an almost numbing narrative of one violent episode after another.

Robin Lindley: You’re very careful about language and it was instructive that you distinguish the casual term of “race riot” that was used to describe the violent outbursts and substituted the more precise term of “antiblack collective violence” in the recognition that the violence—devastating acts of assault, murder and arson—was initiated by whites against black citizens.

Professor David Krugler: The main problem with the term “race riot” is that it suggests that all the participants have an equal responsibility for causing violence and breaking the law. That just wasn’t the case in 1919 with almost every violent incident involving whites organizing extralegal outbursts to punish blacks for perceived affronts to white supremacy or to punish those who allegedly committed crimes, particularly against whites. To call this violence a race riot would be totally misleading.

I want people to understand that, although African Americans found themselves in the middle of riots at great peril to their own lives, their response was shaped by instinct as well as a thinking decision to resist, not to riot.

Robin Lindley: And the mass violence was so widespread, from Arizona to Chicago to Washington, D.C., and in the South.

Professor David Krugler: It’s important to know that this wasn’t just a southern phenomenon. Indeed, some of the worst riots occurred in the upper Midwest—Chicago, for example.

And Washington, D.C. Even though some classify it as a southern city, in many ways it’s not. It’s the nation’s capital and that made the racial conflict and black resistance all the more public. It was viewed at the time by observers and participants as a particularly revealing episode of racial violence in terms of the causes of why whites attacked blacks and what blacks did in response.

Robin Lindley: When we read about this distant violence, I think there’s a perception that blacks were victims or passive, but you stress that African Americans defended themselves against white attacks. They also were much more likely to be prosecuted for violence than whites, even though blacks were not the perpetrators. In fact, whites were seldom charged.

Professor David Krugler: In my research, I was struck by the press reports of racial violence in mainstream newspapers. Time and again, editors and journalists in mainstream newspapers blamed African Americans for the violence. Because that reflected the beliefs of law enforcement officers and even federal troops, that became a justification for arresting African Americans and charging them with serious felonies when whites who caused the violence were not charged. That came out time and time again in Chicago and Washington D.C. with so many black men charged with carrying deadly weapons or concealed weapons when the facts of these cases showed that these weapons were procured for self-defense. African Americans couldn’t rely on law enforcement to protect them or stop the violence.

Robin Lindley: You explore the historical context of the 1919 riots. Black troops were returning from World War I combat and expecting democracy at home after fighting for it in Europe. Were there more black combat troops in that war than in World War II?

Professor David Krugler: In terms of numbers, there were more black troops in combat in World War II than World War I.

The major black divisions were segregated and assigned to French command in World War I, and it’s revealing that they were the only American troops assigned to French command. All white units remained under the command of General John Pershing and the American Expeditionary Force. The majority of African Americans in uniform were channeled to service positions in labor or supply battalions. Many of the black soldiers then in France were stevedores or doing backbreaking labor. That was true stateside as well.

But the four divisions of black soldiers that were in combat distinguished themselves many times. That’s what got a lot of attention. The demobilization of U.S. forces was so swift and white and black soldiers were returning in large numbers in ship after ship in places like New York and Charleston. That enabled the black units that were acclaimed in combat--divisions such as the Harlem Hellfighters—the 369th, and the 370th, the Eighth Illinois National Guard--to come home to welcoming parades that attracted so much attention and celebration. The arrival inspired African Americans in those cities--those in uniform, those that had loved ones in uniform, and those that lived and worked in the United States— to say, “All right, the time is here for us to have the democracy for which we fought in France. We will do whatever it takes to get it.”

Robin Lindley: And at that time you also had the Great Migration, the New Negro Movement, and the Red Scare, so there was almost a perfect storm in terms of the cry for justice for black Americans and then the white fear of a black uprising.

Professor David Krugler: Yes, these things came together in a short time. The New Negro Movement was well under way before the war, and the war provided a way to gel the movement around at last obtaining constitutional rights and equal opportunity.

At the same time, five hundred thousand African Americans moved from the South to the Midwest and Northern industrial cities, creating strains in those cities. And they experienced racism and segregation in those cities as well but they were finding more opportunity.

The draft brought hundreds of thousands of young black men in uniform to different parts of the country. Many of these draftees and enlistees then encountered the New Negro Movement so that, in some ways, the Army became an incubator for the Movement. There was a lot of awareness of that among white officers even before troops were sent to France. They were raising cries of alarm that there were so many black men in uniform that it would be harder to enforce the status quo. They didn’t quite use that language, but they said, “the Negro question would remain unsettled.” Those were code words for saying that black servicemen were not going to submit docilely to inequality if these forces were unleashed.

Robin Lindley: You also discuss the 1919 violence in terms of the American history of mob violence and “rough justice,” including lynching. What was “rough justice”?

Professor David Krugler: The still prevailing understanding of historic lynching today is that it took place in areas where legal systems were not fully in place so people took the law into their own hands to fill a need.

As historian Michael Pfeifer has shown convincingly and beyond doubt, that wasn’t true. Time after time, lynch mobs carried out their murderous acts in communities that had fully established law enforcement agencies and court systems. The mobs wanted immediate gratification. They did not want to go through the court process, even when a sentence of death was likely for the alleged criminal. They wanted to take that person’s life right away, as Pfeifer and other historians point out.

Much of this violence was directed against African Americans for even minor offenses against white supremacy. This really comes out in 1919, when African Americans were being lynched by white mobs because they refused to yield their vehicle on a road or they didn’t use proper forms of address. So rough justice was used to maintain and protect white supremacy and to terrorize other African Americans to quash the New Negro Movement and to provide the larger white community with the sense that they were in control and had the means to deliver immediate “justice” to those who presented any affront to it.

Robin Lindley: You recount graphically the many instances of horrific violence and atrocities. I think many readers may not know of the violence in Elaine, Arkansas, that resulted in perhaps the largest lynching of blacks in American history with the murder by some counts of more than 200 African Americans.

Professor David Krugler: Elaine was one of the later episodes in 1919 and was the most murderous episode of antiblack collective violence that year. It occurred in late September and early October.

In Phillips County, Arkansas, where Elaine is located, there had been some union organizing by African American sharecroppers who had hired U.S. Bratton, a white lawyer from Little Rock, to represent them. He had sent his son to Phillips County to take testimony from the sharecroppers.

When the planters learned of the unionization effort and the hiring of a lawyer, they cracked down hard. They sent a special agent of the Missouri Pacific Railroad as well as a sheriff’s deputy to break up a union meeting late one night in September. That led to a shootout. The sharecroppers were prepared for violence against them to block their movement. In that exchange of gunfire, the white special agent for the railroad was killed.

This shooting led to the mobilization of a mob of thousands [of whites] who broke into smaller mobs. Many were deputized so they had the authority of law behind them. The sheriff of Phillips County, Frank Kitchens, used parts of the mob as his posse. White people too joined from Mississippi.

What followed was a massacre—and it is not an exaggeration to use that word. Or a pogrom.

The estimates of the number of dead range from 20 to more than 230. The Equal Justice Initiative of Alabama issued a report this year putting the number of murdered African Americans at 237. Previous accounts I note in the book put the number around 25. The Equal Justice Initiative study hadn’t come out before my book was published, so I used the figure of 25, which is still a large sum and almost as many who died in Chicago that year. If the figures from the Equal Justice Initiative are correct, it was by and large the bloodiest incident of antiblack collective violence in our history.

And it really began because a group of African Americans were organizing to protect their economic interests. For years, they had been ruthlessly cheated out of their earnings and mired in debt peonage by the planters. They were seeking to break out of those chains, this form of slavery by another name that kept them tied to the land, enriching the landlords while leaving them in poverty. When they made a move to break out of that, the mobs formed to break up the union. They defended themselves and that led to even greater retaliatory violence, which also led to African Americans doing what they could to defend themselves, but they were hopelessly outnumbered.

The sharecroppers were rounded up, and that led to another episode of resistance—trying to save the lives of the men accused of conspiracy to murder whites and sentenced to death.

Robin Lindley: And you detail this story of the struggle for justice for these accused black sharecroppers, the Elaine 12.

Professor David Krugler: A lot of scholars are familiar with the U. S. Supreme Court decision that grew out of this case, Moore v. Dempsey, 1923, that established the federal government’s obligation to insure that state judicial proceedings protect the constitutional rights of the accused, particularly due process. To the modern person, it would seem self-evident that such protections were required in state proceedings, but that wasn’t the case until that decision. And that decision would not have reached the Court had it not been for the NAACP and a black lawyer in Arkansas, Scipio Jones, who undertook extensive efforts to defend the 12 sharecroppers who were sentenced to death after being tortured and convicted in hurried and grossly biased court proceedings.

Robin Lindley: Weren’t many innocent bystanders killed in the Elaine violence? I recall an incident of a woman and baby who were burned to death in their home.

Professor David Krugler: The mother and baby incident occurred in Florida in 1920.

In Elaine in 1919, there were many instances of people who were not part of the union effort who died. That’s not to say that the sharecroppers deserved what happened to them, but the efforts of the mobs and posses did not just punish the sharecroppers but also terrorized the majority black population so they would never again undertake anything that would question the status quo. With that blood thirst, you have instances like that of an elderly woman murdered and her body dragged out to the road and her dress pulled over her head. There was an elderly man killed in his bed. These were conscious, very disturbing decisions by the mob to set examples.

There are so many of these atrocities in 1919, to dwell on them could be very disturbing and numbing, but we cannot ignore them because they happened and we have to understand why that happened. As I said, my purpose in the book was to show what African Americans did in response when these atrocities were under way.

A photograph of the posse in Phillips County, Arkansas, that attacked black sharecroppers and their families. Source: AHC 1595.2, from the Collections of the Arkansas History Commission.

Robin Lindley: It seems that these acts of violence against blacks occurred for very trivial reasons, and the acts of self-defense by blacks usually were only after black citizens had asked for protection from law enforcement and local government.

Professor David Krugler: That’s an important point. In so many of the cities where the violence took place, the initial response of black organizations—particularly NAACP branches—was to go to the authorities and ask them to restore order.

A great example of this occurred in Washington, D.C., which had a very active and effective NAACP branch. Its officers met with the commissioner of the District of Columbia, the equivalent of a mayor, and the police chief, and they asked for protection. When they got a very qualified response from these two top officials, they were understandably indignant. In fact, the commissioner of the district, Louis Brownlow, was more interested in knowing what the NAACP leaders were going to do. They left the meeting saying it was [Brownlow’s] job to make sure that everyone is protected and order is restored. If you’re not going to do it, they said, the black men in Washington were not going to stand by and let themselves and their families be shot down like dogs.

This is a great example of that initial response seeking the protection of law and order and being rebuffed and not getting the obligatory response from authorities and then undertaking to do it themselves.

Robin Lindley: That failure of officials to protect African Americans was striking. And then, in terms of justice, whites who initiated violence were seldom charged with crimes yet blacks were frequently charged and prosecuted. The NAACP sought equal justice and compensation for damage to African Americans.

Professor David Krugler: That occurred in Charleston, South Carolina, which had the first major outbreak of antiblack collective violence in 1919. White sailors attacked black civilians and black-owned businesses and destroyed some of them. The NAACP branch in Charleston then sought compensation from the Navy, which did nothing. They also tried to get the Secretary of the Navy to do something and that didn’t work.

Seeking unbiased proceedings in courts was consistent in the cities where there was antiblack collective violence and unfair targeting of black self-defenders by authorities. There were all sorts of legal efforts to see that these people received adequate defense and that they were able to present the facts of the case and their efforts to defend themselves were presented as such, and even that they had a weapon was seen in this context.

This fight for justice as a whole saw success through legal victories, even though there were many setbacks and failures.

Robin Lindley: You note some bright spots in this grim history with some legal victories and the black press acting as a corrective to the biased mainstream white media that presented the perspective of the white majority.

Professor David Krugler: Yes. James Weldon Johnson, an official with the national office of the NAACP, was especially effective in his writing for the New York Age, a black weekly. He took on the misleading and often outright false accounts that were published in the major dailies in U.S. cities about the causes of the violence. Because he went to some of the places where the violence occurred, he was able to provide well-sourced rebuttals to the narratives and establish the real causes of the violence.

Walter White of the NAACP was another writer and he was even more intrepid. He went to Phillips County, Arkansas, and his life was at risk down there when it was discovered he was from the NAACP and was African American. He was very fair skinned so he used that to his advantage to pass as white to carry out investigative work.

White also went to Chicago for its riots. His evidence was not only useful for identifying what caused Chicago’s rioting and for rejecting the stories that were blaming blacks, it was also essential to the fight for justice because he got affidavits from many black Chicagoans on the actions of individual white rioters.

White even delivered much of the evidence to the grand jury that was convened in Chicago to bring riot charges against those who were arrested. As a result of White’s efforts, the grand jury went on strike because in the initial stages of the grand jury hearing, the state’s attorney was presenting only black defendants. After receiving evidence from the NAACP through Walter White that in fact many whites had been arrested and that they were responsible for so much of the violence, the all-white grand jury said they had to see these cases to have a fair judicial proceeding.

Here we see how these efforts tie together—trying to establish the truth about what happened and also secure a chance for fair court proceedings for those who had been detained and charged.

Portrait of Will Brown published in the book Omaha’s Riot in Story and Picture (1919). Source: RG2467-8, Nebraska State Historical Society

Robin Lindley: You also share some vivid photographic evidence from the time. I’ve been haunted by the photograph of the burnt body of Will Brown, a black man lynched in Omaha, since I first saw it as a child in a history book.

Professor David Krugler: That photograph comes from a book published shortly after Omaha’s racial conflict called Omaha’s Riot in Story and Picture. To the modern reader, it’s a disturbing publication because it has the feel of a children’s book. The text is very simple, but the photographs are unforgettable, particular the one that shows a young Will Brown with a pensive, almost sad expression, and of course, the readers know what happened to him. It’s almost as if, in that expression, he’s showing an awareness of his fate, of the horrible death he suffered at the hands of Omaha’s courthouse lynch mob. There are other pictures that give a sense of how many people poured out in Omaha for the storming of the courthouse and the seizure of Will Brown who had been falsely accused of the sexual assault of a young white woman and the assault of her boyfriend.

As I describe in the book, those assaults never took place. It was concocted by an Omaha political boss, Tom Dennison. He had been planning for months to discredit and oust from office progressive reformers who had displaced his party. A newspaper associated with Dennison and his machine had been publishing throughout 1919 lurid accounts alleging that black men had sexually assaulted white women and girls. It’s my belief that Dennison saw that those stories created a stir but they didn’t accomplish his basic purpose. I believe he had his young assistant and his girlfriend make up the attack. I think they found Will Brown ahead of time and had him arrested. I don’t think Dennison anticipated that the courthouse would be stormed and Will Brown seized, but that’s what happened.

Will Brown was hanged from a light pole in downtown Omaha and shot numerous times and his body was cut down and burned. There’s a horrific photo of people—men, women and even a little boy—gathered around the burned body of Will Brown. This is where we see rough justice because the mob members didn’t see what had happened as a shameful crime. They were proud of what happened and wanted to be photographed.

Robin Lindley: It looked almost like a party atmosphere as Mr. Brown burned.

Professor David Krugler: The historians I benefited from explain that this goes back to the notion of rough justice that people believe what they’re doing is right and that it’s not actually a crime to kill someone even though they have been arrested and will be going through the court system. By that logic, it shouldn’t surprise us that they would pose for pictures. This photographic evidence of lynchings is substantial. It should shock us and bother us, but when we understand the logic behind it, we have a better sense of why it occurred.

Robin Lindley: 1919 must have been the worst year of racial violence in our history, at least since Reconstruction.

Professor David Krugler: Absolutely. Reconstruction saw some violent years, but in terms of the frequency and the compressed amount of time, 1919 stands out as the most violent year for racial conflict in US history. This perfect storm of these forces coming together helps us understand why 1919 was so remarkable: the end of the war; the purposes for which the US fought; the mobilization for the war; the Great Migration; the New Negro identity; black military service; increasing organizational activities of the NAACP. These and other factors came together to create the conflicts.

Of course, the overriding cause was the determination of many white Americans to maintain the prewar system of racial segregation and discrimination. Because so many African Americans were determined not to return to that and to achieve full equality and opportunity, that produced friction and resistance.

The number of dead was well over one hundred. Depending on how we count the Phillips massacre, from 20 to 237 African Americans were killed there. In Chicago, the death toll was 38 and 23 of those were blacks. Those were the two bloodiest conflicts. When you add up all of the events and then add more than one hundred lynchings of African Americans in 1919, then you have triple digit death tolls. When you consider the loss of property and livelihoods and businesses—especially in Charleston and Longview, Texas—those who survived with their lives lost everything in this antiblack collective violence.

By those measures, there was a terrible cost. But perhaps the greatest cost was to US democracy itself. I wouldn’t want to impose modern sensibilities or expectations on a time one hundred years ago, but it seems that because it was possible for the United States to mobilize for the world’s most devastating war to that time to make the world safe for democracy, it doesn’t seem impossible to also have had democracy at home for all. That’s what African Americans were saying before, during and after the war. And there was agreement from some white Americans. The epigraph that begins the book is from a white officer who helped command the Harlem Hellfighters. He basically said that, having fought to make the world safe for democracy, it’s time for America to have democracy—meaning for African Americans. That awareness was around in one voice after another, but it’s unfortunate that it wasn’t possible at that time.

Robin Lindley: Your book tells a remarkable and largely unknown story. It has so much resonance with the events recently with police killings of black men in Ferguson, Missouri, and elsewhere, and the mass shooting of African American church people in Charleston by a deranged white supremacist.

Professor David Krugler: Thank you. I think there is a lot of contemporary relevance and 1919 has a lot to tell us about where we are as a nation today. I’ll offer one example.

The recently released Department of Justice report on policing practices in Ferguson reads—if you leave out the technology—as if it was from 1919 in terms of the deliberate targeting of African Americans and Ferguson civic leadership viewing black constituents as revenue generators and not even recognizing that they are the people they serve, that these are our citizens and are taxpayers. There was no compunction about deliberately targeting individuals because of the color of their skin. In other words, there was a double justice system in place, and that was true in so many places in 1919 when African Americans were not treated equally by the police and when rioting began and whites attacked African Americans, police went after African Americans. This helps us understand why the history of distrust and friction on the part of so many black communities and the police that are supposed to protect them is at such a low point now.

This isn’t just something that happened recently in Ferguson or was happening in the 1960s to the present. It has a long history. One of the cartoons in the book from a black Washington weekly called The Bee shows a well-dressed black woman approaching a police officer lounging against a street pole while the background shows a white mob attacking African Americans. The black woman asks, “Why don’t you stop them?” And the police officer responds, “Ha, Ha. That’s what I would like to know.” That exchange was actually reported in one of Washington’s daily newspapers. The police were saying they weren’t getting guidance and weren’t told what to do.

From 1919 to the outbreaks in Ferguson, we see this continuum and perhaps the best lesson is to understand that it’s been a long time in the making and that we need to take steps to break that historical continuity of destruction and often biased policing.

Fortunately today, we don’t have mob attacks to maintain white supremacy, but does that mean all is well? I think the Justice Department report on Ferguson shows that’s not the case.

“This Nation’s Gratitude,” published in the Washington Bee, July 26, 1919, p.1.

Robin Lindley: Recently, on the first anniversary of the killing of Michael Brown in Ferguson, a white patriot group armed with assault weapons arrived on the scene supposedly to assist police. A black commentator doubted that groups of armed black men with assault weapons would have been welcome there.

Professor David Krugler: Yes. The Oath Keepers, I believe they’re called, tried to explain to those organizing protests in Ferguson that, “We’re on your side, to assure the police don’t do anything to you.” But I think it’s a legitimate question about what the response would be if large numbers of African Americans were openly bearing long rifles and assault weapons on the streets.

In 1919, the sight of African Americans bearing weapons or even the fear that they would have access to weapons preoccupied the Military Intelligence Division and the Bureau of Investigation, the forerunner to the FBI. The young J. Edgar Hoover headed one of its units.

In the book, I looked at the efforts to disarm blacks. The federal government, in collusion with local law enforcement and even gun dealers, denied African Americans their Second Amendment rights and denied them access to weapons they needed to defend themselves because the notion, the lie, was that they were organizing uprisings and conspiracies to murder whites en masse.

Robin Lindley: Your book is a vivid reminder of how ignorance and intolerance erode democracy. Thank you for your book and your thoughtful comments Professor Krugler.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/160430 https://historynewsnetwork.org/article/160430 0
An Open Letter to Senator Elizabeth Warren Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

Elizabeth Warren in the Democratic Debate on July 30, 2019

 

 

Dear Senator Warren,

 

Congratulations on your campaign thus far and on your performance during the two sets of televised debates. You have done a fine job of presenting yourself and your ideas for America.

 

But I don’t think the Democratic debates have been successful so far. The apparent need for each candidate to distinguish themselves from the rest and the relentless efforts of the media, including the debate organizers and hosts, to dig out points of difference and conflict have resulted in a set of seemingly contradictory messages about what Democrats stand for and how you all differ from the much more unified message of Trump and the Republicans. 

 

I have believed for some time that the way for any Democrat to beat Trump is for all Democrats to emphasize what unifies you, or us. I wrote about that idea in May, but the message apparently did not have any effect. So I am trying a different tack: asking you to take the lead in helping the whole field of candidates be clearer about what all Democrats propose to the American people, each in their own way and with their own emphases.

 

I ask you to formulate a statement of Democratic principles and policies that every candidate could accept in a public way, preferably as part of the next debate. That statement should address the underlying agreement among all Democrats on issues that separate us from the Trump candidacy: the intention to address climate change rather than call it a hoax; to raise taxes on the rich, not the rest of us; to fund education, child care and infrastructure more vigorously; to move forward from Obamacare, not destroy it. Imagine the impact on the voting public if every candidate on the stage said the same few words about their support for what surveys show that the majority of the public wants.

 

Success in this effort does not depend on getting unanimous agreement. If someone wants to say, “I don’t support those ideas,” let them. That will further emphasize the unified stance of the rest. Success would be taking control of the debate and the whole effort to get rid of Trump.

 

I ask you to do this because of your status as a leading candidate, your willingness to take the lead in formulating what being a Democrat means, and your fearlessness in articulating your campaign message. Taking this lead may not help you personally to jump over the other Democrats, but it could help all Democrats gather the votes of the majority of Americans who disapprove of Trump. I don’t see how such an effort could harm your campaign among Democratic voters.

 

I understand that your own policy proposals would not get approval from all the candidates, or perhaps even from most. That is exactly what is confusing to the average American voter. You would have to formulate a statement that falls short of the plans that you have outlined. I support your candidacy because of those particular plans. But more important, I believe that the majority of Americans support the general foundation of Democratic ideas and ideals that inform you and the other candidates.

 

The media, from FOX to MSNBC, from the NY Times to the local papers that most Americans read, will not draw the obvious and important contrasts between what Republicans have done and tried to do and what Democrats would do if Trump is defeated. They will keep talking about a horse race, goading you all to attack each other, and emphasizing the smallest differences over the larger consensus.

 

Thank you for your service thus far to all Americans.

 

Steve Hochstadt

Springbrook WI

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/blog/154233 https://historynewsnetwork.org/blog/154233 0
Please stop saying the past is in the past. It isn’t.

Betsy Ross 1777, a ca. 1920 depiction by artist Jean Leon Gerome Ferris, Library of Congress

 

When controversy flared last month over a Revolutionary War-era American flag and Nike sneakers, a reporter asked me what might sound like simple questions: 

 

Should the past stay in the past? Should we be making a fuss over historical realities? 

 

“It’s the history of the United States,” another person told the reporter for the story. “Can’t change it. … What happened happened.” 

 

Here’s the problem with that common sentiment: What happened in the past often has a profound impact -- on real people in real life. Right now. 

 

If you missed it: Nike dropped its Air Max USA 1 shoe after former NFL quarterback Colin Kaepernick worried that the “Betsy Ross” flag design -- with 13 stars in a circle, on the shoe’s heel -- harkens back to an era of black slavery and has been used by white nationalist groups, according to published reports. 

 

That version of the flag dates to the late 1700s. A century before, in 1676, both black and white workers rose up against the ruling body of the Jamestown colony led by Nathaniel Bacon, a white Virginian colonist. The uprising led the Governor’s Council to find ways to separate African workers from their white peers. 

 

The event is often credited as the rationale for dividing, demeaning and diminishing the worth of black people in America -- division that we’re still grappling with. It happened. 

 

Next let’s consider the Three-Fifths Compromise, a legislative agreement in 1787 that determined slaves would be considered three fifths of a free person. It happened.  

 

Although the three-fifths designation was ostensibly for purposes of taxation and legislative representation, it helped set the stage for how black Americans are viewed and treated in this country. If you are three fifths of a person, you are much easier to abuse, ignore and oppress. 

 

The three-fifths notion represents the genesis of the present debate on whether a citizenship question should appear on the national census. It explains why the census counts people, not citizens.  

 

More recently, Plessy v. Ferguson, the landmark U.S. Supreme Court case of 1896, cemented the American concept of “separate but equal.” In practice, that has always meant separate and unequal. Homer Plessy, a person of color (one-eighth black), refused to move to a rail car designated for black people only. He was arrested and took his case to the Supreme Court, where he lost in a 7-to-1 decision. It happened.  

 

The aftermath of the ruling led to the rise of Jim Crow laws across the South, affecting such everyday services and accommodations as schools, theaters and travel as well as voting rights. Even today, people are often segregated based on race all over America, and the voting rights of people of color are being challenged in several states.  

 

After enslavement, lynching was a common means of racial intimidation and terrorism of men and women of color. There have been nearly 5,000 recorded lynchings since the late 19th century. But due to poor record-keeping and reporting, it’s likely there have been many, many more. It happened.  

 

As late as 1998, a black man in Texas was lynched, dragged by the neck three miles behind a car driven by three white supremacists. That happened, too. 

 

This “past” has never left. Reincarnations of nooses and their imagery are everywhere and still used to terrorize. Even in 2019, pictures of nooses were posted in a classroom in Roosevelt, Long Island.   

 

When it comes to race in America, the past is not the past. Shakespeare got it right: The past is prologue. According to a Pew Research Center survey in June, eight in 10 black adults say ”the legacy of slavery still affects black people in the U.S. today.” 

 

Last year, the center reported that “black households have only 10 cents in wealth for every dollar held by white households.” Likewise, the Economic Policy Institute reminds us that in this period of economic boom, black workers had the highest unemployment rate in the country, 6.3 percent -- nearly twice that of whites. And the Centers for Disease Control and Prevention tells us that black Americans “are more likely to die at early ages from all causes.” 

 

I suggest that these indicators, and many more, are not the result of happenstance or coincidence today, but directly caused by things that happened in the past. 

 

The flag credited to Betsy Ross, as an artifact of American history, is innocent. Unfortunately, Kaepernick was correct: This flag has been adopted as a proxy for the Confederate flag and is flown by white supremacist groups such as the Ku Klux Klan, the Patriot Movement, neo-Nazi groups and the militia gang Three Percenters, a group formed after the election of Barack Obama. The throwback flag represents an era when slavery was legal and commonplace.  

 

So is the past really in the past? Or is it a profound part of our everyday lives? To me, the answer is indisputable. 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172725 https://historynewsnetwork.org/article/172725 0
The Constant Threat of Mass Shootings Requires Increased Protection for Presidential Contenders

The Secret Service flag

 

After the two latest mass shootings last  weekend (August 3-4, 2019) in El Paso, Texas and Dayton, Ohio, Americans once again mourned the lives lost and worried for the future. The weekend’s shootings were just two of 250 such massacres in 216 days in 2019. As many of these tragedies were connected to an explicit political philosophy, Americans must face the threat of assassination attempts against the Democratic Presidential contenders. 

 

In the not-so-distant past, two presidential candidates running for office in the midst of American chaos and division were shot. Senator Robert F. Kennedy was assassinated while running for his party’s nomination in 1968. Governor George Wallace was shot and paralyzed in 1972. I discuss both of these incidents in my book Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (2015).

 

Today, mass shootings often target a specific group of people. For example, self-identified “incels” (involuntarily celibate) young men have targeted and attacked women, Dylan Roof targeted African Americans in the Charleston church shooting, the El Paso shooter intentionally killed Latinos as he was concerned that they would “take over” Texas politics; the Pulse Night Club shooter attacked the LGBTQIA community, and the perpetrator of the Pittsburgh synagogue shooting targeted Jews.  Many of the presidential contenders are women, African American, Latino/a,  gay, and/or Jewish. This makes these candidates the potential subject of a hate crime. 

 

So it seems reasonable to demand that the US government, through the Secret Service, the FBI, and other security agencies, must immediately provide Secret Service protection to the ten candidates who will appear in the Democratic Presidential debates in September and beyond. I believe the risk is high enough candidates should receive protection for a few months after they drop out of the race.

 

If the government can support the $110 million bill and counting of Donald Trump’s golf outings in his first two and a half years in office, then the American people must demand protection of those who might be its next President, so that we, hopefully, avoid future tragedies such as those that made Robert F. Kennedy and George Wallace victims a half century ago.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172727 https://historynewsnetwork.org/article/172727 0
Samantha Smith's Dream of Peace and Nuclear Disarmament

Samantha Smith and her letter to Andropov, Samantha Smith Foundation

 

Growing up in Massachusetts, during the Carter and Reagan presidencies, I was one of many little kids worried about nuclear war. We knew about the horror of nuclear weapons from the atomic bombs dropped on Hiroshima and Nagasaki in August, 1945 ending World War 2. 

 

Both America and the Soviet Union were testing a lot of nukes to which I told my family "there was no need to test nuclear weapons because if you fire a nuclear missile at a coffee cup, obviously the cup will break."  The Cold War arms race continued nonetheless. 

In Maine 10-year old Samantha Smith was also deeply concerned about nuclear war. Her mother encouraged her to write to the Soviet leader Yuri Andropov in 1982. She did, in a personal hand written letter, telling Mr. Andropov  "I have been worrying about Russia and the United States getting into a nuclear war. Are you going to vote to have a war or not?.....God made the world for us to share and take care of. Not to fight over or have one group of people own it all. Please lets do what he wanted and have everybody be happy too."  Off in the mail Samantha's letter went, probably not expecting a reply. But then months later Samantha got the surprise of her life.  Samantha's letter was printed in the Soviet newspaper Pravda. Then she got a personal reply from Andropov himself, inviting her to the Soviet Union! The next thing you know Samantha was on TV too, talking about what she wanted most of all: peace.   Samantha Smith toured the Soviet Union in July of 1983, meeting Russian kids, and became an ambassador for peace and nuclear disarmament. She believed people of rival nations could get along and did not want war. Samantha also visited Japan to reinforce their desire to eliminate the nuclear weapons which they had suffered in the atomic bombings on Hiroshima and Nagasaki.  Samantha tragically lost her life in a plane crash in 1985. I remember hearing about the shocking news. Samantha’s spirit has never been forgotten. This is so important because there is a complacency that has set in when it comes to nuclear weapons.  Right now leaders are dragging their feet in reducing the nuclear threat.  There are still about 14,000 nuclear weapons in the world according to the Arms Control Association. The U.S. and Russia have about 90 percent of the nukes.   Shouldn't we all be worried about them today, too? There is still the risk of nuclear war, accidental launch or nuclear terrorism. How long do we want to live with this danger?  Billions of dollars are poured into nuclear weapons each year. Wouldn’t we rather spend this money on fighting hunger, poverty, disease and climate change?  The Move the Nuclear Weapons Money Campaign wants to end nuke spending and use it toward the benefit of humanity. The danger of nuclear weapons is shared by every person, every country. Everyone can take part in this goal of nuclear disarmament, much like Samantha encouraged.  I see examples of this idealism today when working with the CTBTO youth group, who passion is to achieve the Comprehensive Nuclear Test Ban Treaty. This treaty would end nuclear testing forever, helping pave the road for elimination of all nukes.  Samantha Smith taught us your voice matters and you can make a difference in ridding the world of nuclear weapons and achieving peace for all. 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172726 https://historynewsnetwork.org/article/172726 0
Woodstock, The Moon Landing and Sesame Street, Too: 50 Years of American Cult Art

The Lantern Bearers by Maxfield Parrish

 

The year 1969 was a seminal twelve months in the life of the Norman Rockwell Museum, in Stockbridge, Massachusetts. The hometown of the celebrated American artist, who made the past come lovingly to life on his canvasses and magazine covers, opened the museum, mourned the closing of the Saturday Evening Post magazine, so often graced by Rockwell’s illustrations, and stood eyewitness to some of the most remembered events in American history, such as the Apollo 11 moon landing, the Woodstock, N.Y. rock concert, mud and all, the Charles Manson murders, the student rights revolution, sexual revolution and just about any revolution you could find.

 

The museum is celebrating the year 1969 not with an exhibit about politics or history, but one about how artists showcased their work in that one, single year. The exhibit, Woodstock to the Moon: 1969 Illustrated, that opened recently, has an unusual star – the gang from the Sesame Street television show, that debuted that year. When you walk into the first of the two exhibit halls you are face to face, in all of his glory, with the Cookie Monster. He’s busy, too, chomping down hard on the first of seven huge chocolate chip cookies. Right next to him is a series of photographs from the first year of the show, along with Sesame Street memorabilia. The light hearted Sesame Street display sets the tone for the rest of the exhibit.

 

The exhibit contains the art – illustrations, photographs, posters – of 1969 events.

 

You find out things you did not know or simply (my case) forgot, such as the fact that in 1969 the number one movie at the box office was not an acid rock movie, but a solid, good old western, Butch Cassidy and the Sundance Kid. The winner of the Oscar for Best Picture that year was none of the tried and true Hollywood hits, but an X rated movie – Midnight Cowboy (“I’m walkin’ here, I’m walkin’ here…” says Ratso Rizzo as he crossed a busy midtown New York Street).

 

There are numerous colorful movie posters from 1969. Everybody remembers that the hit musical Hello Dolly!, starring Barbra Streisand, debuted that year, what about the James Bond movie (“Bond, James Bond“) On Her Majesty’s Secret Service? The Wild Bunch? Once Upon a TIme in the West?  And it was the year that one of the most memorable, and funny, movies of all time, Animal House, was released.

 

On the personal side, I forgot that 1969 was the year the satirical National Lampoon magazine started publishing (yes, the forefather of all those movies). It was the summer four people were killed at a Rolling Stones concert at Altamount Speedway in northern California.  It was the year that Elvis Presley began his long tenure in Las Vegas, selling out 636 consecutive concerts.  And oh, yes, the cartoon character Scooby Doo made his much-remembered debut. It was also the year that Kurt Vonnegut’s wonderful off beat novel, Slaughterhouse Five, hit the book shelves.

 

There are posters, large and small, that celebrate many of these films with wild illustrations by the prominent artists of the day, including Rockwell. There were illustrated posters for Janis Joplin (she of the gravelly voice) concerts, television shows, movies, political races and Presidents (yes, Nixon).

 

There is even a quite scary magazine illustration of the Frankenstein monster for one of his long-forgotten movies.    

 

Another part of the exhibit is a tribute to the moon landing with lots of photographs and a Rockwell painting of Neil Armstrong and Buzz Aldrin om the surface of the moon.

 

In the middle of one of the halls is a 1969 model television set that continually plays trailers for movies that premiered that year. There is a large and pretty comfortable sofa across from it, so you can sit back and enjoy the history and the memories.

 

The museum exhibit has some problems, though. First, it only fills two halls. The movie posters alone from that year it could fill the entire building. Why not make the exhibit as big as possible?  

 

Many parts of 1969 art history are ignored. There is practically nothing from the Vietnam War and less on student campus riots. There was certainly plenty of artwork for those two epoch events. There is Sesame Street, but what about all the others shows that debuted that year? You want color and drama, what about some photos or artwork about the downtrodden underdogs, the New York Mets, who the World Series that year? 

 

Even so, Woodstockto the Moon: 1969 Illustrated, is worth the trip to the Rockwell Museum, happily celebrating its fiftieth anniversary amid the forested campus where it sits. A look back at the artwork from 1969 is a rare chance to gather memories for many and, for younger people, learn how artists took on the army of events that happened just in that one single year. Much of what we know is from the work of artists, whether filmmakers, illustrators, painters or writers.

 

Besides, how can you go wrong with an exhibit that gives you, in all their glory, John Wayne, James Bond and John Belushi?

 

The exhibit runs through October 27. The Rockwell Museum is just off route 183 in Stockbridge, Massachusetts. The museum is open daily from 10 a.m. to 5 p.m. 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172724 https://historynewsnetwork.org/article/172724 0
Reflecting on Patriotism in the Trump Era

 

Sometimes these days I am asked what bothers me about the present leadership (or lack of it) that occupies the Oval Office, appoints subordinates, decides if our country will go to war, and proclaims the direction we are all going.  If hesitant, all I have to do is take a look at the columnists on the editorial page of the nearest newspaper. 

 

That President Donald J. Trump is the target of every arrow is nearly inevitable, as there are many different Causes for open hostility. He is enroute to permanently dividing our “united” Country, one I thought headed that happy direction.  In our Congress, Republicans and Democrats seem to have precious little in common.  

 

Men and women in uniform hope these days that the erratic Trump fellow represents only a minority of the “folks back home.”  Meanwhile, it does seem more than likely that almost any unit of professors and teachers will be deeply resentful of the Trumpian use of language on Twitter. 

 

There is bitterness among the competent over cabinet appointments and the handling of foreign relations. Many of us who spent so long fighting our wars do fear a Trump war stirred up almost casually—without thinking. As we try to follow what our president is up to, we discover he is likely to be vacationing. We observers feel that our hotel and golf course magnet only has one foot in “governing.”

 

Let’s admit it: the 4th of July that just passed became something of a strain midway when our erratic President  did much as he pleased. Predominating was showmanship.  It seems true enough that children and the military minded admittedly did seem to get a real “bang” out of watching those nine overflights in formation. 

 

As we celebrated our nation’s birthday, many of us reflected on our shared history and our future. All that time we existed as the United States of America; all those wars and battles we can’t forget.  The heroism that we acknowledge without reservation. Yes, a lot of us did want to commemorate the 4th of July. Did we want it to become a Trumpian holiday?

 

We found ourselves detouring as we tried “celebrating” as well as “commemorating” the dead and dying in the aftermath of War.  Our White House occupant clearly got in the way! The holiday was imperfect for some of us. I was at least was a bit sorry for Donald as the rain trapped him in his apparently bullet-proof square glass cage where the prepared text had to be given, no matter what.  

 

Patriotic holidays are decidedly a time for looking back with pride. Always, we center our attention on “our military,” even though there are an endless number of administrative units who have served usefully and faithfully over the decades. 

 

That brings up several new subjects for worry and concern.  Will our intelligence services do their highly essential job under that inexperiencedpresident? That’s not all.  How in the world is William Pendley, long-critical of the Bureau of Land Management going to manage once in charge? Out here, the BLM controls vast square miles of land. It stands for “conservation.”  Pendley apparently favors wholesale drilling for oil on the BLM acres—we shutter. 

 

And yet, there is much to be proud of. We did win Independence, preserve our new capital after 1812, and expand to the Pacific after waging war briefly with Mexico to end up with West Texas, New Mexico, Arizona, and California! That bloody Civil War early in the 1860s proved so costly (there were over six hundred thousand casualties). In the years after Lee surrendered, many of those who fought and survived came to ponder if the former slave was as “free” as had been hoped. The memory at the time was of bravery on both sides, in any case! But as time passed, that Reconstruction Era proved a disappointment, for sure. A New South gradually arrived, but not in every county.

 

As the Indians were forced onto Reservations, they and sensitive observers hoped for a better outcome, somewhere down the line.  After all, the natives were being displaced (to put it gently), and results were by no means universally approved. Fighting Spain, we did manage to bring a form of freedom to Cuba and the Philippines, but taking the long view offers only diluted celebration.  

 

World War I brought celebration to our streets but also division as our homegrown Germans fumed. The era also brought women’s right to vote and Prohibition. With massive World War II we came to  appreciate fully what the American military is able to do under terrible pressure. Hitler, Mussolini, Tojo and dedicated Japanese, all surrendered as we and our sacrificing Allies in two wars brought the right side to its final triumph.  The path to sudden victory at the very end (in August, 1945) revealed a tool of war all hope to be avoided if there is a “next time.”

 

We took on northern Korea, then  apparently with no plans to do so, we fought intervening China.  It was cold and there was plenty of discouragement.  South Korea today looks good to most observers. (Except, oddly, to our President, who clearly  isn’t sure! One could almost accuse him of partisanship toward that awful North Korea.) The Republic of China is another entity over which we can show pride—although we didn’t quite fight a war (not yet, anyway)  for that independent state.  

 

Any pride mainlanders have felt for interacting with Puerto Rico has been pretty much shelved in today’s time of post-storm neglect. There is turmoil, as one writes, and something drastic must be done. Lots of us hope our deteriorating national government will take on this neglected island as a project very soon now. 

 

Far on our periphery we are in an uproar over our Federal Government's conduct on the southern border. There is disrespect for family as an entity and cruel treatment of children who deserve Care.  

 

Meanwhile, we do realize that partly due to our Vietnam effort  nearly all of South Asia ended up safe from Communism as the last century ended.  It isn’t put that way very often—though President Reagan ventured to say “noble” when considering everything overall. (Maybe fighting Saigon helped, a little, in the overall region.) But almost all admit these days that our massive South Vietnam effort didn’t come to the kind of conclusion this sometimes war-making state likes to see. 

 

Our Nation cherishes victories and deplores defeats—not surprisingly. We prefer to feel something like content with the outcome when the flags are packed away!  That Vietnam War with its humiliating end in 1975 is hard to reconsider….

 

Grenada and Panama don’t begin to make up for that ghastly exit from south Asia in 1975; not close.  In a few decades Kuwait was a small war won, but Iraq has attracted few to cheer.  Using our weapons in Afghanistan has not brought much cheer (except maybe over at those National Guard buildings).  Those casualties do not help in such places.

 

Thoughtful Americans are in an uproar as our Nation’s “leadership” seems willing to embark on an unsought, unthought, sudden New War.  Fight Iran? Why in the hell? Will they start it by attacking our carrier force? Or attack the British instead? Is that Oval Office seeking a war to divert us?  From what?  From all the administrators who quit or were jailed, or proved incompetent?

 

Wars worth “celebrating” are not easily come by, are they?  Between memories of the dead and the wounded, sad analyses of how causes did and didn’t work out, and frustrated hopes as  expectations get compared with results, we do have to think, consider, reflect, yes, and avoid much better--don’t we?  

 

At the outset of this essay I expressed giant unease about the state of our good Country as led by one who continues to act, repeatedly, as one out in the hustings, campaigning.  He spends every weekend at purely political rallies. 

 

Nonetheless, looking back to the recent July 4th remembrance, we did commemorate another historic Independence Day first class. It was easy, as always, wasn’t it,  to find  a whole lot to remember and to celebrate enroute?  It seems clear that the problem we faced as we tried to be “patriotic” was that we do not respect or even trust our presidential leadership.  We, most of us anyway, want New Leadership. 

      

But:  let’s put our heads together before July 4th, 2020.  We do like tapping our feet cheerfully to that patriotic band music—yes, we do. But let’s make sure our Land stays at peace.  Most of all, we will not be bypassing this or any other patriotic holiday.  All of us with the label “American” want to participate with unmitigated, patriotic enthusiasm at times when loyalty to Country is appropriate.

 

If we must, we can commemorate July 4th this coming year in spite of who occupies that Pennsylvania Avenue address.  But it would be so nice if we could somehow be free of that odd figure by then.  We do want to be able to concentrate on a United States of America that is still proud to be the home of every one of us!

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172728 https://historynewsnetwork.org/article/172728 0
Remembering The Red Summer 100 Years Later This summer marks the hundredth anniversary of 1919’s Red Summer, when, from May to November, the nation experienced ten major “race riots” that took the lives of more than 350 people, almost all black. How should the challenging but essential task of remembering and commemorating this troubling history be confronted? 

 

What to call the racial violence is the first challenge. Race riot is, as Ursula Wolfe-Rocca of the Zinn Education Project pointed out, a problematic term, implying that everyone caught up in the violence was equally responsible for the disorder. Yet almost every instance of racial violence in 1919 began with white people organizing to attack African Americans for specific purposes: to drive them from jobs and homes, to punish or lynch them for alleged crimes or insults against whites, to block black advancement. In Chicago, for example, white gangs carried out home invasions to drive black residents from houses in previously all-white neighborhoods. To call such actions “riots” minimizes their overtly racist intent and overlooks the instigators. 

 

Although Red Summer captures the scope of the violence, it doesn’t convey the purpose or totality of white mob attacks directed at African Americans during 1919. In my work on 1919’s racial violence, I propose using antiblack collective violence as a replacement for race riot. In some instances, the terms massacre and pogrom are warranted: in Phillips County, Arkansas, in the fall of 1919, white mobs and posses killed more than 200 African Americans in a frenzy that grew out of a pre-meditated attack on a union meeting of black sharecroppers. The very words we use to describe 1919’s violence can be a step toward an unflinching, complete understanding of the event’s significance and legacy.

 

Another challenge: to acknowledge the victimization of African Americans while also recognizing their sustained resistance through armed self-defense. In Washington, D.C., hundreds of black residents formed a cordon to deter white mobs; in Chicago, black veterans recently returned from combat during World War I put on their uniforms and patrolled streets to stop mobs when the police couldn’t or wouldn’t. Resistance also took the form of legal campaigns to clear African Americans charged with crimes for defending themselves and to pressure prosecutors to bring charges against attacking whites. Black self-defense is yet another reason to jettison the term race riot: in fighting back, African Americans were resisting, not rioting. 

 

Typical narratives about 1919’s antiblack collective violence, especially in school textbooks, often conclude abruptly: the attacks ended, the affected community moved on. Such treatment isolates the violence, implying it was an aberration without lasting effects. Yet for African Americans, discrimination and the upholding of white supremacy took other forms. Many cities are examining this problem through programming related to remembrance of 1919. In Chicago, the Newberry Library, along with numerous local partners, is sponsoring a series entitled Chicago 1919: Confronting the Race Riots. A basic purpose is to show Chicagoans “how our current racial divisions evolved from the race riots, as the marginalization of African Americans in Chicago became institutionalized through increasingly sophisticated forms of discrimination” such as red lining and racist housing policies. The Kingfisher Institute of Creighton University in Omaha, where a white mob lynched a black laborer falsely accused of assaulting a white woman in 1919, recently sponsored a lecture by Richard Rothstein on how federal housing policies built a residential racial divide across the country. In June, the Black History Commission of Arkansas hosted a symposium which in part focused on how survivors and the black community recovered from the Phillips County, Arkansas, race massacre. 

 

As these commemorations demonstrate, 1919’s racial violence—which, as the Newberry Library observes about Chicago, “barely register[s] in the city’s current consciousness”—is receiving in-depth attention to ensure that present and future Americans understand how a century-old event shaped the cities, and nation, where they live now.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172710 https://historynewsnetwork.org/article/172710 0
Trump’s Tariff War Resembles the Confederacy’s Failed Trade Policies

 

Current efforts by the United States to put tariff pressures on China resemble the Confederacy’s efforts to pressure Great Britain during the American Civil War. In the early 1860s the Confederate leaders’ strategy backfired, damaging the southern economy and weakening the South’s military. Recent developments in the tariff fight with China suggest that President Trump’s strategy could backfire as well. America’s tariff negotiators should consider lessons from the record of Confederate missteps.

 

In the Confederates’ dealings with Britain and the Trump administration’s tensions with China, booming economies gave advocates of a tough negotiating stance exaggerated feelings of diplomatic influence. Southerners thought the robust cotton trade provided a powerful weapon in their efforts to secure official recognition from the British. President Trump expresses confidence that America’s flourishing economy can withstand a few temporary setbacks in order to win major trade concessions from the Chinese. In both cases, leaders failed to recognize that their gamble had considerable potential for failure.

 

During the 1850s, southern cotton planters enjoyed flush times. Sales of cotton to English textile mills brought huge profits to the owners of cotton plantations. “Our cotton is . . . the tremendous lever by which we can work out destiny,” predicted the Confederate Vice President, Alexander Stephens. Southerners thought English textile factories would have to shut down if they lost access to cotton produced in the United States and that closures would leave thousands of workers unemployed. To the Confederates, it seemed that the English would have no choice but to negotiate with them. Britain needed “King Cotton.” The Confederate government did not officially sponsor a cotton embargo, but the southern public backed it enthusiastically.

 

Presently, a booming economy has put the Trump administration in a strongly confident mood, too. President Trump’s advisers and negotiators on tariff issues, Peter Navarro and Robert Lighthizer, hope China will buckle under American pressure. They expect tariffs on China’s exports will force a good deal for the United States. President Trump encouraged the tariff fight, asserting trade wars are “easy to win.”

 

Economic pressures hurt the British in the 1860s and the Chinese recently, but in both situations coercive measures led to unanticipated consequences. During the American Civil War, some textile factories closed in Britain or cut production, yet British textile manufacturers eventually found new sources of cotton in India, Egypt, and Brazil. Now the Chinese are tapping new sources to replace lost trade with the United States. China is buying soybeans from Brazil and Argentina, purchasing beef from Australia and New Zealand, and expanding commercial relationships with Canada, Japan, and Europe.

 

The failed strategy of embargoing cotton represents one of the great miscalculations of the South’s wartime government. If the Confederacy had continued selling cotton to the English during the early part of the Civil War – before the Union navy had enough warships to blockade southern ports – it could have acquired precious revenue to purchase weapons of war. The absence of that revenue contributed to a wartime financial crisis. Inflation spiked. Economic hardship damaged morale on the home front. Many Confederate soldiers deserted after receiving desperate letters from their wives. Fortunately for African Americans, ill-conceived Confederate diplomacy speeded the demise of slavery.

 

Many economists now blame President Trump’s trade fights with China and several other nations for volatility in stock markets. They attribute a recent global slowdown in commerce largely to President Trump’s protectionist policies. More troublesome, though, may be the long-term consequences of the administration’s policy. Much like the South’s foolish cotton embargo, America’s tariff waris forcingthe Chinese to seek commercial ties with other countries. China appears to be moving away from close relationships with American business.

 

That shift could prove costly. American prosperity in recent decades owes much to commerce with China and the eagerness of Chinese investors to buy American stocks and bonds, including U.S. government debt. If the present conflict over tariffs leads to reduced Chinese involvement in American trade, the Trump administration’s risky strategymay be a reiteration of the Confederates’ foolish gamble on the diplomatic clout of King Cotton.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172711 https://historynewsnetwork.org/article/172711 0
When Republicans Encouraged Immigration

 

Last month, President Donald Trump celebrated his narcissistic Fourth of July at the Lincoln Memorial. It is not surprising that  Trump chose this location as he makes it a point to invoke Abraham Lincoln’s name whenever it suits his purposes or his distorted view of history.  Apparently, however, he is totally unfamiliar with Lincoln’s legacy on immigration.    

 

Whereas Trump espouses a “go back where you came from” ideology and wants to slam the door shut on immigrants, Abraham Lincoln consistently articulated an economic philosophy that relied heavily upon immigrant labor. 

 

In his earliest speeches, Lincoln saw immigrants as farmers, merchants, and builders who would contribute mightily to the nation's economic future. 

"I again submit to your consideration the expediency of establishing a system for the encouragement of immigration. . . .While the demand for labor is thus increased here, tens of thousands of persons, destitute of remunerative occupation, are thronging our foreign consulates and offering to immigrate to the United States if essential, but very cheap assistance, can be afforded them. It is very easy to see that under the sharp discipline of the Civil War, the nation is beginning a new life. This noble effort demands the aid and ought to receive the attention of the Government."

This is far from Trump’s characterization of immigrants as  criminals, rapists, and drug dealers that Trump has called them.  

 

Whereas Trump has fought the Courts, his own Justice Department,  and just about everyone else to keep immigrants out, Lincoln’s endorsement resulted in the first, last, and only bill in American history to actually encourage immigration.  

 

Symbolically and appropriately, Lincoln’s Act to Encourage Immigration became law with Lincoln's signature on July 4, 1864, exactly 155 years before Trump’s narcissist celebration of the holidaythis past year.  President Lincoln's message and legislation on this subject seemingly begun a wave of support for federal and other action to encourage immigration which lasted for decades. 

 

So strong was feeling on this matter that the 1864 platform of the Republican Party (running as the Union Party) noted, "That foreign immigration, which in the past has added so much to the wealth, development of resources, and increase of nations, should be fostered and encouraged by a liberal and just policy." Compare that to the Republican Party of today!  

 

The Bill was amended and strengthened after Lincoln stated in his Annual Message to Congress on December 6, 1864, "The act  . . . seems to need Amendment, which will enable the officers of the government to prevent the practice of frauds against the immigrants while on their way and on their arrival in the port, so as to secure them here a free choice of vocations and places of settlement."   Now envision the detention camps of the Trump era.  

 

Lincoln’s death and the ultimate repeal of Abraham Lincoln's Act to Encourage Immigration could not remove the effect it had upon immigration. Its important influence foretold the massive flow of immigration to the U.S. in the following decades. The secondary effects of the act, such as the popularization abroad of another of Lincoln's landmark laws, the Homestead Act, encouraged thousands of immigrants to settle as farmers in the Midwest and West. 

 

Though he did not live to see the completion of his dream, Lincoln deserves credit for initiating a plan that personified Emma Lazarus's words long before they were memorialized on the Statue of Liberty.  Trump, meanwhile, represents the repudiation of those words.

 

Lincoln’s humble origins were never far from his thoughts and his belief that everyone deserved the opportunity that America affords permeated virtually all of his beliefs.  On the other hand, Trump, born with a silver spoon in his mouth, stands in direct contrast to Lincoln’s background and compassion.  Donald Trump seemingly scorns the common man or middle class.  He reviles the Constitution and the Declaration of Independence whereas Lincoln stated quite explicitly, if not prophetically: 

“Wise statesmen as they were, they knew the tendency of prosperity to breed tyrants, and so they established these great self-evident truths, that when in the distant future some man, some faction, some interest, should set up the doctrine that none but rich men, or none but white men, were entitled to life, liberty, and pursuit of happiness, their posterity might look up again to the Declaration of Independence and take courage to renew the battle which their fathers began – so that truth, justice, mercy, and all the humane and Christian virtues might not be extinguished from the land; so that no man would hereafter dare to limit and circumscribe the great principles on which the temple of liberty was being built.”

For a president already prone to depression, Trump’s America would send Lincoln into the “hypo” (for hypochondriasis) as he called it.  He would be appalled that the Republican Party stands by idly and refuses to hold Trump accountable for virtually all that Lincoln stood for.

 

In short, the Lincoln majestically sitting behind Trump during the July Fourth event earlier this month weeps for what is now lost.  

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172709 https://historynewsnetwork.org/article/172709 0
The Professor Who Was Ostracized for Claiming the Civil War Was About Slavery – In 1911

The Battle of Williamsburg, Kurz and Allison

 

Sometimes when we’re poking around in an archive, we come across century-old documents that are strangely relevant. That’s what the story of Enoch Marvin Banks became to me. An aging letter from 1911 that I found in the Columbia University archive revealed a story that could be in today’s headlines: people in the Jim Crow South tried to capture the memory of the Civil War for political gain.

 

My main research involves Progressive-Era economic thought, and John Bates Clark was one of America’s foremost economists. Sifting through his papers, I came across the usual letters of economic theories and perspectives, but then something unexpected: A long letter from Enoch Marvin Banks dated April 2, 1911 (the quotes below come from this letter). Banks was a professor of history and political economy at the University of Florida, and he seemed distressed. He was “being violently assailed”, evidently over an article he’d written. I didn’t have the article at the time, but I could understand its context from the hints Banks gave. Basically, Banks had committed the crime of blaming the Civil War on slavery. Southern leaders, he stated, had made “a grievous mistake in failing to formulate plans for the gradual elimination of slavery from our States.” In his view, wise leadership would have ended slavery slowly, kept the union intact, and avoided the catastrophe of civil war.

 

With a google search, I later found the article in question, “A Semi-Centennial View of the Civil War” in The Independent  (Feb. 1911). Upon reading it, I discovered that Banks was even more explicit in print: “The fundamental cause of secession and the Civil War, acting as it did through a long series of years, was the institution of Negro slavery” (p. 300). Banks didn’t stop there. He attacked the South’s leadership as well, praising Abraham Lincoln and criticizing Jefferson Davis as a statesman of “distinctly inferior order” (303). Such views were incendiary in the Jim Crow South, and the cause of Banks’ distress.

 

Banks’ views touched off a firestorm in his native South (he was born in Georgia and spent most of his life in the South). Confederate veterans’ groups responded with widespread criticism. Banks included a clipping from the United Confederate Veterans Robert E. Lee Camp No. 58 in his letter to Clark. The clipping addressed the University of Florida for having a staff member who sought to “discredit the South’s intelligence and to elevate the North and to falsify history.” “Shall such a man continue in a position as teacher where he will sow the seeds of untruth and make true history a falsifier?,” they asked. The veterans demanded Banks be removed from the university and replaced with “a man who will teach history as it is and not mislead and poison the minds of the rising generation.”

 

As Banks told Clark, he simply couldn’t stand the controversy and pressure. He obliged these demands by resigning from the university and retreating back Georgia. He died only a few months later. Some suspected that the strain of the ordeal diminished his already weak health and led to his eventual death.

 

This moment reflected the ongoing battle over the legacy of the Civil War and the ideology of the Jim Crow South. As Banks wrote his article, the South was building and codifying its system of racial segregation. Part of this project involved capturing the war’s historical memory. Confederate leaders had to be presented as noble warriors fighting for a lost cause. Jefferson Davis, who was attacked then and now for incompetence, was “one of the noblest men the South ever produced,” according to the Confederate veterans’ group. That’s why they blamed Banks for distorting history, as he challenged the history that was being constructed. As Fred Arthur Bailey wrote in one of the few articles dedicated to this affair: “This tragic incident was but a small part of a large, successful campaign for mind control. Self-serving, pro-Confederate historical interpretations accomplished their purposes” (17). I can’t help agreeing with Bailey’s conclusion.

 

This ordeal seems to me a perfect example of how history becomes a battlefield. It’s no secret that the historical memory of the Civil War became contentious almost as soon as the war ended. In a world where debates about Confederate statues and flags frequently make headlines, I can only conclude that the battle is very far from over.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172695 https://historynewsnetwork.org/article/172695 0
JSTOR Interview Archive Helps Preserve History

History gets lost every day. Every time someone who witnessed -- or made -- history passes away, we lose their perspective unless it had been captured in memoirs or recordings.  Documentarians like Peter Kunhardt of the Kunhardt Film Foundation play an important role in retaining this witnessed history: by recording interviews with historical figures and then crafting these into a narrative, they amplify these voices as they share their perspectives with a broader audience. 

But even documentaries still miss something important. When, for example, Kunhardt interviewed Jesse Jackson for King in the Wilderness, his HBO documentary about Martin Luther King, Jr.’s final three years, he recorded over 90 minutes of Jackson’s recollections and perspectives, but only used a small portion for the final film.  Kunhardt himself says, “…no matter how fascinating the interview, important information is edited out of the final project.” In all, for his 2018 film, Kunhardt recorded more than 31 hours of interviews with 19 people who witnessed and made history with King. Each uncut interview is a treasure trove of important, witnessed history, much of which ends up on the cutting room floor. What if we could preserve these full-length interviews for future generations and use technology to make them even more useful for education?

Introducing the Interview Archive, a Prototype by JSTOR Labs

Now we can. I am pleased to announce the release of the Interview Archive. This prototype includes all 19 uncut interviews filmed for King in the Wilderness – interviews with civil rights leaders like Marian Wright Edelmanm, John Lewis and Harry Belafonte, who made history alongside King. We didn’t stop there, though: the site also helps researchers and students explore the rich material. Each minute of each interview is tagged with the people, places, organizations and topics being discussed. Users can use these tags to explore the different perspectives on over a thousand topics. They can also click on the tags while watching the interviews for background information from Wikipedia, to find and jump to other mentions of the topic in the Interview Archive, or to find scholarly literature and historical images related to the topic in JSTOR and Artstor.  

 

 

The site is a fully-functioning prototype built by JSTOR Labs, a team at the digital library JSTOR that builds experimental tools for research and teaching. At this point, it contains the source interviews from a single documentary; enough, we think, to convey the concept and useful if you happen to be teaching or researching this specific topic. Our aim in releasing this prototype is to gauge interest in the idea.  We hope that historians and teachers will reach out to us at labs@ithaka.org with their thoughts on this concept as well as what material they would most like to see in an expanded site.

Most importantly, we are thrilled to be able to share and preserve the full-length interviews from King in the Wilderness. These interviews belong in the hands of educators, students and researchers, helping to keep this history from being lost.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172696 https://historynewsnetwork.org/article/172696 0
PubMed Central Offers a Historical Treasure Trove  

Where can you freely read, download, text mine, and use for your research and teaching the full text of millions of historically-significant biomedical journal articles spanning three centuries, alongside millions more current biomedical journals? Look no further than PubMed Central (PMC) of the National Library of Medicine (NLM), National Institutes of Health. The NLM is the world’s largest biomedical library and one of the twenty-seven institutes and centers which constitute NIH, whose main campus is located in Bethesda, Maryland.

 

The NLM launched PMC in early 2000 as a digital counterpart to the library’s extensive print journal collection. In 2004, the NLM joined with Wellcome (a London-based biomedical charity, and one of the world’s largest providers of non-governmental funding for scientific research), the Jisc (a UK-based non-profit membership organization, providing digital solutions for UK education and research), and a number of medical journal publishers to agree that medical journals contain valuable information for research, education, and learning. Thus, journal archives should be digitized and freely available to all who would wish to consult them. Two years later, that agreement yielded public access to the full-text of 160 journals spanning nearly two centuries. More recently, the NLM completed a multi-year partnership with Wellcome to expand the historical corpus of PMC with dozens more titles encompassing three centuries and hundreds of thousands of pages. You will find a hyperlinked list of these titles at the end of this article; clicking on each title will take you its associated, digitized run in PMC.

 

While medical journals have always been invaluable resources, their digitization increases their accessibility and creates new opportunities to realize their research value. PMC makes available the machine-readable full text and metadata of the digitized journal articles, including titles, authors (and their affiliations where present), volume, issue, publication date, pagination, and license information. Such article-level digitization also enables us to link data, that is, to connect individual and associated articles with corresponding catalog records, and sometimes even Digital Object Identifiers (DOIs), to improve discovery and use of the articles by interested researchers.

 

In writing about one of these newly-available titles—namely The Hospital, a journal published in London from 1886-1921—on the popular NLM History of Medicine Division blog Circulating Now, Dr. Ashley Bowenobserved that “For researchers interested in the administration of British hospitals in the late 19th and early 20th century, [this journal] is a vital resource.” The Hospital“carried the tag-line ‘the modern newspaper of administrative medicine and institutional life,’ [and] published an enormous variety of items of interest to physicians, nurses, hospital administrators, and public health professionals—everything from medical research to notes on fire prevention and institutional kitchen management, reflections on ‘the dignity of medicine,’ opinions about housing policy, and much more.” Inspired by Dallas Liddle’s recent research which “used [digitized] file size as a way to identify the rate of change in Victorian newspapers,” Bowen downloaded and analyzed every article in the entire run of The Hospital—including all the file names and file sizes—to study the changing content, trends, and sheer volume of this important journal over time, to appreciate its metadata created in the process of digitization, and to evaluate this metadata “in addition to…traditional content analysis.” 

 

Bowen has also used PMC’s historical corpus to research Alfred Binet’s early 20thcentury intelligence tests using The Psychological Clinic and utilized Bristol Medico-Chirurgical Journal and its semi-regular series of articles about “Health Resorts in the West of England and South Wales.” 

 

Understandably, given the sheer size and scope of the overall PMC corpus, Bowen’s studies only scratch the surface of the archive which currently encompasses nearly 5.5 million full-text articles. Nearly 500,000 of those articles were published in 1950 or earlier and over 1 million articles date from 1951-1999. 

 

Among the millions of articles you will find alongside those surfaced by Bowen are:

  • Sir Alexander Fleming’s discovery of the use of penicillin to fight bacterial infections, which appeared in the British Journal of Experimental Pathology, 1929
  • Sir Richard Doll’s groundbreaking study that confirmed that smoking was a “major cause” of lung cancer, which appeared in the British Medical Journal, 1954; and 
  • Walter Reed’s paper proving that mosquitoes transmit yellow fever, which appeared in the Journal of Hygiene, 1902.  
  • reports of centralized health and relief agencies in Massachusetts during the 1918 influenza pandemic; 
  • an appeal for justice by Arthur Conan Doyle, related to the infamous case of the Parsi English solicitor George Edalji, which reflected contemporary racial prejudice;
  • a medical case report on America’s 20th president James A. Garfield, following his assassination in 1881; 
  • post-World War II thoughts about the future of the Army Medical Library by its director Frank Rogers; and 
  • a paper by the bacteriologist Ida A. Bengtson, the first woman to work in the Hygienic Laboratory of the U.S. Public Health Service, the forerunner of the National Institutes of Health. 

 

So, if we haven’t already tempted you to explore PMC for your own research and teaching—and explore its Open Access Subset and Historical OCR Collection, both ideal for text mining—what are you waiting for? Dive in! Encourage your colleagues and students to explore it. Be in touch and let us know what you discover in PMC. We would love to hear from you!

 

List of the historical journal titles made available freely in PMC  through the multi-year partnership between Wellcome and the NLM/NIH. 

Clicking on each title will take you to its associated, digitized run in PMC.  

 

Learn more about PMC and the partnerships dedicated to growing its freely-available historical content:

Public-Private Partnerships: Joining Together for a Win-Win,” Jeffrey S. Reznick and Simon Chaplin, The Public Manager, December 9, 2016.

PubMed Central: Visualizing a Historical Treasure Trove,” Tyler Nix, Kathryn Funk, and Erin Zellers, Circulating Now, the blog of the National Library of Medicine’s History of Medicine Division, February 23, 2016.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172694 https://historynewsnetwork.org/article/172694 0
“Mr. Straight Arrow,” John Hersey, and the decision to drop the atomic bomb

John Hersey

 

Roy Scranton’s ”How John Hersey Bore Witness” (The New Republic, July-August 2019) is an insightful look at a new book on one of my favorite authors.  It touched all the right notes and has prompted me to add Mr. Straight Arrow to my “Christmas list.”  Sadly, in the midst of this otherwise fine review, author Scranton repeats the discredited old chestnut that President Harry S. Truman dropped atomic bombs on Hiroshima and Nagasaki even though he knew Japan was trying to surrender.  Truman’s real reason for using the weapons, according to Scranton, was to employ them as a diplomatic club against the Soviet Union. This allegation was popular in some quarters during the 1960s and 70s, but was only sustained by a systematic falsifying of the historical record and it continues to pop up even today.  Underscoring this sad fact is the link Scranton provides which takes readers to a 31-year-old letter to the New York Times from Truman critic Gar Alperovitz purporting that “dropping the atomic bomb was seen by analysts at the time as militarily unnecessary.”  Presented in the letter is an interesting collection of cherry-picked quotes from a variety of diary entries and memos by contemporaries of Truman, such as Dwight D. Eisenhower.  All are outtakes and have been long rebutted or presented in their actual contexts.  Even key figures are misidentified.  For example, FDR’s White House chief of staff Admiral William D. Leahy, who chaired the meetings of the Joint Chiefs of Staff, is elevated in the letter to the position of Chairman of the Joint Chiefs.  As for the notion that Japan was trying to surrender, this is not what was beheld by America’s leaders who were reading the secretly decrypted internal communications of their counterparts in Japan.  In the summer of 1945, Emperor Hirohito requested that the Soviets accept a special envoy to discuss ways in which the war might be “quickly terminated.”  But far from a coherent plea to the Soviets to help negotiate a surrender, the proposals were hopelessly vague and recognized by both Washington and Moscow as no more than a stalling tactic ahead of the Potsdam Conference to prevent Soviet military intervention --- an intervention that Japanese leaders had known was inevitable ever since the Soviets’ recent cancellation of their Neutrality Pact with Japan.  Japan was not trying to surrender.  Even after the obliteration of two cities by nuclear weapons and the Soviet declaration of war the militarists in firm control of the Imperial government refused to admit defeat until their emperor finally forced the issue.  They had argued that the United States would still have to launch a ground invasion and that the subsequent carnage would force the Americans to sue for peace leaving of much of Asia firmly under Japanese control.   The war had started long before Pearl Harbor with the Japanese invasion of China, and millions had already perished.  That Asians in the giant arc from Indonesia through China --- far from Western eyes --- were dying by the hundreds of thousands each and every month that the war continued has been of zero interest to Eurocentric writers and historians be they critics or supporters of Truman’s decision.  As for the president, himself, Truman rightfully hoped, after the bloodbaths on Okinawa and Iwo Jima, that atomic bombs might force Japan’s surrender and forestall the projected two-phase invasion which would result in “an Okinawa from one end of Japan to the other.”    Hersey understood this well.  Fluent in Chinese (he was born and raised in China), Hersey was painfully aware of the almost unimaginable cost of the war long before the United States became involved and, after it did, observed the savagery of battle on Guadalcanal first hand.  Yes, he understood it quite well and it will come as a surprise, even shock, to many that neither Hersey nor Truman saw Hiroshima as an indictment of the decision to use the bomb.   Those were very different times and the prevailing attitude, according to George M. Elsey, was “look what Japanese militarism and aggression hath wrought!”  (Truman also made similar observations when touring the rubble of Berlin during the Potadam Conference.)  The president considered Hiroshima an “important” work and, far from being persona non grata, Hersey would sometimes spend days at a time in Truman’s company when preparing articles for The New Yorker.  This level of access was not accorded to other journalists and circumstances resulted in Hersey sitting in on key events such as when Truman learned that the Chinese had just entered the Korean War and a secret meeting with Senate leaders over the depredations of Joe McCarthy. Although exceptions can be found in the literature, Hersey’s Hiroshima was simply not viewed in the postwar period as an anti-nuclear polemic and Elsey, who served in both the Roosevelt and Truman administrations before going on to head the American Red Cross for more than a decade, remarked to David McCullough that “It’s all well and good to come along later and say the bomb was a horrible thing.  The whole goddamn war was a horrible thing.”   Scranton, himself, gives a brief nod to this fact, admitting that the midnight conventional firebombing of Tokyo earlier that year killed even more people, approximately 100,000, yet one shudders to think what he teaches to his unsuspecting students.  The “revisionist” Japanese-were-trying-to-surrender hoax prominently recited in his review of Mr. Straight Arrow has long been consigned to the garbage heap of history by a host of scholarly books and articles*  including, ironically, a brilliant work by one of Scranton’s own colleagues at Notre Dame.  Father Wilson D. Miscamble’s The Most Controversial Decision: Truman, the Atomic Bombs, and The Defeat of Japan (Cambridge University Press, 2011) is a hard hitting, well researched effort that is especially notable for its thoughtful exploration of the moral issues involved. Though Scranton and Miscamble share the same campus, a colleague of mine maintains that the two scholars have never met.  Perhaps they should get together for coffee some morning. -------------------------- * Six particularly useful works are: Sadao Asada’s award winning, “The Shock of the Atomic Bomb and Japan’s Decision to Surrender -- A Reconsideration,” Pacific Historical Review, 67 (November 1998); Michael Kort, The Columbia Guide to Hiroshima and the Bomb, (New York: Columbia University Press, 2007) and “The Historiography of Hiroshima: The Rise and Fall of Revisionism,” The New England Journal of History, 64 (Fall 2007); Wilson D. Miscamble C.S.C., The Most Controversial Decision: Truman, the Atomic Bombs, and the Defeat of Japan (New York: Cambridge University Press, 2011); Robert James Maddox, ed., Hiroshima in History: The Myths of Revisionism, (Columbia, Missouri: University of Missouri Press, 2007); Robert P. Newman, Enola Gay and the Court of History, (New York: Peter Lang Publishing, 2004).

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172708 https://historynewsnetwork.org/article/172708 0
How Two Regicides Escaped to America and Became Folk Heroes

 

 

If you went to school in the USA in the late nineteenth or early twentieth centuries, chances are you would have been taught about Edward Whalley and William Goffe. Ask American – or indeed British – schoolchildren today about Whalley and Goffe and you will almost certainly be met with blank stares. This journey from widespread recognition to relative historical obscurity is perhaps surprising when we learn that Whalley and Goffe were significant colonial American figures who had been directly involved in one of the most seismic events in British history: they were the fourth and fourteenth signatories of the death warrant of King Charles I, the only king in British history to be lawfully tried and put to death.

 

But what also makes them stand out from the other fifty-seven signatories is that they were eventually lauded as American folk heroes, dominant figures in early American literature, and American proto-revolutionaries – men some would consider Founding Grandfathers. They earned this reputation because, when the British monarchy was restored in 1660, Whalley and Goffe fled to America where they continued to uphold the principle of revolutionary republicanism against tyrannical monarchy.

 

To achieve a peaceful and successful Restoration, Charles II and his courtiers understood the need for widespread forgiveness for many of those who had been involved in the events that led to Charles I’s death – increasing intransigence between king and parliament in the 1630s and civil wars in the 1640s – and those who had been involved in the constitutional experiments of the English Republic in the 1650s. But there were some key figures that had been simply too closely involved in the regicide, most notably the surviving signatories of the king’s death warrant, to be forgiven and their actions forgotten.

 

Some of these regicides remained in England, confident that their executions would secure their martyrdom, provide a chance to gain sympathisers while re-vocalising their commitment to Oliver Cromwell’s Good Old Cause, and to ensure eternity sitting at the right hand of God. But others preferred to keep promoting the cause by staying alive and taking their chances by fleeing: some to France, Germany, the Netherlands or Switzerland, or – like Whalley and Goffe – to America.

 

The American colonies were predisposed to be sympathetic to these two devoutly Puritan regicides who had devoted their energies to working against the Stuart dynasty – the religious policies of which had, famously spurred early generations of colonists to cross the Atlantic. Indeed, Whalley and Goffe were openly welcomed in Boston and Cambridge, Massachusetts, and even when warrants arrived from England making it clear that the regicides were wanted men, the sincerity of the colonial authorities’ attempts to capture Whalley and Goffe was open to question. While individuals like Governor Endecott of Massachusetts Bay made overtures to suggest that they were being earnest in their pursuit of the king killers, their delayed and ineffectual actions suggested quite the opposite.

 

The closest that Whalley and Goffe came to being discovered, for example, was in 1661 when Endecott appointed two hapless bounty hunters, a merchant and a ship’s captain: Thomas Kellond and Thomas Kirke.  They spent a couple of weeks being outwitted by the colonial authorities before being given generous grants of land, perhaps to buy them off and discourage any further pursuit. Whalley and Goffe spent the remaining fifteen to twenty years travelling around the New England colonies, residing in New Haven, Hadley and Hartford, hiding from a threat that never really, it could be argued, existed.

 

After a flurry of ineffectual activity against the regicides in the year or two following  the Restoration, it became clear that Charles II’s imperial pragmatism was taking precedence over an ideological witch-hunt. Aside from the potential repeated embarrassment of sending over agents to capture Whalley and Goffe, haplessly chasing them through unknown territory that the pair and their protectors inevitably knew more intimately, Charles II and his courtiers had to tread carefully with their newly regained American colonies.

 

It was economically and politically unwise to alienate potentially lucrative trading partners in the Atlantic basin by encroaching on the colonists’ liberties for the sake of capturing two anxious and ageing Puritans. Furthermore, Charles II was inevitably distracted by developments back home, from the incendiary religious politics of the post-civil war era to the natural disasters of the mid-1660s. While Whalley and Goffe were three thousand miles away cowering in basements, hiding in caves, and eking out an existence of a little trade and a lot of prayer, the urgency to capture and execute them gradually receded.

 

This did not stop the author of the first full-length history of Whalley and Goffe, Ezra Stiles, from reinterpreting their story to portray them as brave, thrusting, revolutionaries who were acting as revolutionary heralds  from an earlier age. Stiles was researching and writing in the 1790s, in the aftermath of the American and French Revolutions and he enthusiastically wanted to see the regicides as men sowing the seeds for such visionary ideas over a hundred years earlier.

 

To do so, he had to rely on oral histories. And since fugitives don’t tend to leave a detailed trail of documentary evidence this inevitably led to distortion and excited exaggeration. Individuals retold debatable stories from earlier generations who were determined to associate themselves, their families, and their locality to the story of these heroic men on the run. It also helped Stiles’s cause that the most dramatic story involving the regicides – the tale of Goffe, the ‘Angel of Hadley’, appearing from nowhere to protect the colonists from the indigenous population in King Philip’s War – was actually probably true. 

 

It was on this basis that the majority of novels, plays, poems and paintings that featured the regicides in the nineteenth century portrayed them as heroic champions of liberty defeating the pantomime-villain efforts of the tyrant Charles II and his sneering courtiers. Such a caricature was naturally attractive to schoolchildren growing up with the tale of two obscure Englishmen whose breathlessly heroic actions could be seen as joining the teleological dots between English and American Revolutions. Looking beneath the veneer of this simplistic image might undermine this narrative, but it restores the more fascinating truth of two men whose actions represented something far, far greater.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172697 https://historynewsnetwork.org/article/172697 0
FDR's Token Jewish Refugee Shelter

Cartoon by Stan MacGovern in the New York Post, June 1, 1944

 

Seventy-five years ago today, a ship sailed into the New York harbor, carrying more than 900 European Jewish refugees. Unlike similar ships that had approached America’s shores in the 1930s, the S.S. Henry Gibbins was not turned back. Instead, the passengers were taken to an abandoned army camp in upstate New York, where they spent the rest of the war in safety, far from Hitler’s clutches.

 

Why did President Franklin D. Roosevelt permit this group of Jewish refugees to enter the United States? What had changed since the years when other ships were turned away?

 

By the autumn of 1943, news of the mass murder of Europe’s Jews had been verified by the Allies and widely publicized in the United States, although major newspapers often buried it in the back pages. There could be no doubt that at least several million Jews had been slaughtered by the Germans and their collaborators, and many more were still in danger.

 

President Roosevelt and his administration insisted that nothing could be done to help the Jews except to win the war. Others disagreed. In October 1943, U.S. Senator Warren Barbour, Republican of New Jersey, introduced a resolution calling on the president to admit 100,000 Jewish refugees “for the duration of the war and six months thereafter.” The resolution was endorsed by both the National Democratic Club and the National Republican Club.

 

Granting temporary refuge was a way of addressing the life-and-death crisis that Europe’s Jews faced, without incurring the wrath of those who opposed permanent immigration. The Jews who were saved would go back to Europe, or elsewhere, when the war ended.

 

Sen. Barbour tragically passed away just a few weeks later, but the idea of temporary refuge gained traction. In early 1944, a proposal for temporary havens was presented to President Roosevelt by the U.S. government’s War Refugee Board (a small, underfunded agency recently created by FDR under strong pressure from Jewish groups and the Treasury Department).

 

“It is essential that we and our allies convince the world of our sincerity and our willingness to bear our share of the burden,” wrote Josiah E. DuBois, Jr., a senior official of the War Refugee Board, in a memo to Roosevelt. The United States could not reasonably ask countries bordering Nazi-occupied territory to take in refugees if America itself would not take any, DuBois argued.

 

The president was reluctant to embrace the plan; he had previously confided to his aides that he preferred “spreading the Jews thin” around the world, rather than admitting them to the United States. 

 

Secretary of War Henry Stimson, for his part, vigorously opposed the temporary havens proposal. In his view, Jewish refugees were “unassimilable” and would undermine the purity of America’s “racial stock.”

 

Public pressure pushed the plan forward. Syndicated columnist Samuel Grafton played a key role in this effort, by authoring three widely-published columns advocating what he called “Free Ports for Refugees.”

 

“A ‘free port’ is a small bit of land… into which foreign goods may be brought without paying customs duties… for temporary storage,” Grafton explained. “Why couldn’t we have a system of free ports for refugees fleeing the Hitler terror?… We do it for cases of beans… it should not be impossible to do it for people.”

 

The activists known as the Bergson Group took out full-page advertisements in the Washington Post and other newspapers to promote the plan. Jewish organizations helped secure endorsements of “free ports” from religious, civic, and labor organizations, including the Federal Council of Churches and the American Federation of Labor. U.S. Senator Guy Gillette (D-Iowa) introduced a resolution calling for free ports; eight similar resolutions were introduced in the House of Representatives.

 

Support for the havens plan could be found across the political spectrum. The liberal New York Times endorsed it; so did the conservative Hearst chain of newspapers. Temporary refuge was fast becoming a consensus issue.

 

With public pressure mounting, the White House commissioned a private Gallup poll to measure public sentiment. It found that 70 percent of the public supported giving “temporary protection and refuge” in the United States to “those people in Europe who have been persecuted by the Nazis.”

 

That was quite a change from the anti-immigration sentiment of earlier years. But circumstances had changed, and public opinion did, too. By 1944, the Great Depression was over and the tide of the war had turned. The public’s fear of refugees had diminished significantly, and its willingness to make humanitarian gestures increased. 

 

Despite this overwhelming support for temporary refuge, President Roosevelt agreed to admit just one token group of 982 refugees. And he did not want them to be all Jews; FDR instructed the officials making the selection to “include a reasonable proportion of the [various] categories of persecuted minorities.” (In the end, 89% were Jewish.)

 

Ironically, the group was so small that they all could have been admitted within the existing immigration quotas. There was no need for a special presidential order to admit them, since the regular quotas for citizens of Germany and German-occupied countries were far from filled in 1944. In fact, they were unfilled in eleven of FDR’s twelve years in the White House, because his administration piled on extra requirements and bureaucratic obstacles to discourage and disqualify refugee applicants.

 

Of the 982 refugees whom the president admitted in August 1944, 215 were from Germany or Austria. Yet the German-Austrian quota was less than 5% filled that year. The second largest nationality group was from Yugoslavia; there were 151 Yugoslavs in the group. That quota was less than 3% filled in 1944. There were also 77 Polish citizens and 56 Czechs; those quotas were only 20% and 11% filled, respectively. Put another way, a combined total of 39,400 quota places from those particular countries sat unused in 1944, because of the Roosevelt administration’s policy of suppressing refugee immigration far below the limits that the law allowed.

 

The S.S. Henry Gibbins arrived in the New York harbor on August 4, 1944. Ivo Lederer, one of the passengers, recalled how they cheered when the ship approached the Statue of Liberty. "If you're coming from war-time, war-damaged Europe to see this enormous sight, lower Manhattan and the Statue of Liberty--I don't think there was a dry eye on deck."

 

The refugees were taken to Fort Ontario, an abandoned army camp in the upstate New York town of Oswego. It would be the only "free port" in America. By contrast, Sweden, which was one-twentieth the size of the United States, took in 8,000 Jews fleeing from Nazi-occupied Denmark. 

 

According to conventional wisdom, most Americans in the 1940s were against admitting Jewish refugees. It is also widely assumed that members of Congress—especially the Republicans—were overwhelmingly anti-refugee, too. America’s immigration system supposedly made it impossible for President Roosevelt to allow any more Jewish refugees to enter. And American Jews allegedly were too weak to do anything about it. 

 

Yet 75 years ago this summer, those myths were shattered when a coalition of Jewish activists, rescue advocates, and congressmen from both parties, backed by a large majority of public opinion, successfully pressured FDR to admit a group of European Jewish refugees outside the quota system. 

 

Refugee advocates had hoped the United States would take in hundreds of thousands of Jews. Sadly, President Roosevelt was interested in nothing more than an election-year gesture that would deflect potential criticism. Famed investigative journalist I.F. Stone was not off the mark when he called the admission of the Oswego group “a token payment to decency, a bargain-counter flourish in humanitarianism.” 

 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172707 https://historynewsnetwork.org/article/172707 0
Woodstock at 50: A Conversation with Award Winning Filmmaker Barak Goodman

On Tuesday, August 6th, PBS is set to release its newest documentary to their series American Experience. “Woodstock: Three Days That Defined a Generation” explores the legendary music festival by turning the cameras to the crowd. Emmy award winning filmmaker Barak Goodman and PBS tell the story of those who attended the concert, and how they endured a three-day festival with deficient infrastructure.

 

Woodstock is represented as the embodiment of the 1960’s counter-culture. The legendary festival remains prominent in the lore of hippie culture and the adage of sex, drugs, and rock n roll. Filmmaker Barak Goodman explores the culture of the late 1960’s and tells the story of who made Woodstock a historic experience: the audience.

 

Prior to the film’s release, HNN was able to connect with Goodman to discuss what making the film taught him about the importance of Woodstock. Goodman highlights the distinct culture that produced the festival, the importance of communitarianism in the crowd, and examines the lessons we should carry through to the future. 

 

Jonathan Montano: Before we start, I wanted to say that I learned a lot from the documentary. It was very interesting and educational. To start, let’s just talk about what you think Woodstock says about the counter culture era in general.

 

Barak Goodman: Sure, you know I think a couple of things, it’s a big question. I mean by 1969 there had been a lot of talk, a lot of sort of expression of what the counter culture was about. There were slogans – like peace and love and so forth – and I think in some ways what Woodstock showed us was that those concepts – those slogans – had a basis in reality. That this generation and these kids had taken on board these concepts and really tried to make something real of them. 

 

What makes this festival the sort of window – the lens – into the counter culture it’s that they had to execute these big concepts in a real way and under trying circumstances. They had to express peace and love, they had to build a new city, as they say, in order to avoid a disaster. So, I think it really made it concrete, and brought to focus, what these concepts really meant in real life.

 

That’s the greatness and magic of what happened there and really what’s so inspiring about it. When chips were down, they acted and put their money where their mouth was. That was the saving grace of the festival.

 

Montano: Right, especially considering everything they had to go through, it really seemed as though it might’ve gone terribly.

 

Goodman: In maybe 99 of 100 cases like this it would’ve gone terribly. These people were hungry, tired, living poorly, had little help from the outside except some medical assistance from the state of New York, and of course heroic help from the surrounding communities. But essentially, they were on their own. They had only each other. They had only what was in their hearts and souls at that point to get them through. I don’t want to exaggerate – this isn’t the Donner Party – but it wasn’t a pleasant experience. Many other festivals had devolved into violence, especially with drugs and all around. But this called on their better natures, and that’s what is inspiring.

 

Montano: Yeah, I agree. I know today there are countless festivals from Coachella, Lola-pa-looza, and even what could’ve been Fyre Festival. But was Woodstock the first of its kind? Did it pave the way for festival culture? Or is it entirely distinct?

 

Goodman: I think both. There were other festivals that preceded Woodstock but nothing on that scale. It’s become a myth, an icon, an inspiration for festivals that have come since. I think we all have this idea of Woodstock in the back of our heads when we go to a festival, but I don’t think it has or ever will be repeated. I mean you see all the anniversary Woodstock concerts try to be mounted and they all fail in comparison.

 

I think ultimately you can’t recreate the particular set of circumstances that made Woodstock unique. You’re never going to have that many people show up unexpectedly with no infrastructure for them. You’ll never have the isolation, especially now with cell phones. There would be constant communication with the outside world. There were just so many unique circumstances that make Woodstock unrepeatable. But I think it is a beacon and an inspiration for every future festival that has happened. We all want to go to Woodstock when we go to Coachella.

 

Montano: Absolutely. Would you say that the culture in general was a unique circumstance that made Woodstock happen?

 

Goodman: Yes, I do, but I want to hedge it a bit. I do think that the late 60’s were a special moment. We had a galvanizing issue in Vietnam that brought people together in a way that was almost unique. We had a whole generation ready to turn the page on their parent’s generation, wanting to be different.

 

But I would also say that we are seeing a bit of a repeat. My kids are that age, and they feel somewhat the same as my generation did about Vietnam but for them it’s global warming. In other words, we let them down and left them an unsustainable world and we’re doing nothing about it. It’s going to have to be them that does something. While I don’t think it’s exactly the same, and you wouldn’t have Woodstock, you are seeing a level of activism and a level of communitarianism, and Us vs. Them, that I think for the first time it does feel like the 1960’s. That’s my hope.

 

Montano: Certainly. The comparison between Vietnam and global warming is really interesting.

 

Goodman: Yeah. I mean, these are existential threats. Back then it was “yeah, that could be me going over there and dying.” It’s a bit more diluted now but young people do feel as though there might not be a habitable world to live in if we don’t take the issue on ourselves; if we don’t change things. The great thing about Woodstock is that it on a microcosmic level that you can change things by banding together and being a community if you’re willing to pull in the same direction. I’m hoping this film gets seen by young people because it shows a way to move forward.

 

Montano: The sixties are often represented as the era of sex, drugs, and rock n roll, especially in pop culture. How does Woodstock add a nuance to that depiction?

 

Goodman: Right. So, like a lot of stereotypes it’s rooted in something but it’s also a stereotype, a caricature really. I certainly had that caricature going into making the film. I thought they were just hippies doing drugs, but that just trivializes what they were.

 

I think that what Woodstock, and what I hope the film does, is in some ways rehabilitate the hippie culture. It wasn’t a caricature. Don’t make fun of stoned out hippies wandering naked in a field. This was about something real. It was really beautiful Really beautiful. It was inspiring. It’s something to aspire to not look to down on. That was a real revelation for me in making the film and made it a joyous experience. The feeling like these kids were on to something, they had something to teach us.  I love the moment in the film when Max Yeager, someone who represents a different generation and a different point of view, gets up there and makes a very appreciative speech to the audience saying you showed us, you taught us something.

 

Montano: Which leads me to my next question: Is there anything we should take from Woodstock and apply to contemporary times? In other words, is there something we should learn not just about Woodstock, but from it?

 

Goodman: Absolutely. And I think that’s the last part of the show. It’s basically the better angels of our nature. In this dark time, we tend to give up on people, at least I do. I begin to question if people are drawn to good or not good. Light or dark. What Woodstock shows is that we have within us an enormous capacity for sharing, generosity, Unitarianism, all those things. And boy is that sorely needed right now. It is nothing more of a reminder of how much can get done by following that path, rather than divisiveness or violence.

 

I think it’s that simple. It was a beautiful moment of collective goodness prevailing over what could’ve been a very dark and destructive experience. That’s what Woodstock has to teach us.

 

Montano: The documentary emphasizes a self-police system, mainly through the hog farm, and even self help system for drugs that led to bad trips. A user would be taken care of, then they’d take care of the next bad trip. Do you think anything similar is possible today especially considering the era of social media, hyper-security, even helicopter mom type emphasis in place today?

 

Goodman: You did a much better job pointing to the things that Woodstock can teach us than I did. Absolutely. How brilliant was that? What a stroke of genius to understand the crowd well enough to know that a bunch of rag tag hippies from New Mexico would be better cops than armed New York Policeman. That to me shows a deep, deep understanding of who these kids were and what they were all about.

 

I do think, and I’m no expert on security, but I do think that hyper militarized, Us vs. Them attitude of policing right now draws out the worst and leads to more conflict than it needs to. And I would love to see an attempt to do something much more like Woodstock, with a please force not a police force. And just to deescalate – and we all see it – particularly in confrontations between cops, and usually people of color, that get escalated so quickly, and guns get pulled out and bullets fly. But isn’t it easier to take a deep breath, realize we’re all human beings and just talk to each other? It just feels like that is more productive. That’s what that festival did and thank god they did it.

 

Montano: In a really brilliant way

 

Goodman: In such a brilliant way! God, I mean, not only did the hog farm supply security, but they ended up feeding everybody, and taking care of overdoses. And it was because they had already figured out how to take care of each other in a communitarian way. And they knew how. it wasn’t a set of skills that many had, but they did. Stanley Goldstein was like, ok, that’s who we need here. And yes, I would love to see that attempted at events today.

 

Montano: To begin to wrap up, what did you think of Woodstock before the documentary and did your experience change that thought at all?

 

Goodman: Totally. I think I felt as most people did. That Woodstock was a great rock 'n roll concert. The original movie showed that. And it was that! But I didn’t understand the real story, which is what happens to the crowd. That was the goal of the documentary. We wanted to turn the camera’s around and show the crowd. Whatever made Woodstock Woodstock was not up on the stage, it was down with the people. That was the revelation, the gee whiz moment. That’s why our film can stand next to the other brilliant one from 1971, because it’s about something totally different.

 

Montano: Exactly. When I first watched it, I expected to see things like Hendrix and the Star Spangled Banner, you know – the rock 'n roll side - the brilliance of the music. But that’s not what you showed, and I was enamored by that. And obviously I’m a lot younger than Woodstock, but everybody kind of knows it. But I only knew it as a rock concert. I had no idea, for example, that there was free admission, and that really blew my mind.

 

Goodman: Right, right. We weren’t about to try and re-do the concert movie. That original movie is so great, who would want a new one? We wanted a different one. We got the material to make it and now the two films are companion pieces.

 

Montano: Just one final question: do you think it’s possible to have a festival today where there’s free admission? Do you think anyone would concede the way Woodstock did? To me it’s just impossible to imagine.

 

Goodman: No, I don’t. Not in our current climate. First of all, you can’t have the same thing because of cell phones. There will never be that isolation again. They were on their own and they had to make it through together.

 

And the money side, I just don’t see it. It’s so improbable that the quartet of people who put this on would all be so in over their heads. They were so naïve in ways and this just hadn’t been done. And back then if you just had money you could do it. But now a days you get disasters like Fyre Festival. But here it was also partly the human beings. Joel and Jon were – Joel remains, Jon passed away – a wonderful human being who just doesn’t put money ahead of other people. And that’s what happened.

 

But it was also the times. Not everyone was counting money all the time and figuring out profit margins. It was a capitalist venture, but it was so loose. They were just writing checks – they weren’t even keeping count. They didn’t know how in debt they were. It was just a different moment in our history and one I’m nostalgic about. Today, all the accountants would be there and with their lawyers, and law suits would be flying long before a single chord of music was played. And actually, that is happening with the 50th anniversary concert. It’s probably going to fall apart because it’s a different moment in history. Woodstock was unique and entirely special.

 

Watch “Woodstock: Three Days That Defined a Generation” on PBS Tuesday, August 6th to enjoy an incredible documentary highlighting the 50th anniversary of the historic music festival.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172668 https://historynewsnetwork.org/article/172668 0
All the world’s a stage when you’re Boris Johnson basking in the no-deal-Brexit spotlight

 

In politics, it is said that you campaign in poetry and govern in prose.  Alexander Boris de Pfeffel Johnson, the boy who once said he wanted to be ‘world king,’ has finally realized his life-long ambition of becoming British Prime Minister. The real task of uniting his fractured party over Brexit – and most importantly, uniting a divided country – now lies ahead.

 

Johnson defeated Jeremy Hunt, the mild-mannered former foreign secretary by a convincing majority – that much was never in doubt. Some 160,000 Conservative or Tory Party members – or about 0.24% of the British population – chose the next prime minister of the United Kingdom.  The opposition Labour Party has already criticized Johnson for lacking legitimacy.

 

For Americans wondering how it is that such a small minority could determine the leader of the country, in the UK as opposed to the U.S., people vote in a general election for a party – not an individual – and the leader of the party that has a majority in the House of Commons or Parliament becomes prime minister and forms the next government.

 

This time around, there wasn’t a general election.  The Tory Party remained in office despite the resignation of its leader, Theresa May.  The task then was to choose a new party leader.  Of the Conservative Party faithful, the turnout for the leadership election was 87.4%.  Jeremy Hunt managed 46,656 of the votes against 92,153 for Boris Johnson.  Although Johnson’s majority was thumping, it was in fact less than David Cameron’s (the former prime minister) when he became Tory Party leader in 2005. 

 

It is still a decisive result and the scale of the victory is important.  It gives Boris Johnson tremendous lift-off as he begins his premiership.  He campaigned on a pledge to take Britain out of the European Union by Halloween, ‘no ifs ands or buts,’ come what may, ‘do or die.’  In a way, his election is an endorsement by Tories for a so-called ‘hard Brexit’ – leaving the European Union without a trade deal.

 

His choice of cabinet ministers clearly shows that Boris Johnson is intent on delivering Brexit.  It is also a sure sign to the EU based in Brussels that there’s a new sheriff in London.  Whether Brussels is going to be moved by a cabinet of die-hard ‘Brexiteers’ in Britain is another matter altogether.

 

The contrast between Boris Johnson and his predecessor could not be sharper.  Theresa May, the daughter of a vicar, was distinctly shy and notoriously non-sociable with a deep sense of duty.  She was happy to bury her head in work behind the scenes without any fanfare.  Boris Johnson on the other hand is clearly one of the more colorful prime ministers in recent times with his fair share of controversies.  He is known for his jokes and irreverence.  His personal life has raised a lot of eyebrows.  A twenty-five-year marriage has ended amid allegations of several extra-marital affairs.  The police were recently called to an apartment he shared with his girlfriend after a neighbor claimed to hear yelling, plate-smashing and a woman shouting ‘get off me’ and ‘get out.’  Johnson has refused to answer any questions about the alleged incident.

 

Boris Johnson and Donald Trump have a few things in common.  They are both from New York and are disruptors who believe in throwing out the rule book of politics and pursuing a more unorthodox style of leadership.  Both men defy political gravity and have survived controversies that would otherwise have scuppered the leadership ambitions of any candidate.  Both Trump and Johnson have also been plagued with alleged infidelity, gaffes, and remarks deemed as racist.

 

One thing is sure, Donald Trump has an ally in Downing Street, tweeting that Boris Johnson would make a great prime minister.  But there are some fundamental differences.  In the past, Boris Johnson has described Trump as ‘unfit’ to hold the office of President of the United States, and while he was foreign secretary (secretary of state) Boris Johnson openly opposed Donald Trump’s Muslim ban.  It would be interesting to watch how the two men with very similar personalities interact.  I am sure all sins have been forgiven.  The question now is, will the so-called ‘special relationship’ between Britain and America be special only for the British?  Time will tell.

 

As mayor of London, Boris Johnson presided over a reduction in crime by 27%.  He was also successful in winning two consecutive mayoral elections in Labour-dominated London.  He helped bring the Olympics to the capital in 2012, and his supporters say that he will be equally as successful as prime minister.  His sense of optimism and can-do spirit for Britain is infectious and powerful in a party trounced in the recent European elections to fourth place.

 

Johnson is also well known for his rhetorical tour de force, and an ability to use oratory to amuse, entertain and wow his audience.  Contrary to the caricature of him as a gaffe-prone bombast, Boris Johnson has a formidable intellect and is the ultimate insider outsider – he read classics at Oxford and is the 20th prime minister to be educated at the highly selective Eton College, where Britain’s elect are born and bred.  He went on to become a journalist but was later fired for making up quotes before later entering Parliament in 2001.

 

Johnson is a marmite figure – you like him or loathe him – his critics say that being mayor of London is not the same thing as being prime minister.  That the job at Number Ten entails uniting an entire country behind a vision.  His leadership style, they argue, of selecting people who can deliver around him won’t be enough.  Johnson is notorious for his lack of attention to detail. 

 

Banking on his star gold dust quality of campaigning with soundbites and rhetoric alone, may not be enough to succeed in the top job.  Boris Johnson will have to focus on the minutia of detail across the whole spectrum of government.  As mayor, he could avoid the agony of policy detail, allowing his officials to do the heavy lifting instead.  In Downing Street and as prime minister, there is no hiding place.  The buck starts and stops with him.  His party’s razor-thin majority of two in Parliament – likely to dwindle to just one after a by-election – will require all the diplomatic and persuasive skills he can muster.

 

The leadership at Downing Street might have changed, but the mathematics of the House of Commons remain exactly as they were when Theresa May encountered her difficulties, ultimately leading to her resignation.  There are mutterings among quite senior Tory pro-European members of parliament that they are prepared to bring down the new government if that is what it takes to stop a no-deal Brexit.  More importantly, the House of Commons has voted resoundingly to stop a no-deal Brexit from ever happening.  

 

That makes it difficult to see how Boris Johnson can persuade the European Union to re-negotiate a new Brexit deal.  When he comes to Brussels, his EU counterparts already knows that he has a wafer-thin majority in Parliament and is therefore unlikely to yield an inch. 

 

The EU has made it clear that it will not go back on the Withdrawal Act it negotiated with Theresa May.  The so-called ‘Backstop’ – the guarantee of a frictionless border between Northern Ireland (part of the UK) and the Republic of Ireland (part of the EU) in the event of a no-deal Brexit – remains the major sticking point between London and Brussels.  How Boris Johnson succeeds in threading that needle where Theresa May has failed remains to be seen.

 

I predict Boris Johnson will call a general election before October 31, 2019 in a bid to win a fresh mandate, this time, from the British people.  The smart money is on him winning, given that the opposition Labour Party under Jeremy Corbyn is flat on its back with its own internal problems over a lack of a coherent direction on Brexit.  The Tories in a bid to recover votes from the Brexit Party of Nigel Farage will tout a clear message as the party of ‘Leave.’  The Cabinet picked by Boris Johnson does not strike me as one designed to manage the day-to-day affairs of the country.  It looks rather like a campaign team, ready to sell the message of a no-deal Brexit on the eve of a general election.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172698 https://historynewsnetwork.org/article/172698 0
How African American Land Was Stolen in the 20th Century

 

Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

I recently read an article in the New Yorker that so shocked me that I knew I had to tell you, my small audience, all about it. Vast tracts of land owned by African Americans were taken from them in the 20th century. At the heart of the story is racism in many forms: how the promise of emancipation after the Civil War was broken; how whites used bureaucracy and twisted legalisms to take black land from owners too poor to defend themselves; how the teaching of American history was whitewashed to bury this story. I was shocked because, after decades of studying history, I had no idea about this fundamental cause of economic inequality in America. Writing this article pushed me into investigating the even larger story of how black Americans were prevented from owning real estate, one of the fundamental sources of wealth.

 

Here’s a short version of the history. At the time of Emancipation, Union General William Tecumseh Sherman declared that 400,000 acres formerly held by Confederates be given to African Americans. His order came to be known as the promise of “40 acres and a mule”. But the newly established Freedmen’s Bureau was never able to control enough land to fulfill this promise. In 1866, Congress passed the Southern Homestead Act, opening up 46 million acres of public land in southern states for Union supporters and freed slaves. The land was uncultivated forest and swamp, difficult for penniless former slaves to acquire or use. Southern bureaucrats made it difficult for blacks to access any land and southern whites used violence to prevent blacks from occupying land. Within 6 months, the land was opened to former rebels. In 1876, the law was repealed.

 

The much more extensive Homestead Act of 1862 granted 160 acres of government land in the West to any American who applied and worked the land for 5 years. Over the course of the next 60 years, 246 million acres of western land, the area of California plus Texas, was given to individuals for free. About 1.5 million families were given a crucial economic foundation. Only about 5000 African Americans benefitted.

 

Despite obstacles, many black families had acquired farmland by World War I. There were nearly 1 million black farms in 1920, about one-seventh of all American farms, mostly in the South. During the 20th century, nearly all of this land was taken or destroyed by whites. Sometimes this happened by violent mob action, as in Tulsa, Oklahoma, in 1921, or the lesser known pogrom in Pierce City, Missouri, in 1901, when the entire black community of 300 was driven from town. A map shows many of the hundreds of these incidents of white collective violence, concentrated in the South. Many of the thousands of lynchings were directed at black farmers in order to terrorize all blacks and make them leave.

 

Other methods had a more legal appearance. Over 75 years, the black community of Harris Neck, Georgia, developed a thriving economy from fishing, hunting and gathering oysters, on land deeded to a former slave by a plantation owner in 1865. In 1942, the federal government took gave residents two weeks notice to leave, their houses were destroyed, and an Air Force base was created. That site was chosen by the local white politicians. Black families were paid two-thirds of what white families got per acre. Now the former African American community is the Harris Neck National Wildlife Refuge.

 

Vast amounts of black property were taken by unscrupulous individual use of legal trickery, because African Americans did not typically use the white-dominated legal system to pass property to their heirs. White developers and speculators took advantage of poorly documented ownership through so-called partition sales to acquire land that had been in black families for generations. One family’s story is highlighted in the New Yorker article, co-published with ProPublica. The 2001 Agricultural Census estimated that about 80% of black-owned farmland had disappeared in the South since 1969, about half lost through partition sales.

 

Decades of discrimination by the federal government made it especially difficult for black farmers to retain their land as farming modernized. The Department of Agriculture denied loans, information, and access to the programs essential to survival in a capital-intensive farm structure, and hundreds of thousands of black farmers lost their land. Even under President Obama, discrimination against black farmers by the USDA continued.

 

Because land was taken by so many different methods across the US, and the takers were not interested in recording their thefts clearly, it is impossible to know how much black land was taken. The authors of the New Yorker article say bluntly, “Between 1910 and 1997, African Americans lost about 90% of their farmland.” That loss cost black families hundreds of billions of dollars. In 2012, less than 2 percent of farmers were black, according to the most recent Agricultural Census.

 

While rural blacks lost land, real estate holdings of urban blacks were wiped out by a combination of government discrimination and private exploitation. Because black families could not get regular mortgages due to redlining by banks, if they wanted to buy a house they had to resort to private land sale contracts, in which the price was inflated and no equity was earned until the entire contract was paid off. If the family moved or missed one payment, they lost everything. A recent study of this practice in Chicago in the 1950s and 1960s showed that black families lost up to $4 billion in today’s dollars.

 

For the first time in decades, reparations for African Americans who were victimized by the white federal and state governments are being discussed seriously. This story about whites taking black property shows how superficial, disingenuous and ahistorical are the arguments made by conservatives against reparations. When Sen. Mitch McConnell delivered his simplistic judgment last month, he was continuing the cover-up of modern white real estate theft: “I don't think reparations for something that happened 150 years ago for whom none of us currently living are responsible is a good idea.”

 

Surveys which demonstrate that the majority of white Americans are against reparations only demonstrate how ignorance of America’s modern history informs both public opinion and survey questions. Gallup asked, “Do you think the government should – or should not – make cash payments to black Americans who are descendants of slaves?” While blacks were in favor 73% to 25%, whites were opposed 81% to 16%. A different question might elicit a more useful response: Do you think the government should make cash payments to millions of black Americans whose property was stolen by whites and who were financially discriminated against by American government since World War II?

 

Today’s economic gap between black and white began with slavery. Emancipation freed slaves, but left them with nothing. Hundreds of millions of acres of land were given away to white families. When blacks gradually managed to get some land, it was taken by violence and legal trickery during the 20th century. After World War II, blacks were denied access to another giant government economic program, the GI Bill, which helped millions of white veterans acquire houses. The collusion of federal, state and local governments, banks, and real estate professionals bilked African Americans of billions of dollars in real estate, with the subprime mortgage crisis only a decade ago as the latest chapter. What I have written here is only an outline of the racist narrative.

 

Despite the ravages of slavery, the American story would have been very different if the ideas and practices behind Lincoln’s Emancipation had been put into effect. Instead, white supremacy reemerged in the South and throughout the US. The power that white supremacists exerted in 20th-century America is symbolized by James F. Byrnes, a South Carolina politician, who served in the House of Representatives 1911-1925, was one of the most influential Senators 1931-1941, was appointed to the Supreme Court by FDR, but then led the Office of Economic Stabilization and the Office of War Mobilization during World War II, became Secretary of State 1945-1947, and was Governor of South Carolina 1951-1955. In 1919, he offered his theory of American race relations: “This is a white man’s country, and will always remain a white man’s country.” He followed that motto throughout his career.

 

Our nation is still paying the price.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/blog/154232 https://historynewsnetwork.org/blog/154232 0
Roundup Top 10!  

 

President Trump’s Baltimore tweets were racist — but he also fails to grasp what ails our cities

by Sara Patenaude

Decades of racist policies — not absentee congressmen — explain cities’ biggest struggles.

 

The Real Problem With Trump’s Rallies

by Kevin Kruse

There are a lot of similarities between the president and George Wallace of Alabama. But there’s also one big difference.

 

 

Donald Trump and Boris Johnson Rode the Same Wave Into Power. History Suggests the Parallels Won’t Stop There

by David Kaiser

The history of Anglo-American relations and how Trump and Johnson will utilize them.

 

 

When we hear populism, we think Donald Trump. But we should be thinking Elizabeth Warren.

by Gregg Cantrell

How the left can reclaim the mantle of populism that is rightly theirs.

 

 

How music took down Puerto Rico’s governor

by Verónica Dávila and Marisol LeBrón

Underground music overcame censors to gain popularity and political power.

 

 

What 'Infests' Baltimore? The Segregation History Buried in Trump's Tweets

by Paige Glotzer

In slamming Maryland Rep. Elijah Cummings’s 7th District as “rodent infested,” Trump borrows from the rhetoric that first segregated the city.

 

 

Trump’s Venom Against the Media, Immigrants, “Traitors,” and More Is Nothing New

by Adam Hochschild

The parallels between 1919 and 2019.

 

 

Winston Churchill Would Despise Boris Johnson

by Ian Buruma

Britain’s new leader has a sadly exaggerated sense of the importance his country will have after Brexit.

 

 

Don't just revile Amy Wax--rebut her

by Jonathan Zimmerman

Outrage does not and cannot refute what she said -- only facts can do that.

 

 

Expecting Ireland to be servile is part of a long British tradition

by Richard McMahon

The Brexit crisis is another example of how the UK so often ignores the consequences of its conduct on its neighbour.

 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172706 https://historynewsnetwork.org/article/172706 0
Leonardo da Vinci: Still a Genius 500 Years Later This summer marks the 500th anniversary of the death of Italian painter, scientist, inventor and architect Leonardo da Vinci. If you mention his name to most people they will say, rather quickly, “the Mona Lisa” or “the Last Supper.”  They are two of the most impressive works in world art history, but da Vinci was far more of an artistic force than just two paintings. He was the man who invented the primitive bicycle, the tank, the machine gun, the airplane and all sorts of machines that made life easier for people. He also put together 16,000 thousand pages of notes and illustrations in large, thick notebooks. You needed an invention? He had it, or could quickly produce it.

 

Many of the world’s most prominent museums are showing exhibits of his work this summer, whether the mammoth Metropolitan Museum in New York or the smaller Berkshire Museum, in Pittsfield, Massachusetts. There are many exhibits of da Vinci’s work in museums in London, England, Milan, Florence and Turin in Italy, Poland, Germany, Scotland and other nations.

 

The exhibits about da Vinci, who was born in 1452 and died in 1519, offer a rare and breathtaking look at the artistry of the great man, cover his life and times and offer some intriguing information about him. As an example, he suffered from strabismus, a permanent disease of the eyes that threw his vision slightly out of line. The benefit, though, was that it enabled him to “see” three dimensional foundations for his work and permitted him to produce three dimensional drawings and designs that no one else could.

 

He was an ethereal artist, to be sure, but he was also a shrewd businessman. As a young man he realized that the Italian states of Venice, Tuscany, Milan and others were frequently attacked by other nations. He plunged into efforts to design weapons and transports for the military. He designed a tank with four wheels propelled by men inside the tank who also used levers and triggers to fire weapons at the enemy (the tank was not actually used until World War I, 400 years later). He designed the mobile machine gun, a small, three-foot-high machine on wheels that was moved about quickly in a battle. Numerous eight or nine “guns,” or barrels, were mounted on it for rapidly firing. It was deadly. He was told that the biggest problem armies had was crossing rivers and so he designed temporary bridges made out of fallen trees set up like the vaulted ceilings of domes in which the trees, leaning on top of each other, offered the support that pillars normally would provide. The bridge could be constructed quickly and taken down just as quickly.

 

The Berkshire Museum, in Pittsfield, has a large and impressive exhibition called “Leonardo da Vinci: Machines in Motion,” produced by the Da Vinci Museum, in Florence, Italy, and loaned to it.

 

Da Vinci always believed that what we thought was true was not necessarily true. “All our knowledge had its origins in our perceptions,” he often said.

 

He proved that with his machines, as shown in the Berkshire exhibit. He used the “worm screw,” a long wooden screw that could be turned easily by levers and wheels. His screws were made to lift huge weights easily. He designed rotary screws that could be manipulated by hand and mesh with other screws or gears to move weights. In the Berkshire exhibit is the “double crane’ that da Vinci invented. People used one side of the crane to lift up things and the other to drop them down to street level. Both could be used at the same time and operated by just one or two people. He even invented a machine for blacksmiths so they could use it quite easily to pound down iron with a hammer without using their arms or hands at all. He invented a printing press just 40 years after Guttenberg’s; his could be run by just one man. 

 

What were some of his most popular inventions? Well, first and foremost, the bicycle. Da Vinci built a wooden bike that operated just like those of today. The day I was at the exhibit, kids flocked to it.

 

“Hey dad, I didn’t know they had bikes back then!” screamed one child.

 

The Berkshire Museum has weekly days on which kids can participate in hands on Da Vinci play sessions with their own drawings

 

The Berkshire Museum exhibit sprawls over the entire second floor of the building and a gallery on the first. It is spacious. There are two large video screens on which you see examples of his art and his life story. They add a nice touch to the exhibit and carry da Vinci from the 15th century to the 21st.

 

Da Vinci worked during the Italian Renaissance, called the age of discovery, and people were eager for his inventions. City and state governments supported his work and he became friends with wealthy and powerful people.

 

Some museum exhibits are more compressed than the Berkshire Museum’s but elegant, such as the one at the Metropolitan Museum of Art, in New York. The curators there decided not to stage a large exhibit of the artists/inventor’s work, but instead showcase one famous painting. They chose Saint Jerome Praying in the Wilderness, an unfinished masterpiece started in 1485 and about 85% complete. The painting is in its one gallery with religious music playing all day. That gallery is set inside of a larger gallery of religious paintings, sculpture and artifacts to give the exhibit a very religious feel.

 

The Met exhibit, that drew quite a crowd when I was there, sets off Saint Jerome by himself, surrounded by black walls for effect. It is impressive. The curators urge you to study the painting to see how da Vinci worked. As an example, there is the outline of a lion at the bottom right of the painting that needs to be completed. There are also fingerprints on the top of the painting where Leonardo tried to smooth out paint with his gnarly fingers

 

Max Hollien, director of the Met, said that the work “provides an intimate glimpse into the mind of a towering figure of western art.”

 

The New York exhibit shows that da Vinci did not paint in a very careful way, working on some parts for weeks and then jumping to other parts. He sketched out few final portrait drafts and tended to approach his work in an uninhibited way.

 

So, the next time you sit back in an airplane you can than Leonardo.

 

The Berkshire Museum exhibit is on display until September 8. The Met Museum exhibit is on display continues to October 6.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172679 https://historynewsnetwork.org/article/172679 0
Dear Moderators of the Presidential Debates: How About Raising the Issue of How to Avert Nuclear War?

 

You mass media folks lead busy lives, I’m sure.  But you must have heard something about nuclear weapons―those supremely destructive devices that, along with climate change, threaten the continued existence of the human race.  

 

Yes, thanks to popular protest and carefully-crafted arms control and disarmament agreements, there has been some progress in limiting the number of these weapons and averting a nuclear holocaust.  Even so, that progress has been rapidly unraveling in recent months, leading to a new nuclear arms race and revived talk of nuclear war.

 

Do I exaggerate? Consider the following.  

 

In May 2018, the Trump administration unilaterally withdrew from the laboriously-constructed Iran nuclear agreement that had closed off the possibility of that nation developing nuclear weapons.  This U.S. treaty pullout was followed by the imposition of heavy U.S. economic sanctions on Iran, as well as by thinly-veiled threats by Trump to use nuclear weapons to destroy that country.  Irate at these moves, the Iranian government recently retaliated by exceeding the limits set by the shattered agreement on its uranium stockpile and uranium enrichment.

 

At the beginning of February 2019, the Trump administration announced that, in August, the U.S. government will withdraw from the Reagan era Intermediate-Range Nuclear Forces (INF) Treaty―the historic agreement that had banned U.S. and Russian ground-launched cruise missiles―and would proceed to develop such weapons.  On the following day, Russian President Vladimir Putin declared that, in response, his government was suspending its observance of the treaty and would build the kinds of nuclear missiles that the INF treaty had outlawed.

 

The next nuclear disarmament agreement on the chopping block appears to be the 2010 New START Treaty, which reduces U.S. and Russian deployed strategic nuclear warheads to 1,550 each, limits U.S. and Russian nuclear delivery vehicles, and provides for extensive inspection.  According to John Bolton, Trump’s national security advisor, this fundamentally flawed treaty, scheduled to expire in February 2021, is “unlikely” to be extended.  To preserve such an agreement, he argued, would amount to “malpractice.”  If the treaty is allowed to expire, it would be the first time since 1972 that there would be no nuclear arms control agreement between Russia and the United States.

 

One other key international agreement, which President Clinton signed―but, thanks to Republican opposition, the U.S. Senate has never ratified―is the Comprehensive Test Ban Treaty (CTBT).  Adopted with great fanfare in 1996 and backed by nearly all the world’s nations, the CTBT bans nuclear weapons testing, a practice which has long served as a prerequisite for developing or upgrading nuclear arsenals.  Today, Bolton is reportedly pressing for the treaty to be removed from Senate consideration and “unsigned,” as a possible prelude to U.S. resumption of nuclear testing.

 

Nor, dear moderators, does it seem likely that any new agreements will replace the old ones. The U.S. State Department’s Office of Strategic Stability and Deterrence Affairs, which handles U.S. arms control ventures, has been whittled down during the Trump years from 14 staff members to four.  As a result, a former staffer reported, the State Department is no longer “equipped” to pursue arms control negotiations.  Coincidentally, the U.S. and Russian governments, which possess approximately 93 percent of the world’s nearly 14,000 nuclear warheads, have abandoned negotiations over controlling or eliminating them for the first time since the 1950s.

 

Instead of honoring the commitment, under Article VI of the 1968 nuclear Nonproliferation Treaty, to pursue negotiations for “cessation of the nuclear arms race” and for “nuclear disarmament,” all nine nuclear powers are today modernizing their nuclear weapons production facilities and adding new, improved types of nuclear weapons to their arsenals.  Over the next 30 years, this nuclear buildup will cost the United States alone an estimated $1,700,000,000,000―at least if it is not obliterated first in a nuclear holocaust.

 

Will the United States and other nations survive these escalating preparations for nuclear war? That question might seem overwrought, dear moderators, but, in fact, the U.S. government and others are increasing the role that nuclear weapons play in their “national security” policies.  Trump’s glib threats of nuclear war against North Korea and Iran are paralleled by new administration plans to develop a low-yield ballistic missile, which arms control advocates fear will lower the threshold for nuclear war.

 

Confirming the new interest in nuclear warfare, the U.S. Joint Chiefs of Staff, in June 2019, posted a planning document on the Pentagon’s website with a more upbeat appraisal of nuclear war-fighting than seen for many years.  Declaring that “using nuclear weapons could create conditions for decisive results and the restoration of strategic stability,” the document approvingly quoted Herman Kahn, the Cold War nuclear theorist who had argued for “winnable” nuclear wars and had provided an inspiration for Stanley Kubrick’s satirical film, Dr. Strangelove. 

 

Of course, most Americans are not pining for this kind of approach to nuclear weapons. Indeed, a May 2019 opinion poll by the Center for International and Security Studies at the University of Maryland found that two-thirds of U.S. respondents favored remaining within the INF Treaty, 80 percent wanted to extend the New START Treaty, about 60 percent supported “phasing out” U.S. intercontinental ballistic missiles, and 75 percent backed legislation requiring congressional approval before the president could order a nuclear attack.

 

Therefore, when it comes to presidential debates, dear moderators, don’t you―as stand-ins for the American people―think it might be worthwhile to ask the candidates some questions about U.S. preparations for nuclear war and how best to avert a global catastrophe of unprecedented magnitude?

 

I think these issues are important.  Don’t you?

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172611 https://historynewsnetwork.org/article/172611 0
White Nationalists and the Legacy of the Waffen-SS from Postwar Europe to Today

 

The military collapse of Germany in 1945 was so total that any fears held by the Allies of a Nazi revival were soon dissipated. The success of West Germany as a successful, democratic state confirmed the view that Germany’s militaristic history really was a thing of the past. And yet the specter of Nazi Germany has never completely disappeared, the exploits of the Waffen-SS continue to provide inspiration for extreme right-wing nationalists. More recently there has been a trend to rehabilitate Waffen-SS veterans from Eastern European countries and to reconsider them as patriots rather than agents of Nazi tyranny. 

 

The Waffen-SS was just one of many strange organizations to emerge from Nazi Germany. Originally intended as an elite palace guard to protect Adolf Hitler it grew massively in size and scale to become a multi-national army, the military wing of Heinrich Himmler’s dreaded SS. Of the 900,000 men passing through its ranks, approximately half were drawn from countries outside the German Reich, some even recruited from ethnic groups normally considered beyond the Nazi racial pale. 

 

During the course of World War II, the armored formations of the Waffen-SS gained a reputation for excellence on the battlefield, a reputation that had to be set against its involvement in the mass killings of civilians and numerous battlefield massacres. In the aftermath of defeat in 1945, the Waffen-SS was not only judged to be a criminal entity by the Allies but was also condemned by many old comrades in the Wehrmacht, the German armed forces, eager to pass on blame for the many atrocities committed by Germany during the war. 

 

In response, former Waffen-SS soldiers established HIAG, a self-help group to lobby the new West German government for improved material conditions for its members and to promote the idea that the Waffen-SS was an honorable body of soldiers, similar to those of the German army. The advent of the Cold War reinforced and expanded this narrative, the SS veterans arguing that the multi-national Waffen-SS was a forerunner of NATO and a saviour of Western civilization from Soviet barbarism.

 

The concept of the honorable Waffen-SS achieved some traction, not least in forming its own sub-genre of military publishing. The memoirs and divisional histories written by former SS soldiers were taken-up and amplified by a new postwar generation of enthusiasts in the United States and Western Europe. They also provided a convenient ideological template for the various Neo-Nazi and extreme right-wing groups that emerged in Western Europe during the 1950s, who drew heavily on the distinctive iconography of the Waffen-SS. In the long run, however, these groups achieved little: they were rent by internal division, reviled or simply ignored by the general public and subject to hostile scrutiny by national security services. 

 

Behind the Iron Curtain, any form of celebration of the Waffen-SS would have been unthinkable – a treason against the memory of the Red Army’s victory over Nazi Germany. But this would all change with the fall of the Berlin Wall in 1989 and the collapse of communism in Eastern Europe. 

 

In many of these countries, the long years of Soviet domination were conflated with those of Nazi Germany. Resistance to Soviet rule was always based on nationalist lines, and any organization that even appeared to have fought for national sovereignty became an object of veneration, and in the 1990s this included the Waffen-SS. The governments of Hungary, Latvia and Estonia were in the forefront of this new welcoming attitude, even sending ministers to preside over SS commemorations. In actuality, Hitler and Himmler – totally opposed to any concept of national self-determination – had cynically employed these soldiers for anti-partisan duties or as cannon fodder on the Eastern Front. 

 

Regardless of the historical truth, extreme nationalists were quick to exploit the new opportunities in the East. Waffen-SS veterans from Germany and the rest of Western Europe were invited to Eastern Europe to take part in celebrations otherwise banned in their home counties. 

 

Hungary publicly acknowledged the Waffen-SS in its annual ‘Day of Honor’ celebrations, first held in 1997, which commemorated the siege of Budapest in 1944-45. In something of a festival atmosphere – complete with flying flags, martial music and the laying of wreaths – veterans from the Waffen-SS marched alongside those of the German Wehrmacht and the Hungarian Army, to the applause of an appreciative audience of nationalist and neo-Nazi groups. 

 

Latvia and Estonia were also prominent in welcoming Waffen-SS veterans from across Europe, who in turn donated relief supplies and money to their hosts. Support for the Waffen-SS was somewhat more controversial in the Baltic states, however, with its large minority populations of Russian-speaking citizens opposing the erection of memorials glorifying SS soldiers as freedom fighters. Despite this, Narva in Estonia became a key site of commemoration, the former battleground where Waffen-SS units from the Baltic states, Germany and Western Europe had fought together in defense of the city in 1944. 

 

The break-up of the Soviet Union in 1991 led to the formation of Ukraine as an independent state. During World War II the Waffen-SS had raised a Ukrainian division to fight on the Eastern Front. After independence its veterans found themselves transformed from fascist collaborators into heroes in the struggle for nationhood. The once neglected gravestones of former soldiers were tended by volunteers and the division’s distinctive lion insignia was publicly displayed by young Ukrainians. In the ensuing conflict between Russia and Ukraine, the link between the Waffen-SS and present-day Ukrainian paramilitary forces became apparent. The infamous Azov Battalion openly espoused anti-semitic, white supremacist attitudes and adopted the Wolfsangel insignia worn by several Waffen-SS divisions. 

 

In 21st century Europe, pressures from mass migration, the negative aspects of globalization and the rise of populist political parties emboldened extreme nationalists. The influx of migrants from Africa and the Middle East encouraged them to take their lead from Waffen-SS mythology and define themselves as defenders of Europe from outside threat. At a Waffen-SS commemoration in Estonia in 2005, a Swedish neo-Nazi described his meeting with a tall Belgian veteran: ‘I am so eager standing over here with this two-meter man. He asks me, for the sake of their honor, to free Sweden from the foreign occupiers and explains that we Aryans will die if nothing happens.’ The fawning encounter, as described here, chillingly suggested that the work of the Waffen-SS was not yet complete and needed others to finish the task. Racially motivated attacks on migrants have become increasingly commonplace. 

 

More mainstream nationalist political parties such as Jobbik in Hungary and AfD in Germany have enjoyed some success in recent elections, and while they publicly disown association with neo-Nazi groups a close relationship exists between them. Thus, the neo-Nazi Hungarian Guard – modelled on the Nazi-sponsored Arrow Cross Party – acts as a shadowy para-military force on behalf of Jobbik. This resurgence of extreme nationalism demonstrates the enduring influence of National Socialism, including that of the Waffen-SS. It is a legacy Europe could well do without.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172610 https://historynewsnetwork.org/article/172610 0
Racism in America: What We Can Learn from Germany’s Struggle with Its Own Evil History

Memorial to the Murdered Jews of Europe in Berlin

 

President Trump’s racist tweet storm and the reactions of members of Congress have generated substantial media coverage that often criticizes the ugly remarks. Yet these discussions generally lack any historical context and are a poor substitute for a meaningful national conversation on the central role of slavery in America’s economic and political development.  

 

What would such a “national reckoning” over slavery look like? 

 

Many countries have historically oppressed racial minorities or committed terrible ethnic cleansing, but few have adequately grappled with their past or paid substantial sums in compensation to victims. 

 

One nation that has done both is modern Germany, which has publicly acknowledged responsibility for the Holocaust. The Germans have even created a special noun, Vergangenheitsbewältigung, to describe the process. According to Wikipedia, the word has two closely related meanings: first, “a struggle to overcome the negatives of the past,” and second, “public debate within a country on a problematic period of its recent history.” 

 

In Germany, this national debate has produced dozens of major monuments honoring the victims of the Holocaust and the creation of a K-12 educational curriculum which explains the Nazi government’s role in war crimes and condemns the perpetrators.  

 

Since 1952, the German government has paid more than $75 billion in reparations to the state of Israel, Jewish relief organizations and individual victims of the Holocaust. 

 

It is beyond the scope of this article to compare the crimes committed during America’s two hundred years of African-American slavery with the horrible tragedy of the Holocaust. We can, however, look at the process by which the German people have, in the last thirty years, slowly come to accept responsibility for the Nazi’s regimes crimes.  

 

As the late historian Tony Judt wrote in his classic Postwar: A History of Europe Since 1945, “a nation has first to have remembered something before it can begin to forget it.” 

 

In his book (published 2005), Judt notes that during the first five years after the end of World War II, the new, reconstituted German government tried to avoid any punishment or moral responsibility for crimes of the Third Reich. Their position, reflecting the attitude of most Germans, was “It was all Hitler’s fault.” 

 

In schools, in the news media, and in many government statements, most German adults avoided any mention of the crimes against the Jews. The people experienced a “national amnesia” regarding the years 1933-45. 

 

The first breakthrough came when Chancellor Konrad Adenauer in 1952 negotiated a treaty with the new nation of Israel. Known as the Luxembourg Agreement, it initiated a large-scale series of payments that continue to this day. 

 

While the initial decision to “write a check” (i.e. pay compensations) came relatively soon after the war, when many of the victims were still alive and in desperate need, it would require a new generation, one with no direct ties to the Nazi regime, to publicly acknowledge the German people’s responsibility for the crimes committed against the Jews and other minorities.

 

During the massive rebuilding effort of the 1950s and 1960s, many attempts were made to remove any traces of the Nazi regime. For example, in Munich local authorities wanted to tear down and pave over the Dachau concentration camp. The American military, which had captured the site intact, insisted it be preserved as a testament to the crimes committed there.

 

As Judt noted in Postwar, a national discussion was spurred in 1979 when a four-part American TV series on the Holocaust was shown on German TV. The series included portrayals of the round-ups of Jewish citizens and depictions of the gas chambers. Many younger Germans had never been exposed to these images before.  

 

On January 27, 1995, for the 50th anniversary of the liberation of Auschwitz, thousands of Germans voluntarily participated in ceremonies remembering the Holocaust. In 1999, the German parliament commissioned a new Memorial to the Murdered Jews of Europe, which opened in 2005. This stark display of 2,711 bare concrete slabs stands in the middle of the new, reunified Berlin. Today dozens of other major monuments and thousands of small plaques acknowledging the Holocaust are on display across Germany. 

 

Can we Americans begin our own Vergangenheitsbewältigung?

 

At best, we have taken a few tentative steps. We have had a few movies and TV series, notably Twelve Years a Slave, which depicted the horrors of slavery in antebellum America. We also have the new National Museum of African American History and Culture, which has a deeply moving exhibit on slavery. 

 

But there is also widespread ignorance or denial about slavery. Confederate statues still adorn many Southern cities and the halls of the U.S. Capitol. The Tennessee governor recently proclaimed July 13 Nathan Bedford Forrest Day in his state, honoring the former Confederate general, slave trader and early KKK leader.

 

A 2018 report by the Southern Poverty Law Center found that less than 8% of students knew why Southern states seceded from the union; only 12% knew about the economic importance of slavery to the North. A key problem, the SPLC noted, is that while a dozen states require teaching about some aspects of slavery, there is no nationally accepted, systematic approach. 

 

For example, few American history textbooks point out that protections for slavery were embedded in the Constitution or that slave owners dominated the federal government from 1787 through 1860. 

 

Sven Beckert, an acclaimed Harvard history professor, noted in his 2018 book Slavery’s Capitalism that during most of the 20th century, slavery was treated as a just Southern problem. However, a “new consensus” is emerging that all the American states benefited significantly from plantation slavery.  Rather than being a sidetrack or minor element in our history, Beckert asserts it is in fact “the interstate highway system of the American past.”

 

If so, it has yet to appear on the cultural road map of most Americans. 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172650 https://historynewsnetwork.org/article/172650 0
Is Stonehenge a Tourist Rip-Off?

 

Recently, Trip Advisor called Stonehenge the eleventh worst tourist rip-off in the UK. One reviewer called it a pile of bricks in a field. The prohibition on entering or touching the stones has not helped but the invisibility of the culture that built Stonehenge also detracts from the site’s significance. Interpreting the enigmatic stones is rather like visiting a cathedral without any knowledge of Christianity. 

 

Stonehenge sits, isolated, on a chalk down, the last stone circle to be built in Britain. Subsequent cultures littered tombs around the site. Farmers ploughed and destroyed all but a remnant of the avenue that linked the circle to the River Avon. The tribal lands were split between three counties; it is no longer the sum of its parts. Modern roads have dissected the site and compromised the archaeology. Vehicle noise overwhelms the wind soughing through the stones. In the past decade, Stonehenge has attracted far more visitors. The opening of the visitor centre by English Heritage in 2013 has stimulated much of this interest.

 

Prehistory, the unrecorded period before the Romans, is a challenge to us. As Christianity developed, many pagan sites were covered by churches in order to suppress pagan culture.  Archaeologists argue with each other about how and why Stonehenge was built and rarely agree. Horticulturalists can provide crucial information to help unravel Stonehenge’s mystery by examining its soil and topography. 

 

After the Ice Age, nomadic hunter gatherers walked into Britain from Europe and roamed the country for 6000 years. In 4000 BC, Britain was an island due to rising sea levels and horticulture allowed people to settle. The adoption of horticulture is too often confused with farming. The latter requires metal implements and beasts of burden that would only later emerge. Farming utilized the fertile soils that were too heavy for manual tilling. Horticulture, however, is growing mixed crops on a small scale, specifically using manual labour. This horticultural phase demanded light soil, and the silt along the River Avon was ideal. I call this area Avonlands, and a culture developed there. Stonehenge is the proof of their success. 

 

Merely a hunting zone for 6000 years, Avonlands offered advantages unparalleled in Britain as horticulture began. The river, 60 miles long, linked the tribal area to the sea and provided a highway when land was difficult to cross. At the sea end, coastal trading routes were extended east to continental Europe and west to Eire and the Scottish Isles. The river is slow moving and without rapids, ideal for log boats, and has the highest number of species in any UK river. It had annual salmon runs that  continued  until the 1950’s when the number and weight of fish severely declined. Similarly, sturgeon runs were once strong but have since disappeared. As a chalk river, the Avon contained very high levels of calcium. Two litres of water each day from the river provide 50% of human calcium needs. The skeletons found locally have large, strong bones. Some display severe bone breaks which healed. The river’s calcium content may be why Stonehenge was identified as a healing centre. 

 

The Avon headwaters rise on chalk downs to the north of Stonehenge. The river meanders past the circle and down to the sea at Christchurch. It is a slow river, which means it rarely floods. When it does, usually in winter, floods last just two to four days. That inundation deposits fresh silt and a host of vegetative matter on the land alongside the river, maintaining the soil’s fertility. We call this water-meadow and it rarely experiences drought. Reeds grow in the wetter areas. 

 

The horticulturalists’ tools were basic including antler picks and flint axes. Flint is found in chalk. In flint mines to the east, it was found as dinner plate sized nodules that were extensively traded. Wood and bone were also used to create tools. 

 

The people grew grain. As a stored food it removed the jeopardy of winter. They increased their stock of cattle and pigs, the latter being the most prominent meat in feasts. Because pigs eat human faeces,   this reduced the incidence of disease and gut worms. Unlike cattle, pigs feed in woodlands, especially on autumn crops of acorns. 

 

These horticulturalists still foraged, hunted and fished. They used the river, water-meadows, sea, marshes and wildwood. They cropped bespoke wood for their huts through coppicing and pollarding. Cutting thin wood was relatively easy using a flint axe whilst felling a tree was onerous. They thatched their huts using local reed, still in use today. A dry hut was a health advantage when most people were restricted to leaking grass and heather roofs. Significantly, they operated an economy based on the production and use of these materials. The true measure of their success was a food surplus. Only with this could they spend months each year, for 500 years, building an increasingly complex temple to their horticultural gods.

 

Stonehenge is a barometer for this culture, the first in Britain. They began with a single circle of unworked bluestones in 3000 BC. These stones were floated or dragged from South Wales, over 140 miles away. Subsequent remodelling reflects greater resources in food and people. They hauled 80 sarson stones 20 miles to Stonehenge, each weighing up to 50 tonnes. The final building phase ended in 2500 BC with the hand shaped stones and lintels we see today. This remodelling suggests a changing relationship with the gods that the people believed gave them their horticultural success.

 

Avonlands had too few water meadows to maintain a growing population. With the introduction of metal, horticulture was replaced with the expansion of farming across Britain. An outmoded Stonehenge fascinated the Romans only to be damned by its pagan ancestry for the next 2000 years. If Stonehenge is to be restored to its rightful heritage then it must be reengaged with the River Avon and its tribal lands. Only then can we interpret the astounding achievement of these prehistoric people.    

 

For more by this author, check out their new book: 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172608 https://historynewsnetwork.org/article/172608 0
Invented in the Fifties, Adolescence Swallowed the Nation—And Now the White House, Too

 

Donald Trump is the great mono-story of our time. Unless you’ve been away from the planet for a while, you know cable news is all Trump, all the time; Trumpworld is social media’s mesmerizingly dystopian subdivision. Yet the fixation with our president’s juvenile leadership style overlooks a deeper one: America’s obsession with eternal adolescence.

 

Hanging onto youth is an id-driven urge, right up there with sex and counting “likes” on Twitter and Instagram. But for all its pluses— inspiring seniors to trade the recliner for kettlebell training, for instance—it can lead to behavior that disses public norms, evades the bald-face truth, and swaps fevered fantasies for common sense. We get uber-brattishness not only in the Oval Office, but also across the political spectrum.

 

I didn’t really tumble to the mythic power of adolescence until the 1990s, when my friend Alex Gibney and I created a television history of the 1950s, and I was reminded the decade of Hula Hoops, Frisbees and Silly Putty had invented teenagers, too. Born in 1950, I remember fretting the rock ‘n’ roll boom Big Mama Thornton and Elvis helped ignite would fizzle before I hit the magic No. 13. Little did I realize I was a foot soldier in a cultural revolution.

 

Teenagers had always been with us, of course. It wasn’t until after World War II, however, that rising prosperity merged with booming birthrates to make the young an irresistible force. In his eponymous bestseller on which we based “The Fifties” TV series, David Halberstam explained: “In the past when American teenagers had made money, their earnings, more often than not, had gone to help support their parents or had been saved for one … long-desired purchase, like a baseball glove or a bike.” In the new affluence, however, teenagers “were becoming a separate, defined part of the culture: As they had money, they were a market, and … they were listened to and catered to.”

 

All that being catered to led to baby boomers acquiring a bit of a reputation. Compared to our frugal, self-sacrificing Depression-era parents, it was said (mainly by our frugal, self-sacrificing parents) that we could be a spoiled, pouty and rebellious lot.

 

But boomers dug the attention—so much so we weren’t about to let adolescence go. According to author Kurt Anderson, “the commitment of Americans, beginning with the baby boom generation, to a fantasy of remaining forever young” meant that “during the 1980s and ’90s, when American adults, like no adults before them — but like all who followed — began playing video games and fantasy sports, dressing like kids, and grooming themselves and even getting surgery to look more like kids.” Anderson called it the “Kids ‘R’ Us Syndrome.”

 

Is it any wonder, then, we eventually got a “Kids ‘R’ Us” president? Trump’s daily tweet storms, with his “I know you are, but what am I?” tone, his dysfunctional relationship with the truth, and his ducking of personal responsibility, signal off-the-hook traits that once sank public careers, pronto. But let’s be honest: Isn’t the Donald an example, albeit extreme, of the degree to which our society has normalized such behavior?

 

When Trump was born in 1946, America was revving up twin revolutions in communication technology and transportation that would help fuel the drive to extended adolescence. The first big impact, dramatically shrinking time and distance, manifested in how ordinary Americans were growing hungry for flashier, sped-up lifestyles.

 

In an interview for “The Fifties,” McDonalds founder Dick McDonald told me that after the war, he and his brother Mac McDonald witnessed the new appetites at their San Bernardino restaurant. “[A]ttitudes were completely different …,” said Dick. “We were beginning to get many complaints [about] how slow the service was. We decided, if the customer wants fast service, we’re going to give them fast service”; Dick and Mac started rolling out orders in 20 seconds, instead of 20 minutes.

 

Meanwhile, new technologies accelerated everything from book publishing and telecommunications to automobile and air travel. As the world got smaller and information gushed in, sociologist C. Wright Mills argued that faster times required new mental habits to let people make sense of the larger forces influencing their lives. “What they need, and what they feel they need,” wrote Mills, “is a quality of mind that will help them to use information and to develop reason in order to achieve lucid summations of what is going on in the world and of what may be happening within themselves.”

 

Fat chance of that. By the late 1950s, TV had swept the nation and, for broadcast impresarios, introspection didn’t monetize. A “great media-metaphor shift” was underway, as cultural critic Neil Postman put it in his landmark 1985 book “Amusing Ourselves to Death.” Americans were briskly moving from a print-based culture to an electronic-centered one. “[U]nder the governance of the printing press …,” Postman wrote, “discourse in America was … generally coherent, serious and rational.” Tailoring messages to fit the new TV medium meant fewer words and more pictures, less reflection and more emotion, and ended up rendering our national conversation “shriveled and absurd.”

 

That may be overstating the case, but it’s true that more and more Americans were traveling faster and lighter. Upwardly mobile, they more easily distanced themselves from many of the stark facts of life of only a few years before—backbreaking physical labor; lack of proper health care leading to untreated illness, poor teeth, and premature death; and limited access to higher education and comfortable retirements.

 

Fast-forward to the early 1990s, when personal computers and the rise of the internet induced the next great media shift. Suddenly, our wildest teenage dreams were now only a click or two away, stoking on-demand fantasies of leisure-time exoticism, sex and ever-unfolding material wonders. Ensorcelled, adult-adolescents suspended disbelief to a degree that, in the past, had been the preserve of dreaming, questing teenagers. 

 

Today, in our lucid moments, we know that social media moguls have actively encouraged some of our worst behavior. Pushing angry, hate-filled content that swamps reason, they lock in eyeballs and profits. As internet pioneer Jaron Lanier, put it: “So if they can find a group in which they can elicit those emotions … they’re going to keep on reinforcing that because they can get more engagement, action, and more money out of that group ... .” And, of course, today’s ease of access to our primitive passions allows the world’s real fake-news artists to inflame the public mind and mess with our elections.

 

Too often, internet-abetted outrage spills into violence; yet more often, in an adolescent society, it shows up with its passive-aggressive twin, complacency. The upshot is we have a hard time knowing our own minds or maintaining our focus on serious matters. “It’s very easy to ignore the world when the internet is fun and, at the margin, it’s cheap,” said economist Tyler Cowen. “You can protest politically on your Facebook page or write a tweet and just put it aside, get to the next thing” without breaking a sweat.

 

Cowan argues that our complacency derives from a long-term decline in “American dynamism.” “We now have a society where we’re afraid to let our children play outside,” he said. “We medicate ourselves at much higher rates. We hold jobs for longer periods of time. We move across state lines at much lower rates than before …. But once you’re on a track to make everything safer, it’s very hard to stop or restrain yourself along appropriate margins.”

 

What is surely is true is that, try as we might to minimize risk, the door can never be shut tight. Instead, the failure to act on underlying problems typically opens the window to compounded trouble.

 

Which may explain what’s eating at our teenagers. Not only are kids taking longer to mature today, they’re coping with a hollowness at the heart of their digital lives that concerns psychologist Jean Twenge. “Parenting styles continue to change, as do school curricula and culture, and these things matter,” she wrote in The Atlantic. “But the twin rise of the smartphone and social media has caused an earthquake of a magnitude we’ve not seen in a very long time, if ever.” Pointing to a spike in teen suicide rates, Twenge said, “There is compelling evidence that the devices we’ve placed in young people’s hands are having profound effects on their lives—and making them seriously unhappy.”

 

You have to think that even C. Wright Mills would be shocked at the speed with which information technology has separated us from reality and amped up self-inflation. Who among us hasn’t posted Facebook items that elevate our achievements, and tout fancy friends or vacations? In such a world, maxims that once guided life, like “To thine own self be true,” can sound unbelievably trite. Meanwhile, our digital imps, like teen hoods haunting a fifties’ street corner, whisper that only suckers resist the urge to self-inflate.

 

It isn’t exactly Holmesian to deduce that trouble with setting sensible limits on our actions and aspirations has made it harder to talk sensibly with one another. President Trump’s spokespersons have talked unabashedly about “alternative facts” or insist that “truth isn’t truth”—prime examples of what Neil Postman called “dangerous nonsense.” Meanwhile, bloviators, left and right, insist on minor points of political dogma, and go haywire at any whiff of apostasy.

 

Such juvenile behavior is hard on a democracy. As Tom Nichols wrote in his 2017 book “The Death of Expertise,” when the concept “is understood as an unending demand for unearned respect for unfounded opinions, anything and everything becomes possible, including the end of democracy and republican government itself.”

 

To be sure, reasonable folks can understand how tempting it is for fellow citizens suffering the hard knocks of low pay or no pay, and reverse social mobility, to entertain a lazy slogan like “Make America Great Again.” But yearning for halcyon days when the economy was headed for the stars, sock hops were the groove, and a white, male-dominated supermajority told everybody, at home and abroad, how things were going to go down is, at best, a pipedream.

 

In our TV series, David Halberstam, that clear-eyed observer of American life, cautioned that hindsight is invariably 20-20. “There’s a tendency to romanticize the fifties and to forget the narrowness, the prejudice …” he said. “But I think there’s a generally far greater freedom and a sense of possibility, economic and other, today than in the past.”

 

That’s still true enough. Had David lived to see the changes churned up by our present decade (he died in 2007), however, I’m willing to bet he’d remind us that the greatest of cultures can develop bad habits of mind, lose their edge to adolescent whimsy or outright chicanery—and nations in that fix get the leadership they deserve.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172652 https://historynewsnetwork.org/article/172652 0
The Long History of Unjust and Lawless Attorneys General

 

As Robert Mueller testifies this week before Congress, the United States Department of Justice is once again in the spotlight. Earlier this summer, the House of Representatives held Attorney General William Barr in contempt for his refusal to comply with a subpoena on the 2020 census. Barr is hardly the first AG who has used his appointment as Attorney General to promote lawlessness and injustice. 

 

In fact, in the past 100 years, Attorneys General have violated the Bill of Rights; engaged in political corruption and lawless acts while advocating "law and order"; endorsed abuses of power in the name of "national security”; and refused to cooperate with Congressional investigations of wrong doing. The list of controversial Attorneys General who have undermined their oath to uphold the Constitution of the United States is long and includes eight individuals who served from the time of the presidency of Woodrow Wilson to the presidency of Donald Trump.

 

The first is A. Mitchell Palmer, who served as Attorney General under Woodrow Wilson from March 1919 to March 1921. For much of the period from October 1919 to March 1921, Wilson was incapacitated by a stroke, giving Palmer license to abuse his position.  Palmer initiated the  “Palmer Raids”, also known as the “Red Scare”, in which thousands of people suspected to be Socialists or Communists were rounded up and jailed. The prisoners were often denied their basic civil rights and writ of habeus corpus and detained for months before they were finally released. A small percentage who were not American citizens were deported.  Assisting Palmer in his quest to “save the nation” from the Soviet Union’s new leader Nikolai Lenin was future FBI Director J. Edgar Hoover and  other zealots who had no issue with violations of the Bill of Rights. Palmer undermined respect for the rule of law as he denied basic civil liberties and civil rights to thousands of victims of his “Red Scare”.

 

Palmer’s successor, Harry M. Daugherty, was the Attorney General under President Warren G. Harding and briefly under President Calvin Coolidge from March 1921 to April 1924.  Prior to his appointment, Daugherty was Harding’s campaign manager and part of the infamously corrupt  “Ohio Gang.” Two members of the cabinet under Harding and Coolidge---Secretary of State Charles Evans Hughes and Secretary of Commerce Herbert Hoover—were wary of Daugherty, and eventually Coolidge asked for his resignation after evidence emerged that Daugherty had knowledge of the infamous Teapot Dome scandal (oil lands in Wyoming were given to the Sinclair Oil Company illegally by Secretary of the Interior Albert Fall). Hints of the Teapot Dome and other scandals began to emerge in the last days of Harding’s presidency before his sudden death in August 1923.  Daugherty was indicted in 1926 and tried twice but deadlock in the Justice Department led to the dismissal of charges. Nevertheless, Daugherty was left under a cloud of corruption which undermined the historical reputation of President Harding.

 

Four decades later, President Richard Nixon appointed his campaign manager, John N. Mitchell as Attorney General, a position he held from January 1969 to March 1972.  Mitchell was regarded as one of the closest advisers to Nixon and was infamous, like his president, for his support of “law and order.” Ironically, Mitchell didn’t always follow the letter of the law. He was not vetted by FBI Director J. Edgar Hoover (President Nixon requested he not be), advocated the use of wiretaps in national security cases without obtaining a court order, promoted preventive detention of criminal suspects although it potentially violated the Constitution, and did not properly enforce court-ordered mandates for desegregation. Most famously, the Watergate tapes proved he helped plan the break-in at the Democratic National committee headquarters and was deeply involved in the cover-up that followed. Even after he left the Justice Department and became the head of the Committee to Reelect the President, he threatened Watergate journalist Carl Bernstein and Washington Post publisher Katherine Graham. Mitchell was indicted and convicted of conspiracy, obstruction of justice, and perjury. He spent 19 months in prison and lost his law license for his illegal and unethical actions.

 

Nearly two decades later, President George H. W. Bush appointed William Barr as Attorney General and Barr served from November 1991 to January 1993. In his first round as head of the Justice Department, Barr faced criticism after he encouraged the President to pardon former Secretary of Defense Caspar Weinberger, who served under President Ronald Reagan from January 1981 to November 1987. In the aftermath of the Iran Contra Affair, Weinberger faced indictment and trial on charges of perjury and obstruction of justice. 

 

After the Presidential Election of 2000, President George W. Bush selected former Senator John Ashcroft of Missouri as his first Attorney General, serving from February 2001 to February 2005.  Ashcroft endorsed the use of torture, including in the Abu Ghraib abuse scandal in Iraq in 2004. Further, he endorsed unregulated surveillance by the Foreign Intelligence Surveillance Court as well as FBI surveillance of libraries and retail sales to track suspects’s reading habits. Critics of the Patriot Act and the post-September 11th policies of the Bush Administration argue this was a massive privacy violation.  With his reputation undermined, Ashcroft decided to leave his position after Bush won a second term in 2004.

 

Ashcroft was replaced by Alberto Gonzales, who served from February 2005 to September 2007. Previously, Gonzales was a member of the White House Counsel from January 2001 to February 2005 and  was Bush’s General Counsel during his Texas Governorship from 1995-2001.  Gonzales’s tenure as Attorney General was highly controversial as he endorsed warrantless surveillance of US citizens and gave legal authorization to “enhanced interrogation techniques,” later, generally acknowledged as torture.  He also presided over the firing of nine US Attorneys who refused to adhere to back-channel White House directives to prosecute political enemies.  Further, he authorized the use of military tribunals and the denial of the writ of habeus corpus to detainees at the Guantanamo Bay Naval Base in Cuba.  Eventually, he resigned while under fire for abusing his office and his politicizing it. 

 

Corruption and abuse by the Attorney General have continued under President Donald Trump with Jeff Sessions; then with Matthew Whitaker as his replacement as Acting Attorney General, and finally with recent return of William Barr to the office.

 

Sessions, who had been an Alabama Republican Senator since 1997, almost immediately sparked controversy after news broke that he misled the Senate on his contacts with Russian officials during the 2016 Presidential campaign. Sessions therefore recused himself from the investigation into Russian collusion during the campaign. Trump was immediately displeased and pressure slowly mounted from within the administration for Sessions to resign.

 

But when Sessions left the administration in fall 2018, Acting Attorney General Matthew Whitaker, who served from November 2018 to February 2019, circumvented normal Senate confirmation procedure which subsequently caused an uproar. This led to numerous legal challenges to his claim that he could supervise the Mueller investigation.  

 

The brief and controversial tenure of Whitaker ended in February 2019, with the appointment of William Barr. A quarter century after he served as AG under George H.W. Bush, Barr was the AG  for the second time. Today, Barris even more controversial as he has enunciated his vision of unitary executive authority, adding to Donald Trump’s belief that his powers as President are unlimited.  Barr has refused to hand over the entire Mueller Report to committees in the House of Representatives,  has refused to testify before the House Judiciary Committee, and was recently held in criminal contempt for refusing to share information about Trump Administration attempts to add a citizenship question to the upcoming 2020 Census.  

 

So the Justice Department and the Attorneys General over the past century under Republicans Warren G. Harding, Richard Nixon, George H. W. Bush, George W. Bush and Donald Trump has undermined public faith and its reputation as a fair minded cabinet office intent on enforcing fair, just policies.  However, the past century began with a horrible tenure under Democrat Woodrow Wilson at his time of incapacity, allowing A. Mitchell Palmer to set a terrible standard followed by seven of his successors in the Justice Department. Regaining confidence in the agency and the holder of the office of Attorney General will require a change in the Presidency, clear and simple. 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172654 https://historynewsnetwork.org/article/172654 0
Would Slavery Have Ended Sooner if the British Won the American Revolutionary War?

John Singleton Copley's The Death of Major Peirson painting depicts Black Loyalist soldiers fighting alongside British regulars

 

 

"I would never have drawn my sword in the cause of America, if I could have  conceived that thereby I was founding a land of slavery."

-Marquis de Lafayette, French military leader who was instrumental in enlisting French support for the colonists  in the American War of Independence 

 

 

Historians and the American public have long grappled with the contradiction that the Revolutionary War was waged under the banner "all men are created equal" yet was largely led by slave owners. 

 

The July 4th, 1776 Declaration of Independence (DI) was in itself a revolutionary document. Never before in history had people asserted the right of revolution not just to overthrow a specific government that no longer met the needs of the people, but as a general principle for the relationship between the rulers and the ruled: 

 

"We hold  these truths to be self-evident, that all men are created equal, that they are endowed by  their creator with certain unalienable rights, that among these are Life, Liberty, and the pursuit of happiness. That to secure these rights, governments are instituted among  Men, deriving their just powers from the consent of the governed.--That whenever any Form  of Government becomes destructive of these ends, it is the Right of the People to alter or abolish it, and to institute new Government..."   

 

And yes, "all men are created equal" excluded women, black people and the indigenous populations of the continent. Yes, it was written by slave-owner Thomas Jefferson with all his personal hypocrisies. Yes, once free of England, the U.S. grew over the next 89 years to be the largest slave-owning republic in  history. 

 

Americans are taught to see the birth of our country as a gift to the world, even when its original defects are acknowledged. The DI along with the Constitution are pillars of American exceptionalism--the belief that the U.S. is superior and unique from all others,  holding the promise of an "Asylum for the persecuted lovers of civil and religious  liberty" in the words of Thomas Paine in Common Sense. 

 

Indeed, the powerful words of the Declaration of Independence have been used many times since the Revolutionary War to challenge racism and other forms of domination and inequality. Both the 1789 French Revolution and the 1804 Haitian revolution--the only successful slave revolt in  human history--drew inspiration from this clarion call. In 1829 black abolitionist David  Walker threw the words of the DI back in the face of the slave republic: "See your declarations Americans!!! Do you understand your own language?" The 1848 Seneca Falls women's rights convention issued a Declaration of Sentiments proclaiming that  "We hold these truths to be self evident that all men and women are created  equal." Vietnam used these very words in declaring independence from France in 1946. And as Martin Luther King, Jr. stated in his 1963 “I Have a Dream” Speech, the Declaration was "A promise that all men, yes, black men as well as white men, would be guaranteed the unalienable rights of life, liberty, and the pursuit of happiness."  

 

Historian Gary Nash, among others, has strongly argued against the viewing history as inevitable. He argues this short circuits any consideration of the fact that every historical moment could have happened differently. For instance, in his book “The Forgotten Fifth,” Nash argues that if Washington and Jefferson had been faithful to their anti-slavery rhetoric and  chosen to lead a fight against slavery during the American Revolution,  there was a good  chance they could have succeeded.

 

Perhaps a different question might be asked: what if the British had won, had defeated the colonists' bid to break from the mother country? Is it possible that the cause of freedom  and the ideals of the DI would have been paradoxically better served by that outcome?  

 

England's Victory Over France Leads to the American War For Independence  

 

It was, ironically, England's victory over France for control of the North American continent in the seven years' war (1756-1763) that laid the basis for their North American colonies to revolt just 13 years later. As the war with France ended, the British 1763 Proclamation prohibited white settlement west of the Appalachian mountains in an attempt at detente with Native Americans -- bringing England into conflict with colonists wanting to expand westward. More serious still were the series of taxes England imposed on the colonies to pay off its large war debt: the 1765 Stamp Act, the 1767-1770 Townshend Acts, and the 1773 Tea Acts, among others. As colonial leaders mounted increasingly militant resistance to these measures, so too did British repression ramp  up.   

 

While "No taxation without representation" and opposition to British  tyranny are the two most commonly cited causes propelling the colonists' drive for independence, recent scholarship (Slave Nation by Ruth and Alfred Blumrosen, Gerald  Horne's The Counter-Revolution of 1776, and Alan Gilbert's Black Patriots and Loyalists in particular) has revealed a heretofore unacknowledged third major motivating force: the preservation and protection of slavery itself. In 1772, the highest British court ruled in the Somerset decision that slave owners had no legal claims to ownership of  other humans in England itself, declaring slavery to be "odious". Somerset  eliminated any possibility of a de jure defense of slavery in England, further reinforced  at the time by Parliament refusing a request by British slave owners to pass such a law. While Somerset did not apply to England's colonies, it was taken by southern colonists as  a potential threatto their ability to own slaves. Their fear was further reinforced by the 1766  Declaratory Act, which made explicit England's final say over any laws made in the  colonies, and the "Repugnancy" clause in each colony's charter. Somerset added fuel to the growing fires uniting the colonies against England in a fight for  independence.  

 

"Seeing the Revolutionary War through the eyes of enslaved blacks turns its meaning  upside down"  Simon Schama, Rough Crossings   

 

Among the list of grievances in the DI is ararely scrutinized statement: "He [referring to  the king] has excited domestic insurrections amongst us." This grievance was motivated by Virginia Royal Governor Lord Dunmore's November 1775 proclamation stating that any person held as a slave by a colonist in rebellion against England would become  free by joining the British forces in subduing the revolt. While 5000 black Americans, mostly free, from northern colonies joined with the colonists' fight for independence, few of our school books teach that tens of thousands more enslaved black people joined with the British, with an even greater number taking advantage of the war to escape the colonies  altogether by running to Canada or Florida. They saw they had a better shot at  "Life, liberty and the pursuit of happiness" with the Britishthan with their  colonial slave masters. To further put these numbers in perspective, the total population of the 13 colonies at the time was 2.5 million, of whom 500,000 were slaves and indentured servants. While there is some debate about the exact numbers, Peter Kolchin in American Slavery points to  the "Sharp decline between 1770 and 1790 in the proportion of the population made up  of blacks (almost all of whom were slaves) from 60.5% to 43.8% in South Carolina and from 45.2% to 36.1% in Georgia" (73). Other commonly cited figures from historians estimate 25,000 slaves escaped from South Carolina, 30,000 from Virginia, and 5,000 from  Georgia. Gilbert in Black Patriots and Loyalists says "Estimates range between twenty thousand and one hundred thousand... if one adds in the thousands of not yet organized blacks who trailed... the major British forces... the number takes on  dimensions accurately called 'gigantic' (xii).  Among them were 30 of Thomas Jefferson's slaves, 20 of George Washington's, and good ole "Give me liberty or give me death" Patrick Henry also lost his slave Ralph Henry to the Brits. It was the first mass emancipation in American history. Evidently  "domestic insurrection" was legitimate when led by slave owners against England  but not when enslaved people rose up for their freedom--against the rebelling slave owners!  

 

Before There Was Harriet Tubman There was Colonel Tye  

 

Crispus Attucks is often hailed as the first martyr of the American revolution, a free  black man killed defying British authority in the 1770 Boston Massacre. But few have  heard of Titus, who just 5 years later was among those thousands of slaves who escaped to the British lines. He became known as Colonel Tye for his military prowess in leading black and white guerrilla fighters in numerous raids throughout Monmouth County, New Jersey, taking reprisals against slave owners, freeing their slaves, destroying their weaponry and creating an atmosphere of fear among the rebel colonists--and hope among  their slaves. Other black regiments under the British fought with ribbons emblazoned  across their chests saying "Liberty to Slaves".  One might compare Col. Tye to Attucks but if Attucks is a hero, what does that make Tye,  who freed hundreds of slaves? Perhaps a more apt comparison is with Harriet Tubman, who escaped slavery in 1849 and returned to the south numerous times to also free hundreds of her brothers and sisters held in bondage.  

 

So what if the British had won?  

 

At no point, however, did the British declare the end of slavery as a goal of thewar; it was always just a military tactic. But if the Brits had won, as they came close to doing, it might have set off a series of events that went well beyond their control. Would England  have been able to restore slavery in the 13 colonies in the face of certain anti-slavery resistance by the tens of thousands of now free ex-slaves, joined by growing anti-slavery forces in the northern colonies? As Gilbert puts it, "Class and race forged ties of solidarity in opposition to both the slave holders and the colonial elites." (10) Another sure ally would have been the abolitionist movement in England, which had been further emboldened by the 1772 Somerset decision. And if England had to abolish slavery  in the 13 colonies, would that not have led to a wave of emancipations throughout the Caribbean and Latin America? And just what was the cost of the victorious independence struggle to the black population? To the indigenous populations who were described in that same DI grievance as  "The merciless Indian Savages"? Might it have been better for the cause of freedom if the colonists lost? And if the colonists had lost, wouldn't the ideals of the DI have carried just as much if not more weight?   

 

"The price of freedom from England was bondage for African slaves in America.  America would be a slave nation." Eleanor Holmes Norton, introduction to Slave Nation  

 

We do know, however, the cost of the colonists' victory: once independence was won, while the northern states gradually abolished slavery, slavery BOOMED in the south. The first federal census in 1790 counted 700,000 slaves. By 1810, 2 years after the end of the slave trade, there were 1.2 million enslaved people, a 70% increase. England ended slavery in all its colonies in 1833, when there were 2 million enslaved people in the U.S. Slavery in the U.S. continued for another 33 years, during which time the slave population doubled  to 4 million human beings. The U.S abolished slavery in 1865; only Cuba and Brazil ended slavery at a later date. The foregoing is not meant to romanticize and project England as some kind of abolitionist savior had they kept control of the colonies. Dunmore himself was a slave owner. England was the center of the international slave trade. Despite losing the 13  colonies, England maintained its position as the most powerful and rapacious empire in the world till the mid-20th century. As England did away with chattel slavery, it  replaced it with the capitalist wage slavery of the industrial revolution. It used food as a weapon to starve the Irish, conquered and colonized large swaths of Asia, Africa and the Pacific.   

 

Historian Gerald Horne wrote that  "Simply because Euro-American colonists prevailed in their establishing of the U.S., it should not be assumed that this result was inevitable. History points to other  possibilities...I do not view the creation of the republic as a great leap forward for  humanity" (Counter-Revolution of 1776, ix).  The American revolution was not just a war for independence from England. It was also a  battle for freedom against the very leaders of that rebellion by hundreds of thousands of enslaved black people, a class struggle of poor white tenant farmers in many cases also against that same white colonial elite, and a fight for survival of the indigenous  populations. But the colonists' unlikely victory lead to the creation of the largest slave nation in history, the near genocide of the indigenous populations, and a  continent-wide expansion gained by invading and taking over half of Mexico. The U.S. went on to become an empire unparalleled in history, its wealth origins rooted largely in slave labor. 

 

The struggles for equality and justice for all that the Declaration of Independence promised continues of course but ML King's promissory note remains unfulfilled.  The late Chinese Premier Chou en Lai was once asked his assessment as to whether the French revolution was a step forward in history. His response was, "It's too soon to  tell". Was the founding of the United States a step forward in history? Or is it still too soon to tell?

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172653 https://historynewsnetwork.org/article/172653 0
Reconsidering Journalist and Gay Activist Randy Shilts

 

Twenty-five years ago, Randy Shilts went to his death a conflicted man. He talked openly about the frustration of his life being finished without being complete. Beyond the obvious pain, panic or dread he must have felt as he succumbed to AIDS-related health complications, he knew there was so much more to do, but he was not going to be around to do any of it. 

 

In many senses, he had not resolved personally the dissonance that seemed ever beneath the surface of just what his role was – or who he was supposed to be. He publicly clung to claims of being an objective journalist – “What I have to offer the world is facts and information,” he told Steve Kroft and the nation during one of his last interviews (aired on the CBS franchise 60 Minutes the week he died in February 1994). A closer examination of his life and work reveals, however, that it was not just facts and information that he sought to impart as an objective journalist, but that he sought to fully inhabit the role that Walter Lippmann long ago described as the “place” for many journalists. He fully embraced the idea that our society and culture were in need of an interpretation a trained journalist (and critical thinker) could provide, but moreover, the need to shine the light of media in the right places and set the agenda of what was important, and what was not. 

 

Shilts’s light-shining would be what could win him fierce critics, particularly as he sought to transport himself from just a “gay journalist” to “journalist who happens to be gay.” As one of the nation’s very first openly gay journalists at a major daily newspaper, The San Francisco Chronicle, he sought to distinguish himself not only as a legitimate journalist, but also as a representative of the welcomed homosexual in a vehemently heterocentric world. Doing so meant he quickly personified the clichéd role of the journalist more skilled at making enemies than friends, who instead worried more about making a point. For Shilts, that is what journalism was for, to move society and the people in it to a new place, to a new understanding of or relationships with one another. Facts and information could do that, but Shilts understood how to focus those facts, right down to when a story appeared; he openly acknowledged that he timed stories that raised the frightening prospect of HIV and AIDS for Friday editions of The Chronicle to put some fear into his fellow gay tribe members as a weekend free for partying approached.

Writing about Shilts, someone I have grown to admire and love during more than nine years of research, required, however, more than just a tribute or formal canonization. After only a brief period of probing, one quickly finds that the legacy of Shilts remains a mixed one, depending upon whom one encounters. Many gay rights advocates laud his promotion of gay martyr Harvey Milk via Shilts’s first book, “The Mayor of Castro Street,” or his focus on the battle for lesbian and gay U.S. service members to serve openly in his last tome, “Conduct Unbecoming.” Others who fought the battles of the deadly HIV-AIDS pandemic that darkened America and the rest of the world reject praise and offer vocal and pointed criticism of Shilts as a heartless “Uncle Tom” of the gay liberation movement. These latter feelings flow directly from the praised – and scorned – work that won Shilts his most fame, “And the Band Played On,” and the portions of it resting on the idea of “Patient Zero” responsible for “bringing AIDS to America,” as the editors of The New York Post opined in 1987.  

My research attempts to bridge these two shores, unearthing the incredible drive and determination of this outspoken gay man from small town Illinois, to becoming an early and important barrier breaker as an openly gay reporter for a major daily newspaper.  Dead at the age of 42, a full 22 years before scientists would clear Shilts’s “Patient Zero” as clearly not the man who brought AIDS to America, the posthumous review of Shilts seems incomplete and unfair if it does not include a full review of all aspects of the journalist, author and man.  There is value in taking up the lingering issues of how to place Shilts as a journalist and early gay liberation leader, and bring at least some resolution to the remaining conflict. We can do so without relieving Shilts of having to own his own words and actions, all the while placing them in context of his entire life. 

The story of Shilts and the mixed legacy that remains a quarter century later has connecting points to our contemporary lives – where we seek a fuller understanding of historical people and times, but to do so in the correct context. Shilts’s story remains incomplete if we do not take in a fuller consideration of his successes alongside the problems centered primarily on his construction of the “Patient Zero” myth. Robbed of the living that could have resolved the dangling irritants in his story, we’re left with the same conflict Shilts felt as his life wound down. Similarly, we must wait the end of many stories playing out around us and perhaps find some resolution in the context time affords. 

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172616 https://historynewsnetwork.org/article/172616 0
Is Trump the Worst President in History?

 

As the chance of getting rid of Donald Trump — through impeachment or by voting him out — continues to dominate the headlines, the historical challenge is compelling.  No president has been a greater threat to the qualities that make the United States of America worthy (at its best) of our allegiance.

 

The rise of Trump and his movement was so freakish that historians will analyze its nature for a long time.  From his origins as a real estate hustler, this exhibitionist sought attention as a TV vulgarian.  Susceptible television viewers found his coarse behavior amusing. Then he announced that he was running for the presidency and it looked for a while like just another cheap publicity stunt.

 

But his name-calling tactics struck a chord with a certain group of voters.   Our American scene began to darken.  Before long, he was hurling such vicious abuse that it ushered in a politics of rage. As his egomania developed into full megalomania, the “alt-right” gravitated toward him.

 

The “movement” had started.

 

More and more, to the horror of everyone with power to see and understand, he showed a proto-fascist mentality.  So alarms began to spread: mental health professionals warned that he exemplifies “malignant narcissism.”

 

Never before in American history has the presidential office passed into the hands of a seditionist.  And the use of this term is appropriate.  With no conception of principles or limits — “I want” is his political creed —he mocks the rule of law at every turn.

 

At a police convention in 2017, he urged the officers in attendance to ignore their own regulations and brutalize the people they arrest.  He pardoned ex-Arizona sheriff Joe Arpaio, who was convicted of criminal contempt of court.  He appointed Scott Pruitt to head the EPA so he could wreck the agency and let polluters have the spree of their lives.

 

Trump is fascinated by powerful dictators with little regard to human rights or democracy. He compliments Vladimir Putin and hopes to invite that murderer to stay in the White House.  He likes Rodrigo Duterte of the Philippines, a tyrant who subverts that nation’s democracy.

 

So, Trump certainly has the personality of a fascist.  But he is not quite as dangerous as other authoritarians in history.

 

In the first place, he lacks the fanatical vision that drove the great tyrants like Hitler and Stalin to pursue their sick versions of utopia.  He is nothing but a grubby opportunist.  He has no ideas, only appetites.   The themes that pass for ideas in the mind of Donald Trump begin as prompts that are fed to him by others — Stephen Miller, Sean Hannity, and (once upon a time) Steve Bannon. To be sure, he would fit right in among the despots who tyrannize banana-republics.  But that sort of a political outcome in America is hard to envision at the moment. 

 

Second, American traditions — though our current crisis shows some very deep flaws in our constitutional system — are strong enough to place a limit on the damage Trump can do.  If he ordered troops to occupy the Capitol, disperse the members of Congress, and impose martial law, the chance that commanders or troops would carry out such orders is nil.

 

Third, Americans have faced challenges before. Many say he is our very worst president — bar none.  And how tempting it is to agree.  But a short while ago, people said the same thing about George W. Bush, who of course looks exemplary now when compared to our presidential incumbent.

 

The “worst president.”

 

“Worst,” of course, is a value judgment that is totally dependent on our standards for determining “badness.”  And any number of our presidents were very bad indeed — or so it could be argued.

 

Take Andrew Jackson, with his belligerence, his simple-mindedness, his racism as reflected in the Indian Removal Act of 1830.  Take all the pro-slavery presidents before the Civil War who tried to make the enslavement of American blacks perpetual:  John Tyler, Franklin Pierce, James Buchanan. Take James K. Polk and his squalid war of aggression against Mexico.  Take Andrew Johnson, who did everything he could to ruin the lives of the newly-freed blacks after Lincoln’s murder.

 

The list could go on indefinitely, depending on our individual standards for identifying “badness.”  Shall we continue?  Consider Ulysses S. Grant and Warren G. Harding, so clueless in regard to the comparatively easy challenge of preventing corruption among their associates.  Or consider Grover Cleveland and Herbert Hoover, who blinded themselves to the desperation of millions in economic depressions.  And Richard Nixon, the only president to date who has resigned the office in disgrace.

 

Which brings us to Trump.

 

However incompetent or even malevolent some previous American presidents were, this one is unique. The Trump presidency is a singular aberration, a defacement of norms and ideals without precedent.  However bad some other presidents were all of them felt a certain basic obligation to maintain at least a semblance of dignity and propriety in their actions.

 

Not Trump.

 

Foul beyond words, he lurches from one brutal whim to another, seeking gratification in his never-ending quest to humiliate others. He spews insults in every direction all day.  He makes fun of the handicapped.  He discredits journalists in order to boost the credibility of crackpots and psychopathic bigots.  He accuses reporters of creating “fake news” so he can generate fake news himself: spew a daily torrent of hallucinatory lies to his gullible followers.

 

He amuses himself — with the help of his money and the shyster lawyers that it pays for — in getting away with a lifetime’s worth of compulsive frauds that might very well lead to prosecutions (later) if the evidence has not been destroyed and if the statute of limitations has not expired.

 

So far, however, he is always too brazen to get what he deserves, too slippery for anyone to foil.  

 

Anyone with half of ounce of decency can see this wretched man for what he is.  They know what’s going on, and yet there’s nothing they can do to make it stop.  And that adds to Trump’s dirty satisfaction. Any chance to out-maneuver the decent — to infuriate them — quickens his glee.  It makes his victory all the more rotten, incites him to keep on taunting his victims.  

 

It’s all a big joke to Donald Trump, and he can never, ever, get enough of it. 

 

The question must be asked:  when in our lifetimes — when in all the years that our once-inspiring Republic has existed — have American institutions been subjected to such treatment?  How long can American morale and cohesion survive this?

 

Nancy Pelosi has said that in preference to seeing Trump impeached, she would like to see him in jail.  Current Justice Department policy — which forbids the indictment of presidents — makes it possible for Trump to break our nation’s laws with impunity.  Impeachment is useless if the Senate’s Republicans, united in their ruthlessness and denial, take the coward’s way out.

 

So the prospect of locking him up may have to wait.  But the day of reckoning for this fake — this imposter who will never have a glimmer of clue as to how to measure up to his office — may come in due time.  Then the presidential fake who accuses his victims of fakery will live with some things that are real:  stone walls, iron bars, a nice prison haircut, and the consequences of his actions.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172612 https://historynewsnetwork.org/article/172612 0
(Re)-Claiming a Radical Evangelical Heritage at Christian Colleges

Wheaton College

 

The recent controversies at my alma mater, Taylor University, over the invitation of Vice President Mike Pence’s to deliver their Commencement Address have illuminated the continued fissures at evangelical Christian colleges like Wheaton College (issues of race and gender), Gordon College (issues of sexual identity and gender discrimination), and Azusa Pacific (same sex dating) in this last decade. Some Christian Colleges like Eastern Mennonite and Goshen College have chosen to voluntarily disassociate themselves from the Council of Christian Colleges and Universities(CCCU)—the touchstone organization for evangelical colleges and universities in North America—because the CCCU opposed their internal decisions to include homosexuals in undergraduate admissions and faculty and administrative hires. 

 

At Taylor, the decision of university president Lowell Haines (ironically, a well-known long-haired folk and rock singer at TU in the Seventies), to invite the Vice President to graduation exposed the nascent political and social divisions among faculty, students and alumni that all previous TU Presidents had managed to avoid by their collective decisions not to bring divisive political figures to campus. As a direct (or indirect result) of these long-held political boundaries, the relatively new President decided to resign just weeks after Commencement while the University was in the midst of a new strategic plan with an ensuing Capital Campaign. It was clear that the President was “caught off guard” as to the vast ranges of evangelical thought among his constituencies.

 

It would be easy to assume that the recent strains among evangelical higher education institutions are just the result of Protestant divisions that are ubiquitous throughout history. However, the history of American evangelicalism suggests something completely different: evangelical colleges are returning to a uniquely radical American nineteenth century evangelicalism. This was especially the case with Christian colleges with no denominational control such as Wheaton, Taylor, Gordon, etc.  In the past century, Wesleyan scholar Don Dayton wrote Discovering an Evangelical Heritage, a small book about the conflicted identifies of American evangelicalism that had a profound impact on Christian college campuses. Dayton chronicled the lives of nineteenth century evangelicals like Theodore Weld, Jonathan Blanchard, Charles Finney, the Tappan Brothers and other abolitionists of the antebellum period that founded colleges like Oberlin, Knox, Grinnell, Berea, Wheaton and a host of other Midwestern colleges with egalitarian visions of race and gender accompanied by a heavy critique of our political establishments. These colleges were a true reflection of a radical evangelical re-visioning of the United States through liberatory practices towards women and African Americans in their teachings, theologies and political and social activism. Understanding the history of these colleges led to a more complex understanding of American evangelicalism that includes a totally different brand of Christianity than twentieth century fundamentalism with its rigid social rules and literal interpretations of Scripture.

 

The most radical and renowned evangelical leader of that period was John Brown who was “…The only white man worthy of a membership in the NAACP” according to W.E.B. DuBois. As most students of American history know, Brown and his family considered themselves part of a “martyr-age” of evangelicals who would live and die for the emancipation of enslaved African Americans. Brown’s motivation was based on Scripture, especially the Old Testament that proclaimed: “God hath made one blood of all nations”.

 

The antebellum evangelical movement and Fundamentalists disagreed over issues like the role of women in the church, racial segregation, creationism and science, the definitive role of Scripture in faith and practice, and other theological issues that were informed by a premillennial view of the End Times. Thus, fundamentalist and evangelical churches both quietly (and sometimes openly) argued with each other until the issues were settled in civil courts, legislation, and within other societal structures.

 

After the Civil War and moving towards the Twentieth Century, this post-millennial evangelicalism faded with the aftermath of World War I. After witnessing the devastation of the war,  few theologians believed that the world would continue to improve and thus usher in the Second Coming of Christ. The Fundamentalist movement was birthed from the Progressive Era as a counter to the secular teachings and influence of Marx, Freud, Darwin and the higher criticism of Scripture taught on college campuses and infiltrating main-line Protestant denominations.

 

Thus, many former evangelical colleges like Wheaton (and others) pulled inward and did not engage the wider collegiate academic culture that was perceived to be hopelessly secularized. Many Anabaptist institutions were the exception and it can be more readily understood why Eastern Mennonite and Goshen have departed from fundamentalist teachings on race, class and gender because of the nature of its oppositional culture.

 

Other-worldly fundamentalism held its sway in interdenominational collegiate and church institutions until a small group of post War evangelicals (i.e. John Hoekenga) began a new movement in 1947. Inspired by Carl Henry’s National Evangelical Association, this group of influential and intellectual evangelicals attempted to consolidate and organize the more moderate wing of the evangelical movement.

 

After the relatively calm period of the post -War Eisenhower administration (and with the election of our first Catholic President John Kennedy), the turbulent Sixties with the Vietnam War, Vatican II, women’s movement and the Civil Rights Movement spotlighted social issues that the Church had not effectively dealt with since the nineteenth century. Taken as whole, these issues animated more evangelicals to splinter into smaller political and social interest groups represented by journals such as Sojourners, Christianity Today, Christian Herald and a slew of non-denominational publications from more fundamentalist institutions like the Sword of the Lord out of fundamentalist Tennessee Temple University. Right-wing preacher and broadcaster Carl McIntire from New England, University President Bob Jones in the Deep South, popular Midwestern broadcaster Dr. M.R. De Haan, and other popular radio broadcasters attempted to counter the more inclusive messages from the moderate to liberal voices of the  new evangelicalism represented by Billy Graham.

 

Many historians agree that the defeat of Jimmy Carter in 1980 by Ronald Reagan signaled a significant divide between the progressive evangelicalism represented by Carter over against the conservative policies and personal testimony of President Reagan’s born- again experience. Even though Reagan rarely attended church (and was a divorced man—an anathema to conservative evangelical groups), nevertheless, like Donald Trump, evangelicals believed that Reagan’s policies such as “trickle-down” economics fit their newly found upper-middle class lifestyles which birthed both materialism and Mammon—or, the love of money, which Charles Finney had warned evangelicals against in the last Century and predicted would end the evangelical movement in America.

 

The Reagan administration also made evangelicals feel less guilty about their clear responsibilities to the poor (for which there are over two thousand admonitions in Scripture). Also tied to issues of class were the ubiquitous and conflicted issues of race and gender tied to women’s ordination and the racial integration of churches and colleges. These issues were resurrected by the Civil Rights Movement, (which few white evangelical leaders joined) however, other main-line Protestant denominations felt that these social issues must also be resolved within their churches. As a result, the ordination of women and the wider acceptance of African Americans in positions of authority and empowerment moved slowly along, leaving evangelical Christian colleges in both a modern- day quandary and debate.

 

Issues of race, class and gender, joined by the growing acceptance of the LGBT community within society and main-line churches, (like the Episcopal Church of the United States and other main-line Protestant denominations) also influenced a new generation of young evangelicals. With the election of our first African American President in 2008, Barack Obama, a self-professed Christian and member of a controversial minister’s (Dr. Jeremiah Wright) church (Trinity United Church of Christ—Chicago) and a recent “convert” to endorsing gay rights, the debates on campus and in most churches began in earnest and furthered the generational divide among evangelicals.

 

Thus, the current evangelical divide has long-standing historical roots. In addition, I identify three systemic causes behind this wide chasm of evangelicals that threaten the future existence of their institutions above and beyond growing costs and competition with state institutions.

 

1. Students at evangelical colleges no longer look solely to Scripture in order to make decisions on issues of human sexuality. While a literal interpretation of Scripture is a main tenet in fundamentalist churches (and some conservative evangelical institutions), it is no longer the only touchstone for student’s moral decision making. Also, the complex world of biblical hermeneutics (and who speaks for God in a post-modern era) has made it difficult for younger evangelicals to make universal moral proclamations. In Wesleyan centered churches, (like Anglicans and Methodists) the theological quadrant of Scripture, Tradition, Reason and Experience is being more widely applied to decisions on human sexuality and other moral issues. Thus, for some evangelicals, Experience, or, (the subjective role of Holy Spirit) has superseded literal interpretations of Scripture in support of same sex rights and other controversial moral issues.Further, sexual norms in general are not as salient or relevant to younger evangelicals. The former taboos of living together and pre-marital sex are just about non-existent among the young.  The former teachings and admonitions of complete abstinence before marriage or living together without a license never took hold on this current generation. And, for some evangelicals, same sex relationships were part and parcel of these former prohibitions. They did not hold up to the standard of reason or in the experience of peoples whose lives they respected.

 

2. The Reagan Revolution generation of evangelicals are perceived by younger evangelicals to be pre-occupied by health, wealth, literal interpretations of Scripture, the coming Apocalypse and individual rights. There is also a deep division over the face and nature of secular Presidential leadership. There is a deliberate divorce among older evangelicals’ attitudes between Donald Trump’s personal life and the policies that he favors that directly benefit them. Gone are the days when you hear the comment “worldly” as a pejorative admonition among fundagelicals and the obvious conspicuous consumption among mega-churches.

 

3. There is a current generation of evangelicals (like Sojourners led by Jim Wallis in D.C.) and other popular authors and speakers like (Tony Campolo, Rob Bell, Matthew Vines) that are harkening back to a radical evangelical heritage of the nineteenth century and; thus, the current face and nature of evangelicalism as personified by media driven leaders like (Franklin Graham, Jerry Falwell, Jr. Jim Dobson) who are considered by these young evangelicals as culturally bound, both worldly and other-worldly and [they] are perceived to compromise their professed values to their current material comforts as imaged in this current Presidential administration. This distaste of America’s secular leaders (along with keeping a considered distance from the current crop of evangelical leaders that they believe compromise their values for the “approval of men”) echoes John Brown’s contempt for compromising Christians and slaveholders during the Antebellum period of U.S history.

 

If these trends continue, then, I believe that evangelical colleges will collapse from within. They are already facing a steep enrollment decline largely because their alumni base is not sending their children to their alma maters. They do not see either the spiritual need or the monetary value since there are so many “good” secular universities with active Christian groups on campus that cost a lot less, and, driven by the current vocationalism and student debt, financial reserves are also a major threat to these institutions.

 

The cultural ramifications of these social and cultural upheavals on their Churches is also profound. As denominational churches and the evangelical movement (as represented by mega-churches) continue to differ on their stances towards the acceptance of LGBT church members and priests, pre-martial sexual relationships, multicultural church bodies and the face and nature of American hyper-capitalism, both church and college structures will continue to divide their institutions and the degree to which Christians can and will compromise their spiritual values against secular realities will continue to be debated.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172651 https://historynewsnetwork.org/article/172651 0
History is a Verb, Something You Do: An Interview with Mark Doyle

 

Mark Doyle is a historian of Ireland and the British Empire and Professor in the Department of History at Middle Tennessee State University.

 

 

What books are you reading now?

 

Because it’s summer, I’m being a bit self-indulgent in my reading. I just finished Ulysses, which, to my shame as an Irish historian, I had never read before. It was a harrowing but ultimately very rewarding experience - knowing the historical background was a help, but I wouldn’t have survived without online guides like Shmoop and joyceproject.com to guide me through the murky bits. I feel nothing but admiration for people who read and appreciated it upon its initial publication in the early 1920s without the assistance of our modern Joyce Industrial Complex.

 

I always try to keep at least one novel going – not just because it makes a nice break from my academic reading, but also because it helps me be a better writer. After Ulysses I started The Dispossessed by Ursula K. Le Guin, a science-fiction novel about two worlds with contrasting social and political systems, one capitalist/individualist and the other anarchist/collectivist. I don’t read much sci-fi, but I am drawn to books that take an abstract idea or ideology and follow it to its logical conclusions. George Orwell, HG Wells, Jose Saramago, Margaret Atwood, and George Saunders are other writers in this vein – in addition, I’m sure, to many sci-fi writers whose work I’ve yet to read. I’m quite enjoying The Dispossessed and rather wish I’d read it twenty years ago, when I first began thinking about the theory and practice of these different social systems.

 

I’m also reading At Freedom’s Door by Malcolm Lyall Darling. Darling was a member of the Indian Civil Service from 1904 and 1940, mostly in Punjab, and in 1946-7 he took a tour around northern India to gauge the state of the country on the eve of independence. While it is somewhat colored by Darling’s cultural assumptions and blind spots, it’s an invaluable source about the social and economic conditions in (primarily) rural India just before Partition. I’m particularly interested in the communal relations that Darling describes and am thinking about how his experiences in northwest and northern India might compare with conditions further east. I’m supervising a PhD student who is working on partition in Bengal (northeastern India), and the nature of Hindu-Muslim relations on the ground will be a crucial component of his research.

 

Finally, I’ve just started Fossil Capital by Andreas Malm. I’ve been thinking (and fretting) quite a lot about the role of the historian in the face of catastrophic climate change, and I’m hoping this book, about the roots of the fossil fuel economy, can suggest one way forward. Sometimes our work seems so trivial in comparison with the existential threats we’re faced with – more than once I’ve considered chucking it all in and just chaining myself to a tree in the Amazon – but I also know that our work is necessary for helping humanity survive whatever lies ahead. Malm’s book seems like one promising way forward, an effort to understand how we got into our current predicament and a suggestion about how to find our way out, but there are other approaches that may be just as important. I’ll say more about this below.

 

What is your favorite history book?

 

It’s impossible to choose just one, but I think the one that has had the biggest impact on me as a scholar (and a human) is Ordinary Men by Christopher Browning. It’s a study of a Nazi police battalion who rounded up and killed Jews during the Holocaust, but it’s also much more than that, a multilayered accounting of the social, political, and cultural forces that can lead people to commit extraordinary violence against other people. Browning’s conclusion, that any group of people placed in similar circumstances would act as these German policemen did, is a masterpiece of complex historical argument that made me feel personally implicated. It forced me to ask myself whether I would have the courage to stand up for what I knew was right even when everyone else was doing wrong, whether I would be the one in a hundred who refused to shoot or purposely misfired or actually tried to intervene to save lives, and it’s a question that I still ask myself all the time. When I read the book in graduate school I was still figuring out what the purpose of history was, what role it could play in contemporary society, and here was a powerful answer that continues to resonate in much of my work. History can help us understand the structural forces that foster suspicion, prejudice, resentment, and violence, and once we understand those forces we can begin to make better choices not just about how we live our own lives, but how we order our societies.

 

Why did you choose history as your career?

 

I never really chose to become a historian. It was more an accumulation of smaller decisions that led me in this direction: the decision to add a history major to my philosophy major as an undergrad, the decision to study abroad in Dublin my junior year, the decision to apply directly to PhD programs rather than getting a Master’s first, and so on. At a certain point I was so far along the road that I was incapable of imagining what else I would do with my life, and I also found that I was pretty good at it and mostly enjoyed it, and so here I am. It sounds trite to say “I didn’t choose history. History chose me,” but I suppose it’s sort of true. At a more fundamental level, though, I suppose I gravitated toward history because I liked hearing stories about people and places beyond my own experience, and that remains my primary motivation today. 

 

What qualities do you need to be a historian?

 

Curiosity, empathy, and a commitment to evidence-based, rational argument. It helps to have a bit of imagination, too. In a way, being a historian is like being a novelist: you have to imagine your way into lives that are very different than yours in order to come up with plausible explanations for why things happen the way they do. Unlike novelists, however, we’re required to root our imaginings in the available evidence: the art of history is essentially trying to get that equation right.

 

Who was your favorite history teacher?

 

It’s difficult to name a single teacher. The best teachers I’ve had, whether in history or something else, have all been good storytellers. I’m a frequent practitioner of abstract thought and advocate for big ideas, but the most effective entry point into any topic – before you get to the abstractions and ideas – is a good story that’s capable of eliciting an emotional response. As an undergrad at Tulane University I had a professor, Sam Ramer, who would tell the most outlandish (but true!) stories about Russian history, and I can remember laughing and shaking my head in wonder that such things ever really happened. It was almost enough to get me to adopt a Russian studies major, until Dr. Ramer persuaded me that my ignorance of the Russian language and impending departure for a year in Dublin might make it hard to fulfill the requirements. Fortunately, as it turned out, outlandish true stories aren’t confined to Russian history: Dr. Ramer was just unusually good at telling them.

 

What is your most memorable or rewarding teaching experience?

 

Many of my students, particularly at the survey level, come to history with a negative preconception about the discipline. In their high school experience history was mostly about memorizing dry facts (names, dates, etc.) in preparation for a standardized test, and they often don’t think of it as a subject devoted to argumentation, one in which the questions are as important as the answers. My most rewarding teaching moments come when a student tells me – or, better, inadvertently shows me – that my class has changed the way they think about history. I don’t need to convert all of my students into history majors, but I do want all of my students to develop certain habits of mind that they can apply to all realms of their lives: critically assessing information, considering multiple points of view, grasping the provisional nature of historical (and many other kinds of) knowledge, finding ways to articulate their ideas with clarity and precision. These habits might show up in their coursework, but I also see it when a student has a revelation in class (one student realized, in the midst of a discussion about 20th-century fundamentalism, that she had been raised by fundamentalists), when students cluster in the hallway to further debate something we were talking about in class, or when a previously reticent student begins to find her voice. These are the moments that make the job worthwhile.

 

What are your hopes for history as a discipline?

 

As I hinted earlier, I think historians have an important role to play in confronting the various crises we face at the moment. The chief crisis is climate change, and so we obviously need to be doing lots of environmental and climate-related history, but this is a problem whose impacts go well beyond weather, ecology, or the natural world. Migration (and attendant racism and xenophobia), resource scarcity, traumatic economic restructuring, public health crises, civil wars, interstate violence – the knock-on social effects of climate change will be massive, and many are already getting underway. All of these processes have their own history, and my hope is that historians will use their expertise in these matters to help our societies respond in humane, nuanced, and evidence-based ways to the crises that are coming.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I don’t deliberately collect old books or objects, but in this job it’s hard to avoid accumulating large quantities of both. Most of my recent acquisitions relate to a book I’ve just written about the English rock band the Kinks. Without really intending to, I’ve ended up with quite a few magazine clippings, records, and ephemera related to the band. My favorite is a small doll of Ray Davies, the band’s lead singer and songwriter, that my wife gave me a couple of years ago. He’s smirking at me from a bookcase as I type this, in fact, probably wondering why I haven’t corrected the page proofs yet.

 

What have you found most rewarding and most frustrating about your career? 

 

The most rewarding aspects are related to teaching, although the opportunities for travel have also been invaluable. The most frustrating aspects are the things that most humanities academics complain about, I suppose: a devaluing of our work in the public discourse, lack of government support for our work, the casualization of the professoriate and disappearance of good tenure-track jobs, expanding administrative duties that keep us from performing our core functions. On the whole, however, I feel tremendously fortunate to be in a profession that allows me to indulge my curiosity and share my enthusiasms with captive audiences.

 

How has the study of history changed in the course of your career?

 

I think we’ve become much more aware of how our work impacts the public. For all sorts of reasons – economic, political, demographic – we’re under growing pressure to justify our existence, and this has given rise to more public history programs, more outreach via websites and social media, more interventions in political debates, more efforts to communicate our research to people beyond the academy. On the whole I think this is a good thing, although too much emphasis on the impact or utility of historical scholarship can leave little space for the exploration of esoteric topics for their own sake, and I would like there to be continued space for that. I don’t want historians to be judged simply by their “outputs” or history departments to be valued simply for their ability to get their students jobs, and this tends to be the default position of university administrators and legislatures. The challenge is to define for ourselves the value of our discipline and then communicate that to the wider public, and on the whole I think we’re better at that now than when I started down this path twenty years ago.

 

What is your favorite history-related saying? Have you come up with your own?

 

I’ll go with Marx: “Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past.”

 

In my research methods class I tell my students to think of history not as something that you learn but as something that you do. That section of the class is called “History is a verb,” so I’ll claim that as my own history aphorism.

 

What are you doing next?

 

This summer I’m writing an article about several tours of Ireland by the African-American choral group the Fisk Jubilee Singers in the 1870s. This is an offshoot of a larger project, which may become a book or may become something else, about African and Asian migrants/immigrants to Ireland in the nineteenth century. They were there, but very few historians have thought to look for them. As Ireland becomes ever more multicultural, it’s important to know more about the history of migration into the country, particularly by people of color, and of the ways mainstream Irish society regarded outsiders in their midst.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172609 https://historynewsnetwork.org/article/172609 0
Joe Biden is a Product of His Time

 

Democratic frontrunner Joe Biden recently apologized for his characterization of the segregationists he worked with as a U.S. senator: “At least there was some civility. We got things done. We didn’t agree on much of anything,” he said. This episode illustrates a troubling phenomenon of our current political culture and to a certain extent historical discourse. As American attitudes toward gender, race, and class have evolved, scholars and the public have a tendency to criticize the practices and beliefs of their elders. 

 

For example, some contemporary historians, have been critical of the Founding Fathers, men who either owned slaves or agreed to include slavery as part of the new American nation. Particular opprobrium has been aimed at Thomas Jefferson, a man who condemned slavery, yet owned slaves. Furthermore, Jefferson claimed that the black race was intellectually inferior, yet penned the words “All men are created equal.”

 

The Founders are not the only dead white males whose behavior does not pass muster with today’s historians. Abraham Lincoln also has come under fire from a group of historians who challenge the conventional view of him as “the Great Emancipator.” The most outspoken of these voices is the late historian Lerone Bennett Jr. who wrote, “The real Lincoln... was a conservative politician who said repeatedly that he believed in white supremacy.” In his book Forced into Glory: Abraham Lincoln’s White Dream, Bennett writes that the entire concept of emancipation was antithetical to Lincoln who reluctantly issued the Emancipation Proclamation only as a military tactic. This reading conveniently ignores Lincoln’s public statements disavowing slavery and his efforts to pass the Thirteenth Amendment. 

 

How should we assess this phenomenon of a later generation’s unfavorable view of their political forebears’ attitudes? In the case of the Founding Fathers, clearly they either participated in or tolerated slavery. However, they were ahead of their time creating a government based on sovereignty of the people. That was their goal and they would not let differences over slavery prevent them from achieving it. 

 

 Lincoln too was a product of his time. In his day, all but a tiny minority of abolitionists believed in racial inequality. Most whites in the North like Lincoln opposed slavery, but had serious misgivings about racial equality and the mixing of the races. They were not willing to fight to end slavery in the South or the border-states. But when Lincoln was faced with the destruction of the Union or the abolition of slavery, he chose the latter. His desire to save America was greater than the racial prejudice of his day. The abolitionist William Lloyd Garrison and Frederick Douglass both believed Lincoln had evolved.

 

Biden also is a product of his time. An incident in Biden’s first year in the Senate is instructive of this point. In 1973, Senator Jesse Helms (R-NC), also a freshman, was speaking on the Senate floor. Biden, who disagreed with Helms’ position on civil rights and racial attitudes went to Senate Majority Leader Mike Mansfield to express his disgust. Mansfield advised the young Biden to find something good in every senator so that he could work with all of them to accomplish things. Biden took that advice. When he came to the Senate, many of the most powerful figures were Southern committee chairmen with segregationist pasts. But Biden learned to work with them as did another young man who came to the Senate ten years earlier, Ted Kennedy. As for Biden’s opposition to court ordered busing, a poll taken in 1973 indicated that only nine percent of black parents wanted their children bused away from their neighborhood schools. By the end of the 1970s busing had disappeared as a divisive issue.

 

When assessing past attitudes, it is important to recognize that most people are influenced by the thinking of the day. Even those figures who are ahead of their time as the Founding Fathers and Lincoln were, are products of their environments. Moreover politics in any era is complicated and compromises are sometimes necessary. The Founding Fathers had a country to create. Abraham Lincoln had a country to preserve. Joe Biden who believes in racial equality wanted to create equality of opportunity and promote the progressive agenda. To achieve those ends, all these figures had to work with people whose views were antithetical to theirs. Their willingness to compromise when necessary in the service of a higher aim was not weakness. It was practical statesmanship.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172655 https://historynewsnetwork.org/article/172655 0
Another Kind of Patriotism Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

I went to a patriotic rally on Sunday. There was a lot of talk about flags, which were shown with great reverence. Military veterans were honored as heroes, due great respect. It was colorful and loud.

 

The rally had nothing to do with Trump. The event was a traditional Ojibwe, or Anishinaabe or Chippewa, Pow Wow, celebrated every year at the Lac Courte Oreilles reservation in northwestern Wisconsin. The Honor the Earth Homecoming Pow Wow is the opposite of the “patriotic” rallies that Trump is holding as the beginning of his re-election campaign.

 

On the way to the site, signs were posted along the road urging everyone to think of themselves as unique and worthy persons. Inside, the focus was entirely on the celebration of Native American traditions, wisdom, and culture, without any hint of comparison to other cultures. Members of the local tribe were joined by tribes from across the region, each of whom could sing and drum their own songs. There were no enemies, just friends.

 

Ojibwe veterans from all service branches were named and honored for their service to the American nation and to the Ojibwe nation. But no weapons were displayed, except ceremonial versions of traditional hunting weapons displayed by brightly costumed dancers.

 

Politics was conspicuously absent, as was any complaint about how the Ojibwe and all other Native Americans have been treated by white settlers who invaded the lands they lived in and took them for their own. The only historical hint I heard from the announcer, who was also broadcasting over the reservation’s radio station WOJB, was his brief mention that the Anishinaabe had been defending their land for hundreds of years, long before the appearance of whites.

 

The messages of the Pow Wow were clear: “We are patriots. We love our land and our unique culture. We love America and have defended it in every war. We welcome and respect all Americans.”

 

Donald Trump’s rally in North Carolina, and his whole constant campaign about himself, send the opposite messages. “We are patriots, better patriots than you. We love America and therefore we hate you. Hating you is true patriotism.”

 

I find the implicit violence of the crowd in North Carolina to be just a few steps away from the real violence of the white supremacists in Charlottesville. What if a woman in a hijab had walked in front of that crowd as they chanted “Send her back”? That is the new Republican model of patriotism.

 

What could love of America mean? It could be love of the land, the amazing lands of our 50 states, encompassing beautiful vistas of mountains and lakes and prairies and desert that might be unmatched anywhere else. The Ojibwe love their land as a sacred trust from previous generations, the little bit that has been left to them after centuries of white encroachment. They wish to preserve it forever.

 

Love of America could be allegiance to the principles at the foundation of our political system. Those principles have not been consistently followed, and a truly democratic and egalitarian nation is still a dream to be realized, rather than a reality to be defended.

 

It could be reverence for American history, our unique national story of the creation of a new democracy by European immigrants and the evolution of the United States toward a more perfect union by embracing the lofty principles set forth in our founding documents. That story has many dark chapters, but we could say that American history is a narrative of overcoming – the struggle to overcome regional division, racism, sexism, homophobia, poverty, a struggle that may continue long into the future.

 

Love of America could be affection for Americans. I think of my own tendency to root for American athletes when they compete against athletes from other nations at the Olympics, the World Cup, or in tennis Grand Slams. Americans are incredibly diverse, and it is not easy to put into practice a love for all Americans, no matter ethnic, economic, educational, regional and personality differences. At the least, it should mean that one practices good will toward another American until proven wrong by inhumane behavior.

 

I don’t see any of these forms of love for America in contemporary conservative politics. Conservatives support digging up American land rather than preserving it and fight against every attempt to preserve clean water and air. They taunt conservation organizations who worry about global warming, deny the science of climate change, and oppose all efforts to prevent our own land and the whole globe from becoming less friendly to human habitation. The Trump campaign now sells Trump-branded plastic straws as a deliberate sneer at attempts to save ocean life from being overwhelmed by plastic. For today’s conservatives, American land is a source of financial exploitation: don’t love the land, love the money you can make from it.

 

Today’s conservatives, preceding and following Trump, don’t respect the democratic principles that America has at least tried to embody. From blatant gerrymandering to vote suppression to attacks on the free press to praise for dictators and criticism of foreign democracies, principles have been entirely replaced by temporary political advantage as the source of conservative action.

 

Conservatives hate American history, instead trying repeatedly to substitute myths for facts. They deny the historical realities of racism, the “patriotic” excesses of McCarthyism, the expropriation of Native American lands. They attack historians who simply do their job of uncovering evidence about how Americans behaved in the past, good and bad. And they celebrate some of the worst Americans: the Republican state government in Tennessee has now named July 13 as “Nathan Bedford Forrest Day”, honoring the Confederate general who became the first Grand Wizard of the Ku Klux Klan.

 

Conservatives don’t like most Americans. Again led by Trump, and operating as his megaphone, Republican politicians attack Democrats as enemies of America, despite that fact that Democrats represent the majority of American voters.

 

I didn’t see any Trump hats at the Ojibwe Pow Wow, and I doubt that any Native Americans cheered for Trump in North Carolina. These very different rallies represent opposing ideas about patriotism and America. In my opinion, one expresses a beautiful vision of land and people that has stood for America for hundreds of years. The other is an incoherent reverence for a cult figure of dubious value.

 

I never liked cults.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/blog/154231 https://historynewsnetwork.org/blog/154231 0
Can You Ever Tame the Shrew? Is She Really the Shrew?

 

In Padua, Italy, in the 1500s, the populace had its rules. Rule #1 – marry for money Rule 2# marry for money. Rule #3 – marry for money.

 

Kate, the bombastic, headstrong.in-your-face daughter of a wealthy merchant there, Baptista, does not believe in those rules and is intent on shattering them. When she is of the age to be married, young Petruchio arrives in town, handsome as anyone can be, and full of good humor, charm and plenty of spunk. He’s the rich heir to his father’s massive fortune. 

 

Kate falls for him, but keeps putting him off to upset the rules makers. There’s no rush, either, because her father has declared that she cannot marry until her younger sister Bianca walks down the aisle, flowers in hand.

 

Sweet, innocent and very quiet Bianca is pursued by an army of feisty young suitors and meets numerous men and women who are not who they seem to be. Who will marry first? Can Kate stay the shrew? Will old Padua every be the same again?

 

All of those questions are answered amid the lovely forest and meadows of The Mount, writer Edith Wharton’s sprawling, beautiful estate in Lenox, Massachusetts, where Shakespeare and Co. stages The Taming of the Shrew in an outdoor dell just about every night of the week through August 17.

 

This new production of William Shakespeare’s classic is gorgeous, as gorgeous as the forest in which it is staged, as gorgeous as a moonlit night in New England.

 

Director Kelly Galvin has done wonders with the setting. Actors race on to and off the stage from the woods and disappear back into them. Gangs of characters meander under the trees. The forest is large and deep and the meadow wide so there is plenty of room for Galvin to work some magic and she does.

 

Galvin has given the play a splashy new look. It appears to be set in 2019 and the characters all seem like extras from the Godspell show or college beer blast– all tie-dyed shirts, sneakers and shorts. Baptista, the dad, is his own fashion statement with a light green 1601 meets 2019 suit and lovely pea green hat. He looks like Leonardo de Caprio, too.

  

The stage in the Dell is a small one, set on boards. It is framed by huge billboards with words like  POW! BUZZ! And O! painted on them in color. It looks like a glossy scene for one of the old Batman television shows (“to the bat cave Robin!}. There is much audience participation in the comedy, too, with large, colorful applause signs raised from time to time that draw boisterous roars from the audience. The audience is frequently encouraged to groan AWWWW at some points in the play and OOOOO at others and it does.

 

Director Galvin has also livened up the play with several contemporary rock and roll songs, such as Billy Idol’s White Wedding (staged during a wedding in the play).

 

The director works with a fine cast, too. Particularly good are Nick Nudler as Petruchio, Matthew Macca as dad Baptista, Jordan Marin and Caitlyn Kraft (don’t ask) as Gromio, Devante Owens as Lucentio, Bella Pelz as Bianca, Daniel Light as Hortensio, and Dara Brown as Tranio.

 

Kirsten Peacock, as the tempestuous Kate, steals the show. She plays her role to the hilt, appearing as brutish woman wrestler at times, ready to body slam someone instead of the comely Milady.

 

In short, this Taming of the Shrew  is a madcap romp through history, literature and the woods. It is a lot of fun. The theater really encourages people to attend the play, too, heavily advertising it in the Berkshires and slashing ticket prices to just $10.

 

NOTE – the recurring controversy. The play may have been a frolic in the 1600s, but its message, that it is better to be a docile, subservient wife and not an aggressive shrew who must get her way, has reverberated over the centuries. You would think that the #MeToo movement would be out in full force against this one. The end of the play is particularly sexually savage when the newly wed husbands stage a contest to see which wife is more subservient than the others.

 

You could re-write the whole play and amend the “shrewishness” and sexism of it, but then it would not be Shakespeare and it would not be literary history. You don’t like the sexism? Frown, grimace and roll your eyes all at the same time.

 

It’s just a play, and as Shakespeare often said, the play’s the thing.

 

See this play. It is a first-rate production of a first-rate comedy, full of mirth, tricks, ooos and aaahs, wild clothing, joyous weddings, plenty of rockin’ good music and a bunch of young people doing acrobatics you did not think possible.

 

Hooray for sneakers, and Shakespeare too.

 

PRODUCTION: The play is produced by Shakespeare and Co. Sets: Devon Drohan, Costumes: Amie Jay. The play is directed by Kelly Galvin. It runs through August 17.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172626 https://historynewsnetwork.org/article/172626 0
Roundup Top 10!  

Why disabled Americans remain second-class citizens

by David Pettinicchio

The big hole in our civil rights laws.

 

How the failure of popular politics triggered the rise of Boris Johnson

by Jesse Tumblin

Instead of solving intractable problems, public referendums simply exacerbate them.

 

 

Ellis Island's history casts today's border cruelty in an even harsher light

by Megan J. Wolff

Conditions right now are dirtier, more dangerous, and significantly crueler than they ever were at Ellis Island -- most pointedly so where children are concerned.

 

 

When the American right loved Mexico

by Mario Del Pero and Vanni Pettinà

Back when conservatives exalted free markets, our neighbor to the south was a vital ally.

 

 

2020 election is a test America can't afford to fail

by Nicole Hemmer

If the American system re-elects Trump, then something is deeply wrong with either our system or ourselves.

 

 

Trump's Supreme Court Challenge Has a Historical Precedent

by Bethany Berger

As Trump agrees reluctantly to respect the court—at least in the case of the census—he follows, in part, that long-ago legal victory of the Cherokee Nation.

 

 

Chicago’s resistance to ICE raids recalls Northern states’ response to the Fugitive Slave Act

by Kate Masur

Almost 170 years later, the Fugitive Slave Act is viewed as one of the most repressive federal laws in all of American history.

 

 

The long, ugly history of insisting minority groups can’t criticize America

by Tyler Anbinder

Trump’s attack against four Democratic members of Congress fits a pattern in U.S. politics.

 

 

Fifty Years After the Moon Landing, Recalling One Small Misstep

by Tad Daley & Jane Shevtsov

Why did the first humans to set foot off Planet Earth plant the flag of only part of Planet Earth?

 

 

Trump revives the idea of a ‘white man’s country’, America’s original sin

by Nell Painter

It can’t be left to black Americans alone to resist the president’s racism. Citizens of all colours need to resist, and embrace activism.

 

 

Lincoln Would Not Recognize His Own Party

by David W. Blight

He would see the Republicans as the antithesis of everything he fought for.

 

 

All the Presidents’ Librarians

by Michael Koncewicz

Despite being spied on and intimidated during my time in Yorba Linda, I still think presidential libraries are too important for historians to wash their hands of them.

</

 

The Vicious Fun of America’s Most Famous Literary Circle

by Jennifer Ratner-Rosenhagen

The Algonquin Round Table trained a generation of socially conscious writers.

]]>
Sun, 18 Aug 2019 23:42:00 +0000 https://historynewsnetwork.org/article/172649 https://historynewsnetwork.org/article/172649 0