The Blue Ribband | Crossing the Atlantic

It’s hard to believe that we’ve reached the end of our second season here at Fifteen-Minute History. Before we begin, I want to thank you all for joining us each week since January as we have explored the past fifteen minutes at a time. I hope you have enjoyed listening to these podcasts as much as I have enjoyed writing and recording them! As we close this season out, we thought it would be fun to break from the broad themes of American history and instead share three stories with our audience which follow a common theme—crossing the Atlantic Ocean and traveling to Europe.

John Adams, Winter Crossing, 1778

Two years into the American War of Independence, the Continental Congress dispatched John Adams to France to help secure a treaty with King Louis XVI’s government for military assistance against the British. Adams was hesitant to accept the post; in his diary he wrote, “It was my intention to decline the next election, and return to my practice at the bar…My family was living on my past acquisitions which were very moderate…My children were growing up without my care in their education, and all my emoluments as a member of Congress for four years had not been sufficient to pay a laboring man on my farm.” Of course, as he had done so often in the past, he set aside his personal considerations and fulfilled his duty to the country.

His wife Abigail would remain in Massachusetts with their younger children, but John Quincy would accompany his father aboard the frigate Boston, which departed for France on February 17, 1778. A voyage across the Atlantic was never easy, but a winter crossing was quite perilous because of the strong winds and violent storms (not to mention the threat of British warships). Captain Samuel Tucker explained the many risks to Adams, who insisted on knowing everything he could about the art of sailing, and as they departed American waters the risks quickly became reality. The Adams men were given a small cabin below decks, and John spent much of the voyage suffering from seasickness. His son read to him as he swung in his hammock, practicing his Latin and teaching his father French. When John felt able to go above deck and meet with Captain Tucker, the officer informed him of the ship’s status and sought his advice on any important matters concerning the crew or the voyage. Adams was never shy of offering his opinion on any matter, and both his diary and the surviving account of the voyage written by Captain Tucker are filled with Adams’ comments on the state of the ship (“a beautiful vessel”), the crew (“a detestable use of profanity plagued them”), the food (“wretched and served at the cook’s pleasure only”), and the living conditions (“the reek of burning sea coal and stench of stagnant water below decks were dreadful, contributing to general misery”).

In fair weather, a ship of the late 18th century could cross the Atlantic in three weeks, but in winter it could stretch on up to ten weeks. The Boston made it across in six weeks and four days, and they faced a multitude of dangers along the way. On the second day out from Boston, the ship was pursued by three British frigates, and for two days and nights the crew stood ready for battle before they finally escaped the enemy. Later that night, the Boston’s main mast was struck by lightning, injuring twenty seamen and killing one. Once the storm had passed, Tucker recorded in his log, Adams “resumed lecturing me on every part of my duty to him and to the country.” The most common instruction given by Adams was to improve the general mood aboard ship by regularly cleaning the decks and keeping the men at work and exercised; these suggestions were followed, and within the first week conditions aboard had improved considerably. The weather had cleared once the Boston reached the midpoint of the journey, and Adams commented in his diary, “We see nothing but sky, clouds, and sea and then sea, clouds, and sky.” The ship spotted a British merchantman about a week before they reached France.

Captain Tucker sought Adams’ permission to attack and, when granted, the Boston engaged the enemy. The fighting was fierce, and both John and John Quincy got a firsthand taste of the terrors of naval warfare. John fought alongside the Marines when the enemy ship was boarded, and John Quincy’s cabin was near the ship’s surgery, and he witnessed the gruesome nature of medicine aboard ship. When the Boston finally arrived at Bordeaux on March 30th, John Adams got his first glimpse of the Pyrenees Mountains through the ship’s telescope. He was awestruck as he saw the green sloping hills of southern France with the Spanish mountains to the south. “Europe, thou great theater of arts, science, commerce, war, am I at last permitted to visit thy territories,” he wrote in his diary. Adams and Captain Tucker were entertained aboard a French frigate in the harbor of Bordeaux, and two days later he strode ashore and learned that a treaty had been signed between France and America before he had arrived. Nevertheless, he did his duty and proceeded on to Paris to join Benjamin Franklin at the American consulate, where he would spend the rest of the war. Decades later, Adams would write to his friend and political rival Thomas Jefferson that the voyage from Boston to Bordeaux were a picture of his entire life.

According to Adams’ biographer David McCullough, “The raging seas he had passed through, he seemed to be saying, were like the times they lived in, and he was at the mercy of the times no less than the seas.”

Barbara Tyson Arnt & Margaret Dardis, 1961

The late nineteenth and early twentieth centuries were the time of the great ocean liners which crisscrossed the Atlantic and brought immigrants to America and tourists to Europe. Famous ships like the Titanic, Queen Elizabeth I and II, and Queen Mary I and II were home to rich and poor alike, offering luxurious staterooms on deck that matched the finest London or New York hotels or small, cramped steerage cabins in the bowels of the ships. Transatlantic voyages no longer took weeks, and the liners of White Star, Cunard, and other companies competed to gain the Blue Riband, an unofficial award for the fastest crossing from New York to Southampton and back. Most American audiences know the story of the Titanic and thus the broad outlines of how people made the crossing at the turn of the century, but this practice continued long after the “unsinkable” White Star liner crashed into an iceberg and vanished beneath the freezing waters off Newfoundland. During the two world wars, these ocean liners were converted to troop ships, but after 1945 they returned to their passenger roots and brought thousands of refugees from Europe to Ellis Island and a new life in the New World. Their decks were then filled with American tourists eager for adventure in Europe.

One such tourist was Barbara Arnt (nee Tyson), a 23-year-old woman from Sarasota, FL. She departed New York in September 1961 aboard the SS America, a steamer which had been constructed just before the United States entered the Second World War. Barbara later described her experiences aboard the America in an interview with the United States Lines website: I sailed on her in September 1961 from New York to Bremerhaven. It was wonderful, she was so beautiful. I couldn’t get enough of being out on the fantail watching the wake behind the ship. We were in Hurricane Esther and it was pretty scary. I couldn’t believe how far she would list, it was impossible to walk straight. We had to use the ropes to move around with. My doors kept opening and slamming shut when we would change the way we leaned. They had sides they pulled up at the tables and put wet fabric of some type that kept the dishes and glasses from sliding off. I remember some type of belt looking thing that clipped our chairs to the tables so we wouldn’t slide across the floor.

There were some injuries as well. A lot of people were very seasick but I never missed a meal. It was great. I was 21 and thought it was exciting even when we lost some deck furniture over the side. What great memories I have [of] this beautiful lady. Ocean liners were renowned for their all-inclusive facilities, which included dining halls, bars and saloons, exercise rooms, libraries, and even (in the 1960s as the liners approached the end of their service) cinemas. Families often traveled across the Atlantic together, and liners had to have facilities for even the youngest of passengers. Few parents expected to spend their three or four days aboard ship entirely with their children, and many ships had purpose-built playrooms, including the SS America. In an interview with United States Lines, Playroom Associate Margaret Dardis recalled: My recollections of the children are many and clear. Sea-sick parents shoved their bright-eyed kids, who could be anywhere from age 2 to 12, in the door, clapped hand to mouth, and fled. One trip, we put on a play for the parents— the children’s own dictated script for “The Emperor Has No Clothes”, with the lead deciding to wrap himself in one of the ship’s large bath towels, to indicate the lack of clothing. On another trip, I did not put in a moment in the playroom from New York to Southampton but spent every waking moment on the forward crew deck—because we were transporting the US Olympic team. When the children became obstreperous, I used the technique of telling them to make as much noise as they could for one minute by the clock— and then discovered that their voices carried to, and alarmed, people on the tourist deck, just the other side of the portholes. Perhaps the warmest memory is of a five-year-old boy from the Bronx, named Leon, whose mother feared that he would misbehave toward the other children and who did, indeed, jump onto and kick another child’s building made from the Erector set— but who, I discovered, was a brilliant future engineer; he not only made the most complex construction in the booklet that came with the set, but went on to create several new ones of his own, and became the politest, best behaved child imaginable by the time of arrival. If, by some unimaginable coincidence, Leon, you happen be one of those who set eyes on this, please e-mail me! What I learned from you that trip was the foundation on which I built to become a (now-Emeritus) Professor with forty years of teaching.

Of course, as air travel became more popular and less expensive, ocean liners lost their market share in the travel industry, and by the late 1960s most had been sold off or scrapped. Today, very few of the pre-World War Two liners remain in existence, and all are either museums or floating hotels. SS America was sold off several times before she was wrecked off the Canary Islands in 1994. The two most famous postwar liners, Queen Elizabeth II and Queen Mary II are both still afloat, Elizabeth as a hotel in Dubai and Mary still crosses the Atlantic each year bringing passengers back and forth from the Old World to the New.

Jon Streeter, 2018

For most of the last century, tourists heading to Europe needed to go through medical examinations before they departed the United States and on arriving in their destination country. Paperwork giving permission to travel was necessary as well, and travelers often spent days or even weeks in quarantine at their ports of entry waiting for bureaucracies to catch up with travel plans—and this after spending so much time at sea! Fortunately, today one needs only an American passport to travel to any nation within the European Union, and air travel has made the process far quicker (though less comfortable, as even steerage passengers on an ocean liner could stretch their legs). In my many travels to Europe, I have always been thankful to be alive at a time when I can depart Indianapolis in the evening and arrive in London, Paris or Munich the next morning. I thought I would close this podcast with a final story—one of my own.

In the summer of 2018, AET organized a two-week tour of Germany focusing on “The Cost of Freedom” and the Second World War. We were based first in Munich and then Berlin, and our travels took us to many historic and beautiful sites, as well as some sobering reminders of man’s cruelty to man. I have traveled to Germany more times than I can count, and as we prepared to leave in April and May, I was confident that I had everything ready for the trip. Hotels were booked, tours were in place, paperwork was filed, we were all set! At our tour group meetings I stressed the importance of good shoes, proper etiquette when traveling abroad, updating your passport, exchanging dollars for Euros, and getting to the airport on time. I went to bed the night of June 13th nervous (as I always am before a trip) but excited to get going!

Everyone arrived at the Indianapolis airport on time, and we got ourselves organized and into line to check in and get our plane tickets. I stepped to the counter with the first group of students and one of our adult chaperones, presented my passport, and waited to get my ticket. Then, I heard words I never expected: “Sir, your passport is expired.” I was so focused on preparing others for the trip that I never considered looking at my own travel documents! I will confess publicly that I became physically ill, thinking that the trip was ruined. Fortunately, my adult chaperones felt confident enough to proceed without me until I could obtain a new passport (requiring me to drive to Chicago the following day), and I was able to secure a flight and join my group 72 hours after they left the United States. Those two days were a low point in my career as a travel guide, but they taught me the importance of preparation, calm in the face of uncertainty and, above all, prayer support from friends and family.

Had I been John Adams, the ship would have waited for me, as I was the most important member of the crew. If I’d been about to board SS America, I would have had to wait until the ship returned, and our trip would have been over before it began. Thanks to the wonders of international airlines, not to mention cars and modern printing, the trip was salvaged and we had an amazing time in Germany. I tell this story not to frighten prospective travel companions who might wish to join us on a future AET trip but to show how far we have progressed as a society in the area of international travel (and so you may chuckle at my misfortune). For those of you who would like to join AET on a tour some day, you may be assured that my passport and other travel documents are in good order, and I have set myself daily reminders beginning on January 1, 2028, to renew my passport long before our tour group departs that summer!

Available Wherever You Listen to Podcasts

Winning but Losing | The Vietnam War

“No event in history is more misunderstood than the Vietnam War. It was misreported then,

and is misremembered now.”

— Richard Nixon —

In 1961, President John F. Kennedy dispatched American military advisors to South Vietnam. Their mission was to advise the poorly equipped and trained South Vietnamese Army on how to combat the communist regime of Ho Chi Minh in North Vietnam. Already victorious over the French years before, communist forces were planning incursions into the south. Kennedy dispatched the advisors to shore up the free world’s defenses in the wake of the failure of the Bay of Pigs invasion and the building of the Berlin Wall, both of which had emboldened communist regimes around the world. Their advance was called the “domino effect” and represented a wide array of actions that were successfully discrediting American influence on the world stage while promoting the principles of communism as an alternative. A line in the sand had to be drawn. This line was Vietnam.

A Step Back – The First Indochina War

Starting in the late 1800’s, French forces occupied and colonized the whole of Indochina, comprising modern day Cambodia, Laos, and Vietnam. After losing Indochina to the Japanese, who sowed an independence movement led by the communist Ho Chi Minh during World War Two, the French began to lose their grip on the colony. After some early losses and a failed negotiation with the French government, Minh and his forces fled to the hills until finally being recognized in 1950 by the Soviet Union and Red China, both of whom would flood Minh’s forces with weapons, supplies, and troops. As a result, French forces began to lose battles, beginning at Route Coloniale 4 that same year and ending with the disastrous defeat of Dien Bien Phu in 1954. The French then negotiated a cease-fire and peace agreement, granting independence to Cambodia, Laos, and Vietnam, the last of which was split at the 17th parallel between the communists in the north and the Republic of Vietnam in the south.

From 1954 to 1960, minor incursions were conducted in the south in an effort to destabilize the region, all of which were unsuccessful. Western interests continued to align with South Vietnam, and more support was given to not only quell communist incursions, but to elevate the country onto the world stage as a model for the Third World, despite wide-spread corruption, incompetence, and mismanagement by the south. During this time, a resistance group known as the National Liberation Front unified and began to intensify conflicts along and beyond the 17th parallel. Southern forces did not take these attacks seriously at first, deeming them a nuisance more than a threat, but as time passed, the communists become more efficient and deadly, gaining support throughout the south. This group had another name, one that the American GI would come to know very well, the Viet Cong.

1961-1963 – American Advisors

At first, American advisors were just that, advisors. They were tasked with training and tactics, providing the armies of the south with important information and ideas for conducting widespread war and suppressing communist tactics along the 17th parallel and Ho Chi Minh Trail, which at the time was a six-month trek through the mountains undertaken by communist forces to resupply their comrades in the south. As time went on however, the advisory role of American forces changed to a combat role, as violence between the north and south escalated. The CIA embedded special forces units to conduct operations against communist forces in the north and against insurgents in the south.

Chaos ensued on November 2, 1963 when the leader of South Vietnam was overthrown and executed, some say as a result of statements from the US State Department to the generals in the South that the US would neither oppose nor hinder such an operation. By this time, American advisors were embedded throughout the South Vietnamese government and were able to influence battle tactics at almost every level, all while doing their best to steer clear of the political upheavals going on around them.

It was during this period that the mission of American involvement became blurry, not because of planning or lack of competence, but more because the “winning of the hearts and minds” of a population was a new tactic for American armed forces. As a result, the chaos in the South further emboldened the organized, and centrally commanded communist forces in the north. Adding to this, John F. Kennedy was assassinated on November 22nd, with Vice President Lyndon Johnson becoming President, and changing the landscape—and mission—of what was then becoming a full-scale war.

1963-1969: Search and Destroy

In August 1964, the USS Maddox and USS Turner Joy were reported to be have been attacked in the Gulf of Tonkin, off the coast of North Vietnam. These attacks were answered with retaliatory air strikes. Unfortunately, recently declassified documents state that there were no attacks in the Gulf of Tonkin, and that both were invented by the CIA. As a result of these “attacks”, saturation bombing of the North commenced in 1965 with Operation Rolling Thunder. This operation lasted until 1968, by which time we had dropped more tons of bombs than we did on the Axis Powers in the entirety of World War Two.

The American ground war began in March 1965. United States Marines arrived and began fortifying vulnerable military positions. General William Westmoreland, commander of American forces in Vietnam, outlined his three-step plan for winning the war:

  • Phase 1. Commitment of U.S. (and other free world) forces necessary to halt the losing trend by the end of 1965.

  • Phase 2. U.S. and allied forces mount major offensive actions to seize the initiative to destroy guerrilla and organized enemy forces. This phase would end when the enemy had been worn down, thrown on the defensive, and driven back from major populated areas.

  • Phase 3. If the enemy persisted, a period of twelve to eighteen months following Phase 2 would be required for the final destruction of enemy forces remaining in remote base areas.

Despite early statements from the Johnson administration that South Vietnamese forces needed to win the war themselves, the administration supported Westmoreland’s strategy and increased troop levels from 2,000 to over 17,000. This was anticipated by the Viet Cong and North Vietnamese Army (NVA) who were actively recruiting during this time, gaining between eight hundred thousand to one million fighters from Vietnam, Cambodia, and China. Further support was given to guerrilla forces from these countries and from Soviet Union.

The belief behind Westmoreland’s strategy was that with enough pressure, the resistance would crumble. Unfortunately, with guns, ammunition, and personnel continually resupplying enemy combatants via the Ho Chi Minh Trail, this proved incorrect. As the war dragged on, more and more American troops would be sent to Vietnam, most of whom participated in “Search and Destroy” missions. This strategy entailed American troops moving from one location the other in an effort to find the enemy. Once found, they would engage and clear the area. This method was alien to American commanders, who were used to clear lines of engagement, fronts, and other geographic measures to calculate the progress and success of the war. With exceptions like the battles of Ia Drang Valley and Hue City, most engagements were conducted without the benefit of such lines, and as a result, progress was difficult to measure.

Additionally, most of the engagements took place over an area of land that American forces would leave once the conflict had ended. To make matters more confusing, politicians at the White House and the Pentagon often overrode the orders of those in the field when choosing where to deploy American forces and which VC or NVA targets to attack. This ongoing lack of progress combined with the battle tactics of the Viet Cong and NVA such as mines, snipers, and tunnels, measurably demoralized American forces as the war dragged on.

This culminated with the Tet Offensive of 1968, which is still the largest coordinated attack in the history of war. Every American base was hit simultaneously, at almost the same hour. Though it was an utter failure, and Americans withstood the offensive, the Tet Offensive caused significant damage to the perception of the war at home, with Walter Cronkite of CBS—despite having no military experience, information, battlefield intelligence, or any expertise on war whatsoever—saying the war was all but lost. Statements like this, and many others in the American press, emboldened the defeated Viet Cong an NVA soldiers, who despite losing continuously against American forces, became more aggressive, and more precise in killing American GIs.

1969 – 1975: Winning but Losing

As US forces suffered morale issues and the North Vietnamese became more emboldened, support for the war at home dissipated almost completely. The lack of perceived progress, together with politicians’ inability to effectively wage guerrilla war, contributed to Americans’ disillusionment. This, combined with atrocities like the My Lai massacre, pushed protesters into the streets to try to end the war. These protests increased after 1966, lasting right up to the end of the war nine years later. Many of the protests were centered on “bringing the troops home” and ending the conflict overall, but some were infiltrated by hate groups in an effort to protest the existence of America itself. Whatever the reason, this was a turbulent time for American society, as civilians questioned the motives behind the conflict en masse.

In response to the lack of progress, American forces began to withdraw in 1971, two years into Richard Nixon’s presidency. The NVA launched a full-scale invasion of the south in 1972 which quickly overran South Vietnamese positions that American military professionals believed to be impenetrable. US air power responded to these attacks but were not able to stop them. After the success of the Easter Offensive, negotiators returned to the table in 1973 for the Paris Peace Accords, which called for several stipulations by both the communist North and the democratic South, as well as their American allies, who promised a complete withdrawal from Indochina. In a not-so-surprising move, this was the only provision that was ever honored in the agreement, with the communists ignoring their promises and advancing further. By the end of 1973, all American troops had been pulled off the front lines, leaving only advisors and those serving in the South Vietnamese government behind.

With the withdrawal of American personnel, South Vietnamese forces could not operate the mountains of equipment given or left behind by US forces. This put them at a severe disadvantage as the North marched south. In December 1974, communists attacked and overran the city of Phuoc Long. President Gerald Ford begged Congress to resupply the South and help them hold out. Congress refused. The abandonment of the South Vietnamese by the American government and the continuous losses on the battlefield drove the once-proud Army of the Republic of Vietnam to despair, and many units abandoned their positions or surrendered outright. Others however, made last stands against the communists as they watched their country slowly die.

The fall of Saigon came in April 1975, when over one hundred thousand NVA regulars besieged the city and its thirty thousand defenders. The American embassy was abandoned, with the last US Marines leaving in the early hours as civilians broke down the gates in an effort to get on the last few choppers. Many of these people had worked for the Americans and watched as they were left to their fate while their homes burned around them. The last defenders were overrun within a few days, with NVA soldiers walking through the gates of the American embassy at 11:30 AM on April 30th to raise their flag and declare victory.

It is important to note that despite the tactics, lack of perceived progress, and mismanagement of the war itself, American forces never lost a single battle. This may surprise you, our audience, given the eventual outcome of the conflict. In addition, countless South Vietnamese who opposed communist rule fought to the death for the freedom they sought. Many of them had fled from the North after witnessing the communist definition of “freedom,” which translated into rape and mass execution. In this, its important to remember that the American soldier did not lose the war in Vietnam; the American politician did. Through corruption, lies, micromanagement of battle tactics and plans, and overall incompetence were responsible for the eventual loss.

Aftermath

Over fifty thousand American soldiers died in the Vietnam War, with a larger number maimed or psychologically damaged. The loss in Vietnam demoralized the American public. Protestors who had professed concern for American troops spit and urinated on them as they came home. There were no parades. There were no “thank yous.” Many veterans did not acknowledge their service due to being denied employment for doing so. Additionally, many of the troops who came home did so right out of combat zones; they would be in the jungle, in combat, and then ordered to fall back to bases they would be disarmed, sent to an airport and home within 72 hours of being in the bush. Because of the nature of the war and the hatred they experienced when returning home, the first cases of PTSD were recognized.

The war in Vietnam marked a new chapter in American combat and foreign policy. The failure of the American government to manage a conflict, coupled with the lack of measurable objectives have influenced the conflicts that came after. As we look back on this conflict, it is important to understand the context, history, and motivations to gain the right insight about this war, to understand and honor the sacrifices of the American soldier, and hold government officials accountable for their actions.

Available Wherever You Listen to Podcasts

One Small Step | The Space Race

There is no strife, no prejudice, no national conflict in outer space as yet. Its hazards are hostile to us all. Its conquest deserves the best of all mankind, and its opportunity for peaceful cooperation may never come again. But why, some say, the Moon? Why choose this as our goal? And they may well ask, why climb the highest mountain? Why, 35 years ago, fly the Atlantic? We choose to go to the Moon! We choose to go to the Moon in this decade and do the other things, not because they are easy, but because they are hard; because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one we intend to win

— John F. Kennedy, speech at Rice Stadium in Houston, TX, September 12, 1962 —

Since the dawn of time, mankind has looked into the heavens and wondered what was there. The ancients believed the stars controlled man’s destiny; medieval scholars thought the heavenly spheres represented perfection and that God was to be found among them; and astronomers of the Renaissance saw that they were as imperfect as our own planet. Space captured the attention of the American people beginning in the 1930s as radio dramas like Buck Rogers in the 25th Century told stories of adventure on other worlds. The invention of motion pictures and television brought classics of science fiction like Star Trek, Star Wars, and 2001: A Space Odyssey into the homes of millions of Americans.

During the 1950s and 1960s, the exploration of space became what astronaut Frank Borman called “another battle in the Cold War.” But unlike the other contests of those years between the United States and the Soviet Union, it involved science and technology rather than bullets or bombs. In World War Two, the Germans had developed the first self-propelled rockets which they put to devastating use against the cities of Great Britain. After the war, the Soviets began to build rockets of their own. In an effort to “catch up,” the United States government enlisted the aid of former Nazi rocket scientists like Wernher von Braun and Günter Wendt to bring America into the heavens. Yet the Soviets held the advantage in the early years of the “Space Race.”

“A Red Moon”

On October 4, 1957, the Soviet Union launched Sputnik I, the first artificial satellite in human history. Sputnik had two purposes: it provided the Soviets with valuable data on the composition of Earth’s atmosphere, and it became a major propaganda victory over the West. In a time of heightening tensions between the two superpowers, some Americans feared that the heavens would belong solely to the communists. Soviet successes multiplied throughout the late 1950s as they launched the first animal into space later in 1957 (a dog named Laika), and returned two more to Earth three years later. The Soviets also landed the first unmanned probe on the moon in 1959. Their greatest—and last—victory in the Space Race came in April 1961 when cosmonaut Yuri Gagarin became the first human in space. He orbited the Earth for 108 minutes before returning safely home.

The American space agency, the National Aeronautics and Space Administration, had already embarked on a program to match Soviet exploration of space. 23 days after Gagarin’s flight, astronaut Alan Shepard became the first American in space aboard the Mercury space capsule Freedom 7. His flight lasted only fifteen minutes but nonetheless showed the world that America was catching up. In February 1962, the first American victory came with John Glenn’s historic flight around the Earth aboard Friendship 7. Glenn orbited three times and returned a national hero; he was elected to the United States Senate, and then in 1998 he became the oldest person to fly in space at the age of 77.

“We Choose to Go to the Moon”

On May 25, 1961, President John F. Kennedy addressed a joint session of Congress about the future of America’s space program. In his speech, the president announced a bold vision to the country: “I believe this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth.” NASA officials were stunned—at that point they had not yet sent an astronaut on an orbital flight around Earth, and not even the best scientists in Houston had a clue how to get humans to the moon. Yet Kennedy’s announcement electrified the nation and galvanized both the public and private sectors into the greatest scientific and technological effort in history.

The Soviet Union accepted Kennedy’s challenge of a race from the earth to the moon in 1964. The architect of their space program, Sergey Korolyov, began work on a larger space capsule that could, in theory, take cosmonauts to the moon, and Yuri Gagarin was to have led the team on their voyage. (The Soviet moon program was critically damaged two years later when Korolyov died of unknown causes, but the Soviets continued their orbital flights throughout the mid-1960s.)

The testbed for the American moonshot was NASA’s Gemini capsule. Many of the technologies and procedures used in the later Apollo program were first developed on Gemini, and the rookie astronauts of Gemini became the veterans of Apollo. On Gemini 7, astronauts Frank Borman and Jim Lovell spent fourteen days in space, long enough for a voyage to the moon in December 1965. Neil Armstrong, flying on Gemini 8, was the first pilot to dock his ship with another spacecraft. Both he and his copilot Dave Scott nearly died when a thruster malfunctioned and sent the ship careening off course. Only Armstrong’s cool head brought the ship back under control. On the last flight, Gemini 12, Buzz Aldrin spent five hours in a spacesuit outside his ship performing various tasks, proving that a human could work in a weightless vacuum—provided that the spacecraft was supplied with plenty of a newly-invented product called “velcro.”

Even as NASA was firing astronauts into space on Gemini capsules, an equally important effort was underway on the ground. In order to land astronauts on the moon, NASA had to first design a landing vehicle of some kind. The Grumman Aircraft Engineering company spearheaded the design, delivering the first Lunar Module in 1967. The first unmanned test of this spider-like spacecraft was in January 1968, and Grumman then awaited word for the first manned flight of its creation.

The Apollo program that would eventually take Americans to the moon suffered a tragic setback in January 1967. While performing a routine battery of tests on the Apollo 1 capsule, astronauts Gus Grissom, Ed White, and Roger Chaffee were killed in a fire that consumed the Command Module’s cockpit. The disaster nearly derailed the Apollo program, and only Frank Borman’s heartfelt appeal to the Senate’s investigation committee saved it. NASA spent the rest of 1967 fixing the problems revealed by the Apollo 1 fire in preparation for its first flight.

“One Small Step”

In October 1968, the Apollo program finally got off the ground with the flight of Apollo 7 commanded by Wally Schirra. Despite this success, Apollo was still behind schedule for an end-of-the-decade lunar landing, and NASA decided to send the next flight around the moon. In December, Apollo 8 blasted off from the Kennedy Space Center at Cape Canaveral, FL, with astronauts Frank Borman, Jim Lovell, and Bill Anders aboard. Apollo 8 successfully completed ten orbits around the moon on Christmas Eve, 1968, and the astronauts marked the occasion by taking the first pictures of an “earthrise” and broadcasting a reading of Genesis 1:1-4 back to Earth. This heartfelt gesture at the close of a very difficult year for the United States and the world thrilled the hearts of millions—and it also sparked a lawsuit from atheist groups offended by the broadcast.

The Apollo 9 and 10 missions focused on testing the Lunar Module in space, and on July 16, 1969, the world watched eagerly as Neil Armstrong, Buzz Aldrin, and Michael Collins hurtled into space aboard the Apollo 11 spacecraft. Four days later, Armstrong and Aldrin entered the Lunar Module “Eagle” while Collins remained aboard the Command Module “Columbia.” The Eagle descended toward the lunar surface, landing safely on the moon at 4:17pm Houston time. Buzz Aldrin then spoke to Mission Control and those listening to the NASA broadcast. He asked them “to pause for a moment and contemplate the events of the past few hours and to give thanks in his or her own way.” He then took communion off the radio as Armstrong looked on. At 8:56pm Houston time, Neil Armstrong opened the “Eagle’s” hatch, climbed down the ladder, and became the first human to walk on the moon; as he stepped off the lander he spoke the immortal words: “That’s one small step for a man and one giant leap for mankind.” The television audience in the United States and approximately 450 million radio listeners heard these words and subsequent broadcasts from the Sea of Tranquility. After two moonwalks,

during which the astronauts placed an American flag and commemorative plaque on the lunar surface and spoke to President Richard Nixon via radio, the “Eagle’s” ascent stage lifted off and docked with Collins on the “Columbia.” Four days later, “Columbia” splashed down in the Pacific Ocean, and Neil Armstrong and Buzz Aldrin became the most famous Americans alive at the time. (Sadly, Michael Collins was largely ignored by both the public and the press.)

“It’s Been a Long Way, but We’re Here”

After the historic flight of Apollo 11, the American people largely lost interest in manned spaceflight; Kennedy’s vision had been fulfilled, and there were other developments on Earth that deserved attention—especially the war in Vietnam. The Apollo 12 flight sent two more astronauts to the moon, as did four other flights. A brief resurgence of public interest in NASA occurred in April 1970 when an onboard malfunction crippled the Apollo 13 spacecraft. Astronauts Jim Lovell, Fred Haise, and Jack Swigert were forced into the Lunar Module “Aquarius,” which they used as a lifeboat to survive for four days until splashing down safely in the Command Module “Odyssey.” (Apollo 13 proved that Grumman’s design of the Lunar Module was sound, and Grumman issued a “towing bill” as a gag to North American Rockwell, designer of the Command Module, for $312,421.) And yet NASA was soon hampered by budget cuts from Congress, and the final three Apollo missions were canceled. The last Apollo capsule to fly was the Apollo-Soyuz Test Project, a joint mission between the Americans and Soviets; it docked with a Soviet Soyuz capsule in July 1975 in what became a symbolic end to the Space Race.

Both the United States and the Soviet Union shifted their focus of space exploration toward space stations after Apollo 11. The Soviets launched the first orbital station Salyut 1 in April 1971, and the United States put up three Skylab stations between 1973 and 1974. Two other space stations were eventually launched: the Russian “Mir” station which orbited from 1986 until 2001, and the International Space Station which was launched in 2000 and can be seen with the naked eye on a clear night on Earth.

The United States also developed a reusable orbital vehicle, the space shuttle, in the 1970s. The first, a test vehicle not meant for flight in outer space, was named the Enterprise after thousands of Star Trek fans petitioned NASA to pay tribute to the fictional starship. Five other shuttles were built by NASA: Columbia, Challenger, Discovery, Atlantis, and Endeavor. Two were tragically lost with all hands; the Challenger exploded soon after launch in 1987, and the Columbia broke up while reentering the atmosphere in 2003. The shuttles served as America’s only space vehicles for three decades, and the final flight took place in February 2011. Of course, mankind continues to look to the stars, and NASA plans to move forward with new vehicles and missions to return us to the moon and, perhaps, reach deeper into the void of space toward Mars.

The legacy of the Space Race reaches far beyond the political turmoil of the Cold War. The new technologies developed for the moonshot fueled discoveries in aerospace engineering, electronics, and telecommunications; many devices present in American homes today find their origins in the Space Race. It also spurred a renewed interest in astronomy, mathematics, and engineering that continues today. The thousands of satellites orbiting today were made possible only by the Space Race. In fact, whatever device you’re using to listen to this podcast probably traces its origins to the Space Race. It is important to remember the purpose of NASA and America’s exploration of space, a message encapsulated in the words on the plaque left on the moon by Neil Armstrong and Buzz Aldrin on July 20, 1969: “We came in peace for all mankind.”

AVAILABLE WHEREVER YOU LISTEN TO PODCASTS

A More Perfect Union | Civil Rights & the Supreme Court

No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.

— The Fourteenth Amendment, Section I —

The year was 1865. The Union and Confederate armies were locked in mortal combat around the city of Petersburg in Virginia and in the humid forests of Georgia. The end of the Civil War was near, and the Lincoln Administration was facing the question of how to restore the Union and preserve it against another rebellion. With the passage of the three anti-slavery amendments to the Constitution came the legal framework for equality between the races, and each of the Confederate states would be required to ratify the amendments before they could rejoin the Union. President Lincoln’s message to the American people in his second inaugural address was clear: “With malice toward none, with charity toward all, with firmness in the right as God gives us to see the right, let us strive on to finish the work we are in, to bind up the nation’s wounds, to care for him who shall have borne the battle and for his widow and orphan, to do all which may achieve and cherish a just and lasting peace among ourselves and with all nations.”

Tragically, Abraham Lincoln’s vision of a peaceful and merciful reconstruction of the Southern states died with him on April 15, 1865. Flushed with anger and eager for revenge, the Republican-controlled Congress imposed a military occupation on the South to crush all lingering anti-Unionist sentiments. The eleven years of Reconstruction saw the rise of the Ku Klux Klan, the flight of many freed slaves to the North and their subsequent return as sharecroppers, and the stigma of carpet-bagging politicians and businessmen hoping to make a quick buck on the backs of the rebel states. When Reconstruction ended in 1876 amidst the turmoil of that year’s presidential election, large sections of the South remained unreconstructed.

As federal troops pulled out of the cities and towns of the Old Confederacy as part of the compromise that brought President Hayes to the White House, racist Democrats regained majorities in many Southern legislatures and began to institute policies in their states that would have lasting consequences for racial equality and justice for African-Americans.

“Jim Crow” and the Supreme Court

Southern Democrats wished to return to power by using racial discrimination to mobilize white voters who hated the Freedmen. They approached this goal in two ways: first, the Ku Klux Klan (founded by the former Confederate general Nathaniel Bedford Forrest) would terrorize the Republican minority and the population of freed slaves, preventing them from casting votes even after the Fifteenth Amendment was passed; and second, by enacting laws which separated the races in all public spaces. These were later called “Jim Crow” laws, named for a racist blackface caricature of African-Americans by a white actor named Thomas D. Rice in the years before the Civil War. By 1892, the phrase “Jim Crow” referred to any law which separated whites from blacks in any public area.

The federal government tried on several occasions to break the Jim Crow laws, but with limited success. The Civil Rights Act of 1875, as well as the Fourteenth Amendment, guaranteed legal equality for all Americans regardless of race, but the Southern states repeatedly cited the Tenth Amendment and state sovereignty as justification for their racist laws. (Before the Civil War, the South had used this same argument to protect slavery within their borders.) In 1890, Louisiana passed a law segregating passengers on trains by race, and both black and white activists of the Citizens’ Committee of New Orleans soon mobilized to challenge this in the courts. One of their leaders, Homer Plessy, bought a ticket on a train departing from New Orleans and sat in a whites-only carriage. He was informed that due to his racial heritage, he would have to move to a “colored-only” car; he refused and was arrested.

Plessy’s case was heard in a Louisiana court, and his lawyer cited the Thirteenth and Fourteenth Amendments as evidence that he had every right to sit in whatever rail car he wished. The court disagreed, and Plessy was convicted of breaking the state’s Separate Car Act. The Louisiana Supreme Court refused to hear the case, and so Plessy’s attorneys appealed to the Supreme Court. Four years later, the case of Plessy v. Ferguson was brought before the highest court in the United States. The defendant was John Howard Ferguson, the judge in Plessy’s original case.

The Supreme Court had a difficult history with regards to racial issues in America. The court has the power of judicial review, which permitted it to rule on the constitutionality of federal, state and local laws. Until the 1850s, the court had used this power sparingly, preferring to allow Congress to decide on important social issues facing the country. In 1857, Chief Justice Roger Taney broke with this tradition and authored the most shameful decision in the court’s history: Dred Scott v. Sandford. In this ruling, the Supreme Court upheld a circuit court decision that African-Americans were not citizens of the United States, and that slaveowners had the right to bring their “property” into a free state (in effect making slavery legal across the United States). Thus, much was at stake for the Supreme Court’s reputation as the case of Plessy v. Ferguson came before the bar.

The court heard oral arguments on April 13, 1896, and issued its ruling on May 18th. Justice Henry Brown wrote the majority decision, and six other justices joined. The Court ruled that states could segregate public facilities as long as they provided equal facilities for both races. The lone dissenter in the case was Justice John Harlan (later known as the “Great Dissenter”), and his fiery statement about the Constitution being “color-blind” and labeling the ruling as being on the same level as the hated Dred Scott would eventually lay the groundwork for Plessy’s ultimate reversal.

Now armed with the Supreme Court’s blessing, Southern states increased their legal oppression of black Americans, passing new laws each year which further restricted their rights and freedoms. Anti-miscegenation laws were passed in many states, forbidding whites and nonwhites from marrying one another. Even some federal government agencies adopted segregationist policies. The Federal Housing Administration forced African-Americans into specific neighborhoods in American cities to minimize their impact in congressional elections.

President Woodrow Wilson, a hero to modern progressives, permitted the Civil Service to segregate its workers and signed a law in 1917 to segregate the US Armed Forces. The 350,000 African-Americans who served in the military during the Great War were commanded by white officers, and black soldiers were prohibited by law from commanding white troops under any circumstances. This policy continued during the Second World War, in which only five African-Americans served as officers in an army numbering over sixteen million men. No black soldiers were awarded the Medal of Honor despite several acts of unimaginable heroism until 1997, 52 years after the war had ended.

The Civil Rights Era

As the world saw the horrors of the Nazi Holocaust in tapes played at Nuremberg and heard stories from returning American soldiers who had liberated the camps, many Americans began to question their country’s racist policies toward their own minority citizens. In 1948, amidst a storm of criticism, President Harry Truman signed Executive Order 9981 and desegregated the Armed Forces. The president, a Southerner, believed that merit should determine command, not skin color.

The movement for civil rights would grow in strength during the late 1940s, and as America entered the relatively peaceful and prosperous 1950s, it became clear that it was time for change. In 1954, a class action lawsuit in Topeka, Kansas, evolved into a suit between an African-American welder named Oliver Brown, whose daughter Linda had been denied admittance to an elementary school because she was black, and the Board of Education in Topeka. The lawsuit was filed in federal court, as were others in the states of South Carolina, Virginia, Delaware, and the District of Columbia. In each case, district and circuit courts refused to permit integration, and the Supreme Court chose to hear all five cases under the umbrella of Brown v. Board of Education.

Racial tensions were running high in the country as the Supreme Court met to hear the case. Brown’s defense was led by the chief counsel for the National Association for the Advancement of Colored People, Thurgood Marshall, who was later appointed to the Supreme Court by President Lyndon Johnson. When the court first heard the case in the spring of 1953, they were unable to come to a decision. The case was re-heard in the fall of that year, but Justice Felix Frankfurter was able to delay a decision yet again in an effort to convince Chief Justice Fred Vinson to rule in favor of Brown. When the chief justice died in September 1953, President Eisenhower used his recess appointment powers to install Earl Warren as the new Chief Justice. (Frankfurter commented that Vinson’s death was conclusive proof of the existence of God.)

Warren, a supporter of desegregation, was able to wrangle the other justices to his side, and on May 17, 1954, the Supreme Court unanimously ruled in favor of Oliver Brown in what many observers—and members of the court—have called the greatest moment in the court’s history. Brown v. Board of Education required all states to immediately desegregate their public schools. It was a massive victory for the cause of civil rights and legal equality between the races. However, the court did not provide a mechanism for desegregation, and in a second decision a year later, it ordered schools to comply with Brown “with all deliberate speed.” Brown’s impact was immediate—the civil rights movement was galvanized to push for greater equality. When Rosa Parks defied a segregationist law by sitting on a bus seat reserved for whites and was arrested and abused by racist police officers in Birmingham, Alabama, the demand for implementing Brown’s provisions grew even louder. In November 1960, in the midst of a national election, President Eisenhower took measures into his own hands and sent US marshals into the South to forcibly desegregate public schools. Five marshals escorted Ruby Bridges, who was then only six years old, into her new elementary school as angry mobs of both white and black Americans hurled abuse at the little girl. At last, the Court’s ruling was being enforced.

The Supreme Court’s Legacy

The Civil Rights Era culminated with Dr. Martin Luther King’s speech in August 1963 in Washington proclaiming his dream of a color-blind society (echoing Justice Harlan’s dissent in Plessy) and the passage of the 1964 Civil Rights Act. The dream of legal equality for African-Americans had become a reality at last. Of course, work remains to be done in the area of social equality and putting an end to racism on both sides of this controversial issue.

The American people affirmed the Supreme Court’s ruling in Brown as the Civil Rights movement marched onward, and the court continued to issue decisions to break down barriers to legal equality. As the 1960s dawned and waves of progressive change swept across the United States in areas beyond race relations, the court looked back to Brown and began to see itself as an agent of social change. In 1964, the Supreme Court struck down a Connecticut law banning the use of contraceptives in Griswold v. Connecticut, affirming a right to marital privacy as belonging to all Americans. This trend continued and is still seen in major court rulings today. In each case, the court looked back to Brown v. Board of Education in impact litigation decisions and has used its power of judicial review to bring about social change. While some may question the court’s use of its power for progressive ends, it is inarguable that the cause of equality and civil rights for all Americans transformed both the Supreme Court and the country as a whole.

AVAILABLE WHEREVER YOU HEAR PODCASTS

The Grey Ghost | The USS Enterprise

“Fate: Protects fools, little children, and ships named Enterprise.”

— William Riker, Star Trek: The Next Generation —

No, this podcast is not about Star Trek, so Star Wars fans and science fiction skeptics need not reach for the stop button. The name “Enterprise” is not exclusive to fictional starships or the space shuttle; in fact, nineteen ships of the British Royal Navy and nine of the United States Navy have born the name (spelled either with an S or a Z). Undoubtedly, the most famous USS Enterprise is the World War Two-era aircraft carrier, which fought in more battles in the Pacific War than any other vessel, earned twenty battle stars, and is today the most decorated ship in American naval history. “The Big E,” (first of her many nicknames) was commissioned in May 1938 and attached to the Atlantic fleet for her first year of service. As tensions rose with Japan and the Navy Department realized the importance of aircraft carriers in the Pacific, the Enterprise was transferred to the Pacific Fleet and based first at San Diego and then at Pearl Harbor.

From Pearl Harbor to Midway

The Enterprise was the flagship of Admiral William F. Halsey’s Carrier Division Two in the Pacific. On November 28th, the Navy Department ordered Halsey to deliver a Marine fighter squadron to Wake Island in preparation for a Japanese attack on US bases in the Pacific, and the Enterprise departed Hawaiian waters that evening. Halsey’s other carrier, USS Lexington, was moving a bomber squadron to Midway, and the Saratoga was at San Diego for repairs. All three carriers were thus absent when the Japanese attacked Pearl Harbor on December 7th, which is one of the reasons why the United States recovered and went on the offensive against Japan so quickly. Halsey learned of the attack while the Enterprise was returning to Hawaii from Wake, and she reached Pearl Harbor on the evening of December 8th. As the crew watched the still-burning wrecks of American warships from the flight deck, Halsey met with Admiral Husband Kimmel and ordered every able-bodied man aboard his flagship to prepare the Enterprise for departure and battle.

Halsey was the most aggressive fleet commander in the Pacific War, and the Enterprise was his primary weapon. While he lacked the ground strength to invade enemy-occupied islands, he was determined to move and strike as quickly and as often as he could, and the Enterprise and her escorts sailed from one Japanese target to another throughout the early months of 1942, bombing enemy islands and sinking enemy supply ships. She also protected troop and supply convoys headed for Samoa and, on February 1st, raided the Marshall Islands in the largest American attack on the Japanese thus far. Enterprise pilots sank three ships, damaged eight more, and destroyed at least sixteen aircraft and numerous ground installations.

By April 1942, both the public and the Roosevelt administration were eager to hit the Japanese Home Islands, and a plan had been drafted by Colonel James Doolittle of the US Army Air Corps to fly sixteen B-25 bombers off a carrier to attack Tokyo. Halsey’s carrier division, now centered around Enterprise and USS Hornet, would be the main naval strike group for the “Doolittle Raid.” The Hornet would carrier the bombers (and thus be unable to launch fighters), while the Enterprise would protect the attack group by flying combat air patrol. The Enterprise left Pearl Harbor on April 8th, met the Hornet coming west from San Diego, and crossed the Pacific heading for Japan. The fleet was sighted by enemy patrol vessels six hundred miles from their targets, and the Hornet launched the bombers while the Enterprise’s fighters attacked the enemy ships. Their job complete, both carriers and their escorts returned to Pearl Harbor on April 25th. The Doolittle Raid was a stunning success, both in the military and propaganda spheres. While Tokyo suffered little damage, the psychological effect of an attack on their capital led the Japanese to pull some of their air defense strength back from the front to defend the city and their emperor.

Only days after their arrival at Pearl Harbor, the Enterprise and the Hornet were steaming south to join the carriers Lexington and Yorktown in the Coral Sea, but the Japanese attacked before the task force could arrive. The Lexington was sunk in the Battle of the Coral Sea, and the Enterprise returned to Pearl Harbor in late May. Admiral Halsey was beached because of a skin condition, and command of his task force passed to Admiral Raymond Spruance, who sailed with the Hornet and the Yorktown to defend Midway Island in early June. The initial Japanese attack on the island and the American carriers did little damage, while the first American strike group got lost searching for the Japanese fleet. They found the enemy as the Japanese were rearming their planes and attacked immediately. Three Japanese fleet carriers were sunk, two by dive bombers from the Enterprise, and a fourth had to be abandoned. The American carrier Yorktown was badly damaged and scuttled as well. Midway was the turning point in the Pacific War, and the Enterprise had played a key role in stopping the Japanese onslaught once and for all.

The Solomons and Santa Cruz

By mid-1942, the United States had mobilized its economy for wartime production and was putting new warships to sea every week. The Pacific fleet began to receive small Wasp-class escort carriers, but its three remaining fleet carriers (Enterprise, Saratoga, and Hornet) were still the core of its striking power and thus were deployed in every major engagement going forward. The Enterprise spent a month at Pearl Harbor for crew rest and refitting before joining the fight at Guadalcanal in the Solomon Islands northeast of Australia. The Battle of Guadalcanal, fought on land and sea, was one of the fiercest engagements of the Pacific War.

The Enterprise’s fighter and bomber groups fought countless actions against the enemy and sank the light carrier Ryujo in July 1942. Squadron VF-10 was so effective in aerial combat that it earned the sobriquet “The Grim Reapers.” Enterprise itself took heavy damage in July, but her damage control parties managed to patch her up, and she was able to return once again to Pearl Harbor. Back in the fight in October 1942, the Enterprise fought in the Battle of the Santa Cruz islands, where the Japanese sank the carrier Hornet and severely damaged “Big E.” With the Saratoga out of action temporarily from an enemy torpedo hit, Enterprise was now the only American fleet carrier in the Pacific. She fought through and pushed the Japanese back at Santa Cruz, and her Seabees (Construction Battalion workers) pulled double-duty and worked around the clock to repair as much of the ship’s battle damage as possible before the ship reached an Allied drydock at New Caledonia. The carrier pulled into Nouméa on October 30, 1942, and French yard workers and civilians saw a massive banner fluttering over the flight deck: “Enterprise vs. Japan.”

The Seabees and their French counterparts repaired the Enterprise in record time, earning them the praise of Admiral Halsey, long recovered from his skin condition and now commander of all American naval forces in the South Pacific: “our commander wishes to express to you and the men of the Construction Battalion serving under you his appreciation for the services rendered by you in effecting emergency repairs during action against the enemy. The repairs were completed by these men with speed and efficiency. I hereby commend them for their willingness, zeal, and capability.” These words of high praise were rare from the famously-gruff Halsey, and they inspired the ship’s Seabees to redouble their efforts in the coming battles. For the next six months, the Enterprise was the tip of the spear in America’s fight against Japan. She appeared, struck, and disappeared so often that the Japanese called the ship the “Grey Ghost” and speculated that there had to be at least three identical ships bearing the name Enterprise. As the Guadalcanal campaign wound down, Enterprise fliers destroyed the Japanese battleship Hei, covered American landings on small islands in the Solomons, and engaged enemy surface ships near the Rennell Islands. She was then ordered back to Pearl Harbor in May 1943, where she was presented with the first Presidential Unit Citation ever given to an aircraft carrier. She then steamed for Puget Sound for major repairs and upgrades.

From Washington to the Philippines, and Back Again

The 1943 refit of Enterprise included new anti-aircraft guns, upgraded plane elevators, new workshops for the Seabees, and an “anti-torpedo blister” to protect her hull. She returned to Pearl Harbor in November 1943 and joined Admiral Spruance’s Fifth Fleet. The Enterprise was smaller than the new Essex-class carriers and Iowa-class battleships, but her famous name and fearsome reputation inspired sailors aboard any ship which could see her in the distance. Now serving in the Central Pacific Theater under the overall command of Fleet Admiral Chester Nimitz, the Enterprise spearheaded the assault on the Gilbert and Marshall Islands, striking enemy shipping and shore installations before the Marines went in to capture the islands. She then joined the attack on the Mariana Islands, hitting Saipan and supporting troops liberating the island of Guam. As the summer of 1944 approached, the crew of the Enterprise knew that their next target would be the Philippines, America’s largest prewar possession and the final goal of the central Pacific offensive.

Admiral Spruance deployed the Fifth Fleet west of the Mariana Islands in the Philippine Sea to screen for enemy defenses as Army and Marine units boarded their transports for the assault on the Philippines. The Enterprise and three other carriers were in the van when the Japanese attacked on June 19, 1944. The Battle of the Philippine Sea was the largest carrier battle in history. Enterprise bombers struck Japanese battleships and cruisers, which burst into flames as bombs and torpedoes found their mark, and her fighters blasted one enemy Zero from the skies after another. The carrier took moderate damage during the battle and returned to Pearl Harbor (as it turned out for the last time during the war). She then returned to the Fifth Fleet in time for the invasion of the Philippines.

During the preliminary moves toward the Philippines, the Enterprise made minor attacks on small Japanese islands and one large raid on Formosa, but her crew husbanded their resources for the battle to come. Before the Americans could land on the many islands of the Philippines, the Japanese Navy would have to be brought to battle one last time and its strength destroyed by the overwhelming might of the Fifth Fleet. The two sides engaged each other in the last major naval battle of the war at Leyte Gulf in late October 1944. This battle produced heroes on both sides and saw some stunning acts of courage on the part of American destroyer captains (most famously Ernest Evans aboard USS Johnston), and it was the Enterprise’s greatest test of the war. The carrier’s pilots sank one Japanese ship after another in the largest naval battle in history, and she suffered two direct hits from enemy kamikaze planes. Fortunately the damage was minor, and she emerged from Leyte Gulf largely unscathed. Her record for the battle stood at three enemy ships sunk and 52 planes shot down, the largest count of any ship in the battle.

With the Japanese now confined largely to land and unable to project power at sea, the Enterprise returned to supporting landing forces and conducting small air strikes on enemy-held islands. She participated in the Battle of Iwo Jima, at one point maintaining a continuous combat air patrol over the island for seven days and eight hours. Whenever she was damaged, the carrier sailed to Ulithi atoll in the Caroline Islands north of Indonesia and then returned to the fight, usually within three or four days. As the Battle of Okinawa heated up, the Enterprise was repeatedly attacked by suicidal kamikaze planes. On May 14, 1945, six days after the war in Europe had ended, a kamikaze Zero fighter crashed into her forward elevator, destroying the mechanism and killing or wounding almost fifty sailors.

Now unable to launch aircraft at full capacity, the Enterprise set sail for Puget Sound, where she was repaired and upgraded for a second time. However, two days before she was scheduled to return to action, the United States dropped an atomic bomb on Nagasaki, and the war ended only a few days later. Of America’s five prewar aircraft carriers, only two had survived the war (Enterprise and Saratoga). Both were heroes of the Pacific war, but the Enterprise stood tall above her sister and all other warships of the United States Navy.

The End of “Big E”

For the several months, the Enterprise became what one sailor aboard called “a glorified ocean liner.” She ferried returning veterans home from Pearl Harbor in the weeks after the war’s end and then sailed for the East Coast, where her hangers were filled with bunks. The carrier then crossed the Atlantic three times to retrieve veterans of the European war. She was honored by the British Admiralty in a ceremony at Portsmouth in November 1945, and by early 1946 her labors had ended.

Many World War Two-era ships have been turned into floating museums, but this honor was sadly denied to both veterans of the early Pacific war. Saratoga was sunk in an atomic bomb test at Bikini Atoll in 1946. She survived the first explosion, but the second sent her to the bottom. Enterprise was spared this ignominious end, but her fate was hardly more glorious. The “Big E,” pride of the US Navy and symbol of American strength in the darkest hours of the war with Japan, was decommissioned in 1947 and sold for scrap. By 1960, only the ship’s bell (which is now at the US Naval Academy in Annapolis, MD), the stern name plate (now in River Vale, NJ), and an anchor at the Washington Navy Yard remained of the great ship.

Of course, the Department of Defense was determined not to allow the name Enterprise pass into history, and in 1958 the Navy christened the world’s first nuclear-powered aircraft carrier USS Enterprise. After 51 years of service, that Enterprise was decommissioned in 2012, and Secretary of the Navy Ray Mabus announced at its decommissioning ceremony that the next Gerald R. Ford-class aircraft carrier would also carry the name Enterprise. When it comes to the United States Navy, names carry with them the legends of those ships which came before, and history will surely not forget the name Enterprise.

AVAILABLE WHEREVER YOU LISTEN TO PODCASTS

Standing Athwart History | William F. Buckley, Jr.

“I am obliged to confess I should sooner live in a society governed by the first two thousand names in the Boston telephone directory than in a society governed by the two thousand faculty members of Harvard University.”

— William F. Buckley, National Review Mission Statement, November 19, 1955 —

As we discussed in last week’s episode, the Great Depression led to a fundamental transformation of one of America’s two political parties and a revolution in the American political order. The Democratic Party under Franklin Roosevelt embraced the concept of a “welfare state” that sought to protect Americans from the ups and downs of market capitalism using the power of government. As has been said before, Fifteen-Minute History takes no position when it comes to political questions, and this must be restated here. Today, in the second series on American political philosophy, we will discuss the man who led the charge to define modern American conservatism, the political opposite to progressive liberalism.

William F. Buckley is not a well-known figure to most Americans today, but his impact is felt everywhere, from the halls of Congress and the White House to talk radio and the Fox News Channel. It was Buckley who saw the need to unite various factions within the conservative movement into a coherent social and political force to, as he put it, “stand athwart history yelling ‘Stop.’” Buckley was born in New York City in 1925 and educated in Paris and London.

He came late to the English language, first learning both French and Spanish, and this contributed to both his idiosyncratic accent and vast vocabulary. He attended Yale University, where he learned the art of debate and became a master of argument (a skill he put to great use in his many public and television appearances). In 1951 he joined the Central Intelligence Agency, and he began his writing career that same year. His first book quickly defined his image with the American public.

God and Man at Yale

As a student at Yale, William F. Buckley was concerned that the school was imposing what he called a “collectivist, Keynesian, and secularist ideology” upon its students. Rather than embracing the traditional role of the university and encouraging open dialogue and free thought, Buckley asserted that modern American education was an exercise in forcing students to adopt progressive beliefs regardless of how they had been raised. Having grown up in a conservative Catholic family, Buckley resented how his professors had tried to break down his religious faith and questioned the existence of God rather than encouraging individual intellectual growth by asking questions—as had been common in Western education since the time of Socrates.

God and Man at Yale landed in the American academic world like a bombshell. When it was first published, most intellectuals believed its initial popularity would fade, but it touched a nerve within middle America, especially among parents who listened to their children’s talks around the Thanksgiving and Christmas dinner tables when on holiday from university. They saw how their offspring had drifted away from the traditions of their youth, and God and Man at Yale helped these parents understand why.

Buckley continued to provoke strong reactions in his writings. His second book, McCarthy and His Enemies, defended the controversial Wisconsin senator as he pursued communist infiltrators within the American government during the so-called “Red Scare.” Throughout the 1960s, his books attacked the liberal order and the welfare state, and while they seldom earned favorable reviews from his East Coast peers or the academic world, they sold hundreds of thousands of copies and demonstrated that Buckley’s views were shared by more than just a handful of archaic conservatives in the segregated South and rural West.

National Review

Buckley was not the only conservative intellectual writing in the 1950s. A professor at Michigan State, Russell Kirk, published The Conservative Mind in 1953, in which he outlined the history of American conservatism and traced its modern principles to what he believed were their roots in the American founding. The Conservative Mind provided a detailed, academic description of conservatism, but amidst the storm of criticism it sparked from academia, the message was lost to average Americans, who found Kirk’s emphasis on wordy quotations from long-dead statesmen like Edmund Burke and John Adams difficult to apply to the modern world. There remained a vacuum in American society for conservative opinions, and William F. Buckley was determined to fill it.

National Review “stands athwart history, yelling Stop, at a time when no one is inclined to do so, or to have much patience with those who so urge it.” These words, written by Buckley in the Mission Statement for National Review magazine in 1955, were the opening volley in the literary movement to define conservatism for average Americans. As founder and editor of the magazine, Buckley brought together a group of contributors to write for him, many of whom disagreed with each other (and with their employer). Buckley looked for men and women who could express the principles of conservatism in clear, unambiguous terms and translate them into applicable precepts for their readers. Some writers, like Russell Kirk and the Catholic intellectual Brent Bozell (Buckley’s brother-in-law), pushed the traditional conservative message of faith and family; libertarians such as Frank Meyer argued for a limited government which acted only under the Constitution; and the anti-Communist Whittaker Chambers translated his experiences with American communism into a warning that, in his opinion, liberals were drifting toward socialism with their policies.

Buckley used his magazine to explain how conservative principles could be put into action in the United States. He also set limits on how one would define an American conservative: “It is the job of centralized government (in peacetime) to protect its citizens' lives, liberty and property. All other activities of government tend to diminish freedom and hamper progress.” His call back to the principles in the Declaration of Independence meant that, in Buckley’s view, certain people and groups who called themselves “conservative” actually were not. For example, Buckley explicitly denounced anti-Semitism and racism, as well as white supremacists like George Wallace, the segregationist Alabama governor who ran for president four times during Buckley’s lifetime. He also opposed the John Birch Society, a collection of authoritarian rightwingers who supported fascism in their efforts to slow the spread of communism around the world, and he rejected the ultra-libertarian philosophy of Objectivism and its patron saint, Ayn Rand (author of, among other works, Atlas Shrugged).

However, Buckley’s strong stance on the Constitution and its endorsement of states’ rights led to a great deal of controversy during the Civil Rights Era. Buckley and National Review supported segregationists and defended their views as consistent with the Constitution—though the magazine did urge southern states to permit African-Americans to vote without paying poll taxes or taking literacy tests. In 1957, Buckley wrote that whites in the South “had the right to impose superior mores for whatever period it takes to effect a genuine cultural equality between the races.” In effect, he was saying that temporary segregation was beneficial because black

Americans lacked the cultural and educational sophistication of whites. Buckley’s brother-in-law Brent Bozell broke with National Review on this issue, and during the 1960s the magazine softened its tone on civil rights as white supremacists brutalized African-Americans who were seeking equality. Buckley admitted later in life that he wished he had been more sympathetic to the civil rights movement, and he encouraged his readers to write to Congress in support of the creation of Martin Luther King, Jr. Day as a national holiday. Nevertheless, he still remains a controversial figure when it comes to questions of race in America.

National Review has endorsed many presidential candidates since its founding, always the “most rightward viable candidate” (what is now known as the “Buckley Rule”). Most famously, National Review supported Senator Barry Goldwater of Arizona in his challenge to President Lyndon Johnson in 1964, which Goldwater lost in a landslide. Many at the magazine were drawn to the actor Ronald Reagan, who gave a televised speech in support of Goldwater during the campaign, and Reagan (a National Review subscriber) soon came to embody the conservative philosophy it espoused. When Reagan challenged Gerald Ford in 1976, Buckley and National Review supported his insurgency, and they were overjoyed four years later when Reagan was elected president. The Reagan years saw National Review reach its peak in subscribers and influence. It supported much of Reagan’s agenda, was regularly cited in the president’s speeches, and its contributors often came to the White House for both policy briefings and public events. In the years since 1988, National Review has continued to promote traditional conservative positions, criticizing Bill Clinton’s welfare programs, supporting George W. Bush’s War on Terror and tax cuts, and opposing Barack Obama’s national healthcare plans. The magazine opposed Donald Trump in 2016, endorsing Senator Ted Cruz in the Republican primary, and it continues to hold President Trump’s feet to the fire whenever his actions stray from traditional conservative ideology.

Firing Line

William F. Buckley’s conservative voice earned him occasional spots on television throughout the 1950s and early 1960s as a commentator on world events. His relaxed posture, elegant accent and overpowering vocabulary were very popular with news consumers, and by 1966 he was a regular on CBS and NBC’s nightly news programs. In 1968, ABC hired Buckley to offer commentary on that year’s national conventions for the two political parties. As Buckley’s foil, ABC chose Gore Vidal, the controversial author and liberal intellectual. Buckley had once commented that he would never share a stage with Vidal, whose open homosexuality and liberal politics offended Buckley, but the two met and discussed the conventions in a (mostly) civilized manner. However, during an exchange on the violence of the Chicago police during the Democratic convention on August 28, 1968, Vidal called Buckley a “crypto-Nazi.” Visibly angered, Buckley lost his usual calm demeanor. He rose from his chair several inches and retorted, “Now listen, you (beep), stop calling me a crypto-Nazi or I’ll sock you in your (beep) face, and you’ll stay plastered.” Vidal had told his friends he hoped to anger Buckley on national television and thus disgrace him before his conservative fans, and he had gotten his wish. Historians point to this moment, which saw a massive audience reaction both for and against Buckley, as the beginning of modern political debate shows on television. Buckley was ashamed of his actions, but his feud with Vidal continued, and the two men traded barbs in print and interviews for the rest of their lives.

In 1966, Buckley began to host his own TV talk show called Firing Line. Broadcast first on a local New York television station and then nationally on PBS, Firing Line ran for 34 seasons with more than fifteen hundred episodes in all. The show typically brought liberal academics or politicians on to debate Buckley, who always remained calm—he had learned his lesson with Vidal. When he did jab his opponents, he was always polite, for example when he asked his liberal friend Mark Green during their 100th appearance together on the show, “Tell me, Mark, have you learned anything yet?” Firing Line occasionally had non-political figures on to discuss American culture or current events. Two of the most memorable shows featured the boxing champion Muhammad Ali discussing black nationalism and the poet Alan Ginsburg giving his views on hippie and drug culture. Firing Line also hosted formal debates between presidential candidates moderated by Buckley, as well as political or cultural debates in which Buckley always led the affirmative team. Firing Line showed America that its political and intellectual leaders could engage in civil debate with the other side rather than shouting talking points at each other.

As cable television grew in popularity in the 1980s and Firing Line began to compete with CNN’s Crossfire, its ratings began to decline. When the Fox News Channel debuted in 1996, America found newer, louder voices for conservative talk on television from the likes of Bill O’Reilly and Sean Hannity. Buckley and his producer Warren Steibel ultimately canceled Firing Line in December 1999, ending the longest-running television series with a single host in history. Nineteen years later, as the Trump era brought new rancor to political debate in America, PBS revived Firing Line with a new host, the Republican activist Margaret Hoover (great-granddaughter of President Herbert Hoover), and the show has maintained its founder’s format and characteristic of civilized debate.

Miles Gone By

In addition to his political works, William F. Buckley also published a series of spy novels featuring the fictional CIA agent Blackford Oakes. Drawing on his experiences with the CIA in the 1950s, he wrote eleven novels and a companion reader from 1976 to 2005. Buckley also wrote other fictional works as well as an autobiography, Miles Gone By, published in 2004. Buckley grew wary of the conservative movement’s embrace of nation-building and domestic spying in the wake of the 9/11 attacks, and The American Conservative magazine wrote that “at the end of his life, Buckley believed that the movement he had made had destroyed itself by supporting the war in Iraq.” (Of course, Buckley’s criticism of Republican orthodoxy was nothing new—he had broken with the party in the 1990s by writing a book advocating for an end to the drug war and the legalization of marijuana.)

In March 2000, as the year’s presidential campaign was heating up, Buckley published an article in Cigar Aficionado titled “Politics—The Demagogues are Running.” In it he criticized several candidates for appealing not to political ideology to earn votes but rather giving the people whatever they wanted, regardless of the benefit to the country. He blasted Bill Bradley for his borderline-socialist policies (the New Jersey senator was running to the left of Al Gore in the Democratic primary) and Republican Steve Forbes for trying to buy the nomination from George W. Bush. Interestingly, he also shared his thoughts on a man who was considering running on the Reform Party ticket: “What about the aspirant who has a private vision to offer to the public and has the means, personal or contrived, to finance a campaign? In some cases, the vision isn't merely a program to be adopted. It is a program that includes the visionary's serving as President. Look for the narcissist. The most obvious target in today's lineup is, of course, Donald Trump. When he looks at a glass, he is mesmerized by its reflection. If Donald Trump were shaped a little differently, he would compete for Miss America. But whatever the depths of self-enchantment, the demagogue has to say something. So what does Trump say? That he is a successful businessman and that that is what America needs in the Oval Office. There is some plausibility in this, though not much. The greatest deeds of American Presidents—midwifing the new republic; freeing the slaves; harnessing the energies and vision needed to win the Cold War—had little to do with a bottom line. So what else can Trump offer us?”

On February 27, 2008, William F. Buckley was found dead in his study. He had died of a heart attack while suffering from emphysema and diabetes. His wife Patricia had predeceased him, and he was survived by his son Christopher. Tributes to his leadership of the conservative movement poured out across the airwaves, and a man who had shaped his country’s intellectual climate for half a century was laid to rest in a simple plot of earth in Sharon, CT, next to his wife.

AVAILABLE WHEREVER YOU LISTEN TO PODCASTS

A “New Deal” | The Great Depression and the Transformation of American Politics

“All that progressives ask or desire is permission…to interpret the Constitution according to the Darwinian principle; all they ask is recognition of the fact that a nation is a living thing and not a machine.”

— Woodrow Wilson, The New Freedom,“What is Progress?” —

The United States of America has, in many ways, always been a progressive nation. We progressed beyond the limits of colonial existence under British rule, beyond the shackles that held our fellow citizens of African descent in bondage, beyond the geographical limits of the Thirteen Colonies. And yet, America has also been a nation of conservatives, clinging to the principles of representative government promised in the English Bill of Rights, to the Constitution amidst the turmoil of civil war, to the way things have always been. For much of its history, progress was driven by individuals who invented new labor-saving devices; only once, during the Civil War, did the federal government impose what we would call “progress” on the nation when it abolished slavery in the Thirteenth Amendment. That began to change, however, during the late 19th century when the progressive movement began to urge social advancement using the power of government. The arch-progressive of the early 20th century, Woodrow Wilson, urged the American people to support his programs in his 1913 book The New Freedom, and his presidency saw waves of change sweep across the country in finance, political reform, and most importantly women’s suffrage with the passage of the Nineteenth Amendment to the Constitution. However, when the First World War ended, Americans elected Warren Harding and Calvin Coolidge, who returned to the conservative roots of American politics. Coolidge was famous for his desire to remain out of economic and social affairs. Progressives tried to regain their momentum, but only when the Great Depression struck did their cause find a receptive audience.

Here on 15-Minute History, we do our best to remain nonpartisan in how we teach American history, but political affairs have obviously shaped the course of America’s growth and development. In the next two podcasts, we will trace the rise of the modern conservative and liberal political ideologies back to their roots in American history. Our hope is that we can give our audience a clear and unbiased view of where Republicans and Democrats got their ideas and how the two parties came to believe what they now proclaim each night on cable news and in their election campaigns.

The United States had faced economic downturns since at least 1807 when President Thomas Jefferson imposed the Embargo Act on the country and collapsed the national economy. What happened in 1929 was, at least on paper, little different from the panics of 1819, 1837, 1857, 1873, 1884, and 1907. Of course, its impact was felt far more dramatically by the American people because, during the 1920s, more Americans than ever had begun to invest their hardearned money in the stock market and various industrial ventures. As New Yorkers saw investors hurling themselves from buildings as their fortunes vanished, the American people realized that their way of life was on the brink of collapse. Soon, millions of Americans were out of work and unable to feed their families. The crisis deepened when dust bowl storms devastated the Great Plains and wiped out farms and livestock. In America and across the world, the people turned to their leaders and cried out for relief.

President Herbert Hoover’s efforts to stem the tide of economic distress had utterly failed as his tariff proposals drove prices up and destroyed more jobs than they created. By 1933, his presidency was in ruins, and while he still won the Republican nomination his defeat was almost certain. The Democratic Party, which had been out of power since the Civil War except during the administrations of Grover Cleveland and Woodrow Wilson, saw an opportunity to regain the White House. This was not merely political opportunism; the party had long wished to present itself to the people as agents of positive, progressive change. It just took an economic crisis of global proportions to bring the American people around to their side.

Over the course of sixteen primary contests in 1932, the Democratic Party had winnowed its field of eight candidates down to three. At their convention in Chicago at the end of June, two former governors of New York, Franklin Roosevelt and Al Smith, and Speaker of the House John Nance Garner of Texas vied for the nomination. Roosevelt’s supporters represented Southern segregationists and western farmers (both traditional Democratic strongholds), as well as ethnic minorities and urban elites. Smith’s support came from the political machines of New York City and Chicago, but his base did not extend beyond these two cities. Garner had little backing from the political establishment but did earn favor with the powerful California newspaper editor William Randolph Hearst. In the end, the party chose Roosevelt, and in his acceptance speech on July 2nd he promised “a new deal for the American people.”

Roosevelt’s nomination represented a change in direction for the Democratic Party. Since its inception under Andrew Jackson in the late 1820s, the Democrats had always been a party of Southerners, Westerners, and Northern bankers. This last group had shifted to the Republicans during the Civil War and were the strongest backers for that party and a reason why the Republicans had dominated the political landscape since the 1860s—they had the money. By bringing ethnic minorities and urban elites into the Democratic fold, Roosevelt was able to recast the party as the one which spoke for the American underclass. His message of a “new deal” resonated far more effectively than even he had hoped, and his victory over Herbert Hoover was the largest in American history since that of George Washington.

A “New Deal” for the American People

Franklin Roosevelt’s plan for reviving the American economy centered around a single economic principle, first articulated by the British economist John Maynard Keynes. In his Treatise on Money, Keynes wrote that governments needed to increase their spending (if necessary by borrowing money and running deficits) during an economic downturn to maintain total national spending. Once the economy recovered, government spending would decrease as the nation’s private sector began to grow. To put this plan into action, Roosevelt began to create new programs, collectively known as the New Deal, to stabilize the economy and put the American people back to work. During his first one hundred days in office, the administration declared a “bank holiday” and reformed the banking system, printed vast sums of money and took the country off the gold standard, repealed the Eighteenth Amendment to increase revenue through the sales of alcohol, and, most critically, created massive public works programs to create jobs.

The Public Works Administration built airports, hospitals, roads, and dams; the Civilian Conservation Corps planted forests, drained swamps, and built national parks; and the Works Progress Administration constructed public buildings like docks, theaters, and observatories across the country. These one hundred days set a standard of federal government action on behalf of struggling Americans that still continues to this day.

The Roosevelt administration also raised taxes on wealthy Americans to help pay for the recovery and limit deficit spending. The 1935 Revenue Act imposed a top marginal rate of 79% (the highest since the end of the Great War) and redistributed the wealth of rich Americans to the poor. Many of Roosevelt’s supporters felt betrayed by this measure, as wealthy elites had been among his strongest backers during the 1932 campaign. A year later, the government raised taxes on companies which held their profits in reserve rather than spending them on new equipment or raising employees’ salaries. This angered many American business owners, and the act was repealed after only two years. However, it put the Democratic Party firmly in the progressive taxation camp, where it has remained ever since.

Most of the New Deal programs created during the Depression were ended either during or after the Second World War. One which has stood the test of time is Social Security, which guarantees a pension to older Americans once they retire. Social Security was probably the most controversial action of the Roosevelt administration during the Depression—one Republican opponent called it “the lash of the dictator”—but it has become one of the most popular entitlement programs in American history. The program has been reformed several times but never departed from its basic structure: Americans pay a portion of their income out of each paycheck to fund current retirees with the promise that future workers with pay for their retirement in return. Again, the Democratic Party took a position in favor of a welfare state and ensuring the well-being of the American people.

Unfortunately, while the New Deal helped some Americans, it did not end the Great Depression. A recovery of sorts began in 1936, but a year later a second recession struck the country and drove unemployment numbers back up. President Roosevelt’s efforts at new programs were opposed by a conservative backlash, especially on the Supreme Court, and he sought unsuccessfully to add justices to the court in order to get his programs through. This also backfired, but as justices retired, Roosevelt was able to put New Dealers on the Supreme Court who ruled his programs as constitutional. What ultimately ended the Great Depression was the outbreak of the Second World War. By 1941, as the United States began to ship arms to Great Britain and the Soviet Union, unemployed Americans found work in factories, and when Japan attacked at Pearl Harbor and America entered the war, the country reached full employment in the first weeks of 1942. Fears of a return to depression peaked as the war drew to a close, but the skies soon cleared as America celebrated its victory. There have been economic downturns since the Great Depression (most recently in 2008), but America has not again suffered the woes of a national depression.

The Political Legacy of the New Deal

The 1936 presidential election would be the first test of Roosevelt’s “New Deal coalition.” The great question was whether or not it would hold together and, more broadly, if the American people would support their president’s radical transformation of the American economy. The Republicans chose Governor Alf Landon of Kansas to run against Roosevelt, and most observers predicted the campaign would be close. However, Landon was an ineffective candidate who agreed with Roosevelt on most issues; he supported the entirety of the New Deal but then criticized it as “anti-business” and insisted he could do a better job of managing the economy. On election day, the American people returned Roosevelt to the White House in the largest popular vote margin of any campaign since 1820 to that point. Roosevelt won every state in the Union except Maine and Vermont and eleven million more votes than Governor Landon.

Roosevelt would go on to win reelection twice more (breaking the precedent set by George Washington of serving only two terms) and would die only one month into his fourth term in 1945. Alf Landon’s campaign of supporting the New Deal’s new welfare state and interventionist attitude in the economy would become a model for the Republican Party for more than three decades. Except in 1948, in every presidential election from 1936 to 1972 the Republican candidate would express support for welfare to help poor Americans but insist that he was better able to run these programs. The shift in the Democratic Party during the New Deal was thus mirrored in the Republican Party in the decades after the Depression—rather than critique the idea of progressive intervention in the economy, the Republicans accepted it as necessary but promised a “me-too-only-better” vision for the country. For Dwight Eisenhower and Richard

Nixon, this campaign model proved successful (though in each case there were likely other mitigating factors). By 1972, the Republicans had wholeheartedly embraced both the New Deal and its successor, Lyndon Johnson’s Great Society, and President Nixon went even further than either Roosevelt or Johnson could have dreamed when it came to government action in the economy. He imposed wage and price controls, created the Occupational Safety and Health Administration and the Environmental Protection Agency, and become the first president since Roosevelt to support a universal healthcare system. All this came out of his campaign promise to clean up the “welfare mess.”

So what became of conservatism within the American political system? For much of the middle portion of the 20th century, critics of conservatism linked the ideology to the economic policies which had created the Great Depression (though this critique is debatable). When conservatives tried to stand up against the ever-increasing spate of government regulation, the cry went up that these politicians wished to return America to the “Roaring Twenties”—implying that a second Depression would be around the corner if their ideas were implemented. 1936 to 1972 was the summit of progressivism, not just in the United States but around the world. The federal government protected workers, paid benefits to the unemployed and the elderly, provided medical care to the disabled, and regulated the business cycle to prevent a depression. Only time would tell if this progressive wave could be sustained.

AVAILABLE WHEREVER YOU HEAR PODCASTS

Legends of the Old West | The Life and Times of Wyatt Earp

"Fast is fine, but accuracy is everything. In a gun fight... You need to take your time in a hurry."

— Wyatt Earp —

It’s Wednesday October 26, 1881. The sun has begun its descent into the western sky, its light reflecting off the heat waves coming from the hot desert soil. Four men walk through the entrance to a corral. They have come to disarm five gunslingers who have openly broken the law and made threats against them. When they finally stop walking, the group of four stand six to ten feet away from the men they are there to apprehend. Few words are exchanged. The leader, Virgil Earp, gives the command to the group of criminals to throw down their arms. Of the five, Ike Clanton and Billy Claiborne flee the scene. The remaining three draw their weapons. The four marshals draw theirs. Within thirty seconds, it’s over. As the smell of burnt powder and dust clears in the arid air, three outlaws are dead, three lawmen are wounded, and one stands coolly in the wake of the violent exchange. His demeanor, temperament, and lack of anxiety is quoted by a close friend later in life as, “a person whom I regarded as absolutely destitute of physical fear. His daring and apparent recklessness in time of danger is wholly characteristic.” This man was Wyatt Earp.

Life in the Old West

Life in the Old West, from 1865 to 1895, was unlike anything you, our audience, can imagine, as even films and television fail to capture its true nature. Although the West was already owned by the United States, expansion into these territories was limited because it was still untamed and lacked order. However, after the Civil War, veterans seeking adventure and landowners who had lost everything began to migrate west at great peril to themselves and their families.

Many Americans believed they had a “manifest destiny” to claim the untouched lands of the West for themselves and their nation. As more and more people settled across the Mississippi and beyond the Rocky Mountains, some brought law and civilization while others brought crime and suffering. Within the untamed lands of the west, those that were disposed to lawlessness became more so, and makeshift courts and law enforcement agencies were created to bring order to this chaos. Judges, US marshals, sheriffs and their deputies would reign in the wild side of each town and territory, often with little or no training. Judges like Roy Bean knew so little about the law that he once threatened a lawyer with hanging for using profane words like “habeas corpus.” Weaponry, guile, personal experience, and both good and bad intentions were their only tools.

Travel was dangerous in those days. The average wagon traveled at a speed of two miles an hour, averaging between ten to fifteen miles per day. For families going from Missouri to Oregon or California, this meant a five-month journey. A single person on horseback was a different matter, with mounted companies being able to travel between upwards of fifty miles per day, and single soldiers often went even further. Lone travel was discouraged. A single rider in an empty landscape was a tempting target, but both men and beast would be discouraged by a large group.

As the migrations continued, boom towns emerged around silver and gold deposits throughout the west. As word spread back east, those looking for adventure and riches soon found themselves on the same trails as families and others seeking to start anew. As the boom towns grew, so did every possible establishment bent on making a quick buck. Generally, this revolved around whiskey, gambling, and brothels, some of which took in more than $4 million in today’s money. Lawlessness thrived in these environments, as it sometimes does today. The lawman and the courts did their best to maintain the “thin blue line” between civilization and barbarism.

Tragedy and Purpose

Wyatt Earp was born on March 19, 1848. The fourth of seven children, he spent his early life in Illinois. At the age of one his father organized a group of a hundred settlers to travel to San Bernardino, CA, where he was planning to buy some land. Unfortunately, Wyatt’s sister became ill and the family had to stop just 150 miles into their journey west, and they settled in Iowa. When Wyatt was thirteen, several of his brothers joined the Union Army while his father worked with local companies. This left Wyatt and his remaining brothers to care the eighty-acre farm alone.

In May 1864, Wyatt’s father once again organized a group to head to San Bernardino, arriving in December of that year. Wyatt got his first job at the age of sixteen with his brother Virgil hauling cargo for two companies to Las Vegas, the Utah Territory, and Nevada. During this time, he learned how to referee boxing matches and to gamble, both of which he found very lucrative.

At the age of twenty his family moved back east to Lamar, MO, where he got his first law enforcement job as a constable. It was there that he met, courted, and married his first wife in 1870. He began to build a house while running a hotel with his in-laws. During this time Wyatt was said to have begun the process of settling down, working nights and weekends on their new home while spending days at the hotel. Unfortunately, tragedy soon struck. A few months before his wife was about to give birth, she and their child died of typhoid fever, sending Wyatt into a downward spiral for the next four years. He was arrested several times for being fond of and visiting a “house of ill repute,” running several brothels and saloons, and for being intoxicated (which was illegal at the time).

By 1875, Dodge City, KS, had become a main thoroughfare for cattle drives due to its proximity to the Chisholm Trail. Wyatt briefly served as an assistant marshal, and after a brief but failed attempt to make money in the Dakota Territory mines, he rejoined the police force in Dodge City in 1877. Wyatt was involved with several disputes in Dodge. The town was a rest stop for cowboys (a derogatory name at the time) exhausted from the cattle drives and ready to sow their wild oats. The normal process was for herds to come through, be put to pasture, and a selection of the team responsible for herding them would descend on the town and drink, smoke, gamble, and populate the “houses of ill-repute”. This provided ample opportunities for the law to be enforced.

One such occasion occurred after an outlaw robbed a railroad construction camp and fled the city. Wyatt was made a US marshal and ordered to pursue him until he lost the trail in Texas. At his last stop in Texas, he was trying to get additional information about the outlaw when another patron informed him that his target had gone back to Kansas. Very little is recorded of this and other conversations between the two men, but whatever was said sparked something in the patron that would save Wyatt’s life a year later.

That incident started when some cowboys ransacked one of the many saloons in Dodge City and harassed or assaulted customers while firing their guns wildly. Wyatt confronted the men, bursting through the saloon door to put a stop to the madness. He soon found himself confronted by anywhere from three to nine guns (depending on the account), all pointed at him.

It was at this moment that the patron from the Texas bar rose from a back table and put a pistol to the cowboy leader’s head, ordering him to stand down. He did, and the men were taken into custody. At that moment, Wyatt Earp and John Henry “Doc” Holliday became close friends. During his entire time in Dodge City, Wyatt Earp was only involved in one major gunfight and though a man did die as a result, differing reports make it unclear whether the cause of death was gangrene a few weeks later or if the man died from the gunshot that night. Regardless, the incident was something Wyatt would never forget.

Tombstone and the OK Corral

In 1879 Virgil Earp, who was a lawman in Prescott, AZ, wrote to Wyatt about a growing mining town called Tombstone. It was good timing, as Dodge City had begun to settle down, or as Wyatt described it, "Dodge was beginning to lose much of the snap which had given it a charm to men of reckless blood, and I decided to move to Tombstone, which was just building up a reputation." The Earps (Wyatt, his second wife Mattie, his brother Jim and his wife), Doc Holliday and his common-law wife “Big Nosed” Kate left for Prescott not long after to meet up with Virgil. The group then departed for Tombstone and were joined by another Earp brother, Morgan, who had left his wife in California to strike it rich in the new silver town.

Virgil had already been hired as the US marshal for the territory of Tombstone, and as he settled into his new post Wyatt and his party began the process of getting acclimated to the new town. Wyatt got a job as a “shotgun” messenger for Wells Fargo while Jim became a bartender. Doc Holliday immersed himself–quite successfully–in the gambling trenches of the boom town and began to accumulate a small fortune.

During this time, a group of outlaws known as the Cowboys were present in Tombstone. Wyatt and his brothers had several early run-ins with this gang. One such incident included being present at the death of the town sheriff, Fred White at the hands of Curly Bill, one of the cowboys. Wyatt was one of the first on the scene as Fred dropped to the ground after an apparent discharge from Curly Bill’s gun into his groin. Earp pistol-whipped Bill to the ground and held him there until backup arrived a few minutes later. During those minutes, other cowboys took shots at Earp from the darkness. When help arrived, one of Wyatt’s friends, Fred Dodge, reported his demeanor as rounds sailed past them. “Wyatt's coolness and nerve never showed to better advantage than they did that night. When Morg and I reached him, Wyatt was squatted on his heels beside Curly Bill and Fred White. Curly Bill's friends were pot-shooting at him in the dark. The shooting was lively and slugs were hitting the chimney and cabin…. in all of that racket, Wyatt's voice was even and quiet as usual.”

Wyatt and his brothers had continual run-ins with the cowboys until Morgan was finally threatened with death if the he or his brothers arrested any of them again. The threats continued for several weeks until October 26, 1881 when the four men—Virgil, Wyatt, Morgan and Doc— walked through the entrance of the OK Corral in an attempt to disarm the outlaws.

Here, we find Ike Clanton and Billy Claiborne fleeing the scene. Tom and Frank McLowery and Billy Clanton are holding their ground as the two parties draw their weapons and open fire. Two of the Cowboys are hit at once, with Virgil and Morgan being shot not long after. Doc Holliday lays down continuous fire at all three, first with a coach gun and then with his nickelplated revolvers. Only Wyatt remains in place, methodically firing at the targets that still pose a threat. As the thirty seconds pass and the shooting stops, two men are left standing and two are on the ground, wounded. Witnesses to the fight cited both Wyatt and Doc at the pivotal gunmen while the shooting lasted. Once it was over, Wyatt would credit Doc again with saving his life, adding that his friend was the deadliest gunman he had ever seen.

The gunfight is the quintessential scene of the American West, immortalized forever as the way we see this time period. This is true for several reasons. First, the gunfight was uncommon. Despite popular belief, these gunfights were not a staple of life in the Old West and when they did happen, there were rarely witnesses. Second, and this shouldn’t surprise you, there were a lot of people watching the fight at the OK Corral. Tall tales were plentiful in the West, and had it not been for the many eyewitnesses to the events of October 26, 1881, historians might not have known of the cool behavior of Wyatt Earp and the deadly accuracy of Doc Holiday. The showdown was chronicled in newspapers that circulated throughout the boom towns and even reached readers back east, adding to the legends of what lay beyond the Mississippi River.

During his waning years Wyatt was interviewed by a biographer who wrote a very flattering version of the incident in a book which was published a few years after Wyatt’s death. The book hit the shelves in time for the cowboy craze of Hollywood that spanned several decades, and the gunfight at the OK Corral became part of the iconic American view of the cowboy, the outlaw, the lawman, and the West as a whole.

After the OK Corral

Wyatt, his brothers, and Doc were put on trial for murder of the Cowboys in the corral, of which they were acquitted. The Cowboys swore revenge on the Earps, and later that year they wounded Virgil and murdered Morgan while he was playing pool at a saloon in Tombstone. Devastated, Wyatt, Doc, and several of their close friends killed all of the Cowboys responsible for the attack, as well as those who helped the murderers or even knew of their plans. During this time, reports of some of the gunfights between the cowboys and Wyatt’s posse grew from facts into legends. After his quest for vengeance ended, Wyatt abandoned his wife and pursued Josephine Marcus in San Francisco, whom he stayed with for the next forty years until his death. During this time, they traveled from one boom town to another, with Josephine developing a gambling habit that plagued Earp for the rest of his life. Their travels included–but were not limited to–Alaska, Nevada, Texas, Arizona, the Utah Territory, and many other locations that showed signs of silver, gold, or any other way to make a profit. No matter what he did, law enforcement always seemed to follow him, even to the age of sixty when he was hired by the Los Angeles Police Department to track down fugitives who fled to Mexico.

Wyatt Earp died in 1929. He had no children. Two years before his death, he was asked again about the events of the OK Corral. He said: “For my handling of the situation at Tombstone, I have no regrets. Were it to be done over again, I would do exactly as I did at that time. If the outlaws and their friends and allies imagined that they could intimidate or exterminate the Earps by a process of murder, and then hide behind alibis and the technicalities of the law, they simply missed their guess. I want to call your particular attention again to one fact, which writers of Tombstone incidents and history apparently have overlooked: with the deaths of the McLowerys, the Clantons, Stillwell, Florentino Cruz, Curly Bill, and the rest, organized, politically protected crime and depredations in Cochise County ceased completely.”

The mythical figure of Wyatt Earp has been portrayed in books, magazines, movies, and radio throughout a variety of dramas and action stories. In the modern American mind, he represents the stoic in the otherwise lawless land, the pillar of absolute that seems to stand against the chaos—this despite his many disreputable actions he took throughout his life. The life and times of Wyatt Earp are a testament to the power of story and legend. Whatever view you may have of him, no can argue this place in American history and his life during an age of legends.

AVAILABLE WHEREVER YOU LISTEN TO PODCASTS

What to Watch | The Best and Worst in Historical Movies

More Americans get their history from movies than from any other source, academic or popular (to the consternation of many history teachers). Speaking personally, I have on occasion been forced to correct inaccurate pictures of history with my students, and I am often met with looks of shock that Hollywood would present the past in anything but a factually-correct manner.

Movie studios change history for a variety of reasons: dramatic storytelling, the need to condense events, or even corporate or personal agendas. This week, Joe and I hope to inspire you to find some solid historical films in a variety of categories (and also warn you about some films that are less than accurate, or even downright lies). Each of us chose a movie in the various categories. Jon took the good ones, and I got stuck with the bad ones. So enjoy!

Military History

War is the most interesting and catastrophic of human endeavors, and it is often the center of popular films. Until the emergence of superhero films a decade ago, seven of the top ten worldwide box office spots were held by war movies (and even Marvel and DC superhero flicks showcase war, albeit of a different kind). War brings out the best and the worst in mankind, and these films showcase the best and the worst in movie making.

Vietnam War Movies

Pretty much every movie about Vietnam. Here are two examples.

Apocalypse Now (1979): When Francis Ford Coppola created this film, he did it give a modernday interpretation of Joseph’s Conrad’s Heart of Darkness to show how Americans perceived the war. Note the word, perceived. In no way was it intended to be factual, to represent the American solider and their motivations, or to illustrate how the war was conducted.

Unfortunately, I’ve had people cite this film as source of American atrocities and ineptitude. From the Air Cav attack on a village to the scene of Do Lung Bridge, where there was no CO, total chaos, and endless hell. Folks, this stuff just didn’t happen. Sorry, no one destroyed villages to surf and there was no Do Lung bridge. The bridge scene was meant to illustrate the perceived futility of the war, and the air attack was to show the Americanization of it. Again, both false. Good story, bad history.

Platoon (1986): No platoon experienced all these things. Yes, I know it was Oliver Stone’s way including all the atrocities that occurred in the war and showcasing them while withholding all the good things done by American soldiers, but unfortunately—like all films about Vietnam—the fake scenes with the fake battles with the fake soldiers and fake guns are cited as real and factual. Again, my apologies, but things just didn’t happen this way. While there are a few exceptions, movies about Vietnam are made protestors who care less about historical accuracy and more about their political motivations. Steer clear or proceed with caution.

Gettysburg (1993)

It’s difficult to make an exciting movie about soldiers lining up and firing at each other with single-shot weapons. And Gettysburg is not a particularly exciting movie. However, if you want exciting entertainment I would recommend the latest Marvel film (I hear it’s gonna be the best one yet!). Gettysburg portrays real history, not the fake drama on display for the masses thirsty for gore. Based on Michael Shaara’s incredible book The Killer Angels, Gettysburg is not a war movie in the traditional sense. Rather than invent characters and place them in historic events, Gettysburg recreates the largest battle of the American Civil War in near-perfect detail. You won’t find yourself thrilled by incredible CGI displays of bloodshed, but you might just walk away from Gettysburg with a better understanding and appreciation of the most devastating conflict our nation has ever witnessed.

Political History

According to the great German theorist Carl von Clausewitz, “War is a continuation of politics by other means.” The same is true of political films—they are similar to war movies in that they are often adapted and changed to suit the times and to tell exciting stories. While political films do not appeal to everyone, and they often anger one side of the political aisle or the other, they are usually entertaining—sometimes in a tragic sort of way.

JFK (1991)

Many of you will disagree with this selection given the cinematography for this film was superb, but we are going for historical accuracy, right? Oliver Stone isn’t known for his ability to portray American events in both politics and war, and this film is no different. The wild theories pared with long-run time make for a snooze-fest for all of us unbelievers. Additionally, part of the frustration with the film is since the entire premise is based on varying degrees of speculation those that may believe get more questions than answers. I don’t know about you, but I like movies that have endings, good or bad. So, what really happened that day? Let's just for a moment speculate, shall we? Just don’t look to this movie for help. Thumbs down.

Lincoln (2012)

Abraham Lincoln is a larger-than-life figure in American history, but in this film (the second, and I promise last about the Civil War era) humanizes the 16th president in a way not seen before on film. Daniel Day Lewis’ portrayal of President Lincoln captures his popular image while also revealing his character in very personal ways that were once confined to long and windy biographies. From his unique sense of humor and use of stories to explain his thinking, Lincoln comes alive on the screen. The film is set amidst the controversy of passing the Thirteenth Amendment, and viewers are treated not just to the high-and-mighty statesmen like Lincoln and Seward but to unscrupulous tricksters who bribed and threatened members of Congress to do the president’s bidding. Lincoln shows the audience that the abolition of slavery was not as easy as history class usually says. It was a difficult, and often shady, business of backroom dealings with questionable characters, and the film demonstrates the unshakeable strength of President Lincoln as he sought to end the saddest chapter in American history.

Social History

Americans are fascinated by how people lived in ages past, perhaps more than people in any other country. We imagine ourselves living in a world without the technology and convenience of modern life, and we wonder if we’d make it in those days. Social history is not technically a genre of film, but rather the term describes a movie set in the past which reveal the best and the worst of those who came before us. Some use the details of everyday life in the past to tell amazing stories, while others seek to shine a light on historic injustices as a way to inspire us to continue to progress as a society.

Australia (2008)

The name alone sends chills down the adventurer’s spine. The cast, score, and setting make you think that nothing could go wrong, but in the end, this film has one good scene. The rest of the scenes seem to be a hodgepodge of different plots mixed together with a love story that by the end, is a bit watered down. The infamous good scene is the effective halt of stampeding cattle which, because of past personal experience, I was happy to see. As said, everything was so fluid that viewers can barely grasp what life was like in the reformed prison colony of Australia, despite its overly-long runtime. Hugh Jackman is awesome. Nicole Kidman is awesome. The continent and history of Australia is awesome. With these facts, you’d think it would be a winner. You’d be wrong.

Apocalypto (2006)

Like many of his other films, Mel Gibson’s Apocalypto sparked some controversy when it was released. The movie presents the audience with a picture of the Mayan civilization of Central America just before its first contact with Europeans. While wildly inaccurate in some places—the Mayans were never as barbaric as to conduct widespread human sacrifice like their Aztec neighbors to the north—the film presents the most accurate picture of pre-Colombian Native American civilization ever seen on screen. Gibson’s message of a civilization’s downfall resonated with discerning audiences and caused them to examine their own cultures and look for evidence of rot and ruin which might lead to an eventual decline and fall.

Cultural History

Like social history, cultural history examines how a group of people in the past lived their lives. The distinction is while social history gives a broad picture of an entire society, cultural history focuses in on a specific group of people at a distinct time period. Once again, this excites the imagination of people who can not imagine life outside their own time and brings new understanding to the past and reflection on the present.

The Scarlet Letter (1995)

No one is king of the world in this movie. This loose interpretation of Hawthorne’s historically inaccurate tale represents the worst of revisionist history. Puritans are portrayed in film in one of two ways, as legalists or Satanists (one could argue that this is one in the same). This is a lie, and I’m tired of it. Most first, second, and third generation Puritans made their way in the wilderness with nothing but guile and deep-rooted faith in the sovereign will of their Creator.

Instead of providing a view into what this life was like, this film joins the countless false portrayals of Puritan life while including a forbidden love story which urges everyone watching to let go. Missing the mark on all counts, The Scarlett Letter illustrates that a woman’s heart is indeed a deep ocean of secrets but based on this movie, no should care.

Titanic (1997)

I can already hear you shouting at me. “OK, Jon, I’ve let you wonder about a Chinese America, question America’s foreign policy and wars for oil, and yammer on about the madman Sherman. But Titanic?! That’s a bridge too far.” (Which, incidentally, is another great movie.) Please hear me out. Titanic was the top-grossing film in history for twelve years, and it appealed to every demographic in society: rich and poor, young and old, male and female. Of course, watching Leonardo di Caprio and Kate Winslett fumble around together inside an old car is almost as painful as hearing Leo bloviate about global warming from his private jet, but Titanic is more than just a love story between two adults trying to pretend they’re teenagers. For students of history, Titanic showcased the twilight of the Victorian Age, of class distinctions, of the fantastically rich looking down upon the desperately poor. Characters in First Class ignore or disparage those in steerage even as the ship slips beneath the ocean. The world presented in Titanic is one which now offends all supporters of social progress. Sadly, it took the chaos and slaughter of the Great War, which began only two years after the sinking, to put an end to the Victorian Age and bring about a more egalitarian and democratic society in Europe.

Films that Shaped Filmmaking

Our final category references films that shifted the course of movie history, both in positive and negative ways. Like every other human activity, moviemaking follows trends which historians can observe and critique. By examining the world of movies before and after the films listed below, we can discern which movies are following these trends, and which blazed new trails which other directors would follow.

Blade Runner (1983)

Yeah so there are so many bad films out there. Where would I even start? Also, since I’ve been responsible for naming a slew of them for this episode, I decided to rebel a bit and cover a good one. Jon can curse me later with the One Ring. Blade Runner opened to divided audiences who either saw it as a flop or a cinematic masterpiece. Thankfully, history sided with the former, and the movie is widely regarded as one of the best sci-fi films ever made. Ridley Scott takes you to a world of shadow and light. In this film, you never quite see anything, and that’s the point. A 1940’s detective story made in the 1980’s set in the distant future of next year (2019), Blade Runner follows a cop named Deckard who is responsible for “retiring” replicants, which are humanoid creations akin to androids. The film follows Deckard through the moral implications of his job, questions the definition of life, and ends with one of the greatest adlibbed speeches of all time. Lines like, “Its too bad she won’t live, but then again who does?” are paired with settings that are somehow vast and claustrophobic at the same time. Many film makers cite this movie as their inspiration, especially in the way it uses light. I agree with them. And while we’re on the topic, Blade Runner 2049 is amazing as well.

The Lord of the Rings trilogy (2001-2003)

My personal favorite film—I regard all three as a single, ridiculously-long movie—is Peter Jackson’s The Lord of the Rings trilogy. It is a masterpiece on screen, filled with all the violence and bloodshed to appeal to this teenager who first watched it in the theater in December 2001, as well as the deep and meaningful themes which still resonate with me each Christmas when I sit down and watch them again. On a technical level, The Lord of the Rings revolutionized the movie industry even more than my runner-up, Star Wars. CGI battle scenes, motion-capture performances, and broad and sweeping cinematography existed before these three films, but it was The Lord of the Rings which brought these new techniques into the Hollywood mainstream. If you have never seen this trilogy, go and stream it right now—that’s your homework assignment for this week.

Conclusion

A picture is indeed worth a thousand words. Modern cinema has enabled viewers to take a literal front-row-seat in watching history unfold. Through movies we can watch the drama of events through the many historical figures who witnessed them first-hand. The list represented here is but a preview of some movies that we recommend and others we encourage you to ignore.

Unfortunately, modern movies have a tendency to fall prey to the cancer that is revisionist history, with directors, actors, and writers injecting their political, socio, and economical beliefs into historical storylines. Regardless of which side of the isle is responsible, the effect is devastating to viewers who often take what’s given to them as fact. So, if you feel so inclined, please listen to our words of warning when it comes to historical movies. Don’t take them at face value. If you’ve learned anything from us it’s that in this, the information age, you have the ability to research topics for yourself. And as we continue to teach you about history, we hope to help.

Honorable Mention

Here, Joe and I picked a personal favorite film which did not necessarily fit into the categories we had assigned for ourselves. While they may not be the most accurate depictions of history, they use events of the past to teach lessons which may be applied to the present.

Braveheart (1997)

Mel Gibson’s portrayal of the Scotsman William Wallace showed the horrors of medieval combat for the first time. The breathtaking terrain of the highlands mixed with the scale and score of the film brought that world to life. Its message of conviction, leadership, and freedom resounded with me and audiences all over the world.

Star Trek: First Contact (1996)

Really, Jon? A Star Trek movie in a history podcast? First of all, Joe talked about Blade Runner, so I don’t want to hear anything about science fiction not being history. Star Trek: First Contact is not a historical movie (at least not yet), but it has an interesting sub-plot about a famous character revered in the 24th century. When the crew of the Enterprise meet him after time-traveling to a point about fifty years from our present day, they learn he is not the hero seen in the pages of history—he is a drunk, foul-mouthed buffoon. The film reminds us that historical figures are not larger-than-life; they are men and women as flawed as each of us, and it tells us that when called to high purposes, even the least of us can rise to the occasion.

AVAILABLE WHEREVER YOU LISTEN TO PODCASTS

War is Hell: William Tecumseh Sherman

The North can make a steam engine, locomotive, or railway car; hardly a yard of cloth or pair of shoes can you make. You are rushing into war with one of the most powerful, ingeniously mechanical, and determined people on Earth—right at your doors. You are bound to fail. Only in your spirit and determination are you prepared for war. In all else you are totally unprepared, with a bad cause to start with. At first you will make headway, but as your limited resources begin to fail, shut out from the markets of Europe as you will be, your cause will begin to wane. If your people will but stop and think, they must see in the end that you will surely fail.

— William T. Sherman to Professor David Boyd, December 24, 1860 —

William Tecumseh Sherman predicted the course of the American Civil War in a conversation with a colleague in December 1860, days after South Carolina had seceded from the Union. His words showed a keen understanding of the relative strengths of North and South, and he forecast with eery accuracy the course the war would take. Sherman had always been an ardent Unionist, believing that the United States should remain one nation, and this motivated his service to the North during the Civil War. His views on slavery were typical of a Northerner for most of his life—that slavery was economically necessary and that Freedmen should not be permitted to settle near whites. He did, however, oppose breaking up slave families and urged Southerners within his social circle to teach their slaves to read and write. Only when the war broke out and Sherman saw with his own eyes what was happening to enslaved Africans did his heart and mind turn to abolition.

Sherman was born in 1820 in central Ohio, and after his father’s death nine years later he was raised by a family friend, as William’s mother lacked the resources to care for all eleven of her children. His foster father, Senator Thomas Ewing, recommended him for admission to West Point in 1836, and William completed his military education with excellent grades but a dreadful disciplinary record. (In his memoirs, written after the Civil War, Sherman wrote that he averaged one hundred fifty demerits each year because he refused to conform to the “neatness in dress and form” required of the cadets.) After graduating, Sherman served in the Second Seminole War in Florida and then in California during the Mexican War. He saw no combat in California but was recognized for his “meritorious service” as an administrator. Two years after the war ended, Sherman returned to Washington, DC, and married his foster sister, Ellen Boyle Ewing; Ellen’s father Thomas, now Secretary of the Interior, and President Zachary Taylor, the hero of the Mexican War, were in attendance with much of official Washington.

In 1853, Sherman resigned his Army commission and turned his attention to business. He opened a bank in San Francisco which earned him a good living but suffered from stress-related asthma because of his work. Ultimately, Sherman’s business venture in California failed, and he moved to New York to open a new branch, which also closed its doors after only a few months. By 1858 he was living in Kansas and practicing law, but his success was minimal. The next year, he was able to secure a post at the Louisiana State Seminary of Learning & Military Academy, where he finally found his niche. His time at the school earned him a solid reputation within the military establishment, but the tumult of sectional crisis would soon sweep him up.

Sherman’s Personality

Sherman has always been an enigma to historians, whether they approve of his actions during the Civil War or not. For much of his life he struggled with depression, and he considered taking his own life in early 1861. His fiery temper and erratic behavior worried his wife Ellen, who wrote to her brother-in-law John, complaining that her husband suffered of “that melancholy insanity to which your family is subject.” Reports of Sherman’s instability reached a Cincinnati newspaper, which labeled him as “insane” in an article published in December 1861.

While serving in Kentucky early in the Civil War, Sherman began to experience bouts of paranoia. He imagined spies lay in wait behind every tree ready to sabotage the Army’s efforts (which later turned out to be true, as Kentucky was riddled with secessionist supporters of the Confederacy), and his colleagues began to whisper behind his back that he was losing his mind. In public, he remained a model of order and duty, but in private he was a lonely figure within the Army, content only in the company of a few close friends. Among these was Ulysses S. Grant, about whom he later wrote, “He stood by me when I was crazy.” His letters to his wife reveal a deep love for her and for their eight children but also a terrible sense of inadequacy and rejection by the world.

Historians and psychologists have diagnosed Sherman with a number of maladies. Writing in a Northern newspaper shortly after the war ended, one doctor commented, “Sherman’s abilities in command do not fully mask his inadequacies in matters of human interaction. He is cold, withdrawn, and even hostile toward those whom he does not know well.” The article went on to claim he possessed two distinct personalities, exhibiting one in public and another in private—what is today termed “schizophrenia.” More recently, scholars have begun to reexamine Sherman in the light of modern medicine. In 2001, amidst renewed interest in Sherman’s exploits after the publication of The Soul of Battle by Dr. Victor Davis Hanson, psychologists at the Uniformed Services University of the Health Sciences read all of Sherman’s surviving letters and published works in an effort to determine whether or not he truly suffered from mental illness or was simply what we would now call an introvert. Their findings were not conclusive, but in their report they stated, “General Sherman may have suffered from a form of autism, perhaps Asperger syndrome, that was undiagnosed during his lifetime.” Whatever his mental state, General Sherman was immensely popular with his soldiers, who referred to him fondly as “Uncle Billy,” and he was one of the most effective military commanders in American history.

Sherman’s Passion

As the sectional crisis smoldered through the “Secession Winter” of 1860-61, Sherman readied himself for war. He returned to Washington and met with President Abraham Lincoln shortly after he was inaugurated. Sherman hoped to regain his commission, but Lincoln, perhaps wary of the soldier’s reputation, was not interested. He then moved to St. Louis to run a streetcar company (which failed). After the attack on Fort Sumter in April, Sherman again contacted the War Department and offered his services, and he was summoned to Washington once again in June 1861, where he was commissioned colonel of the 13th US Infantry Regiment. He saw action at the First Battle of Bull Run in July, where he was wounded, and then transferred to the Western Theater, where he would remain for the rest of the war.

Sherman believed firmly that war was hell but that it was necessary only to prevent worse atrocities like the breakup of the American union. His passionate belief in the Union cause led to the aforementioned mental breakdown and contemplation of suicide as he saw his country’s armies weather personnel and supply shortages while the Confederacy seemed to be invulnerable to attack—at least on paper. His commander, General Henry Halleck, placed him on leave so he could recover mentally and physically (he had refused to eat and lost nearly forty pounds), and only when Halleck was promoted and command of the Department of the Missouri passed to his friend General Grant did Sherman return to action. On March 1, 1862, Grant gave Sherman command of the Army of the Tennessee’s 5th Division, and the army moved south from Kentucky into Tennessee.

Like Sherman’s, General Grant’s reputation within the Army was mixed at best. Grant’s business ventures in Illinois between the Mexican and Civil wars had been a catalogue of failures, and he was reputed to be an alcoholic and unfit to lead even a company of soldiers. (This may have been a slur by his career and political opponents.) And yet, because both men were proven leaders and effective strategists, President Lincoln gave them his support no matter the charge against them. Later in the war, when a delegation of politicians were at the White House demanding Grant’s removal because they thought he was hesitating at Vicksburg, the president refused their request. A New York Times article reported that, “When one charged General Grant, in the President’s hearing, with drinking too much liquor, Mr. Lincoln, recalling General Grant’s successes, said that if he could find out what brand of whiskey Grant drank, he would send a barrel of it to all the other commanders.”

The Confederates attacked Grant’s army at Shiloh Church on the morning of April 6, 1862. The furious assault drove the Union troops back, but Sherman rallied his division and was able to hold it together as it retreated toward the Tennessee River. Grant had been away from the army during the attack but returned that night; when the two men met under a tree (Grant was smoking one of his customary cigars), they planned a counterattack for the following day. On April 7th, Sherman was in the front lines of his division as the Union army advanced. His division turned the enemy flank and drove them back, and he was wounded in the hand and shoulder and had three horses shot out from under him. The victory at Shiloh, followed up by those at Corinth, Vicksburg, and Chattanooga, won Sherman great fame with the American people and, for the most part, restored his reputation among his colleagues.

In March 1864, Grant’s army had captured Chattanooga and was poised to invade Georgia. The Confederacy had been sundered in half with the fall of Vicksburg eight months earlier, and the Union now had to destroy the enemy’s remaining sources of food. Grant was summoned to Washington and given command of all Union armies, and he promoted his friend Sherman to command the Western Theater. Sherman had one order: march on Atlanta, the last remaining east-west railroad junction, and then on to the Atlantic Coast and split the Confederacy again. He regularly outmaneuvered his opponents as he approached Atlanta, fighting only one pitched battle at Kennesaw Mountain, and the Georgia capital fell on September 2nd. (The fall of Atlanta was more than a military victory—it secured Lincoln’s reelection in the 1864 campaign, and historians have commented that this may have been Sherman’s greatest contribution to the Union cause.)

With the armies in the Eastern Theater locked in mortal combat in Virginia around Richmond and Petersburg, Sherman turned his thoughts to how he could finally break the Confederates’ will to fight. Through three long years of bloodshed, the rebels’ spirits had never wavered, and Sherman believed it was because the people of the South had not felt the true horrors of war. In a telegram to General Grant on October 9th, Sherman laid out his plan to march “to the sea” from Atlanta to Savannah and destroy or capture anything that sustained the rebels’ war effort. In characteristic fashion, Sherman ended his telegram with the words, “I can make this march, and I will make Georgia howl.”

As Washington, DC, prepared for the Christmas holiday, President Lincoln received a telegram from General Sherman on December 22, 1864. “I beg to present you as a Christmas gift the city of Savannah with 150 heavy guns and plenty of ammunition and also about 25,000 bales of cotton.” Sherman’s “Army of the West” had burned its way across Georgia over the preceding two months, costing the South nearly $100 million in property damage and freeing nearly ten thousand slaves along the way. Roads, telegraph poles, and railroad lines were ripped up; bales of wheat, hay and cotton were burned; and the homes of any Confederates who resisted were destroyed. (Contrary to popular belief in the South, Sherman’s army did not murder rebel civilians in cold blood and only fired upon those who had first fired on them.) Sherman’s “March to the Sea” inaugurated a new era in warfare—he had waged “total war” by targeting not merely the soldiers of enemy armies but anything which sustained their war effort. Sherman understood that wars are waged by nations whose people believe in the cause, at least to some extent; only by breaking that will to fight could an enemy be truly beaten. This lesson, taught by General Sherman, would be learned well by future generations of American military leaders.

Sherman was not a cruel man. Temperamental, yes, but never cruel. His diaries during the March to the Sea are filled with sadness as he saw homeless children reaping the consequences of their parents’ defiance of the Union. He did what he could to ease their suffering, but this did not prevent him from fulfilling his duty. As he saw the horrors of slavery firsthand in Georgia, he grew more abolitionist by the day, and his Special Field Order No. 15 appropriated land for forty thousand freed slaves in Georgia from their former masters (an order later revoked by President Andrew Johnson). Many slaves saw Sherman as a man of God, a “Moses” come to free them from bondage. The March to the Sea burned the heart out of the Confederacy, and when the rebels surrendered in April 1865, it was due in equal measure to Grant’s ruthlessness in battle and Sherman’s willingness to inflict cruelty upon the Southern people. It also demonstrated Sherman’s opinion on the nature of war: “War is cruelty. There is no use trying to reform it; the crueler it is, the sooner it will be over.”

Sherman’s Politics

After parading with his victorious Army of the West through Washington in the Grand Review of May 24, 1865, Sherman was given command of the Military Division of the Missouri, encompassing all US territory from the Mississippi River to the Rocky Mountains. Four years later, when Ulysses S. Grant was elected president, Sherman was appointed Commanding General of the United States Army, where he waged political battles with Washington bureaucrats and his troops fought actual battles against Native Americans. He organized new training schools for Army officers and did his best to help his friend stem the growing violence of the Democrat-backed Ku Klux Klan in the South during Reconstruction. By 1883, Sherman had grown tired of politics, and he resigned his command of the Army and then left the military on February 8, 1884.

Sherman’s final years were spent in New York City, where he pursued his interests in art and the theater—he was a devoted fan of Shakespeare. In 1884, the Republican Party approached him to run for president as General Grant had done. Sherman had watched politics destroy his friend’s reputation and health, and before the party could even offer him guidance on a platform or campaign strategy, he issued a public statement in the newspapers that has become famous: “I will not accept if nominated and will not serve if elected.”

Ellen Sherman died in 1888, which devastated the old general, and in grief he turned his attention to conservation efforts with fellow Republican Theodore Roosevelt (who had also used the outdoors to comfort himself after the death of his wife Alice four years earlier). The two men worked together to form the Boone and Crockett Club, a wildlife conservation organization named for two famous American outdoorsmen. In January 1891, General Sherman fell ill with pneumonia, and he died on February 14th. President Benjamin Harrison issued a statement to Congress on Sherman’s death, saying, “He was an ideal soldier, and shared to the fullest the esprit du corps of the army, but he cherished the civil institutions organized under the Constitution, and was only a soldier that these might be perpetuated in undiminished usefulness and honor.”

AVAILABLE WHEREVER YOU LISTEN TO PODCASTS