Before Bruce Lee sprang into martial arts movies in the early 1970s, the average actor in a kung fu film may have been better prepared to deliver a Shakespearean soliloquy than a roundhouse kick.

“In our early action films, we used actors who knew little about fighting,” Raymond Chow, one of the producers behind “Enter the Dragon” and other movies that starred Lee, said in an interview in 1973. “We had to use various camera tricks. But the audiences can tell the difference. It knows a real fighter when it sees one. That’s why Bruce Lee is such a hit.”

Born in San Francisco and raised in Hong Kong, Lee was a fighter’s fighter. He began studying martial arts in earnest as a teenager, augmenting his fighting with strength training and dancing. In time he developed his own style, Jeet Kune Do.

Acting was in Lee’s blood. His father, Lee Hoi-Chuen, appeared in Cantonese opera and films, and Lee started acting as a boy. He appeared in Chinese films and the short-lived 1960s American television series “The Green Hornet,” playing the title character’s assistant, Kato, in their crime-fighting exploits. Kato was a valet and martial arts master, a supporting character who became popular in Hong Kong, where the show was known as “The Kato Show.” Kato’s popularity helped Lee land a movie deal with Chow’s Golden Harvest studio.

Lee’s precise, powerful yet seemingly effortless grace and presence before the camera made him an international star.

“Enter the Dragon,” one of the first martial arts movies produced by a Hollywood studio, was Lee’s best-known film. Lee did his own stunts, helped write the script and choreographed the fight scenes. In the film he played a martial arts master who infiltrates a criminal’s island fortress by agreeing to fight in a tournament. The film transfixed audiences around the world and cleaned up at the box office. Here’s one of the movie’s many memorable fight scenes.

Lee did not live to see the film’s success. He died at 32 on July 20, 1973, after being found unconscious on the floor of his Hong Kong apartment, just days before “Enter the Dragon” had its premiere.

“Enter the Dragon” and the rest of Lee’s career have made an indelible mark on popular culture. His other films, which include “The Big Boss” and “The Chinese Connection,” have become part of the kung fu canon. They inspired the next generations of martial arts movie stars, like Jackie Chan and Jet Li, and helped open up Hollywood to Asian actors (although the extent to which that has happened is questionable).

Films, documentaries and books have been made about Lee’s life, and cultural references to him abound. He has inspired video game characters, even entire games. Yellow outfits, like the jumpsuit he wore in “Game of Death,” a film that was released posthumously, were also worn by the lead character in Berry Gordy’s “The Last Dragon” and Uma Thurman in the climactic scenes of the first part of Quentin Tarantino’s martial arts epic, “Kill Bill Vol. 1.”

Lee earned a star on the Hollywood Walk of Fame and was named one of Time magazine’s 100 people of the century. “Enter the Dragon” was added to the Library of Congress’s National Film Registry and labeled an American classic. A statue of Lee, poised to strike, on the Hong Kong waterfront still attracts throngs of fans.

“On an adventure level, the performances are quite good. The one by Mr. Lee, not only the picture’s supermaster killer but a fine actor as well, is downright fascinating. Mr. Lee, who also staged the combats, died very recently. Here he could not be more alive.”

You can find more fascinating New York Times obituaries, year round, here and on our Twitter feed. Click here for the continuing feature “Notable Deaths of 2016”, and if you want to revisit some of the most momentous obituaries to have appeared in The Times, you might look for “The Book of the Dead,” a compilation of obituaries dating back to the newspaper’s founding in 1851. It will be available for preorder and will appear on store shelves in October.

We welcome your feedback about Not Forgotten here. We hope you enjoyed it.

—-Shreeya Sinha

Share This Page

Photo

Princess Diana and Prince Charles on their wedding day in London in 1981.Credit
Press Association, via Associated Press

She died young. She died violently. She was a global celebrity in the broadest sense, a woman of startling charisma who became famous when she married the heir to the English throne and even more famous when she divorced him and embarked on a life of her own.

But the sudden death of Diana, the Princess of Wales, alongside her lover in a fiery car crash in a Paris tunnel on Aug. 31, 1997, elevated her into something else entirely: a symbol of a nation’s emotional and generational conflicts, a blank slate on which an entire people — and to some extent, the world at large — could project their own fears, prejudices and passions. Britain went a little crazy. For a few disorienting weeks, everything seemed up for grabs, including the monarchy itself.

Photo

Mourners gathered at a memorial for Princess Diana outside Kensington Palace after her death in 1997.Credit
Santiago Lyon/Associated Press

She was born Lady Diana Spencer, the daughter of an earl, in 1961. Althorp, her childhood home, was a stately, drafty pile, crammed with priceless works of art. Her childhood was privileged but lonely — her parents had a terrible divorce — and her education indifferent.

In fact, nothing remarkable at all happened to Diana until, at age 19, she married Charles, the Prince of Wales, in view of thousands of strangers (millions, if you count the television audience), wearing a voluminous puffball of a dress that drowned her slender frame.

If the wedding was a gossamer fairy tale, the marriage was a real-life nightmare. Diana was emotional, fragile, needy, anorexic, bulimic; Charles came from the stiff-upper-lip school of interpersonal relations and had a longtime (married) girlfriend, Camilla Parker-Bowles.

Charles and Diana had two sons. She eventually found various lovers, too. Their divorce was shocking and unprecedented, but it freed Diana to look elsewhere for love, and she soon took up with a man named Dodi al-Fayed, a rich playboy whose father owned Harrod’s department store. They died together in a high-speed chase in Paris, fleeing from paparazzi pursuing them in cars and motorcycles after a date.

Britain went into deep shock, wondering aloud whether it had helped cause Diana’s death by not appreciating her enough in life. The power of the emotion — and the frenzy whipped up by the tabloid newspapers — all but forced Queen Elizabeth to break with centuries of tradition and protocol and make a public address to the nation. Elton John sang at the funeral. Men, women and children lined the streets and wept as Diana’s coffin went by.

Diana is nearly as vivid a figure in death as in life. She lives on in her sons, William and Harry, who have talked in recent years about her effect on them. William’s wife, Kate, a future queen of England — this would take some time, because both Elizabeth and Charles, the current heir, would have to die before William inherits the throne — wears the massive sapphire and diamond engagement ring that Charles gave to Diana, and that William in turn gave to her.

The young man in question was Christopher McCandless. His identity was not confirmed for weeks, but in time he would become internationally famous as a bold, or very imprudent, figure.

Mr. McCandless died alone in an abandoned bus on the Stampede Trail, a desolate stretch of backcountry near Denali, in August 1992. He was surrounded by his meager provisions: a .22-caliber rifle; some well-worn and annotated paperbacks; a camera and five rolls of exposed film; and the diary, 113 cryptic notes on the back pages of a book that identified edible plants.

Before Mr. McCandless died, from starvation aggravated by accidental poisoning, he had survived for more than 110 days on nothing but a 10-pound sack of rice and what he could hunt and forage in the unforgiving taiga.

Jon Krakauer, at the time a freelance writer, heard about Mr. McCandless’s story from an editor at Outside magazine who had read the Associated Press piece. The editor wanted Mr. Krakauer to write a long article about Mr. McCandless on a tight deadline, and he delivered.

But after the story ran, Mr. Krakauer needed to learn more.

“I decided I wanted to write this book because I felt like there was a lot more to tell; there was a lot I hadn’t discovered,” Mr. Krakauer said in a telephone interview.

Over the next few years he dug into Mr. McCandless’s life and discovered a complicated, compelling story. He chronicled Mr. McCandless’s travels and lonely death in “Into the Wild” (1996), a national best-seller that has since sold millions of copies in the United States. A film based on the book, starring Emile Hirsch as Mr. McCandless and directed by Sean Penn, was released in 2008.

Mr. McCandless’s story continues to fascinate, confound and infuriate readers two decades after “Into the Wild” was first published. Mr. Krakauer said it was by far his best-selling work, adding, “I get more hate mail from this book than probably from anything else.”

“He’s this Rorschach test: People read into him what they see,” he said of Mr. McCandless. “Some people see an idiot, and some people see themselves. I’m the latter, for sure.”

Mr. McCandless came from a well-off family on the East Coast. He graduated from Emory University with honors, then disappeared in 1990. He donated virtually all the money in his bank account to Oxfam, a charity dedicated to fighting poverty, then drove west before abandoning his car and burning the cash he had left. He deserted his family and a privileged life without looking back.

Mr. McCandless canoed into Mexico, hitchhiked north and worked odd jobs along the way. He often roamed alone, but left an impression on many of the friends he made along the way. An older man named Ron Franz even offered to adopt him; Mr. McCandless gently turned him down.

He never contacted his parents, Walt and Billie McCandless, or his sister, Carine. His parents were worried, but knew that long, improvised jaunts were nothing new for their son.

“He was always an adventuresome, pretty self-contained individual,” Walt McCandless said in an interview. “And it’s important to realize that the trip he didn’t come back from wasn’t his first adventure.”

Some readers see Mr. McCandless’s rejection of materialism and his embrace of the natural world as romantic, taking him for a contemporary Thoreau. Many others, especially native Alaskans, have argued that he must have been mentally ill, suicidal or hubristic, and that it was irresponsible for Mr. Krakauer to glorify his story.

Walt McCandless and Mr. Krakauer both disagreed with that assessment.

In 2014 Mr. McCandless’s sister Carine published “The Wild Truth,” a memoir that depicted a physically abusive, chaotic childhood that both siblings were forced to conceal.

“Chris made his choices, and he accepted accountability,” Ms. McCandless said in an interview. But she said she does feel her parents should accept some blame.

"I do hold them accountable for his disappearance,” she said. “I think for him to leave in that extreme way, to go without telling anyone where he was — I do hold them accountable for his disappearance, but not for his death.”

Walt and Billie McCandless said they did not want to comment on the memoir.

“He was a tortured soul; he did what he had to do,” said Mr. Krakauer, who wrote the foreword to “The Wild Truth,” adding: “He suffered as a young man, and he did what he had to do to escape it.”

By the time Mr. McCandless died, he seemed to have found a measure of peace, according to one of his last notes, scrawled inside a paperback copy of “Education of a Wandering Man,” a memoir by the novelist Louis L’Amour. It said:

“I HAVE HAD A HAPPY LIFE AND THANK THE LORD. GOODBYE AND MAY GOD BLESS YOU ALL.”

An earlier version of this article, using information from Mr. Krakauer’s publisher, misstated the number of copies of "Into the Wild” that have been sold. It is several million, not “nearly two million.”

—Daniel E. Slotnik

Share This Page

Photo

Michael Jackson performing during the halftime show of Super Bowl XXVII in 1993.Credit
George Rose/Getty Images

When Michael Joseph Jackson was born into a large family in a small house in Gary, Ind., on Aug. 29, 1958, no one could have imagined that he would become perhaps the most recognizable entertainer on the planet. On the king of pop’s birthday, Not Forgotten takes you back through his life and music.

Jackson’s rise was swift. By the time he was 10, he and his brothers were pop sensations performing as the Jackson 5. The group had four No. 1 Motown hits in a little more than a year, including “I Want You Back,” all of which featured Michael’s ebullient high-pitched voice.

By 20, Jackson wanted to break away from his overbearing father, his demanding siblings and the Jackson 5 sound. His first solo album, “Off the Wall,” may be the quintessential recording of the disco era. It featured “Don’t Stop ’Til You Get Enough,” for which Jackson sang with a flirtatious falsetto.

Jackson’s next album was “Thriller,” which was released in 1982 and became the best-selling album of all time. It won eight Grammy Awards, spent two years on the Billboard album chart and sold more than 100 million copies around the world. Jackson’s dancing and innovative music videos, especially the one for the title track “Thriller,” helped redefine the medium and open MTV to black musicians.

Five years later, “Bad” was released. It was also hugely successful, with five No. 1 singles and a video for the title track that was directed by Martin Scorsese.

After “Bad” the bizarre details of Jackson’s personal life often overshadowed his abilities as a musician and entertainer. His other albums include “Dangerous” (1991) and “HIStory,” and although they all did well commercially they never approached the world-beating success of “Thriller.”

The unsurpassed entertainer, the gifted and driven song-and-dance man who wielded rhythm, melody, texture and image to create and promote the best-selling album of all time, “Thriller”? Or the bizarre figure he became after he failed in his stated ambition to outsell “Thriller,” and after the gleaming fantasy gave way to tabloid revelations, bitter rejoinders and the long public silence he was scheduled to break next month?

Emmett Louis Till was born on July 25, 1941, on Chicago’s South Side and was nicknamed Bobo because of his fun-loving, cheerful disposition while growing up in the segregated middle-class neighborhood. When he was 14 he went to Mississippi to spend the summer with his cousins, and his mother gave him his father’s signet ring as a gift.

On Aug. 24, 1955, after an exhausting day of picking cotton in the scorching Delta sun, Till and his cousins went to a local store run by a poor white couple in their 20s, Roy and Carolyn Bryant. Ms. Bryant was working alone in the store when Till went in to buy bubblegum. It is not clear what happened inside, but soon afterward Ms. Bryant stormed out, presumably to get a pistol from her car parked outside. Till, unaware of the danger, whistled, and his cousins, now panicked, quickly drove him away.

Ms. Bryant later claimed that Till had flirted with her on a dare. The details would later change depending on when she told the story.

Four days later, around 2:30 a.m., Ms. Bryant’s husband, Roy, and his half brother J. W. Milam pounded on the door of the Wright family home where Till was staying with a pistol. Bryant announced that they were “looking for the boy that did the talking.” Forcing their way in, according to a PBS documentary about Till, they roused Till from sleep, marched him to their car and sped away.

Till’s disfigured body was found three days later, “the most celebrated race-sex case since Scottsboro was born,” the journalist William Bradford Huie wrote in Look magazine. His body was so mutilated that it could be identified only by the silver signet ring, still on his finger.

“Someone is going to pay for this,” Till’s mother wailed, according to an American National Biography web page about her. She demanded that her son’s body be returned to Chicago for an open-coffin funeral. “I wanted the world to see,” she said.

Till’s body, unembalmed, was displayed publicly for four days. People left in tears. Some fainted.

The murder became a rallying point for the nascent civil rights movement. The Rev. Jesse Jackson called it the movement’s “Big Bang.”

The Bryant brothers were found not guilty. After the acquittal, they kissed their wives, lit cigars and posed for pictures. And later, protected from double jeopardy, they boasted about how they had murdered Till.

Till’s mother, Mamie Till Mobley, turned to the federal government to no avail. She tried to meet with President Dwight D. Eisenhower, but he refused. J. Edgar Hoover, the director of the F.B.I. at the time, declined to make the killing a federal case.

“There has been no allegation made,” he said, “that the victim Emmett Till has been subjected to the deprivation of any right or privilege which is secured and protected by the Constitution and the laws of the United States.”

The Till case became emblematic of a history of violence toward African-Americans and of the country’s legacy of white supremacy. It provoked international outrage and pressure on political leaders in the United States. Young black Americans grasped the precariousness of their own lives, and figures like the Rev. Dr. Martin Luther King Jr., Medgar Evers and many others were galvanized to press the fight on the front lines. Ms. Till Mobley became a teacher and civil rights activist herself, as did many whites.

Friedrich Nietzsche, the rebel of 19th-century philosophy who died 116 years ago on Aug. 25, would probably recognize some of his ideas in modern society.

Nietzsche wrote with the confidence and vehemence of any pundit. He posited extreme precursors to moral relativism and self-actualization, two ideas that have become prevalent during the last few decades. His often-aphoristic writing style would be perfect for Twitter, where there are many accounts in his name.

Whether he would be pleased about how his ideas have influenced our culture is another matter, but it would be very difficult to argue that they have not. Perhaps the most well-known example is the frequently made accusation that his writings fostered a sense of Teutonic racial superiority that Germany and then Hitler would use to justify embarking on two world wars, even though Nietzsche himself had repudiated his nationality and claimed to be descended from Polish nobles.

His ideas might seem more familiar to us now, but at his death they were controversial, even shocking.

“Nietzsche was largely influenced by the pessimism of Schopenhauer, and his writings, full of revolutionary opinions, were fired with a fearless iconoclasm which surpassed the wildest dreams of contemporary free thought,” The New York Times wrote after he died on Aug. 25, 1900. “His doctrines, however, were inspired by lofty aspirations, while the brilliancy of his thought and diction and the epigrammatic force of his writings commanded even the admiration of his most pronounced enemies, of which he had many.”

Those enemies included organized religion, especially Christianity, democracy, mediocrity, nationalism and women. Nietzsche railed against these and other adversaries on pages often densely packed with allusions, symbolism and language closer to romantic poetry than fusty metaphysics. Here is a sampling of his best-known writings:

Out of life’s school of war: What does not kill me, makes me stronger. — “Twilight of the Idols”

Whoever fights monsters should see to it that in the process he does not become a monster. And when you look long into an abyss, the abyss also looks into you. — “Beyond Good and Evil”

God is dead! God remains dead! And we have killed him! How shall we console our selves, the most murderous of all murderers? — “The Gay Science”

Unlike many of his philosophical predecessors, Nietzsche did not argue for a specific weltanschauung, or worldview, even though his writings may suggest one. He distrusted any thinker who proposed a comprehensive system for interpreting the world, and he often wrote in a manner that allowed for multiple interpretations.

Nietzsche is not a philosopher in the strict and technical sense of the word. He has no system or consistent body of thought professing to explain all aspects of the universe. He does not expressly deal with epistemology, ontology or, indeed, with metaphysics in general. He concentrates himself on the moral and aesthetic aspects of things, on their “values,” as is now the custom to say, owing to Nietzsche himself, who introduced the term; and he does so with a literary force and artistic power of presentation which makes his writings specially stimulating and is really the cause of his comparative popularity.

Nietzsche’s originality may have stemmed from consideration, then renunciation. He was born on Oct. 15, 1844, the son of a Lutheran minister. His father died when he was young, and his mother hoped he would join the church, but by the time he went to the University of Bonn (he later moved to the University of Leipzig) he had decided to study the classics and pursue a career in philology. He earned a professorship in Greek at the University of Basel in Switzerland when he was just 24 and became inspired by Richard Wagner and Arthur Schopenhauer.

By the late 1870s Nietzsche had retired from his professorship, broken off his relationship with Wagner and tried to wrest his philosophy from Schopenhauer’s shadow.

He worked tirelessly throughout the 1880s, producing what became “The Gay Science,” “Beyond Good and Evil” and “Thus Spoke Zarathustra,” but his physical and mental health declined. His Times obituary said that when he died he had “been hopelessly insane” since 1889.

She was the princess of R&B, a Grammy-nominated singer and actress whose glassy vocals against synthetic soundscapes pioneered a new genre. But she was also a girl next door, a teenager with her own street style who rose above the vulgarity of other stars.

Aaliyah Dana Haughton died 15 years ago along with eight other passengers of a small airplane that crashed in the Bahamas. She was 22, but she had already reached a level of fame few could achieve in a lifetime.

Born in Brooklyn and raised in Detroit, Aaliyah was raised for stardom. At 11, she sang on stage with Gladys Knight. True to its title, her debut album, “Age Ain’t Nothing but a Number,” was released when she was 15. It was produced by the R&B giant R. Kelly and included chart toppers like “Back and Forth” and “At Your Best (You Are Love).” It went platinum, selling more than a million copies. In one of her more gossip-provoking moments, it was widely reported that she had secretly married R. Kelly, who was in his late 20s. Their marriage was annulled.

At the beginning of her senior year of high school in 1996, she released her second album, “One in a Million,” with help from the star producer-songwriter duo Timbaland and Missy Elliot. Timbaland’s trademark fusion of hip-hop and electronic music featured twitchy, complex syncopated beats and start-stop rhythms that complemented Aaliyah’s precocious, sultry voice. That album sold two million copies.

The collaboration with Timbaland took her to new heights in 1998 with “Are You That Somebody,“ recorded for the “Dr. Dolittle” soundtrack. The song, which the critic Simon Reynolds called “the most radical pop single” of the year, earned Aaliyah the first of her five Grammy Award nominations.

In The Times, Kelefa Sanneh wrote, “Where most divas insist on being the center of the song, she knew how to disappear into the music, how to match her voice to the bass line — it was sometimes difficult to tell one from the other.”

Aaliyah’s acting career took off in 2000 with a lead part in “Romeo Must Die.” Her hit single on the soundtrack, “Try Again,” earned her another Grammy nomination. She also had a title role in the film “Queen of the Damned,” which was released after her death.

Aaliyah died on her way back to Miami from Abaco Island, where she had finished working on the video for her latest album’s third single, “Rock the Boat,” directed by Hype Williams.

In 1959, Truman Capote stumbled on a short article in The New York Times about a gruesome quadruple murder at a Kansas farm. He soon realized that it was the story he had been waiting to write for 20 years.

When he began writing professionally, Capote, who died 32 years ago today, theorized that journalism and creative writing could come together in the form of what he called the “nonfiction novel.” The subject had to be right, however; with journalism underpinning such a novel, the pitfall was that it could quickly date itself. Crime, he decided, could be the perfect vehicle.

The first people he shared his nonfiction novel idea with, he said, thought of it as merely a remedy for writer’s block. Capote disagreed.

“Reporting can be made as interesting as fiction, and done as artistically,” he told Plimpton.

Accompanied by his childhood friend Harper Lee, the author of “To Kill a Mockingbird,” Capote made his way to Kansas to investigate the murders of the Clutter family. Their trip resulted in “In Cold Blood,” which made his name synonymous with the true crime genre.

By then he was 35 and had already achieved fame and fortune with his fiction, which included “Other Voices, Other Rooms” and “Breakfast at Tiffany’s.” But “In Cold Blood,” which reconstructed in stark detail the murders at the Clutter farm, was a radical departure for him.

The killers, Perry Smith and Dick Hickock, both of them ex-convicts, had intended to rob the family, which they knew to be well-off. But they were surprised to find almost no money in the house; everyone but the robbers, it seemed, knew that the farm owner, Herbert Clutter, paid only with checks.

Before arriving at the farm, Smith and Hickock had agreed that no witnesses could be left behind, whether or not the robbery was successful. The Clutters were tied up in separate rooms and killed at close range by shotgun blasts. Herbert Clutter’s throat was also slit.

“In Cold Blood” started as a series of articles for The New Yorker, based on six years of research and interviews that, Capote said, were transcribed from memory without the use of tape recorders or notes. Made into a book, it became a national best seller, despite assertions that it is not entirely factual. And it brought Capote even more financial and social success.

The book, disturbing and gory, took its toll on him, though. He told Plimpton that if he had known what was waiting for him in Kansas, he would have “driven straight on. Like a bat out of hell.”

Capote formed a bond with Perry Smith; though strikingly different, they both had endured turbulent childhoods. “Each looked at each other and saw, or thought he saw, the man he might have been,” Gerald Clarke wrote in “Capote,” his biography of the writer published in 1988. (Philip Seymour Hoffman won a best-actor Oscar for his performance as the title character in the 2005 film “Capote.”)

Capote knew that before he could finish his book, the ending — the executions of the two convicted murderers — had to happen. In 1965, when the killers were hanged, the conflict he felt “tore him apart,” Mr. Clarke said in an email.

Capote told Plimpton: “I’m still very much haunted by the whole thing. I have finished the book, but in a sense I haven’t finished it.”

Capote lived in a “heavy-drinking generation,” as Mr. Clarke described it, but after the publication of “In Cold Blood,” his drinking got worse, and he started using drugs. Once slender, he deteriorated into a “paunchy” man, as his Times obituary noted in 1984.

In July 1978 Capote was interviewed on Stanley Siegel’s live television talk show (shown at 4:36 below). Siegel, who died last January, asked the obviously inebriated Capote what would happen to him if he did not give up alcohol and drugs.

As many Americans protested the police shooting of Michael Brown in Ferguson, Mo., two years ago, members of the Huey P. Newton Gun Club carried their rifles on a march in Dallas. And last month, in response to more police shootings, they took them to another rally in Dallas in which five officers were fatally shot by a veteran of the Army Reserve, not a club member.

The Dallas club began in 2014 after an officer there killed an unarmed black man and wounded a child with a stray bullet but was not disciplined. The club’s members made it their mission to patrol their neighborhoods, keeping an eye on the police and others.

The name Huey P. Newton can elicit cries of “hero” or “criminal,” and the space in between reflects the distance in racial perspectives that the United States has failed to bridge since Newton helped found the Black Panthers 50 years ago, when the civil rights struggle was moving beyond the South to black neighborhoods in the North and West.

Newton advocated armed self-defense in black communities, where the organization also provided social services. They would patrol the streets, guns drawn, turning them on drug dealers and police officers alike.

“We’ve never advocated violence, violence is inflicted upon us,” Newton told The Times in 1970, one month after a California court overturned his conviction for killing a police officer in Oakland, Calif., where the Panthers originated. “But we do believe in self-defense for ourselves and for black people.”

Expressing a willingness to defend oneself with weapons was hardly revolutionary. When Frederick Douglass was asked in 1850 what he believed to be the best response to the Fugitive Slave Act, he replied, “A good revolver.” And Malcolm X advocated the same.

The Black Panthers, which never grew beyond a few thousand members, tried to combine socialism and black nationalism. Its charter called for full employment, decent housing, and the end of police brutality.

Unlike black separatists, the Panthers welcomed all races and found wealthy liberals willing to give them money. But the group’s social programs — like a breakfast program for schoolchildren and clothing and food drives — came undone partly by the corruption of the leadership.

Historians have detailed its mistreatment of female members, extortion, drug dealing, embezzlement and murder. At least 19 Panthers were killed in shootouts with one another, the authorities or other black revolutionaries.

While “by any means necessary” became a mantra of the group, J. Edgar Hoover’s F.B.I. also did whatever possible to target the Panthers. As many members went off to prison and the group dwindled, Newton became a despotic and paranoid drug addict, wielding dictatorial powers with a small coterie, and knocking off anyone in his way.

During the Olympic Games in Rio de Janeiro, Not Forgotten is resurfacing obituaries about some of the greatest Olympic athletes of all time.

“Go anyplace and people will tell you Wilma Rudolph was the first black woman to win a medal — it’s not true,” Alice Coachman said in 1997.

Coachman was in a position to know. A very good position: 5 feet 6⅛ inches on her first attempt of the high jump at the 1948 Olympic Games in London.

That set an Olympic record and — because Coachman had achieved it on the first try — earned her the gold medal. Dorothy Tyler of Britain, who cleared 5-6⅛ on her second try, had to settle for silver.

When Coachman died in 2014, at 90, the fact that she was the first black woman to win an Olympic gold medal was the salient point of her obituary in The New York Times.

Photo

Alice Coachman in 2012.Credit
Damon Winter/The New York Times

Sixty-six years earlier, however, The Times had not even mentioned the fact in its dispatch from London. The correspondent, Allison Danzig, barely noted that Coachman had set a record. In fact, he cast her victory not as a triumph for American women but as a “disappointment” to Tyler’s British fans.

Coachman attributed Rudolph’s pre-eminence in the public mind to the fact that the 1960 Olympics in Rome, where Rudolph won three gold medals, were televised. Viewers could see with their own eyes what newspaper reporters and radio commentators of earlier eras did not necessarily emphasize.

Coachman was treated almost as a nonperson on her homecoming to Albany, Ga., forced to use a side door of the auditorium where she was being honored. The mayor refused to shake her hand.

Racism is not the only explanation for Coachman’s relative invisibility until recent years, however. Some of it had to do with one of her gifts.

“I had accomplished what I wanted to do,” Coachman said in explaining why she retired as an athlete after the London Olympics. “It was time for me to start looking for a husband. That was the climax. I won the gold medal. I proved to my mother, my father, my coach and everybody else that I had gone to the end of my rope.”

At the Olympics, maybe. The truth is that her career as an exemplar was just beginning.

If you could have dinner with one person who is no longer with us, and whose obituary was published in The New York Times, who would it be, and why that person? Not Forgotten is asking that question of a variety of influential people this summer in a series of posts called Breaking Bread.

Today we have Dominique Dawes, the first African-American female gymnast to win an individual medal. Nicknamed “Awesome Dawesome,” she went on to compete in three Olympics.

If I could choose to have dinner with somebody who has passed away, I would choose to dine with Mother Angelica.

Mother Angelica was the nun who founded the largest religious network, Eternal Word Television Network, starting with only $200. She is the only woman to have founded and led a cable network for over 20 years.

I’d invite Mother Angelica to my home and have her sit at the head of our table, alongside my husband and two baby girls. The meal I’d cook would be soul food (I grew up on it), consisting of chitlins, collard greens, cheese grits and candied yams. Mother Angelica would understand this meal: She was raised around blacks and poor Italians in a tough Canton, Ohio, neighborhood. She knew people, she understood their plights, she was one of them!

And she knew resilience most of all, raised by a single mother from an early age after her father had abandoned them.

I’d ask her to say the blessing, then proceed to ask her a few things about her life and about fortitude. A priest once told me that it’s very difficult to have a relationship with your Heavenly Father after your earthly father has abandoned you. I often wondered how she overcame this abandonment, learned to forgive her father and ultimately trust in God?

She was a cloistered nun, in a convent, yet she was seen by hundreds of millions of people worldwide as the host of a series on EWTN. How was she able to embrace both of these so very opposite vocations? (I am an introvert by nature, and performing in front of millions during the Olympic Games gave me anxiety, as does speaking at events in front of thousands now.)

Over dinner, I’d be fascinated to hear from Mother Angelica about how she channeled her own pain into a larger purpose. And I would ask her how I might help others, whether they suffer from anxiety, depression, addiction, physical ailments or the pain of abandonment or divorce. Her whole life, after all, was dedicated to helping others, especially the disenfranchised.

Mother Angelica, I would ask, how can we here on earth emulate what you did, even in a smaller way, offering help to others in a world that so desperately needs it?

The Sultan of Swat. The Caliph of Clout. The Great Bambino. When baseball fans hear these monikers, nearly 70 years after Babe Ruth died on Aug. 16, 1948, they’re taken back to the golden age of baseball, when one charismatic player ruled the sport by smacking more home runs than entire teams, changing the game in the process.

But before Ruth tantalized fans with his prodigious power, he was practically helpless. From the time he was 7 years old, Ruth grew up in St. Mary’s Industrial School for Boys, a reformatory and orphanage for children in Baltimore. He might have amounted to nothing without the help of one dedicated mentor.

George Herman Ruth Jr. was born in Baltimore on Feb. 6, 1895. His mother was the former Katherine Schamberger. He was a rambunctious child who routinely skipped school, drank and taunted local police officers around his home. He became so unruly that his parents sent him to St. Mary’s, a notoriously strict institution, although he pleaded with them not to.

At St. Mary’s, Ruth had to adhere to a grinding schedule of school, prayer and work, which left no time for carousing. His parents had signed over custodial rights to the school and essentially washed their hands of him, leaving Ruth alone and desperately in need of a father figure.

Then he met Brother Matthias, a brawny, 6-foot-6 disciplinarian and assistant athletic director at St. Mary’s, who took to Ruth immediately. Matthias was widely credited with introducing Ruth to baseball. They spent hours together honing Ruth’s skills, both as a hitter and a left-handed pitcher.

“It was at St. Mary’s that I met and learned to love the greatest man I’ve ever known,” Ruth wrote about Matthias in his 1948 autobiography, “The Babe Ruth Story.”

Ruth learned to play during the dead-ball era of the early 20th century, when hitters swung down on the ball, kept it inside the park and relied on speed as their greatest asset. Baseball was strategic, built on grounders, bunts and stolen bases instead of power.

Matthias had a different approach. He belted majestic fly balls deep into the St. Mary’s outfield. The impressionable Ruth copied Matthias’s approach, which led to his unprecedented gift for hitting bombs.

Word of Ruth’s talents spread, and Jack Dunn, owner of the minor league Baltimore Orioles, came to watch him play. Dunn was so impressed that he became Ruth’s legal guardian in order to sign the 19-year-old. On his arrival in the clubhouse, Orioles players referred to the burly Ruth as “Jack’s newest babe,” coining one of the great nicknames in American sports history.

Ruth’s career with the Orioles was short. That summer he was acquired by the Boston Red Sox, for whom he would win his first three championships as a pitcher and an outfielder.

But the Red Sox made a grave mistake when they sold Ruth to the rival New York Yankees in 1920. Many bleacher historians blame this error for the Red Sox’ 86-year championship drought — the so-called Curse of the Bambino.

Ruth played 15 seasons with the Bombers, amassing four more championships. His records include a .690 slugging percentage and 714 career home runs, a record that stood until Henry Aaron broke it on April 8, 1974.

Photo

Babe Ruth signed his new contract with Jacob Ruppert in 1934, the last of the 15 seasons he played with the Yankees.Credit
The New York Times

Ruth’s place in baseball’s pantheon was apparent to anyone who saw him play. He was part of baseball’s first Hall of Fame class, in 1936, the year after he retired. An inveterate cigar smoker, he learned he had throat cancer a decade later and died from the disease on this day in 1948.

His Yankees teammate Joe Dugan probably summed up Ruth’s larger-than-life stature best, elevating it to myth: “To understand him you had to understand this: He wasn’t human.”

During the Olympic Games in Rio de Janeiro, Not Forgotten is resurfacing obituaries about some of the greatest Olympic athletes of all time. Before tonight’s gold medal heavyweight match, between Russia’s Evgeny Tishchenko and Kazakhstan’s Vassiliy Levit, we revisit one of Olympic boxing’s most talented pugilists.

Most boxers battle for the title, money and acclaim. Teófilo Stevenson rejected all of that for his country.

Stevenson, who stood 6 feet 5 inches, weighed 220 pounds and battered opponents with a deft left jab and a sledgehammer straight right, won three consecutive Olympic heavyweight gold medals for Cuba, in 1972 in Munich, 1976 in Montreal and 1980 in Moscow.

His 1980 victory made him the first Olympic boxer to earn three consecutive gold medals in the same division. But he might have had a chance for another: Stevenson was still a tremendous fighter when Cuba boycotted the 1984 Olympics in Los Angeles. He won the last of his three amateur boxing world titles two years later at the age of 34.

After his first two medals, boxing promoters were practically slavering at the potential ticket sales of a Cold War-era match between Stevenson, a product of Communist Cuba, and Muhammad Ali, who died in June at 74.

But athletes in Fidel Castro’s Cuba were not permitted to compete professionally, so Stevenson would have had to defect in order to fight Ali. He was not willing to do so, even though promoters promised him $1 million or more.

“I will not leave my country for one million dollars or for much more than that,” Stevenson said in an article, headlined “He’d Rather Be Red Than Rich,” in Sports Illustrated in 1974. “What is a million dollars against eight million Cubans who love me?”

Ali told The New York Times in 1976 that he thought Stevenson was a promising amateur fighter but that he was probably not ready for the pros.

“I saw him get a little tired in round three against the last guy he fought,” Ali said. “Imagine if he had to fight 15 hard rounds against bad people like me or George Foreman or Joe Frazier or Ken Norton, somebody who would put pressure on him.”

Ali said that Stevenson’s passing up such a lucrative payday was a big mistake.

(The next Olympic boxer to win three Olympic gold medals in the same weight class was Stevenson’s countryman, Felix Savon, who won the heavyweight medal from 1992 until 2000.)

“I didn’t need the money because it was going to mess up my life,” Stevenson told The Chicago Tribune in 2003. “For professional boxers, the money is a trap. You make a lot of money, but how many boxers in history do we know that died poor? The money always goes into other people’s hands.”

While the world was consumed with war in the first half of the 1940s, three men were subsumed with growing unrest across India, with the fates of tens of millions of their compatriots in their hands.

This day — a moment, really — in history belongs to these leaders: Mohandas Gandhi, Jawaharlal Nehru and Muhammad Ali Jinnah.

At the stroke of midnight on Aug. 15, 1947, power over one-fifth of humanity was transferred from Britain to the newly independent countries of India and Pakistan. But there was a fatal flaw: There were no borders.

Indians had struggled for decades to rid themselves of British rule, galvanized by the nonviolent movement led by Gandhi. Their efforts were kept in check by ruthless military force, but by the end of World War II, Britain lacked the will and the means to defeat the campaign. They reluctantly relinquished India after 200 years, leaving the country at the brink of implosion.

Gandhi, Nehru and Jinnah were divided on what should happen once the British left. Gandhi, more an idealist than a realist, wanted an undivided nation; he chose to remain out of government.

The British negotiated with the Muslim League, led by Jinnah, who believed that a separate state was the only way to protect the rights of Muslims, who were a minority; and the (mostly Hindu) Indian National Congress, led by Nehru, who grudgingly went along with the British decision to divide India on the basis of religion.

Cyril Radcliffe, who had never been to Asia, arrived in India 36 days before the date of the partition to draw the lines to split one of world’s largest and most ethnically diverse countries. On Aug. 9, he finished drawing the map, but the British viceroy, his superior, kept it a secret. He didn’t want the British to be blamed for any ensuing violence. But it prolonged the uncertainty for millions and very likely increased the loss of life to come.

Shortly before the clock struck midnight on Aug. 15, 1947, Nehru, Gandhi’s successor at the helm of the independence movement and India’s first prime minister, was inside Parliament in New Delhi delivering an address recognized as one of the greatest of the 20th century.

“Long years ago, we made a tryst with destiny, and now the time comes when we shall redeem our pledge, not wholly or in full measure, but very substantially,” he began.

Nearing the conclusion, he said, “There is no resting for any one of us till we redeem our pledge in full, till we make all the people of India what destiny intended them to be.”

Those stirring words met the occasion, but had no effect on the swirling chaos on the ground as mobs sought on their own to determine the religious makeup of towns and villages. Communities that had lived together for centuries viciously turned on each other. The borders were announced two days after independence: Hindu-majority India flanked by Muslim-majority West Pakistan and East Pakistan.

Up to 15 million people moved across the two borders in less than a year, one of the fastest mass migrations in history. Millions of Muslims fled India, most heading west. About the same number of Hindus and Sikhs went mostly east into the new India. About one million people were killed.

On Jan. 30, 1948, Gandhi, who remained the strongest advocate for peace, was assassinated by a Hindu extremist who opposed his ideology.

Gandhi’s death “left all India stunned and bewildered as to the direction that this newly independent nation would take without its ‘Mahatma’ (Great Teacher),” wrote The New York Times. Jinnah, who “brought about, almost single-handed, one of the most sweeping political transformations of the century in Asia,” The Times wrote, died the same year, on Sept. 11, 1948. Nehru ruled for 17 years and died on May 27, 1964.

Those hastily drawn borders by the British became the focus of four wars and seven decades of animosity between India and Pakistan. For many millions on the subcontinent today, all the promise that came with independence remains unfulfilled.

Were you, a family member or your community personally affected by the partition of India? If so, we’d like to hear from you.

What was your reaction when you first heard about the partition? How old were you?

Please describe your journey. For instance, where did you go after the partition? Did you help others? And did you face any violence?

How old are you now?

What is your name?

First and last preferred, please.

What is your email address?

(Optional) Do you have photos you’d like to share with us from that period?

By submitting to us, you are promising that the content is original, does not plagiarize from anyone or infringe a copyright or trademark, does not violate anyone’s rights and is not libelous or otherwise unlawful or misleading. You are agreeing that we can use your submission in all manner and media of The New York Times and that we shall have the right to authorize third parties to do so. And you agree to our Terms of Service.

https://static01.nyt.com/images/2014/08/11/obituaries/WILLIAMS-hp-slideshow-slide-TMXO/WILLIAMS-hp-slideshow-slide-TMXO-facebookJumbo-v4.jpgRobin Williams was one of the most explosively, exhaustingly, prodigiously verbal comedians who ever lived, says the film critic A. O. Scott. And the only thing faster than Williams’s mouth was his mind.CreditBy Adam Freelanderhttps://static01.nyt.com/images/2014/08/11/obituaries/WILLIAMS-hp-slideshow-slide-TMXO/WILLIAMS-hp-slideshow-slide-TMXO-facebookJumbo-v4.jpg

Robin Williams, an indefatigable, improvisational genius, arrived on screens as an alien and left as an Academy Award-winning actor.

After his death, two years ago today, The New York Times described him like this:

Onstage he was known for ricochet riffs on politics, social issues and cultural matters both high and low; tales of drug and alcohol abuse; lewd commentaries on relations between the sexes; and lightning-like improvisations on anything an audience member might toss at him. His gigs were always rife with frenetic, spot-on impersonations that included Hollywood stars, presidents, princes, prime ministers, popes and anonymous citizens of the world. His irreverence was legendary and uncurtailable.

We remember Williams with some of our favorite scenes and lines (some of which contain strong language), and encourage readers to do the same on Twitter using #tellnyt.

Williams earned an Academy Award nomination for playing Adrian Cronauer, a chatty Armed Forces Radio host in Saigon in the 1960s. “This is the first role that calls upon me to do what I do best — me,” he said.

He voiced an unforgettably zany blue genie in the 1992 Walt Disney feature.

“Mrs. Doubtfire” — 1993
“Off your Mercedes, dear, you own that big expensive car out there? Oh, dear. Well, they say a man who has to buy a big car like that is trying to compensate for smaller genitals.”

Williams played an actor who cross-dressed as a British housekeeper to spend more time with his children in this 1993 family comedy.

“Good Will Hunting” — 1997
“You’re an orphan, right? Do you think I know the first thing about how hard your life has been, how you feel, who you are, because I read ‘Oliver Twist?’ Does that encapsulate you?”

Williams’s Oscar-winning turn as a therapist working with a troubled prodigy, played by Matt Damon, offered him a rare serious role that took advantage of his wide-ranging talents.

“Death to Smoochy” — 2002
“Even when you’re squeaky clean, you can still fall in the mud.”

Williams starred in this black comedy as Rainbow Randolph, a children’s TV show host who is fired for taking bribes and replaced by an upstanding performer played by Edward Norton.

Didrikson backed up her swagger; There was seemingly no sport she could not master. After her world record-breaking performance at the women’s track and field national championships in 1931, an article in The New York Times described her as a “feminine athletic marvel” who was as adept at “swimming, boxing, tennis, baseball and basketball as she is in track.”

She was born Mildred Ella Didriksen (she later changed the “e” to an “o”) on June 26, 1911, in Port Arthur, Tex., but went by Babe because, even as a youth, she could supposedly hit a baseball like Ruth.

In 1932, she competed at the Amateur Athletic Union’s national track and field championships, which, at the time, served as Olympic qualifiers. Some teams had as many as 22 athletes, but Didrikson performed solo in all of the events as a publicity stunt for her sponsor. She won five individual events, tied in a sixth and won the championships single-handed.

At the 1932 Games, Didrikson won gold medals in both the javelin throw and the high hurdles. In the high jump, she cleared 5 feet 5 inches, the same as gold medalist Jean Shiley. But she was disqualified on her final jump and awarded the silver medal after a judge ruled her technique had violated Olympic rules, even though the issue had not been raised in earlier rounds.

The fact that Didrikson won only three medals also deserves an asterisk. Women were limited to three Olympic track and field events in 1932, so Didrikson could possibly have won more had she been allowed to compete.

Didrikson’s success at the Olympics had made her internationally famous, but by the time she died, on Sept. 27, 1956, she was also known as a champion golfer. She had only taken up the sport in 1935, but had tackled it with the same drive she brought to all of her athletic endeavors. She met her future husband, the professional wrestler George Zaharias, when they were paired to play golf together at a tournament. She took his surname when they married in 1938.

“At least part of Mrs. Zaharias’ success could be attributed to her powers of concentration and diligence,” her obituary in The Times said. “When she decided to center her attention on golf, she tightened up her game by driving as many as 1,000 balls a day and playing until her hands were so sore, they had to be taped. She developed an aggressive, dramatic style, hitting down sharply and crisply on her iron shots like a man and averaging 240 yards off the tee with her woods.”

As an amateur golfer, Zaharias once won 14 tournaments in a row. She helped found the Ladies’ Professional Golf Association and won 31 tournaments on tour. She also won 10 majors, including victories at the women’s United States Open in 1948, 1950 and 1954.

Zaharias beat Betty Hicks by 12 strokes in the 1954 United States Open, an astonishing margin considering that Zaharias had been treated for colon cancer in 1953 and had undergone a colostomy.

Zaharias became a spokeswoman for cancer awareness and toured for as long as she could, but the disease returned. She died from it in September 1956.

“I think that every one of us feels sad that finally she had to lose this last one of all her battles,” President Dwight D. Eisenhower said at the time.

The Associated Press named Zaharias the Woman Athlete of the Year six times and the World’s Greatest Woman Athlete of the First Half of the 20th Century. Sports Illustrated lauded her as the woman Athlete of the 20th Century in individual sports. These accolades came decades after the sportswriter Grantland Rice first called her “Wonder Girl.”

But the comedian Bob Hope may have expressed Zaharias’s talents best with a self-deprecating comment he made when they played in a charity tournament together.

“I hit the ball like a girl,” Hope said, “and she hits it like a man.”

During the Olympic Games in Rio de Janeiro, Not Forgotten is resurfacing obituaries about some of the greatest Olympic athletes of all time.

A few seconds, perhaps a fraction of a second, can mean the difference between victory and defeat, between becoming a legend or leaving as a footnote.

“People come out to see you perform, and you’ve got to give them the best you have within you,” the great track and field star Jesse Owens once said. “A lifetime of training, for just 10 seconds.”

Yet that lifetime of training, which propelled Owens into the history books with his performance in the 1936 Games in Berlin, seemed for a time as if it might be of little use. With the rise of Nazi Germany roiling Europe, the Amateur Athletic Union remained divided in 1935 over whether to allow American athletes to compete in Berlin; it ultimately approved their participation, but only by a narrow vote.

“I wanted no part of politics,” Owens said. “And I wasn’t in Berlin to compete against any one athlete. The purpose of the Olympics, anyway, was to do your best.

The A.A.U. wasn’t the only organization involved in a moral tug of war over the Olympics. Owens, who was black, was encouraged by some civil rights groups to boycott the games. After deciding to go, he found a chilly reception in Germany, where claims of Aryan supremacy were central to Nazi ideology. He was called racial epithets and subjected to other mistreatment.

To the dismay of Hitler and the Nazis, Owens went on to win four gold medals — in the long jump, 100-meter dash, the 200-meter dash and the 4x100 meter relay — more than any other American track and field athlete in a single Olympic Games. His long jump record, of 8.13 meters, would not be surpassed for 25 years.

“I had jumped into another rare kind of stratosphere — one that only a handful of people in every generation are lucky enough to know,” Owens said of his accomplishments.

Photo

New Yorkers lined the streets to welcome Owens back from the Olympics in September 1936.Credit
Associated Press

The son of a sharecropper and grandson of a slave, James Cleveland Owens was born on Sept. 12, 1913, in Alabama and moved with his family as child to Cleveland. Sickly in his youth, he went by the nickname J.C., and a teacher’s misunderstanding during a roll call would lead to his being called Jesse for the rest of his life.

Owens broke records at the junior high school, high school and collegiate level in the 1920s and ‘30s. But it was his time at Ohio State University that proved crucial in his development.

For all his record-breaking Olympic success overseas, his return home was sobering. President Franklin D. Roosevelt didn’t acknowledge his achievements, a snub that stung Owens. Unlike modern-day athletes who can be paid handsomely through endorsements and other commercial deals, Owens had to take myriad jobs to support his family. He later became a motivational speaker and public relations representative.

In 1976, President Gerald R. Ford awarded him the Presidential Medal of Freedom, the highest honor given to civilians in the United States. Owens died from complications related to lung cancer on March 31, 1980.

Several movies have been made about his life, including this year’s “Race,” starring Stephan James. In a review of the film, which he called “studiously uplifting,” Stephen Holden wrote in The Times, “Long before television elevated black sports heroes into gods, there were athletes like Jesse Owens who paved the way.”

In Rio, the heirs of Owens, like Usain Bolt of Jamaica and Allyson Felix of the United States, are looking to carve their own names in Olympic history, propelled by the chance for glory, pride for country and perhaps, as Owens had expressed, a simple love for the sport.

“I always loved running — it was something you could do by yourself and under your own power,” he said. “You could go in any direction, fast or slow as you wanted, fighting the wind if you felt like it, seeking out new sights just on the strength of your feet and the courage of your lungs.”

For the nearly 30 years that Adam Yauch’s scratchy voice blared through boomboxes, and later earbuds, he and his hip-hop group, the Beastie Boys, changed the face of music. And four years after Yauch’s death at age 47, they can’t, they won’t and they don’t stop having an influence on their beloved city: New York.

Yauch, known as MCA, was born 52 years ago on this day in Brooklyn. He was a “New York kid” with “just enough crazy,” according to his longtime bandmate Adam Horovitz (Ad-Rock), and hung out at Palmetto Playground (renamed Adam Yauch Park) in Brooklyn Heights. He attended Edward R. Murrow High School in the borough’s Midwood neighborhood and spent two years at Bard College in the early 1980s.

“Man, living at home is such a drag/Now your mom threw away your best porno mag (Busted!)/You gotta fight for your right to party”

In 1983, Yauch’s hard-core punk band, which also included Michael Diamond (Mike D), morphed into an unlikely hip-hop trio: white Brooklynites rapping about girls, vandalism and, of course, their right to party. The Beastie Boys’ slapstick was “greeted by some hip-hop purists as a novelty act,” Jon Pareles, the New York Times music critic, wrote after Yauch’s death. “They were Jewish bohemians, not ghetto survivors; they were jokers, not battlers.”

Their album “Licensed to Ill” (1986) exposed suburban fans of rock radio to hip-hop. It became the first hip-hop album to reach No. 1 on the Billboard chart, and its hit song “No Sleep Till Brooklyn” became a popular slogan that echoed from radios and appeared on T-shirts across New York.

Born and bred in Brooklyn the U.S.A./They call me Adam Yauch but I’m M.C.A.

Three years later, the album “Paul’s Boutique” became a hip-hop staple: a “seamless set of provocative samples and rhymes — a rap opera, if you will,” Rolling Stone magazine said at the time.

The Beastie Boys’ rhymes never became too serious, but they did mature. Yauch became a supporter of feminism and a practicing Buddhist, creating the Milarepa Fund to support Tibetan independence from China. A series of Tibetan Freedom Concerts raised awareness for his cause.

I want to say a little something that’s long overdue/The disrespect to women has got to be through

Yauch (pronounced “yowk”) also spoke out against any backlash against Islam in 1998, long before talk of a Muslim immigration ban. “I think that another thing America needs to think about is our racism, racism that comes from the United States towards Muslim people and towards Arabic people,” he said, adding, “The United States has to start respecting people in the Middle East.”

In 2004, the Beastie Boys offered a post-Sept. 11 tribute to their city with the album “To the Five Boroughs.”

Offstage, Yauch, Horovitz and Diamond were businessmen, too. In 1992, they started Grand Royal, their label and magazine. Yauch also directed many of the Beastie Boys’ music videos and started Oscilloscope Laboratories in New York to produce and distribute independent films.

“I burn the competition like a flame thrower/My rhymes they age like wine as I get older”

In his 40s, Yauch missed the Beastie Boys’ 2012 induction into the Rock and Roll Hall of Fame: He had received a diagnosis of salivary gland cancer three years before and remained too ill to attend. He died on May 4, 2012, but was still able to record “Hot Sauce Committee Part Two,” a 2011 album that fittingly references the 1980s New York that gave rise to Yauch and his band of pranksters-turned-legends.

Hans Christian Andersen, whose fairy tales endure more than a century after his death on this day in 1875, had a childhood as difficult as those of his plucky protagonists.

Born on April 2, 1805, in Odense, Denmark, Andersen grew up in stark poverty, but his father, a shoemaker, cultivated his imagination.

“On Sundays he made me panoramas, theatres, and transformation pictures, and he would read me pieces out of Holberg’s plays and stories from the ‘Thousand and One Nights,’” Andersen was quoted as saying in his obituary in The New York Times. “And those were the only moments in which I remember him as looking really cheerful, for in his position as an artisan he did not feel happy.”

Andersen found beauty in his humble surroundings.

“A single little room, its floor space almost completely taken up by the shoemaker’s workbench, the bed, and the turn-up bench on which I slept, comprised my childhood home,” he wrote in his autobiography, translated as “The Fairy Tale of My Life.” “But the walls were covered with pictures, on the chest of drawers there stood beautiful cups, glasses, and knickknacks, and above the workbench, by the window, there was a shelf with books and songs.”

Andersen was a solitary child who spent most of his time making costumes for puppets and enacting plays on a model stage his father had built for him. He headed for Copenhagen when he was just a teenager.

His first play was soon produced by a theater there, and he went on to write poems, novels and, of course, children’s stories.

The hovels of Andersen’s childhood were far behind him, but he retained his gift for spinning magic from the mundane. Many of his stories featured children who persevered in the face of ridicule, ignorance and evil.

Versions of his tales, which include “The Ugly Duckling,” “The Emperor’s New Clothes” and “The Princess and the Pea,” remain childhood favorites. Other yarns inspired films like “The Little Mermaid,” “Thumbelina” and “Frozen,” which was originally and very loosely based on the stories he collectively titled “The Snow Queen.”

In time, Andersen became famous and traveled around Europe, meeting celebrities like Charles Dickens. So the opening line of his autobiography is hardly hyperbolic.

When Henri Cartier-Bresson first picked up a tiny Leica 35mm film camera in 1931, he began a visual journey that would revolutionize 20th-century photography.

His camera could be wielded so discreetly that it enabled him to photograph while being virtually unseen by others — a near invisibility that turned photojournalism into a primary source of information and photography into a recognized art form.

Cartier-Bresson’s concept of the “decisive moment” — a split second that reveals the larger truth of a situation — shaped modern street photography and set the stage for hundreds of photojournalists to bring the world into living rooms through magazines such as Life and Look. In 1947, he and Robert Capa helped create the photographer-owned cooperative photo agency Magnum.

Photo

A street scene in the southern French town of Hyères in 1932.Credit
Henri Cartier-Bresson/Magnum Photos, courtesy Fondation Henri Cartier-Bresson

Though he often focused on the human condition in his photographs, Cartier-Besson would often look at his contact sheets or prints upside down to judge the images separate from any social content. They stood as rigorous compositions on their own.

His signature shooting technique was to find a visually arresting setting for a photograph and then patiently wait for that decisive moment to unfurl. In his obituary in The New York Times in 2004 — Cartier-Bresson had died on Aug. 3 — the critic Michael Kimmelman noted that Cartier-Bresson was equally adept at responding instantly to changing circumstance.

“Photographers and others who saw him work talked about his swift and nimble ability to snap a picture undetected,” he wrote. “(Sometimes he even masked the shiny metal parts of his camera with black tape.) They also admired his coolness under pressure. The director Louis Malle remembered that, despite all the turmoil at the peak of the student protests in Paris in May 1968, Mr. Cartier-Bresson took photographs at the rate of only about four an hour.”

Photo

Sunday on the banks of the river Marne taken in 1938.Credit
Henri Cartier-Bresson/Magnum Photos, courtesy Fondation Henri Cartier-Bresson

With the primacy of digital photography and social media in the 21st century, slow, painstaking image-making is becoming a relic. Photographers and their images now move at a pace as fast as the events swirling around them. Technological advances in cameras and methods of distribution have heralded in a new visual era, not unlike what Cartier-Bresson’s Leica did almost a century ago.

Photographs are no longer rare artifacts, nor primarily a means of learning about the exotic or unknown. They arrive instantaneously on our phones every day from every corner of the world and from all kinds of people. With a smart phone, everyone is a photographer, and images compete for crowd approval on social media channels like Instagram, Snapchat and Facebook.

Which raises questions on this anniversary of Cartier-Bresson’s death: Do these changes make a master’s carefully constructed images irrelevant? Or are they even more instructive today? Respond on Twitter using the hashtag #tellnyt.

James Baldwin, whose cutting, unequivocal writing about race relations helped make America more equal than it was before, was born on this day in 1924, according to many accounts. The Times wrote in his obituary on Dec. 1, 1987:

Mr. Baldwin’s prose, with its apocalyptic tone — a legacy of his early exposure to religious fundamentalism — and its passionate yet distanced sense of advocacy, seemed perfect for a period in which blacks in the South lived under continual threat of racial violence and in which civil-rights workers faced brutal beatings and even death.

Here are some of his most prescient lines:

I imagine one of the reasons people cling to their hates so stubbornly is because they sense, once hate is gone, they will be forced to deal with pain.

What is ghastly and really almost hopeless in our racial situation now is that the crimes we have committed are so great and so unspeakable that the acceptance of this knowledge would lead, literally, to madness. The human being, then, in order to protect himself, closes his eyes, compulsively repeats his crimes, and enters a spiritual darkness which no one can describe.

Only white Americans can consider themselves to be expatriates. Once I found myself on the other side of the ocean, I could see where I came from very clearly, and I could see that I carried myself, which is my home, with me. You can never escape that. I am the grandson of a slave, and I am a writer. I must deal with both.

I was a maverick, a maverick in the sense that I depended on neither the white world nor the black world. That was the only way I could’ve played it. I would’ve been broken otherwise. I had to say, ‘A curse on both your houses.’ The fact that I went to Europe so early is probably what saved me. It gave me another touchstone — myself.

On March 10, 1876, Professor Alexander Graham Bell stood in a Boston boarding house holding a receiving device connected to a series of wires that ran into an adjacent room. There, his assistant, Thomas A. Watson, waited patiently, clutching another receiver to his ear.

Bell spoke into his end of the contraption, and Watson heard his voice in the receiver: “Mr. Watson! Come here! I want—!”

From that experiment using just a few feet of wire would grow an industry that would transform the world. Through the likes of the American Bell Telephone Company and its successor, AT&T (known colloquially as Ma Bell), what was once Bell’s “toy” became a communications goliath made up of billions of dollars’ worth of infrastructure carrying tens of millions of calls every day.

Alexander Graham Bell — who died at 75 on this day in 1922 at his estate in Nova Scotia in Canada — was fascinated by speech, sound and communication from a very young age. He was born on March 3, 1847, in Edinburgh, Scotland, to Alexander Melville Bell and Eliza Grace Symonds Bell, who was deaf. He was homeschooled by his father, a phoneticist and the developer of Visible Speech, a series of symbols designed to aid the deaf in oration

Bell moved to Boston in the early 1870s and there used methods that he had learned from his father to teach deaf students. His techniques proved so useful that he eventually taught them to others as a professor at the Boston University School of Oratory.

During these years he continued his research into sound at the university, experimenting with electricity. He hired Watson, an electrical designer and mechanic, for his electrical expertise. Soon they were collaborating on acoustic telegraphy, hoping to transmit a human voice by means of pulses along a telegraph wire.

Bell was granted a patent for the telephone — No. 174,465 — on March 7, 1876. An “Improvement in Telegraphy,” the documents stated. The patent, however, proved controversial from the start.

Even though Bell is known as the father of telephony, his claim as its inventor has been challenged repeatedly in hundreds of legal cases, some of which have appeared before the United States Supreme Court. Throughout, though, Bell’s patent was upheld.

“I may perhaps take credit for having blazed a trail for others who came after me,” Bell, who was humble by nature, once said, “but when I look at the phenomenal developments of the telephone and at the great system that bears my name, I feel that the credit for these developments is due to others rather than myself.”

He would go on to undertake important work in fields such as hydrofoils and aeronautics; make early advances in the creation of the metal detector; and develop a wireless telephone, called the photophone.

Summarizing the part Bell had played in the building of an increasingly interconnected society, The New York Times in 1947, to signify the 100th anniversary of Bell’s birth, asked readers to jump back and put themselves in the shoes of a mid-1800s parent encouraging a child to imagine the possibilities that might lie ahead.

“If in the middle of the last century,” The Times said, “our fathers had read to their children a tale about a charming princess who summoned an equally charming prince to her rescue over a copper wire with the aid of some wonderful lamps in which magical filaments glowed, there would have been cries of admiration. Well, fairy tales have a way of coming true in science and invention. And Bell’s telephone is one of them.”

I often wonder what Yves Saint Laurent, who was born on this day in 1936, would think of the modern fashion world.

This is in part because his name has been in the news recently, given the upheaval at the brand he built, where yet another creative director will debut a newish vision for the label next month.

But it’s also because many of the issues currently front and center — not just in fashion but also in the wider conversation about the social contract — were causes that he championed. In fact, he never saw them as causes per se, but rather as simply part of the definition of what it meant to be modern.

Diversity? Saint Laurent was among the first designers to embrace black models on the runway, claiming such women as Iman, Katoucha Niane and Dalma Callado as his muses. Naomi Campbell credits him with getting her her first French Vogue cover.

Yet every season, we still seem to have the same discussion about the color myopia of the industry.

The power of pantsuits? He understood what they could mean for women back in 1966, when he unveiled his first Le Smoking: a tuxedo for women worn with a ruffled white shirt and a satin cummerbund. The idea shocked the world then. The New York socialite Nan Kempner was turned away from Le Cote Basque for wearing hers, only to return having divested herself of the trousers and wearing the jacket as a mini-dress. (That was, somehow, more acceptable to the management.)

But trouser suits were the dress of choice at the just-concluded Democratic National Convention in Philadelphia, and wearing them Hillary Clinton has built the second stage of her public life, taking Saint Laurent’s vision beyond fashion history to history, full stop.

The democratization of fashion? Saint Laurent popularized the idea of high fashion ready-to-wear, introducing Rive Gauche, his Left Bank boutique and off-the-rack collection, in 1966. He was the first couturier to make his clothes available to consumers beyond the gilded doors of the haute salons. Now e-commerce has moved the dial even further, and for the first time this season three designers (Tom Ford, Tommy Hilfiger and Burberry) will be showing clothes that can be bought the next day, instead of six months down the line.

So maybe Mr. Saint Laurent, who died on June 1, 2008, would be rolling his eyes. Maybe he would be laughing. Probably he would be both frustrated and proud: at how far we’ve come, and how much further we have to go.

“A new phenomenon in physics” that could generate 200 million volts of energy was reported in a wire service dispatch on Page 2 of The New York Times on Jan. 29, 1939. Otto Hahn, the paper reported, had discovered that the uranium atom could be split, a conclusion, he acknowledged, that “violated all previous experience in the field of nuclear physics.”

But the breathtaking disclosure was delivered with a major caveat: The practical application of the discovery, if any, would take 25 years.

That prediction, as it turned out, was off by a long shot. Less than seven years later, as a direct result of Hahn’s discovery, the United States dropped two atomic bombs on Japan to end World War II.

When Hahn died in Germany on July 28, 1968, his Times obituary was featured on Page 1. By then, the significance of the Nobel Prize-winning chemist’s original finding was well established: He had, the article said, “discovered that the atom could be split, paving the way for the nuclear bomb” in a “discovery that changed the course of 20th-century history.”

Photo

The equipment used by Otto Hahn and Fritz Strassmann.Credit
Hulton Archive/Getty Images

Hahn made his discovery in his laboratory at the Kaiser Wilhelm Institute in Berlin, working with his assistant, Fritz Strassmann. (Strassman’s predecessor, Lise Meitner, had been fired by the institute because she was Jewish. Hahn said after the war that he had opposed Nazism.)

But the process of splitting the uranium-235 atom would not be labeled nuclear fission until later, and Hahn himself, as a chemist rather than a physicist, initially described his discovery in the most equivocal terms.

By May 1940, it had become clear why scientists were reluctant to discuss the atom-splitting and the energy it released — a development they “regarded as ushering in the long dreamed of age of atomic power, and, therefore, as one of the greatest if not the greatest discovery in modern science,” the reporter William Laurence wrote in The Times.

The main reason they were silent, Laurence explained, was “the tremendous implications this discovery bears on the possible outcome of the European war,” which by then had already begun.

Hahn later said that he had never believed that his discovery would have military implications. “I am a scientist,” he said, “and like all scientists am interested only in discovery and not application.”

He later became an antiwar activist who opposed nuclear proliferation and expressed his fears in this rhyme:

American elections — and the American electorate — grow more complex and confounding every campaign cycle. This year’s presidential contest, featuring one of the most experienced politicians ever to seek the White House and a showman who has never served a day in the military or elected office, has befuddled even the most experienced observers.

George H. Gallup, one of the founders of public opinion research, would have reveled in the challenges presented by the personalities — the two most unpopular major party candidates to win their parties’ nominations — and by the seemingly contradictory views of the public about the state of the nation.

Mr. Gallup, who died 32 years ago this week at age 82, could not, and probably would not, tell you who he thought would win in November. But he could tell you what forces were driving public opinion, from fear of crime and terrorism to a widespread unease about rapid cultural and demographic changes.

And he most certainly would have pointed out the flaws in a presidential primary system that produced two candidates with such high negative ratings and so many voters in despair.

“Dr. Gallup had a major conviction that the whole election process in the nation was way off on a wrong track, and he argued that the people wanted major reforms — including abolishing the Electoral College, a single national primary, confining campaigning to a month or two in the fall, and national funding of the campaign,” Frank Newport, editor in chief of the Gallup Organization, wrote in an email. “He no doubt would be feeling ever more strongly about these convictions in today’s environment.”

Photo

George Gallup at the NBC studios in New York in 1968.Credit
Associated Press

Mr. Gallup, an Iowan with a commanding presence and a bone-crushing grip, would also undoubtedly have strong feelings about the profound changes roiling the polling industry. His organization pioneered many of the advances in measuring public opinion, including use of the telephone rather than mail or face-to-face interviews. That technology is now under scrutiny, as more and more pollsters are turning to the internet and mobile devices to conduct surveys. (Gallup and The New York Times rely almost exclusively on telephone polling, but are experimenting with reaching the public in other ways.)

Last year Gallup announced that it would not conduct so-called horse race polls (“If the election were held today, … ?”) this election cycle. A Gallup poll famously predicted that Thomas E. Dewey would defeat Harry S. Truman in 1948, and the firm’s final poll in 2012 badly overestimated Mitt Romney’s share of the vote. The company instead is now focusing on the mood of the public, taking, as Mr. Gallup called it in one of his book titles, “The Pulse of Democracy.”

When Hillary Clinton formally clinches the Democratic presidential nomination this week in front of television cameras and a crowd of thousands, one vital influence will be conspicuously absent: her mother, Dorothy Rodham, whose quintessentially American story of resilience is woven into the fabric of her candidacy.

“No one had a bigger influence on my life or did more to shape the person I became,” Mrs. Clinton wrote in her 2014 memoir, “Hard Choices.”

When she claimed victory over Senator Bernie Sanders in the race for the nomination, she invoked her mother again, saying, “I wish she could see her daughter become the Democratic Party’s nominee.”

Mrs. Clinton’s mother, the former Dorothy Emma Howell, was born on June 4, 1919, into poverty and neglect on the same day that Congress passed the 19th Amendment, which gave women the right to vote. (It was sent to states for ratification and took effect 14 months later.)

Dorothy’s parents fought violently in the house in Chicago that they shared with four other families. Her father, a firefighter, was granted a divorce and custody of the two children after Dorothy’s mother never showed up in court to fight accusations of abandonment and abuse. Dorothy and her little sister were sent on a cross-country train to live with their grandparents in California. Dorothy was 8, her sister was 3.

Their grandmother was old-fashioned and strict. She preferred black Victorian dress and tolerated no disobedience — Dorothy was not allowed to attend parties or have visitors. After she went trick-or-treating one Halloween, she was confined to her bedroom for a year, let out only to go to school.

At 14, Dorothy escaped her grandmother’s strict domain for the unforgiving America of the Great Depression. She cooked, cleaned and nannied for a family in San Gabriel, Calif., in exchange for room, board and $3 a week. She lived in near abject poverty, but in that household Dorothy learned what family was.

With her adopted family’s support and help from influential teachers, she eventually graduated from high school. Mrs. Clinton would later reflect on the small acts of kindness that helped her mother survive, like the teacher in elementary school who noticed that Dorothy couldn’t afford milk. The teacher brought an extra carton of milk every day, then asked: “Dorothy, I can’t drink this other carton of milk. Would you like it?’”

After graduating in 1937, Dorothy was lured back to Chicago by her mother, who told her that her new husband would pay for Dorothy’s tuition at Northwestern University. But her mother lied: She brought Dorothy back to work as a housekeeper. Heartbroken, Dorothy eventually found secretarial work.

“I’d hoped so hard that my mother would love me that I had to take the chance and find out,” Dorothy told her daughter when asked why she returned home. “When she didn’t, I had nowhere else to go.”

In 1942, Dorothy married Hugh Ellsworth Rodham, a conservative Republican who operated a small drapery business. They raised three children — Hillary Diane, Hugh Jr. and Tony — in the leafy suburb of Park Ridge, Ill. Mr. Rodham exerted a tough influence on his daughter while passing on to her his distinctively boisterous laugh — one that could “turn heads in a restaurant and send cats running from the room,” she wrote.

Dorothy Rodham raised her daughter to stand her ground and hit back if necessary, Mrs. Clinton wrote. In 1965, after Hillary Rodham had entered Wellesley College as a civic-minded Republican and had become plagued by doubts about remaining there, her mother bucked her up. “You can’t quit,” she quoted her mother as saying. “You’ve got to see through what you’ve started.”

The war in Vietnam and the turmoil of the civil rights movement led Mrs. Clinton to undergo a political transformation. She graduated as an antiwar Democrat.

During her unsuccessful 2008 campaign for the presidential nomination, Mrs. Clinton was fiercely protective of her mother’s privacy. In the 2016 race, she has made her mother’s struggle part of the emotional core of her campaign.

Later in life, Dorothy Rodham resumed her education by taking college courses. She died on Nov. 1, 2011, while her daughter was secretary of state. Mrs. Clinton wrote:

Mom measured her own life by how much she was able to help us and serve others. I knew if she was still with us, she would be urging us to do the same. Never rest on your laurels. Never quit. Never stop working to make the world a better place. That’s our unfinished business.

This is the story of Cassius Marcellus Clay — not that Cassius Clay, the heavyweight fighter and luminous worldwide presence best known as Muhammad Ali.

This story is about the original Cassius Clay: the 19th-century scion of a slaveholding family who became a belligerent emancipationist, globe-trotting statesman, unsparing duelist, early Republican and larger-than-life American eccentric.

It was for that Cassius Clay, who died on July 22, 1903, at the Kentucky plantation house where he had been born 92 years earlier, that Ali’s father and, by extension, Ali himself were named.

A firebrand publisher, Yale-educated lawyer, Kentucky state legislator, major general in the Union Army, survivor of multiple assassination attempts and the United States minister to Russia under Presidents Lincoln and Johnson, General Clay was as well known for his private activities as for his public ones.

His obituary in The New York Times, published on July 23, 1903, is remarkable for a level of catty candor rarely seen in American news obituaries of the era — traditionally staid, reverential documents — and, very likely, of any era.

“He was found desperately ill, and has had every care,” the opening paragraph reads. “His children, long estranged by reason of his eccentricities, were again able to be with him, and were at the bedside when death ensued.”

Things get more delicious from there.

There was General Clay’s prolific dueling, which left him with a tangle of scars on his face and body but left his opponents far worse off: He was said to have slain more men in duels than anyone else in the country.

On one occasion, caught without his pistol, General Clay was shot above the heart by a would-be assassin. He forestalled further ado by slicing off the assailant’s nose and ears with a Bowie knife.

Then there was General Clay’s precipitate divorce from his first wife of 45 years, Mary Jane Warfield, and his equally precipitate second marriage — made, he insisted, on populist political grounds — to a 15-year-old servant girl. He was 84 at the time.

“In 1837 he had married his first wife, Miss Warfield, a member of an aristocratic family of slave holders,” the Times obituary said. “Years afterward, when he had become an ardent disciple of Tolstoï, he came to the conclusion that he ought to wed a ‘daughter of the people.’ ”

And so he did, taking Dora Richardson as his bride in 1894. “Gen. Clay Weds Pretty Dora,” a headline in The Times proclaimed. “His Children Were Unable to Prevent Their Aged Parent’s Marriage.”

Young Dora, who evidently had little say in the matter of her betrothal, did not take kindly to being yoked to a man more than five times her age. She ran away repeatedly from home and from the boarding school to which her husband sent her.

“The fact that he supplied her with the most beautiful French gowns and lavished money upon her, she did not consider compensation for the teasing she got at the hands of her fellow-pupils,” The Times said. “In two months he had to take her back home, still uneducated.”

After four years of Dora’s comings and goings, which were avidly covered in the newspapers, General Clay divorced her.

She remarried “a worthless young mountaineer,” The Times reported, but after he was killed in a railway accident, the general tried vigorously to win back “his peasant wife,” as he fondly called her.

In this endeavor, unlike most others, he did not succeed.

The youngest son of Gen. Green Clay and the former Sally Lewis, Cassius Marcellus Clay was born on Oct. 19, 1810, at White Hall, his family’s mansion near Richmond, Ky.

His father (1757-1828) had been a hero of the Revolutionary War and was a general in the War of 1812; Henry Clay, the United States senator and statesman, was a cousin. Both of Cassius’ parents were from the Southern landed gentry, making the family among the wealthiest landowners in the state.

At Yale, Cassius Clay heard a speech by the famed abolitionist William Lloyd Garrison and was converted to the cause. Returning home after earning a law degree in 1832, he established a practice in Lexington, served three terms in the Kentucky General Assembly and was a captain in the 1st Kentucky Cavalry in the Mexican War.

In 1844, he freed his own slaves and the next year started The True American, an emancipationist newspaper published in Lexington.

His proposals for gradually ending slavery, which he also promulgated in public lectures, did not go over well in Kentucky. He kept a cannon on hand to protect the newspaper office from looming mobs and weathered several more attempts on his life.

General Clay, who in the 1850s helped establish the Republican Party, was a friend and staunch supporter of Abraham Lincoln. After the outbreak of the Civil War, he organized the Cassius M. Clay Battalion, a corps of several hundred volunteers charged with protecting the White House.

In 1861, Lincoln appointed him minister to Russia, a post he held through the following year and again from 1863 to 1869. Dispatched to St. Petersburg, General Clay was instrumental in brokering the deal that in 1867 let the United States purchase Alaska.

The general’s later life was a sorry state of affairs. Barricaded in White Hall with a veritable arsenal beside him, he pined for the faithless Dora and worried obsessively that enemies, real and imagined, were coming to kill him.

Photo

In 1903, The New York Times ran two articles pondering the level of General Clay’s mental health.
Credit

“Though his sight became so much impaired that he could not shoot any longer,” The Times reported in his obituary, “he kept plenty of firearms at his elbow, and kept trained from a porthole in the wall the same brass cannon he had caused to be built to protect his printing office.”

But the vital legacy of General Clay’s early life has endured down the years. He fathered a string of children — as many as 10 in some estimates — most with his first wife, although at least one with a St. Petersburg mistress. Two daughters, Mary Barr Clay (1839-1924) and Laura Clay (1849-1941), became leaders of the women’s suffrage movement.

In 1853, he donated the land for what became Berea College in Berea, Ky. Established two years later, it was the first interracial and coeducational college in the South, open to blacks and to women from its inception.

General Clay was buried in Richmond Cemetery, in Richmond, Ky., and his funeral was newsworthy for the racially mixed crowd in attendance.

“Never was a more striking scene witnessed on the way to Richmond, where the funeral services were to be held,” a contemporary newspaper account read. “From every humble Negro cottage along the roadside and at every cross roads, the mothers and large children carrying those who were too little to walk, the Negroes were lined up to pay their last respects to the man whom they honored as the Abraham Lincoln of Kentucky.”

In the end, then, its garrulous chronicle of its subject’s peccadilloes notwithstanding, the obituary of Cassius Marcellus Clay is every inch a requiem for a heavyweight.

July 20,1969 — a date that lives in my memory as the great divide, the B.C. to A.D., in my journalism career. It was the day of the first walk on the moon by humans, Neil Armstrong and Buzz Aldrin, and I covered the event for The Times from mission control in Houston.

Neil A. Armstrong, the 38-year-old civilian commander, radioed to earth and the mission control room here:

“Houston, Tranquility Base here. The Eagle has landed.”

Just think, the 50th anniversary of the first moon walk is only three years away. Although I am now 82, my doctors seem to think I have a good chance of still being around for it. I doubt I will be up to the dawn-to-dawn workdays and multiple deadlines of yore, but a bit of the remembered excitement should be a tonic.

The Armstrong obituary I wrote ran above the fold on the front page on Sunday, Aug. 26, 2012. As I wrote it, I felt the old surge of Apollo emotion returning. Ever so briefly, I was young again, responding to a deadline and waiting presses.

In the obituary, I continued the exchange between Armstrong and mission control:

“Roger, Tranquillity,” mission control replied. “We copy you on the ground. You’ve got a bunch of guys about to turn blue. We’re breathing again. Thanks a lot.”

The same could have been said for hundreds of millions of people around the world watching on television.

One reader that Sunday was a woman I had known and been fond of more than 50 years ago. She was still a space buff and in an email praised the obit. One thing led to another and in our rediscovery we dispelled creeping loneliness in favor of love. Today we are together.

Before Bruce Lee sprang into martial arts movies in the early 1970s, the average actor in a kung fu film may have been better prepared to deliver a Shakespearean soliloquy than a roundhouse kick.

“In our early action films, we used actors who knew little about fighting,” Raymond Chow, one of the producers behind “Enter the Dragon” and other movies that starred Lee, said in an interview in 1973. “We had to use various camera tricks. But the audiences can tell the difference. It knows a real fighter when it sees one. That’s why Bruce Lee is such a hit.”

Born in San Francisco and raised in Hong Kong, Lee was a fighter’s fighter. He began studying martial arts in earnest as a teenager, augmenting his fighting with strength training and dancing. In time he developed his own style, Jeet Kune Do.

Acting was in Lee’s blood. His father, Lee Hoi-Chuen, appeared in Cantonese opera and films, and Lee started acting as a boy. He appeared in Chinese films and the short-lived 1960s American television series “The Green Hornet,” playing the title character’s assistant, Kato, in their crime-fighting exploits. Kato was a valet and martial arts master, a supporting character who became popular in Hong Kong, where the show was known as “The Kato Show.” Kato’s popularity helped Lee land a movie deal with Chow’s Golden Harvest studio.

Lee’s precise, powerful yet seemingly effortless grace and presence before the camera made him an international star.

“Enter the Dragon,” one of the first martial arts movies produced by a Hollywood studio, was Lee’s best-known film. Lee did his own stunts, helped write the script and choreographed the fight scenes. In the film he played a martial arts master who infiltrates a criminal’s island fortress by agreeing to fight in a tournament. The film transfixed audiences around the world and cleaned up at the box office. Here’s one of the movie’s many memorable fight scenes.

Lee did not live to see the film’s success. He died at 32 on July 20, 1973, after being found unconscious on the floor of his Hong Kong apartment, just days before “Enter the Dragon” had its premiere.

“Enter the Dragon” and the rest of Lee’s career have made an indelible mark on popular culture. His other films, which include “The Big Boss” and “The Chinese Connection,” have become part of the kung fu canon. They inspired the next generations of martial arts movie stars, like Jackie Chan and Jet Li, and helped open up Hollywood to Asian actors (although the extent to which that has happened is questionable).

Films, documentaries and books have been made about Lee’s life, and cultural references to him abound. He has inspired video game characters, even entire games. Yellow outfits, like the jumpsuit he wore in “Game of Death,” a film that was released posthumously, were also worn by the lead character in Berry Gordy’s “The Last Dragon” and Uma Thurman in the climactic scenes of the first part of Quentin Tarantino’s martial arts epic, “Kill Bill Vol. 1.”

Lee earned a star on the Hollywood Walk of Fame and was named one of Time magazine’s 100 people of the century. “Enter the Dragon” was added to the Library of Congress’s National Film Registry and labeled an American classic. A statue of Lee, poised to strike, on the Hong Kong waterfront still attracts throngs of fans.

“On an adventure level, the performances are quite good. The one by Mr. Lee, not only the picture’s supermaster killer but a fine actor as well, is downright fascinating. Mr. Lee, who also staged the combats, died very recently. Here he could not be more alive.”

The news media massed to chronicle his first birthday; his second; and, in untold, unforeseen numbers, his third, at which, clad in a tiny blue coat, he saluted his father’s passing coffin in one of the most enduring images in history.

Years later that photograph — taken on Nov. 25, 1963 — would stand as a dark augury of the son’s own life. For if John Fitzgerald Kennedy Jr. was the charmed star of a late-20th-Century American fairy tale, he also turned out to be the protagonist of a story that ended, as it had for so many members of his family, swiftly, publicly and well before its time.

John Jr. — known to legions of Americans by the tender twinned epithet John-John — died at 38, even younger than his father had, on July 16, 1999, when the small plane he was flying plunged into the sea off Martha’s Vineyard. His wife of barely a thousand days, Carolyn Bessette Kennedy, and her sister Lauren Bessette also died in the crash.

Photo

John John with his father, President John F. Kennedy.Credit
Associated Press

Like his father and uncles before him, the young John Kennedy (he eschewed the “F.” and the “Jr.”) could not have embodied the collective fantasy of the hero more thoroughly had he been assembled by consensus: He possessed wealth, charm, athleticism, prowess and dark good looks in no small measure — as close to a prince du sang as the American democracy would bear.

His adult exploits were chronicled no less voraciously than his childhood ones had been: his graduations from college and law school; his admission, after well-documented struggle, to the bar; his founding, in 1995, of George, a glossy magazine of politics and popular culture.

The public hung avidly on the sparkling bits: the parties; the celebrity girlfriends, among them Madonna, Sarah Jessica Parker and Daryl Hannah; the 1988 People magazine cover (“The Sexiest Man Alive”); and, in particular, his clandestine wedding to Ms. Bessette, a fashion publicist, in 1996, in a humble wood-frame chapel on a secluded island off the Georgia coast.

But a darker thread ran through it all. By the time they died, Mr. Kennedy and his wife were reported to have been living apart. Ms. Bessette Kennedy — a golden-haired beauty fit for a prince — was said to be hotheaded and volatile. He wanted children; she did not. He embraced the limelight; she abhorred it.

The magazine, too, was in trouble, condemned by some media watchers as little more than bombast and already embarked on an economic decline. It ceased publication in 2001.

The couple’s last voyage appeared to have been a cautious stab at reconciliation, as they journeyed together to a Kennedy family wedding. They took off at dusk, amid hazy, erratic weather and limited visibility, with Mr. Kennedy — a relatively untried pilot who had been told by doctors not to fly because of a recent broken ankle — at the controls.

Hubris? Perhaps — concluding with the hero’s fall from the sky that such stories can entail.

“Those whom the gods love die young,” the ancient Greek dramatist Menander wrote. In a 1962 speech he gave by the sea in Newport, R.I., Mr. Kennedy’s father — prophetically for the son — sounded a related theme:

“I really don’t know why it is that all of us are so committed to the sea, except I think it is because in addition to the fact that the sea changes and the light changes, and ships change, it is because we all came from the sea. And it is an interesting biological fact that all of us have, in our veins the exact same percentage of salt in our blood that exists in the ocean, and, therefore, we have salt in our blood, in our sweat, in our tears. We are tied to the ocean. And when we go back to the sea, whether it is to sail or to watch it, we are going back from whence we came.”

William Henry McCarty Jr. was said to have been born in Manhattan in 1859 before birth certificates were routinely issued. He died in 1881 in New Mexico, which was still only a territory and did not yet furnish official death certificates. And, by the time he was dubbed Billy the Kid, just a few months before his death, he had already reached his majority and barely qualified for the moniker anymore.

But the nickname stuck.

The Kid, a son of Irish immigrants who had fled the potato famine and then took Horace Greeley’s advice and went west, entered the pantheon of frontier folklore.

The first mention of the slim, beardless, blue-eyed desperado’s death in The Times was a one-paragraph article on July 19, 1881, under the headline “A Notorious Outlaw Killed”: A fugitive “terror of New Mexico cattlemen,” identified only by his nickname, had been shot dead by Sheriff Pat Garrett of Lincoln County in a cabin at Fort Sumner five days earlier.

Also known as William H. Bonney and Henry Antrim, he had escaped from the county jail on April 28 while awaiting his hanging for murdering Garrett’s predecessor.

According to one version, his mother had moved with her two sons to the Midwest, then to New Mexico to recover from tuberculosis. A Times article on July 31 said The Kid had been abused by his stepfather, Bill Antrim, and left home in Silver City at 15.

He became a hotel waiter, then a helper to a blacksmith, who “undertook to impose upon Billy,” and finally insinuated himself into the violent rivalry over beef contracts between Lincoln County cattlemen.

He killed at least a half-dozen people, but claimed to have murdered 21 during what The Times described as “his worse than worthless life.”

Still, as recently as six years ago, Gov. Bill Richardson of New Mexico considered a posthumous pardon — to redeem a promise by Lew Wallace, the 19th-century territorial governor (and later the author of “Ben-Hur” ) of amnesty if The Kid would testify about a murder he had witnessed. He testified, but Wallace reneged, and Governor Richardson ultimately decided against a pardon.

“Best to leave history alone,” said Susannah Garrett, a granddaughter of the sheriff.

Frida Kahlo, one of Mexico’s most important artists, understood the power of a selfie well before it became a pervasive part of popular culture. Kahlo’s paintings often shifted the viewer’s perspective beyond her self-portraits to offer personal and societal commentary, both subtle and overt.

“I am happy to be alive as long as I can paint,” Kahlo said.

Some of her artistic themes were highlighted in “The Two Fridas,” a 1939 oil painting that shows two seated Kahlos holding hands. Near-mirror images, they reflect love and loss and ideas surrounding beauty. The two hold hands, connected by shared veins that flow to their exposed hearts. One heart appears to be broken, with blood splattered on Kahlo’s lap from a cut vein.The other is intact with blood pumped to a framed photo of Diego Rivera, the celebrated muralist with whom Kahlo had a tumultuous marriage and had divorced that year. (The couple remarried the following year.) Together, the two Fridas suggest the physical and emotional toll of the divorce.

Kahlo expressed herself in dress as well, using her raiment as both adornment and armor. She embraced traditional Tehuana clothing, which in her paintings was often interpreted as a symbol of female authority. The choice to wear it in self-portraiture was a nod to her own fortitude. The style’s floor length skirts also allowed Kahlo to conceal her damaged leg, a result of polio as a child. It was amputated later in life.

If her clothing was an embrace of cultural identity, her signature unibrow and her wispy mustache were in some ways a rebuke to conventional standards of beauty.

A native of Coyoacán, Mexico, Kahlo began painting in 1926 while bedridden after sustaining life-altering injuries, including a broken spinal column, in a bus accident.

At her death on this day 62 years ago, she was well-known as an artist but nevertheless remained overshadowed by Rivera. The headline for her obituary in The New York Times said, “Frida Kahlo, Artist, Diego Rivera’s Wife.” Her work as a painter was mentioned almost as an aside.

By then her paintings had been exhibited and well-received in major cities like Mexico City, Paris and New York. In a 2007 retrospective in Mexico City, a Times review noted that it was through her lesser known works that Kahlo “emerges as an artist who gathered multiple influences into her own language.”

After her death, as the feminist movement gathered steam, her work would often be seen as eclipsing Rivera’s thanks to a renewed interest in her unflinching portrayals of a woman’s mental state through the lens of her own life. (It was dramatized in the 2002 film “Frida,” with Salma Hayek in the title role.)

Her work today sells for millions of dollars, and her likeness has appeared on everything from T-shirts to beer bottles. As noted by Graham W. J. Beal, the former director of the Detroit Institute of Arts, in a Times article last year, “Fridamania shows no signs of relenting.”

The query begins one of the movie’s most discomforting scenes. In it, a white-haired gent, moving with unhurried and ominous purpose, unpacks a set of dentistry implements and sets to work on a young man who is bound to a chair.

“Is it safe?” he asks. “Is it safe?”

To anyone who has ever visited a dentist, the episode that follows — the torment the older man visits on his baffled and terrified patient/prisoner, who is played by Dustin Hoffman — is almost unbearable to watch, in large part owing to the preternatural sang-froid of the tormenter. The film, from 1976, was “Marathon Man,” a thriller involving diamonds taken from Jews during World War II, a history student whose brother is a government agent and a fugitive Nazi war criminal — our brutal antagonist, Szell, a former concentration camp dentist played by Laurence Olivier, who was nominated for an Academy Award for his performance and won a Golden Globe.

Knighted in 1947 and raised to a life peerage in 1970, Lord Olivier was, of course, one of the great theatrical performers — some say the greatest of all — of the 20th century, equally adept at comedy and tragedy, especially revered as a Shakespearean of charismatic intensity and daring physicality. But illness and age led him to retire from the stage in 1974; few, if any, people under 50 today saw him perform live. And though his indelible film roles go back decades — to Heathcliff in “Wuthering Heights” (1939), Maxim de Winter in “Rebecca” (1940), Mr. Darcy in “Pride and Prejudice” (1940) and Hamlet (1948) — “Marathon Man” might well be the way younger filmgoers were introduced to Lord Olivier and the hyperbolic realism that characterized his work. His Szell was too cruel, too evil to be believed and yet memorably credible — frightfully, shudder-inducingly persuasive. Here’s the scene. Try to watch it.

At his death at age 82, 27 years ago today on July 11, 1989, Mel Gussow’s authoritative obituary in The New York Times gave ample testimony to his achievements both as an actor and director and as the founder and first artistic director of Britain’s National Theater. But perhaps inevitably, such a portrait feels a little musty, as though the man himself was a figure most alive in the distant past, a sepia-colored character to be revered — Lord Olivier, not Larry, as he was known to friends and colleagues — who could not be the technicolor movie villain whose villainy he so clearly relished embodying and enhancing.

He enjoyed playing good guys, too, of course, and did so, even in his dotage, with similar verve. In “The Boys From Brazil” (1978), he played a reverse role, a wise and wizened Nazi hunter modeled after Simon Wiesenthal who ends up face to face — and in mano a mano combat — with the Nazi doctor Josef Mengele, played by another actor beyond his matinee idol days, Gregory Peck. The scene in which they grapple bloodily over a gun as they await the arrival of a clone of Adolf Hitler (with a pack of trained dogs threatening to attack from behind a closed door) is one of Hollywood’s most outlandish climaxes, horrifically, blackly comic and, typical of any Olivier performance, not safe at all.

Sir Arthur Conan Doyle’s most enduring creation was Sherlock Holmes, the logical detective who appeared in dozens of stories and four novels by Conan Doyle and who has more recently been portrayed in movies by Benedict Cumberbatch and Robert Downey Jr. Many would suspect that Conan Doyle, a trained physician who was often beseeched by the public to apply his skills to real-life cases, might have been as inflexibly rational as Holmes.

But by the end of his life, on July 7, 1930, Conan Doyle was a fervent believer in spiritualism, having spent decades researching ghosts, fairies and the paranormal. His fascination with the supernatural grew after his son Kingsley and his younger brother, Innes, battle-weary from service in World War I, died amid the worldwide influenza pandemic shortly after returning home.

Conan Doyle attended seances and wrote and lectured on spiritualism. He befriended Harry Houdini, the escape artist and magician, maintaining that Houdini had psychic powers even though Houdini himself denied it.

Their friendship ended soon after Houdini attended a seance at which Conan Doyle’s second wife, the former Jean Leckie, claimed to be channeling the spirit of Houdini’s beloved mother. Leckie produced several pages of automatic writing, in fluent English and signed with a cross. Houdini was highly skeptical: His mother, a Jew, had been a rabbi’s wife and, as an immigrant from Hungary had a limited grasp of English.

Was Louis Armstrong the world’s most beloved entertainer, or was he the single most important musician in the history of jazz?

The answer is yes.

To millions, Armstrong, who died 45 years ago today, was the human ray of sunshine with the mile-wide smile, the gravelly voice and the peerless way with a song — the man whose joyous rendition of “Hello, Dolly!,” recorded when he was in his 60s, momentarily ended the Beatles’ three-month reign at the top of the singles chart.

To jazz aficionados, he was also something more: the trumpet virtuoso with the boundless musical imagination who almost singlehandedly shifted the focus of jazz from collective improvisation to individual expression — the man whose playing on the remarkable Hot Five and Hot Seven sessions, recorded when he was in his 20s, virtually defined the art of the jazz solo.

Miles Davis said it was impossible to play anything on a horn that Armstrong hadn’t already played.

Dizzy Gillespie put it this way: “His style was equally copied by saxophonists, trumpet players, pianists and all of the instrumentalists who make up the jazz picture.”

The New York Times jazz critic John S. Wilson wrote in 1971 that Armstrong was “the root source that moved jazz onto the path along which it has developed for more than 45 years.”

There was nothing in Armstrong’s early years to suggest that he was destined for greatness. Born into poverty in New Orleans, he sang on street corners as a child and studied music while confined to the Colored Waifs’ Home for Boys.

He learned fast. Before he was out of his teens, he was a fixture on the New Orleans music scene; a few years later he moved to Chicago, where he made the records that changed jazz history. In due time he became the first jazz superstar, embraced by the world for his bravura playing, his ebullient singing and his larger-than-life personality.

Louis Armstrong died at his home in Queens on July 6, 1971. His front-page Times obituary noted that “he had observed his 71st birthday on Sunday,” just two days earlier. That this quintessential American success story was born on July 4, 1900, always seemed too perfect to be true. And, as it turned out, it wasn’t true: According to later research, he had actually been born on Aug. 4, 1901.

Call it poetic license. The date he (and everyone else) celebrated was, as the old saying goes, close enough for jazz.

Being born on Feb. 29 can prove to be confusing. Celebrating your birthday every Dec. 25 may leave you feeling shortchanged in the presents department. But if you’re an American born on the Fourth of July, you typically get the day off and are treated to a fireworks display, too. We culled our obituary files for people born that day to explore what, if anything, they had in common.

For all the celebrities who were born on the Fourth of July, the holiday may be more famous for two adversaries who died on that date. In 1826, former President John Adams succumbed at 90 after supposedly uttering the last words, “Jefferson still survives.” In fact, Jefferson, 82, had died five hours earlier.

At 17, like many other Americans, Medgar Evers enlisted in the Army during World War II. A star athlete in high school, he participated in the Allied invasion of Europe, rising to the rank of sergeant before his honorable discharge in 1946.

But for Evers, who was born on this day in 1925 to an African-American farming family in Decatur, Miss., even the segregated Army was more welcoming than the Jim Crow South to which he returned after the war.

The racial injustice there rankled so much that he resolved to fight it, becoming the first field officer for the National Association for the Advancement of Colored People in Mississippi. He recruited new members, championed school integration, encouraged blacks to vote and staged daring protests against racial inequality in the South. He also called for a new investigation of the murder of Emmett Till, a 14-year-old African-American who was lynched in Mississippi in 1955, supposedly for flirting with a white woman.

Not surprisingly, intimidation and attempts on Evers’s life followed. People called his home threatening to shoot his family, and his house was firebombed. He did not back down.

“If I die, it will be in a good cause,” he told The Times in 1963. “I’ve been fighting for America just as much as the soldiers in Vietnam.”

The battlefields of Europe did not stop Evers; those of Mississippi did. Early in the morning of June 12, 1963, a bullet from a rifle ripped through his back, the gunfire awakening his neighborhood and reverberating through the civil rights movement for decades. He was shot returning home from an N.A.A.C.P. meeting, only hours after President John F. Kennedy delivered a televised address calling for equal rights for all American citizens, regardless of race.

Evers managed to drag himself to his doorstep, where his wife, Myrlie, an activist who later became chairman of the N.A.A.C.P.’s board, and their children found him. At the emergency room he was initially refused admittance because he was black, until his family explained who he was. He was 37 when he died less than an hour later.

Claude Sitton, the Times reporter who covered much of the civil rights movement in the South, reported that dozens were arrested in protests that erupted after Evers’s death.

As Evers once said, “You can kill a man, but you can’t kill an idea.”

His murderer was Byron De La Beckwith, an avowed white supremacist. In 1964 two all-white, all-male juries deadlocked and refused to convict Beckwith. A second trial that year ended in a hung jury, and he spent most of his days as a free man.

In 1989 documents surfaced that indicated that jurors had been illegally screened, and Beckwith was brought to trial and convicted in 1994. He died in prison in 2001.

The outrage at Beckwith’s freedom led to more demonstrations nationwide and inspired numerous works of art, film and music, including Nina Simone’s protest song “Mississippi, Goddam.”

Two months later, in August 1963, the protests culminated with the March on Washington for Jobs and Freedom, a pivotal, galvanizing moment for the civil rights movement.

As a war veteran Evers was buried in Arlington National Cemetery, with full military honors, achieving in death what he had been denied in life — equality with his brothers-in-arms and his fellow citizens.

President Theodore Roosevelt signed two historic bills aimed at regulating the food and drug industries into law on June 30, 1906. With decisive strokes of his pen on that oppressively hot day, Roosevelt also provided Upton Sinclair with the greatest validation for which any muckraker could hope. It was Sinclair’s novel “The Jungle” that helped spur the public outrage that led to the legislation.

“The Jungle,” a harrowing account of a Lithuanian immigrant’s experience laboring in Chicago’s meatpacking industry, was serialized in the Socialist magazine Appeal to Reason in 1905 before the installments were collected and published as a book in 1906. It came on the heels of exposés by the press and after months of reporting in Chicago’s Packingtown, as the neighborhood around the stockyards was known, by Sinclair himself.

“The Jungle’s” grotesque descriptions of conditions endured by workers and livestock, and the contaminated food that came of them, made it a runaway hit and catalyzed the public’s fear and fury.

There were cattle which had been fed on “whiskey-malt,” the refuse of the breweries, and had become what the men called “steerly” — which means covered with boils. It was a nasty job killing these, for when you plunged your knife into them they would burst and splash foul-smelling stuff into your face; and when a man’s sleeves were smeared with blood, and his hands steeped in it, how was he ever to wipe his face, or to clear his eyes so that he could see?

The book eventually sold millions of copies, was translated into dozens of languages and cemented Sinclair’s reputation as a crusader for social justice. It remains an inspiration to journalists investigating the food industry and food health scares, workplace conditions and the environmental impact of industry.

Sinclair later said that his readers had missed the point by focusing on the health risks created by unsanitary stockyards and meatpacking facilities rather than on the dehumanization of workers and the brutal treatment of animals.

“I aimed at the public’s heart,” he said, “and by accident I hit it in the stomach.”

Still, Sinclair was quick to harness the reaction. About a month after “The Jungle” was published, the White House started receiving “100 letters a day demanding a Federal cleanup of the meat industry,” Alden Whitman wrote in Sinclair’s obituary. (He died on Nov. 25, 1968.)

Roosevelt invited Sinclair to the White House, then ordered a federal investigation. Sinclair took every opportunity to harangue the Beef Trust, as the meatpacking industry was known, and sent a stream of telegrams to the White House demanding reform. Roosevelt soon tired of Sinclair’s outspokenness. In a note to the author’s publisher, the president wrote, “Tell Sinclair to go home and let me run the country for a while.”

Sinclair did no such thing. In some 90 books and innumerable articles over the next six decades, he pushed for progressive causes, like “strong trade unions, abolition of child labor, birth control, Prohibition, utopian Socialism, an honest press, morality in business and industry, vegetarianism, mental telepathy and spiritualism, educational reform and civil liberties,” his obituary said. He won the 1943 Pulitzer Prize for his novel “Dragon’s Teeth,” which dramatized the Nazi takeover of Germany during the 1930s.

Fame from “The Jungle” lasted until the end of Sinclair’s life. He was invited to the White House again in 1967, the year before his death, to witness the signing of a new food safety law by President Lyndon B. Johnson.

On June 28, 1914, an 18-year-old student named Gavrilo Princip fired a pistol in Sarajevo, Bosnia, and changed the world.

Princip, a Serbian nationalist enraged by the annexation of Bosnia and Herzegovina by the Austro-Hungarian empire, had assassinated Archduke Franz Ferdinand, presumptive heir to that empire’s throne, and his wife, the duchess of Hohenberg, as they rode in a motorcade. Ferdinand was aware of the danger — earlier that day he had deflected a bomb hurled at him by another would-be assassin, The Times reported. (Many contemporary accounts say the bomb actually bounced off the car.) He was traveling to visit people injured in that blast when he was killed.

Such courage, or perhaps obstinacy, was typical for Ferdinand. He gave up his future children’s claim to the Hapsburg throne to marry the countess, who was of lower station and of whom his uncle, the emperor, Franz Joseph, disapproved.

After the assassination Austria-Hungary declared war on Serbia. Soon Europe, and much of the world, spiraled into war as one country after another, enmeshed in a web of previously established alliances, took sides — either with the Central Powers (Germany, Austria-Hungary and their allies) or the Allies (France, Britain, Russia and others, including, eventually, the United States).

What became known as the Great War, or later World War I, would prove to be more devastating than any that had come before. The Times described the war’s impact in an article one year after Ferdinand was murdered.

Those two shots brought the world to arms, and the war that followed has brought devastation upon three continents and profoundly affected two others, and the tocsin has sounded in the remotest islands of the sea. Towns have been bombarbed in the Society Islands and battles have been fought in all the oceans, from the extremity of South America to the Malay Peninsula, from the heart of Africa to the coast of China. Nation after nation has been drawn into the whirlpool, and more are drawing toward it, and the end is far off. What face the world will wear when it is all over no man can predict, but it will be greatly changed, and not geographically alone.

During the four years that followed, millions of young men died as they scrambled between trenches or were killed by disease and chemical weapons like mustard gas. There were more than 30 million servicemen killed or wounded. By the time an armistice was declared in 1918, a generation had lost its innocence, and writers like Hemingway and Fitzgerald were inspired by the malaise of their contemporaries.

The war formally ended when the Germans signed the Treaty of Versailles, agreeing reluctantly to terms dictated by the Allied forces. The date was June 28, 1919, exactly five years after Ferdinand was killed. In 20 years the world would be at war again, the wounds of World War I never having fully healed.

By the time the sun came up, however, Mr. Pine, a deputy police inspector, and Ms. DeLarverie, a cross-dressing lesbian singer, were standing together at an intersection of history — even if they were on opposite sides of what appeared at first to be an old-fashioned donnybrook outside a mobbed-up bar.

“Nobody knows who threw the first punch, but it’s rumored that she did, and she said she did,” said Lisa Cannistraci, one of Ms. DeLarvarie’s guardians before her death on May 24, 2014, at 93.

Her fierce reaction at Stonewall was in keeping with a lifetime spent shielding lesbians — her “baby girls” — from intolerance, bullying or abuse. No one dared cross her, Ms. DeLarvarie said. “They’ll just walk away, and that’s a good thing to do because I’ll either pick up the phone or I’ll nail you.”

For the police, a raid on a joint like the Stonewall had been, until June 1969, a no-brainer. Gay bars were often controlled by organized crime. Corraling homosexuals was a good way for officers to boost their arrest records. “They were easy arrests,” Mr. Pine said when discussing the Stonewall uprising at the New-York Historical Society on the occasion of its 25th anniversary. “They never gave you any trouble.”

Until they did. That’s when a Manhattan bar became synonymous with gay liberation.

Mr. Pine apologized for the raid in 2004, six years before his death on Sept. 2, 2010, at 91.

If you could have dinner with one person who is no longer with us, and whose obituary was published in The New York Times, who would it be, and why that person? Not Forgotten is asking that of influential people this summer in a series of posts called Breaking Bread.

Today we have Tom Brokaw, the longtime anchor of NBC’s “Nightly News.” “Winston Churchill has always been my fantasy dining companion,” he wrote to us.

That’s not a great surprise: Mr. Brokaw has written frequently about World War II, most notably in his 1998 best seller, “The Greatest Generation.” Churchill, then, the intrepid prime minister who rallied a beleaguered Britain to victory in the war, is a natural. (A raconteur who loved good food, a fine cigar and a stiff drink, he would also be a convivial table guest.)

“Lunch at the Ritz would be an appropriate setting,” Mr. Brokaw wrote. And in his imagination he put himself there, with some specific questions in mind:

Sir Winston, I am limited to three questions, which is the interview equivalent of a teaspoon of domestic champagne.

Nonetheless, I am beyond grateful to be in your presence.

World War II: John F Kennedy famously said that you mobilized the English language and sent it into battle. Were there any moments after one of your famous speeches that you privately thought Great Britain was in greater peril than you let on? When?

At the Big Three meeting in Tehran in 1943, Franklin Roosevelt and Josef Stalin excluded you from a meeting of just the two of them. Was that a humbling sign that the best days of the British Empire were in the past?

You had a lifetime of cigars, brandy, wine and very little exercise. You were a prisoner of war and escaped. Your political career seemed to be over in the 1930s, but your glory days were yet to come. You lived to 90. Was it your indomitable will, or was it a higher being looking out for you?

Sir, your country has been an empire, a leading member of a western alliance and now has voted to go it alone.
Is this wise?

Scientists racing to develop a vaccine against Zika virus disease this summer may be hoping for results like those of Dr. Jonas Salk, creator of the first successful vaccine against poliomyelitis. Dr. Salk died on this day in 1995 at the age of 80, decades after the polio vaccine he developed helped vanquish the deadly, paralyzing disease throughout much of the world.

News that the polio vaccine worked in a field trial involving 440,000 American children, announced at a University of Michigan news conference on April 12, 1955, “caused a public sensation probably unequaled by any health development in modern times,” Harold M. Schmeck Jr. wrote in his New York Times obituary of Dr. Salk.

The discovery made Dr. Salk a hero. “An opinion poll ranked him roughly between Churchill and Gandhi as a revered figure of modern history,” Mr. Schmeck wrote. “It was a turning point in the fight against a disease that condemned some victims to live the rest of their lives in tanklike breathing machines called iron lungs and placed sunny swimming holes off limits to children because of parents’ fears of contagion.”

In recent years, however, fears of rare, vaccine-preventable diseases have subsided. In the United States alone each year, tens of thousands of “philosophical exemptions” to required childhood vaccinations are granted.

Dr. Salk’s great competitor was Dr. Albert B. Sabin, who developed a live polio virus vaccine that ultimately replaced the use of Dr. Salk’s killed virus version in many countries. The live vaccine, given orally, is easier and cheaper to administer, and is particularly useful during epidemics because a vaccinated person temporarily sheds the vaccine virus and can passively immunize others.

“But Dr. Salk never lost faith in killed virus polio vaccine and continued to champion its cause all his life,“ Mr. Schmeck wrote. "On several occasions he pointed out that the live virus vaccine did, on rare occasions, produce the disease as well as immunity, while the killed virus vaccine, properly made, carried no such risk.”

It was precisely because of this risk that, five years after Dr. Salk’s death, the United States discontinued use of the live virus vaccine. Children in America now exclusively receive the inactivated poliovirus vaccine, known as IPV, that resulted from Dr. Salk’s research.

Worldwide eradication of the disease has remained an elusive goal. This year and last, polio cases unrelated to the vaccine have occurred in Pakistan and Afghanistan. Earlier in the decade, children in Somalia, Nigeria, Syria and more than a dozen other countries were infected by wild polio virus. Vaccination campaigns have sometimes been thwarted by war and distrust of medical teams.

Dr. Salk established the Salk Institute for Biological Studies in San Diego, where he continued to research infectious diseases, including AIDS, into the 1990s.

When Judy Garland sang one of the best-known songs in movie history in “The Wizard of Oz” in 1939, “Over the Rainbow,” one line summed up her brilliant career as an actress and singer:

“Somewhere over the rainbow skies are blue, and the dreams that you dare to dream really do come true.”

Garland’s dream of fame did come true, but she never found peace of mind.

Even after she ascended to worldwide stardom, she constantly sought the love, adulation and acceptance that she felt had eluded her since childhood.

The seeds of her discontent were sown when she was very young. Garland, who died at 47 on June 22, 1969, was born Frances Ethel Gumm on June 10, 1922, in Grand Rapids, Minn. She was the daughter of vaudeville professionals, who encouraged their three daughters to go into show business, and she grew up with the pressure of her parents’ expectations. She had a strained relationship with her mother, a fierce stage parent, and was devastated when her beloved father died of meningitis in 1935.

The pressure continued in Hollywood: Studio heads told her she wasn’t pretty enough, deepening insecurities that dogged her throughout her career.

Her most celebrated role came in 1939, when at 16 she portrayed an orphaned farm girl from Kansas named Dorothy Gale in “The Wizard of Oz.” Young Dorothy longed to leave her tedious life on the farm and travel somewhere over the rainbow, where skies were always blue, as she sang in the video clip below.

“I’ve always taken ‘The Wizard of Oz’ very seriously, you know,” she once said. “I believe in the idea of the rainbow. And I’ve spent my entire life trying to get over it.”

After “The Wizard of Oz” Garland appeared in films like “Meet Me in St. Louis” (1944), “Easter Parade” (1948), and “A Star is Born” (1954), for which she was nominated for an Academy Award. She had already received one Oscar, a special juvenile award for her turn in “The Wizard of Oz.”

Garland said she was on a lifelong quest for love. She was married five times and was quoted as saying she longed for the sincere love of one man, rather than the applause of thousands of fans.

“If I am a legend, then why am I so lonely?” she once asked.

Garland turned to drugs and alcohol to fill the void. She died from an apparently accidental barbiturate overdose. She was 47.

Mickey Rooney, her childhood co-star in films like “Babes in Arms” and “Babes on Broadway,” echoed what many close to her had hoped.

“She was — I’m sure — at peace, and has found that rainbow. At least I hope she has.”

Her rosy complexion as a toddler gave her the nickname Pinky. That’s what she was called in convent schools and later in the halls of Oxford and Harvard, where as a student she was a campus tour guide, listened to Carly Simon and looked like Joan Baez.

After graduating from Harvard, the lyrics from Peter, Paul and Mary’s version of the 1960s song — “I’m leavin’ on a jet plane/Don’t know when I’ll be back again” — were stuck in her head as she boarded a plane for home. She returned to the United States 16 years later, in 1989, not as Pinky but as Benazir Bhutto, the new prime minister of Pakistan — the first woman elected to lead an Islamic country.

Her time in office would be as tumultuous as her childhood had been idyllic, ending in her assassination by the Pakistani Taliban on Dec. 27, 2007, just days before general elections, which her populist party was expected to win.

“I didn’t choose this life,” Bhutto said. “It chose me.”

Ms. Bhutto was born on this day in 1953 to a wealthy family whose lands were once so extensive it took days to appraise them. In a country where families dominated business and politics in an almost feudal manner, the Bhuttos seemed destined to rule. As Ms. Bhutto grew up, her father, Zulfikar Ali Bhutto, rose in power, from a post in Pakistan’s United Nations delegation to prime minister. He imparted lessons to her along the way.

But her political education went into overdrive when a top army general, Muhammad Zia ul-Haq, overthrew her father and imprisoned him. She was 24. Ms. Bhutto visited him often, absorbing one-on-one political seminars in the grimmest of settings. Her father encouraged her to study other female leaders, including Indira Gandhi and Joan of Arc.

Mr. Bhutto was hanged in 1979, charged with orchestrating the murder of a political rival. Ms. Bhutto was forbidden to attend his funeral.

She and her mother were soon given leadership of her father’s People’s Party. But as the opposition to a military regime, Ms. Bhutto spent half her time in prison or under house arrest, sometimes in solitary confinement.

When the ruling general’s plane mysteriously fell from the sky in 1988, much of the nation rejoiced, and elections were set. Ms. Bhutto seized her moment, campaigned as the “daughter of Pakistan” and, at 35, reclaimed the office of prime minister for her family.

She was elected twice, serving from December 1988 to August 1990 and again from October 1993 to November 1996.

“Charismatic, striking and a canny political operator,” The Times said in an appraisal after her death. “She ruled the party with an iron hand, jealously guarding her position, even while leading the party in absentia for nearly a decade.”

Ms. Bhutto could be imperial in bearing, charming and also ruthless. At one point she ousted her mother from the party’s leadership, provoking the elder Ms. Bhutto to remark, “She talks a lot about democracy, but she’s become a little dictator.”

After accusing her government of corruption, her younger brother Murtaza, a member of the provincial legislature, was gunned down outside his home in a police ambush. Her husband, Asif Ali Zardari, whom she had named minister of investment, was indicted in the murder but exonerated. Witnesses were either arrested, intimidated or killed.

Each of her terms as prime minister ended when she was dismissed by the president on graft charges. When she and her husband left office in 1996, they were worth hundreds of millions of dollars, though the source of their wealth was unclear. Pakistan was named one of the world’s three most corrupt countries.

“In her mind, she was Pakistan, so she could do as she pleased,” her former adviser, Husain Haqqani, said.

Ms. Bhutto spent most of the last nine years of her life in self-imposed exile, much of it in a palatial estate in Dubai. After receiving amnesty on the pending charges, she returned in late 2007 to seek a third term.

A close ally of the Afghan Taliban — which her government supported in its infancy in 1996 — killed her at a rally outside the capital. It happened in a park where Pakistan’s first prime minister was also assassinated, in 1951.

Pakistan still waits today for a real democracy to emerge, and an elected leader from outside the few feudal families that have ruled the country, alternating with the military, since its birth.

Benjamin “Bugsy” Siegel is mainly remembered as a bloodthirsty but dapper gangster.

In New York City, Siegel (1906-1947) was a core member of the infamous hit squad Murder Incorporated and implicated in many high-profile killings. His hair-trigger temper and penchant for brutality earned him the nickname “Bugsy,” in his day a slang term for crazy.

But Siegel, who died in a hail of bullets 69 years ago today, was also something of a visionary. He eventually moved west and pioneered the development of Las Vegas as a casino capital, investing in it when it was little more than a sleepy desert town with a pliant City Council and lax gambling regulations.

In New York, Siegel, a product of the tough streets of Williamsburg in Brooklyn, was, like his associate Meyer Lansky, a kingpin in what was known as the Jewish mob. He was one of the “Big Six,” a dominant group of bootleggers in the Northeast that included Lansky and Charles “Lucky” Luciano.

Seeking to expand his empire, he left New York City in the 1930s to set up bootlegging and gambling operations on the West Coast.

He moved to Beverly Hills and threw all-night parties with stars like Cary Grant, Clark Gable and Gary Cooper. But Siegel wanted more. So he sought a fresh start in Las Vegas, pouring $600,000 of his own funds into a new hotel and casino, which began Las Vegas’s transformation into a city of high-stakes mob glamour.

His girlfriend at the time was the actress Virginia Hill, whose long legs had earned her the nickname “The Flamingo.” Siegel named his new venture, the Flamingo Hotel and Casino, after her. (Their relationship was dramatized in the 1991 movie “Bugsy,” starring Warren Beatty and Annette Bening, who married the next year.)

When the casino struggled at first, Siegel used millions of dollars from mob investors to prop it up. Without him, the Flamingo would have folded. And when it began to catch on, other investors, some with equally shady reputations, headed for “Vegas” to start their own gambling palaces.

But Siegel’s attempt at legitimacy came to naught. On June 20, 1947, he was shot through the living room window of Ms. Hill’s house in Beverly Hills, Calif., the victim of a hit like those he had reputedly organized.

The casino he built in her name endured until 1993, when the last of the original buildings were razed and replaced by Hilton.

If you could have dinner with one person who is no longer with us, and whose obituary was published in The New York Times, who would it be, and why that person? Not Forgotten is asking that of influential people this summer in a series of posts called Breaking Bread.

Today we have Anderson Cooper, CNN anchor who wrote “The Rainbow Comes and Goes: A Mother and Son on Life, Love, and Loss” (2016), with his mother, the heiress and fashion magnate Gloria Vanderbilt. He wrote about his father, Wyatt Cooper, a screenwriter and actor from Mississippi.

The paper was lying on the kitchen counter, and I was startled to see his face staring up at me as I passed by. It was two days after his death.

The article was short. It gave a few details of his life, but it didn’t tell me what I really wanted to know: How could he have died? What would happen to my family and me now?

As a teenager I used to imagine that he had written me a letter, and every birthday I secretly hoped it would arrive. The letter would be full of fatherly advice, and tell me all the things I didn’t know about him. Even now, when I see a stack of mail at home, I can’t help but hope the letter has finally come.

After a while, no matter how much you love someone, no matter how hard you try to remember, you start to forget little details — the sound of their voice, the way they smell, the look in their eyes when they smile and laugh. If I could see my father just once more, sit down and talk with him, look into his crystal blue eyes, feel the safety of his arms around me, I would give anything for that.

I’d ask him what he thinks of me? Is he proud of me? Does he approve of the man that I’ve become? I’d tell him about the choices I’ve had to make, the fears and difficulties I’ve struggled to overcome. What would he have done if he were me?

My dad was 50 when he died, and I’ve always believed that I would die at that age as well. I just turned 49, and my doctor assures me I have many years yet to live.

So if I could see my father one last time, I’d be sure to ask him the most important question of all: What should I do next? What path forward should I take? How should I live out these years I never expected to have, these years he never lived to see?

For his confirmation gift, his parents gave him a telescope. His imagination was piqued as a student in Berlin when he read about a phantasmagorical journey to the moon.

“I knew how Columbus had felt,” he recalled.

When he died on June 16, 1977, Wernher von Braun, the son of East Prussian aristocrats, had left an indelible, if ambiguous, legacy as a visionary space-travel pioneer.

His boyhood obsession with rocketry elevated him to the position of Nazi Germany’s leading missile scientist and the brains behind the V-2 — Vergeltungswaffe Zwei (Revenge Weapon Two) — perfected in the village of Peenemünde, on the Baltic, where his grandfather had hunted ducks, and then aimed at Britain.

With Soviet forces advancing at the end of World War II, von Braun and more than a hundred of his fellow scientists surrendered to the United States Army. They were scooped up in Operation Paperclip and transplanted in Alabama, where they formed the vanguard of an American space program that built the Saturn V rocket, which sent nine crews toward the moon.

In addition to Columbus, von Braun liked to invoke the Wright Brothers and Charles Lindbergh. But he was also often mentioned in the same breath as Faust, for his wartime Devil’s bargain. He would say later that his chief goal was always space travel — eventually a permanent moon base and a mission to Mars — and that his V-2 rockets had worked perfectly, except that they landed on the wrong planet.

“We wouldn’t have treated your atomic scientists as war criminals,” he once said, “and I didn’t expect to be treated as one.”

But history is written by the victors, and for all the warranted gratification his later scientific accomplishments gave him, some critics never forgave his contributions to America’s wartime enemies. As the satirist Tom Lehrer sang:

Though a listener would not have realized it hearing her crooning, belting or scatting, Ella Fitzgerald, the “first lady of song,” was a quiet person in private.

Unlike many of her jazz world contemporaries — the list is practically endless — she was abstemious. When she was not onstage or on tour, where she spent most of her life, she preferred tranquil days at her Beverly Hills home and a placid social life with friends like Carmen McRae, Sarah Vaughan and Peggy Lee.

“It’s not easy for me to get up in front of a crowd of people,” Fitzgerald once said. “It used to bother me a lot, but now I’ve got it figured out that God gave me this talent to use, so I just stand there and sing.”

Many who followed Fitzgerald’s career — from her singing debut at an amateur contest at the Apollo Theater in Harlem in 1934 until her death on June 15, 1996 — might have doubted that she had ever known anything like stage fright. She was just too powerful a presence, like in this clip of her singular version of “Mack the Knife.”

Yet her quiet, abstemious side probably contributed to her longevity; her career lasted six decades.

Fitzgerald had a protean voice. She sang show tunes, swing, bebop, novelties, bossa nova and opera. She recorded with Ellington, Basie and Armstrong, and made albums of songs by Porter, Rodgers and Hart, and George and Ira Gershwin.

The world recognized her talent when she first sang with the drummer and bandleader Chick Webb, with whom she recorded her first hit, “A-Tisket, A-Tasket.”

“As for Ella Fitzgerald, the gauche young woman who first sold the country on ‘A-Tisket, A-Tasket,’ it is a little late to remind you that her simply rendered Negro lyrics are already a part of swingdom’s folklore,” Theodore Strauss wrote in a nightclub review in The Times in 1939, a year after the song was released.

Her commendations included honorary doctorates at Yale and Dartmouth, the National Medal of Arts and 13 Grammy Awards.

The decades did little to diminish Fitzgerald’s voice. When she gave her final concert, at Carnegie Hall in 1991, Jon Pareles wrote in The Times that even though she had to be helped on and off the stage, “she can still be an exuberant human trumpet.”

An inscrutable point in space, which contains all other points simultaneously, inspires a poet, and revenge. Despairing curators wander in a labyrinthine library stocked with innumerable, unintelligible books. A mild-mannered reader dreams of gauchos, knife fights and death.

These and all other manner of the mystical, enigmatic and paradoxical imbued the writing of Jorge Luis Borges, an Argentine author whose concise, intricate work overflowed with wonder. He penned densely philosophical short stories and poems of his own and literary hoaxes that intentionally blurred the line between reality and fiction.

“His fables are written from a height of intelligence less rare in philosophy and physics than in fiction,” John Updike said of Mr. Borges’s writing. “Furthermore, he is, at least for anyone whose taste runs to puzzles or pure speculation, delightfully entertaining.”

Mr. Borges was widely considered a candidate for the Nobel Prize for literature, but he never received it. “I still don’t understand why they haven’t given it to him,” Gabriel García Márquez said when he won the prize in 1982.

Some speculated that the Nobel committee overlooked Mr. Borges because of his reluctance to engage with the political violence that engulfed Argentina in the 20th Century. But Mr. Borges, an otherworldly figure himself, preferred the printed page to our unruly and unwelcoming reality. That reality grew more distant when he went blind in the 1950s and was forced to rely on others to transcribe his words and read to him. He departed this world for good when he died of liver cancer on June 14, 1986.

Toward the end of his life, however, Mr. Borges said he recognized himself in his most fantastical writing.

“Through the years, a man peoples a space with images of provinces, kingdoms, mountains, bays, ships, islands, fishes, rooms, tools, stars, horses, and people,” Mr. Borges said. “Shortly before his death, he discovers that the patient labyrinth of lines traces the image of his own face.”

“North Korea can not be allowed to develop a nuclear bomb,” Mr. Clinton replied. “We have to be very firm about it.”

In 2004 he asked president George W. Bush, “In light of not finding the weapons of mass destruction, do you believe the war in Iraq is a war of choice or a war of necessity?”

“I think that’s an interesting question,” Mr. Bush replied. “Please elaborate on that a little bit. A war of choice or a war of necessity? It’s a war of necessity.”

At a time when partisan divides seem insurmountable, Democrats and Republicans could agree on Tim Russert, the longest-serving host of NBC’s “Meet the Press.” By turns they were wary and fond of him and always on their toes under his tough questioning on Sunday mornings.

Arizona Senator John McCain probably expressed the sentiment best after Mr. Russert thanked him for appearing on the show in 2006, saying “I haven’t had so much fun since my last interrogation.”

For almost 17 years as moderator Mr. Russert built on the show’s long tradition of being required viewing for politicians, pundits, journalists and anyone else with a passion for public affairs. The show regularly reached an audience of almost four million people.

And he was working until the end. Mr. Russert died of a heart attack after collapsing at NBC News’s Washington bureau on June 13, 2008. He had been recording voiceovers for the show, amid Senator Barack Obama’s historic presidential campaign. Below is a tribute episode that aired after his death.

From his table on the Washington set of “Meet the Press,” Mr. Russert covered elections through the 1990s and early 2000s. In one memorable instance he brought comprehensible analysis to the confusing ballot tumult in Florida in the 2000 presidential election that ended with a Supreme Court decision and victory for Mr. Bush.

Mr. Russert was an unlikely candidate for broadcast stardom. The son of a garbage collector from Buffalo, N.Y., he came to journalism from government, having worked for Senator Daniel Patrick Moynihan and Gov. Mario M. Cuomo of New York. He was no one’s idea of a polished, lantern-jawed news anchor out of central casting. He was meaty and sometimes cross-looking with his dramatically knitted eyebrows; he could be prosecutorial one moment and jovial the next.

He joined NBC in 1984 as an executive. Impressing management with his political acumen, he was promoted to Washington bureau chief in the late 1980s and to moderator on “Meet the Press” in 1991, despite having scant on-camera experience.

Eight years later, The Times wrote that he had remade the show, “changing it from a sleepy encounter between reporters and Washington newsmakers into an issue-dense program, with Mr. Russert taking on the week’s newsmaker.”

The show still draws a comparable number of viewers with Chuck Todd occupying Mr. Russert’s former seat.

If you could have dinner with one person who is no longer with us, and whose obituary was published in The New York Times, who would it be, and why that person? Not Forgotten is asking that question of a variety of influential people this summer in a series of posts called Breaking Bread.

Today we have David H. Petraeus, a former C.I.A. director and the highest-profile general from the wars in Iraq and Afghanistan.

I would like to host General Grant for dinner at the Lotos Club, one of the oldest literary clubs in the United States (founded in 1870, early in President Grant’s administration). Besides celebrating writers and those in the arts, the club, in Midtown Manhattan, has also recognized military and government leaders (including the former Defense Secretary Robert M. Gates and me) at its annual state dinners.

Hosting Grant — a great writer as well as a great leader — at the Lotos Club would thus be very fitting. He would feel welcome there.

Coincidentally, the lovely old townhouse that houses the club, on East 66th Street just off Fifth Avenue, is next door to the address at which Grant lived the final years of his life.

I have long admired Grant and felt that some historians were unduly critical of him at various points in the last century (although more recent biographies have once again recognized his extraordinary qualities and how fortunate we were to have him in uniform during the Civil War, in particular).

In my view, Grant stands alone among American military leaders as hugely impressive at all three levels of war: tactically (as shown in his capture of Forts Henry and Donelson in Tennessee early in the war); operationally (the Vicksburg victory in 1863, one of the greatest operational-level campaigns of all time); and strategically (devising and overseeing the first truly comprehensive strategy for the Union forces to defeat Robert E. Lee’s Confederate Army).

In January 2007, a historian at the Command and General Staff College gave me a copy of “Grant Takes Command,” a 1968 history by Bruce Catton, as I was preparing to head to Iraq to command the troop surge. I read it during the tough early months of that endeavor, and I found Grant’s example inspirational.

Especially impressive was his sheer fortitude in the face of congressional sniping, press criticism, political pressures, battlefield setbacks and terrible casualties.

Most important, as the first Union commander to come up with a comprehensive strategy to defeat the Confederate forces, he was the first to give battle to Lee and not retreat back to Washington immediately afterward. Rather, he wrote to President Abraham Lincoln in 1864 that he intended “to fight it out all summer on this line if that’s what it takes.”

Photo

General Grant and his staff at Massaponnax Church, Va., shortly after the bloody battle of Spottsylvania Courthouse on May 13, 1864.Credit
Mathew B. Brady, via Associated Press

Of course, it took all summer, all fall, all winter and part of the spring until the surrender of Lee’s army at Appomattox in April 1865. Again, through it all, Grant displayed extraordinary fortitude and quiet determination in dealing with the enormous pressures on him — and then displayed remarkable magnanimity and respect in agreeing to the terms of Lee’s surrender.

After the war Grant dealt calmly but firmly with the erratic behavior of President Andrew Johnson in the wake of Lincoln’s assassination. And although as president (1869-77) he was tarnished by financial scandal after placing too much trust in some members of his cabinet, he sought to be compassionate during the Indian Wars and in the conduct of Reconstruction, and demonstrated integrity in guiding the nation through a host of financial crises.

And he was modest and unassuming in all that he did. Toward the end of his life, when facing financial ruin — a result of misplaced faith in his investment advisers — he turned down charity from admirers and sought to secure his family’s future by writing his memoirs. They are still regarded as the most literate, forthright memoirs of any major American military figure.

With the help of Mark Twain, the memoirs were an enormous commercial success when published after Grant died, on July 23, 1885, at an Adirondacks retreat. Twain, by the way, was among the earliest members of the Lotos Club.

For me, Grant was always captured best in the pithy response he offered to Gen. William Tecumseh Sherman, his most trusted commander, after the nearly disastrous first day of the Battle of Shiloh in 1862, when Grant’s army was almost pushed back into the Tennessee River. Sherman had emerged from the darkness to encounter Grant sitting under a tree with the rain dripping off his slouch hat.

“Well, Grant,” Sherman said, “we’ve had the devil’s own day today, haven’t we.”

A life of crime is usually lived in the shadows. But John Gotti, the longtime boss of the Gambino crime family, preferred the spotlight. He was a publicity hound long before social media and smartphones made oversharing ubiquitous.

“He was the first media don,” said J. Bruce Mouw, a former F.B.I. agent who helped find the evidence that led to Gotti’s conviction. “He never tried to hide the fact that he was a superboss.”

Mr. Mouw was quoted in Gotti’s obituary, written by Selwyn Raab, who covered organized crime for The Times for years. Gotti died on June 10, 2002, in a federal prison in Springfield, Mo., his home for the past decade.

Gotti took control of the Gambino family after engineering the assassination of his predecessor, Paul Castellano, in 1985. He went on to make flagrant power moves, courting the press all the while. He cut a dashing figure, draped in expensive double-breasted suits that might as well have been suits of armor, as far as prosecutors were concerned.

“In tabloid argot, he was the Teflon Don, evading successful prosecution, or the Dapper Don, for his smart appearance,” Mr. Raab wrote.

Gotti relished the attention. He bragged that he was inspired by Albert Anastasia, the founder of the syndicate Murder Incorporated, and that his management style was derived from Niccolò Machiavelli’s “The Prince,” which he said he read as a child. He knew his every move was being scrutinized but never let his observers feel that they had the upper hand.

“Mr. Gotti became organized crime’s most significant symbol of resistance to law enforcement since Al Capone in Chicago 60 years earlier,” the obituary said. “If he spotted detectives on stakeouts, he was known to taunt them by rubbing one index finger against another and mouthing the words: ‘Naughty, naughty.’”

Charges finally stuck after Salvatore Gravano, better known as Sammy the Bull, turned on Gotti in court, detailing Gotti’s involvement in mob hits and other criminal enterprises. On April 2, 1992, Gotti was convicted on 13 counts, among them a racketeering charge that cited him for five murders and other murder charges, conspiracy, gambling, obstruction of justice and tax fraud.

Yearning for the spotlight ran in the family. His children starred on the reality TV series “Growing Up Gotti,” which appeared on A&E in 2004-5.

https://static02.nyt.com/images/2014/09/04/arts/joan-rivers-bw/joan-rivers-bw-facebookJumbo-v2.jpg2016-06-07T22:27:41.041+00:00Joan Rivers: 50 Years of FunnyMoments from the groundbreaking career of Joan Rivers.CreditBy THE NEW YORK TIMEShttps://static02.nyt.com/images/2014/09/04/arts/joan-rivers-bw/joan-rivers-bw-facebookJumbo-v2.jpg

On Kate Winslet’s head piece: “The nice thing about this hat is that it covers up the head wound that made her think it was a good idea to wear it in the first place.”

On Queen Elizabeth II: “Gowns by Helen Keller. Nice looking. Not at all like her stamp. Wears her watch over the glove, though — tacky.”

On Donatella Versace: “That skin! She looks like something you’d hang off your door in Africa.”

On herself: “My love life is like a piece of Swiss cheese; most of it’s missing, and what’s there stinks.”

Joan Rivers, the irrepressible and sharply acerbic comedian, would have been 83 today, and since her death almost two years ago, she has left a celebrity-skewering void that can still be felt during every major red carpet event, from the Oscars to the Grammys, where the glitterati were sitting ducks for her as she hosted the E! show “Fashion Police.”

(Ms. Rivers died undergoing a routine procedure in New York City. A settlement in a malpractice lawsuit filed by Ms. Rivers’s daughter, Melissa, was reached in May.)

In honor of her birthday, we’re asking readers to share their favorite zingers and one-liners by Ms. Rivers on Twitter using #TellNYT.

But beyond the red carpet, we remember Ms. Rivers today as one of America’s first successful female stand-up comics in a landscape dominated by men. She paved the way for generations of comedians, distinguishing herself with slashing style and biting self-deprecation, even about her death.

“I’ve had so much plastic surgery,” she once said. “When I die they will donate my body to Tupperware.”

Ms. Rivers broke through in the 1960s as a guest on “The Tonight Show” with Johnny Carson. A review in The New York Times in 1965 called her “an unusually bright girl who is overcoming the handicap of a woman comic, looks pretty and blonde and bright and yet manages to make people laugh.”

But she was fired after she got her own show. Then her husband committed suicide. Driven by despair and desperation, she reinvented herself as a writer, producer and entrepreneur. In the 1990s she began “poking a microphone into freshly Botoxed faces on red carpets,” and in 2010 she became a star of “Fashion Police.” Nothing was sacred. No one was spared.

The Times obituary said:

“She would take the stage in a demure black sheath and ladylike pearls, a tiny bouffant blonde with a genteel air of sorority decorum. Then she’d stick her finger down her throat and regurgitate the dirt on the rich and famous, the stream-of-consciousness take on national heroes and sacrosanct cultural idols.”

Dorothy Parker never met a contemporary she couldn’t skewer. A contributor and critic for Vanity Fair and The New Yorker and a founding member of the informal gathering of literati known as the Algonquin Round Table, she delivered withering, seemingly effortless bons mots.

“Informed that Clare Boothe Luce was invariably kind to her inferiors, Miss Parker remarked, ‘And where does she find them?’ ”

The quotation is from Parker’s obituary in The New York Times. She died of a heart attack on June 7, 1967. She was 73.

Ms. Parker dispensed caustic humor in prose and verse as well as over drinks. Her observations and remarks were very much of their time, but they still induce winces in an era when cutting snark has become practically de rigueur. Over the years many couplets and witticisms have been attributed to Parker, some apocryphally. Here are just a few:

On men:

“Men seldom make passes at girls who wear glasses.”

“Take me or leave me; or, as is the usual order of things, both.”

On money:

“I don’t know much about being a millionaire, but I’ll bet I’d be darling at it.”

On alcohol:

“Three be the things I shall never attain: Envy, content and sufficient champagne.”

On writing:

“I hate writing; I love having written.”

“I can’t write five words but that I change seven.”

“If you have any young friends who aspire to become writers, the second greatest favor you can do them is to present them with copies of ‘The Elements of Style.’ The first greatest, of course, is to shoot them now, while they’re happy.”

Her suggested epitaph:

“Excuse my dust.”

The suggestion was taken. Those words are on a memorial plaque where Parker’s ashes are interred.

“In many ways, the personal characteristics of Robert Kennedy are very much like the dominant characteristics of the American people,” the New York Times columnist James Reston wrote on June 6, 1968, the day Kennedy was murdered, and maybe that was why he connected so viscerally with his impassioned constituency.

“We are an ambitious, strenuous, combative, youthful, inconsistent, abrupt, moralistic, sports-loving, non-intellectual breed,” Mr. Reston wrote, and Kennedy “was a passionate and pugnacious man who confronted the inevitable and sometimes the avoidable contradictions of life, and inspired a great loyalty and great fear in the process.”

Kennedy, at the time New York’s junior senator and a former attorney general in the cabinet of his brother John F. Kennedy, had just claimed victory in the California presidential primary in a rally at the Ambassador Hotel in Los Angeles when he, like his brother four and a half years earlier, was felled by an assassin. He died 20 hours later, the first assassination of an American presidential candidate.

His death, just two months after the Rev. Dr. Martin Luther King Jr. was assassinated in Memphis, was another shock that only deepened Americans’ soul-searching as they grappled with the legacies of racial injustice and divisions over the nation’s involvement in the Vietnam War. In a searing Op-Ed critique, the playwright Arthur Miller demanded that Americans “face the fact that the violence in our streets is the violence in our hearts.”

Before a funeral train carried Kennedy’s body to Washington from New York for burial, diverse thousands paid their respects. “World statesmen in formal dark suits stood next to Harlem school boys in torn Levis and sneakers,” the reporter J. Anthony Lukas wrote in The Times.

Photo

A crowd watching the funeral train in New Jersey in 1968.Credit
William Sauro/The New York Times

Kennedy had been revered by many as a political savior in a turbulent time and despised by others as ruthless and opportunistic. “Many men succeed in politics by using their worst qualities, and this applied to Robert Kennedy at the beginning of his legislative career,” Mr. Reston wrote, “but in the end he failed while using his best qualities.”

In his eulogy, Senator Edward M. Kennedy urged that his brother be judged at face value. “My brother need not be idealized or enlarged in death beyond what he was in life,” he said. “He should be remembered simply as a good and decent man, who saw wrong and tried to right it, saw suffering and tried to heal it, saw war and tried to stop it.”

Since then, the quadrennial California primary has shouldered the added distinction of marking the anniversary of Kennedy’s death. Californians, 48 years later, go to the polls Tuesday.

When Ruhollah Khomeini, the founder of the Islamic republic in Iran, was buried in 1989, three days after his death on June 3, all international phone lines in the country were cut and international flights halted. In Iran, mourning the leader of the last great revolution of the 20th century required everybody’s complete attention.

Millions took to the streets, and at Behest-e Zahra, or Zahra’s Paradise, where Khomeini was buried, the lamentation was so overwhelming that his coffin, carried by the crowds, had trouble making it to its grave site.

His obituary in The New York Times was almost 3,500 words but quick to encapsulate the man, a Shiite Muslim cleric, and his importance to Iran and the world:

“The life of Ayatollah Ruhollah Khomeini was so shadowy, with such an overlay of myth and rumor, that there was lingering disagreement or uncertainty about his ancestry, his true name and his date of birth,” Raymond H. Anderson began.

“But when he returned in triumph to Teheran on Feb. 1, 1979 — after almost 15 years in exile — the imposing man in a black robe with a white beard and intense dark eyes left little doubt about who he was, or what he wanted for his ancient land.

Ayatollah Khomeini felt a holy mission to rid Iran of what he saw as Western corruption and degeneracy and to return the country, under an Islamic theocracy, to religious purity.”

Today he is remembered as the Shiite Muslim cleric who, on Feb. 11, 1979, drove Shah Mohammad Reza Pahlavi from the Peacock Throne in founding the world’s first and only Islamic republic governed by a religious political ideology.

In Iran he was better known as the Imam Khomeini, an honorific denoting the near-holy status that he continues to have in many parts of Iranian society. And such a man, his former revolutionary compatriots thought, deserved a pilgrimage site all his own.

Today, 27 years after his death, the sprawling, golden-domed Imam Khomeini shrine is one of the largest religious complexes in the world. At its center stands a golden-plated cage that holds Khomeini’s remains.

Photo

The shrine of Imam Khomeini.Credit
Morteza Nikoubazl/Reuters

The shrine is one of the first imposing structures people see when they drive to Tehran after arriving at the Imam Khomeini International Airport, or IKIA. When Iranians are given a four-day holiday every year to commemorate him, the vast, marbled halls of the shrine are filled, and Khomeini’s successor, Ayatollah Ali Khamenei, gives a speech.

The crowds thin out the rest of the year, but Iranians, mostly families on vacation, continue to flock to the shrine. Children run around in their socks — shoes are forbidden there — while mothers and fathers sit hunched over a carpet, picnicking, close to their beloved imam.

If you could have dinner with one person who is no longer with us, and whose obituary was published in The New York Times, who would it be, and why that person? Not Forgotten is asking that question of a variety of influential people this summer in a series of posts called Breaking Bread.

First up is Cory Booker, New Jersey’s junior senator, a former mayor of Newark and a Democrat frequently mentioned as a potential vice-presidential running mate in the 2016 election.

Asked in a telephone interview on Wednesday to choose a dinner companion, Mr. Booker said he had gone back and forth between two of his heroes, both of whom rose from slavery: Harriet Tubman, the abolitionist who is most remembered for ushering dozens of slaves to freedom along the Underground Railroad, and Frederick Douglass, the abolitionist author and orator and later prominent humanitarian and campaigner for women’s rights.

Mr. Booker said he keeps a statue of Tubman and a picture of Douglass in his office.

The Douglass picture, he said, is “a constant reminder to me of the idea of lifelong service, of tireless, unyielding dedication to the ideals of our nation, and to his understanding that our nation’s ideals of freedom and liberty can’t just be fought for once in a while.”

In the end he chose Douglass.

“He’s just somebody who had a profound regality about him and a pervasive humility,” Mr. Booker said. “And the way he engaged himself, and one of the last things he did before he died was going to a suffrage movement meeting.”

The meeting was of the Women’s National Council at Metzerott Hall in Washington. An honored guest, Douglass was escorted to the platform by the suffragettes Susan B. Anthony and Anna H. Shaw, The Times reported. Shortly after the meeting, on Feb. 20, 1895, he collapsed and died at his home in the capital while he and his wife waited for a carriage to take him to Hillsdale African Church, where he was to give a lecture. He was 78.

“It is a singular fact, in connection with the death of Mr. Douglass,” The Times wrote in its obituary, “that the very last hours of his life were given in attention to one of the principles to which he has devoted his energies since his escape from slavery.”

Douglass, who had escaped from slavery and taught himself to read, became a leading abolitionist in the North and an adviser to President Abraham Lincoln. “Mr. Douglass, perhaps more than any other man of his race, was instrumental in advancing the work of banishing the color line,” the obituary said.

Mr. Booker said that Douglass had been a hero of his since childhood, when his parents taught him about great Americans. He described Douglass as “one who bent the arc of our history more toward justice,” adding that he admired Douglass all the more for embracing manifold causes.

“He’s a person who shows that all of our fights for justice are interwoven,” Mr. Booker said

Mr. Booker said he strove to live up to Douglass’s example.

“I’m hopeful that at the end of my life someone like Frederick Douglass would look at my life and say, ‘Well done, you’ve proven yourself to be worthy of the legacy we left you,’ ” he said.

As a vegan, Mr. Booker acknowledged that choosing the dinner menu might be problematic. But he said he thought Douglass might enjoy the cuisine found at some of his favorite vegan restaurants in New Jersey.

“I would let him know that he could enjoy a healthy lifestyle, and really delicious food.”

Calling Henry Louis Gehrig a steadfast first baseman for the Yankees is like calling the Pacific Ocean a pond. Better known as Lou, he was nicknamed the Iron Horse for his streak of 2,130 consecutive games played, a record that stood until Cal Ripken broke it in 1995.

Gehrig did not make it into the Baseball Hall of Fame for reliability alone. He played in six consecutive All-Star Games, was twice named the American League’s most valuable player, and was the first baseball player to have his number, 4, retired.

He set a plethora of records, some of which have never been broken. His record of 23 career grand slams lasted 75 years before it was broken in 2013 by another Yankee, Alex Rodriguez.

He was a consummate first baseman and hitter who stood out on Yankees dynasty teams with Joe DiMaggio, Bill Dickey and Babe Ruth, who preceded him in the batting order. He batted at least .300 for 12 consecutive seasons, achieving a career average of .340, and was no stranger to the long ball — he hit 493 home runs, twice hitting 49 in a single season and four in one game in 1932.

But Gehrig was not just synonymous with baseball prowess. He retired from the Yankees at 36 because of amyotrophic lateral sclerosis, a neurodegenerative disease. Lou Gehrig’s disease became the informal name for A.L.S., which led to his death on June 2, 1941.

Gehrig delivered a farewell speech to a crowd of 61,808 at Yankee Stadium on July 4, 1939, during a sweltering break between games in a doubleheader with the Washington Senators. The Yankees had lost the first game, 3-2, and Gehrig took the field with his aging teammates from the 1927 World-Series-champion Yankees, a fearsome lineup known as Murderers’ Row. He was overcome with emotion but still delivered what some have called baseball’s Gettysburg Address.

“For the past two weeks you have been reading about the bad break I got,” Gehrig said, according to the National Baseball Hall of Fame and Museum’s website. “Yet today I consider myself the luckiest man on the face of the earth.”

Gehrig wept as the crowd chanted, “We love you, Lou!” The Times reported that “it was without doubt one of the most touching scenes ever witnessed on a ball field and one that made even case-hardened ball players and chroniclers of the game swallow hard.”

Then Gehrig, who was still a team captain, returned to the dugout to watch the final game of the day. The Yankees won, 11-1.

Helen Keller lost her sight and hearing 18 months after she was born. She had not learned how to speak and was plunged into what she called “the unconsciousness of a newborn baby,” an isolated and seemingly inescapable prison.

But by the time Ms. Keller died at 87, on June 1, 1968, she had persevered. Her front-page obituary said that “she could ‘see’ and ‘hear’ with exceptional acuity; she even learned to talk passably and to dance in time to a fox trot or a waltz.”

Ms. Keller learned to communicate with the help of Anne Sullivan Macy, a teacher who virtually transformed her from a near-feral child into a Radcliffe graduate. Ms. Keller became a captivating writer, chronicling her life in memoirs, and a kind of motivational speaker, aided by Ms. Sullivan onstage. She also was an advocate for the blind and toured the country with Ms. Sullivan as part of a vaudeville act, befriending celebrities like Charlie Chaplin, Enrico Caruso and Harpo Marx.

“I seldom think about my limitations, and they never make me sad. Perhaps there is a touch of yearning at times, but it is vague, like a breeze among flowers. The wind passes, and the flowers are content.”

Her accomplishments were dramatized in William Gibson’s “The Miracle Worker”, which debuted on Broadway directed by Arthur Penn in 1959 and starred Anne Bancroft as Ms. Sullivan and Patty Duke as Ms. Keller. (Ms. Duke died in March.) The show ran for more than 700 performances, and the actresses reprised their roles in the 1962 film version (below), for which they both won Academy Awards.

Ms. Keller’s achievements and indomitable spirit helped alter public perceptions of disabled people, showing them as overcoming obstacles rather than as inviting only pity. A foundation named after her in 1915 continues to provide treatment for diseases that cause blindness and visual impairment.

In 1964, President Lyndon B. Johnson presented Ms. Keller with the Presidential Medal of Freedom, honoring her as a model of courage and determination.

This summer, we invite you to join us as we exhume obituaries from our archives, some dating to the 19th century. You’ll meet leaders, inventors, entertainers, artists, novelists and the infamous — each linked in some way to the obituary. On some days we’ll ask influential people a simple question: If you could have dinner with one person who is no longer with us, who would it be, and why?

We welcome your feedback and ideas. You can send us an email using this form or tweet the editor of this project, Shreeya Sinha @shreeyasinha, or the main writer,
@DSlotnik. Follow us on Twitter.