Monthly Archives: October 2014

Post navigation

In 1952, weather conditions led to a massive smog descending on and gathering over London. Visibility was less than 30 centimeters (12 in), the air was black with coal and pollution, and the usually bustling city ground to a standstill. By the time the smog had cleared, 4,000 people had died from exposure to the pollution, and another 8,000 would die in the following weeks from complications. The smog would lead to increased awareness of the problems of pollution over city centers.

London has long had something of a romantic relationship with their fog. Our image of the Industrial Revolution often involves people wrapped in long trench coats, making their way through the pea soup that’s settled over the streets, barely pushed back by the gas lamps that line the sidewalk. That’s not an old image, either; in 1952, a smog settled over London that not only stayed for four days, but led to the deaths of more than 12,000 people.

On December 5, 1952, several factors came together in what would be a deadly mix. A prolonged cold spell meant that people were firing up their home heating units, meaning smoke was pouring from every residential chimney in earnest and for days, more smoke was added to the already heavy output that was gushing from factories across the city. A relatively new phenomenon was also adding to the problem—cars. An anti-cyclone was hovering over the area, keeping the smog from rising off the city. The wind that normally would have helped disperse the smoke had died, and smoke kept building up until the city turned black.

At the height of the event now dubbed The Great Smog, visibility was so bad that it was impossible for people to see their own feet. Cars were abandoned in the street as people sought shelter indoors, although they were no better off. People were so distracted by the fog that many lost track of friends and family members, whom they would later find had died in what they thought was the safety of their own homes.

Some did make it to hospitals—on foot, as even the ambulance services had stopped running. Nurses recount admitting patients whose lips had turned blue, patients who were struggling to breathe against the suffocating smoke. In the four days that the smog had settled over the city, about 4,000 people suffocated. For many fairly healthy individuals, the smog was survivable—but among the dead were children, the elderly, and chronic smokers whose lungs and respiratory systems were already comprised.

Schools closed, and so did airports and train stations. Buses stopped running, and among the first to die were the cattle that were on sale at the Smithfield Market. When they were butchered, it was found that their insides had turned black from the smoke and their meat was unusable. People’s clothes were permeated; even underwear turned black.

Winds came on the fourth day, and cleared out much of the smog. For many, the damage was already done, though, and another 8,000 people would die in the following weeks from smog-related illnesses.

The Great Smog of 1952 wasn’t the first time the city had been halted by smog and coal tar in the air. In December of 1873, the death toll was about 40 percent higher than usual because of the smog that settled over the city. Other smogs happened in 1880, 1882, 1891, and 1892, most severe around areas where there was a heavy concentration of factories and, consequently, workers.

After The Great Smog of 1952, legislation was passed to begin eliminating coal use in factory and residential fires. Originally, paraffin was used in place of coal, but the deadly event led to a long-lasting awareness in the city of the potential impacts of pollution.

Milky Way
When this bar debuted in 1923, it was the first to take inspiration from a real dessert: the milky way malted milkshake. (That’s right—it was not named after the galaxy.)

Baby Ruth
When this bar launched in 1921, its makers claimed it was named after President Grover Cleveland’s daughter Ruth (who had died 17 years earlier at the age of 12). But at that time, the more obvious association was with new Yankees star Babe Ruth, making this the first candy bar to profit from the success of a public figure—even though he wasn’t being compensated.

Nestle Crunch
Prior to this bar’s introduction in 1937, candy-bar fillings were somewhat rich: nuts, caramel, etc. By using dirt-cheap puffed rice, however, Nestle helped mainstream the notion that candy could be almost anything you put into chocolate—an idea that brought candy-bar prices down and spawned treats like Krackel .

Cadbury Milk Chocolate
The Cadbury family’s idyllic factory village in Birmingham, England—where these bars were created in 1897—helped inspire Milton S. Hershey’s own facility in Pennsylvania. “It was a sort of social utopia,” explains Deborah Cadbury, a family descendant and author of Chocolate Wars. “The Cadbury brothers as Quakers were the first to really look after their employees and provide pensions and security of employment and a living wage.”

Chicken Dinner
This bar, which debuted in 1923, was the first chocolate bar to be marketed as nutritious; advertisements touted the nut-packed treats as “candy made good.” Though Sperry’s Chicken Dinner was discontinued in the 1960s, its success helped spawn the power bar industry, paving the way for brands like Clif and Luna, whose bars offer vitamins alongside hearty doses of chocolate, caramel and more.

Snickers
More than 80 years after its launch in 1930, this Mars bar is world’s best-selling international confection. And although it may not have revolutionized candy-bar taste or distribution, it’s unparalleled at selling itself: its star-studded ad campaign “You’re Not You When You’re Hungry” helped sales hit around $3.5 billion in 2012, outpacing M&Ms, Reeses and Kit Kat. Also, says Kimmerle, it helps that Snickers offers the holy trinity of confection: nougat, caramel and peanuts—coated in chocolate.

Nestle Milk Chocolate
Prior to this bar’s introduction in 1875, bar-form cocoa was bitter, chewy and dark. And chocolatiers couldn’t sweeten it with regular milk, as the liquid invited mildew growth. By adding the condensed milk pioneered by Henri Nestlé for infant formula, however, Swiss chocolatier Daniel Peter solved that problem—his product was smoother, sweeter and had a longer shelf life. That breakthrough paved the way for almost every modern-day chocolate bar, including Hershey’s, Lindt and Godiva.

Hershey’s Milk Chocolate
Nestlé may have invented milk chocolate, but Hershey’s made it mainstream. By building his factory right in the middle of dairy land—and using local milk to amp up production volume—Milton Hershey powered an unparalleled distribution network, says Sweet Tooth author Kate Hopkins, turning chocolate into an American obsession. Since its first bar debuted in 1900, Hershey’s has become one of the world’s most recognizable brands: its treats fed soldiers during World War II; its ad campaigns were revered; and now, there’s a $23.5 million museum dedicated to its legacy.

Kit Kat
Beyond being the first candy bar to be marketed around sharing, which helped turn chocolate into a social snack, Kit Kat was also the first to gain a global following. Whereas Hershey’s and Cadbury cornered different markets with similar products, the wafer-filled Kit Kats launched in both Europe and the U.S. before entering Australia, Asia and Africa—paving the way for other blockbuster bars like Snickers and Butterfinger.

Imagine a world where fast food workers can pay their rent and utility bills, plus buy their children food and clothes. Well, you don’t have to imagine it because, such a place exists. It’s called Denmark.

A New York Times article on Tuesday [3] chronicled the life of a Danish fast food worker named Hampus Elofsson, who works 40 hours a week at a Burger King in Copenhagen, and makes enough not only to pay his bills, but to save some money and enjoy a night out with friends. His wage: $20 per hour. Yep, you read that right. The base wage in Denmark is close to two and a half times what American fast food workers make.

Elofsson’s pay is the kind of wage that Anthony Moore, a shift manager in Tampa, Florida can only dream about. He earns $9 an hour for his low-level management job, or about $300 per week, and like half of America’s fast food workers, he relies on some form of public assistance to make up the difference between that wage and barely eking out a living.

“It’s very inadequate,” Moore, a single father of two young daughters, told the Times. He gets $164 in food stamps for his daughters. “Sometimes I ask, ‘Do I buy food or do I buy them clothes? . . . If I made $20 an hour, I could actually live, instead of dreaming about living.”

Of course, in America, fast food workers and their advocates aren’t even dreaming about $20 per hour. They are asking for $15 per hour, and the fast food industry, as well as conservative economists and politicians are scoffing at that, and fighting any pay increase tooth and nail.

What Danish fast food workers have that their American counterparts do not is a powerful union, and fast food franchise owners who are willing to make a little less of a profit, though they still do make a profit. Denmark is also a much smaller country, with a higher cost of living and a huge social safety net. And yes, a fast food burger is a little more expensive in Denmark than here in America.

Martin Drescher, the general manager of HMSHost Denmark, the airport restaurants operator, told the Times: “We have to acknowledge it’s more expensive to operate. But we can still make money out of it — and McDonald’s does, too. Otherwise, it wouldn’t be in Denmark.”

He also said: “The company doesn’t get as much profit, but the profit is shared a little differently. We don’t want there to be a big difference between the richest and poorest, because poor people would just get really poor. We don’t want people living on the streets. If that happens, we consider that we as a society have failed.”

Watermelons originated in southern Africa, from where they spread thousands of years ago to the Nile Valley. In recorded history, seeds of this fruit have been discovered in the tomb of Tutankhamen. The Bible mentions watermelon as food consumed by the ancient Israelites.

Watermelon spread north to the Mediterranean and Europe with early Moors. They were carried as far east as China, which is today the largest producer of watermelon. Some have posited that watermelons were introduced to the Southern US by African slaves, but they were discovered being cultivated from the northern to southern states by Native Americans as early as the 1500’s. There is some debate today as to whether watermelons are a fruit or a vegetable.

The consumption of lycopene, contained most plentifully in watermelon at its peak ripeness, has been demonstrated to increase bone health. The amino acid citrulline in watermelon converts to another amino acid – arginine – in our kidneys and other organs, and in preliminary laboratory studies has been shown to assist the cardiovascular system, and possibly help lower body fat.

Surprisingly, watermelons are not just a refreshingly sweet, thirst quenching fruit, but are actually abundant in nutritional benefits, from vitamins and minerals to phytonutrients. They’re rich in vitamins A, C and B-complex compounds such as thiamin, riboflavin and pyridoxine. Watermelons afford a wealth of beta-carotenes, as well as folate, pantothenic acid, the amino acid citrulline and antioxidants lycopene, cryptoxanthin, lutein and zeaxanthin. They also contain anti-inflammatory phenolic compounds, which include both lycopene and tripterpenoid. They’re plentiful in the minerals potassium, manganese, magnesium, phosphorus and copper, with trace amounts of selenium, iron and even calcium. The seeds of the fruit offer a wealth of nutrients as well, and should not be overlooked! Because watermelon is so nutrient rich, it’s a great juice for fasting, cleasnsing and weight loss.

Watermelon provides a rich resource of electrolytes and is void of cholesterol and nearly absent of fat, while offering a modest amount of fiber and protein, and as little as 48 calories per cup.

Daniel Webster had two chances to become president via the vice presidency. In 1840 the Whig party nominated William Henry Harrison for president and Harrison offered the vice presidency to Webster. Webster turned it down and Harrison died after a single month in office; his death would have made Webster president.

Eight years later Webster competed with Zachary Taylor for the Whig party’s nomination. Taylor won and invited him to be his running mate, and Webster again shunned the office, saying, “I do not propose to be buried until I am really dead and in my coffin.” Taylor won the White House and died 16 months afterward, which again would have made Webster president if he’d accepted.

Related: In the election of 1880 James Garfield simultaneously won the presidency, retained his seat in the House, and won a Senate seat — he’d been elected to all three offices at once.