How Medicare Came Into Existence

TIME said the bill—signed on July 30, 1965—created a "welfare state beyond Roosevelt's wildest dreams"

It was 50 years ago Thursday, on July 30, 1965, that President Lyndon Johnson signed the Medicare bill, turning the national social security healthcare program for older Americans into law. But, despite Johnson’s legendary powers of legislative persuasion, the celebratory signing event—complete with the enrollment of the first Medicare beneficiary, former President Harry S. Truman—could have looked very different.

After all, the idea of helping American seniors afford health care took time to gain traction: The idea came up not long after Franklin Roosevelt initiated the modern social-security system in the 1930s. When the coinage “Medicare” first came on the American scene, the program it described was not the one we think of today. In 1960, the term referred to an opposing program proposed by the Eisenhower administration. The big fear at the time was that tying any kind of health aid to social security would quickly deplete the funds available for that then-30-year-old system; Eisenhower’s version, overseen by then-Vice President Richard Nixon, would have been both voluntary and state-funded.

In that year’s Presidential campaign, however, Nixon lost to challenger John F. Kennedy—who, as TIME put it a few years later, “vowed without qualification that his Administration would persuade a Democratic Congress to pass a medicare bill, to be financed under the social security system.” Kennedy died, however, before he could make good on that promise—which is where Johnson comes in. Benefiting from his 1964 election victory, Johnson made it happen. But what exactly it would look like remained to be settled.

By April of 1965, as TIME reported, there were three options in the running: Johnson’s social-security-linked compulsory program; an Eisenhower-esque voluntary program with no link to social security; or an American Medical Association-backed plan called “eldercare,” which prioritized patient choice and was need-based. The solution came, surprisingly, in the form of House Ways and Mean Committee chair Wilbur Mills, who had been a staunch opponent of Medicare. He combined elements of the three plans into one that would succeed. The basics of the plan were compulsory and funded by increasing social-security taxes, while extras were voluntary. The program we now know as Medicaid, for those in need, would also be expanded.

“The medicare bill will not solve all the problems of growing old—but it will certainly make the process much less costly to the elderly,” TIME noted. And that wasn’t all it did, the magazine continued. The medicare bill represented a fundamental change to American political norms:

Almost 30 years ago, Franklin Delano Roosevelt signed into law the Social Security Act. At the moment of signing, he issued a statement that, in retrospect, sounds almost apologetic: “We have tried to frame a law which will give some measure of protection to the average citizen and his family against the loss of a job and against poverty-ridden old age. This law, too, represents a cornerstone in a structure which is being built but is by no means complete. It is a structure intended to lessen the force of possible future depressions.”

Social security was mostly an emergency act in a nation still struggling out of the depths of a depression in which, in F.D.R.’s famed phrase, more than one-third of the nation was “ill-housed, ill-clad, ill-nourished.” The change since then in American life has never been more apparent than last week, when Congress acted on two bills that projected a new sort of welfare state beyond Roosevelt’s wildest dreams. First, the House of Representatives passed and sent to the Senate, where it faces certain swift approval, the Johnson Administration’s $6 billion-a-year medicare bill…

Action on both bills came not in time of depression but in the midst of the most prosperous year that the affluent society has ever known. There were a few squawks about presidential pressure, but it was widely accepted that both measures would achieve great good in making the U.S. even more affluent without turning it into a socialistic society. It was generally conceded that both bills, despite the vastness of their scope, were aimed not at increasing the power of the Federal Government, but at eradicating some remaining blemishes in the Great Society.

A California legislator who faces a recall campaign for his support of a law mandating vaccinations is just one of the heroes in the history of vaccines. Alas, there are villains too

Edward Jenner

Popperfoto/Getty ImagesEdward Jenner

No one knows the name of the dairy maid 13-year old Edward Jenner overheard speaking in Sodbury, England in 1762, but everyone knows what she said: “I shall never have smallpox for I have had cowpox. I shall never have an ugly pockmarked face.” Jenner was already a student of medicine at the time, apprenticed to a country surgeon, and the remark stayed with him. But it was not until 34 years later, in 1796, that he first tried to act on the dairy maid’s wisdom, vaccinating an 8-year-old boy with a small sample from another dairy maid’s cowpox lesion, and two months later exposing the same boy to smallpox. The experiment was unethical by almost any standard—except perhaps the standards of its time—but it worked. Jenner became the creator of the world’s first vaccine, and 184 years later, in 1980, smallpox became the first—and so far only—disease to have been vaccinated out of existence.

Jonas Salk and Albert Sabin

Jonas Salk and Albert Sabin didn’t much care for each other. The older, arid Sabin and the younger, eager Salk would never have been good matches no matter what, but their differences in temperament were nothing compared to a disagreement they had over science. Both researchers were part of the National Foundation for Infantile Paralysis—later dubbed the March of Dimes—and both were trying to develop a polio vaccine. Sabin was convinced that only a live, weakened virus could do the trick; Salk was convinced a newer approach—using the remains of a killed virus—would be better and safer. Both men turned out to be right. Salk’s vaccine was proven successful in 1955; Sabin’s—which was easier to administer, especially in the developing world, but can cause the rare case of vaccine-induced polio due to viral mutations—followed in 1962. Both vaccines have pushed polio to the brink of eradication. It is now endemic in only three countries—Afghanistan, Pakistan and Nigeria—and appears, at last, destined to follow smallpox over the extinction cliff.

Dr. Maurice Hilleman

Ed Clark—Time & Life Pictures/Getty ImageDr. Maurice Hilleman (center) talks with his research team as they study the flu virus in a lab at Walter Reed Army Institute of Research, Silver Springs, Md. in 1957.

Around the world, untold numbers of children owe their health to a single girl who woke up sick with mumps in the early morning hours of March 21, 1963. The girl was Jeryl Lynne Hilleman, who was then only 5; her father was a Merck pharmaceuticals scientist with an enduring interest in vaccines. Dr. Maurice Hilleman did what he could to comfort his daughter, knowing the disease would run its course; but he also bristled at the fact that a virus could have its way with his child. So he collected a saliva sample from the back of her throat, stored it in his office, and used it to begin his work on a mumps vaccine. He succeeded at that—and a whole lot more. Over the course of the next 15 years, Hilleman worked not only on protecting children against mumps, but also on refining existing measles and rubella vaccines and combining them into the three-in-one MMR shot that now routinely immunizes children against a trio of illnesses in one go. In the 21st century alone, the MMR has been administered to 1 billion children worldwide—not a bad outcome from a single case of a sickly girl.

Pearl Kendrick and Grace Eldering

University of Michigan School of Public HealthPearl Kendrick

It was not easy to be a woman in the sciences in the 1930s, something that Pearl Kendrick and Grace Eldering knew well. Specialists in public health—one of the only scientific fields open to women at the time—they were employed by the Michigan Department of Health, working on the routine business of sampling milk and water supplies for safety. But in their free time they worried about pertussis—or whooping cough. The disease was, at the time, killing 6,000 children per year and sickening many, many more. The poor were the most susceptible—and in 1932, the third year of the Great Depression, there were plenty of poor people to go around. A pertussis vaccine did exist, but it was not a terribly effective one. Kendrick and Eldering set out to develop a better one, collecting pertussis samples from patients on “cough plates,” and researching how to incorporate the virus into a vaccine that would provide more robust immunity. They tested their vaccine first on mice, then on themselves and finally, in 1934, on 734 children. Of those, only four contracted whooping cough that year. Of the 880 unvaccinated children in a control group, 45 got sick. Within 15 years of the development of Kendrick and Eldering’s vaccine, the pertussis rate in the U.S. dropped by 75%. By 1960 it was 95%—and has continued to fall.

The work that’s done at the lab bench is not the only thing that makes vaccines possible; the work that’s done by policymakers matters a lot too. That is especially true in the case of California State Senator Richard Pan, a pediatrician by training who represents Sacramento and the surrounding communities. Pan was the lead sponsor of the recently enacted Senate Bill 277, designed to raise California’s falling vaccine rate by eliminating the religious and personal belief exemptions that many parents use to sidestep the responsibility for vaccinating their children. For Pan’s troubles, he now faces a possible recall election, with anti-vaccine activists trying to collect a needed 35,926 signatures by Dec. 31 to put the matter before the district’s voters. Pan is taking the danger of losing his Senate seat with equanimity—and counting on the people who elected him in the first place to keep him on the job. “I ran to be sure we keep our communities safe and healthy,” he told the Sacramento Bee. That is, at once, both a very simple and very ambitious goal, made all the harder by parents who ought to know better.

Dr. Andrew Wakefield

Shaun Curry—AFP/Getty ImagesFrom right: Dr. Andrew Wakefield and his wife, Carmel arrive at the General Medical Council (GMC) in central London on Jan.28, 2010.

Not every conspiracy theory has a bad guy. No one knows the name of the founding kooks who got the rumor started that the moon landings were faked or President Obama was born on a distant planet. But when it comes to the know-nothing tales that vaccines are dangerous, there’s one big bad guy—Andrew Wakefield, the U.K. doctor who in 1998 published a fraudulent study in The Lancet alleging that the MMR vaccine causes autism. The reaction from frightened parents was predictable, and vaccination rates began to fall, even as scientific authorities insisted that Wakefield was just plain wrong. In 2010, the Lancet retracted the study and Wakefield was stripped of his privilege to practice medicine in the U.K. But the damage was done and the rumors go on—and Wakefield, alas, remains unapologetic.

Jenny McCarthy and Jim Carrey

Brendan Hoffman—Getty ImagesJim Carrey (center) carries Evan McCarthy, son of actress Jenny McCarthy (left) during a march calling for healthier vaccines on June 4, 2008 in Washington.

If you’re looking for solid medical advice, you probably want to avoid getting it from a former Playboy model and talk show host, and a man who, in 1994’s Ace Ventura: Pet Detective, introduced the world to the comic stylings of his talking buttocks. But all the same, Jenny McCarthy and Jim Carrey are best known these days as the anti-vaccine community’s most high-profile scaremongers, doing even the disgraced Andrew Wakefield one better by alleging that vaccines cause a whole range of other ills beyond just autism. None of this is true, all of it is shameful, and unlike Wakefield, who was stripped of his medical privileges, Carry and McCarthy can’t have their megaphones revoked.

Rob Schneider

Richard Shotwell—Invision/APRob Schneider in 2014.

What’s that you say? Need one more expert beyond Jenny McCarthy and Jim Carrey to weigh in on vaccines? How about Rob Schneider, the Saturday Night Live alum and star of the Deuce Bigalow, Male Gigolo films? Schneider has claimed that the effectiveness of vaccines has “not been proven,” that “We’re having more and more autism” as a result of vaccinations, and that mandating vaccines for kids attending public schools is “against the Nuremberg laws.” So, um, that’s all wrong. A vocal opponent of the new California law eliminating the religious and personal belief exemptions that allowed parents to opt out of vaccinating their kids, Schneider called the office of state legislator Lorena Gonzalez and left what Gonzalez described as a “disturbing message” with her staff, threatening to raise money against her in the coming election because of her support of the law. Gonzalez called him back and conceded that he was much more polite in person. Still, she wrote on her Facebook page, “that is 20 mins of my life I’ll never get back arguing that vaccines don’t cause autism with Deuce Bigalow, male gigolo.#vaccinateyourkids.”

Robert F. Kennedy, Jr.

Rich Pedroncelli—APRobert F. Kennedy, Jr. speaks against a measure requiring California schoolchildren to get vaccinated during a rally at the Capitol in Sacramento, Calif., on April 8, 2015.

If you’re looking for proof that smarts can skip a generation, look no further that Robert F. Kennedy, Jr., son of the late Bobby Kennedy. RFK Jr. has made something of a cottage industry out of warning people of the imagined dangers of thimerosal in vaccines. An organomercury compound, thimerosal is used as a preservative, and has been removed from all but the flu vaccine—principally because of the entirely untrue rumors that it causes brain damage. But facts haven’t silenced Kennedy who, as a child of the 1950s and ‘60s, surely got all of the vaccines his family doctor recommended. Children of parents who listen to what he has to say now will not be so fortunate.

13 Fun Ways to Work Out With Your Dog

The versatile furry friends can do anything from running to yoga to boot camp with you

Dogs make the best workout buddies. They never complain about hills or cancel on you last-minute. And they’re always stoked to follow you out the door. That energy can be contagious: research from Michigan State University found that canine owners were 34% more likely to get the recommended 150 minutes of exercise a week than folks who didn’t have a dog. Even if you’re just taking your pup for a walk, that counts. (Move at a brisk clip and you can burn as many as 170 calories in half an hour.) But there are lots of other activities you and Fido can do together—all while strengthening your bond.

Check out these fun ways to get fit with your furry pal.

Running

Because dogs are creatures of habit, they can help you keep up your weekly mileage: Once your pup gets into the routine of a morning run, she won’t let you wimp out if it’s drizzling, or you’re just feeling bleh, explains J.T. Clough, author of 5K Training Guide: Running with Dogs($8; amazon.com). “She’ll wait by your sneakers, tongue out, tail wagging,” says Clough, who runs a dog-training business on Maui. “Her excitement can be enough to change your attitude.”Concerned your little pooch won’t keep up? No need to worry, says Clough: “The truth is most small dogs have more energy than the big breeds.” Just be careful in the heat and humidity, since dogs don’t sweat like we do. And if you have a flat-faced breed (think pugs and Boston terriers), keep your runs under five miles, Clough suggests, since these dogs have a harder time taking in air.

Stand-up paddleboarding

It’s almost as if stand-up paddleboards were designed for canine co-pilots: Dogs of all sizes can ride on the nose (while you get a killer ab workout). Pick an ultra-calm day on a lake or bay for your first excursion together, so your pup can develop his sea legs. If you’re struggling to balance the board, try paddling on your knees, which lowers your center of gravity, until your dog is comfortable. Still, odds are you’ll both take a dip, which is why Clough recommends outfitting your dog with a life preserver. It’ll make it easier for you to lift him back onto the board, too: Most doggie vests have an easy-to-grab handle, like the NRS CFD (from $35; amazon.com).

Is your dog a born swimmer? Bring a stick or throw toy and play fetch once you’ve paddled out.

Kayaking

You can also take your dog out for a spin in a sit-on-top kayak. Smaller breeds may perch up front, while larger dogs might feel safer closer to your feet. Teach your buddy to get in and out of the kayak on land first; then practice in the shallow water close to shore. (If he seems nervous about sliding around, you could lay down a small mat or piece of carpet so his paws can get some traction.) The trick is to keep the first few outings relaxed and fun (read: brings treats!). Stick to inlets and slow-moving rivers without too much boat traffic. You can let your dog paddle alongside you if he wants to swim. If not, that’s okay too: “He’s getting lots of stimulation just by riding in the boat,” says Clough—all while you ton your arms and core and burn hundreds of calories.

Cycling

Is your dog so exuberant on walks you worry she might one day pull your arm off? If so, try letting her keep up with you as you pedal: “Biking is perfect for dogs with tons of energy,” says Clough. “They are totally psyched to flat-out run.” Meanwhile, you’re getting a great workout (cycling can torch 500-plus calories per hour) and building your leg muscles.

If your girl likes chasing squirrels and skateboards, consider using a device called the Springer. It attaches the leash to your bike’s frame or seat stem and absorbs much of the force of sudden tugs ($130; amazon.com).

Biking with your dog may actually help with any behavioral issues she has, Clough adds. “The biggest problem I see with dogs is that they’re not getting enough exercise.” Indeed, veterinarians at Tufts University’s Animal Behavior Clinic say aerobic exercise stimulates the brain to make serotonin, a hormone that helps dogs, especially those who are anxious or aggressive, to relax.

Rollerblading

This is another great way to burn off a dog’s excess energy—as long as you’re an expert inline skater, that is. If not, “it can be disastrous,” warns Clough. “Your dog will be like ‘Woohoo!’ and you’ll be like, ‘Where’s the break?!” But even if you’re super confident on wheels, she suggest rollerblading in an area free of traffic, like a park or boardwalk, so you can enjoy the excursion as much as your pal. Chances are, you’ll have so much fun you’ll forget you’re seriously working your core.

Dog-friendly boot camp

Fitness classes designed for people and pups—like Leash Your Fitness in San Diego and K9 Fit Club in Chicago —are becoming more and more popular. In a typical class, you’ll run through high-intensity moves for strength, balance and cardio while your four-legged companion practices obedience drills. “I recommend that people at least try out a class,” says Clough, who helped launch Leash Your Fitness. “The focus is more on the person’s workout than the dog’s,” she explains, but your dog is learning to feel comfortable in a distracting environment—and that will make it easier to take him along on other fitness adventures.

Dog yoga

Yep, “doga” is a thing, and it turns out pooches are naturals at this ancient practice. Can’t picture it? Think about your girl’s morning stretches: She probably does a perfect cobra, right? In a doga class, you’ll help her try more poses—and she’ll (hopefully) act as a prop for your own poses. But really doga is all about the pet-human bond. There’s often some doggy massage and acupressure involved. And while you’re in such close contact, you’ll have the opportunity to do a regular health check, feeling for any lumps beneath her fur.

Active fetch

You throw the ball and your pup goes bounding after it. But who says you have to just stand there? While he’s retrieving, bust out some muscle-building moves like crunches, lunges, squats, and more—until you’re both panting and worn out. Better yet, race him for the ball and squeeze in some sprints. Fetch can be a game you play, too.

Soccer

Believe it or not, some dogs love soccer—especially herding breeds like Border Collies and Australian Shepherds. Pet brands sell soccer-style balls (resistant to sharp teeth) in different sizes, like the 5-inch Orbee-Tuff ball from Planet Dog ($20; amazon.com). Once your boy learns to “kick” or “dribble” with his nose or paws, get your heart rates up with keep-away, or by punting the ball and racing for it.

Not a soccer fan? Try engaging him with other toys (like rope tugs) and activities (such as hide-and-seek). “Put yourself into kid mood, come up with a game, and show him,” Clough suggests. “He’ll most likely play it with you.”

Snowshoeing and cross-country skiing

Cold weather doesn’t mean you have to leave your dog cooped up. Some breeds—like Huskies and St. Bernards—have snow in their DNA, but many dogs enjoy a good romp in the white stuff. And whether you’re on snowshoes or skis, you’ll get in a low-impact, total-body workout. But the best part comes later, when you both curl up for a snooze by the fire.

If your dog gets chronic snow build-up between the pads on her paws, you can outfit her with booties. Brands like Ultra Paws (from $32; amazon.com). and Ruffwear ($90; amazon.com) make rugged footwear for winter walks.

You have the perfect training buddy. Why not work toward the goal of finishing a dog-friendly race? Events for four-pawed runners and their owners—such as the Fast and the Furry 8K in St. Paul, Minn. and the Rescue Me 5K9 in Irvine, Calif. —are held all over the country.

Don’t have a dog?

You can still work out with one. Call a local animal shelter and volunteer to take dogs out for walks or runs. Pound puppies are often desperate for exercise and attention, and your commitment to your new furry pal is great motivation to stick with a fitness routine. Best of all, as an anxious or unruly dog learns to walk on a leash and behave in public, you’ll be improving his chances of finding a forever home.

Millennials Now Have Jobs But Still Live With Their Parents

A Pew study finds the perplexing pattern has affected the housing industry

Halfway through this decade and nearly seven years after the Great Recession, Millennials are bouncing back—sort of.

In a new study released by Pew, researchers find that while Millennials—people who were born after 1981—are back to the pre-recession era unemployment levels of 7.7%, they haven’t been able to establish themselves as adults in other ways, like owning a home or getting married.

Richard Fry, an economist and lead author of the study, describes the situation as Millennials’ “failure to launch.” “I think the core is a bit of a puzzle with one clear consequence,” Fry told TIME. “There’s good news: the group that was hit the hardest—young adults—are now getting full-time jobs and earnings are tracking upwards. But the surprise is that with the recovery in the labor market, there are fewer young adults living independently.” (Living independently here is defined as heading a household; in other words, owning a home.)

When the recession hit, young people moved back into their parents’ house in droves, unemployed and without much hope for any future work. The thought process was that once the economy improved and Millennials returned to work, they’d scoot out of their parents lair.

But that hasn’t been the case, and economists aren’t sure why.

“Is it a good thing or a bad thing? I don’t know,” Fry said. He was also the author of a study three years ago that explored Millennials living and work situations using 2012 data, and he thought then that the explanation was clear. “My thought was, ‘Yeah, that’s true, the job market is crummy,'” he said. “My expectation was that as the labor market improves, more young people will strike out on their own, but that’s not the case.”

About 42.2 million 18-to-34 year olds are living away from home this year; 2007 numbers were just above 2015’s independent young adult population at 42.7 million. There are a few common characteristics of these Millennial householders; they are more likely to be women (72% compared to their male counterparts) and college-educated (86% of those with bachelors degrees were living independently compared to 75% of the same peer group holding only a high school education). Fry points to women getting in permanent romantic relationships earlier that either lead to marriage or cohabitation as the cause of this gender difference.

The consequences of Millennials still living at home go far beyond the household dynamics of adult children being at home with parents. Consider the housing sector, which has not recovered from the 2008 economic tumble. If more young adults had decided to take on home ownership, the economy may have improved more.

So how are Millennials most likely living if they’re not living at home? Probably with a roommate, or doubled up with a fellow adult who is not their spouse or partner, data suggests.

But having a roommate or living at home have real demographic effects for the future, Fry says. He goes back to two key facts: that people living independently tend to be better educated and that college educated people tend to delay marriage or not marry at all (though even Millennials with a high school education are not getting married as much as they used to.) That means that less educated Millennials are facing consequences in not just the job market, but beyond.

“There’s less sorting—that when the less educated do marry, they marry others who are also less educated,” he said. “That’s going to impact household income and economic wellbeing. That’s going to affect economic outcomes.”

See The Racy Stroller Ad Causing An Outrage

Photograph by Duy Vo for Vogue Netherlands and BugabooThis photo of model Ymre Stiekema for a Bugaboo shoot has caused an uproar among mothers after it was posted on social media.

Women say the image doesn't represent the reality of motherhood.

All it took was one photo posted on Facebook and Instagram, and stroller company Bugaboo is learning that hell hath no fury like busy mothers scorned.

The Dutch maker of high-end baby pushchairs posted a picture last week of 23-year-old Dutch model Ymre Stiekema running with her 2-year-old daughter. Stiekema was jogging with her $800 Bugaboo stroller, but was also dressed in a black-and-white bikini in a brand photoshoot with Vogue Netherlands. The kind of attire, you know, one normally sees on mothers who are taking a run in the park. “See how model and mum Ymre Stiekema stays fit and healthy with the Bugaboo Runner,” the caption reads.

That post has earned attracted over 590 comments (and counting) on the brand’s Facebook page, mostly from mothers who feel the image is a tad unrealistic. “I’m not gonna cus [sic] her out for what she’s wearing but I have 2 children and if was running in this on the school run it would be because I’d forgotten to put my clothes on because we’re late for school, my 5 year old had fallen off his scooter, my 2 year old refuses to put his shoes on and there’s a good chance I haven’t had the time to do my bikini line. There’s an image to leave you with,” summarized one commenter.

Some have used sarcasm to get their point across on the realities of juggling motherhood and staying fit. “So thaaaat’s what I have been doing wrong….. I need to jog with my baby in a Bugaboo to get my beach body back. 😉 ,” said one commenter. “Do you get a personal trainer after birth when you by [sic] this?” said another.

To be fair, some commenters also recognize the nature of the marketing campaign by Bugaboo and their aims of publicizing their Bugaboo Runner, a stroller designed to be used while running. The company also released a statement to NBC Today defending the depiction of Stiekema. “We want to inspire moms and dads everywhere to explore the world with their families, while keeping up with an active and healthy lifestyle,” they said. “We believe that all parents should run free no matter where they are on their fitness journeys and what they choose to wear on their runs.”

The Science of Why You Crave Comfort Food

It's not just because these foods are tasty. It's because they make us feel less alone

In mid-July, I was visiting my hometown in Minnesota when I happened upon the unmistakable scent of something deep-fried. I was at a concert, and no matter how off-brand a dietary choice of corn dogs and cheese curds may be for a health writer, I went for it. How could I not? I spent two thoroughly enjoyable summers during college working at the Minnesota State Fair, and that experience continues to make corn-and-grease-dipped hot dogs not only appetizing but somehow irresistible, too.

Summer is the season for nostalgic eating: Hot days in the park call for a trip to the ice cream truck, concerts call for corn dogs, baseball games call for hotdogs and beer, ice-cold movie theaters call for popcorn. And it’s not just me. Researchers suggest that when we associate foods with happy memories, the effects are profound, impacting how good we think foods taste as well as how good those foods make us feel.

It makes intuitive sense that positive experiences with a given food could influence our craving for it later on, but recent research also suggests something else is at play, too: comfort foods remind us of our social ties, which means they may help us feel less lonesome when we feel isolated. In a recent July 2015 study, Jordan Troisi, an assistant professor of psychology at Sewanee, The University of The South, and his colleagues found that people with strong relationships preferred the taste of comfort food when they experienced feelings of social isolation.

“Comfort food seems to be something people associate very significantly with close relationships,” says Troisi. “This probably comes about by individuals coming to associate a particular food item with members of their family, social gatherings, and people taking care of them, which is why we see a lot of comfort foods [that are] traditional meals or things had at a party.”

Of course, what counts as comfort food is different person to person. When Troisi has asked people write about an experience they’ve had with a comfort food, essays have ranged from soup to kimchi. “It’s not just that ice cream, for instance, is really tasty. It’s that someone has developed a really significant meaning behind the idea of ice cream due to their relationships with others, and that’s what is triggering this effect,” he says.

Even the smell of a meaningful dish can elicit feelings of belonging, some research suggests. In a February 2015 study, Virginia Commonwealth University researcher Chelsea Reid and her colleagues had 160 people smell 12 different scents, including apple pie, cotton candy and baby powder and rate the extent to which the scent was familiar, arousing, autobiographically relevant, and the extent to which it elicited nostalgia. “Nostalgia can be evoked in different ways, but scents may be particularly likely to evoke nostalgia due to the strong link between scents and memory. The smell of pumpkin pie might bring all those holidays with family flooding back, or the smell of a familiar perfume might arouse memories with your partner,” says Reid.

Biologically speaking, scent and memory are closely tied. “Psychological research has demonstrated that smells are powerfully linked to memory, and to autobiographical memory in particular,” says Reid. “The olfactory bulb, which is involved in the sense of smell, is linked to areas in the brain associated with memory and emotional experiences.”

Humans have a fundamental need to belong, says Reid, and because nostalgia often centers around personal events involving people they care about, she sees the evocation of nostalgia as one way people can obtain a sense of belonging even when the people they are close to are not close by.

So while corn dogs in the summer may not be fine dining by any standard, for me, they trigger happy memories of summers long ago—and that’s a good thing. In moderation, of course.

With all the news reports—about everything from shark activity to “flesh-eating” bacteria—the ocean is getting a lot of nasty press this summer. But actually, it may be the sand that’s the ickier part of the beach, according to a new study in the journal Environmental Science & Technology. It harbors far more fecal bacteria (yes, that means poop) than the water.

“Beach goers should be aware of the health implications of contaminated beach sand, and should not assume that sand is always safe,” lead researcher Tao Yan, PhD, explained to Health.

It turns out that previous studies have shown that the sand is actually grosser compared to the water; it often has 10 to 100 times the fecal bacteria than the water, the study authors note.

For the most recent investigation, however, Yan and his team wanted to understand why.

In a lab, the researchers created a set up of three “microcosms” using samples from three Hawaii beaches, including sand and seawater normally found in those places, and then contaminated them with fecal bacteria commonly found on beaches. They then watched the samples to see how the bacteria populations changed over time. Ultimately they found that the decay process of harmful bacteria was much slower in the beach sand than in the water, which might explain why sand seems to be more of a hotbed.

But can this bacteria really hurt you? A 2012 study in the journal Epidemiology suggests that yep, it’s possible dirty sand can make you sick. The researchers analyzed sand samples from two beaches (one in Alaska and one in Rhode Island) within two miles of waste-water treatment facilities. Then, they surveyed nearly 5,000 visitors to those beaches, and found that those who played in the sand or got buried in the sand were more likely to develop diarrhea, nausea, vomiting, or other GI upset in the weeks after their visit.

Still, researchers insist their findings are no reason to quit the beach all together—just take the obvious precautions.

“The symptoms we observed are usually mild and should not deter people from enjoying the beach, but they should consider washing their hands or using a hand sanitizer after playing in the sand or water,” senior author Timothy Wade, PhD, said in a U.S. Environmental Protection Agency press release.

Also, rinsing off in the public access showers ASAP, and following that with a good shower at home after a day at the beach might not be a bad idea either. Just sayin’.

You Asked: Is It Bad to Hold in a Sneeze?

Pulled muscles and perforated eardrums are a couple of the calamities that could befall a sneeze suppressor.

Spend some time reading medical case studies—a great way to ruin a pleasant morning, by the way—and you’ll be shocked at the unlikely ways people manage to hurt themselves. Focus on sneeze-related accidents, and you’ll notice a trend: Bad things happen when people hold in their sneezes. A fractured larynx, acute cervical pain and facial nerve injuries are just a few of the documented mishaps caused by a stifled achoo.

“I’ve seen patients with a ruptured eardrum or pulled back muscles, and you hear about cracked ribs,” says Dr. Michael Benninger, an otolaryngologist—that’s an ear, nose and throat doctor—and chairman of the Head and Neck Institute at Cleveland Clinic.

While sneezes (and the schnozes that expel them) come in many sizes, a whopper sneeze can blast air out of your nose at 500 miles per hour, Benninger says. If you redirect that force inward, your suppressed sneeze can send waves of force rippling through your head and body.

Usually that’s not a big deal. After all, most of us have bottled a sneeze here or there without issue. But Benninger says a preexisting musculoskeletal injury or weakness, odd ear or throat physiology or some other anatomical quirk could lead to an adverse reaction to a held-in sneeze.

While such reactions are unlikely, Benninger says sneezes aren’t meant to be caged. “Sneezing probably cleanses the nose of irritants, viruses and those types of things,” he explains. He uses the word “probably” because there’s research to suggest sneezing might perform other functions, from signaling to people that you’re sick to resetting the homeostatic environment in your nose.

“I’ve read reports that people sneeze differently in different cultures—almost like a learned behavior,” he says. He adds that everything from your lung capacity to the structure of your face and nose can play a role in how forcefully you sneeze, and the potential of your sneeze to cause or exacerbate an injury.

His advice? Don’t hold in a sneeze. “If you feel one coming on and you want to stop it, rubbing your nose can help,” he says. For patients who may feel pain when sneezing—those who’ve recently undergone surgery or broken a bone—Benninger advises opening your mouth wide to minimize a sneeze’s strength. “It’s like forcing water through a pipe,” he says. “If the air can escape through your nose and mouth, that creates less pressure than forcing it through a smaller opening.”

Just make sure that when you sneeze, you’re doing it into the crook of your arm, not your hand. “We know sneezing can project smaller particles 10 to 12 feet, so it’s important to cover your mouth,” Benninger says. “But if you sneeze into your hand, everything you touch is going to be contagious.” Your clothes help absorb particles, and you probably won’t be touching much with the inside of your arm, he adds.

Is Halal Meat Healthier than Conventional Meat?

Halal refers to Muslim criteria for how food is raised slaughtered and prepared. But do the requirements make the food healthier?

Correction appended, July 30

Denmark announced last year it would ban Halal and Kosher slaughtering practices. Halal meat is reared—and slaughtered—differently from conventional meat. But is it healthier?

Like kosher food, Halal food is guided by religious criteria that govern everything from how the animals destined to be eaten are fed and raised, to how they are slaughtered and prepared for consumption.

According to the Muslims in Dietetics and Nutrition, a member group of the Academy of Nutrition and Dietetics, Halal food can never contain pork or pork products (that includes gelatin and shortenings), or any alcohol. Rasheed Ahmed, founder and president of the Muslim Consumer Group (MCG), which both certifies Halal food and educates Muslims about different foods’ Halal status, says that to be truly Halal, how the animals are raised is taken into account. Animals must be fed vegetarian diets, which means that many chickens and cows raised on U.S. farms don’t qualify (some feed contains animal byproducts). Halal animals also can’t be treated with antibiotics or growth hormones, since the hormones may contain pork-based ingredients.

Halal animals must be slaughtered by a Muslim, who says a blessing, and by hand, not by machine (which is the way many chickens in the U.S. are killed. Once killed, the animal’s blood must drain completely, since Muslims who eat Halal do not consume the fresh blood of animals.

Ahmed admits that his criteria for certification are a bit stricter than others; for example, MCG won’t certify fish if it’s farm-raised, since it’s not clear whether they fish was fed animal byproducts. Only wild-caught fish are Halal certified by MCG standards.

While some people believe that these criteria make Halal food healthier, Carol O’Neil, professor of nutrition and food sciences at Louisiana State University Agricultural Center says that there simply aren’t studies showing that to be true. The U.S. Department of Agriculture, which serves as the reference for nutritional content of food, does not separate out Halal meat (or kosher meat, for that matter) from other meats for its nutritional information.

“It’s difficult to know if there are any kind of nutritional differences,” says O’Neil. “There are certainly no studies done looking at people who consume Halal meat to see if their cholesterol levels are different, or anything like that. We just don’t know.”

O’Neil does note, however, that Halal practices may be more humane for the animal, and therefore that may make a difference for some people. “Our religion does not allow us to put any pressure on the animals,” says Ahmed. “So we treat them as humanely as possible.”

Correction: The original version of this article misstated when the ban was enacted. It was in February of 2014.