The Most Bonkers Scientific Theories (Almost) Nobody Believes Anymore

Look, I’m not here to tell you what to think. But you need to be grateful for science – all of it. Even the crazy stuff. Even for folks who could have sworn it was possible to turn iron into gold, and even for that time when a brilliant astronomer thought that there could be life on the sun. Because great leaps in human understanding are always surrounded by backward steps and dead ends. It has to be OK to fail. Spectacularly.

In this gallery we present to you the silliest of the silly scientific theories, some of which you’ll no doubt recognize because for whatever reason people refuse to give up on them. So remember, there are no dumb theories, just dumb reasons to keep believing in long-discredited theories.

Above:

Lambs That Grow Like Weeds – Literally

The ancient Greeks were the first to have the crazy idea that a lamb could grow right out of the ground, with a stem attached to its navel. Pliny the Elder later mentioned it, and Europeans picked up the idea again in the 14th century.

This is the exceedingly strange legend of the Vegetable Lamb of Tartary.

Now, these folks were well aware of where lambs came from. They were baby sheep that came out of mommy sheep. Or a stork drops it off. Or whatever. But their story may have arisen out of the first Western accounts of cotton plants, which an ancient Greek by the name of Megasthenes found in India, referring to them as “trees on which wool grows.”

Then boom, people start thinking that lambs can grow out of the ground. Said Sir John Mandeville, a sort of 14th century travel writer, of India: “There grew there a wonderful tree which bore tiny lambs on the ends of its branches. These branches were so pliable that they bent down to allow the lambs to feed when they are hungry.”

The allure of the Vegetable Lamb of Tartary continued into the late 1700s, when it was still debated by botanists. Those being experts in plants.

Phrenology: Like Reading a Crystal Ball, But With People’s Heads

They say you shouldn’t judge a book by its cover, but they never said anything about not judging a man by the bumps on his skull. So phrenologists went out and did just that.

By taking measurements of the skull, phrenology’s founder Franz Josef Gall argued around the turn of the 19th century, you could determine a person’s personality traits, propensities, and intelligence. This was because the region of the brain responsible for, say, self-esteem, would grow larger in a more confident person, consequently pushing out the skull. Phrenologists could feel a subject’s head for these abnormalities, a bit like a psychic reading a crystal ball, in the sense that both practices are absurd ways to relieve people of their money.

Aside from its many scientific problems, phrenology also had social consequences. The theory came about in the dark days of colonialism and wormed its way into Europeans’ judgments of the peoples they were subjugating. For instance, in Rwanda, the Belgians' phrenological assessments deemed the Tutsis more intelligent than the Hutus, helping to sow the seeds for a genocide that took place decades later. And these methods were of course broadened to deem Europeans inherently more intelligent than Africans.

But Franz Josef Gall’s theory that behaviors and thought processes are contained in certain parts of the brain turned out to have a grain of truth. Decades later, brain scans showed that many parts of the brain are indeed more active during specific activities. More recently, there's been a backlash against the kind of overly simplistic "neophrenology" that pins specific functions on specific brain regions. Exactly what job – or jobs – any given part of the brain does is a question that will keep neuroscientists scratching their heads for years.

Life on the Sun and, for That Matter, Life on Every Planet. Life for Everyone!

The sun is such a violent ball of energy that if you look at it from Earth, it'll burn your eyeballs right out of your head. But in the 1700s, when telescope-aided astronomy was still a young discipline, some thought the sun to be a regular old terrestrial body covered by a “resplendent surface.”

It was only a matter of time before someone came along and argued extensively that life could exist there. That dubious honor fell on the discoverer of Uranus, William Herschel. Sunspots, he said, were atmospheric openings providing glimpses of the sun’s surface, or in fact mountains poking above the cloud layer. And according to Herschel, on that surface, like the surfaces of all the other planets, there existed life.

He wasn’t alone in his speculation of life elsewhere in the solar system. Herschel was working in a telescope-fueled era of intense interest in what creatures might populate other worlds. And it’s an interest that continues to this day. While creatures could not exist on the sun, except maybe the blargg from Super Mario, we’re currently exploring other seemingly wild theories, such as life forms inhabiting the ice-encased seas of Jupiter’s moon Europa.

A few years after Herschel put forth his theory, Thomas Young published a response in a collection called Natural Philosophy, in which he noted that among other problems the immense gravity of the sun would pose problems for life. Our understanding of the lifeless sun grew as the 1800s progressed, culminating in the early 20th century theory that the sun is a gigantic nuclear furnace.

Homeopathy: Medicine for People Who Failed Medical School, and Probably Middle School as Well

Here’s a fun little home experiment. Get a beer and dilute it with water to the point where the beer constitutes one part per 10,000. Then bump up the dilution until not a single molecule of beer exists in the solution. Now drink the “beer.” Do you feel buzzed?

Of course you don’t, because there’s no alcohol left. But the 18th century principles of homeopathy claimed that your water “remembers” the drug that it once mingled with. And it’s this principle that was once applied, totally ineffectively and therefore dangerously, to administering medicine. You see, practitioners of homeopathy believed that to cure a sick patient, you take a drug that causes similar symptoms in a healthy person and dilute it as much as possible. The more you dilute a drug, the more powerful it becomes for the patient. And various European governments bought into this, covering homeopathic treatments under their public healthcare programs.

Ha. I was kidding when I was talking in the past tense. People still believe this stuff. And some public healthcare programs in Europe, including in the U.K., still cover homeopathic treatments.

Alchemy: That’s Gold, Jerry! Gold!

Gold isn’t just about vanity. It has all manner of uses, from dentistry to electronics to monetary exchange. But long before we knew just what the stuff was truly capable of, the human fascination with gold helped kick off a scientific revolution. You see, we can thank the alchemists and their kooky gold-laden experiments for the ascendency of chemistry in the 1600s.

Medieval alchemists placed so much value in gold that they worked tirelessly to somehow transform base metals, such as iron, into it. But to do so, they’d need to first discover the philosopher’s stone, the substance capable not only of initiating this transformation, but of bestowing man with immortality. Alchemy was about perfection – gold being the perfect element and immortality being pretty sweet as well.

And the discipline was serious business. Alchemists wrote in code, lest their successes escape the lab (none ever did, but not because they were coded). In 1601 an alchemist under the patronage of Friedrich I of Württemberg promised the duke he could extract gold from silver, which, it turns out, is impossible. So he faked it, got caught, and was hanged. Even Isaac Newton dabbled in the art, and by dabble I mean he wrote more about it than he did physics, amassing thousands of pages of notes over three decades of research.

For millennia, alchemy involved decidedly unscientific things like magic and spirituality, but it was the methods alchemists used that grew increasingly sophisticated. In the 17th century the silly superstitious aspects finally began to fall away, as chemistry, always a part of alchemy, emerged as a dominant discipline. Which is just as well, since now we know exactly why Mentos and Diet Coke don’t get along.

Astrology: I Sure Hope Those Rovers Aren’t Monkeying Around on Mars and Messing Up My Horoscope

The millennia-old idea that the stars and planets have some sort of influence on earthly events developed independently all over the world, from Europe to Central America, and was considered a legitimate scientific discipline until only a few centuries ago.

But just because everyone else is doing it, doesn’t mean you should too. Astrology is a pseudoscience, through and through. Never pay someone to give you an astrological reading, unless it comes with a free sandwich or something. And they can validate your parking. Then it may be worth it.

Yet astrology was once intertwined with astronomy. In fact, they were often indistinguishable until the 1600s, when the invention of the telescope set off a celestial revolution. Tracking the movements of planets and stars, while an ancient practice that didn’t necessarily involve divining your future, was supercharged with the introduction of the instrument, leaving astrology to descend further and further into the realm of pseudoscience. But it should be noted that astrology lent much to our understanding of the cosmos.

The Rain Follows the Plow Because the Plow Is Surprisingly Charismatic. Or Something.

A continent is a terrible thing to waste. Why would we Americans go through all the trouble of acquiring our arid western lands if we couldn’t make money off of them? Damn Nature and her stubbornness!

But what if we could give her a little boost?

As farmers marched west in the late 1800s, the traditionally dry territories they settled in were suddenly gifted with increased rainfall. From this, some scientists concluded that the newly cultivated lands had been the cause of this climate shift. The thinking went that broken soil absorbed more water, which was readily evaporated, leading to increased rainfall. And it was a theory that amassed a following in the scientific community.

Never an industry to miss an opportunity, the railroads happily disseminated the theory in their advertisements. Could such an ad campaign have had something to do with railroads owning a good amount of land out west? Let’s go ahead and say it probably did.

Well, it turned out the increased rainfall the farmers were enjoying was just weird weather. The climate soon reverted right back into aridity, and the theory that rain follows the plow was left high and dry. Yet modern climate studies suggest that increased vegetation and urbanization might indeed affect rainfall, though far from reliably and far from being as widespread as an entire region like the Midwest.

Spontaneous Generation: You Too Can Grow a Mouse From a Sweaty Shirt

Aristotle put forth the first comprehensive theory of spontaneous generation, the idea that a creature could arise out of an inanimate (or formerly animate) object, as a maggot “grows” from rotting meat. And it went unchallenged for 2,000 years, with a 17th century physician even proposing a home experiment to prove it: Drop a sweaty shirt – yes, a sweaty shirt – into an open-mouthed jar and add some wheat husks. Voila, the amalgamation has grown mice, which it turns out isn’t so much due to them just sprouting as it is to mice being partial to wheat.

Enter that physician’s contemporary, Francesco Redi, who had a sneaking suspicion that maggots were actually the larvae of flies. To prove it, he laid out different flasks, each containing rotting meat. Some of these flasks he sealed, others he left open, and still others he covered with gauze, allowing airflow but restricting flies. As we would today expect, he found maggots only in the open flasks.

Then came along Louis Pasteur, of dairy-aisle fame, to put spontaneous generation down for good. He boiled meat broth in a flask, then heated the flask’s neck and bent it into an S shape, thus allowing air to enter, but trapping particulate matter and microbes in the curve. The broth grew no microorganisms, until Pasteur tipped the flask, flooding the curve and pulling the microbes into the solution. Life then proliferated.

Now considered to be a founder of microbiology, Pasteur had smashed a millennia-old assumption and in the process lent pivotal evidence to the understanding that disease is not due to miasma – or “bad air” caused by rotting flesh – but to microbes floating around us and, rudely, invading our personal space.

An Earth as Hollow as Rush Limbaugh’s Head

If you lived 10,000 years ago, you'd be forgiven if you peered down into a cave and assumed it went on forever. Ancient peoples had a certain penchant for hyperbole. Jules Verne went epic with the idea in Journey to the Center of the Earth, and nearly 200 years before him a scientific theory about a hollow world came from none other than astronomer Edmond Halley, after whom the famous comet is named.

Now, Halley was a damn good scientist working with as much data as was available to him in the 17th century, and he admitted to his readers that this was certainly a strange tale. He put forth a theory that is fantastically wrong, but every step of the way supported his arguments, sometimes using Isaac Newton’s Principia, then addressed counterarguments. It was a milestone in theorizing about what can’t, and never will, be seen.

The hollow Earth was bunk, yes, but hilariously more on point 300 years ago than the flat-Earthers of today, who maintain this staggeringly vacuous wiki.