WIRED's biggest stories, delivered to your inbox

Polaroids and Helicopters: Critical Innovations From 1941-1950

Today's leading-edge technology is headed straight for tomorrow's junk pile, but that doesn't make it any less awesome. Everyone loves the latest and greatest.

Sometimes, though, something truly revolutionary cuts through the clutter and fundamentally changes the game. And with that in mind, Wired is looking back over 12 decades to highlight the 12 most innovative people, places and things of their day. From the first transatlantic radio transmissions to cellphones, from vacuum tubes to microprocessors, we'll run down the most important advancements in technology, science, sports and more.

This week's installment takes us back to 1941-1950, when Orwell released his frighteningly accurate dystopia, nuclear bombs changed human history forever and photography finally became instant.

We don't expect you to agree with all of our picks, or even some of them. That's fine. Tell us what you think we've missed and we'll publish your list later.

1942: The Manhattan Project (Science)

On June 28, 1941, months before the United States officially entered World War II, President Roosevelt signed executive order 8807, created the Office of Scientific Research and Development. This agency was tasked with coordinating American military scientific research, including the secret “S-1 Uranium Committee” that would later evolve into the Manhattan Project.

At the start of the war, in 1939, Albert Einstein, Leo Szilard, and other physicists sent Roosevelt a letter that outlined a German project to harness atomic physics and create bombs of unprecedented destruction. The U.S. quickly went about locating and securing sources of the radioactive element uranium and fast-tracking a program to unlock its power. The S-1 Committee produced reports outlining how this material could be enriched to produce an atomic bomb.

By 1942, the Manhattan Project was formally established. The enormous undertaking would go on to employ 130,000 people and cost roughly $2 billion (about $25.8 billion in modern terms) so that the U.S. could create and test the world’s first atomic weapons. Reactors produced enriched uranium as well as the newly discovered plutonium to fuel the powerful bombs. In 1945, the U.S. army detonated the first nuclear device during the Trinity test in New Mexico. Physicist J. Robert Oppenheimer – known as the “father of the atomic bomb” – said the test brought to mind words from the Bhagavad Gita: “Now I am become Death, the Destroyer of Worlds.”

Only two atomic bombs, Fat Man and Little Boy, were used in military action during the war. They rained fiery destruction upon the Japanese cities of Hiroshima and Nagasaki, killing hundreds of thousands. The world saw first-hand the potential for atomic devastation and lived for decades under the shadow of potential nuclear war between the United States and Soviet Russia. Nuclear physics, whether used in power plants or incredible explosives, has since then been a constant reminder that science has the potential both for good and for evil.

One of a sequence of photos taken six miles from the first nuclear bomb explosion site. Image: Library of Congress

1949: George Orwell's Nineteen Eighty-Four (Entertainment)

With the 1949 publication of dystopian novel Nineteen Eighty-Four, George Orwell peered into humanity's future and saw something utterly disturbing. Unfortunately, his dark fiction proved prescient: With its chilling vision of constant government surveillance, effective propaganda and perpetual war, Orwell's book described a horrifying world that looks shockingly like Earth in the 21st century.

From Big Brother and thoughtcrimes to the Newspeak language used to limit free thinking, the British author delineated the basic template later used by so many science fiction writers and filmmakers to portray government gone terribly wrong. The fact that we live today in a world in which many of his concepts have been woven into the basic fabric of life -- well, it's downright Orwellian.

1948: The First Polaroid Camera (Photography)

Before there was Instagram, there was Polaroid. The company’s name has become synonymous with instant photography, even though it stemmed from founder Edwin H. Land’s Harvard research in polarization (before cameras, Polaroid was known for their sunglasses).

In 1948 the company launched the Land Camera. The landmark invention developed film on the spot by pressing a negative onto paper with rollers. The image was transferred via a developing reagent pressed between the paper and film. The paper sandwich then came out the back and the photographer peeled away the negative to expose the print. The camera sold 900,000 units between 1948 and 1953.

According to the camera’s origin myth, the idea came to Land when he was vacationing in Mexico with his daughter and she asked why she couldn’t see the photo immediately.

Much like Instagram, the Land Camera was panned as not being “real” photography; an amateur’s toy. To combat this impression, Land put Ansel Adams on retainer and the photo legend remained an advocate and adviser to the company until he retired.

Land himself was a visionary akin to Steve Jobs in the way he designed beautiful products that people loved (though Land asked Jobs to stop comparing them in interviews because unlike Jobs, Land knew the science behind his products).

The Polaroid Model 95 went on sale in 1948, outselling Edwin Land’s optimistic projections. Photo by David Kim from the book Instant: The Story of Polaroid.

1947: The Computer Bug (Computers)

In the '40s, the computer evolved from idea to electrical reality. During the war, in an effort to crack German cryptography, the British built the Colussus, the first programmable computer that was both electronic and digital. Here in the States, at the University of Pennsylvania, J. Presper Eckert and John Mauchly created the ENIAC, and 18,000-square-foot machine packed with nearly 18,000 vacuum tubes. And at Harvard, researchers unveiled the Mark I, aka the IBM Automatic Sequence Controlled Calculator.

We hold a special place in our Wired hearts for all of these machines. But if we’re forced to choose one seminal moment from the 1940s, we can’t help but pick the arrival of the world’s first computer bug.

The bug is one of computing’s great metaphors. But in the beginning, it wasn’t a metaphor. It was the real thing.

At Harvard, in the late 1940s, the Mark 1 was succeeded by two other machines, known -- appropriately enough -- as the Mark II and the Mark III. One fateful day, the Mark II stopped working, and an operator named Bill Burke traced the glitch to a moth stuck in one of the machine's relays. The term bug had been used in the past with other devices, but Grace Hopper -- the US Naval officer who was one of the Mark I’s first programmer -- took the word and ran with it. When she wrote up the incident in her log, she included the moth.

“First actual case of bug being found," the log read. And computing would never be the same.

1948: The First Videogame (Games)

Issued in 1948, the patent for a “Cathode-Ray Tube Amusement Device” covers what is currently thought to be the first electronic videogame. It was invented at DuMont Laboratories, a company critical to the origin of the television business: Its founder Allen B. DuMont had made considerable improvements to cathode ray tube (CRT) display technology, making consumer TV sets commercially viable. It didn’t take long for DuMont engineers Thomas T. Goldsmith and Estle Ray Mann to dream up a more interactive use for CRTs.

The game played by their device was, naturally, a first-person shooter. The graphics were composed of a single dot -- the beam of the cathode ray gun -- which the player could move across the screen. Certain areas of the screen were designated as targets, but these would be visible to the player only by means of an acetate overlay printed with translucent drawings of airplanes. If the player scored a hit, the beam would become defocused, simulating an explosion.

DuMont never manufactured or marketed the Amusement Device, and the idea of using a CRT to play interactive games wouldn’t catch on for another quarter-century.

1944: The Guided Missile (War)

London in the 1940s was a city under siege from the skies. Most nights would be peppered with the ominous sound of air raid alarms as the Luftwaffe’s bombers approached. This was so often the case that many Londoners took to heading straight to an underground station for a decent night’s sleep, rather than suffer an interruption in their own bed, followed by a mad dash to the bomb shelter.

This period – their “finest hour” to quote Churchill – is worn by today’s London as a badge of pride. Plaques mark where the first bombs fell, and prominent buildings such as the Victoria and Albert Museum chose not to repair shell damage and the scars are still visible today.

Despite the havoc and fear that the blitz wreaked on London, it was still a hell of a risky job to be a German pilot. Anti-aircraft guns and fighter planes were waiting to defend the city.

That’s why the Germans came up with the Doodlebug. Don’t be fooled by its strangely peppy name. The V1 (as it was more formally known), was the world’s first guided missile and it brought Londoners a new kind of hell. The Germans developed it and began to deploy it against the British capital in 1944. The guidance system was pretty basic but a large part of the V1’s deadliness lay in the simplicity of its design. A gyroscope ensured the rocket traveled towards a preset vector and an airscrew in the nose kept track of how far it had flown. When the rocket reached the correct distance, the engine cut out and the V1 would eerily plummet down on its target.

The V1 was so successful in maintaining the Blitz effort at a minimal risk to the German airman, that after WWII militaries around the world pounced on the missile concept and developed it further, coming up with ever more sophisticated guidance systems like the infrared and heat-seeking missiles of today.

1945: The House Committee on Un-American Activities (Security)

The 1940s saw communism as one of the nation's biggest threats. In response, the House Committee on Un-American Activities became a permanent committee of nine House members in 1945. Their mission was to investigate subversion and propaganda that attacked "the form of government guaranteed by our Constitution."

By 1959, however, the committee was losing its prestige. Former President Harry Truman declared it the "most un-American thing in the country today."

Along the way, the committee paved the way for the conviction of espionage-related charges of Alger Hiss, a former State Department official who helped found the United Nations. The refusal of the so-called "Hollywood Ten" in 1947 to testify before the committee during a Communist witch hunt resulted in contempt of Congress charges and the blacklisting of these screenwriters, actors, directors and musicians by the Motion Picture Association of America.

1940s: Materials Rationing (Design)

In the 1940s, the U.S. and other countries started rationing goods they felt could be scarce during the war. It meant a system of rationstamps and an enforced austerity, but it also meant the world had to explore new materials and new ways of using old ones.

The U.S. rationed rubber; silly putty was born. The U.S. rationed silk; nylon use exploded in ladies’ stockings. In Britain, steel rationing continued into the '50s, and a car company called Rover invented an off-road vehicle made with an aluminum body.

In the WWII time of plowshares-to-swords, even food required some creative solutions, like replacing rationed white sugar with honey. The war also led to a strong recycling initiative, especially for aluminum and tin cans, scrap metal, and rubber.

Printing United States Office of Price Administration war ration book number 2. Image: Library of Congress

1940s: The Helicopter (Vehicles)

In April of 1944 a teenage boy was stranded on a sandbar in Jamaica Bay not far from where New York's John F. Kennedy Airport sits today. The tide was rising and the risk posed by the cold water forced the Coast Guard to act fast. Instead of using a boat, the rescuers decided to use what would become one of their most important search and rescue tools in the coming decades: a helicopter. Plucking the young boy from the sandbar marked the first time a helicopter had been used for rescue. Later that same month, another Sikorsky R-4 helicopter would rescue injured soldiers from behind enemy lines in Burma. The helicopter was quickly proving its worth.

The idea of a vertical-lift flying machine dates back to at least Leonardo da Vinci's famous sketch of an "aerial screw." Ideas progressed through the 18th and 19th centuries, and shortly after the Wright brothers' first flight, there were a few reports of short vertical hops in 1907. In the 1920s Argentine engineer Raúl Pateras-Pescara pioneered the use of a cyclic, allowing the blades of the helicopter to change pitch every time they cycled overhead, with the rotary wing steering as well as providing lift. Several helicopter designs would follow until Igor Sikorsky designed, built and flew his VS-300 untethered on May 13, 1940. The helicopter used the single main rotor and tail rotor design that would pave the way for the R-4 and most of the helicopters to follow.

1947: The Microwave (Gadgets)

The microwave didn't just transform the kitchen, for many a college student and code monkey, a microwave is a kitchen. But imagine a world without the microwave, where it takes 45 minutes to heat up a Hot Pocket. Fortunately, you don't live in that world, thanks to an accidental discovery that allows you to feed yourself warm food without ever having to crack open a cookbook or even learn how to turn on a stove.

In 1945 Percy Spencer, an engineer at defense contractor Raytheon, noticed that his Mr. Goodbar candy bar melted in his pocket while he was working on a high-powered microwave radar transmitter. His chocolatey loss was our gain, as Spencer decided to investigate the heating properties of microwave beams. The first thing he heated on purpose was corn, which popped, thus introducing the world to microwave popcorn. The second was an egg that exploded in the face of one of Spencer's colleagues. (They hadn't yet invented the microwave oven door.)

The first commercially available Raytheon Radarange went on sale in 1947, but at nearly 6 feet tall and 750 pounds, and with a price tag of $5,000, it was still a long way from American homes. The first public taste of microwave-heated food was from a hot dog vending machine placed in Grand Central Station that same year. The microwave oven wasn't a commercial success until the 1970s, thanks to the falling prices of microprocessors and improvements in microwave technology.

1945: Wesley "Branch" Rickey Signs Jackie Robinson (Sports)

Everyone knows Jackie Robinson. But only hard-core baseball fans know Wesley "Branch" Rickey, the man who hired him. If that were Rickey's only claim to fame, he'd be a remarkable figure. But Rickey revolutionized the game in many ways. Aside from signing Robinson and, later, drafting Roberto Clemente (the game's first Latino), Rickey was instrumental in developing baseball's farm system and introducing innovations ranging from batting helmets to statistical analysis.

It's odd that, given his visionary innovations, Rickey had a hit-or-miss (mostly miss) career playing for various teams just after the turn of the century. He quit playing ball and went back to school, earned a law degree while coaching at Michigan and then got a job managing the St. Louis Browns (later the Baltimore Orioles). After a stint in the Army during World War I, he left the Browns for the Cardinals.

Rickey was a mediocre manager but a big thinker. He inaugurated the farm team system when he convinced Cardinals owner Sam Breadon to buy stock in minor league teams in Houston and Fort Smith, Arkansas, ensuring the Cards had their pick of young talent. Later, as president and general manager of the Brooklyn Dodgers from 1943 to 1950, Rickey pioneered the use of statistical analysis when he hired statistician Allan Roth. With Roth on board, Rickey championed the idea that on-base percentages were more important than batting averages.

Beyond the numbers, Rickey promoted the use of batting helmets, cages, pitching machines, and even created the first full-time spring training facility, in Vero Beach, Florida.

Still, Rickey is perhaps best known for signing Robinson to a minor league contract with the Montreal Royals, a Dodgers farm team, in late 1945. Robinson played with the Royals through the '46 season before the Dodgers called him up to the Show in 1947. It was a monumental moment, but only one of many reasons Rickey was elected to the Baseball Hall of Fame in 1967.

John Bardeen, William Shockley and Walter Brattain, the inventors of the transistor, 1948 -- the birth of Silicon Valley.

1947: The Start of Silicon Valley

The concept of the transistor had been floated for decades, but it was a team at Bell Labs that made the transistor a reality using gold contacts that were pressed to a “germanium surface that had been anodized to 90 volts, electrolyte washed off in H2O and then had some gold spots evaporated on it,” according to the laboratory notebook of Walter Houser Brattain in an entry dated December 16, 1947. “Both gold contacts to the surface rectified nicely.” In other words, it worked.

Brattan and his colleagues at Bell Labs, John Bardeen and William Bradford Shockley, would be awarded the Nobel Prize in physics in 1956 for their efforts. But in addition to furthering scientific knowledge, the transistor they invented set in motion the entire technology industry as we know it.

Shockley left Bell Labs and headed to Palo Alto, California to found Shockley Semiconductor. From Shockley’s startup came Fairchild Semiconductor and ultimately Intel, also birthing the venture capital industry along the way. Today, Intel crams billions of transistors on chips, and they are found in everything from radios and computers to cellphones and screens of every description. If there is an obvious pick for most important invention of the 20th century, the transistor is it.

Since 2007, Wired.com’s This Day In Tech blog has reflected on important and entertaining events in the history of science and innovation, pursuing them chronologically for each day of the year. Hundreds of these essays have now been collected into a trivia book, Mad Science: Einstein’s Fridge, Dewar’s Flask, Mach’s Speed and 362 Other Inventions and Discoveries that Made Our World. It goes on sale Nov. 13, and is available for pre-order today at Amazon, Barnes and Noble and other online book stores.