It seems like there has never been a worse time for retail. Sears looks doomed. Many other large retailers, including J.C. Penney and Macy’s are closing stores and even fashionable boutiques like BCBG Max Azria are filing for bankruptcy. CNN reports that store closings this year are already outpacing 2008, the worst year on record.

Yet take a broader view and things don’t look so bad. Apple’s stores have been a hit and a number of digital companies, like Amazon, Warby Parker and Bonobos have been opening brick and mortar locations. Sure, e-commerce is growing, but the Census Bureau estimates that it still accounts for less than 10% of total retail.

As Darrell Rigby explained in an article in Harvard Business Review, every 50 years retail goes through a major disruption. The rise of urban centers led to department stores. Automobiles created suburbs and shopping malls. Then category killers and discount stores challenged that status quo. Today, retail is in the process of being reinvented once again.

Yet the debate over who had an idea first is often beside the point. The truth is that innovation is never a single event. So a moment of epiphany matters far less than the work that comes after, which is often as uncertain as it is thankless. That’s what makes the difference between great innovators and “coulda-beens.”

We’re often told that innovation is about ideas, but that’s only partly true. If your idea is revolutionary enough, your biggest problem won’t be having it stolen, but getting anyone to take it seriously at all. That’s why hardest part of innovation is not coming up with an idea, but getting it adopted. Talent will only take you so far; you also need the grit to stick with it.

In 1895, to fuel his zeal for conquest, Napoleon offered 12,000 francs to anyone who could come up with a way to preserve food to sustain his army on the long march to Russia. Inspired by the challenge, as well as the prize money, an unknown confectioner named Nicolas Appert invented the canning methods that are still largely in use today.

The most recent award, the Qualcomm Tricorder XPRIZE, based on the famous Star Trek device, challenged teams to create a mobile device that can “diagnose patients better than or equal to a panel of board certified physicians.” The winners were a team of middle-aged siblings, who got their start as kids watching science fiction and tinkering together.

In 2014, Thomas Royen, a retired German statistician, solved a puzzle that had stumped mathematicians for decades. Called the Gaussian correlation inequality, it involves the probability of a random point lying in overlapping shapes and is one of those seemingly simple mathematical ideas that is devilishly hard to figure out.

Yet the mathematics community was skeptical. Royen published his proof, which was not formatted in the standard way, in an obscure journal where he sat on the editorial board. So many simply assumed that Royen’s work was just another of the many false proofs that had emerged over the years.

Eventually, two other mathematicians restructured Royen’s paper, published it on arXiv.org and his work was recognized, but it could have easily been lost for another generation. “I am not so talented for ‘networking’ and many contacts.” Royen said dismissively, but his apathy points to a larger problem. Innovating is not just about ideas, but communicating them.

In The One Percent Doctrine, veteran journalist Ron Suskind described how former Vice President Dick Cheney, when told that a unlikely but dire threat had merely a one percent chance of happening, argued that, in terms of response, it should be treated as a certainty.

The idea has a certain logic to it. Clearly, if the potential impact of an event is consequential enough, then it needs to be addressed whether it is likely to happen or not. After all, unlikely things happen all the time. However, nobody’s resources are unlimited. So you need to balance the need to prepare for low probability events with the need to address more likely ones.

That, in essence, is the dilemma every business finds itself in. We spend most of our time working on the regular stuff, even though, as Nassim Taleb explained in The Black Swan, even very low probability events can have an enormous impact. Fortunately, today technology is transforming our ability unlock opportunity even in the most unlikely places.

When we think of Albert Einstein, we inevitably conjure up images of the icon rather than the man. We see Einstein with his wild hair and his tongue sticking out or Einstein as a playful old man, riding a bicycle. We remember his cheerful confidence and his easy comfort with his own genius. He wasn’t always that way.

The younger Einstein, the one who actually came up with the ideas that established his place in history rather than the world famous scientist he became, was far different. Reeling from chronic unemployment and a troubled marriage, he was working as an obscure clerk in a patent office when he changed the world.

Yet we can learn far more from that awkward young man that we can from the icon. While the older Einstein was, as Robert Oppenheimer put it, “a landmark, not a beacon,” the early Einstein was a creative force who transformed how we see our universe. So don’t be fooled by the witty myth in the tattered sweater. Here are 4 lessons the real genius can still teach us.

In the process of researching my book, Mapping Innovation, I talked to dozens of successful innovators, from world class scientists seeking to cure cancer and create new computing architectures, to senior executives at large corporations and entrepreneurs at startups. It was a pretty diverse group.

One of the underlying premises of the book is that there is no one “true” path to innovation, so I expected to see a variety of approaches and that’s indeed what I found. Some of the people I talked to were slow and deliberate, spending years or even decades on a difficult problem. Others were fast and agile, iterating and pivoting toward a viable solution.

However, I also noticed that some remarkably constant themes emerged. Over time, it became clear that while the people I talked to were vastly different in background, training, personality type and method, they tended to have four attributes in common. While none of these will make you a great innovator, you are unlikely to innovate without them.

The principles of running a business are fairly straightforward. You create clear objectives, achieve them efficiently and try to get better as you go. Business school professors have fancy names for this stuff, like “strategic DNA,” “core competencies” and “continuous improvement,” but in a nutshell all that stuff means is that you try to do things better, faster and cheaper.

The problem comes when you find yourself running a square-peg business in a round-hole world. When that happens, following traditional best practices will only result in getting better and better at doing things people want less and less. Round holes don’t care how good your square pegs are or how efficiently you can produce them.

Make no mistake. Eventually, every business finds itself in a round hole world. That’s why good companies fail. Not because they somehow become stupid and lazy all of a sudden, but because the world changes and they lose relevance. Then those same practices that led to success now lead to failure. We need to learn to prepare for a future we cannot yet see.

Many people think that Charles Darwin came up with the idea of evolution. He didn’t. In fact, by the time he hit the world stage, many people already believed in evolution and there were already a number of theories, such as those of Jean-Baptiste Lamarck, that sought to explain it. Darwin was merely the first to come up with a workable hypothesis.

Clearly, Darwin is one of the most influential scientists who ever lived. Today, more than 150 years after he first published On the Origin of Species, his theory remains one of the most essential and pervasive scientific tools we have. Yet it is not only product of his work that’s valuable, Darwin’s innovation process is something that we can all still learn from.

In 1991, Linus Torvalds released the Linux kernel on the Internet and invited anyone who wanted to download, use and modify it. In an amazingly short amount of time, a community built up around Torvalds’ initial code and their contributions transformed it into an operating system that rivaled those of even corporate giants like Microsoft.

Even now, it seems somewhat of an unlikely story that such a fledgling effort could make such a transformational impact. Yet today, open communities have become so pervasive that the term “proprietary,” to a large extent, just means the stuff we build on top of open source software. And we’re just beginning to scratch the surface.

Today, we’re entering a new era, where open platforms are going beyond just software and starting to take hold in everything from scientific research to manufacturing. In fact, as our ability to connect to ecosystems of talent, technology and information continues to increase exponentially, the solution to many tough problems is becoming more social than technical.