Wolfram brings computational science to the table and has posited that the earth and universe can be understood as a computer program that can be significantly altered as we continue to advance in technology.

Wolfram is a genius, I'm just not clear what "advancements" he's brought to computational science or bit-string physics. I mean, that "universe as a computer" stuff is all still theory right now, right?

Call me cynical but I fear that this will result in more Futurism with people crossing into other fields of expertise, reading papers and then holding them up as the holy grail in undoing aging and death. Sure, it's amusing but I think at best this is going to be a lot of smart people pounding square pegs into round holes all day long. At worst it's just going to sidetrack people from doing work and daydreaming about interdisciplinary possibilities (like some of the Macy Conferences did for Cybernetics).

Welp, better settle in and prepare for the crazy Kurzweil stories to fire back up!

Remember that he wrote A New Kind of Science at night, while he continued to run a successful multi-million dollar software enterprise during the day. The peer review jury is still partly out on ANKOS, but his highly original ideas continue to thrive and spur further research. His deep insight that true chaos devolves from ordered deterministic processes (e.g. cellular automatia) across all of nature is nothing short of astounding. Focused he is. Obbsessive and a bit eccentric, certainly. But a nut? Not by a long shot. Same goes for Ray K.

Thinking about a universe simulator that predicts the future is fun; but should be impossible by the laws of information entropy. The absolute smallest space you could use to record the information about the position and rotation and composition of an atom would be at least the size of an atom; Even if your machine runs on the quark scale, you need to record information for every quark of the universe. Your machine could never achieve a greater bit "resolution" than the universe that it takes place in, so therefor you could only ever simulate a portion of the known universe. To simulate the entire universe, you would need a computer the size of the universe at least (if not much larger), IE the universe itself. You cannot fit a perfect copy of the universe inside of the universe. So short of somehow creating additional dimensions then you're SOL. That is, if the universe is indeed digital (as particles would suggest). If instead everything is continuous with infinite resoloution... there's a whole lot of questions to be answered.

Mathematica [wikipedia.org], used by grad students everywhere. The impact of this software is huge. Grad students everywhere rely on it to visualize equations they otherwise wouldn't understand. It has been a tremendous boon to computer scientists, astronomers, chemists, physicists, etc.

He also wrote some papers on particle physics. And then there's Wolfram-Alpha, which I use at least once a week.

NKS does not make a good case for Wolfram's genius, but rather for his arrogance and ignorance. It's good work, from the point of view of being correct, and being about a significant subject. It's not so good from the point of view of originality. Nor is it a superior treatment of a known subject. It's not even a novel approach. Wolfram really went gaga over cellular automata, and they've been well known ever since Conway's Game of Life popularized them in 1970, and studied well before that. He talks as if the subject had languished, and his research singlehandedly revived interest in them. Perhaps so, among physicists. He also excuses his failure to understand its significance as the consequence of it being presented as just a game. Obviously, he didn't talk with any computer scientists before writing that book. He merely rediscovered what computer scientists have known for decades. Worse, he's not even the first physicist to have rediscovered computer science! That man, and his arrogant physicist buddies need to get out of their bubble more often. I've seen this kind of thing before, where the people at the top of a particular discipline start acting as if all other science is secondary, is only an aspect of their chosen discipline. Saw that attitude towards Computer Science in professors and students of Electrical Engineering. They didn't get it that algorithms were more than simple, trivial little lists of instructions for hooking up logic gates. Mathematicians also have this tendency to view CS as just a branch of math, and algorithms as something that can be expressed as "just" a series of formulas. It's like the view that a person is only a bag of water with a few other chemicals mixed in, or the "Big Iron" implication that a computer is only a lump of metals. Goes over the top in overlooking the organization.

Wolfram's work illustrates that Computer Science should be a discipline of its own, on the same level as Math. The concept of the computer algorithm ranks with the mathematical formula in importance. You can't do any serious physics without advanced math. These days, you also need advanced computer science to do physics. His much vaunted NKS is in fact Computer Science.

It took genius to invent the wheel. In that sense, Wolfram is a genius. What does it take to avoid reinventing the wheel? Wisdom.