Wow. I'm absolutely amazed how someone could use EDUCATION as a reason to doubt someone's intelligence. You know, before I went back to grad school, I worked for nearly a decade developing Linux drivers, X11 drivers, designing chips, developing web sites, and among other things, writing parallel applications for real products used in the real world. So I also have plenty of practical experience.

But you probably wouldn't know much about practical experience either, being an uneducated toothless hick from the backwoods of the Appalachian mountains, living in a tattered wooden shack and using readin' ritin' and 'rithimitic books from the 1850's.

Ha! How do you feel now? You don't like it when the ad hominem attack comes in your direction, do you? You don't like it that I assume you're an idiot because you make ignorant comments, do you?

And really, I would love to see someone make a real, logical argument that explains why system-side knowledge of CPU resource allocation is "useless". Or why developing a tool that makes it more convenient for developers to use parallel resources is "useless". What do you consider to NOT be useless anyhow?

Wow. I'm absolutely amazed how someone could use EDUCATION as a reason to doubt someone's intelligence. You know, before I went back to grad school, I worked for nearly a decade developing Linux drivers, X11 drivers, designing chips, developing web sites, and among other things, writing parallel applications for real products used in the real world. So I also have plenty of practical experience.

But you probably wouldn't know much about practical experience either, being an uneducated toothless hick from the backwoods of the Appalachian mountains, living in a tattered wooden shack and using readin' ritin' and 'rithimitic books from the 1850's.

Ha! How do you feel now? You don't like it when the ad hominem attack comes in your direction, do you? You don't like it that I assume you're an idiot because you make ignorant comments, do you?

And really, I would love to see someone make a real, logical argument that explains why system-side knowledge of CPU resource allocation is "useless". Or why developing a tool that makes it more convenient for developers to use parallel resources is "useless". What do you consider to NOT be useless anyhow?

I can't help it. Sometimes I just HAVE to feed the trolls.

What's area are you focusing on in Grad School?

And, how has the transition been back after being gone so long? [I'm contemplating a masters or two and want to know]

I'm doing Computer Architecture. Actually, I started out doing AI, but by the time I finished the core courses, all of the people in the AI lab I wanted to join had graduated, and I found myself alone. So I fell back on my professional background and switched to architecture.

The transition wasn't terribly easy. I hadn't been in school in a little over 9 years. I knew how to work hard, but I had forgotten what it was like to study and take classes, so I had to redevelop my study habits. My first quarter back, I didn't have financial aid, so I worked 20 hours/week and went to school. Of the two courses, I took, I got an A in one, and a B+ in the other. The B+ was in distributed operating systems, which is my weakest area anyhow, so some of it was that I was rusty, and some because I'm not strong in that area.

just because someone is a PhD doesn't mean they are totally clueless. Obviously. But I can understand why people have had it to here with the Andrew Tannenbaums and other academics who seem to live in a theoretical world.

Ok, this I can totally get on board with. Many of the professors I know and most of the grad students have no "real world" experience. And you can really tell the difference between those who have experience and those who don't. Those without experience are often idealists who really can be clueless about what it is that it really takes to implement something practical, while those with industry experience tend to be more grounded and willing to compromise for the sake of tractability. Note that by the time a professor is in their 60's the probability that they've been involved in one or more startups is quite high, so it's not as bad as you might think. There are a lot of old professors here.

I don't know if I'm a scientist in engineer's clothing or an engineer trying to pretend to be a scientist, but my industry experience has given me massive advantages over many if my colleagues. My engineering skills give me the tools to actually implement things that others would try to estimate, and as a consequence, my results are generally more precise and convincing in publication. (It turns out that a lot of reviewers for top CS conferences have significant industry experience and can be very demanding about authors proving that their idea is actually practical to implement.) My practical bias has also been a disadvantage in some circumstances where I find myself unfamiliar with the politics. I can fall back on general knowledge of psychology (egos, money, etc.), but there are peculiarities of CS that I still have to learn. Fortunately, my advisor is better at the politics, so we make a good team.

Now, keep in mind, that in any field, the scientists are always decades (or more) ahead of the engineers. Consider what we KNOW about chemistry and physics and what's actually been used to make practical products. It's a small fraction. Computer Science is absolutely no different. Parallel computing is now becoming a ubiquitous reality, but the computer scientists already explored the hell out of it, got bored of it, and moved in in the 90's. Today, Intel and AMD are only just beginning to implement primitive forms of the things that the academics came up with long ago.

It's important to recognize, though, that many many things that are cool in theory are highly impractical to implement. Someone who can distinguish this has a major advantage, and very many academics can't.

Of course, there are some things that are SO impractical that even the academics don't bother. For instance, there's this really interesting O(N) solution to finding the median of an unsorted set of numbers. The algorithm is described in detail in textbooks. But no one has ever implemented it. It's just too hard. Moreover, the constant factor is so large that for most situations, applying an O(N log N) sort and then picking the middle is not only much simpler but also significantly faster.

Anyhow, here's the deal about academics. You can never have too much knowledge. When I came back to school, I learned a hell of a lot of theory that I didn't know. And that knowledge has turned me into a better engineer. If I have 5 options to engineer a solution, I now know more about things like algorithmic complexity that will help me decide in advance that one is better than another, or perhaps that the problem is NP-complete and so I already know in advance that I need an AI solution. I was already good with multithreaded programming with pthreads, but studying distributed operating systems taught me a lot of approaches that I didn't know, ways to visualize solutions to problems, and the sorts of pitfalls (data races, etc.) that one can run into. In each of the areas where I was already good, I am now better.

Good practical industry experience can make a person very competent. Academic knowledge of theoretical computer science can make one insightful. Their combination can make a person brilliant, with a a deeper understanding of the field combined with a stronger ability to actually build things that work. (I hope some day to become brilliant.)