If you are new to Computer Science please read our FAQ before posting. Your question might have already been answered!

Do not post questions such as "should I study computer science?", "how do I get an internship?", "what sort of job can I get after school?", etc... There have been too many of these threads; they bore the regulars and scare away experts. If you have a question like this, please consider posting on cscareerquestions or askcomputerscience.

I remember reading a post in a computer science subreddit that outlined some areas in computer science that will have more relevance and importance in the future, such as parallel computing for processing large amounts of data and such. I couldn't find it, so I'm asking you guys. Which field in computer science do you think will be important to know in the future?

EDIT: Yeah, I'm getting a LOT of various fields... it would be nice if you could explain why you think it's important

Not not important, but less important (at least, relatively speaking).

Perhaps databases? They have been a pretty big thing so far, but I can't see it being the forefront of research. Definitely incremental improvements, but I think for the most part what we have does not leave as much to be desired as some other fields.

As for what I think will be more important: language design, especially as applied to parallel computing. For the coder of average skill, writing and reasoning about complex parallel code isn't an easy task. I think most coders can do a decent job of writing well performing single threaded code these days, but multi-threaded is another story.

We aren't going to get much faster cores, so all programmers will need to be able to write parallel code.

Do you mind if I ask where you teach? That sounds like a very interesting curriculum!

I came across Chapel while doing my regular surfing of the Wikipedia list of programming languages by category. Programming languages and PL design are very much my interests in CS and will be my focus in my MS CS, so I often search for new and interesting languages.

I totally found out about Chapel on accident. I happened to be in a talk about Chapel at SC 2009 and my ears perked up because I recognized someone's name. By the end of the talk I thought, "I should teach this in PL next semester."

Not a great reason to pick up a language, for sure. It's worked out very well, though!

If parallelism and formal verification (listed above) are becoming increasingly important, than Programming Languages is becoming increasingly important. I would particularly note that effect analysis is a relatively new subfield of Programming Languages with lots of work still to be done.

Well you could consider the study of FORTRAN a very narrow specialty within cs...nit-picky, I guess. But it is interesting that it's so hard to come up with an answer to the question of which cs fields will not be useful.

People don't realize how important cryptography is, we use it every time we access the secured version of a website, its just that our browser handles everything for us behind the scenes.

Specifically, research in crypto algorithms safe from quantum computers will see interest as some algorithms we currently use can be broken in reasonable time given a reasonably fast quantum computer.

Another area is lightweight crypto, crypto algorithms that are reasonably secure but have very low processing power requirements.They are required in standalone devices with very small batteries like wireless sensor networks, rfid based devices and their likes.

The growing intersection of user interfaces and programming experience.

When most aspects of our live will be supported by digital technology, people will interact with machines all the time -- partly as users, partly as programmers. From the specialist trying to maintain and improve hyper-complex code bases to the end-user giving requests to an fairly independent and context-sensitive system, the more power we have the better we need to express it.

Honestly this is like trying to predict the exact daily weather forecast, several years from now. We have neither the necessary amount of data, computational power, or expertise to make any reasonable long term predictions about such a general question.

Just look at predictions about the future of AI, 20 years ago, or hardware 30 year ago. They are hilarious.

AI algorithms are very important to our daily lives today. The difference is they are "weak" AI systems which basically means they are just a subset of human intelligence and won't talk to you. If you use Google, or you go to the hospital (that has an expert system to help with diagnosis) you have benefited from AI. Hell if you use Siri you've really benefited.

That is not true at all. AI is a very specific class of algorithms. I highly recommend reading the wiki page on it. One of the reasons it seems like AI hasn't made much progress over the years is because the bar keeps getting moved each time we accomplish something.

We will see a more widespread use of expert systems in medical diagnostics and other various fields as computing speeds and knowledge databases continue to grow. That being said no one but us will have any idea what an expert system is. It will be everywhere but not many people will know that it is everywhere.

From a more pratical prespective I think distribuited systems and concurrency (and concurrency models / support from programming languages, correctness verification, etc) are now more relevant than ever with multi-core systems everywhere and "cloud" services. I think these areas will be very active and are of major importance in the next decade, at least.

Of course these are areas are not "more important" than others, but now these is a huge motivation for investing in them, but programming languages, algorithmics, complexity theory, artificial inteligence, will always be important

Other autonomous robots, like the swarm bots (by Vijay Kumar) in order to operate outside the motion capture camera filled lab, they need less intensive algorithms.

A lot of people are 'cheating' vision though by using the Microsoft Kinect and other cheap depth aware cameras in real world application, but you still have face detection and identification as a really heavily wanted technology not just for every terrorist hunting organization in the world, but also for gimmicky log-in methods.

So one of the coolest things I've seen that I believe would be considered computer vision, is automation robots on an assembly line welding metal together and then checking that weld to make sure that it is, in fact, a solid and correct weld.

Web development.
Native apps will always have a place, but the browser is becoming the OS. Google apps, mobile sites, e-commerce, social networking, cloud computing, etc. all happen in the browser. Good web developers will be vital as the migration continues. We're not talking your nephew writing a static HTML page in his basement, or a crotchety C programmer who only writes printer drivers. We need developers who have a firm grasp on the HTTP protocol and understand how many technologies fit together to create a web app. These will be needed in the near and far future.

I think they're both going to go hand in hand. Because it's so important to the consumer, I think networking is going to become the most researched piece of hardware. In twenty years, I see a very small chance we'll have native apps, and the network is going to have to accommodate this. It will also, I believe, tie into cybersecurity.

They don't. Web development is not computer science any more than Java programming is. There is no science in what /u/ckennington suggested; it's just programming. Science is about studying phenomena and learning about the universe, not creating an application for people to use (although studying the usage of that application, or showing that certain designs are better/worse via experimentation etc... could very well be science!)

The product of science is knowledge.

Because it's so important to the consumer, I think networking is going to become the most researched piece of hardware.

The consumer doesn't really drive research; they drive product development. Academic curiosity drives research. "Networking" is already one of the most highly researched areas in computer science. I honestly don't know much about hardware research, but, that tends to fall to into the lap of the EE guys for the most part.

Most of the papers that I read operate at a higher level than physical media. No matter how much time we put into network hardware research, it's still fundamentally limited by the speed of light. What we can do is make more efficient use of the network resources that are available.

In twenty years, I see a very small chance we'll have native apps, and the network is going to have to accommodate this.

Predictions like this have been made since the dawn of computing (e.g., John Gage coined the phrase "the network is the computer" in 1984 at SUN). That doesn't mean that there won't be a continuing push towards "cloud computing" (and whatever the next buzz word is), but, I think there will likely always be a set of applications that can't deal with the trade offs of remote computation.

Wow I'd really wish, since I'm currently getting a PhD in the field. But how can you say if we have reached a plateau or we are at the edge of a great paradigm shift, after 60 years of research? My gut feeling goes with the former, but then again, how would I know?