I don't know if this is an acceptable question to make here, but I wanted to know what kinds of things are being researched that can lead to major breakthroughs in math. I'm not talking about usual research but something that can be really big and promising.

I think the question as it's stated is too general. I can't bring myself to vote to close,though-any of the moderators want to help Ivan make the question more specific?
–
Mathemagician1234Nov 26 '12 at 0:04

I know the question is general, but I'm expecting general answers too. That is exactly the reason I was in doubt if this kind of question is acceptable (thank you for clarifying that) and why I used the tag soft question.
–
Ivan LernerNov 26 '12 at 0:15

Donald Knuth said that it would be better to support the normal ideas and improvements of working researchers than to look for the big research programs at the frontiers of math. I think he is spot on with this remark. Also note that it depends on the observer what constitutes a major breakthrough. Is a fast time-harmonic Maxwell-solver (let's say 1000 times faster than an FDTD solver on a single CPU) a major breakthrough? What about a faster than fast fourier transform?
–
Thomas KlimpelNov 26 '12 at 0:54

@Mathemagician1234 : You don't have nearly enough reputation to vote to close, and you know it. Please don't pretend to have moderation powers until you earn them.
–
Adam SmithNov 27 '12 at 6:44

5 Answers
5

There are essentially two different ways to approach the question that might lead you to different conclusions.

But before that, I think it would be nice to share a quote about the current state of knowledge in mathematics, and where we are headed. This is paraphrased from an anatomist who was remarking on the state of his field near the turn of the century.

Knowledge is like a field of corn. First came the farmhands, who with great sweeps gathered entire bushels for themselves. Then came the peasants, who scraped the untouched pockets near the rough parts of the land, gathering handfuls of material, but nonetheless grateful that it was enough just to get by. Finally came the crows, who with tremendous effort picked out their meal from the smallest of morsels left in the dirt by those who came before. Gentlemen, we are the crows.

As our level of mathematical knowledge increases over time, it becomes correspondingly difficult to comprehend, let alone answer, the questions that remain. Some of them fall into the category of easy questions with incredibly complicated answers (like Fermat's last theorem), while some are seemingly vague and nebulous to all but the specialists that engage with them. Questions like P-NP are considered to be very deep with incredibly complicated answers, yet it would take quite a while to explain to someone without extensive background what the questions surrounding P-NP are, and why they need to be answered.

The first approach to the question would be "what is the current trend in applied mathematics?" I don't have very much experience with the subject, but there are very important open questions in computer science and number theory, such as the aforementioned P-NP questions. As stated before, integer factorization is highly relevant to cryptography, and a "fast" algorithm to factorize large integers would be considered quite groundbreaking.

The second approach would be to ask about the trends in "pure" mathematics, or just mathematics for mathematics' sake. Current research is highly specialized. It is difficult to prioritize one field over another. Even highly abstract fields like topology have important applications to other theoretical fields. Theoretical physics has borrowed heavily from group theory and topology. Mathematicians like to ask questions about things without regard to practical application. We like to classify manifolds because it's fun (the classification of manifolds is not complete, however, but it is a highly studied field). It just so happens that manifolds are enormously useful to frame problems in physics and other sciences.

The quote is inappropriate. We are not the crows, but the peasants. Your answer also approaches the relationship of math to its applications too much from an executive summary point of view. Math crucially depends on its applications, be it in physics, philosophy, logistics or uncertainty. It generates many types of languages. A language is most useful if there is something interesting to talk about in that language.
–
Thomas KlimpelNov 26 '12 at 9:08

I just wanted to add that sometimes results from "the usual research" happen to "hit on" something "phenomenal", or, as is often the case, the implications of some result from "usual research" has far-reaching consequences that one could not have anticipated.

Theory often "trickles down" to applications, in ways that one cannot necessarily know, and in turn, "needs in the real world" often motivate breakthroughs in theoretical research.

So I wouldn't dismiss, outright, "the usual research."

I simply felt obliged to point out that the foundations of "breakthrough" research are years in the making, and such work, behind the scenes, often goes unnoticed. I didn't take your question to be dismissive; it's just that those credited for breakthroughs wouldn't be in the spotlight (historically, or in the future) were it not for the "usual" work of his/her teachers, predecessors and/or contemporaries.

You are absolutely right, my intention was not to dismiss it, even because I'll probably be doing the usual research in a few years. I just wanted to know if today we have anything extraordinary going on like what happened with physics on the 20th century. It just seems that we are kind of stagnant.
–
Ivan LernerNov 25 '12 at 22:15

I think, P vs NP is currently the problem with the most important implications to society. It concerns the complexity of algorithms, which can be described as the asymptotical growth rate of the number of calculations needed by a computer to solve a problem. P denotes the class of problems like solving systems of linear equations and calculating the determinant of a matrix which can be solved comparatively easily (in a number of steps proportionaly to a P olynomial in the number of digits of the input, hence the name). NP is a class of problems like the factorization or discrete logarithm problem which may not be easy to solve but whose solution can be easily checked if one is given the correct solution. Clearly $P \subseteq NP$. Most mathematicians believe that $P \ne NP$, meaning that there exist problems belonging to NP, but not to P.

The most currently used encryption algorithms rely on the difficulty to solve certain NP-Problems. For example the RSA cipher could be deciphered if one had a fast algorithm that given the product $pq$ of two large primes $p$ and $q$ (somewhat like 50 or 100 digits each), returns those two primes.

In, other words, our whole encryption system could break down if someone published an efficient algorithm for solving NP-complete problems (which are "maximally hard" NP problems, i. e. problems which enable us to solve every NP problem efficiently if we can solve them efficiently; the factorization problem is presumably not NP-complete and so it could be possible that you could factor an integer quickly but you are not able to find a Hamilton circle in a graph in polynomial time).

On the other hand, if someone proved $P \ne NP$ we could be very optimistic that our current encryption standards are not effectively breakable (althou.

Joseph Teran at UCLA works on things like computer graphics and virtual surgery, and has a lot of cool videos (showing output from his algorithms) on his web page. Ron Fedkiw at Stanford also does cool computer graphics stuff.

The "deep learning" algorithms for machine learning being developed by Geoffrey Hinton, Andrew Ng, etc. seem very exciting. This area is making a lot of progress right now.