When I was a graduate student in math (mid-late eighties and early nineties) the arena was dominated by a few grand projects: for instance, Misha Gromov's hyperbolic groups, which spread into many seemingly heterogeneous domains (such as combinatorial group theory, to name just one), and Bill Thurston's classification of low-dim manifolds.

A few years have passed, and I must admit that, aside my pet domains, such as categorical algebra and applied logic, I have not kept up with innovations.

I am pretty sure, though, that new trends, generated by some core new ideas, are leading contemporary math research, or substantial portions thereof. One such idea I know already, namely Voevodsky's homotopical foundations of logics, which brings together abstract homotopy theory and type theory.

What else?

PS I am not interested (at least not directly) in new solutions to old problems, unless these solutions are the germs of (and conducive to) new directions.

Rather, I would hear about new research projects which unify previously disparate fields, or create entirely new ones, or shed lights into old domains in a radically new way (in other words, I am interested in what Thomas Kuhn called paradigm shifts). The ors are of course not mutually exclusive.

Addendum: It looks like there is an ongoing debate on whether this question should be kept open or not. As I have already answered below, I submit entirely and with no reservations to the policy of this respectable forum, whatever the outcome. As a dweller of MO, though, I am entitled to my personal opinion, and this is to keep my question in the air. Nevertheless, I am well aware of the potential risks of either turning it into -what I like best of math right now- or generic answers that provide no meat for this community. Therefore, allow me to add a clarification:

My wish is the following- to get to know from informed individuals which new paradigm-shifting trends are currently under way.

To this effect I appreciate all the answers so far, but I invite everybody to provide some level of details as to why they have mentioned a specific idea and/or research project (some folks have already done it). For instance, one answer was tropical math, which indeed seems to have risen to prominence in recent years. It would be nice to know which areas it does impact and in which way, which core ideas it is founded on, which threads it brings together, etc. The same applies of course to all other proposals.

This could be quite an interesting discussion, which I personally could likely enjoy and might even contribute to. That being said, I firmly believe that this is too subjective and possibly argumentative for MO. Voted to close as such.
–
quidDec 30 '12 at 21:18

dear quid, I have no issues if you folks decide to close it down, and I do understand your misgivings about possible arguments on what is important and what not. But fact is, there have always been and there will always be leading trends in math and sciences, and the only thing I am after here is a set of honest answers: why not letting working mathematicians talk about what is important right now in their great field? If people will feel differently about what is relevant, so be it. All the best, Mirco
–
Mirco MannucciDec 30 '12 at 21:41

20

This is a nice question. Of course it is subjective, most of mathematics is subjective! We are faced with subjective criteria everytime we referee or submit a paper, everytime we choose a research problem, everytime we propose a topic for a thesis. I would like to offer another criterion to those who are in haste to close questions like this one: will the average reader of this forum learn something from the question, its answers, and/or the discussion? If yes, let it run, if no close it if you must.
–
alvarezpaivaDec 31 '12 at 1:26

12

Meta thread created tea.mathoverflow.net/discussion/1505 Please post further contributions on the appropraiteness of the question their; for those not yet meta active: there is an extra sign-up for meta (do not try to use your MO login) but it is simple and instant (despite the wording 'apply for membership')
–
quidDec 31 '12 at 14:16

28 Answers
28

The Langlands program. It goes back to the sixties, but in the last years, with the proof of the fundamental lemma by Ngô Bảo Châu and with several results in the local case, it became one of the most active area in number theory, and I think there is no hope to finish the job in the next, say, 50 years.

I really meant no. This is of course just a personal opinion (and am not an expert!). But I think there very important recent results in the local case, for example the local-global compatibility proved by Emerton.
–
RickyJan 2 '13 at 11:36

3

I agree with Kevin. It is not true anymore that there is no hope to finish it in the next 50 years. There has been tremendous progresses in the last 20 years (basically since Wiles proved FLT). The announcement last year (is there a preprint yet?) by Thorne, Li, Harris and Taylor of the construction of a Galois representation attached to an arbitrary regular automorphic form for Gl_n is a great progress. The importance of the general proof of the fundamental lemma by NGO can hardy be underestimated, and could lead him and other to new progresses in functoriality. Laurent Lafforgue's ...
–
JoëlJan 4 '13 at 12:02

2

project on base change without the twisted trace formula has not come to fruition yet, but it may well do so soon, and open new ways. The progresses in the modularity results for Galois representations are also very impressive. And someone unknown now may very well come tomorrow with a great new idea. So, sure, several very important breakthroughs, plus a tremendous amount of technical work is still needed to "finish" the Langlands program -- but there is a hope now that this may happen much sooner that we expected 20 years ago.
–
JoëlJan 4 '13 at 12:07

1

Dear Quid, you are right, and this is why I have put "finish" between quotation marks. The Langlands program is pretty much open-ended. Let me precise my question by giving definitions of the Langlands Program. A certainly narrow interpretation, that Langlands would certainly reject as too narrow, but within which many people are working now, is the existence of a natural bijection, satisfying some properties too long to be stated here, between algebraic cuspidal automorphic forms for $Gl_n$ over a number field $K$, and geometric $l$-adic Galois representations of $G_K$. This program...
–
JoëlJan 6 '13 at 20:51

There is the derived algebraic geometry program of Jacob Lurie, starting with his thesis in 2007 and building on the work of ..., Simpson, Toën-Vezzosi, etc. In the words of Lurie, this is basically "jazzing up" the foundations of algebraic geometry with homotopy theory. Lurie has written two 1000-page books called Higher Topos Theory and Higher Algebra, and a series of papers called DAG. He has already had success in applying the theory to the classical topic of elliptic cohomology.

What is the grand project here? I was on the ICM topology committee that chose him to speak. One of the main reasons that we chose is that he had constructed a spectrum for equivariant K-theory (I'm not exactly sure what that means, so I might have mistated it). He also constructed various other exotic cohomology theories like tmf and TMF. There was some talk that these will be useful to classify the stable homotopy groups of spheres, for example (I think Behrens is working on one aspect of this). But I'm not really sure what kind of grand project you have in mind?
–
Ian AgolJan 1 '13 at 19:38

11

I think homological algebra in its various guises is one of the universally acknowledged tools of modern mathematics. This deals with LINEAR settings, such as vector spaces, modules over rings and other abelian categories. I think one way to understand this grand project, which goes back at least to Quillen, is that one expects an even more universal and powerful tool in the nonlinear setting, HOMOTOPICAL algebra. This is a tool that helps us "correct" or derive operations with nonlinear objects such as spaces, rings, schemes, categories, etc. The cited books greatly develop [cont.]
–
David Ben-ZviJan 2 '13 at 0:01

10

[cont.] homotopical algebra in a new setting, making it a powerful and streamlined tool that's (from the POV of an end user) easier to take "off the shelf" than previous ones. There are already many many applications (including some of the most spectacular by Lurie himself - in particular the Cobordism Hypothesis), though as should be clear the "grand project" goes much further back and is much broader. Algebraic geometry and representation theory are some of the areas where homological algebra is most deeply embedded and where the much broader nonlinear version will make a tremendous impact.
–
David Ben-ZviJan 2 '13 at 0:04

8

(Let me re-reiterate, no one here is ascribing homotopical algebra to Lurie, and as has been often claimed this is not the format to debate relative merits of individual mathematician's contributions, but I think it's incontrovertible that Lurie is a leader in this grand project, and that great progress has been made in recent years.)
–
David Ben-ZviJan 2 '13 at 0:11

4

It's an interesting question "what is expected to be done" (or "what's DAG's version of the Weil Conjectures"..) - I don't know. One example of a (not overly facetious) answer is, "the geometric Langlands program". I also expect the classical (especially the p-adic) Langlands program will benefit greatly. One can say the same for lots of specifics in algebraic geometry (Donaldson-Thomas theory, homological mirror symmetry) and maybe someone with a better understanding of the motivic world could say something useful there.
–
David Ben-ZviJan 2 '13 at 3:44

I think it is certainly appropriate to denote as a "grand project" the remarkable new progress in the area sometimes called additive combinatorics or additive number theory, though the subject has expanded to the point that neither of these are good names any longer, if they ever were. I am talking about the strain of thought whose modern form starts with Gowers and continues through with work of Tao, Green, Helfgott, Breuillard, Ziegler, Pyber-Szabo, and many, many others: loosely speaking, all this work centers around the idea that "things that are approximately structured approximate a structure" -- so that if I am a subset of a group and I am approximately closed under multiplication, I must be close to some literal subgroup; or if I am a subset of Z which contains too many arithmetic progressions, I must actually have big intersection with some infinite arithmetic progression; or if I am a subset of R^2 such that lines containing two points of my set are overly likely to contain a third, then I must look something like a subgroup of a real elliptic curve....

Another way to put it is that this field is concerned with structural dichotomies -- subsets either obey the laws that completely random subsets do, OR they are "structured" in some appropriate way; there is no in between.

To me this perfectly meets the definition of a grand program -- like Gromov's take on group theory (with which it shares both some content and some philosophial affinity!) it provides a really new paradigm for "how things are," and at the same time it has given us real progress in a multitude of areas (analytic number theory, harmonic analysis, combinatorial geometry, etc.)

Having a precise notion of such a field would allow us to further exploit the analogy between number fields and function fields (much like discrete valuation rings, Dedekind domains, schemes, etc. have done in the past). In particular, with a suitable notion of $\mathbb{F}_1$ we should be able to find a proof of the Riemann hypothesis based on Weil’s proof of the Riemann hypothesis for curves over finite fields.

The integers are not an algebra over any field in the classical sense, which makes it a priori impossible to adapt Weil's argument to this case. For this analogy to work, a good definition of $\mathbb{F}_1$ should have the property that $\mathbb{Z}$ is an $\mathbb{F}_1$-algebra.

It should be noted that Durov's definition of $\mathbb{F}_1$ is surprisingly simple and that $\mathsf{Mod}(\mathbb{F}_1)$ is just the category of pointed sets (which supports the claim about combinatorics; note that $\mathsf{Mod}(\mathbb{F}_{\emptyset})$ is the category of all sets). But his approach is used for some problems in Arakelov geometry and doesn't qualify for the solution of the Riemann hypothesis.
–
Martin BrandenburgJan 2 '13 at 10:42

Does compressed sensing count as math? If it does, here is a blog post from the horse's mouth.

Edit: For those who would like a popular article, here is a good one in Wired (by JSE if I'm not mistaken). Also, it is encouraged to read the highly upvoted comment by JSE below.

Because I don't think I can ever explain it better than Terence Tao's brilliant blog post or think I'm qualified either, I'll just refer to the blog, and here simply mention in which field I, as someone working on combinatorial design theory, personally stumbled on it as an interactions between fields (Please read the following only when you have nothing better to do.). I hope experts edit and improve this post.

I had heard good things about compressed sensing before, but the first paper I read was about its application to error correction by Candes, Rudelson, Tao, and Vershynin. I don't know if it's comparable to other recent truly remarkable progress in coding/information theory (e.g., polar coding, which could be a candidate for the answer to OP's question), but it was a refreshing read to me who dabble in coding theory. It's in one sense similar to normal linear codes in that the goal is to recover a vector $f \in R^n$ by knowing $y = Af +e$, where $A$ is an $m$ by $n$ matrix and $e \in R^m$ is the error vector. But the paper studies when $f$ is uniquely determined by $l_1$-minimization a la compressed sensing. Then I learned that some combinatorial design theorists I follow were applying design theory to compressed sensing, in a very rough sense, to give a nice deterministic method for explicitly providing ideal $A$. And when I checked what was up in quantum information these days (I also dabble in quantum information), I ran into this paper by Gross, Liu, Flammia, Becker, and Eisert, where compressed sensing is applied to quantum state tomography, a method for determining the quantum state of a system. And this is the one paragraph version of how I wound up with an endless to-read backlog of papers spanning multiple fields.

If you can have single-pixel cameras then can you create a giant telescope just by using a small telescope?
–
user30304Dec 30 '12 at 22:26

1

Thank you for including some more specialised and personalised information.
–
quidDec 31 '12 at 17:51

1

A great resource form more information is the repository for compressed sensing at Rice University (dsp.rice.edu/cs). And I do think CS is very important ... it is great that you brought it up!
–
Kevin R. VixieDec 31 '12 at 20:28

5

I would say that the grand project is better described as "sparse inference," where we try to reconstruct data that is known or expected to be sparse in some basis (or low-rank, or in some other way restricted to a low-dimensional but badly nonconvex subspace of parameter space.) This includes compressed sensing but also a much bigger circle of ideas (L^1 minimization, convex relaxation more generally, hierarchical clustering, manifold learning, etc.) I have learned a ton from talking to people about this stuff and I hope more pure mathematicians will get in on it!
–
JSEJan 2 '13 at 3:08

3

But e.g. the paper of Candes and Recht that just won the Lagrange Prize places compressed sensing within a bigger and more conceptual theoretical framework. That's what I mean by pushing back on "compressed sensing" as a name for the whole field. By the way, thanks to YF for linking to my Wired piece -- but for anybody reading MathOverflow, Terry's blog post is going to offer you much more than my magazine article, which is very simplified!
–
JSEJan 4 '13 at 15:50

Is quantum computing too far away from pure math to qualify? Perhaps it has not shaken up mathematics so much, but it has brought together theoretical physics, computer science, and mathematics in a way unseen before.

Graph minor theory. The first success of this theory was the proof of Wagner's conjecture that in every infinite set of finite graphs, one is a minor of the other. However, the theory developed over the course of twenty-odd papers by Robertson and Seymour has been enormously fruitful and its consequences are still being actively explored, with no end in sight. The proof of the strong perfect graph conjecture was the next spectacular success, and then came applications to the structure of clawfree graphs. Graph minor theory almost singlehandedly transformed people's perception of graphs as structureless combinatorial gadgets about which only countless ad hoc theorems could be proved, into a realization that graphs are highly structured objects about which general theories can be developed.

Thanks Timothy, this is really spectacular stuff! I confess (shame on me!) that up to 10 minutes ago I did not even know what a minor of a graph is.... But after your great answer, I checked the wiki and found out a whole fascinating universe there. For instance, to someone like me that cringes at the word infinity, it comes as a happy surprise the Friedman, Robertson & Seymour 's theorem that is the finitistic version of the above. I have the feeling that this is exactly one of these paradigm shifting events I was looking for, and that we will see much more along similar lines
–
Mirco MannucciJan 4 '13 at 23:54

Mirco: The original graph minor theorem can be phrased finitistically as follows: For any hereditary property, there is a finite set $S$ of finite graphs such that having the property is equivalent to not having any graph in $S$ as a minor. I am not sure exactly where you stand philosophically, but the work of Friedman, Robertson, and Seymour may discomfit finitists of some stripes because it shows that the graph minor theorem cannot be proven except by using stronger induction axioms than are allowed in first-order Peano arithmetic.
–
Timothy ChowJan 6 '13 at 2:07

Tim, thanks for this add-on. This result by HF not only does not bother me, but actually is music to my ears! My philosophical position could be summarized in two main tenets: 1) all math objects are configuration of syntactic games (included the "natural numbers") 2) the apparent dichotomy finite-infinite is contextual. Now, 1) allows me to avoid restrictions of any sorts as far as which axiomatic system I can play with (of course, just like ordinary games, someone likes bridge and someone likes poker). 2) tells me basically this: there is infinitely (sorry the pun) more in the so-called
–
Mirco MannucciJan 11 '13 at 11:31

finite realm than what we ever dreamed so far. I reiterate that your answer is especially dear to my heart, because I am certain that in the years to come we will see much more of that. The complexity (and beauty) of the finite realm (particularly finite graphs, finite categories, etc) will literally astound us
–
Mirco MannucciJan 11 '13 at 11:37

1

Dear Timothy, I have a small bone to pick with this answer. While graph minor theory is indeed a grand project, the proof of the strong perfect graph conjecture and the characterization of the structure of claw-free graphs are not part of graph minor theory. Both are concerned with forbidden induced subgraphs, rather than forbidden minors. The tools used in studying forbidden induced subgraphs are rather different, as witnessed by the fact that the paper containing the proof of the strong perfect graph conjecture does not reference a single paper from the graph minors sequence.
–
Louigi Addario-BerryMar 11 '13 at 13:23

One grand project which has generated much work is the quest for mathematical understanding of mirror symmetry (via homological mirror symmetry, or the Strominger-Yau-Zaslow/Gross-Siebert picture). Attempts to formulate and prove the conjecture have led to interesting new ideas in symplectic geometry (like the work of Paul Seidel) and attempts to confirm enumerative predictions from string theory have led to new techniques in algebraic geometry. While this grand project has been around since the early 90s (e.g. Kontsevich's ICM talk which introduced homological mirror symmetry was in 1994) it is still going strong and much progress has been made.

Another very active program in geometry was initiated by the paper of Donaldson-Thomas (see also the more recent paper of Donaldson-Segal) and is an effort to define instanton counting/Floer-theoretic invariants in the context of higher-dimensional gauge theory and exceptional geometry.

The search for constant scalar curvature Kaehler metrics (see Donaldson's lecture from the Fields Medallists' Lectures Volume or Tian's book "Canonical metrics in Kaehler geometry") and the related Donaldson-Tian-Yau conjecture on existence of Kaehler-Einstein metrics on Fano varieties was recently resolved after nearly twenty years' work by many of the world's leading geometric analysts.

The 2000 paper "Introduction to Symplectic Field Theory" by Eliashberg-Givental-Hofer certainly counts as the initiation of a grand project: the systematic study of punctured pseudoholomorphic curves in certain non-compact symplectic manifolds. The theory has many applications, and a by-product is the new foundational polyfold approach to elliptic moduli problems.

just a minor comment about 3: Donaldson's article was actually written in 1996, and although it is supposed to be a reworking of his 1986 Fields Medallist lecture, the material on constant scalar curvature Kahler metrics (including the conjectures) is from 1996, which is roughly when Donaldson started thinking about those matters.
–
YangMillsJan 5 '13 at 21:07

Manjul Bhargava's new field of arithmetic invariant theory is a perfect example of a new grand project. It began with Manjul's doctoral thesis, in which he presented a completely new view of Gauss's composition law for binary quadratic forms in a way that led to generalizations. This led to a series of papers on generalizations to cubic forms and beyond, with a general framework coming from representation theory.

In addition to being intrinsically interesting, this has led to new results on counting quadratic rings, cubic rings, etc, whose crowning achievement is an important new result on the Birch and Swinnerton-Dyer conjecture (c.f. Bhargava and Shankar). There is much more to research in the theory, and numerous people (including some of his students) have found manifold connections with other areas of math, such as knot theory and algebraic geometry.

Is this a "theory", or a set of striking applications of the geometry of numbers (of which Bhargava is a master) via insightful use of the representation theory, geometry, and arithmetic of algebraic groups? For an algebraic group acting on an "arithmetic" scheme, the set $S$ of integral points of an "orbit" injects into the degree-1 cohomology of the stabilizer. If the stabilizer is commutative then the cohomology set is a group, and when those group operations preserve $S$ we get a "composition law" on $S$. Making such laws explicit is a subtle art, but do any not arise in this way?
–
user29720Dec 31 '12 at 1:18

4

Thanks, Will. For connected ss groups over local and (non-real) global fields, the degree-1 cohomology injects into an ${\rm{H}}^2$ with coefficients in the Cartier dual of the commutative "fundamental group" (due to vanishing of degree-1 cohomology of simply connected ss groups), so it raises the question: is there a central extension of $S_3$ for which those degree-1 cohomologies with $S_3$-coefficients inject into an ${\rm{H}}^2$ with commutative "coefficients" recovering the composition law? That is, is there no known "cohomological" explanation of that composition law?
–
user29720Dec 31 '12 at 5:09

(Forgive me if any of this is wrong, I'm rushing.) For binary cubic forms, you have a composition law only among those pairs with the same quadratic resolvent; the law is actually coming cohomologically from the SL_2 action, whose stabilizer is a form of Z/3Z whose H^1 reads off (more or less) the 3-torsion in the class group of an appropriate quadratic ring. So this one fits kreck's formulation. The only one I know which doesn't is the case studied by Gross and Lucianovic (continued next comment)
–
JSEJan 2 '13 at 3:16

5

where the stabilizer is a form of SO_3 -- but now the group law is coming from the map H^1(K,SO_3) -> H^2(K,+-1) and there is a cohomological explanation in kreck's other sense. As far as I know, "every composition law is a cohomological law" among those observed so far.
–
JSEJan 2 '13 at 3:17

"The theory has rich connections with other approaches to the study of large networks, such as ``property testing'' in computer science and regularity partition in graph theory. It has several applications in extremal graph theory, including the exact formulations and partial answers to very general questions, such as which problems in extremal graph theory are decidable. It also has less obvious connections with other parts of mathematics (classical and non-classical, like probability theory, measure theory, tensor algebras, and semidefinite optimization)."

"This book explains many of these connections, first at an informal level to emphasize the need to apply more advanced mathematical methods, and then gives an exact development of the theory of the algebraic theory of graph homomorphisms and of the analytic theory of graph limits. This is an amazing book: readable, deep, and lively. It sets out this emerging area, makes connections between old classical graph theory and graph limits, and charts the course of the future." --Persi Diaconis, Stanford University

"This book is a comprehensive study of the active topic of graph limits and an updated account of its present status. It is a beautiful volume written by an outstanding mathematician who is also a great expositor." --Noga Alon, Tel Aviv University, Israel

"Modern combinatorics is by no means an isolated subject in mathematics, but has many rich and interesting connections to almost every area of mathematics and computer science. The research presented in Lovasz's book exemplifies this phenomenon. This book presents a wonderful opportunity for a student in combinatorics to explore other fields of mathematics, or conversely for experts in other areas of mathematics to become acquainted with some aspects of graph theory." --Terence Tao, University of California, Los Angeles, CA

"Laszlo Lovasz has written an admirable treatise on the exciting new theory of graph limits and graph homomorphisms, an area of great importance in the study of large networks. It is an authoritative, masterful text that reflects Lovasz's position as the main architect of this rapidly developing theory. The book is a must for combinatorialists, network theorists, and theoretical computer scientists alike." --Bela Bollobas, Cambridge University, UK

Optimal transport. Both its study (generalizations, Monge problem, regularity issues, and geometric properties to cite the part I work in) and its applications (to geometry notably with the Work of Sturm and Lott-Villani, to image processing and recognition, etc.) have developed hugely since the 90's.

I have three answers. The first two involve mathematics with an applied flavor, with strong connections to mathematics of a purer flavor. The last one is purer in origin, but full of potential for applications.

This is clearly an applied area. But it has strong connections to purer areas like harmonic analysis, PDE, geometric measure theory, and variational analysis.

The mathematical branch of image analysis heated up a great deal in the 1990's as a cumulative result of S. Geman and D. Geman (1984). Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images (Google citations = 13672), Mumford and Shah (1989) Optimal approximations by piecewise smooth functions and associated variational problems (Google Citations = 3140), and Rudin, Osher and Fatemi (1992) Nonlinear total variation based noise removal algorithms (Google citations = 5079). Innovations in applied harmonic analysis - wavelets! - also had a large impact.

Interesting mathematics could be applied to problems that could be seen, literally! This inspired many applied and (pure) mathematicians to explore and contribute. This was strengthened by the formation of the SIAM activity group in about 2001, as well as by the fact that it had strong, influential participants like Guillermo Saprio, Andrea Bertozzi, Don Geman, Stan Osher, Tony Chan, David Mumford ... to name only a very few. Another reason image analysis became interesting to mathematically serious folk (as well as many dabblers) was that the ideas went both ways -- cool mathematics could be applied, but also, applications generated exciting, new mathematical problems.

$\color{blue}{\text{Examples}}$

The Mumford-Shah functional: This variational functional introduced by David Mumford and Jayant Shah to solve segmentation problems became an object of study attracting lots of intense scrutiny from the likes of E. De Giorgi, L. Ambrosio, G. David and others. And as fas as I know, the structure theory is still not complete.

ROF functional -- TV Denoising: This functional and it's variants generated a huge amount of interest. In fact that interest has not died off, especially if one looks at the endless variations that have been generated. Interesting algorithms as well as purer investigations using the tools of geometric measure theory have generated new ideas, even in geometric measure theory. Example: Allard's 2007 paper, Total variation regularization for image denoising, I. Geometric theory, uses geometric measure theory tools to definitively expose the nature of TV regularized image functional minimizers.

Geman and Geman: as is clear from the Google citations, it has had an enormous influnece in applications. I know the least about this subject, so I am not aware of the details of its impact on mathematics.

The area is stronger than ever and is characterized by a constant influx of fresh ideas, some of which generate very interesting and rich innovations in mathematics. For example, the CS topic brought up by Yuichiro has a big intersection with mathematical image analysis.

$\color{blue}{\text{Dicussion:}}$ Is this a grand challenge? I would argue that it is, but it is much more of a grass roots effort, not dominated by one personality but rather driven by a large number of ingenious people and real world problems. So it is different than the Grothendieck or Lurie or Thurston programs. It is more chaotic, more accessible, yet rich with motivations and inspirations that lead very deeply as well. It feels to me like something at the intersection of mathematics and physics.

$\color{red}{\large\text{II)}}$ $\color{blue}{\large\text{Mathematics for and from the Data Deluge}}$

It is not news that massive overloads of data are being generated, nor is it a new idea that old analysis tools are not enough. Those who know something of both the current data challenges and available mathematical technology realize that:

Those mathematical tools are largely unexplored for their potential to data, and

data problems are powerful sources of new ideas in those (purer) mathematical areas.

This is definitely another grand challenge which in fact subsumes the previous grand challenge of mathematical image analysis. It is of course driven by real world applications, but this in no way lessens the mathematical challenges. But it does broaden them tremendously.

$\color{blue}{\text{What is the nature of the mathematics involved in this challenge?}}$ It is very wide ranging, from geometric measure theory, harmonic analysis and PDE to graph theory, probability and statistics. Real problems are agnostic as to where insights might come from!

$\color{blue}{\text{What are the big questions?}}$ How do we extract information from very high dimensional data? How do we characterize streaming data on the fly? How do we find the proverbial needle in the haystack? etc. etc. etc.

How does this translate into mathematical programs of research? In tremendously varied ways. One has to look at research in mathematics, electrical engineering and computer science (at least) to get a grasp on the large scale of the intellectual energy devoted to these problems.

I am a neophyte here, but this area is both rather hot and very intriguing. Currents in metric spaces by L Ambrosio, B Kirchheim (2000), Differentiability of Lipschitz functions on metric measure spaces by J Cheeger (1999), and monographs like Heinonen's Lectures on Analysis in Metric Spaces (2001) as well as various papers on analysis in sub-Riemannian spaces are examples and starting points for exploration. (The Helsinki school in analysis seems to me one major driving force here.)

There appears to me to be huge opportunities for progress here. Lots of exciting questions!

I also believe that the potential of this area for use in understanding and modeling data in metric spaces is just beginning to be realized. Data often comes with some notion of distance, but no natural embedding in a vector space. As the numbers of mathematicians working simultaneously in both pure and applied modes grow, I believe areas like analysis in metric spaces will become exploited for their power to illuminate applied problems.

very interesting and well presented answer Kevin, KUDOS! somehow I feel that one of the next breakthroughs will be in new math tools to manage and get insights from large data sets, so I look forward especially to dig into your number 2 above
–
Mirco MannucciJan 2 '13 at 11:39

I want to second the suggestion of Magnus' ... computational topology is a another great example of what I am talking about. (In fact, a piece of what I have done with the flat norm has been spun off by a couple of my collaborators into things similar to the bar codes from the computational topology folk.)
–
Kevin R. VixieJan 4 '13 at 19:04

Universality phenomena for determinantal point processes and relatives.

After the deep results obtained by many great researchers concerning independent random variables, lot of attention has been recently paid to a certain kind of interacting random variables, arising from several (a priori non related) fields of mathematics, which behaves in a same way as the number of such random variables goes to infinity (appearance of the Sine kernel, Tracy-Widom distribution ...) ; the so-called universality phenomenon. This class of interacting random variables is not yet identified but includes

the eigenvalues of many random matrix models

the lengths of the rows of Young diagrams distributed according to the Plancherel measure

Theory of computing and more specifically computational complexity and the NP=!P problem represet an important new paradigm in mathematics. This offers new look on classical issues like the study of algorithms, optimization, and randomness, and provide an interesting lens for many areas/examples/results of math.

Among other things, computational complexity describes different novel notions of "proofs" and among them the possibility to prove a mathematical statement only "beyond a reasonable doubt."

As already mentioned in another answer, quantum information/computation and mathematics related to it represent a major new programme/paradigm.

Ricci flow. It did solve Poincaré's conjecture and the $1/4$-pinching conjecture, but has also become an object of study. More generally, it has launched a large amount of work on geometric flows (mean curvature flow and others), notably with the idea that some other problems can be solved by designing an ad-hoc flow.

Didn't curvature flow (for curves, for example) predate Ricci flow? But you're right in that the idea that Riemannian metrics or submanifolds of Riemannian spaces can "flow" in naturally defined ways has been a driving force in Riemannian geometry. I wonder if there are any examples of this philosophy in other geometries? I only know of the work of Stancu and others in affine convex geometry.
–
alvarezpaivaDec 31 '12 at 10:14

3

@alvarezpaiva: As far as I know the history, the first geometric flow was the harmonic map flow introduced by Eells-Sampson in 1964.
–
Robert HaslhoferDec 31 '12 at 15:09

Has the "grand project" promised by this book flourished, or at least evolved, in the last decade?
I don't mean this question to be critical, I am just curious where his line of thought
(e.g., "logical complexity engenders geometric complexity") stands
ten years after.

Well, understanding the structure of finite simple groups (e.g. maximal subgroups) Understanding the way general groups are built from finite simple groups, (formally this includes the theory of p-groups which is a separate thing but probably you can study this "modulo p-groups"), classifying representation of finite simple groups, understanding primitive permutation groups, and finding other proofs/approaches to the classification itself.
–
Gil KalaiJan 2 '13 at 15:40

3

Regarding the representations, a truly grand and I think underappreciated achievements is Lusztig's construction of all irreducible complex characters of all finite groups of Lie type (including the vast majority of the finite simple groups). However the modular representation theory of finite Lie groups is I believe still a wide open and exciting grant project.
–
David Ben-ZviJan 2 '13 at 16:31

A follow-up in a different direction would be Aschbacher's work on fusion systems. Aschbacher's aim is to extend (!!) CFSG to give a complete classification of all finite fusion systems (of a particular type). This is different to the other follow-ups mentioned above in that Aschbacher is not using CFSG as such, rather he is applying the techniques developed in the course of proving CFSG to the study of a wider class of object.
–
Nick GillJan 7 '13 at 12:53

In numerical mathematics there is a recent new grand theme called randomized numerical linear algebra (RandNLA). One example (probably even a paradigm) is the "randomized range finder" from the paper "Finding structure with randomness". In a nutshell, you hit a (probably very large) matrix $A$ from the right with a random matrix $\Omega$ (which should be a short matrix) and then use a traditional numerical algorithm to find an orthonormal base of the range of $A\Omega$. By hitting the matrix from the right, one can reduce the number of columns of the matrix and hence, the potential computational effort can be reduced. On the other hand, one loses some dimensions of the range of the matrix but one hopes that, due to the randomization, the most important dimensions are kept.

The general idea is that in most cases the interesting quantities are not such "high-dimensional" as they look at first glance. In the case of a high dimensional range of a matrix $A$, it may be that the "usual element" $Ax$ lives in a space of lower dimension.

Interesting questions in this area include: What guarantees can be given for the output of a randomized algorithm? To what extend does the distribution from which the random object in the algorithm is drawn influence the quality of the output? Under what circumstances does randomization pay off (e.g. in terms of computational effort or storage)?

The simultaneous study of a space $X$ and its observables $F(X)$ (real, complex, or operator-valued functions on $X$) is an old topic, but with quantum groups and non-commutative geometry it has been the source of much modern mathematics. The introductory paper by Connes does a really nice job at explaining this.

In the paper Connes underlines the pioneering work of I.M. Gelfand in this area. However he misses one little thing. Gelfand's work on integral geometry was also motivated by this philosophy. The idea is to consider the incidence relation as a special type of multivalued map between the two spaces and to consider how functions, forms, densities, and other functional objects correspond under the map.

Within the realm of finite permutation group theory there are a series of projects that could be collectively entitled The classification of finite combinatorial objects subject to transitivity assumptions. These kinds of classifications have, of course, been around a long time (for instance the Greeks interest in platonic solids is a particular instance) but the nature of this work changed very dramatically with the completion of the Classification of Finite Simple Groups.

Particular threads of this grand project include:

The classification of distance-transitive graphs (cf. work of Saxl, Van Bon, Inglis and others);

The classification of flag-transitive designs (cf. the paper of Buekenhout, Delandtsheer, Doyen, Kleidman, Libeck and Saxl which gives an almost-complete classification). More recently the flag-transitivity condition has been relaxed, and progress has been made on classifying designs which are, for instance, line-transitive or point-primitive (cf. work by many authors!)

The classification of finite projective planes subject to various assumptions. This is a special case of the previous item. In 1959 Ostrom & Wagner gave a full classification of projective planes admitting 2-transitive automorphism groups; in 1987, and using CFSG, Kantor gave an almost-classification of projective planes admitting point-primitive automorphism groups; results have appeared subsequently dealing with the weaker situation of point-transitivity.

The classification of generalized polygons subject to various assumption. The previous item is a special case of this. (I know of recent work on generalized quadrangles due to Bamberg, Giudici, Morris, Royle and Spiga; not sure about hexagons and octagons.)

The classification of `special geometries'. This is work initiated (I believe) by Francis Buekenhout in an attempt to understand the sporadic groups (see the earlier answer by J Mckay). The idea is to find geometries on which the sporadic groups act, analogously to the way the groups of Lie type acts on Tits buildings.

The classification of regular maps (i.e. graphs embedded nicely on topological surfaces and admitting an automorphism group that is regular on flags/ directed edges). This is the thread that involves the Platonic solids; more recently there is a wealth of work by people like Conder, Siran, Tucker, Jones, Singerman, and many others.

There are many others but these give a flavour (skewed to my own interests).

In many of the threads just mentioned (but not all) a crucial first step in classifying objects is to use the Aschbacher-O'Nan-Scott theorem which describes the maximal subgroups of $S_n$. One then often needs information about maximal subgroups of the almost simple groups and so another famous theorem of Aschbacher comes into play (along with results by Kleidman, Liebeck, and others). These theorems are closely related to the answer given by Gil Kalai - the production of results of this ilk (facts about the finite simple groups) is, in itself, a grand project!

"Krasner, Marshall, Connes and Consani and the author came to
hyperfields for different reasons, motivated by different mathematical
problems, but we came to the same conclusion: the hyperrings and hyperfields are great, very useful and very underdeveloped in the mathematical literature.
Probably, the main obstacle for hyperfields to become a mainstream
notion is that a multivalued operation does not fit to the tradition of
set-theoretic terminology, which forces to avoid multivalued maps at
any cost.
I believe the taboo on multivalued maps has no real ground, and
eventually will be removed. Hyperfields, as well as multigroups, hyperrings and multirings, are legitimate algebraic objects related in many
ways to the classical core of mathematics. They provide elegant terminological and conceptual opportunities. In this paper I try to present
new evidences for this."

In information theory (error-correcting codes) the grand achievements in 90-ies are turbo-codes and LDPC codes. Recent 2009 discovery which became hottest topic is polar codes.

It is
tempting to say that paradigm-shift coming with turbo and LDPC codes instead of earlier
popular approaches: convolutional codes, Reed-Solomon codes, BCH codes et.al.
is shift from algebra to probability, from order to chaos.
I mean that earlier constructions were much dominated by algebra considerations e.g.
non-recursive convolutional codes are just the ideals in the ring $F_2[x]\oplus ... \oplus F_2[x]$. While turbo and LDPC are actually constructed and decoded with methods which
much influenced by probabilistic and randomized considerations: roughly speaking good LDPC codes can be constructed by sufficiently sparse and random matrix.
The decoding methods used for LDPC - belief propagation naturally belong to probability or machine learning maths. rather than algebra.

Actually turbo code is almost the same as convolutional code, modula one "small" detail - interleaver. Interleaver is "radomizer"
added to the algebra-tasted convolutional code, it is crucial thing which makes all work.
That what concerns the encoder. The decoder of turbo-codes "resembles" turbine and hence the name "turbo"-code, it is crucially based on probabilistic techniques in coding theory.
Well, the key technique - BCJR algorithm was developed much earlier, so, of course,
all division into old-new paradigms is not very precise, but nevertheless seems there is something behind it.

These ideas found rich practical applications. If someone is reading this with the help of smartphone - say thank to "turbo-codes" - they are working there.

New discovery - polar codes - probably can be characterized as algebra's strike back
- they seems to be quite algebraic nature, sorry I cannot say much for the moment.

The CFSG (classification of finite simple groups) yields L: The finite groups of Lie type,
and S: the non-Lie groups = 26 sporadic simples. We do not know how natural this taxonomy is.

One approach is that of (categorical) 2-groups. Another is that of BIRS Banff 12frg158 which
is an attempt to tame the sporadics using integrable systems, symplectic geometry, characteristic classes, and mathematical physics. This may lead to flourishing of new interconnections between many fields.

I might say that algebraic stacks and their development might be a "grand project" (although, admittedly, this is something that I do not know a whole lot about). In particular, there is the "Stacks Project", found here: stacks.math.columbia.edu/

I don't really know if this fits to the question; it only fits to the title. After all, algebraic stacks were already studied in the mid-60s and the idea of moduli spaces/stacks is even older.
–
Martin BrandenburgJan 2 '13 at 10:45