Since Euclid's axiomatization of space, we have developed a sophisticated mathematical model of space. Given a category of structures (measures), local space is modeled the spectrum of measurements that these structures can possibly make. Global space is construed as patches connected via transport, which identifies measurements across patches.

I'm troubled that I have not come across any axiomatization of time. Assuming that mathematics is a priori science, the great varieties of theories of space in physics can be attributed to our sophisticated mathematical model of space. There is relativity, string theory, quantum theory and M theory.

Perhaps the reader may object that these theories are theories of space-time, rather than theories of space. However, I wish to note that in these theories, time is essentially treated in the same manner as space. In classical physics, time is but another dimension of space. In relativity, time is distinguished from time by the (3,1) signature, but this is just a metric. Riemannian geometry is still considered a theory of space rather than a theory of time.

I'm wondering, then, whether you have encountered is a mathematical axiomatization of time, that treats time in a way not that is not inherently spatial? Assuming once more that mathematics is a priori science, perhaps such an axiomatization can lead to breakthroughs in physics and finance.

Finally, there is a physical theory that I think comes close to a model of time. Namely, entropy. Just as space is dual to measures, we can think of time as dual to entropy. Given as entropy can be defined using combinatorics and probability, this could be viewed as a mathematical theory.

EDIT: Steve mentioned that perhaps one can view entropy as a theory of time via the Thermal Time Hypothesis. Other than entropy, are there any other axiomatizations of time?

ANOTHER EDIT: In the answers given below, most of the models of time are archimedean. I'm wondering, this models can be tweaked to allow a cyclic conceptualization of time. Many ancient cultures, eg from India, consider time to be cyclic rather than archimedean. Should I ask this as a separate question?

I think of this cyclic/archimdean dichotomy as something like Euclidean/non-Euclidean geometry.

The space in which I live is Hausdorff, I don't know yours...
–
QfwfqMar 8 '11 at 13:39

1

If you're looking for axiom systems that give rise to entropy, you should look in the field of statistical mechanics and thermodynamics, and not in the fields of relativity/geometry. Thermodynamics and the increase of entropy are characteristics of all reversible systems, and do not depend on the geometry of four-dimensional space-time. And I feel certain that some mathematician must have axiomatized thermodynamics at some point.
–
Peter ShorMar 8 '11 at 13:45

2

@unknowngoogle: I don't quite understand your comment. I guess I would say the space I live in is Hausdorff.
–
Colin TanMar 8 '11 at 14:00

10 Answers
10

In probability, time is usually handled as a nested sequence of $\sigma$-algebras (say $B_t$, with $B_t \subset B_s$ if $t\leq s$), and to find the reality (call the reality $f$, and it includes the state at all times past and future) at time $t$, one takes the conditional expectation $f_t := E[f | B_t ]$. The sequence $(f_t)$ is then a martingale (a uniformly integrable martingale, more precisely), and this construction is the essence of what the big deal is about martingales.

Brownian motion is a martingale that you've probably heard of, but this also handles simpler situations. For example, consider the experiment: toss a coin repeatedly, and keep track of how many heads you've thrown, minus how many tails. We can capture this experiment in the following way: For $0\leq x <1$, let $f_n(x)$ be the number of 1's minus the number of 0's among the first $n$ digits of the binary expansion of $x$, and let $B_n$ be the $\sigma$-algebra (in this case, a boolean algebra) generated by the intervals $[i/2^n,(i+1)/2^n)$. Then $f_n$ is $B_n$-measurable, and $E[f_t | B_s]=f_s$ for any natural numbers $s < t$, and the sequence $(f_n)_{n=1}^\infty$ is a martingale (albeit different from the type mentioned above). If you want to play any fair game on the "coin tosses" as they come up (allowing use of knowledge of all previous-in-time tosses), then your fortune at time $t$ is still a martingale.

In other words, the passage of time is captured as un-conditional-expectating a function.

For a practical introduction to martingales, I recommend Williams' "Probability with martingales." It is a marvel of writing, and in my humble opinion should be taken as a model for how to write a monograph.

+1. One can also say that the passage of time is captured in this axiomatization as the gradual refinement of the $\sigma$-algebra $B_t$, which in turn could be understood as a suitable abstraction of the intuitive idea of accumulating memory. This is in fact a very old concept, which might predate the concept of time as extension similar to space; Augustine has a nice discussion of it in Book 11 of his ``Confessions'' (which is the almost only thing I remember from introduction to philosophy that I've had as physics major).
–
ansobolMar 8 '11 at 20:21

+1 for "un-conditional-expectating". I hope I find occasion to use this in a paper.
–
Louigi Addario-BerryMar 10 '11 at 15:19

I am going to argue that it is not so easy to dismiss the theory of relativity as a reasonable description of time just because Lorentzian geometry is superficially similar to Riemannian geometry.

Let us consider special relativity as an analogue of Euclidean geometry. In terms of an axiomatic theory based on a priori physical observations, special relativity can be thought of as coming from two physical "axioms"

1) Light propogates in a vacuum, and therefore has constant speed relative to an arbitrary inertial reference frame.

2) Most physical laws should transform covariantly.

Based on this, one can write down the group of transformations between inertial reference frames, and then observe that this is the (linear) isometry group of the metric

$\eta = -dt^2 + dx^2 + dy^2 + dz^2$

Evidently space and time interact with one another, just from the naive way we perceive our existence. So one way to frame the question is, what other axioms do we need to add to Euclidean geometry to be able to accurately add a time dimension? Is the idea of a unified space-time even possible? So a brilliant aspect of SR is that you really shouldn't think of space and time being separate entities, with separate axioms schemes, at least in this classical sense. You can just somehow include 1) and 2) to euclidean geometry, and with hindsight get the space $(\mathbb{R}^{1,3}, \eta)$. Now we don't have to think about this as a static geometric object. In fact the laplacian turns into the wave equation, i.e.

So in particular the causal character is built into the geometry, and furthermore this is coming from more or less an axiomatic description. The way that time is built into the geometry is "hidden" in the sense that there is no universal frame describing time, but again this just reflects the same physical fact. Then by analogy one would compare nontrivial solutions of the Einstein equations with noneuclidean geometry.

I know that this is likely already familiar to you, and that it seems from your question that you may want to think of things from a different viewpoint. However I wanted to point out that in some sense relativity is like an axiomatization of time, and that the deep relation to geometry isn't some mathematical bias, but rather based on physical observations (so, a priori a science).

It is interesting to note, that axiom 1. may be reformulated in more mathematical fashion: "Transformation between any two inertial reference frames is linear affine transformation of coordinates". If this is assumed, then in theory there exists parameter which has dimension of velocity, and it is universal for all observers. This gives us Poincare group of symmetry. Then we may look for theory with such symmetry built in, and it is realized in Maxwell Equations and discovered parameter is equal to speed of light in vacum.
–
kakazMar 8 '11 at 22:00

Your comment reminded me that all of this is explained beautifully in an address Minkowski made in 1908 called "Space and Time." I haven't found it online but it is published in the Dover book titled "The Principle of Relativity." In the article Minkowski gives a mathematical way to go from Newtonian mechanics to SR and also argues that one should not treat space and time separately.
–
Ken KnoxMar 9 '11 at 0:31

In algebraic quantum field theory, time evolution can be identified with the modular flow of Tomita-Takesaki theory.

The basic setup of algebraic quantum field theory is that we are given a covariant functor $\mathcal A$ : {double cones in our space time $\mathbb R^4$} $\mapsto$ {von Neumann algebras on H}.
Here, H is a Hilbert space. A double cone is the intersection of a light cone with an inverse light cone. The von Neumann algebra $\mathcal A (\mathcal O)$ associated to a double cone $\mathcal O$ is to be tought of as being generated by all possible observable that can be observed within the limited amount of space and of time that you have in $\mathcal O$.
There's also a distinguished vector $\Omega$ ∈ H called the vacuum vector.

Ok. That's the setup.
Then, there are a whole bunch of axioms...
I'll skip them.
Among other things, you want the algebras $\mathcal O$ to be generated by Whitman fields (=operator valued distributions) paired against functions with support in $\mathcal O$.

The result I'm talking about is called the Bisognano-Wichman theorem. It says that the modular flow of Tomita-Takeaski theory for the algebra $\mathcal A (\mathcal O)$ with respect to the vector $\Omega$ can be identified with a version of time translation. Here, I say "a version" because you want an action of $\mathbb R$ that preserves $\mathcal O$. It's a flow that fixes the two cone points of $\mathcal O$, and that respects to conformal class of the Minkowskian metric on $\mathbb R^4$.

I recomend Araki's book Mathematical Theory of Quantum fields for a gentle introduction to algebraic quantum field theory.

The thermal time hypothesis (TTH) of Connes and Rovelli might be the sort of thing you're looking for.

By way of background, let $\mathcal{H}$ be a Hamiltonian. The thermal density matrix is $\omega = Z^{-1}\mbox{Tr}(e^{-\beta \mathcal{H}})$, and the time evolution of an observable $A$ is given as usual by $e^{i\mathcal{H}t/\hbar} A e^{-i\mathcal{H}t/\hbar}$. Now the one-parameter modular group of $\omega$ that appears in the Tomita-Takesaki theory of von Neumann algebras can be shown to coincide with the time evolution group: if $s$ is the modular parameter and $t$ is the physical time, then $t = \hbar \beta s$. In particular, $s$ does not depend on $\beta$.

The TTH states that physical time is determined by the modular group, which is in turn determined by the state. Besides implying Hamiltonian mechanics, the TTH simultaneously inverts and generalizes the Kubo-Martin-Schwinger condition and hence also the Gibbs relation, with temperature providing the physical link between time evolution and equilibria.

Hmmm ... the question asked intimately blends quantum mechanics and general relativity, in the sense that time (as we experience it in everyday life) is associated to our ability to causally order events.

For example, we readily communicate information forward in time, but not backwards in time, and in particular, we cannot send information faster than speed-of-light. How does this work?

Even more challenging: how can we reduce these physical puzzles to well-posed mathematical problems?

The discussion in Nielsen and Chuang's textbook Quantum Computation and Quantum Information of "The Principles of Deferred and Implicit Measurement" bears directly upon these mysteries ... and in turn, the Nielsen and Chuang discussion derives largely from work by Kraus, Lindblad, and Choi ... and in turn, Kraus, Lindblad, and Choi based their work largely on theorems derived in a dry 1955 article by W. Forrest Stinespring titled "Positive Functions on {$C^\ast$}-Algebras."

So we are lead to ask, how does one explicitly link Stinespring's dry algebraic theorems to the juicy physical mysteries of causality, relativity, and quantum mechanics?

Well, I had occasion earlier this week to post about this link on Scott Aaronson's Shtetl Optimized weblog, and I append that discussion.

The short answer is "A seminal 1955 experiment by Hanbury Brown and Twiss established the connexion" ... and the details are very interesting.

Entire books have been written upon this subject, and so I hope MathOverflow readers don't mind a fairly length answer ... which nonetheless covers only a tiny fraction of this fascinating topic ...

One wonderful aspect of (of many IMHO) of Scott [Aaronson] and Alex [Arkhipov's] new class of linear optics experiments is the motivation these experiments provide for students to go beyond Feynman's celebrated Lectures on Physics in understanding the physics of photon counting.

The quantum physics of photon detection is a subtle topic that even Richard Feynman got wrong on occasion. The story of Feynman's mistake is vividly told in the Physics Today's obituary for Robert Hanbury Brown (volume 55(7), 2002), which tells of Feynman standing up during a talk Hanbury Brown, proclaiming (wrongly) "It can't work!", and walking out of the lecture.

The quantum physics associated to this Feynman story is summarized in series of six short letters, totaling 12 pages in all, that appeared in Nature during 1955-6. These letters describe what is today called the "Hanbury Brown and Twiss Effect"—the first-ever observation of higher-order photon counting correlations.

The story of the Hanbury Brown and Twiss Effect, as recounted on the pages of Nature, in effect has six thrilling episodes:

Episode 2: Brannen and Ferguson announce (in effect) "The claims of Hanbury Brown and Twiss, if true, would require major revision of some fundamental concepts of quantum mechanics; moreover when we did a more careful experiment, we saw nothing." (The question of correlation between photons in coherent light rays, Nature 178(4531), 1956).

Episode 3: Not yet having seen Brannen and Ferguson's criticism, Hanbury Brown and Twiss further announce (in effect) "We observe nontrivial correlations even in photons from the star Sirius, and our theory allows us to determine its diameter" (Test of new type of stellar interferometer on Sirius, Nature 178(4541), 1956).

Episode 4: Hanbury Brown and Twiss reply "The experiment of Brannen and Ferguson was grossly lacking in sensitivity; had they analyzed their experiment properly, they would have expected to see no effect" (The question of correlation between photons in coherent light rays, Nature 178(4548), 1956).

Episode 6: Hanbury Brown and Twiss announce (in effect) "When the experimental methods of Brannen and Ferguson are implemented with higher sensitivity, and analyzed with due respect for quantum theory as explained by Purcell, the results wholly confirm our earlier findings." (Correlation between photons, in coherent beams of light, detected by a coincidence counting technique, Nature 180(4581), 1956).

When we read the 12-page story of Hanbury Brown and Twiss side-by-side with the discussion of photon counting in The Feynman Lectures on Physics, we are struck by three aspects of the Hanbury Brown and Twiss experiments that are not emphasized in the Feynman Lectures.

First, the Hanbury Brown and Twiss articles exhibit a charming physicality that is largely absent from the Feynman Lectures. For example, Hanbury Brown and Twiss describe the use of an "integrating motor" to measure the total current associated to photon detection during an experimental run. Modern physics students will wonder "What the heck is an integrating motor?", yet in the physics literature of the 1950s this concept was viewed as being so intuitively obvious as to require no explanation: the total number of revolutions of an electric motor (as counted by purely mechanical means!) obviously can be made proportional to the integral of the current flowing through it ... that's how electric meters work, right?"

As Ed Purcell's letter to Nature rightly observes, the observation of subtle quantum correlations with purely mechanical counters "adds lustre to the notable achievement of Hanbury Brown and Twiss."

Second, the experimental protocol of Hanbury Brown and Twiss includes elements that are highly sophisticated from the viewpoint of modern quantum information theory. In particular, while aligning their apparatus, they reverse the flow of photons by placing their eyes at the position of the source, and while physically looking at two photodetectors through a half-silvered mirror, they adjust the mirrors such that the images of the photodectors are coherently superimposed. We nowadays appreciate that from the viewpoint of QED, this time-reversed coherence is necessary to ensure that quantum fluctuations in the photon detector currents are deterministically associated to quantum fluctuations in the photon source currents.

Third, it follows that in the observations of Sirius recorded by Hanbury Brown and Twiss, their experimental record of correlated photocurrents here on earth is deterministically associated to currents that span the surface of the remote star Sirus -- eight light-years away! This counterintuitive implication was why many theoretical physicists (including Feynman) at first considered the results of Hanbury Brown and Twiss to be (literally) incredible.

Nowadays we appreciate that this seeming paradox is naturally reconciled via the quantum informatic mechanism that Nielsen and Chuang call the "Principles of Deferred and Implicit Measurement" -- principles that are formally associated to work by Kraus and Lindblad in the 1970s; principles that were not readily appreciated by Feynman and his colleagues in the 1950s.

[note added: although Stinespring published his theorems in 1955, it took decades for physicists to appreciate their implications.]

Moreover, the experiments of Hanbury Brown and Twiss were vastly wasteful of photonic resources. The star Sirius emits about $10^{46}$ photons/second, of which Hanbury Brown and Twiss detected about $10^{9}$ two-photon entangled states/second ... the relative production efficiency thus was a dismal $10^{-37}$. Even today, more than 50 years later, the production of six-photon entangled states still is dismally inefficient: in recent experiments $10^{18}$ photons/second of pump power yield about one six-photon state per thousand seconds, for a relative production efficiency of order $10^{-21}$.

We see that one of the fundamental challenges (among many!) that Scott and Alex's experiment poses for 21st century physicists, is to devise methods for generating entangled photon states that are exponentially more efficient than existing methods. To achieve this, modern physicists will have to do exactly what Hanbury Brown and Twiss did ... "look" at the photon detectors from the time-reversed viewpoint of the photon source ... and then (by careful design) arrange for the photon source currents to have near-unity correlation with the photon detector currents.

This is an immense practical challenge in cavity quantum electrodynamics, that we are certain to learn a great deal in trying to solve. At present we are similarly far from having scalable quantum-coherent n-photon sources, as we are far from having scalable quantum-coherent n-gate quantum computers.

These considerations are why, from an engineeering point-of-view, it is prudent to regard n-photon linear optics experiments, not as being obviously easier than building n-gate quantum circuits, but rather as being comparably challenging from a technical point-of-view. And this is why it will not be surprising (to me) if the Aaronson/Arkhipov distribution-sampling algorithms prove in the long run to be similarly seminal mathematically and theoretically—and similarly challenging experimentally—to Peter Shor's number-factoring algorithms.

Summary: A satisfactory understanding of mathematical/physical time is intimately bound-up with our understanding of experiments like that of Hanbury Brown and Twiss ... and even after many decades of work, we still have a long way to go, to achieve this understanding.

In particular, despite more than a century of work, we still lack a mathematical roadmap that naturally accommodates the quantum dynamics of field theory, the informatic causality of Stinespring/Kraus/Choi/Lindblad, and the dynamical state-space geometry of Riemann and Einstein ... see the concluding section of Ashtekar and Schilling's arxiv manuscript Geometrical formulation of quantum mechanics, and also Troy Schilling's thesis Geometry of Quantum Mechanics (Penn State, 1996) for further discussion.

Added comment: Troy Schilling's 1996 thesis Geometry of Quantum Mechanics is well-conceived, and I have often wondered about Schilling's subsequent career. If anyone has information, please post a comment.

I know Troy (we worked at a think tank together around 2000). He is doing crypto stuff in industry now.
–
Steve HuntsmanApr 11 '11 at 15:57

Thank you very much, Steve! Perhaps Troy will see this post, and will enjoy knowing that his thesis work has found a fan-base. Last week we presented quantum simulation methods derived from Troy's geometric dynamics formalism at the 52nd ENC at Asilomar, and these geometric methods were well-received by nuts-and-bolts spectroscopists and microscopists. Presentation here: "faculty.washington.edu/sidles/ENC_2011/&quot;
–
John SidlesApr 16 '11 at 21:51

Interesting approach are casual sets together with causality relation. There are physical models of quantum gravity based on such approach for example casual dynamic triangulations invented by Renate Loll, Jan Ambjørn and Jerzy Jurkiewicz. In this theories time is an order defined by causality of events which is base notion and spacetime structure follows from that in dynamic way.

There are several axiomatic systems of temporal logic (aka tense logic). You can find more e.g. in Hodkinson and Reynold’s chapter in the Handbook of Modal Logic, in this introduction by Venema, or, from a more philosophical standpoint, in SEP.

Probably the evolution of our knowledge is governed by nonmonotonic logic. As gathering more knowledge is similar to gathering information, probably this may provide relation from information theory and entropy formulation to logic. It is worth to note that even in nonmonotonic logics there may be monotonic core and nonmonotonic periphery of knowledge, so in some way in may be more general that temporal logic which may be contained within. Of course it is only a speculation about system more general than temporal logic.
–
kakazMar 8 '11 at 22:09

Also nonmonootonicity is semantic notion whilst You link to system defined by syntactic language.
–
kakazMar 8 '11 at 22:09

I'm afraid you misunderstand the concepts. In any kind of formal system, if you change the list of axioms, it changes (at least potentially) the list of derivable theorems. The only thing special about non-monotonic logic is that this dependence of the set of theorems on the set of axioms is not necessarily increasing (adding axioms may remove theorems). Otherwise the system is completely static, just as classical logic. There is no way to refer to change of information from within the system.
–
Emil JeřábekMar 9 '11 at 11:17

A bit tangential, but I'm surprised no one mentioned time in general relativity (Ken Knox discussed special relativity, but the case in general relativity is subtly different). This discusses relativity from a perspective closer to a physicist's, since it's a bit more elementary (and hence easier to understand) in my view. As discussed at the end, relativity in general is somewhat incompatible with statistical mechanics, at least under the standard approximations, so this is almost surely not what you're looking for, but it may be of use to someone.

In GR, space-time is a 4-manifold which is endowed with a Lorentzian metric $g_{ab}$, which is a rank 2 covariant tensor. The scalar product of two vectors $V$ and $W$ is then $g_{ab}V^aW^b$, from which we can compute things as in special relativity (where the metric is $\eta_{ab}$, which is $0$ if $a \ne b$, -1 if $a=b$ and $x^a$ is a spacial coordinate (i.e. x,y,z), and 1 if $a=b$ and $x^a=t$ is the time coordinate).

If a vector (field) V satisfies $g_{ab}V^aV^b >0$, we call it time-like, and similarly for trajectories based on their tangent vectors. These are the possible trajectories for particles with positive mass. Any time-like trajectory corresponds in the limit of low mass and low speed to a local frame of reference in which it is the 'time', and if the trajectory is a geodesic then the frame is inertial. Particles with 0 mass (i.e. light) have null trajectories, with $g_{ab}V^aV^b =0$. If $g_{ab}V^aV^b <0$, the vector is space-like. Particles with no forces (other than gravity) acting on them travel on geodesics. The trajectory of a massive particle is called it's world line. The sign conventions here are often reversed, so care is advised. For a time-like path, we can compute its length by $ds^2 = g_{ab} x^a x^b$, where $x$ are your coordinates.

For any observer, they observe themself as stationary, and traveling forward in time with unit speed. Time for that observer corresponds exactly to the length of their trajectory. That observer can even set up a local set of coordinates in which the metric is approximately $\eta_{ab}$, provided all masses are sufficiently far away. Locally, then, time behaves like in special relativity. The big difference between special and general relativity is that the latter has no inertial reference frames in general, so observers can only measure times in at their own location.

To summarize, time is another coordinate in space-time, just like space, but it isn't universal (it's observer dependent), and the only real restrictions on it is that it must be the part of the metric that has positive signature (in the sense above).

Since this was no-doubt useless in explaining the difference between time in special and general relativity, I recommend Hughston & Tod's An introduction to general relativity. It's fairly light reading and has an introduction to special relativity, but it's rigorous enough for mathematicians reading it. A number of other books at a higher level are available, of which Hawking & Ellis, Wald, and Misner, Thorne, & Wheeler are all good references.

However, if you're looking for a formulation in which the second law of thermodynamics is provable, or even where entropy is defined, relativity is the wrong place to look. Even in special relativity, concepts like thermal equilibrium depend on a particular reference frame, so entropy may be defined in one inertial frame but not another. There has historically been great debate over how temperature (which is the thermodynamic conjugate of entropy) should transform under Lorentz transformations, and it's still not totally resolved.

I've been avoiding answering this question, as I think axiomatization of time is a rather fruitless activity. But, as no one else has mentioned this, I feel, unfortunately, compelled to post this answer.

The axiomatic approach to topological quantum field theory [Blanchet and Turaev] defines a topological quantum field theory,

This normalization axiom is an axiomatization of time as it occurs in any diffeomorphism invariant theory.

In more detail, any theory that is diffeomorphism invariant is, in particular, invariant with respect to diffeomorphisms in the time direction $t'(t)$. The generator of time evolution is the Hamiltonian $H$. Thus, any state in the Hilbert space is invariant under the action of the Hamiltonian $H$. This is the exact content of the normalization axiom.

The mathematician's concern of the axiomatization of time arises as a result of the physicist's concern with the modelling of time. It strikes me that there is no non-archimedean or cyclic model of time in physics. Perhaps this is a result of the belief in the 2nd law of thermodynamics. I'm interested in where it makes sense to model time as cyclic.
–
Colin TanMay 15 '11 at 10:52

Again it's not only about time, but rather about space time, but quite different from the other mentioned approaches:

The Hungarian logicians Andreka, Madarasz, Nemeti, Toke and Sain have actually explored the possibilities of treating space-time and relativity in terms of model theory, so they really give axiomatizations in the technical sense and not just mathematical models. See the preprints on this page here, which include also a survey of previous axiomatizations of space-time.