Topological entropy is one measure of the topological order of stuff. I know the term from Kitaev and Preskill, and Levin and Wen. Livine and Terno allude to it in computing the entropy of a boundary in BF theory and spin networks (without gravity - they are hoping to address that in the next paper).

Topological entropy is one measure of the topological order of stuff. I know the term from Kitaev and Preskill, and Levin and Wen. Livine and Terno allude to it in computing the entropy of a boundary in BF theory and spin networks (without gravity - they are hoping to address that in the next paper).

http://arxiv.org/abs/0801.0861Quantum Graphity: a model of emergent locality
Tomasz Konopka, Fotini Markopoulou, Simone Severini
25 pages
(Submitted on 6 Jan 2008)
"Quantum graphity is a background independent model for emergent locality, spatial geometry and matter. The states of the system correspond to dynamical graphs on N vertices. At high energy, the graph describing the system is highly connected and the physics is invariant under the full symmetric group acting on the vertices. We present evidence that the model also has a low-energy phase in which the graph describing the system breaks permutation symmetry and appears to be ordered, low-dimensional and local. Consideration of the free energy associated with the dominant terms in the dynamics shows that this low-energy state is thermodynamically stable under local perturbations. The model can also give rise to an emergent U(1) gauge theory in the ground state by the string-net condensation mechanism of Levin and Wen. We also reformulate the model in graph-theoretic terms and compare its dynamics to some common graph processes."

This impresses me as a risky idea, still at preliminary stage of development. I'm not sure I would be reading this paper now if it hadn't been for that 5-page note by Jerzy K-G.
Jerzy's note is solidly grounded (IMHO) in the constrained BF formulation of gravity. He doesn't need to postulate much of anything new. And yet it associates a particle of matter with a tunnel singularity stretching to infinity.

Fra and Atyy, thanks for responding. The ideas here may be more familiar to you than they are to me and I suspect that I am confused about some things here.

From my perspective, whether or not it seems to follow reasonably, the Kowalski-Glikman note lends credibility to this radical graph-theoretic picture by Markopoulou. Or if not credibility at least interest.

The graphity people have a kind of cosmogony where the universe begins as a complete graph---fully connected---each node connected to every other node.

They have a concept of temperature defined in such a way that as the graph "cools", it gets more and more like a nice hexagonal lattice----more like the space we are used to.

The graph evolves by repeated application of "moves" which are able to turn off and turn on the links between nodes, and locally rearrange how the nodes are connected.

I found it interesting to see how they defined energy of a graph E(G).

This "Quantum Graphity" paper was published in Physical Review D a few months after it appeared on arxiv, in 2008.

There is a follow-up that I've also been taking a look at, about conserved quantities in this kind of "graphity" set-up.

http://arxiv.org/abs/0805.3175Conserved Topological Defects in Non-Embedded Graphs in Quantum Gravity
Fotini Markopoulou, Isabeau Prémont-Schwarz
42 pages, 34 figures
(Submitted on 20 May 2008)
"We follow up on previous work which found that commonly used graph evolution moves lead to conserved quantities that can be expressed in terms of the braiding of the graph in its embedding space. We study non-embedded graphs under three distinct sets of dynamical rules and find non-trivial conserved quantities that can be expressed in terms of topological defects in the dual geometry. For graphs dual to 2-dimensional simplicial complexes we identify all the conserved quantities of the evolution. We also indicate expected results for graphs dual to 3-dimensional simplicial complexes."

It's nice they moved over to non-embedded graphs, because embedding brings with it a lot of extra machinery.

Thanks for bringing this up. I stumbled on he quantum graphity model before but I never looked into detail. If I remeber right it's not quite what I looked for but I will look again.

Some general traits to my liking are

- The idea of considering a permutation symmetry of all points. This is very natural if one starts from a simple distinguishability concept like I do as well. This may seem simple as in simplistic, but I think it's it's powerful if used the right way.

- Then as observer complexity increases (~lowering the energy) maintaining this symmetry is unstable, and new structures emerge. I think the use condensed matter analogies to make conjecturs about the low energy self-organisation is to take place. It's certainly not the intuition I use (I rather think in terms of intrinsic inferences, as in seeking equilibrium of games) which probably why I found it non-intuitive and speculative the last time. But I didn't see this connection to my own ideas that last time I went over this.

For those that may want to step back and review what "quantum graphity" is about it seems on of the first papers was from 2006 with interestingly even have Smolin as a co-author.

Quantum Graphity by Tomasz Konopka, Fotini Markopoulou and Lee Smolin

"We introduce a new model of background independent physics in which the degrees
of freedom live on a complete graph and the physics is invariant under the
permutations of all the points. We argue that the model has a low energy phase
in which physics on a low dimensional lattice emerges and the permutation symmetry
is broken to the translation group of that lattice. In the high temperature, or
disordered, phase the permutation symmetry is respected and the average distance
between degrees of freedom is small. This may serve as a tractable model for the
emergence of classical geometry in background independent models of spacetime.
We use this model to argue for a cosmological scenario in which the universe underwent
a transition from the high to the low temperature phase, thus avoiding the
horizon problem."

Starting to read the original papers seeking for design principles and guiding principles.

One of their idea - to my liking - seems to be that permutation symmetry is broken as temperature drops, leading to emergence of more structure so that at sufficiently high temperature spacetime is not distiniguishable.

Though this strikes me as an induction step in the sense that one make similarly ask about the origin of the points, subject to permutation symmetry. So maybe in the hierarcy of phase transition, there really is no beginning and and no end as per some absolute scale. This is why I hope that they are envisioning a "general phase transtition". This is exactly what I have tried to do as well - consider phase transitions where one microstructure, spontaneously splits into smallers ones. The conserved quantity is overall complexity as in the number of points. But I think one also needs to explain the origin of the complexity in the first place, not just how a given complexity may break an initial symmetry. I've always felt these were related as "symmetries" as they emerge, IMO qualify can be reinterpreted as "new degrees of freedom" (think either there is a symmetry, or not (0,1)). Because the total is more than the sum of the parts. I think of the breaking of the symmetries as "adding information", since we add information about the transformation between the phases. I see this as a "compression" algoritm. And the reason for the transition beeing favouralbe is that one is more efficient code.

I''ll keep reading their stuff to see if I can connect to anything that may help.

I just realised, before when reading up on penrose that I migh be able to "reinterpret" what I call microstructures into graph like things. Not because I think it helps per see, but maybe it will help me _relate_ to the bunch of graph/network-inspired ideas out there.

Clearly each microstate can be seen as a "point", and each microstate is connected by means of a quantized transition probability I can calculate. "Transition probability" is closely related to action, and might be a alternative way intead of spin networks. It's more like action networks. This is what I thought LQG was, when I started rovelli's book. Now I see hope that maybe it still could be, even though it wasn't rovelli's original thought. Or maybe some of the other spin/graph related ideas are even a closer fit.

In a certain sense a "action network" should relate to a "spin network" as the permutation symmetry relates to the broken one. Ie. Take a "spint network" and remove the structure as to get permuation symmetry, wouldn't we get a more naked action network? Something which would be much easier to justify.

The Transactional Interpretation is based on a similar assumption. Cramer developed the Wheeler-Feynmann absorber theory http://en.wikipedia.org/wiki/Wheeler–Feynman_absorber_theory
The path integral also relates quantum and stochastic processes, and this provided the basis for the grand synthesis of the 1970s which unified quantum field theory with the statistical field theory of a fluctuating field near a second-order phase transition. The Schrödinger equation is a diffusion equation with an imaginary diffusion constant, and the path integral is an analytic continuation of a method for summing up all possible random walks. http://en.wikipedia.org/wiki/Path_integral_formulation
A Feynman diagram is then a contribution of a particular class of particle paths, which join and split as described by the diagram. A Feynman diagram consists of points, called vertices, and lines attached to the vertices.

I assume such one vertice represent one quantum event. It is interesting that a surface of the Black Hole's Event Horizon contains all quantum events of the Black Hole's interior if one information is just one oscillation of the particle an causes a delation of the Planck unit (Planck time, Planck length).
For the static Black Hole surface A=2pi [tex]R^2[/tex] there are A/4 Planck length squared information.
(M/m)*(R/2L)= A/4 Lp
where: M/m=number of the particles (fermions 1/2 spin) M=R [tex]c^2[/tex] /2G
L=Compton wave length=h/mc
Lp=Planck length = 2pi hG/ [tex]c^3[/tex]

It shows a direct connection of the information content with a surface withaout the Boltzmann constant as it is in Shannon's entropy.

I have a question:
Does it suggest the Boltzmann constant is an emergent number from the quantum information and we may avoid it in quantum information ?

I have a question:
Does it suggest the Boltzmann constant is an emergent number from the quantum information and we may avoid it in quantum information ?

I can't comment on Quantum Graphity yet but in the way I envision this, I would like to say that the equivalent of "boltzmanns constant" is emergent. But from our past discussion I have a feeling you have a somewhat more realist view of "quantum information" that I do. But set aside those things, boltzmanns constant can be seen as emergent.

I think in each transition, there is an effective "boltzmann constant".

With "boltzmanns constant" in quotes I simply refer generally to the SCALE factor in front of the entropy measure defining it's relation to probability to not confused it with the specific boltzmanns constant.

A Feynman diagram is then a contribution of a particular class of particle paths, which join and split as described by the diagram. A Feynman diagram consists of points, called vertices, and lines attached to the vertices.

I share the assocaition but one important difference us that feymann diagrans and the path integral USES QM.

From my perspective, I am trying to reconstruct the rules of QM from a deeper idea, not assume them.

I think QM formalism is also emergent.

Some approaches take QM as fundamental, some take GR as fundamental. I consider neither as fundamental. Instead I think that there are traits from GR and QM that belong to the fundamental view. The evolving relation between context and described as described by GR. But QM acknowledges the operational perspective. I think both are non-negotiable.

I think that the non-commutativity of QM is emergent from evolving state networks since non-commutative structures while weird, is more efficient at coding changing information streams under complexity constraints.

I find it too lame to start off with all the baggage of QM, and apply to these networks. It would miss my goal at least.

The quantum network in Loop Quantum Gravity is very complicate. I do not know what creates the Quantum Graphity and where the information for its structure comes from.
In my model the base for the quantum network is the non-local information of the Compton wave length. It is simple and creates interesting relation:
1. Gravitational interaction / EM interaction
2. The holographic structure of the information.
3.Explains Dark Matter and Dark Energy effects.

Like with LQG and Penrose, when you start reading about the motivation I feel excitement, but when you see the model "invented" I feel some dissapointment because the first steps taking often comes out as too large in the wrong direction. I guess for me it's because I am looking for something particular. I guess I hope to find some approach already with some answers to the questions I prefer to ask.

Sorry for my minor diversion. I'm interested to see what you other guys have to say about Quantum Graphity.

Quantum Grphity shows a possibility to create the space and matter as the derivation of the quantum information. It is a general mathematical idea.
I think, it is a natural direction of the physics, where the space, matter, time, gravity are emergent from the information.
The change of the information content causes change in the topology and we may say about the entropy of the topology.
I assume the basic information is the Compton wave length and Planck length which shows the relations as a space. The phase transitions in the structures create vacuum or matter then.

Quantum Grphity shows a possibility to create the space and matter as the derivation of the quantum information. It is a general mathematical idea.
I think, it is a natural direction of the physics, where the space, matter, time, gravity are emergent from the information.

I certainly appreciate information based approaches. However, it is unsatisfactory to treat information in the realist sens. I must also explain the origin of the degrees of freedom needed to define information. Most approaches I know of, does not.

I see this as having several components.

Given any information structure, we can define measures and explain expectations and expected actions, which are then conditional on the starting premise.

So far this is nothing strange. But to complete the argument, what information structure to start with (note that there is probably an uncountable number of possible starting premises) and how to motivate it?

In my analysis, when you start with a premise, this premise is like an initial assumption, or uncertain bits that are simply our best bets, but this initial assumption is not distinguishable from the initial assumptions of physical laws. So all these things are one, and furthermore in a real interaction, real unpredictable feedback causes these things to change.

So one key is that given an initial information structure, argue how and why this must evolve and trace this back to a minimal starting point of basically some bits acting as a "seed". Then argue that it's likely that this seed will acquire more complexity.

So, IMO there is also in the information theoretic approaches the problem of the origin of information. To just assume that the universe is a big set of information that is conserved is an unacceptable and unsatisfactory picture IMO.

The initial assumption is the most important, I think.
They did observe the Euler's beta function 40 years ago and they have created the String Theory. Their assumption was that everything is build of a string in Planck scale.

I did play with Newtons and quantum equations and it shows me a background space of Compton wave length with Planck length contraction. Here is an assumption that this laws are correct.

It is interesting that we measure the speed of light in the vacuum. Vacuum is not an empty space, there are virtual particles-antiparticles in each point. The information is for us a difference in the energy and it is always relative. We can't observe a wave function alone. Does it exist alone if we do not observe it ?

We observe the Universe not as it is but how we can measure it. It depends on our senses and aparatus.
It seems for me the information is a most fundamental phenomenon but what is true ?

http://arxiv.org/abs/0801.0861Quantum Graphity: a model of emergent locality
Tomasz Konopka, Fotini Markopoulou, Simone Severini
25 pages
(Submitted on 6 Jan 2008)
"Quantum graphity is a background independent model for emergent locality, spatial geometry and matter. The states of the system correspond to dynamical graphs on N vertices. At high energy, the graph describing the system is highly connected and the physics is invariant under the full symmetric group acting on the vertices. We present evidence that the model also has a low-energy phase in which the graph describing the system breaks permutation symmetry and appears to be ordered, low-dimensional and local. Consideration of the free energy associated with the dominant terms in the dynamics shows that this low-energy state is thermodynamically stable under local perturbations. The model can also give rise to an emergent U(1) gauge theory in the ground state by the string-net condensation mechanism of Levin and Wen. We also reformulate the model in graph-theoretic terms and compare its dynamics to some common graph processes."

This impresses me as a risky idea, still at preliminary stage of development. I'm not sure I would be reading this paper now if it hadn't been for that 5-page note by Jerzy K-G.
Jerzy's note is solidly grounded (IMHO) in the constrained BF formulation of gravity. He doesn't need to postulate much of anything new. And yet it associates a particle of matter with a tunnel singularity stretching to infinity.

Fra and Atyy, thanks for responding. The ideas here may be more familiar to you than they are to me and I suspect that I am confused about some things here.

From my perspective, whether or not it seems to follow reasonably, the Kowalski-Glikman note lends credibility to this radical graph-theoretic picture by Markopoulou. Or if not credibility at least interest.

The graphity people have a kind of cosmogony where the universe begins as a complete graph---fully connected---each node connected to every other node.

They have a concept of temperature defined in such a way that as the graph "cools", it gets more and more like a nice hexagonal lattice----more like the space we are used to.

The graph evolves by repeated application of "moves" which are able to turn off and turn on the links between nodes, and locally rearrange how the nodes are connected.

I found it interesting to see how they defined energy of a graph E(G).

This "Quantum Graphity" paper was published in Physical Review D a few months after it appeared on arxiv, in 2008.

There is a follow-up that I've also been taking a look at, about conserved quantities in this kind of "graphity" set-up.

I've read many of her papers. Her work goes as far to suggest that you can remove the geometry of space from the suspected final theory of physics, arguing the final theory will be fundamental.