Bienvenidos! Welcome!A blog about Tau-Chain, TML and Agoras.

Tau-Chain.info is an unofficial blog that gathers information from various Internet sites about IDNI projects: Tau, Tau-Chain, Tau Meta Lenguaje and Agoras. This blog does not belong nor is it part of the official Tau project. The purpose of the blog is to make the project known, in an independent way, to people.​Tau is a revolutionary blockchain platform designed to scale up social consensus and accelerate knowledge creation in a decentralized network. Agoras will be an advanced marketplace for knowledge and computational resources built on this framework.

These concepts are not often discussed so let's have the discussion from the beginning. The first concept to think about is pancomputationalism or put another way the ubiquitous computers which exist everywhere in our environment. We for example can look at physical systems living and non living and see computations taking place all around us. If you look at rocks and trees you can see memory storage. If you look at DNA you can see code and if you look at viruses you can see microscopic programmers adding new codes to DNA. Even when we look at the weather such as a hurricane it is computing.

If you look at nature you see algorithms. You will see learners (yes the same as in AI), also in nature. The process is basically the same for all learning. Consider that everything which is physical is also digital. Consider that the universe is merely information patterns.

If we look at society we can also think of society as a computer. What does society compute though? One way people talk about a society is as a complex adaptive system, but this is also how people might talk about the human body. The human body computes with the purpose of maintaining homeostasis, to persist through time and reproduce copies of itself over time. The human brain computes to promote the survival of the human body. Just as viruses pass on codes to our DNA, the human brain is infected with mind viruses which are called memes. Memes are pieces of information which can alter physically how the brain is working.

The mind isn't limited to the brain. The mind is all the resources the brain can leverage to compute. In other words a person has a brain to compute with but when language was invented this allowed a person to compute not just using their own brain but using the environment itself. To draw on a cave is to use the cave to enhance the memory of the brain. To use mathematics is to use language to enhance the ability of the brain to compute by relying on external storage and symbol manipulation. To use a computer with a programming language is essentially to use mathematics only instead of writing on the cave wall we are writing in 1s and 0s. The mind exists to augment the brain in a constant feedback loop where the brain relies on the mind to improve itself and adapt. If there were no external reality the brain would have no way to evolve itself and improve.

A society in the strictly human sense of the word is the aggregation of minds. This can be at minimum all the human minds in that society. As technology improves the mind capacity increases because each human can remember more, can access more computation resources, can in essence use technology to continuously improve their mind and then leverage the improved mind to improve their brain. The Internet is the pinnacle of this kind of progress but it's obviously not good enough. While the Internet allows for the creation of a global mind by connecting people, things, and minds, it does nothing to actually improve the feedback loop between the mind and the brain, nor does it really offer what could be offered.

Bitcoin came into the picture and perhaps we can think of it as a better memory. A decentralized memory where essentially you can have money. The problem is that money is a very narrow application. It is the start, just as to learn to write on the cave wall was a start, but it's not ambitious enough in my opinion.

Humans in the current blockchain or crypto community do not have many ways where human computation can be exchanged. Human computation is just as valuable as non biological machine computation because there are some kinds of computations which humans can do quite easily which non biological machines still cannot do as well. Translation for example is something non biological machines have a difficult time with but human beings can do well. This means a market will be able to form where humans can sell their computation to translate stuff. If we look at Amazon Mechanical Turk we can see many tasks which humans can do which computer AI cannot yet do, such as labeling and classifying stuff. In order for things to go to the next level we will need markets which allow humans to contribute human computer and or human knowledge in exchange for crypto tokens.

The concept of a decentralized operating system is interesting. First if there are a such thing as social computations (such as collaborative filtering, subjective ranking, waze, etc) then what about the new paradigm of social dispersed computing?

First think of computation as a commodity. It is something which can be bought and sold (with Bitcoin if you like?).

Second think of computation as ubiquitous while also thinking of the trend toward hyper connectivity.

Third think of how apps can be distributed across multiple platforms and physical computing systems.

Consider the impact this could have on life as we know it.

​The question becomes what do we want to do with this computing power? Will we use it to extend life? Will we use it to spread life into the cosmos? Will we use it to become wise? To become moral? To become rational? If we want to focus on these kinds of concerns then we definitely need something more than Bitcoin, Ethereum, or even EOS. While EOS does seem to be pursuing the strategy of a decentralized operating system which seems to be the correct course, it does not get everything right.

One problem is as I mentioned before the importance of the feedback loops between minds and brains. The reason I always communicate on the concept of external mind or extended mind is based on that fact that it is the mind which creates the immune system to protect the brain from harmful memes. The brain keeps the body alive. The brain is not really capable of rationality, or morality, or logic, and relies on the mind to achieve this. The mind is essentially all the computation resources that the brain can leverage.

EOS has the problem in the sense that it doesn't seem to improve the user. The user can connect, can join, can earn or sell, can participate, but unless the user can become wiser, more rational, more moral, then EOS has limits. EOS does have Everpedia which is quite interesting but again there are still problems. What can EOS do to improve people in society and thus improve society, if society is a computer and is in need of being upgraded?

Well if society is a computer first what does society compute? What should it compute? I don't even know how to answer those questions. I could suggest that if computation is a commodity along with data then whichever decentralized operating systems that do compete and exist will compete for these commodities. The total brain power of a society is just as important as the amount of connectivity. And the mind of the society is the most important part of a society because it is what can allow the society to become better over time, allow the people in the society to thrive, allow the life forms to continue to evolve avoid extinction.

A decentralized operating system on a technical level would have a kernel or something similar to it. This is the resource management part. For example Aragon promises to offer a decentralized OS and it too mentions having a kernel. A true decentralized operating system has to go further and requires autonomous agents. Autonomous agents which can act on behalf of their owners are philosophically speaking the extended mind. But the resources of a society is still finite, has to be managed, and so a kernel would provide for an ability to allow for resource management.

The total computation ability of a society is likely a massive amount of resources. A lot more than just to connect a bunch of CPUs together. Every member of the society which can compute could participate in a computation market. Of course as we are beginning to see now, the regulators seem concerned about certain kinds of social computations such as prediction markets. So it is unknown how truly decentralized operating systems would be handled but my guess is that if designed right then they could be pro-social, be capable of producing augmented morality by leveraging mass computation, and also by leveraging human computation be able to be compliant. To be compliant is simply to understand the local laws but these can be programmed into the autonomous agents if people think it is necessary.

What is more important is that if a law is clearly bad, and people have enhanced minds, then it will be very clear why the law is bad. This clarity will help people to dispute and seek to change bad laws through the appropriate channels. If there is more wisdom, due to insights from big data, from data scientists, etc, then there can be proposals for law changes which are much wiser and more intelligent. This is something specifically that people in the Tauchain community have realized (that technology can be used to improve policy making).

A lot is still unknown so these writings do not provide clear answers. Consider this just a stream of consciousness about concepts I am deeply contemplating. This is also a way to interpret different technologies.

Size matters. Some people object that it does not matter, but has meaning. But meaning always matters, so it is the same.

The bigger problems one solves, the bigger the gains. Big problems require big solutions. We live in a big universe and our very survival is to deal with bigger and bigger problems, which require bigger and bigger solutions to cope.

But nevertheless to build big is hard so we naturally prefer to create small things which can grow. Small from point of view both of understandable and affordable to build. So best fit are small solutions, cheap and easy to make which scale out or unfold or unleash into big means to address big problems. Scaling is everything.

Scaling.[1] Scalable! Scalability !!

The root-word 'scale' possesses marvelous riches of meaning in English language [2] with lots of poetics inside.:

The scalability issues could be grokked [6] with the following anecdote:​

Bunch of workers on a construction site and a huge log. The onsite manager commands a few of them to lift and move it. They try and object ''Too heavy!''. The manager adds more and more workers, until they shout back again: ''Too short!''.

A few real examples, the first two - bad and the last three excellent:

[a] I won't name this 'crypto' just will say it is named after a mythical element of the universe, according to the prescientific gnostic [7] imaginations. It's core 'value proposition is to shovel meaningful computation into a thread of computation which very value proposition is to be as random, meaningless and unidirectional (hard to do, easy to prove) as possibly possible - the blockchain. The theoretically most expensive form of computation. Visualize: cars and airplanes made of gold and diamonds burning most expensive perfumes. Or mass production of electricity by raising trillions of cats and hiring trillions of people to pet them with grid of pure gold wires to discharge and collect the electrostatics. If they have chosen the original Satoshi blockchain [8] for their 'experiments' - where the futility of such attempt would become instantly clear and would die out outright due to impending unbearable cost - will of course be more fair way to do, and would've spared dozens of billions of dollars to the Mankind, but logically they preferred a 'controlled' blockchain of their own. In a sense that the guys with vested interest into it have the power to hand-drive, stop, restart and vivisect it. The only use of this 'blockchain supercomputer' is ... tokenomics by Layering. Why it was at all necessary for a blockchain advertised as so good as to do all the general computation, to be made so hairy and bushy with layered tokens??

[b] Another trio of chaps, won't mention names again, were really at awe with Satoshi's creation, so much that they not just liked, but wanted it and decided to have it. For themselves. All of it. And rebelled and forked out and provided 'scaling' errrmm ... uhhh... solution. By increasing the blocksize. Something which Satoshi meditated on, extensively discussed with his disciples and not occasionally decided to put breaks on. [9] Very recently the crypto news headlines said that the blocksize increase solution providers are eyeing ... Layering. Which they furiously were advocating that blocksize increase makes unnecessary. Cause it is the solution, isn't it? Or maybe it just was. And is not anymore? Well, I'd say that all the aka 'alts' [10] - to provide a rejuvenated clone of Bitcoin tweeked here and there to provide momentary ease of difficulty and transaction fees - suffer from one and a same problem - traveling back in time does not tell you the future.

[c] Lets jump half a century back in time. It is 1960es. The very making of internet. Computers are already here and scaled up in numbers so their networking to become a problem/juice worth the solution/squeeze. The birth of TCP/IP [11] and the report of the very makers of it. Of the solution for the network scaling. Enjoy the ancient wisdom:

Initially, the TCP managed both datagram transmissions and routing, but as the protocol grew, other researchers recommended a division of functionality into protocol layers. Advocates included Johnatan Postel of the University of Southern California's Information Sciences Institute, who edited the Request for Comments (RFCs), the technical and strategic document series that has both documented and catalyzed Internet development. Postel stated, "We are screwing up in our design of Internet protocols by violating the principle of layering." Encapsulation of different mechanisms was intended to create an environment where the upper layers could access only what was needed from the lower layers. A monolithic design would be inflexible and lead to scalability issues. The Transmission Control Program was split into two distinct protocols, the Transmission Control Protocol and the Internet Protocol.

The layering made the Internet as we know it. By the simple trick of just one node needed to permit another. Unstoppable inclusivity!

[d] The Mastercoin / Omni Layer [12]:​

«A common analogy that is used to describe the relation of the Omni Layer to bitcoin is that of HTTP to TCP/IP: HTTP, like the Omni Layer, is the application layer to the more fundamental transport and internet layer of TCP/IP, like bitcoin».

[e] The Lightning network (LN) [13]:​

The Lightning Network is a "second layer" payment protocol that operates on top of a blockchain (most commonly Bitcoin).

​Satoshi spoke on 'payment' channels in his masterpiece. Foreseeing the way to scale.

An estimate of the power of LN layering [14].:

''The bitcoin devs accept that eventually larger block sizes will be needed. The current transaction rate isn't going to cut it if people all over the world actually start using bitcoin daily. They estimate that eventually, if everyone in the world uses bitcoin and makes 2 transactions a day, but uses the lightning network, a 133mb blocksize will be needed. Without the lightning network, something like a 200gb (GIGABYTE) size PER BLOCK would be needed to accommodate that much usage.''

​Layering upscales it with orders of magnitude of higher efficiency.

If Bitcoin is the 'first layer' and Omni and Lightning are 'second layer', I see which one is the 'Zeroth Layer' and also foresee [15] the inevitability of the merger or 'Amalgamation' of all second layers over all blockchains, so the user will be able to transact everything into anything to anybody, without to know or care which chain is in use ... I have special nicknames for these and will go back to these topics in series of future posts.

Enough of examples I reckon.

The Postel's sacred Principle of Layering comes from the implementation levels paradigm.​

or Abstraction layering [16]:​

''separations of concerns to facilitate interoperability and platform independence''

With other words - delegate the task to that layer of the system which does the particular job best. We can generalize this into The Scaling Commandment. Only one enough:

​What do I mean by the concept of "liquid platform"? This is merely a re-articulation of the concept of self amendment and self definition. In other words it is very much like an autopoietic design. Bruce Lee once said to "be like water", and the reason is because water can adapt to any environment it is placed it by taking the form of the container it is put into.

So by liquid paradigm I mean that the core feature of true next generation platform design is going to be focused on maximum adaptability.Feedback loops and the virtuous cycle

​How can we have a platform which promotes continuous self improvement? If you have a platform with no hard coded "self" then even the design of the platform is under constant negotiation and creation. This is key because it means Tauchain will be able to adapt quicker than all other competing platforms. Quicker than Tezos because Tezos merely provides self amendment but lacks the virtuous cycle, the meta language, etc.

The Tau Meta Language allows for self definition at the level of languages. This means even the communication mechanism between humans and machines can be updated continuously. This continuous updating is the key design breakthrough of Tauchain because it means Tauchain will always be state of the art in any area. Think of a platform like Wikipedia where anyone can update any part of it in real time continuously so that every part of it is always the state of the art.

Starting at languages, the feedback loop can be created between humans and intelligent machines. Humans must make decision on how to design Tau. These design decisions benefit from the virtuous cycle due to the feedback loop between humans and machines allowing the decision making ability itself to be upgraded. This could even allow for the humans to transcend traditional human capabilities by relying on intelligent machines to assist in design which means better future designs, which means better decision making, which means better future designs which leads to better decision making, this represents the "virtuous cycle" by way of a feedback loop between humans to machines to humans to machines to humans etc. The humans improve the quality of the machines by feeding knowledge, feeding new algorithms, feeding just enough for the machines to become intelligent enough to help the humans to help the machines even more efficiently in the next iteration of Tauchain, over and over again.

Humans and machines will seek more good and less bad for the formal specification of Tau itself. Good and bad designs will be defined collaboratively by the human participants by way of intelligent discussion. As discussion scales, bigger crowds means more human minds involved, which means improved design, which leads eventually to a better and perhaps wiser Tau, which of course would lead to wiser even more intelligent discussions, which can lead to an improved formal specification, and to a better Tau. So that is a loop. It is also a loop between improving Tau, improving society, improving Tau, improving society.

tau-chain.info is not the official site of Tau-Chain. This website has NO affiliation or relationship with the company or IDNI organization, and its contents are neither approved nor supervised by it. The website is dedicated to collaborate with the community and helping Tau and Agoras to progress in a sustainable way.