Bienvenidos! Welcome!A blog about Tau-Chain, TML and Agoras.

Tau-Chain.info is an unofficial blog that gathers information from various Internet sites about IDNI projects: Tau, Tau-Chain, Tau Meta Lenguaje and Agoras. This blog does not belong nor is it part of the official Tau project. The purpose of the blog is to make the project known, in an independent way, to people.​Tau is a revolutionary blockchain platform designed to scale up social consensus and accelerate knowledge creation in a decentralized network. Agoras will be an advanced marketplace for knowledge and computational resources built on this framework.

Smaller vs larger denominations in cryptoThe large denominations produce a different psychology (the psychology of scarcity). This has a problem though because if for example 1 ETH or 1 BTC is $1000 it does eventually begin to look like it's just for rich people. It begins to look to some in developing countries that it's just too expensive. On the other hand Ripple has smaller denominations and still has quite a high market cap regardless.

In the official documents, it is known that the Agora tokens currently being sold on exchanges are "intermediate tokens". There are going to be roughly 42 million intermediate tokens. When people look at it this way people might think the price of AGRS is high after a certain psychological barrier such as $100. At $100 using the intermediate tokens the market cap would be four billion two hundred million. In crypto this is not that high of a market cap and in tech this is not so high. A tech company can easily reach a market cap of 4 billion and if we remember Snapchat had a market cap far beyond that.

In the case of Tauchain which the goal is to reveal to the world truly novel technological breakthroughs which provide for unique features then we can not predict where the high end for AGRS will be. What we can know is that the price looks vastly different if we look at it via intermediate tokens vs official Agora tokens. 147,000,000,000 tokens can exist according to Ohad.

In this case the true number is 147 billion. These are Ripple like numbers. So a price of between 0.01 and 0.04 USD is per token is reasonable in a good market. The $1 range is if AGRS achieves similar to Bitcoin price success range. This in my opinion would be extremely optimistic. Ultimately no one can predict where the price could move and currently 1 true AGRS is less than a penny.For people who did take the risk to buy Agora tokens at $100 to get 3.5 million? If it ever does reach $1 level (Ripple or Bitcoin scale success) then you folks are multi-millionaires in the making.

Just several hours ago lead developer and founder of the Tauchain project Ohad Asor released his most significant code update yet. This blog post will be to discuss some of those updates and put it into context. In order to make sense of the current codebase :"Tauchain Codebase" I will also discuss a bit about the makeup of the code.

The significant breakthrough - Ohad implements the BDDFirst some might be wondering what is BDD? BDD is a data structure called binary decision diagram. This data structure in my opinion is as significant to Tauchain as the "blockchain" data structure was to Bitcoin. For those who do not have a computer science degree I will elaborate on what exactly a data structure is below before discussing what a BDD is and why it is so significant.

Brief discussion on what a data structure isIn programming a data structure is a concept which represents a data organization method. For example blockchain is all about how records are stored as blocks. There are other similar data structures which represent decentralized data management and storage such as for instance the distributed hash table data structure.

The really good programmers choose the appropriate data structure to meet the requirements of the project. BDD was chosen specifically by Ohad because it provides efficiency boosts in a key area necessary for Tauchain to function as intended. In specific we know Tauchain requires partial fixed point logic in order to have decidability in P-SPACE. We also know Tauchain requires decentralization and efficiency. Efficiency can be understood better in terms of the trade off between time and space. We do not have unlimited time or space so we must sacrifice one in order to get more of the other.

Example:

Compression saves space but increases processing time (this is also related to encryption which costs in processing time).

A lookup table saves time but costs space.

When we look at the code base we know that Ohad can optimize the code either by sacrificing space in which the executable will be bigger (but the code runs faster) or he can choose to sacrifice time in which the code is a smaller executable to save memory but might run slightly slower. This highlights the essential trade off between time and space when optimizing code but of course there is more to it because algorithms within a code base have to make similar trade offs.

Now what exactly is a BDD (binary decision diagram)?Now that we understand the basics about efficiency and what a data structure is we can make a bit more sense of what a BDD is. In order to understand why BDD as a data structure is so important to Tauchain we have to remember that Tauchain is about logic. We can take the most basic example of Socrates:​

A predicate takes an entity or entities in the domain of discourse as input while outputs are either True or False. Consider the two sentences "Socrates is a philosopher" and "Plato is a philosopher". In propositional logic, these sentences are viewed as being unrelated and might be denoted, for example, by variables such as p and q. The predicate "is a philosopher" occurs in both sentences, which have a common structure of "a is a philosopher". The variable a is instantiated as "Socrates" in the first sentence and is instantiated as "Plato" in the second sentence. While first-order logic allows for the use of predicates, such as "is a philosopher" in this example, propositional logic does not.[5]

Based on the rules of first order logic we can have our inputs and receive our outputs. In the most basic example above we an see a bit about how logic works. To elaborate further:​

Relationships between predicates can be stated using logical connectives. Consider, for example, the first-order formula "if a is a philosopher, then a is a scholar". This formula is a conditional statement with "a is a philosopher" as its hypothesis and "a is a scholar" as its conclusion. The truth of this formula depends on which object is denoted by a, and on the interpretations of the predicates "is a philosopher" and "is a scholar".

A truth table has one column for each input variable (for example, P and Q), and one final column showing all of the possible results of the logical operation that the table represents (for example, P XOR Q). Each row of the truth table contains one possible configuration of the input variables (for instance, P=true Q=false), and the result of the operation for those values. See the examples below for further clarification. Ludwig Wittgenstein is often credited with inventing the truth table in his Tractatus Logico-Philosophicus,[1] though it appeared at least a year earlier in a paper on propositional logic by Emil Leon Post.[2]

When we are dealing with logic we may find that a truth table helps with visualization.

Now with this knowledge we have the most basic Socrates example:

All men are mortal.

Socrates is a man.

Socrates must be mortal.

This can be represented via truth table and is called a syllogism. To solve this we simply apply a kind of reasoning called deductive reasoning. This would indicate that if All men are mortal is true and if Socrates is a man is also true then Socrates is a mortal must be true. If we were to say all men are mortal but Socrates is immortal then Socrates cannot be a man. So if Socrates is a man he must be moral or there is what we call a contradiction. Logic is all about avoiding these sorts of contradictions and in specific binary or boolean logic is to reach a conclusion which always must be one of two possible values.

If I ask you to play a game which we want to guarantee will end with either one of two possible outcomes then we have a good example of a boolean function. 1 or 0, true or false, on or off, a or b.

Some of you may be familiar with data structure we call a DAG (directed acyclic graph). For those of you who understand this concept you can visualize a BDD as being very similar to a propositional DAG.

​By David Eppstein [CC0], from Wikimedia CommonsWe know from DAGs that it's a finite amount of vertices, edges, etc. We may also be able to visualize topological ordering and if you remember my post on transitive closure you might also remember the visuals on how that can work:

​This highlights the fact that BDD can be used to create a SAT solver.For example:

A DPLL SAT solver employs a systematic backtracking search procedure to explore the (exponentially sized) space of variable assignments looking for satisfying assignments. The basic search procedure was proposed in two seminal papers in the early 1960s (see references below) and is now commonly referred to as the Davis–Putnam–Logemann–Loveland algorithm ("DPLL" or "DLL").[18][19] Theoretically, exponential lower bounds have been proved for the DPLL family of algorithms.

Without getting overwhelmed by technical details the key points are below:

BDD is a data structure of immense importance to the Tauchain project.

BDD enables Tauchain to "come alive" by allowing for even the basic truth table to be applied or the SAT solver to be implemented.

BDD + PFP is what we see in the Github code base. We see that Ohad has implemented BDD for PFP (binary decision diagram for partial fixed point logic).

To read the code for yourself and track the progress of Tauchain development take a look at Github:

I continue herewith with sharing my contemporary state-of-grok [1] of the up to now four [2] scriptures of the aka newtau[3]. Sorry for the delay, but it comes mostly from the efforts to contain the outburst of words, catalyzed by the very exegetic process of such a rich content, into a reader-friendly shorter form.

The subject of vivisection textographically identifies as the first three paragraphs of ''Tau and the Crisis of Truth'', Ohad Asor, Sep 11, 2016 [4].

The four core themes extracted are ennumerated bellow, with as modest as not to sidetrack the thought and to not spoil the original message, streak of comments of mine.:

I. LAW

As I guy who's immersed in Law for more than quarter of century [5] I can swear with both hands on my heart in the notion of unknowability of Law.

Since my youth years in the law school [6] I was asking myself how it is possible at all to have 'rule of law' [7] in case any legal system ever known required humans to operate !?

It seemed that the only requisite or categorcal difference between mere arbitrary 'rule of man' [8] and the 'rule of law' was that in some isolated cases some ruling men happened to be internally programmed by their morals [9] to produce 'rule of law' appearance effects by 'rule of man' means.

Otherwise 'rule of law' done via 'rule of man' poses extremely serious threats of law to be used by some to exploit and harm others.

In that line of thoughts my conclusion was that the Law is ... yet to come.

What we know as Law is not good networking protocol software of mankind as such, but rather we see comparatively rare examples of individually well programmed ... lawyers.

On the wings of a technological breakthrough, just like: flying came with the invention of airplanes and moonwalk needed the advent of rocketry, or to remember without to stay alive - the writing. The Law is an old dream. If we judge by the depth of the abyss of floklore - one of the humanity's most ancient dreams, indeed. Needless to repeat myself that this was what sucked me into Tau as relentlessly as a black hole spagetification [10] :)

The referred by Ohad frustration by Law of the great Franz Kafka [11] expressed in his book The Trial[12] becomes very understandable for Kafka's epoch lacking the comforting hope in a technology which we already have -the computers- and the overall progress in the field of logic, mathematics, engineering ... forming a self-reinforcing loop centered around this sci-tech of artificial cognition.

Similarly to the nuclear fusion, which is always few decades away, but the Fusion gap closes noticeably nowadays [13], we are standing on the cliff of a Legal gap.

The mankind's heavy involvement in cognition technologies, especially in the last several decades, outlined multiple promising directions of further development, which seem to bring us closer to abilities to compensate the fundamental deficiencies of Law and in fact to finally bring it into existence.

It took entire Ohad Asor, however, to identify the major reasons why the Law is bottlenecked out of our reach yet, and to propose viable means to bridge us through that Legal gap... The other side is already in sight.

II. WORDS

It is in the first place the language to blame !

The human natural language [14]. Our most important atribute as species. The mankind maker. The glue of society. It just emerged, it hasn't been created. It has rather ... patterns, vaguely conventional, than intentionally coined set of solid rules. There ain't firm rules to change its rules, either ... The natural human language is mostly wilderness of untamed pristine naked nature, dotted here and there with very expensive and hard to install and maintain ''arteftacts'' [15]. Leave it alone out of the coercion of state mass media, mass education and national language institutes and it falls back into host of unintelligible dialects. Even when aided by the mnemonic amplifier which we call writing.

Ambiguity is characteristic of the natural language, a feature in poetry and politics, but a deadly bug in logic and law.

We'll put aside for now the postulate of impossibility of a single universal language to revisit it later when its exegetic turn comes. In another chapter onto another scripture. Likewise, not in this chapter we'll cover the neurological human bottlenecks which are targetted to be overcome by Tau. Lets observe the sequence of author's thoughts and to not fast forward.

Instead of that I'll dare to share with you my own hypothesis about why the natural human languages are so. (I'm smiling while I type this, cause I can visualize Ohad's reaction upon reading such frivolous lay narrative. I hope he being too busy will actually not to.) To say that the human languages are just too complex does not bring us any nearer to decent explanation. Many logic based languages are more than a match of the natural human ones in terms of expressiveness and complexity. It shouldn't be that reason.

My suspicion is rather that the natural human languages pose such a Moravec hardness [16] for being not exactly languages. Languages are conveyors of meaning. Human languages convey not meaning, but indexes or addresses or tags of mind states. The meaning is the mind state. Understanding between humans is function of not only shared learnt syntaxi, but also of shared lives. Of aggregation of similar mind states which to be referred by matching word keys.

If this is true it is another angle for grokking the solution of human users leaning towards the machine by use of human intelligible Machinish, instead of Tau waiting the language barrier to be broken and machines to start speaking and listening Humanish.

In a nutshell we yet wait the Law to come cuz Law is not doable in Humanish. Bad software. And the other side of the no-law coin is that the humans are no cognitive ASICs [17]. We do congnition only meanwhile and in-order-to do what other animals do - to survive. Bad hardware.

In order law to become law it must become handsfree [18].

Not humans to read laws, but laws to read laws.

The technology to enable that looks on an arm's length.

III. TRUTH

Ok, so far we butchered the law and the language. What's left?

The nature and essence of human language brought one of the most harmful and devastating notions ever. Literally, a thought of mass destruction.

The ''crisis of truth''. The wasteland left by the toxic idea spilover of ''there is no one truth'' or even ''there ain't truth'' at all. This is not only abstract, philosophical problem. Billions of people actually got killed for somebody else's truth.Not occasionally the philosophers who immersed themselves into this pool are nicknamed 'Deconstructivist' [19]. Following back their epistemic genealogy, we see btw, that they are rooted rather in faith than in reasoning, but this is another story.

The general problem of truth, of which the problem of law is just a private case, opens up two important aspects:

Number one, is that all knowledge is conjectural to truth and that, truth is an asymptotic boundary - forever to close on but never to reach. Like speed of light or absolute zero. Number two, is that human languages make pretty lousy vehicles to chase the truth with.

If really words are just to match people's thoughts together, then there are thoughts without words and words without thoughts. Words mismatch thoughts, so how to expect they to bridge thoughts to things? Entire worlds on nonsensical wording emerge, dangerously disturbing the seamless unity of things and thoughts. Truth displaced.

''But can we at least have some island of truth in which social contracts can be useful and make sense?''[4]

This island of shared truth is made of consensus [20] bedrock and synchronization [21] landmass.

Thuth and Law self-enforced. From within instead of by violence from without. And in self-referenial non-regressive way.

IV. NOMIC

''We therefore remain without any logical basis for the process of rulemaking, not only the crisis of deciding what is legal and what is illegal." [4]

Peter Suber with his ''The Paradox of Self-Amendment: A Study of Law, Logic, Omnipotence, and Change'' [22] proposed a rulemaking solution which he called Nomic [23].

''Nomic is a game in which changing the rules is a move.'' [22]

The merit of Nomic is that it really eliminates the illths of the infinite regress [24] of laws-of-changing-the-laws-of-changing-the-laws, ad infinitum, by use of transmutable self-referrenial rules. But Nomic suffers from number of issues - the first one, in the spotlight of that chapter, being the fact that we still remain with the “crisis of truth” in which there is no one truth, and the other ones - like sclability of sequencing and voting - we'll revisit in their order of appearance in the discussed texts.

The aka 'newtau' [3] went past the inherent limitations of the Nomic system and resolves the 'crisis of truth' problem.

The next few chapters will dive into Decidability and how it applies to provide solution to the problems described above.

“A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.”

― Robert A. Heinlein [1]

No, it is not a vow everybody to be everything. It is a reflection of the fundamental human fungibility [2]. The average human can be taught to take any human role. The exceptions of true organic geniuses (those who are hard to be replaced) and morons (those who are incapable to replace), only confirm this general rule of shear numbers [3]. This is what makes the mankind so scalable [4].

''Know'' is synonymous with ''can''. Literally. Knowledge = technology. Even etymologically [5]. Knowledge is praxis [6]. Only. There ain't such thing as impractical knowledge. If it is not a skill, it is not knowledge. I mentioned once [7] that we're all AIs. Ref.: feral children [8].

We are not what we eat [9], but we are what we've learnt. You are what you know/can. And you can what you have learnt. Learning is from the taking side. Teaching is on the giving side. Of one and a same process. We do not have a word to denote the modulus [10] of learning/teaching, it seems. But it will come.

We are taught by the others, the society. We are the cherry ontop of a layer cake of culture onto nature [4]. We are learning by ... living. We acquire skills in plethora of contexts from family, street, school, job, media ... Learning [11] is not a monopoly of man, countless systems are also learners. Maybe one of the basic definitions of life and intelligence is the ability to learn [12]. Giant topic, yeah. We won't graze into it here now on what is learning, but on how we learn.

Due to our neurological bottlenecks we spontaneously form hierarchies [13]. This hinders our scalabilty [4] by forcing humanity to be more or less a fractal of 5. We are close to a number of breakthroughs which to mitigate these innate limitations of ours into a number of ways [7] [14] [15] [16]. But the general case is not subject of this article - herein we focus on HOW we are taught. How we acquire knowledge, and how this knowledge of ours gets recognized and utilized by society. And the hierarchic emergent structuring is of course in full force upon us in teaching as well as into everything social else.

So comes education [17], such comes exam [18], knowledge certification [19], certified skills application [20], knowledge creation verification [21], job fitness testing [22], CVs and employer recommendations ... etc., etc. With all the bugs and the so little features of this 'map is not the territory' [23], situation.

It is all centralized and hierarchic - exactly as the global fractal of double-entry accountancy ledgers which we call fiat financial system is. In fact it is so interwoven with fiat finance than it is almost inextricable from it [24]. And as much inefficient and imprecise.

In all these years of talking and thinking on Tauchain [25] - I noticed - and this suspicion of mine incrementally turns into shear conviction - that Tau, the upscaler of humanity, inevitably also is the ultimate teaching machine. If education is facilitating of learning, Tau is the maximizer of learning. By its very construction, it comes out so.

People talk and listen whenever and whatever they want. Tau has unlimited capacity to listen and attend and remember, and answer. Only limited by the hardware capacity allocated. Tau extracts meaning. Purifies the stream, distills it down to the essence. Detects repetitions, contradictions and all other, ubiquitous nowadays conversation bugs. Remembers changes of opinions of the individual user. And points them out. Sounds like the best tool to know oneself. And the others to know you if you let them.

Your Tau account or profile is what you know. You say what you say and also ask. Say statements and questions. Tau pools you together with the others who state the same and, more importantly, who ask the same type of questions. Knowing what you know, and asking about what you don't know but want to know, maps not only your knowledge state but also maps your knowledge dynamics. Records and drives how your knowledge changes. You even have access to what you forget, and can recollect it. True real time knowledge state reporting. For first time in human history.

If consciousness [26] is - aside from the clinical state of being merely awake - the post-factum integration of senso-motoric experience [27], the Accountant of mind, the speaker of the narrative which is you, then Tau is your consciousness booster. That is - stronger than thought.

The ultimate teaching, the ultimate fair testing or exam, the ultimate real-time comprehensive diploma, or certificate, super-peer reviewed paper(s) of you as academic carrer.., the ultimate job interview AND the ultimate ... job of being working as yourself and anything useful you create to be instantly scarcifiable and monetizable - your Tau account is! And all the rest of accessible socoety - being your own workforce. And you to them. In the billions. In a move. In real time.

Including control over the pathways of increase of your skills towards the most productive personally for you learning directions, because it aids you to analyze the you-Tau history and to apply knowledge maximizer techniques and to participate profitably into creation of newer better ones. Maximizer of self. And maximizer of society making it to consist of max-selfs. Ever improving. Merger of education with work occupation. Work-as-you-live.

The literal Knowledge Economy, as described by @trafalgar in his article [28] from few months ago. Where search, creation, reflection, certification, recognition, commercialization, accumulation, modification, improvement ... everything of knowledge - is all in one.

And it is not only Humans and Tau lonely job. I foresee the other Machines to join the party [15]. Yes, I mean machines capable to have interests and to ask and seek answers of palatable questions.

This - the education amplification - to come down the technology way - has been, of course, anticipated by many. Few arbitrary examples:

- A distant rough-sketch hint for the inevitable tuition power of Tau is Neil Stephenson's [29] ''The Diamond age'' [30] , with the depicted: '' Or, A Young Lady's Illustrated Primer '' [31], as an interactive networked teaching device.

- or if I'm right about the inevitable conquest of the natural languages territory [15] - UX [32] like in the 'Her' (2013) film [33].

- Thomas Frey [34] of the futurist DaVinci Institute [35] in his book ''Epiphany Z'' [36] paid special attention of this.: down the way of micro- and nano-education, an effective merger of the processes of education, diplomas issuing, job application, exam and actual execution of job obligations. Tom does not know about Tau. But I'll tell him.

With a big smile of irony and self-irony of course... these examples. Just to pick from here and there proofs of the giant anticipation of what's to come. And taken with a few big grains of salt. Cause the reality will be immensely more powerful.

Tutor [37], tuition [38], my emphasis via using exactly this wording, comes to denote the economic side of learning/teaching. It is about the cost of learning - the association of tuition with fees, about the placement of the acquired skills, about the business organization of those, about the protection of ownership and security of transaction of knowledge ... Let me introduce here a neologism [39] which to reflect the business side of it:

Scrooge Factor [40]- Simply denoting the money-making power of a technology use by a business. The 'money suction power' of a business entity or organization of any kind coming from the application of a technology, if you want. Technology as socialized knowledge. Scaled up over multiple humans. Over a society. Of course the Scrooge Factor can pump in different directions. The Scrooge Factor of the traditional hierarchic education, governance and everything ... is apparently very often negative - hierarchies decapitalize, dissipate, waste. Orders of magnitude more wasteful than any PoW [41], but on this - some other time.

So aside from all the niceties of the abstractions of the full supply and value chains of a Knowledge economy, lets round up some numbers:

- We know that a true functional semantic search engine alone is worth $10t. Yeah. Tens of Trills. Trillions. As per the assessments of Davos WEF attendees of as far as I remember 2015 or 2016...

- Also, Bill Gates stated back in 2004 [42] that ''If you invent a breakthrough in artificial intelligence, so machines can learn,'' Mr. Gates responded, ''that is worth 10 Microsofts.''

- Tom Frey [34] also argued [43] that by 2030 the biggest corporation in the world will be an online school. Given the present day size and growth rate [44] of, say, Amazon [45] this 'online school' should be in the range of good deal of trillions of marcap if it is to be bigger than the biggest corporations. But we do not need such indirect analogies over analogies to access the scale. The shear size of the global education industry is the most eloquent indicator [46]. Note that Tom talks about 'corporation' i.e. for clumsy and inefficient hierarchic human collective. Not for a system which does this orders of magnitude more efficiently and powerfully due to being intrinsically P2P, i.e. geodesic [13]. Even the best futurologists can be forgiven for missing to predict Tau. :)

And this mind-boggling hail of trillions, does not even account for the Hanson Engine [16] factor.

Tau the Tutor ex Machina is just another unintended useful consequence outta the overall design.

It is nearly impossible to track and contemplate exactly what all these 'side-effects' would be and how they will synergetically boost each other.

With my articles I intend to only touch some lines of the immense phase space [47] of the possibilia, with neither any ambition to think it is possible to cover it all, nor this to represent any form of advice.

Future is incompressible. Compression is comprehension. Comprehensible only by living.

Failure to go to the geodesic way of learning, will turn these beautiful but trilling words into prophecy: ​

"The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age." H.P.Lovecraft [48] (1926 ).''

tau-chain.info is not the official site of Tau-Chain. This website has NO affiliation or relationship with the company or IDNI organization, and its contents are neither approved nor supervised by it. The website is dedicated to collaborate with the community and helping Tau and Agoras to progress in a sustainable way.