Ontologies are expressed in RDF language family (Resource Description Framework), or simply JSON-LD.We propose a software client that stores an ontology of local rules, and determines its actions using a Reasoner. Reasoners are tools that infer new rules or conclusions given old ones. They do it intelligently - using pure logical reasoning, and they also supply proofs for their results. Tauchain node is therefore an intelligent agent able to communicate with other agents, at the very same language they're written with, which is quite human-readable.

It can communicate with other languages as well, like HTTP, once implemented in RDF.Arrow of time is brought into the network using the Blockchain algorithm. Items can get into a Merkle tree that will be signed by a miner, roughly speaking. The network will also function an RDF-speaking distributed storage, namely Kademlia DHT, letting hashes of items to be time-stamped in a mechanism which is up to the rules.

The rules of the network are determined by its users. Conversly, many independent universes can be created over tau-chain, that may or may not share tau's timestamping. They can also reference to each other, allowing code-reuse: recall that rules are code, since we have a unified language. Moreover: we give an ability to implement Decidable computer programs with RDF, namely DTLC languages rather Turing Complete ones. The implications of those languages will be described on the paper.

Agoras is basically like tau that has the notions of proofs and communication, but with additional notions of time, ledger, and reward. It is basically implementing a Bitcoin-like thing over tau, but it has far going powers: the ability to prove now get a new meaning.The strength of Agoras over existing cryptocurrencies is based on the logical properties and strengths of the representations and the network, mainly this logic being decidable, so every truth statement can be proved, yet expressive enough.On say Bitcoin, you can reward coins to one who provides a cryptographic signature proof. In tau we can express our own rules and demand a proof that the rules have met. You can set a rule that one will get coins if they, for example:

* Write a code that passes a given unit-test.* Prove a fact or a consequence from wikipedia's data (dbpedia).* Run some code by supplying an execution proof.* Meet some custom achievements e.g. financial or educational tests.

You can also create a DAC with your own custom rules and combine methods from different sources (code reuse): like some auctioning or trading or voting or pricing algorithms that were already implemented over tau.

Once everything can be expressed in human comprehensible knowledge and rules, and given a software that is able to figure out by itself from this data what to do, and given this software can communicate with others, electronic markets get a whole new meaning.

1. There is a little bit unfair for the previous buyers, they are early buyers and take more risks but now they have to buy this coin at the same price(after calculating the discount), and BTC price keeps going down, in my opinion 50% retroactive discount makes sense, where is the 10% bounty for these early investors? These guys are the earliest and most faithful supporters of this project.

2. What's the difference between Tau-Chain and Zennet? I suppose you will continue decentralized Supercomputer on Tau-Chain but will have more features, correct?

3. If you don't hit the 2M$ goal when pre-sale ends, what will you deal with the left?

THIS MESSAGE MAY CHANGE PERIODICALLY--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------Any structure, that of a building or that of a social organisation was created or is running by an engineered program. Engineering being the set of rules applied to matter or living systems is that which is laid beneath anything whatsoever.When we coin the term engineered we often assume a predetermined design executed and controlled by the force of the designer engineer.. Engineering often defined as the application of knowledge in order to create something. But that definition apply only in the case of an external centralized fixed engineered scheme. The other type,The type that rules in most of nature's behavior and all forms of life is decentralized, and it is rarely categorized as engineered unless assuming an external force, namely God. When we come to asses the benefit of a decentralized system we are looking at the operation itself and not at how it was created. An engineered system can be ruled internally by a small and powerful minority or by a large majority, but only a P2P network can be such that the power to change the design is equally divided between all participant at any given moment, and the rules of the design can be affected and can evolve as a result of a direct feedback interaction between individuals and their environment Tau- Chain to my understanding is the next step in that direction..not only AI but A-Life, which on its most fundamental level is a decentralized P2P engineered scheme, set to create the most complex network of interaction limited by self destruction and navigated by the forces of both.

The Blockchain is the first step in engineering a successful working decentralized system. A system that can increase precision as a product of security, based on network of users.

If you think in quantum term or probability terms, precision increased by more instances applied. In accounting, precision applied to security.A system can be secured by a central force, or by a decentralized force. but a centralized force is very prone to a systematic failure, The kind we experience in the banking system. (The things we consider as AI is centralized while the things we try to create as synthetic life are indeed decentralized like our unaware consciousness.)

The one major problem with the blockchain is the 51% attack. A truly decentralized system have to be engineered to deal with such problem. Life is such a system.The Tau-chain try to come closer to it. but I think that it is yet to resolve the one primary mechanism of transforming "TRUE" to "FALSE". and I will give you the Bob and Alice example to clarify my point:

Say that Bob and Alice had two children together and on custodian rights dispute Bob refuse to take a genetic test that aims to prove Alice's claim that the children are not Bob's. Assuming test can not be forced on the claimants, what would be the judge decision?

Now instead of a biased human judge we want an objective engineered system to decide. Like most of you here, the system will conclude that since Bob don't want to take the test He is not sure that at least one of the children is biologically his. The judge however was not able to draw that conclusion. The judge facing the two in curt realized that Bob is the Mather's Barbara nick name, while Alice is The father's Alexander's nick name.

The point is that a system, any system that is not based on arbitrary axiom like mathematics, is pron to "General Consent".And the challenge to overcome is finding the mechanism of transforming the binary equivalent of FALSE to the binary equivalent of TRUE. In other words, a system that can change polarization. In nature it is a gradual processes of a network reaching a tipping point.

In accounting it will create the same effect to prevent a 51% attack. But for complex systems aiming at content to be able to do this it have to be engineered in a way that both TRUE and FALSE satisfy some ratio and are interchangeable constantly.

Obviously I have no clue how to code such system since Im not a coder, and I know very little about computers, but in term of content, I can tell that the system have to always assume a number for 'general consent" level while calculating all individual disagreement into the final product. thus enoght single individual can always shift that "general consent" to a new figure. On the way to achieve that constant "general consent" flexible number, we may need to practice change of polarization at a non adjustable tipping point, that of 51%.

I'll be glad for a feedback about unclear points from the paper. Yes, it is hard to explain, and I'm looking for the words to make it clearer. I'll therefore be very thankful for specific questions.

Well, I get where you're planning to go but i) the rationale is somewhat trivially showy and you seem to have both ii) seriously overestimated the capabilities of the current technology and iii) seriously underestimated the scope and scale of the challenge (to a degree that forces me to question the depth of your understanding of the domain).

i) You need to provide more support for breathtakingly-wide statements such as “This class is isomorphic to the class of intuitionistic proofs.” (Can you prove it?)

ii) Let's take Déductions as an example, what is actually available is: “... rules in N3 language to generate simple Create-Update applications for Java Swing platform from OWL, RDFS, or UML ...” That's parsecs away from the capability that you're assuming will be available to underpin your modelling.

iii) The complexity of the modelling task militates against anything less powerful than OWL Full and that effectively renders the task impracticable with contemporary reasoners. And - you'll find out for yourself soon enough when you get down the nitty-gritty - all this nonsense about bolting ontologies together is just that.

BTW, you should probably include in the references Jezza Carroll and Chris Bizer's original paper: Modelling Context using Named Graphs. (And you should note that adopting this tactic for modelling context then obliges you to introduce and maintain an explicit temporal representation (of “now”) which is going to pose its own profound problems for the reasoning).

I will say that your approach is at least broadly coherent, as opposed to merely spouting risible gobbledygook like the other altcoins that purport to wave an AI flag.

1. There is a little bit unfair for the previous buyers, they are early buyers and take more risks but now they have to buy this coin at the same price(after calculating the discount), and BTC price keeps going down, in my opinion 50% retroactive discount makes sense, where is the 10% bounty for these early investors? These guys are the earliest and most faithful supporters of this project.

Note that I've edited the messege on the prev thread, givin 25% discount for early buyers.wrt that, can we please copy to here our conversation there, so the transition msg to that thread will stay last?

Quote

2. What's the difference between Tau-Chain and Zennet? I suppose you will continue decentralized Supercomputer on Tau-Chain but will have more features, correct?

tauchain is a platform. Agoras will be a product over it, implementing markets, where zennet market type is only one supported kind there.

Quote

3. If you don't hit the 2M$ goal when pre-sale ends, what will you deal with the left?

Well, I get where you're planning to go but i) the rationale is somewhat trivially showy and you seem to have both ii) seriously overestimated the capabilities of the current technology and iii) seriously underestimated the scope and scale of the challenge (to a degree that forces me to question the depth of your understanding of the domain).

you didnt give much detail here, but on the rest of your comment, so i can't answer to this part.

Quote

i) You need to provide more support for breathtakingly-wide statements such as “This class is isomorphic to the class of intuitionistic proofs.” (Can you prove it?)

ii) Let's take Déductions as an example, what is actually available is: “... rules in N3 language to generate simple Create-Update applications for Java Swing platform from OWL, RDFS, or UML ...” That's parsecs away from the capability that you're assuming will be available to underpin your modelling.

by any means, i cannot see how automatic GUI building is more than a piece of cake. I've done such myself zillion times, for example, i already built automatic GUI with QT for my first sketches of zennet.

Quote

iii) The complexity of the modelling task militates against anything less powerful than OWL Full and that effectively renders the task impracticable with contemporary reasoners. And - you'll find out for yourself soon enough when you get down the nitty-gritty - all this nonsense about bolting ontologies together is just that.

I do take into account that some logic algos here aren't trivial at all. Some will use SMT solvers, some are only human solvable, and some are not solvable or not known to be solvable. Still, many operations like consistency check can be done efficiently.

Quote

BTW, you should probably include in the references Jezza Carroll and Chris Bizer's original paper: Modelling Context using Named Graphs. (And you should note that adopting this tactic for modelling context then obliges you to introduce and maintain an explicit temporal representation (of “now”) which is going to pose its own profound problems for the reasoning).

tx, nice paper, maybe will make a reference. wrt time ordering, ofc Satoshi ideas have to take place.

Quote

I will say that your approach is at least broadly coherent, as opposed to merely spouting risible gobbledygook like the other altcoins that purport to wave an AI flag.

$2m won't cover it though, $20m just might.

Let's keep the discussion going and see where it takes us. I cannot answer such a vague claim, ofc. You sound like you know what you're talking about, and I'm very happy to learn.

A pointer to a wikipedia entry doesn't provide adequate support for your statement. The fact that you think it does means that our discussion is already at cross-purposes.

Throughout the document, you continually refer readers in a vague fashion to intellectually demanding areas of mathematics and logic as though you expect them to be able to work out for themselves the verification of your claims.

It is incumbent on you, the solicitor of investment funding, to provide a comprehensible explanation of how the work in these domains explicitly supports your approach, mere hand-waving references just won't do.

A pointer to a wikipedia entry doesn't provide adequate support for your statement. The fact that you think it does means that our discussion is already at cross-purposes.

Throughout the document, you continually refer readers in a vague fashion to intellectually demanding areas of mathematics and logic as though you expect them to be able to work out for themselves the verification of your claims.

It is incumbent on you, the solicitor of investment funding, to provide a comprehensible explanation of how the work in these domains explicitly supports your approach, mere hand-waving references just won't do.

Cheers

Graham

I agree that the article should be much more explained. Your points help me to recognize where I need to put more emphasis.

1. There is a little bit unfair for the previous buyers, they are early buyers and take more risks but now they have to buy this coin at the same price(after calculating the discount), and BTC price keeps going down, in my opinion 50% retroactive discount makes sense, where is the 10% bounty for these early investors? These guys are the earliest and most faithful supporters of this project.

Note that I've edited the messege on the prev thread, givin 25% discount for early buyers.wrt that, can we please copy to here our conversation there, so the transition msg to that thread will stay last?

You don't get my point, man, anyway, this is a fantastic coin, I will support it forever as long as you developer guys don't leave it alone.