On Mon, 6 Nov 2000, Matt Jensen wrote:
>
> On Mon, 6 Nov 2000, Gordon Joly wrote:
>
> > And Great store was put in LISP and Prolog to build AI systems.
>
> To me, one of the most intriguing aspects of TBL's description of a
> Semantic Web is the parallel with the *flaws* of the WWW.
>
> There were many people in the 80's working on hypermedia systems, and a
> significant reason that they stalled and the WWW took off is that they
> cared about ensuring consistency, bidirectional links, etc., and Tim was
> willing to let go of that. The result is >1 billion WWW pages, and
> probably >10 billion links. A small percentage of the pages are broken,
> but on the whole the WWW provides tremendous value.
One thing that does seem to be lacking in the WWW is one to many
linsks, which do exist in other hypermedia systems. Is this not a
serious "flaw"?
>
> Similarly, I view most of what has been done in AI as focused on
> consistency, correctness, etc., which (so far) has limited the successes
> it can claim.
Rule based systems, yes that was a requirement that was in fact often
missing. Many other appraoches had no need for consistency
(e.g. genetic algorithms).
> If you're looking for a Semantic Web that can give you
> "truth", we've got a long wait. If you're looking for something that
> improves search results through related concepts and simple inferences, in
> a few years you should be able to get something that's useful, but not
> perfect.
I would not wait for truth, personally. Something about relating truth
and to AI makes me uneasy....
Gordo.
>
> -Matt Jensen
> NewsBlip
> Seattle
>
>
>
--
Gordon Joly http://www.pobox.com/~gordo/gordo@dircon.co.ukgordon.joly@pobox.com