I am confused slightly. Wouldn't this have to be implemented on the client?
(It might be easier to set the Earth spinning the other direction.) Also,
you would have problems with state transition. (FUD would cause the masses
to yell privacy violation.) The browser would have to transfer the state of
a previous URI to the new URI when following the link, this is not normal
behavior especially across domains.
I have two suggestions:
1. Force the world's server administrators to learn how to properly run
their web server. This way, 404s never occur on a previously valid URI.
2. Build a smarter search engine.
Metadata be more appropriate for some other forum, perhaps you could look at
http://www.w3.org/Metadata/ for some suggested forums.
Now, on a more interesting note. I have a suggestion which would only
require you to convince all of the 60+ million server admins instead of the
300+ million web users. Why not specify some means for web servers to
automatically notify web search engines of bad urls? Equally daunting, I
agree. But, this is far more useful, more maintainable, and you won't be
flooded with an email for every 404 error the server generates. It would
only require a very simple CGI or server module on both the search engine
and target servers.
I can elaborate in private since I suspect this is slightly off-topic.
,David Norris
World Wide Web - http://www.geocities.com/CapeCanaveral/Lab/1652/
Home Computer - http://illusionary.tzo.cc/
Page via mail - 412039@pager.mirabilis.com
ICQ Universal Internet Number - 412039
E-Mail - kg9ae@geocities.com