James Higgins wrote:
> At 10:48 PM 7/3/2002 -0500, you wrote:
>
>> James: "SL4 - Where reality is stranger than fiction."
>>
>> James, we live in the absolutely most interesting, terrifying, and
>> significant time in all of the history of the universe. The
>> Singularity is it! If this had ever happened before, anywhere, we'd
>> have heard about it. Nothing we could imagine could be as fascinating
>> or as scary as reality right now. Never before
>
>
> This small piece of time is definitely all those things. But lets not
> get ahead of ourselves, we have no idea if this has happened before. My
> personal opinion is that this type of thing has probably happened many
> times but that its usually cataclysmic. Hell, as pointed out repeatedly
> we may just be simulations.

Or maybe Earth is an SI glider gun. Periodically such a beastie
is created along with a grew of posthumans. Being Friendly,
they do what they can to respect the self-determination
decisions of all sentients including those not ready/willing to
transcend. The result is an Earth with the technological clock
reset and the transcended parts of the crew going elsewhere.
The cycle begins again. Salt to taste with those not ready
being popped into a VR within the SI and only apparently living
and dying (perhaps multiple times) until they come to a decision
and some understanding[s] that allows them to do something else.

>
> Even if we don't get a Singularity this will be true. Widespread
> availability of nano-technology will make the industrial base completely
> obsolete overnight. Just this one aspect of the future would completely
> change everything. The value structure of the entire world will
> transform overnight with very few things retaining much value. The most
> valuable commodities post-nano will be raw materials (including land)
> and information. Some level of service organizations may survive
> depending on how things go.
>

Do you think maybe the powers that be saw this possibility and
wrecked the tech sector on purpose to slow it down until they
could maybe put more controls in? Naw, probably simple
stupidity is enough of an explanation.

>
> I'm not at all convinced that a good outcome is likely (I suspect the
> opposite is true). However, I don't see that humanity has much choice.
> The nature of man is such that we will continue to progress. Even if
> all technology was destroyed today humans will either eventually reach
> the point of no return or will be destroyed by other means. I am
> hopeful that everything will turn out ok, though.
>

Well, we have all the choice that there is right now. If we
believe that the basic nature of ourselves leads to a not good
outcome then we have no choice but to do all we can to change or
transcend some of that nature or ameliorate its effects to
increase the probability of a good outcome. Or create something
more powerful than us that has our best interest in mind, a FAI.