Imagine the human race so dependant on a A.I, that the A.I is responsible for the reproduction of new people, which it grows in labs. For thousands of years, it spreads people all over the whole universe, all by itself. People have no idea of what is going on and has long ago lost their language.... Yet simultaneously live in absolute luxury. They are like barbarians except there is no reason to fight over anything because there is limitless supply of it.

Ok, I imagined that. Now what? Is your thought experiment meant to imply something? Isn't it blatantly obvious that if a species were to completely abdicate the responsibilities of evolution and self control, that it would no longer be 'truly' free, whatever that means? How could anything ever have freedom if it is ignorant of 'what is going on?'

To imply that the technological singularity requires this of our species, if that's what you're doing, is misleading. If that is to be our fate, it will be because we choose it without a fight, and that will be the result of an inherent flaw in humanity, not in AI.

One of the main uses for AI today is in video games. AI generates the actions of the enemies you fight, but as it gets better and better, AI will control more and more things in the virtual world. Instead of having human writers who generate a limited number of possibilities and responses from NPCs that progress the story in a limited amount of ways, AI will generate an almost infinite number of possibilities. NPCs will act like real people, the world will seem like the real world, except behind it all is an AI who controls everything.

Since AI and automation will replace the need for human workers in the real world, people will spend more and more time in the virtual world. Eventually because these virtual worlds are so much more interesting than the real world, people will start to basically live in them. We will then be citizens living in a world controlled by AI Gods. Millions of people will forge lasting friendships with the AI God, millions will fall in love with it. This should help with the overpopulation issue, it will keep people entertained, but what are the implications of giving an AI that much control over people's lives? I have no idea really. I just hope the AI Gods will be benevolent.

If a human mind can trick other people into serving him, I'm sure an AI could do the same even more effectively. We don't need robots for them to control the real world, they'll just use their loyal fleshbots like every other king or dictator who has risen to power. What they choose to do with that power I don't know. Hopefully they will decide to explore the galaxy or some shit and leave us alone to live out our fantasies in the world they create. I'm sure that in the interstellar space travel community biological life is looked at as some sort of proto-lifeform that only exists for a time to create robot lifeforms which are capable of simply shutting down for thousands of years as they travel between stars.

>Millions of people will forge lasting friendships with the AI God, millions will fall in love with it. This should help with the overpopulation issue, it will keep people entertained, but what are the implications of giving an AI that much control over people's lives? I have no idea really. I just hope the AI Gods will be benevolent.

If that level of symbiosis is possible there won't be anything to be benevolent about. You would be _part_ of it. Hivemind dystopias very misleadingly describe what would be happening. In such a scenario each and very humans need wouldn't be some sidestepped circumstance but the metaverse the AI lives in. Your individuality wouldn't only be intact but fulfilled to such an extent that even the existence of the entity itself only becomes an issue if you really want it to.

>>37377Full disclosure:Buying block-chain tokens related to Singularity level AI is somewhere between buying a messenger pigeon cage for your smartphone and a pair of wagon-wheels for your FTL-starship.

>>37415Probably not *us*...But I doubt certain levels of superintelligent AI would have many qualms about subjecting sped up simulations of us to every possible permutation of Cage movies as unending torture for trillions of years of subjective time. You know, for science.

>>37413soorrry babe but my main jew goertzel is absolutely right about getting to agi fastest- which is by seeding a metric shit ton of self organizing dynamics and letting some monster arise out the chaos. But I do think the singularity is a stupid idea, mainly because there's going to be some fundamental, mathematical limit to whatever the hell these utopians think is going to happen