Recently, I read Superintelligence written by Nick Bostrom. We are far from the general AI but what will happen, if we reach that level? The general AI will definitely outsmart the humans and the question is will they like us because they will not be controlled by us. In my opinion, it looks a little scary. I wonder what do you think about this?

P.S: I am not sure about the place so I write on ImAnEngineer. I can change if this is a wrong place.

I watched the trailer and will watch the movie. It looks like sci-fi but it is not. It is the real thing but it needs a time. Giving consciousness and emotions to machine will definitely change the world but I am not sure good or bad.

Technology and AI will be used to augment human capabilities to an ever more symbiotic extent. Pacemakers and hearing aides have been around for a long time, and the number of useful technologies and applications and the intelligence of the technology will increase at an exponential rate. Technological and even AI augmentation will provide extensive advantages that will be hard for un-augmented people to compete with. Developing AI weapons and unleashing them will be a danger like any other weapons of mass destruction and possibly even more dangerous. There is already enough human-developed weapons to kill every person on the planet and AI technology won't be preventable either, so it will be up to sane people everywhere to ensure it is not abused.

I completely agree with you dougw. Although there are mass destruction weapons, technology is improved our lives drastically. Now, control depends on people and I hope sane people hold the control and we won't have trouble. However, my main point is if we have an AI weapon, it may take its own decision and result can be catastrophic. It is not only the AI army, imagine an AI trader. It can control the whole system and one-day stock market may be collapsed. How we may control the AI even all people are sane?

Recently, I read Superintelligence written by Nick Bostrom. We are far from the general AI but what will happen, if we reach that level? The general AI will definitely outsmart the humans and the question is will they like us because they will not be controlled by us. In my opinion, it looks a little scary. I wonder what do you think about this?

P.S: I am not sure about the place so I write on ImAnEngineer. I can change if this is a wrong place.