... if it is not equipped with a "moral code" or something that resembles a conscience. Sure, you have Asimov's Laws of Robotics, but I don' believe three rules are enough to accomodate for the decision-making process of a thinking computer.

Being able to think and to organize in societies implies some sort of "moral code". Even relatively primitive animals have it.

It will likely be different from human "moral code" but provided that machines have some interests in keeping us around (either for safety, as an effort to "preserve an environment", or simply as a farm stock) that doesn't automatically mean our extinction. Just like we didn't kill all species of animals.

Yeah, about that... we're already most likely behind the ongoing extinction event, which will be one of most rapid ones in geological record - biologists estimating extinction rates at least hundreds times higher than background level, large part of all species gone by the end of this century.

Being able to think and to organize in societies implies some sort of "moral code". Even relatively primitive animals have it.

It will likely be different from human "moral code" but provided that machines have some interests in keeping us around (either for safety, as an effort to "preserve an environment", or simply as a farm stock) that doesn't automatically mean our extinction. Just like we didn't kill all species of animals.

Hey please stop right there. It is not a Machine's "moral code" but the developer's moral code injected into that machine. There is no such a thing as a Machine's own consciousness or free will now or in the future according to the evidence of computer science.