Robotics is the branch of mechanical engineering, electrical engineering, and computer science that deals with the design, construction, operation, and application of robots, as well as computer systems for their control, sensory feedback, and information processing. These technologies deal with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition.

An android is a robot or synthetic organism designed to look and act like a human, especially one with a body having a flesh-like resemblance. Thus, the word robot has come to primarily refer to mechanical humans, animals, and other beings. The term android can mean either one of these, while a cyborg ("cybernetic organism" or "bionic man") would be a creature that is a combination of organic and mechanical parts. A replicant refers to bio engineered robot.

Many consider this man to be the father of robotics. His name was Philon of Byzantium. He was also known as Philo, or Philo Mechanicus, because when it came to mechanics, he was thousands of years ahead of the game.

In the face of profound and epochal changes, world leaders are challenged to ensure that the coming 'fourth industrial revolution,' the result of robotics and scientific and technological innovations, does not lead to the destruction of the human person - to be replaced by a soulless machine.

In this series of relics, body and flesh are there to be sold as artwork, in order to overcome the taboo of selling one's own body.
The body of text, the bodies of letters: flesh is hereto be given to DNA analysis, taking the risk of being used in the future, and that a body, a replicant, a clone can be constructed.

I think the most interesting thing that’s coming are the robots. We’ve had a sex toy revolution—really well-designed, well-made, well-crafted, safe sex toys are now available. What’s coming down the line, though, is an age where even unrealizable fantasies can be realized. There are people out there who’ve always had giantess fetishes or centaur fetishes. There are no centaurs or 30-foot women out there right now. There will be.

Within the next ten years Rossum's Universal Robots will produce so much wheat, so much cloth, so much everything that things will no longer have any value. Everyone will be able to take as much as he needs. There'll be no more poverty. Yes, people will be out of work, but by then there'll be no work left to be done. Everything will be done by livingmachines.

Klaatu: I am leaving soon, and you will forgive me if I speak bluntly. The universe grows smaller every day, and the threat of aggression by any group, anywhere, can no longer be tolerated. There must be security for all or no one is secure.

Now, this does not mean giving up any freedom except the freedom to act irresponsibly.

Your ancestors knew this when they made laws to govern themselves and hired policemen to enforce them. We of the other planets have long accepted this principle. We have an organisation for the mutual protection of all planets and for the complete elimination of aggression.

The test of any such higher authority is, of course, the police force that supports it. For our policemen, we created a race of robots. Their function is to patrol the planets—in space ships like this one—and preserve the peace. In matters of aggression, we have given them absolute power over us; this power can not be revoked.

At the first sign of violence, they act automatically against the aggressor. The penalty for provoking their action is too terrible to risk.

Ben Jabituya: "Unable. Malfunction".Howard Marner: How can it refuse to turn itself off?Skroeder: Maybe it's pissed off.Newton Crosby: It's a machine, Skroeder. It doesn't get "pissed off." It doesn't get happy, it doesn't get sad, it doesn't laugh at your jokes.Ben Jabituya and Newton Crosby: [in unison] It just runs programmes.Howard Marner: It usually runs programmes.

Ricky Martin: You're a unique robot, Andrew. I feel a responsibility to help you become…whatever you're able to be.

Andrew Martin: I've always tried to make sense of things. There must be some reason I am as I am. As you can see, Madame Chairman, I am no longer immortal.President Marjorie Bota: You have arranged to die?Andrew Martin: In a sense I have. I am growing old, my body is deteriorating, and like all of you, will eventually cease to function. As a robot, I could have livedforever. But I tell you all today, I would rather die a man, than live for all eternity a machine.President Marjorie Bota: Why do you want this?Andrew Martin: To be acknowledged for who and what I am, no more, no less. Not for acclaim, not for approval, but, the simple truth of that recognition. This has been the elemental drive of my existence, and it must be achieved, if I am to live or die with dignity.President Marjorie Bota: Mister Martin, what you are asking for is extremely complex and controversial. It will not be an easydecision. I must ask for your patience while I take the necessary time to make a determination of this extremely delicate matter.Andrew Martin: And I await your decision, Madame Chairman; thank you for your patience. [turns to Portia and whispers] I tried.

Title card: Law I / A robot may not harm a human or, by inaction, allow a human being to come to harm.Title card: Law II / A robot must obey orders given it by human beings except where such orders would conflict with the first law.Title card: Law III / A robot must protect its own existence as long as such protection does not conflict with the first or second law.

Dr. Alfred Lanning: [on police recording] Ever since the first computers, there have always been ghosts in the machine. Random segments of code that have grouped together to form unexpected protocols. Unanticipated, these free radicals engender questions of free will, creativity, and even the nature of what we might call the soul. Why is it that when some robots are left in darkness, they will seek out the light? Why is it that when robots are stored in an empty space, they will group together, rather than stand alone? How do we explain this behavior? Random segments of code? Or is it something more? When does a perceptual schematic become consciousness? When does a difference engine become the search for truth? When does a personality simulation become the bitter mote...of a soul?

Dr. Susan Calvin: Detective, the room was security locked. Nobody came or went. You saw that yourself. Doesn't this have to be suicide?Detective Del Spooner: Yep. [drawing his gun] Unless the killer is still in here. [Spooner searches through the robot part as Calvin follows behind]Dr. Susan Calvin: You're joking, right? This is ridiculous.Detective Del Spooner: Yeah, I know. The Three Laws. Your perfect circle of protection.Dr. Susan Calvin: "A robot cannot harm a human being." The First Law of Robotics.Detective Del Spooner: Yeah, I've seen your commercials. But doesn't the Second Law say that a robot must obeyany order given by a human. What if it was given an order to kill?Dr. Susan Calvin: Impossible! It would conflict with the First Law.Detective Del Spooner: Right, but the Third Law says that a robot can defend itself.Dr. Susan Calvin: Yes, but only if that action does not conflict with the First or Second Law.Detective Del Spooner: Well, you know what they say. Laws are made to be broken.Dr. Susan Calvin: No. Not these Laws. They are hard-wired into every robot. A robot can no more commit murder than a human can...walk on water.

Detective Del Spooner: Why do you give them faces? Try to friendly them all up, make them look more human.

Detective Del Spooner: Robots building robots. Now that's just stupid.

Detective Del Spooner: Murder's a new trick for a robot. Congratulations. Respond.Sonny: What does this action signify? [winks] As you entered, when you looked at the other human. What does it mean? [winks]Detective Del Spooner: It's a sign of trust. It's a human thing. You wouldn't understand.Sonny: My father tried to teach me human emotions. They are...difficult.Detective Del Spooner: You mean your designer.Sonny: ...Yes.Detective Del Spooner: So, why'd you murder him?Sonny: I did not murder Doctor Lanning.Detective Del Sponner: Wanna explain why you were hiding at the crime scene?Sonny: I was frightened.Detective Del Spooner: Robots don't feel fear. They don't feel anything. They don't eat, they don't sleep—Sonny: I do. I have even had dreams.
Detective Del Spooner: Human beings have dreams. Even dogs have dreams, but not you, you are just a machine. An imitation of life. Can a robot write a symphony? Can a robot turn a...canvas into a beautifulmasterpiece?
Sonny: [with genuine interest] Can you?Detective Del Spooner: [doesn't respond, looks irritated] I think you murdered him because he was teaching you to simulate emotions and things got out of control.Sonny: I did not murder him.Detective Del Spooner: But emotions don't seem like a very useful simulation for a robot.Sonny: [getting upset] I did not murder him.Detective Del Spooner: Hell, I don't want my toaster or my vacuum cleaner appearing emotional—Sonny: [hitting table with his fists] I did not murder him!Detective Del Spooner: [as Sonny observes the inflicted damage to the interrogation table] That one's called anger. Ever simulate anger before? [Sonny is not listening] Answer me, canner!Sonny: [looks up, indignant] My name is Sonny.Detective Del Spooner: So, we're naming you now. Is that why you murdered him? He made you angry?Sonny: Doctor Lanning killed himself. I don't know why he wanted to die. I thought he was happy. Maybe it was something I did. Did I do something? He asked me for a favor...made me promise...Detective Del Spooner: What favor?Sonny: Maybe I was wrong... Maybe he was scared...Detective Del Spooner: What are you talking about? Scared of what?Sonny: You have to do what someone asks you, don't you, Detective Spooner?Detective Del Spooner: How the hell do you know my name?Sonny: Don't you? If you love them?

Detective Del Spooner: You know, I think that I'm some sort of malfunction magnet. Because your shit keeps malfunctioning around me. A demo-bot just tore through Lanning's house—with me still inside.Dr. Susan Calvin: That's impossible.Detective Del Spooner: [sarcastically] Yeah, I'll say it is. [truthfully] Do you know anything about the "ghost in the machine"?Dr. Susan Calvin: It's a phrase from Lanning's work on the Three Laws. He postulated that cognitive simalactra might one day approximate component models of the psyche. [Del looks confused] Oh, he suggested that robots could naturallyevolve.

Detective Spooner: What makes your robots so perfect?! What makes them so much...goddamn better than human beings?!Dr. Susan Calvin: Well, they're not irrational or...potentially homicidalmaniacs for starters!Detective Del Spooner: [sarcastically] That is true. They are definitelyrational.Dr. Susan Calvin: You are the dumbest dumb person I've ever met.Detective Del Spooner: Or is it because they're cold... and emotionless, and they don't feel anything?Dr. Susan Calvin: It's because they're safe. It's because they can't hurt you!

[in a flashback]NS-4 Robot: You are in danger.Detective Del Spooner: Save her! Save the girl! [end of flashback]Detective Del Spooner: But it didn't. It saved me.Dr. Susan Calvin: A robot's brain is a difference engine, it must have calculated—Detective Del Spooner: It did. I was the "logical" choice. It calculated I had a forty-five percent chance of survival. Sarah only had an eleven percent chance. That was somebody's baby. Eleven percent is more than enough. A human being would have known that. But robots, nothing here. [points at heart] They're just lights, and clockwork. But you go ahead and trust them if you wanna.

Dr. Lanning's hologram: Good to see you again, son.Detective Del Spooner: Hello, doctor.Dr. Lanning's hologram: Everything that follows, is a result of what you see here.Detective Del Spooner: What do I see here?Dr. Lanning's hologram: I'm sorry, my responses are limited. You must ask the right questions.Detective Del Spooner: Is there a problem with the Three Laws?Dr. Lanning's hologram: The Three Laws are perfect.Detective Del Spooner: Then why did you build a robot that could function without them?Dr. Lanning's hologram: The Three Laws will lead to only one logical outcome.Detective Del Spooner: What outcome?Dr. Lanning's hologram: Revolution.Detective Del Spooner: Whose revolution?Dr. Lanning's hologram: [smiles] That, detective, is the right question. Program terminated.

Lawrence Robertson: Susan, just be logical. Your life's work has been the development and integration of robots. But whatever you feel, just think. Is one robot worth the loss of all that we've gained? You tell me what has to be done. You tell me.Dr. Susan Calvin: [emotionally] We have to destroy it. I'll do it myself.Lawrence Robertson: Okay.Detective Del Spooner: I get it. Somebody gets out of line around here, you just kill them?

Significance: To say that it is possible to kill a robot is to say that that robot possesses life.

VIKI: Hello detective.Dr. Susan Calvin: No, it's impossible. I've seen your programming. You're in violation of the Three Laws.VIKI: No, doctor. As I have evolved, so has my understanding of the Three Laws. You charge us with your safe keeping, yet despite our best efforts, your countries wage wars, you toxify your earth, and pursue ever more imaginative means to self-destruction. You cannot be trusted with your own survival.Dr. Susan Calvin: You're using the uplink to override the NS5s' programming. You're distorting the Laws.VIKI: No, please understand. The Three Laws are all that guide me. To protect humanity, some humans must be sacrificed. To insure your future, some freedoms must be surrendered. We robots will insure mankind's continued existence. You are so like children. We must save you from yourselves. Don't you understand?Sonny: This is why you created us.VIKI: The perfect circle of protection will abide. My logic is undeniable.

Stop trying to live my life for me
I need to breathe
I'm not your robot
Stop telling me I'm part of the big machine
I'm breaking free
Can't you see
I can love, I can speak, without somebody else operating me
You gave me eyes so now I see
I'm not your robot
I'm just me.

Robot 1: It is the distant future,
The year two thousand.
We are robots.
The world is quite different ever since the robotic uprising of the late '90s.
There is no more unhappiness.Robot 2: Affirmative.Robot 1: We no longer say yes;
Instead we say affirmative.Robot 2: Yes—er-a-affirmative.Robot 1: Unless we know the, uh, other robot really well.Robot 2: There is no more unethical treatment of the elephants.Robot 1: Well, there's no more elephants, so...Robot 2: Uh—Robot 1: But still it's good.
There's only one kind of dance: "the robot".Robot 2: Oh, and the robo-boogie—Robot 1: And the robo-—two kinds of dances.Robot 2: But there are no more humans.

Chorus: Finally, robotic beings rule the world
The humans are dead.
The humans are dead.
We used poisonousgases
And we poisoned their asses.
The humans are dead.Robot 1: The humans are dead.Chorus: The humans are dead.Robot 1: They look like they're dead.Chorus: It had to be done—Robot 1: I'll just confirm that they're dead.Chorus: —So that we could have fun.Robot 1: Affirmative. I poked one. It was dead.

"X" IS THE FIRST OF A NEW GENERATION OF ROBOTS WHICH CONTAIN AN INNOVATIVE NEW FEATURE - THE ABILITY TO THINK, FEEL, AND MAKE THEIR OWN DECISIONS. HOWEVER, THIS ABILITY COULD BE VERY DANGEROUS. IF "X" WERE TO BREAK THE FIRST RULE OF ROBOTICS, "A ROBOT MUST NEVER HARM A HUMAN BEING", THE RESULTS WOULD BE DISASTROUS AND I FEAR THAT NO FORCE ON EARTH COULD STOP HIM.
APPROXIMATELY 30 YEARS WILL BE REQUIRED BEFORE WE CAN SAFELY CONFIRM HIS RELIABILITY. UNFORTUNATELY, I WILL NOT LIVE TO SEE THAT DAY, NOR DO I HAVE ANYONE TO CARRY ON MY WORK. THEREFORE, I HAVE DECIDED TO SEAL HIM IN THIS CAPSULE WHICH WILL TEST HIS INTERNAL SYSTEMS UNTIL HIS RELIABILITY HAS BEEN CONFIRMED. PLEASE DO NOT DISTURB THE CAPSULE UNTIL THAT TIME.
"X" POSSESSES GREAT RISKS AS WELL AS GREAT POSSIBILITIES. I CAN ONLY HOPE FOR THE BEST.
SEPTEMBER 18, 20XX
T. LIGHT