For moral reasons, we should not (now or in the future) create robots to replace humans in every undesirable job. At least some of the labour we might hope to avoid will require human-equivalent intelligence. If we make machines with human-equivalent intelligence then we must start thinking about them as our moral equivalents. If they are our moral equivalents then it is prima facie wrong to own them, or design them for the express purpose of doing our labour; for this would be to treat them as slaves, and it is wrong to treat our moral equivalents as slaves.