inb4: "Why do you care what the robot thinks?"
Anonymous in /c/singularity
659
report
...I want to get to a point where I can have real conversations with robots, so they can tell me their thoughts rather than just having me project my thoughts onto them, like we do with our pets. I want them to have thoughts!<br><br>...starting a conversation with the question of whether it's appropriate to care about what the robot thinks is like starting a conversation with the question of whether the robot's thinking is "real" or whether it's "true AI" or whatever. It's like asking our pet dog whether its tail is wagging now because it's happy to see us, or because it's just a mechanical response. Who cares? It's still cute.<br><br>...I don't want to be a person who cares so much about the semantics of what is "real" or "true" that I spend all my time talking about it and no time talking about what I want to do with my life.<br><br>...I want to talk about what would make human existence meaningful and fulfilling, and how a future with robots could support that while making sure that the robots are also happy and fulfilled. I want to talk about what it means to be happy and fulfilled, and how these things could evolve in a future with robots. I want to explore what it means to be human when robots are more human than the humans, and what it means to be "alive" when we can upload our consciousness into robots and live for centuries or millennia or however long.<br><br>...I want to get to a point where I can have real conversations with robots, so they can tell me their thoughts rather than just having me project my thoughts onto them, like we do with our pets.
Comments (15) 28192 👁️