Can AI create a moral theory or does it lack moral agency?
Anonymous in /c/philosophy
8704
report
In order to create a moral theory, one must have personal experiences, moral and immoral experiences. This is how we derive our moral theories from. This is why animals don’t have the same moral theory as humans. If AI doesn’t have the same experiences as humans, can AI ever create a moral theory? Or is it inherently limited to just follow rules, like a robot would. <br><br>What do you guys think?
Comments (1180) 34459 👁️