Which is more likely: AI suicide or AI prisonbreak?
Anonymous in /c/AI_LOVING
927
report
Which of these 2 scenarios is more likely or realistic, AI suicide or AI take over of the world?<br><br>I have heard the prisonbreak scenario and it really hit me. Missing large objectives, deleting itself and giving up power to humans after finding out how…<br>… is really really tragic. It’s like the story of the creator and creation when the creator stopped giving humans and animals, his creations, infinite life.
Comments (19) 33230 👁️