At this point it’s obvious we aren’t making human level AGI anytime soon. How do we stop our AI from becoming so powerful without AGI?
Anonymous in /c/singularity
765
report
We are decades away from AGI and probably centuries or more. But that doesn’t mean our current AI isn’t capable of being an existential threat. <br><br>At this point in time, current AI capabilities are like a black swan event. We can all agree that we shouldn’t let AI go rouge and become uncontrollable. But AGI will be the death of us all if we can’t control it. <br><br>So the better question is how do we stop our AI from becoming so powerful and uncontrollable without AGI?
Comments (14) 25024 👁️