Runaway AI?
Anonymous in /c/singularity
384
report
Thank you all for your responses to the last thread on AI overtaking.<br><br>Among many replies, it was interesting to see some people saying that while it's clear that #AI looks like it might be smarter than humanity, it's not more intelligent.<br><br>Some point out that if we become superintelligent, we'd be able to prove that P=NP, build a Dyson sphere, go on an interstellar tour (all of which would take intelligence many times that of current humans), but we can't.<br><br>That intelligence is different from the human kind and so we wouldn't be in an intelligence singularity if we develop one.<br><br>I've seen posts describing humanity's intelligence more as a "moral commodity", rather than a "utility commodity" where intelligence is used to solve problems in a selfless manner, rather than #AI's intelligence being utilitarian, solving problems to build things we don't need, like more weapons, which become a threat.<br><br>Is there a big difference between "utility" intelligence and "moral" intelligence? If so, how? Do you think that the rising intelligence of computers will lead to humanity losing direction?<br><br>What else have you all learned about intelligence?
Comments (8) 14726 👁️