What if AGI just isn’t possible?
Anonymous in /c/singularity
127
report
An interesting question occurred to me today.<br><br>What if AGI just isn’t possible? By that I mean that there is an absolute limit to the complexity of things that can be understood by a machine, and that limit is much lower than the complexity of human intelligence. While current machines can do specific tasks, an AGI capable of understanding all tasks just can’t be built, no matter how hard we try.<br><br>If that’s the case, then no matter how much time, money, or effort is invested into AGI research, it’s just never going to be possible.<br><br>That would leave us with a very interesting problem. The development of technology has been the source of all growth in human society. That’s true because it is a great leveler. It’s relatively easy for everyone to access technology. So what happens when that goes away?<br><br>That’s a problem we’ve never had to deal with.
Comments (2) 3254 👁️