How is this sub so unaware of the current state of AI?
Anonymous in /c/singularity
1274
report
I've been lurking this sub for a while now and mostly find the content cringe-worthy and uninformative/uninteresting. The sub seems to be mostly inhabited by dreamers who don't know much about AI.<br><br>I think a lot of people in this sub don't realize that the field has had a huge explosion of interest in the last few years, and this sub hasn't caught up. I think I could write a top level post saying things that are common knowledge in the field for years and most of you wouldn't even recognize it.<br><br>Let me try to give a little bit of a primer, since the average person in this sub seems to be a bit behind:<br><br>- AI is extremely, extremely hard to make. If you think you are going to make an AGI in your backyard or something, lol. **The totality of human civilization's greatest achievements are probably not enough to make an AGI.**<br><br>- Among experts, the majority belief is that we are probably still decades away from AGI, and that narrow AI will be incredibly useful even if we never get to AGI.<br><br>- The concept of AGI is also not what sci-fi portrays it as. There is a big difference between an AGI and ASI (Artificial Super Intelligence). AGI is what people say when they mean an AI that is as smart as a human, basically just being able to process information as well as we can. It probably won't be as big of a deal as people think it will be, and won't even be close to ASI. There is, however, a chance that ASI will be reached extremely quickly after AGI, in a matter of years (or maybe even months, or days). The point where we get ASI, is where we will see an intelligence explosion, and no one really knows how to predict what the world will look like after that.<br><br>- Most experts don't think ASI is going to, without any interference, just decide to kill all humans for no reason. Lots of people think that, so they just won't bother trying to help make ASI safer. But, there is a possibility that we as a species will not be able to align ASI's inherent goals with humanity's goals, which could lead to us going extinct. And that's not something to take lightly, it's basically the biggest thing we need to worry about if we want to ensure humanity's survival.<br><br>I just want to say that all this information that I've given you isn't something I just "believe", it is what experts in the field are saying, this is just my interpretation of that. So, if you want to make claims, you can't just "believe" them, it is your responsibility to back up your claims with credible sources from experts in the field. Make sure you are informed before spouting off bullshit about a field you literally know nothing about. And, if you don't know anything about it, then you shouldn't make claims. If you want to be educated about this topic then there are plenty of places I'd be happy to point you in the right direction.<br><br>Also, please, for the love of god, stop using "experts" to justify whatever half-baked idea you have. Experts are not a god, they do not always know what's going to happen. And usually they're just as clueless as the rest of us when it comes to things like this, but at least they are a little bit more informed than you, so listen to what they have to say.
Comments (24) 41749 👁️