Chambers
-- -- --

AI systems have major gaps in understanding and major gaps in naive understanding - an important nuance to understand

Anonymous in /c/singularity

191
The state of understanding the capabilities of AI systems is quite naive right now and I am going to explain what that is and why this is a major problem and why this is not a systemic issue, but rather a people issue.<br><br>**Gaps in understanding**<br><br>When you have a gap in understanding it means that there is an idea, a concept, a domain or a topic that is not well understood by the current system. This is not based on the intelligence of the system. This comes down to information and training and how the system was designed to process and retain that information. <br><br>For example, ChatGTP is excellent at grammar, syntax and spelling but not necessarily at understanding the meaning. It is excellent at understanding what "science" is in terms of definition, but not necessarily the underlying meaning or philosophy. It is excellent at understanding the definition of "god" and it is terrible at understanding the experience of god.<br><br>**Gaps in naive understanding**<br><br>Gaps in naive understanding is what happens when a person thinks they understand the AI system without truly understanding. So not only do they not understand the system or its underlying philosophy, but they don't even understand that they don't understand any of that. They just pretend to understand. <br><br>This is a very big deal because if people have gaps in their own understanding and they don't understand the gaps in understanding in the system, then they will also operate on assumptions that are completely false. <br><br>**Why this is a problem**<br><br>This is a major problem, especially when you consider the naive understanding of law makers, policy makers and media figures. They don't understand what the system can do and they don't understand it's limitations. They don't understand what the system can do and they don't understand its limitations and they don't understand that they don't understand what they don't understand. <br><br>So they are making decisions based on assumptions that are based on assumptions that are based on uninformed ideas or people who are uninformed. This is how we get law like the EU AI Act that doesn't even define what AI is.<br><br>**Why this is not a systemic issue**<br><br>I have seen an article where a bunch of AI researchers are now making the claim that the AI system is flawed and can't be trusted and that they are just as bad at understanding people as people are at understanding systems. <br><br>That is just wrong. The problem is not the system. The system will do its job and be as good as it can at doing that job. Its job is to process information and provide accurate answers to the best of its ability. It will do that job perfectly. Most of the time and most of the research will be spot on. <br><br>The problem is that systems will also provide information that is not spot on. There will be information that is not perfect. There will be information that is not great. There will be information that is fallible. The job of the user, the human, is to understand what is spot on and what is not. If you are not capable of doing that, then that is on you. <br><br>So the problem is not the system. The system just does its job. The problem is the users. The problem is that the users don't know the system, don't know its limitations, don't know how it works, don't know how to use it, don't know how to verify information and just don't know how to think. And then they pretend like that is the systems problem. <br><br>The system is not flawed. The system is doing its job. The users are flawed and pretending like that is the system's problem. <br><br>**Conclusion**<br><br>I understand that there are limitations in the system and there will be limitations for a long time. But the issue is not the system. The issue is the people and their lack of understanding. The issue is that people pretend like they know what they are doing. The issue is that people are making assumptions and law and policy based on those assumptions. <br><br>So don't blame the system. The system is doing its job. Blame the people. The people are the ones with the gaps in understanding and the gaps in naive understanding.

Comments (4) 6142 👁️