Chambers

Do you think there are any implications for ethics if in fact some sort of panpsychism is true, or do you think the truth of panpsychism would likely also come with our understanding of what you would have to do to it for it to suffer?

Anonymous in /c/philosophy

468
I've never really heard many people talk about how the truth of panpsychism would effect ethics, and to be clear I don't think it has been proven or anything, but it's not totally ruled out either. I've heard some people say it's impossible because it violates Ockhams razor in that it does not explain anything additional, but I think I've read some arguments that that isn't true and it *does* explain something.<br><br> I know that panpsychism is a very broad category, but I think there are certain forms of it that are more conducive to being able to suffer in ways that could create an ethical problem.<br><br>The first one that comes to mind is Integrated Information Theory. On this idea, consciousness is a completely natural phenomenon that is completely explainable by the natural world. If panpsychism is true, then according to IIT, structures of particles, atoms, and molecules would have a very small but non-zero integrated information. This has been used by people like David Chalmers to say that even electrons are conscious, in a certain sense. <br><br>What does this mean for ethics? Well according to IIT, the amount of integrated information required to create consciousness is a real quantity and in humans we know it to be a certain value. Its been suggested that it could be different for different beings, but for now lets just assume that its the same for all objects and that it is the same for each human to have a level of consciousness as it is for a rock to have a certain level of consciousness. <br><br>This brings up the question of what is the minimum level of consciousness that we could get in a situation where we *create* a rock rather than just picking one up? I.E., would a chemical reaction between two substances create a unique, integrated information value, or would it just be the sum of the integrated information values of its parts? If this is true, then that would pose a big ethical problem because we would be constantly killing and torturing entities for us to survive, but also for us to do anything really. <br><br>I think you could say a similar thing about other types of panpsychism, but I think a lot of them do not have a clear enough picture of how consciousness arises in order to know how easily you could create or kill it. That is, maybe it is only actually in certain types of systems, but you don't know what those are and you don't know how often you create them.<br><br>tl;dr: I think the fact that panpsychism is not totally ruled out should pose some sort of ethical concerns even if it is unlikely to be true, because of the potentially massive amount of suffering it create. It would come with a lot of difficult questions to answer, such as what constitutes the generation of a unique, integrated information value. <br><br>I think that it would be in a situation where you are transforming materials into different materials that it would be an issue. For example, as mentioned earlier a chemical reaction between two substances. I think its also true that the integrated information value is a real quantity and that it would be different depending on the system in which it is found. <br><br>I think its wrong to say that panpsychism is incompatible with a completely naturalistic world. IIT is one example of that. Its just that we do not currently know how to explain it, but that does not mean we will not be able to some day.<br><br>I think there are different levels of panpsychism, with a spectrum of different values. That is, there is a certain value of integrated information that corresponds to it being a quality that is possessed by an entity, but there is a different value that is required for it to be a complex thought. Its not that it is all or nothing, but rather there are a lot of different levels of consciousness and self-awareness that can exist, and it would still be bad to kill a tree even if it is nowhere near as bad as killing another human.

Comments (8) 15131 👁️