Does half of a brain feel half of a pain?

Is the mind the same as the body? What is consciousness? Can machines have it?

Moderators: AMod, iMod

Post Reply
WillSanguine
Posts: 1
Joined: Sat Nov 11, 2017 4:06 pm

Does half of a brain feel half of a pain?

Post by WillSanguine »

TL;DR: If you run a computer simulation than simulates 1/1,000th of a brain, does it experience any sensations? Even 1/1,000th of a sensation?

I am going to propose a thought experiment. This thought experiment assumes you believe it is possible to build a computer or other human-made system that is conscious and experiences sensations. You might then believe that it is unethical to cause pain to such a system. The thought experiment is meant to show a potential contradiction or paradox arising from such a belief. I am not sure how to get out of this paradox, except through some radical sort of dualism perhaps.

Neuroscientist Dr. Green has a special "nerve simulator" kit which lets him build a working system of neurons, perhaps in digital / software form or perhaps in a more material form. Let's say that this system is just complex enough that it is capable of experiencing pleasure or pain. This requires 100,000 neurons to be simulated. Let's say that several philosophers and neuroscientists who inspect the system agree that it is in fact _unethical_ to activate the system in a way that will cause pain to the system.

Dr. Green, however, is able to simulate the left half of the system alone (50,000 neurons) or the right half alone (50,000 neurons). Would this be unethical, even a little bit? If you think that it would be a little bit unethical, how far "down" are you willing to go with this? For instance, would it be unethical to simulate _one_ neuron experiencing pain, even a little bit unethical (like 1/100,000 as much)? What about one part of a neuron, such as a single axon? What about one molecule of one neuron? That seems unlikely. Therefore, there must be some smallest possible system that can experience any pain.

Say it turns out that 100,000 neurons can experience pain but 50,000 neurons cannot. Dr. Green can then take the left and right halves of the system (50,000 neurons each, divided into two boxes) and simulate them side by side, simultaneously. Would this be unethical? If the answer is no, there is a problem: how is this different than simulating the whole system at once? If the answer is yes, there is still a problem: why would it be ethical to do these things at different times and in different locations, but unethical to do them at the same time, side-by-side?
User avatar
GreatandWiseTrixie
Posts: 1547
Joined: Tue Feb 03, 2015 9:51 pm

Re: Does half of a brain feel half of a pain?

Post by GreatandWiseTrixie »

Society is shite, yet nobody says it is unethical to bring babies into a world, where they will definitely experience pain.
There's also no way to measure if the pain exists, or if the simulation is conscious.
EchoesOfTheHorizon
Posts: 356
Joined: Mon Oct 23, 2017 6:08 am

Re: Does half of a brain feel half of a pain?

Post by EchoesOfTheHorizon »

You can do this with under 100 connections, not 100,000. I used to do a lot of work on mnemonic logic clones to double check my theories, and would purposefully bifurcate them network wise (4 networked together, 8 halves, 44 points all together prior to the 4 being networked to the problem I had to solve). I called these Logic Clones, as they gave me a pretty good idea of what my ideas would be all at once on quick inspection.

A obvious hurdle to your idea is you claim the philosophers would claim the simulation will only work if 100,000 neurons are working, but you can shut it down to 50,000 in one lobe.

Ummm..... clearly they either understood this and thought nothing of it (human brain has many rewards and punishment centers scattered about, but the emotional pain we most focus on is generated in but one lobe), or they misdiagnosed not knowing this, and gave a ethical opinion prematurely.

First off, pain isn't inherently unethical. Killing a person, ending their survival generally is. Pain is a survival tool we use, when it is working right. It tells us not to do things, to stop and heal, or get someone to take care of us. Lots of pains go wild, especially chronic pain.

What you are not exploring is why it is "unethical" for this AI to feel pain with such a small neural network. I presume you claim it has consciousness? My logic clones didn't, they merely predicted my thoughts axiomatically, to their logical conclusions. For a system to feel consciousness, I think ants have to have a more complex mind..... I don't know if such a minuscule scale of mind, giving it a impulse of "pain" is really torture. They might get the signal and not want it, desire to dissist, but I have my doubts torturing a ant rivals that of a creature with a much larger and sophisticated bream and neurology, like a gecko. Pretty sure a Gecko can experience torture, and no like it, just like we can. It also has a neurological capacity that would blow your system out of the water.

So, how many ethical philosophers are going to rush in and study a system of consciousness less than a ant, make a conclusion, and then loose any sleep at all if this mad scientist goes to town trying to torture it? Can he even be successful?

It is a question of magnitude coupled with complexity. You'll need the qualities of a neural network dramatically increases, it needs to be coherent, a sense needs added, as well as the ability to react negatively, and recoil/reject it, preferring a less painful alternative. You can't call the less painful alternative pleasure either. When a doctor hits your knee testing your reflex, your knee isn't seeking pleasure.

Once you get to the complexity of a gecko, and only then, will I start to worry. No point going around with a brain that sophisticated and throw it on a permanent mindfuck mode.

For human scale simulations, if you approach it not holistically..... in otherwords not a full human mind, but rather exclusively and a priori, symptomatically (testing only what you are searching for) then yeah, might be okay, so long as the AI isn't able to think fully like a human.

An example, let's say Jerry has PTSD, it is 100 years from now, doctor wants to operate to fix it, cause that's how they approach the issue 100 years from now. They need to know the exact patterns of PTSD, how the thoughts around it form..... so actively every time they touch a neuron, it shows in a AI display the inner consciousness of Jerry, that relates to a negative PTSD like trigger stimuli.

If all the AI is, is a handful of memories inducted into it (very few), a simulated pain center, simulated pleasure center..... with a effort to judge if this needs nicked or kept around, then I think no reason exists why a simulated AI taking only a handful of memories, simulating real human trauma, is unethical.

1) temporally, such a mind exists only for a few seconds
2) lacks the larger complexity of soul
3) lacks a sense of awareness of unique self, as a machine in a lab
4) any impulse to survive, while ethical by default in it, is ultimately pointless, see no 1, 2, 3


Torturing a full copy of a brain even for just a few seconds doesn't pass muster.... that brain is every bit as complex as you are. If I tortured you for a few seconds, you wouldn't like it. It is because a full copy of a human brain has that complexity of soul. Even if the brain isn't aware it is merely a simulation, a AI, it thinks it is, and functions just like us, in our everyday lives. You could have conversations with i on a meaningful and emotional level, and vice versa.


But a mere limited machine, that remembers just a handful on memories.... like that time you bumped your toe as a kid, or a dog licking your hand..... checking for pain or pleasure here, or even administering it..... that isn't a full blown person. I can think it, and it goes away in a second. Other aspects of the mind then take over. We have complexity, a life to live. Such a AI is stuck in a mere handful of moments. It can't easily build a meaningful existence out of it, if you left it running for 1,000 years on, with such a small amount of memory to work off of.

Magnitude and complexity, as well as length of time is important here.
User avatar
Arising_uk
Posts: 12314
Joined: Wed Oct 17, 2007 2:31 am

Re: Does half of a brain feel half of a pain?

Post by Arising_uk »

GreatandWiseTrixie wrote: Fri Nov 17, 2017 2:12 pm Society is shite, yet nobody says it is unethical to bring babies into a world, where they will definitely experience pain.

...
Actually some do, they are called Antinatalists and we have a poster upon this board, Dalek prime, who advocates this cause.

https://en.wikipedia.org/wiki/Antinatal ... id_Benatar
Impenitent
Posts: 4305
Joined: Wed Feb 10, 2010 2:04 pm

Re: Does half of a brain feel half of a pain?

Post by Impenitent »

anthropomorphic fallacy

-Imp
User avatar
Arising_uk
Posts: 12314
Joined: Wed Oct 17, 2007 2:31 am

Re: Does half of a brain feel half of a pain?

Post by Arising_uk »

EchoesOfTheHorizon wrote:You can do this with under 100 connections, not 100,000. I used to do a lot of work on mnemonic logic clones to double check my theories, and would purposefully bifurcate them network wise (4 networked together, 8 halves, 44 points all together prior to the 4 being networked to the problem I had to solve). I called these Logic Clones, as they gave me a pretty good idea of what my ideas would be all at once on quick inspection. ...
That's interesting, can you give an example of a theory and the results that you used this method on?
Dimebag
Posts: 520
Joined: Sun Mar 06, 2011 2:12 am

Re: Does half of a brain feel half of a pain?

Post by Dimebag »

WillSanguine wrote: Sat Nov 11, 2017 4:31 pm Say it turns out that 100,000 neurons can experience pain but 50,000 neurons cannot. Dr. Green can then take the left and right halves of the system (50,000 neurons each, divided into two boxes) and simulate them side by side, simultaneously. Would this be unethical? If the answer is no, there is a problem: how is this different than simulating the whole system at once? If the answer is yes, there is still a problem: why would it be ethical to do these things at different times and in different locations, but unethical to do them at the same time, side-by-side?
Unless the two halves were connected in such a way that they could causally interact to allow the pain experience to occur, then I don't see any problem. Alone, each half is unable to cause an experience to occur. No matter how close you put the two systems, they could not interact with one another to allow an experience of pain to occur. Just as my putting my head close to your head doesn't allow me any privileged access to your experiences, so too the two systems remain inert to each other.

The interaction is of key importance. Just as me holding a match 1metre away from a tin of gas will not cause ignition, the potential of the combined parts can only occur when they are interacting.
Post Reply