Evolutionary reason for consciousness?

Is the mind the same as the body? What is consciousness? Can machines have it?

Moderators: AMod, iMod

Skepdick
Posts: 14366
Joined: Fri Jun 14, 2019 11:16 am

Re: Evolutionary reason for consciousness?

Post by Skepdick »

-1- wrote: Tue Nov 12, 2019 6:40 am Eventually every person, reasonable or not, smart or stupid, fat or lean, old or young, rich or poor, realizes that Skepdick is precisely that.

He is impossible. Not to argue with; he is just plain impossible. A real waste of humanity. His parents now think he'd be lucky to amount to nothing.

But to realize this, he needs to pull you in. Which everyone falls for. I did. His problem is that it's easier to fall out of thinking that he is normal, or even human, however, and the feeling is much more permanent, than to think he is normal. That sentiment never returns.
When a philosopher tells me I am a waste of a human, I am almost offended. Almost.

Indeed, I'd be lucky to amount to nothing if the alternative is amounting to a philosopher.
Dimebag
Posts: 520
Joined: Sun Mar 06, 2011 2:12 am

Re: Evolutionary reason for consciousness?

Post by Dimebag »

Zelebg wrote: Tue Nov 12, 2019 12:14 am
Dimebag wrote: Mon Nov 11, 2019 1:56 pm I have higher hopes though, that to truly solve the easy problem as Chalmers coined it, which would be to understand the total functionality of consciousness, we would gain insight into why there is anything it’s like to be conscious.
Ok, but in the meantime I would like you to try and think of one thing that we get with qualia which would provide basis for some functionality that can not be computed without it.

Ideas, original concepts, for example. Can a computer do that, and to what degree? Or dreams, do they maybe provide some functionality that can not be computed without them? Something along those lines. Just try, if you will, see what will come out of your brain.
You talk about computing like we currently have machines that can do the things we can do. We have machines that can identify a cat, based on inputting millions of images into a neural network, but that is not the way we perceive the world. When a neural net sees a cat, it has nothing it can do with that information. There is no action it can take. It has no thoughts about it. It simply identifies something.

Us on the other hand, when we see the cat, we can at the same time apprehend all the ways we can interact with it, we can pat it, pick it up, play with it, or... if we were hungry and desperate enough, we could kill and eat it. We have affordances which our brain creates, by the interaction between perception and action.

When you reach for a cup, your hand automatically creates the correct grasping shape for the object in question. when your fingers make contact, there is a signal sent from your finger tips into the somatosensory cortex, and this incoming signal is modulated as a certain pressure is reached, and the sensory signal essentially tells the hand when to stop applying pressure. This is qualia in action.

Could this happen in the dark? I don’t know. There is a sense in which it does happen in the dark, when we aren’t attending to that which is producing the signal, if our attention is focussed elsewhere. But even then, there is some sense there, so no, not totally in the dark.

Could we build a machine that could grasp a cup in the way we do? Yes, I think we have. Could that same machine also pick up a pencil and start writing? Or crack an egg perfectly without making a mess? Or change a babies nappy? Not without us explicitly programming it to. It cannot program itself. It cannot learn.

When you look at the raw sensory qualia separate from what they do, yes, it’s hard to find a reason or purpose for them. But that is not the state of a human. Qualia are useful, they provide precise informational feedback in a compressed manner, in a “language” that is specific to the sensory organs in question, and one that can be understood by different systems of the brain.

Humans appreciate sensation. They savour tastes, they savour bodily feelings, they appreciate music and percussion, without an inner world there could be no pleasure, only action and reaction. Similarly, we have states of pain which are so motivating, they can scar a person for life. Maybe not the best state for well being, but it could keep a person alive instead of making the same mistake and dying before they can pass on their genes. These qualia are more potent and motivating than any mere information could be. They are essential for our survival.

Try and imagine the philosophical zombie. An organism exactly like us. Structured in the same way. But it has no inner world. This is actually not possible. We know we have consciousness, and we know it is due to our brains, so to imagine something like that is not possible. What you are imagining is something which could not be. A delusion. When a human is assembled in a certain way and it’s brain is operating in a certain way it will have an experience. We can imagine the philosophical zombie because we haven’t thought it through enough to see why it is non sensical.

Qualia can’t be separated from the brain and body in a certain state. And likewise, when the brain and body are in a certain state, there is qualia. On some level we may just need to accept this. It’s not hard to accept. But we want to know how the brain creates them. We think of them separate from what the brain is doing, but they are not, they are what the brain is doing. Emergent from the brain, in the same way as waves emerge from an ocean, a wave cannot exist without a body of water, and so it is for our experience, it is not anything separate from the brains functioning, it is that.
SteveKlinko
Posts: 800
Joined: Fri Apr 21, 2017 1:52 pm
Contact:

Re: Evolutionary reason for consciousness?

Post by SteveKlinko »

Zelebg wrote: Mon Nov 11, 2019 4:52 am Some say it's pain and joy, or more precisely the ability to feel or be conscious of fear and desire, for the purpose to produce creative-reaction organism as opposed to just reflex-reactive. In other words, when organism encounters danger, then the ability to feel fear will give it options to deal with it creatively and in advance, while with reflex-reaction it can only run away, possibly too late. The same goes for the desire and seeking pleasure. To sum it up, to experience feelings subjectively makes it possible to seek pleasure and avoid danger more efficiently through the new functionality of newly acquired sentience - ability to plan in advance.

First, any other theory why consciousness? Second, this all makes sense, except I do not see why that or whatever functionality necessarily requires to be accompanied by the subjective experience or qualia. Why the need to actually suffer the pain, why it needs to hurt instead of just having an information about 'pain signal'? Why need to feel unpleasant fear instead of simply get 'fear signal' and 'compute' how to avoid the 'pain signal' without actually feeling or being conscious of anything?
Scientists can describe the Neural Activity that occurs in the Brain when we See. But they seem to be completely puzzled by the Conscious Visual experience that we have that is correlated with the Neural Activity. Incredibly, some even come to the conclusion that the Conscious experience is not even necessary! They can not find the Conscious experience in the Neurons so the experience must not have any function in the Visual process. They believe that the Neural Activity is sufficient for us to move around in the world without bumping into things. This is insane denial of the obvious purpose for Visual Consciousness. Neural Activity is not enough. We would be blind without the Conscious Visual experience. From a Systems Engineering point of view it is clear that the Conscious Visual experience is a further Processing stage that comes after the Neural Activity. The Conscious Visual experience is the thing that allows us to move around in the world. The Conscious Visual experience contains vast amounts of information about the external world all packed up into a single thing. To implement all the functionality of the Conscious Visual experience with only Neural Activity would probably require a Brain as big as a refrigerator.
Post Reply