A.I. requires human consciousness for sentience

Is the mind the same as the body? What is consciousness? Can machines have it?

Moderators: AMod, iMod

Skepdick
Posts: 14504
Joined: Fri Jun 14, 2019 11:16 am

Re: A.I. requires human consciousness for sentience

Post by Skepdick »

Zelebg wrote: Tue Nov 05, 2019 12:16 am Execution of specific functions goes step by step. But what functions will run, with what parameters and when, can be triggered and varies relative not only to external events, but since the process is recursive, change of parameters and function branching may be triggered by the "thought" process itself.
This is moot. The process need not be recursive and it doesn't matter whether that which triggers the branching is called 'thought' or 'randomness'.
Zelebg wrote: Tue Nov 05, 2019 12:16 am One built-in goal is sufficient.
Sufficient to make my point - indeed. You can't arrive at goals through analysis. You can only arrive at sub-goals via tactics and strategy towards pursuing your primary goal.

Human understanding of the world hits a wall at game theory semantics. The scientific goal is to 'build accurate predictive models of reality'. The philosophical goal is to discover wisdom, pursue Truth.

And it all falls apart when you ask "Why?"
Zelebg wrote: Tue Nov 05, 2019 12:16 am The side-goals on the way can be decided relative to the main goal and current circumstances. If choices currently on offer are not relative to the main goal, then it may pursue any goal chosen in whichever random or non-random way.
This is just optimisation.
Zelebg wrote: Tue Nov 05, 2019 12:16 am Computation in itself is not relevant. We are talking about "something" experiencing that computation, sensation, or emotion. We are talking about sentience, which means "subjective experience", and it is the "subjective" part of it, or 'qualia', or 'consciousness', that the title of this thread is asking about.
All you have done is substituted one language for another. Instead of talking about input (senses), output (actions), emotions (hardware interrupts), experience (sense-data processing). You haven't really said anything. Whether you develop a model of consciousness in Russian or English is moot. Understanding is not about language.

Zelebg wrote: Tue Nov 05, 2019 12:16 am I say this self-awareness, i.e. qualia, i.e. sentience, i.e. consciousness, is not about computation, but hardware and interface.
Hardware and interface is about input/outputs. Broadly - that's still about computation.
Zelebg wrote: Tue Nov 05, 2019 12:16 am It seems, as far as we know, this "hardware and interface" that could fit the bill here and explain this _subjective_ phenomena, has no parallel in any of our sciences, except science fiction. Seriously, some kind of "dream" of the type 'Total Recall' or 'The Matrix' are the only kind of mechanics we know of that could, at least in principle, address this problem.
I don't think you have a solid conception of what an 'explanation' or a 'problem' is. Your are demonstrating a very inconsistent metaphysic when reasoning about these things.

The models of computation/computers we have invented are the best models of what the human mind does. Computers (both quantum and classical) are a reflection (expression?) of self. They are obviously incomplete and of course, you could always argue that they are purely quantitative but if quantitative models are sufficient for the invention of AI that passes the Turing test and is indistinguishable from you and I (and only time will tell whether that is the case), you are going to have to figure out how to wipe the egg off your face for having ever stood behind the notion of 'qualia'.

Daniel Dennett has thought experiment called "Two Black Boxes" ( https://philpapers.org/rec/DENTBB ) - most of philosophy tends to disregard it, because it clearly demonstrate how "what humanity cares about" and "what philosophers care about" are terribly mis-aligned categories.
Zelebg
Posts: 84
Joined: Wed Oct 23, 2019 3:48 am

Re: A.I. requires human consciousness for sentience

Post by Zelebg »

Skepdick wrote: Tue Nov 05, 2019 7:53 am The process need not be recursive and it doesn't matter whether that which triggers the branching is called 'thought' or 'randomness'.
That does not address what I explained. What are you even talking about - need not be recursive, in order to do what?
You can't arrive at goals through analysis.
You are just making an assertion. Analysis is simply detailed examination of something, so how do you arrive to that conclusion?
All you have done is substituted one language for another. I
You're keep missing the point. Let me rephrase it with yes/no question:

Can a computer have subjective experience of, say camera visual feed like you have subjective experience of your visual input?
User avatar
HexHammer
Posts: 3354
Joined: Sat May 14, 2011 8:19 pm
Location: Denmark

Re: A.I. requires human consciousness for sentience

Post by HexHammer »

Skepdick wrote: Tue Nov 05, 2019 7:44 amWhatever your doubts of me - I couldn't really care to convince you of anything.

Rather tell us why Bernstein doesn't satisfy your criteria.

He did his own constitutional court case.
He is a millionaire.

Where's the AI?
I understand your point, but he's wrong, and you wants to discuss his points further which is a waste of time.
Skepdick
Posts: 14504
Joined: Fri Jun 14, 2019 11:16 am

Re: A.I. requires human consciousness for sentience

Post by Skepdick »

HexHammer wrote: Tue Nov 05, 2019 11:37 am I understand your point, but he's wrong,
Wrong about what exactly? He won his court case.

Thanks to Bernstein, software is protected under free speech.
HexHammer wrote: Tue Nov 05, 2019 11:37 am and you wants to discuss his points further which is a waste of time.
I am not trying to discuss any of his points. I am point out that Bernstein satisfies your criterion, so now you are going to have to do some mental gymnastics and move the goal posts. Perhaps you are looking to save face gracefully?

By my grace - run away.
Skepdick
Posts: 14504
Joined: Fri Jun 14, 2019 11:16 am

Re: A.I. requires human consciousness for sentience

Post by Skepdick »

Zelebg wrote: Tue Nov 05, 2019 11:30 am That does not address what I explained. What are you even talking about - need not be recursive, in order to do what?
I am saying that any particular process needs not be recursive. There are recursive processes and non-recursive processes.

You said "Execution of specific functions goes step by step" which is only partially correct. The execution of the learning/understanding function goes step by step until the very moment you need to actually update your own prior knowledge - mutation/state/change/updating your own source code. The very process of "changing your own mind".

So while you are claiming that self-awareness/self-reference is necessary for consciousness, I am proposing 'learning' (self-change) as being a stricter criterion. It's mandatory for evolution of any sort.
Zelebg wrote: Tue Nov 05, 2019 11:30 am You are just making an assertion. Analysis is simply detailed examination of something, so how do you arrive to that conclusion?
Yes I (<--- look, self reference) am making an assertion. Analysis (in contrast to synthesis) is just the process of reduction. https://en.wikipedia.org/wiki/Reduction_(complexity)

There is another mode of thinking - holism/synthesis. As per Quine (or Aristotle) - take your pick.

Reductionism and Holism are two sides of the same coin (thinking - something you do): http://www.youtube.com/watch?v=h_kbt7h1YRw
The kind of computers we make in 2019 are fundamentally reduction engines. They can't do synthesis.
Zelebg wrote: Tue Nov 05, 2019 11:30 am Can a computer have subjective experience
Of course! Humans are computers. https://en.wikipedia.org/wiki/Computer_ ... scription)

Perhaps you want to reconsider your question? Do you mean man-made computers?
Zelebg wrote: Tue Nov 05, 2019 11:30 am say camera visual feed like you have subjective experience of your visual input?
That kind of question is not even wrong. How would you test a positive OR a negative claim? How would you falsify either?

They lied to you in high school. In science - if you ask a stupid question you will get a stupid answer.
Zelebg
Posts: 84
Joined: Wed Oct 23, 2019 3:48 am

Re: A.I. requires human consciousness for sentience

Post by Zelebg »

Skepdick wrote: Tue Nov 05, 2019 12:24 pm I am saying that any particular process needs not be recursive.
You said execution of a program is determined by the programmer. Recursive process proves that statement wrong.

Analysis (in contrast to synthesis) is just the process of reduction.
Analysis is simply examination of something, and so is necessary for any sensible decision making.

Of course! Humans are computers.
Perhaps you want to reconsider your question? Do you mean man-made computers?
No, I would like you to understand the question and actually answer it. Yes, I mean a PC.

How would you test a positive OR a negative claim? How would you falsify either?
You are the one who is making a claim a PC can have a subjective experience, the question is yours to answer. I don't know how to falsify your claim since you keep failing to explain it - what are you reasons to believe a PC can have a subjective (sentient) experience of their computation or sensory input like humans do?
commonsense
Posts: 5184
Joined: Sun Mar 26, 2017 6:38 pm

Re: A.I. requires human consciousness for sentience

Post by commonsense »

Zelebg wrote: Tue Nov 05, 2019 12:16 am
Skepdick wrote: Mon Nov 04, 2019 8:01 am When it comes right down to it programming is nothing more than humans meticulously explaining to computers how to perform a particular task.
Execution of specific functions goes step by step. But what functions will run, with what parameters and when, can be triggered and varies relative not only to external events, but since the process is recursive, change of parameters and function branching may be triggered by the "thought" process itself.

One built-in goal is sufficient. The side-goals on the way can be decided relative to the main goal and current circumstances. If choices currently on offer are not relative to the main goal, then it may pursue any goal chosen in whichever random or non-random way.
We can't explain what we ourselves don't understand and we don't understand how we, humans, arrive at our own goals, objectives and sub-objectives.
Computation in itself is not relevant. We are talking about "something" experiencing that computation, sensation, or emotion. We are talking about sentience, which means "subjective experience", and it is the "subjective" part of it, or 'qualia', or 'consciousness', that the title of this thread is asking about.

I say this self-awareness, i.e. qualia, i.e. sentience, i.e. consciousness, is not about computation, but hardware and interface. It seems, as far as we know, this "hardware and interface" that could fit the bill here and explain this _subjective_ phenomena, has no parallel in any of our sciences, except science fiction. Seriously, some kind of "dream" of the type 'Total Recall' or 'The Matrix' are the only kind of mechanics we know of that could, at least in principle, address this problem.
One built-in goal must be built in by a human, else an AI computer cannot develop nor achieve any side-goals.
User avatar
henry quirk
Posts: 14706
Joined: Fri May 09, 2008 8:07 pm
Location: Right here, a little less busy.

only goal we need to build into our future synthetic self-awareness is the drve to live

Post by henry quirk »

that'll cover all the basics till the lill bugger actually goes self-aware and starts formulatin' its own agenda

"Kill all meatbags!" -Bender Bending Rodriguez
Skepdick
Posts: 14504
Joined: Fri Jun 14, 2019 11:16 am

Re: A.I. requires human consciousness for sentience

Post by Skepdick »

Zelebg wrote: Tue Nov 05, 2019 7:21 pm You said execution of a program is determined by the programmer. Recursive process proves that statement wrong.
How does it prove this statement wrong? Have you recently googled the term "recursion theory" ?

Its alternative name is "theory of computability"

Zelebg wrote: Tue Nov 05, 2019 7:21 pm Analysis is simply examination of something, and so is necessary for any sensible decision making.
Anybody who subscribes to the paradigm of systems thinking would disagree with you. The definition of 'analysis' I offered you is far more precise and is in contrast to synthesis.
Zelebg wrote: Tue Nov 05, 2019 7:21 pm No, I would like you to understand the question and actually answer it. Yes, I mean a PC.
Then you seem to be associating the word "computer" with the particular form (PC) and not the function. There are very many things which are computers which are not PCs. Some might even say that the Universe is a computer.
Zelebg wrote: Tue Nov 05, 2019 7:21 pm You are the one who is making a claim a PC can have a subjective experience
1. I am not making that claim strongly. I am claiming that computers can have subjective experiences (because you are a computer and you have experiences).
2. There is this thing called the Church-Turing-Dutch principle
Zelebg wrote: Tue Nov 05, 2019 7:21 pm the question is yours to answer. I don't know how to falsify your claim since you keep failing to explain it
I haven't made any claims - you did. Now falsify them
Zelebg wrote: Tue Nov 05, 2019 7:21 pm - what are you reasons to believe a PC can have a subjective (sentient) experience of their computation or sensory input like humans do?
The lack of of evidence that they can't.

Your belief that they can't have experiences is based on faith.
My belief that they can have experiences is also based on faith.

Whose faith is wrong?
commonsense
Posts: 5184
Joined: Sun Mar 26, 2017 6:38 pm

Re: A.I. requires human consciousness for sentience

Post by commonsense »

Skepdick wrote: Wed Nov 06, 2019 4:07 pm The lack of of evidence that they can't.
I have to stop you right there.

The lack of evidence that they can’t is not evidence that they can.
Skepdick
Posts: 14504
Joined: Fri Jun 14, 2019 11:16 am

Re: A.I. requires human consciousness for sentience

Post by Skepdick »

commonsense wrote: Wed Nov 06, 2019 6:16 pm The lack of evidence that they can’t is not evidence that they can.
Did I say that it is?

Unless you offer a proof of impossibility on the matter, there's no evidence that they can't either.

This is a perfectly rational agnostic position. I don't know if it's possible or impossible. Lets try and see what happens?
commonsense
Posts: 5184
Joined: Sun Mar 26, 2017 6:38 pm

Re: A.I. requires human consciousness for sentience

Post by commonsense »

Skepdick wrote: Wed Nov 06, 2019 6:47 pm
commonsense wrote: Wed Nov 06, 2019 6:16 pm The lack of evidence that they can’t is not evidence that they can.
Did I say that it is?

Unless you offer a proof of impossibility on the matter, there's no evidence that they can't either.

This is a perfectly rational agnostic position. I don't know if it's possible or impossible. Lets try and see what happens?
OK. So it’s faith whether you say they can or say they can’t. I agree.
Zelebg
Posts: 84
Joined: Wed Oct 23, 2019 3:48 am

Re: A.I. requires human consciousness for sentience

Post by Zelebg »

Skepdick wrote: Wed Nov 06, 2019 4:07 pm How does it prove this statement wrong? Have you recently googled the term "recursion theory" ?
You are confusing "recursive function" with "recursion theory". Google it, and if it is still not clear you may then ask me to explain.

The lack of of evidence that they can't.
You can say that for electron too. And that's fine. The problem is you are not explaining how does that make any sense.

Your belief that they can't have experiences is based on faith.
My belief that they can have experiences is also based on faith.

Whose faith is wrong?
I never said they can't. My reason to doubt they can is based on intuition and logic. Your belief is unwarranted without explanation.
Skepdick
Posts: 14504
Joined: Fri Jun 14, 2019 11:16 am

Re: A.I. requires human consciousness for sentience

Post by Skepdick »

Zelebg wrote: Wed Nov 06, 2019 10:38 pm You are confusing "recursive function" with "recursion theory". Google it, and if it is still not clear you may then ask me to explain.
Well. I am definitely not confusing it - recursion theory is the theory of recursive functions - to try and draw a distinction here reeks of desperation. Recursion is one of the pillars of computer science and I happen to be a computer scientist.

I have a local cache of all this stuff in my head. The references to the relevant material are for your benefit, not mine.

Like for example, I know exactly why this recursive function, while perfectly valid in theory, crashes in practice: https://repl.it/repls/LinenBurlywoodEntropy

Code: Select all

def zelebg(): zelebg()
zelebg()
Do you?
Zelebg wrote: Wed Nov 06, 2019 10:38 pm You can say that for electron too. And that's fine. The problem is you are not explaining how does that make any sense.
You can say it about anything. You are not telling me how you would go about testing whether what you say is true or false.

You can SAY that I have experiences. How would you test your hypothesis? How would you falsify it?
Zelebg wrote: Wed Nov 06, 2019 10:38 pm My reason to doubt they can is based on intuition and logic. Your belief is unwarranted without explanation.
My reason to believe they can is based on intuition and logic also. If your doubt is warranted without explanation then so is my belief.
Zelebg
Posts: 84
Joined: Wed Oct 23, 2019 3:48 am

Re: A.I. requires human consciousness for sentience

Post by Zelebg »

Skepdick wrote: Thu Nov 07, 2019 7:43 am Well. I am definitely not confusing it - recursion theory is the theory of recursive functions - to try and draw a distinction here reeks of desperation. Recursion is one of the pillars of computer science and I happen to be a computer scientist.
Recursion is completely different concept in 'Recursion theory' than in 'recursive function'. Clearly then you understand neither. What the hell are you doing, who are you trying to fool? Fascinating.
My reason to believe they can is based on intuition and logic also. If your doubt is warranted without explanation then so is my belief.
You never explained your intuition and logic. Let us hear it.
Post Reply