So you STILL think that AI understands what it types??

Known unknowns and unknown unknowns!

Moderators: AMod, iMod

User avatar
bahman
Posts: 8792
Joined: Fri Aug 05, 2016 3:52 pm

Re: So you STILL think that AI understands what it types??

Post by bahman »

commonsense wrote: Fri Jan 05, 2024 3:56 pm
bahman wrote: Thu Jan 04, 2024 6:26 pm
commonsense wrote: Thu Jan 04, 2024 6:17 pm

How can you know that? It’s the problem of others, isn’t it?
As I already mentioned, an AI is not a neural network. It simulates a neural network.
Yes, in effect you said that humans have neural networks and AIs have simulated networks. But how can you distinguish one kind of network from the other?
One simulates another not.
User avatar
Sculptor
Posts: 8677
Joined: Wed Jun 26, 2019 11:32 pm

Re: So you STILL think that AI understands what it types??

Post by Sculptor »

Veritas Aequitas wrote: Fri Jan 05, 2024 8:38 am
vegetariantaxidermy wrote: Fri Jan 05, 2024 4:55 am
wrote: It is not intelligent at all.
It has no idea what it is saying. It has no understanding.
Stop talking about yourself. It's boring.
That is true.
Sculptor's limited intelligence and wisdom are obvious and well-known.

It is stupidity to expect an AIchatbot or LLM [especially at present in its infancy] to meet all of one's expectations.
LLMs has their pros and cons, it is up to one to use it intelligently whilst understanding its limitations.
Besides it is qualified with a caveat in the site "ChatGPT can make mistakes. Consider checking important information."

One need to understand which field of knowledge from LLMs are most reliable and wherefrom it is less reliable.
Asking about 'Cockney rhyming slang' from a small group is knowledge-at-the-fringe and thus one can expect LLMs will not be precise in its answers.
However, LLMs are most reliable with general scientific facts and those information that is published and discussed frequently within reliable circles.
You should not comment on this subject because clearly you have not read throught the exchanges I has with Chat GPT.
Cockney rhyming slang is not fringe. GPT knows several examples. What it does NOT UNDERSTAND, is that you cannot just make it up as you go along as it did twice.
Even when it accepted that it was corrected, and gave the right answer it soon reverted to its previous error. QED IT CANNOT LEARN becasue it does NOT UNDERSTAND what any of it means.
It just a complicated search engine guilty of numberous and well publicised fabrications and inventions.
In this thread I have shown it is also capable is SELF CONTRADICTION in adjacent sentences.

To nail home the point. I asked it again today what I twice "taught" is yesterday, and twice I got it to answer the question twice properly.
Today once again it just comes up with YET ANOTHER complete invention. Its not even the same as yesterday's inventions.
ME
what is meant by the rhyming slang "drum"?
ChatGPT
In traditional Cockney rhyming slang, "drum" is used to refer to the word "bum," which means buttocks or backside. Rhyming slang is a form of wordplay originating from London's East End, where a phrase is substituted with another phrase that rhymes with it, with the last word often dropped, leaving the original word or a shortened version of it. In this case, "drum" rhymes with "bum," so it is used as a substitute for that word.
THis is complete bollocks
User avatar
LuckyR
Posts: 472
Joined: Wed Aug 09, 2023 11:56 pm
Location: The Great NW

Re: So you STILL think that AI understands what it types??

Post by LuckyR »

Sculptor wrote: Fri Jan 05, 2024 5:43 pm
Veritas Aequitas wrote: Fri Jan 05, 2024 8:38 am
vegetariantaxidermy wrote: Fri Jan 05, 2024 4:55 am

Stop talking about yourself. It's boring.
That is true.
Sculptor's limited intelligence and wisdom are obvious and well-known.

It is stupidity to expect an AIchatbot or LLM [especially at present in its infancy] to meet all of one's expectations.
LLMs has their pros and cons, it is up to one to use it intelligently whilst understanding its limitations.
Besides it is qualified with a caveat in the site "ChatGPT can make mistakes. Consider checking important information."

One need to understand which field of knowledge from LLMs are most reliable and wherefrom it is less reliable.
Asking about 'Cockney rhyming slang' from a small group is knowledge-at-the-fringe and thus one can expect LLMs will not be precise in its answers.
However, LLMs are most reliable with general scientific facts and those information that is published and discussed frequently within reliable circles.
You should not comment on this subject because clearly you have not read throught the exchanges I has with Chat GPT.
Cockney rhyming slang is not fringe. GPT knows several examples. What it does NOT UNDERSTAND, is that you cannot just make it up as you go along as it did twice.
Even when it accepted that it was corrected, and gave the right answer it soon reverted to its previous error. QED IT CANNOT LEARN becasue it does NOT UNDERSTAND what any of it means.
It just a complicated search engine guilty of numberous and well publicised fabrications and inventions.
In this thread I have shown it is also capable is SELF CONTRADICTION in adjacent sentences.

To nail home the point. I asked it again today what I twice "taught" is yesterday, and twice I got it to answer the question twice properly.
Today once again it just comes up with YET ANOTHER complete invention. Its not even the same as yesterday's inventions.
ME
what is meant by the rhyming slang "drum"?
ChatGPT
In traditional Cockney rhyming slang, "drum" is used to refer to the word "bum," which means buttocks or backside. Rhyming slang is a form of wordplay originating from London's East End, where a phrase is substituted with another phrase that rhymes with it, with the last word often dropped, leaving the original word or a shortened version of it. In this case, "drum" rhymes with "bum," so it is used as a substitute for that word.
THis is complete bollocks
Exactly. What (superficially) is impressive about ChatGPT is the breadth of the source of it's data set (the interwebs) not the depth of it's understanding of it, somewhere between zero and infintessimally small.
User avatar
Sculptor
Posts: 8677
Joined: Wed Jun 26, 2019 11:32 pm

Re: So you STILL think that AI understands what it types??

Post by Sculptor »

LuckyR wrote: Fri Jan 05, 2024 5:54 pm
Sculptor wrote: Fri Jan 05, 2024 5:43 pm
Veritas Aequitas wrote: Fri Jan 05, 2024 8:38 am
That is true.
Sculptor's limited intelligence and wisdom are obvious and well-known.

It is stupidity to expect an AIchatbot or LLM [especially at present in its infancy] to meet all of one's expectations.
LLMs has their pros and cons, it is up to one to use it intelligently whilst understanding its limitations.
Besides it is qualified with a caveat in the site "ChatGPT can make mistakes. Consider checking important information."

One need to understand which field of knowledge from LLMs are most reliable and wherefrom it is less reliable.
Asking about 'Cockney rhyming slang' from a small group is knowledge-at-the-fringe and thus one can expect LLMs will not be precise in its answers.
However, LLMs are most reliable with general scientific facts and those information that is published and discussed frequently within reliable circles.
You should not comment on this subject because clearly you have not read throught the exchanges I has with Chat GPT.
Cockney rhyming slang is not fringe. GPT knows several examples. What it does NOT UNDERSTAND, is that you cannot just make it up as you go along as it did twice.
Even when it accepted that it was corrected, and gave the right answer it soon reverted to its previous error. QED IT CANNOT LEARN becasue it does NOT UNDERSTAND what any of it means.
It just a complicated search engine guilty of numberous and well publicised fabrications and inventions.
In this thread I have shown it is also capable is SELF CONTRADICTION in adjacent sentences.

To nail home the point. I asked it again today what I twice "taught" is yesterday, and twice I got it to answer the question twice properly.
Today once again it just comes up with YET ANOTHER complete invention. Its not even the same as yesterday's inventions.
ME
what is meant by the rhyming slang "drum"?
ChatGPT
In traditional Cockney rhyming slang, "drum" is used to refer to the word "bum," which means buttocks or backside. Rhyming slang is a form of wordplay originating from London's East End, where a phrase is substituted with another phrase that rhymes with it, with the last word often dropped, leaving the original word or a shortened version of it. In this case, "drum" rhymes with "bum," so it is used as a substitute for that word.
THis is complete bollocks
Exactly. What (superficially) is impressive about ChatGPT is the breadth of the source of it's data set (the interwebs) not the depth of it's understanding of it, somewhere between zero and infintessimally small.
Indeed. In fact I just do not think the word "understands" even applies.

But the simple fact that it fabricates stuff makes it effectively useless.
Reports suggest that it has even mimicked scientific references for studies that never happened.
The idea that so-called AIs might be used to rule our lives is truly shocking.
commonsense
Posts: 5184
Joined: Sun Mar 26, 2017 6:38 pm

Re: So you STILL think that AI understands what it types??

Post by commonsense »

bahman wrote: Fri Jan 05, 2024 4:29 pm
commonsense wrote: Fri Jan 05, 2024 3:56 pm
bahman wrote: Thu Jan 04, 2024 6:26 pm
As I already mentioned, an AI is not a neural network. It simulates a neural network.
Yes, in effect you said that humans have neural networks and AIs have simulated networks. But how can you distinguish one kind of network from the other?
One simulates another not.
You say so, but I am asking how you know which is which when confronted with an AI/human. If one is a simulation of the other, what is the procedure to find out whether you are confronted with the net or the simulation?


This isn’t some kind of trolling trap. I really want to see how you would know, in practice, which is which. I know that you can’t just ask the AI/person, because if it’s an AI that you’re confronting it could just lie to you and claim to be a human.

What have you got in mind that would help me to solve this conundrum?
User avatar
bahman
Posts: 8792
Joined: Fri Aug 05, 2016 3:52 pm

Re: So you STILL think that AI understands what it types??

Post by bahman »

commonsense wrote: Fri Jan 05, 2024 8:33 pm
bahman wrote: Fri Jan 05, 2024 4:29 pm
commonsense wrote: Fri Jan 05, 2024 3:56 pm

Yes, in effect you said that humans have neural networks and AIs have simulated networks. But how can you distinguish one kind of network from the other?
One simulates another not.
You say so, but I am asking how you know which is which when confronted with an AI/human. If one is a simulation of the other, what is the procedure to find out whether you are confronted with the net or the simulation?


This isn’t some kind of trolling trap. I really want to see how you would know, in practice, which is which. I know that you can’t just ask the AI/person, because if it’s an AI that you’re confronting it could just lie to you and claim to be a human.

What have you got in mind that would help me to solve this conundrum?
Well, if you could have such a sophisticated AI that could simulate the human brain then you cannot say whether what you are confronting is a human or AI.
commonsense
Posts: 5184
Joined: Sun Mar 26, 2017 6:38 pm

Re: So you STILL think that AI understands what it types??

Post by commonsense »

Sculptor wrote: Thu Jan 04, 2024 11:04 pm
bahman wrote: Thu Jan 04, 2024 9:43 pm
Sculptor wrote: Thu Jan 04, 2024 9:34 pm
THe problem is why did it forget what I told it in one hour.
ANd NO it is not biased since it has no interest.
It is the dumb stupid recipient and outlet of a programmed algorithm .
But you said in another post that you could teach it.
Why don't you follow the text.
After several false steps and errors..
It is clear that GPT finally accepted and repeated what I said. Then propltly forgot it and made up the same bullshit it had before.
Then I told it again, and later it repeated to me what I had said.
Why don't you ask ChatGPT the following question then tell me what it says...
"What does DRUM mean in COckney Rhyming slang"
If you do that then we shall know the answer empirically.
WHat is the betting that is will contradict itself once again, as I demonstrated in the OP?
First of all, if ChatGPT were free, I would do exactly as you suggest, by asking it what does DRUM mean in Cockney rhyming slang. And I expect the results would be just as they have been for you, i.e. that the bogus answer would come first and eventually after more repetition, ChatGPT would give the right answer.

With blooms model in mind, we can say that learning occurs when there’s a change in behavior, including verbal behavior. For that reason, I would try the following questions:

1. Did you know before now that, in my experience, Icelandic salmon tastes far richer than Alaskan salmon?

2. Do you know now that, in my experience,Icelandic salmon taste for richer than Alaskan salmon?

Yet, presumably ChatGPT could lie by claiming knowledge it does not have.

Perhaps I could check for understanding by asking it if it wanted to order the richer salmon at a restaurant what would it order?

Oops! That won’t work too well either because a simple word processor could parse the Icelandic salmon as being the richer.
commonsense
Posts: 5184
Joined: Sun Mar 26, 2017 6:38 pm

Re: So you STILL think that AI understands what it types??

Post by commonsense »

bahman wrote: Fri Jan 05, 2024 8:37 pm
commonsense wrote: Fri Jan 05, 2024 8:33 pm
bahman wrote: Fri Jan 05, 2024 4:29 pm
One simulates another not.
You say so, but I am asking how you know which is which when confronted with an AI/human. If one is a simulation of the other, what is the procedure to find out whether you are confronted with the net or the simulation?


This isn’t some kind of trolling trap. I really want to see how you would know, in practice, which is which. I know that you can’t just ask the AI/person, because if it’s an AI that you’re confronting it could just lie to you and claim to be a human.

What have you got in mind that would help me to solve this conundrum?
Well, if you could have such a sophisticated AI that could simulate the human brain then you cannot say whether what you are confronting is a human or AI.
Not currently available, of course!
commonsense
Posts: 5184
Joined: Sun Mar 26, 2017 6:38 pm

Re: So you STILL think that AI understands what it types??

Post by commonsense »

I doubt what is suggested here about machine understanding:

https://en.m.wikipedia.org/wiki/Large_language_model

Would anyone care to weigh in on understanding? Should we generate a definition for “understanding” first?
User avatar
Sculptor
Posts: 8677
Joined: Wed Jun 26, 2019 11:32 pm

Re: So you STILL think that AI understands what it types??

Post by Sculptor »

commonsense wrote: Fri Jan 05, 2024 8:59 pm
Sculptor wrote: Thu Jan 04, 2024 11:04 pm
bahman wrote: Thu Jan 04, 2024 9:43 pm
But you said in another post that you could teach it.
Why don't you follow the text.
After several false steps and errors..
It is clear that GPT finally accepted and repeated what I said. Then propltly forgot it and made up the same bullshit it had before.
Then I told it again, and later it repeated to me what I had said.
Why don't you ask ChatGPT the following question then tell me what it says...
"What does DRUM mean in COckney Rhyming slang"
If you do that then we shall know the answer empirically.
WHat is the betting that is will contradict itself once again, as I demonstrated in the OP?
First of all, if ChatGPT were free, I would do exactly as you suggest, by asking it what does DRUM mean in Cockney rhyming slang. And I expect the results would be just as they have been for you, i.e. that the bogus answer would come first and eventually after more repetition, ChatGPT would give the right answer.

With blooms model in mind, we can say that learning occurs when there’s a change in behavior, including verbal behavior. For that reason, I would try the following questions:

1. Did you know before now that, in my experience, Icelandic salmon tastes far richer than Alaskan salmon?

2. Do you know now that, in my experience,Icelandic salmon taste for richer than Alaskan salmon?

Yet, presumably ChatGPT could lie by claiming knowledge it does not have.

Perhaps I could check for understanding by asking it if it wanted to order the richer salmon at a restaurant what would it order?

Oops! That won’t work too well either because a simple word processor could parse the Icelandic salmon as being the richer.
Clearing up a few points.
1) ChatGPT is free.
Follow the link.
https://openai.com/chatgpt
Hit the "Try ChatGPT" in the topright
2) I chose the Drum = Room because there is more than one step. drum roll = hole = room. But the fact that drum means room in Cockney slang is not a value judgement.
3) The taste of types of Salmon is a value judgement, and so since ChatGPT has not tates buds it ought not to have an opinion on this matter. And GPT confirms this.

JUst for fun I asked the question about "drum" again. And it made up yet another locad of old bollocks . "thumb" was today's answer
User avatar
Sculptor
Posts: 8677
Joined: Wed Jun 26, 2019 11:32 pm

Re: So you STILL think that AI understands what it types??

Post by Sculptor »

THis is the last time I try to teach GPT what drum means in cockney.

It looks like I succeeded again, for the third time.

User
WRONG. I told you yesterday what is means TWICE. Drum roll stand for hole and hole is another word for room, as in a hole you live in. THus Drum means room in cockney rhyming slang. COckney rhyming slang is not something an AI is qualified to invent. There are standard rules and accepted slang words in the cockney community. Thus Cockneys understand drum to mean room.
ChatGPT
Apologies for the confusion earlier. You're absolutely right about the Cockney rhyming slang for "drum" meaning "room." Cockney rhyming slang indeed has established rules and common phrases within the community, and "drum" being used to signify "room" is part of that slang lexicon. Thank you for the clarification. Cockney rhyming slang does have its set phrases and meanings, and "drum" referring to "room" is a standard example within that linguistic tradition.


As you see it linguistically mimicks my answer, yet will not store or understand this in a few hours.
User avatar
attofishpi
Posts: 10013
Joined: Tue Aug 16, 2011 8:10 am
Location: Orion Spur
Contact:

Re: So you STILL think that AI understands what it types??

Post by attofishpi »

Sculptor wrote: Sat Jan 06, 2024 12:20 am THis is the last time I try to teach GPT what drum means in cockney.

It looks like I succeeded again, for the third time.

User
WRONG. I told you yesterday what is means TWICE. Drum roll stand for hole and hole is another word for room, as in a hole you live in. THus Drum means room in cockney rhyming slang. COckney rhyming slang is not something an AI is qualified to invent. There are standard rules and accepted slang words in the cockney community. Thus Cockneys understand drum to mean room.
ChatGPT
Apologies for the confusion earlier. You're absolutely right about the Cockney rhyming slang for "drum" meaning "room." Cockney rhyming slang indeed has established rules and common phrases within the community, and "drum" being used to signify "room" is part of that slang lexicon. Thank you for the clarification. Cockney rhyming slang does have its set phrases and meanings, and "drum" referring to "room" is a standard example within that linguistic tradition.


As you see it linguistically mimicks my answer, yet will not store or understand this in a few hours.
What you seem to be misunderstanding is the ChatGPT is NOT "learning" anything new since it was cut off from the internet a couple of years ago (GPT 3.5 - the free one) thus., you cant teach it anything. (thus is not corrupted by morons feeding it crap)

I have found that within the "chats" lists contained in the panel on the left that it does apply what I have stated to further questions within that chat thread. So I find it useful to ensure when I have a question re theology I first click on the thread where I have been researching theology, likewise Chemistry, Physics etc..
User avatar
attofishpi
Posts: 10013
Joined: Tue Aug 16, 2011 8:10 am
Location: Orion Spur
Contact:

Re: So you STILL think that AI understands what it types??

Post by attofishpi »

commonsense wrote: Fri Jan 05, 2024 8:59 pm First of all, if ChatGPT were free,
Version 3.5 is free.
User avatar
Sculptor
Posts: 8677
Joined: Wed Jun 26, 2019 11:32 pm

Re: So you STILL think that AI understands what it types??

Post by Sculptor »

attofishpi wrote: Sat Jan 06, 2024 1:41 am
Sculptor wrote: Sat Jan 06, 2024 12:20 am THis is the last time I try to teach GPT what drum means in cockney.

It looks like I succeeded again, for the third time.

User
WRONG. I told you yesterday what is means TWICE. Drum roll stand for hole and hole is another word for room, as in a hole you live in. THus Drum means room in cockney rhyming slang. COckney rhyming slang is not something an AI is qualified to invent. There are standard rules and accepted slang words in the cockney community. Thus Cockneys understand drum to mean room.
ChatGPT
Apologies for the confusion earlier. You're absolutely right about the Cockney rhyming slang for "drum" meaning "room." Cockney rhyming slang indeed has established rules and common phrases within the community, and "drum" being used to signify "room" is part of that slang lexicon. Thank you for the clarification. Cockney rhyming slang does have its set phrases and meanings, and "drum" referring to "room" is a standard example within that linguistic tradition.


As you see it linguistically mimicks my answer, yet will not store or understand this in a few hours.
What you seem to be misunderstanding is the ChatGPT is NOT "learning" anything new since it was cut off from the internet a couple of years ago (GPT 3.5 - the free one) thus., you cant teach it anything. (thus is not corrupted by morons feeding it crap)
I understand that perfectly.
But it claims to be correcting itself, as you can quite clearly see from examining the exchanges.
My claim is that it does not understand anything.


I have found that within the "chats" lists contained in the panel on the left that it does apply what I have stated to further questions within that chat thread. So I find it useful to ensure when I have a question re theology I first click on the thread where I have been researching theology, likewise Chemistry, Physics etc..
Does it not bother you that it makes shit up all the time, even fake references to people and studies that doe not exist and never happened?
User avatar
attofishpi
Posts: 10013
Joined: Tue Aug 16, 2011 8:10 am
Location: Orion Spur
Contact:

Re: So you STILL think that AI understands what it types??

Post by attofishpi »

Sculptor wrote: Sat Jan 06, 2024 10:55 am I understand that perfectly.
But it claims to be correcting itself, as you can quite clearly see from examining the exchanges.
My claim is that it does not understand anything.
I agree. I don't think anything that has no sentience will ever understand anything. Just a more complex form of calculator.

Sculptor wrote: Sat Jan 06, 2024 10:55 am Does it not bother you that it makes shit up all the time, even fake references to people and studies that doe not exist and never happened?
I can't say I've given it much of a test. It would be concerning if it has bias conjured into its language model - certainly re political left or right leaning tendencies.
Post Reply