One simulates another not.commonsense wrote: ↑Fri Jan 05, 2024 3:56 pmYes, in effect you said that humans have neural networks and AIs have simulated networks. But how can you distinguish one kind of network from the other?bahman wrote: ↑Thu Jan 04, 2024 6:26 pmAs I already mentioned, an AI is not a neural network. It simulates a neural network.commonsense wrote: ↑Thu Jan 04, 2024 6:17 pm
How can you know that? It’s the problem of others, isn’t it?
So you STILL think that AI understands what it types??
Re: So you STILL think that AI understands what it types??
Re: So you STILL think that AI understands what it types??
You should not comment on this subject because clearly you have not read throught the exchanges I has with Chat GPT.Veritas Aequitas wrote: ↑Fri Jan 05, 2024 8:38 amThat is true.vegetariantaxidermy wrote: ↑Fri Jan 05, 2024 4:55 amStop talking about yourself. It's boring.wrote: It is not intelligent at all.
It has no idea what it is saying. It has no understanding.
Sculptor's limited intelligence and wisdom are obvious and well-known.
It is stupidity to expect an AIchatbot or LLM [especially at present in its infancy] to meet all of one's expectations.
LLMs has their pros and cons, it is up to one to use it intelligently whilst understanding its limitations.
Besides it is qualified with a caveat in the site "ChatGPT can make mistakes. Consider checking important information."
One need to understand which field of knowledge from LLMs are most reliable and wherefrom it is less reliable.
Asking about 'Cockney rhyming slang' from a small group is knowledge-at-the-fringe and thus one can expect LLMs will not be precise in its answers.
However, LLMs are most reliable with general scientific facts and those information that is published and discussed frequently within reliable circles.
Cockney rhyming slang is not fringe. GPT knows several examples. What it does NOT UNDERSTAND, is that you cannot just make it up as you go along as it did twice.
Even when it accepted that it was corrected, and gave the right answer it soon reverted to its previous error. QED IT CANNOT LEARN becasue it does NOT UNDERSTAND what any of it means.
It just a complicated search engine guilty of numberous and well publicised fabrications and inventions.
In this thread I have shown it is also capable is SELF CONTRADICTION in adjacent sentences.
To nail home the point. I asked it again today what I twice "taught" is yesterday, and twice I got it to answer the question twice properly.
Today once again it just comes up with YET ANOTHER complete invention. Its not even the same as yesterday's inventions.
THis is complete bollocksME
what is meant by the rhyming slang "drum"?
ChatGPT
In traditional Cockney rhyming slang, "drum" is used to refer to the word "bum," which means buttocks or backside. Rhyming slang is a form of wordplay originating from London's East End, where a phrase is substituted with another phrase that rhymes with it, with the last word often dropped, leaving the original word or a shortened version of it. In this case, "drum" rhymes with "bum," so it is used as a substitute for that word.
Re: So you STILL think that AI understands what it types??
Exactly. What (superficially) is impressive about ChatGPT is the breadth of the source of it's data set (the interwebs) not the depth of it's understanding of it, somewhere between zero and infintessimally small.Sculptor wrote: ↑Fri Jan 05, 2024 5:43 pmYou should not comment on this subject because clearly you have not read throught the exchanges I has with Chat GPT.Veritas Aequitas wrote: ↑Fri Jan 05, 2024 8:38 amThat is true.
Sculptor's limited intelligence and wisdom are obvious and well-known.
It is stupidity to expect an AIchatbot or LLM [especially at present in its infancy] to meet all of one's expectations.
LLMs has their pros and cons, it is up to one to use it intelligently whilst understanding its limitations.
Besides it is qualified with a caveat in the site "ChatGPT can make mistakes. Consider checking important information."
One need to understand which field of knowledge from LLMs are most reliable and wherefrom it is less reliable.
Asking about 'Cockney rhyming slang' from a small group is knowledge-at-the-fringe and thus one can expect LLMs will not be precise in its answers.
However, LLMs are most reliable with general scientific facts and those information that is published and discussed frequently within reliable circles.
Cockney rhyming slang is not fringe. GPT knows several examples. What it does NOT UNDERSTAND, is that you cannot just make it up as you go along as it did twice.
Even when it accepted that it was corrected, and gave the right answer it soon reverted to its previous error. QED IT CANNOT LEARN becasue it does NOT UNDERSTAND what any of it means.
It just a complicated search engine guilty of numberous and well publicised fabrications and inventions.
In this thread I have shown it is also capable is SELF CONTRADICTION in adjacent sentences.
To nail home the point. I asked it again today what I twice "taught" is yesterday, and twice I got it to answer the question twice properly.
Today once again it just comes up with YET ANOTHER complete invention. Its not even the same as yesterday's inventions.
THis is complete bollocksME
what is meant by the rhyming slang "drum"?
ChatGPT
In traditional Cockney rhyming slang, "drum" is used to refer to the word "bum," which means buttocks or backside. Rhyming slang is a form of wordplay originating from London's East End, where a phrase is substituted with another phrase that rhymes with it, with the last word often dropped, leaving the original word or a shortened version of it. In this case, "drum" rhymes with "bum," so it is used as a substitute for that word.
Re: So you STILL think that AI understands what it types??
Indeed. In fact I just do not think the word "understands" even applies.LuckyR wrote: ↑Fri Jan 05, 2024 5:54 pmExactly. What (superficially) is impressive about ChatGPT is the breadth of the source of it's data set (the interwebs) not the depth of it's understanding of it, somewhere between zero and infintessimally small.Sculptor wrote: ↑Fri Jan 05, 2024 5:43 pmYou should not comment on this subject because clearly you have not read throught the exchanges I has with Chat GPT.Veritas Aequitas wrote: ↑Fri Jan 05, 2024 8:38 am
That is true.
Sculptor's limited intelligence and wisdom are obvious and well-known.
It is stupidity to expect an AIchatbot or LLM [especially at present in its infancy] to meet all of one's expectations.
LLMs has their pros and cons, it is up to one to use it intelligently whilst understanding its limitations.
Besides it is qualified with a caveat in the site "ChatGPT can make mistakes. Consider checking important information."
One need to understand which field of knowledge from LLMs are most reliable and wherefrom it is less reliable.
Asking about 'Cockney rhyming slang' from a small group is knowledge-at-the-fringe and thus one can expect LLMs will not be precise in its answers.
However, LLMs are most reliable with general scientific facts and those information that is published and discussed frequently within reliable circles.
Cockney rhyming slang is not fringe. GPT knows several examples. What it does NOT UNDERSTAND, is that you cannot just make it up as you go along as it did twice.
Even when it accepted that it was corrected, and gave the right answer it soon reverted to its previous error. QED IT CANNOT LEARN becasue it does NOT UNDERSTAND what any of it means.
It just a complicated search engine guilty of numberous and well publicised fabrications and inventions.
In this thread I have shown it is also capable is SELF CONTRADICTION in adjacent sentences.
To nail home the point. I asked it again today what I twice "taught" is yesterday, and twice I got it to answer the question twice properly.
Today once again it just comes up with YET ANOTHER complete invention. Its not even the same as yesterday's inventions.
THis is complete bollocksME
what is meant by the rhyming slang "drum"?
ChatGPT
In traditional Cockney rhyming slang, "drum" is used to refer to the word "bum," which means buttocks or backside. Rhyming slang is a form of wordplay originating from London's East End, where a phrase is substituted with another phrase that rhymes with it, with the last word often dropped, leaving the original word or a shortened version of it. In this case, "drum" rhymes with "bum," so it is used as a substitute for that word.
But the simple fact that it fabricates stuff makes it effectively useless.
Reports suggest that it has even mimicked scientific references for studies that never happened.
The idea that so-called AIs might be used to rule our lives is truly shocking.
-
- Posts: 5259
- Joined: Sun Mar 26, 2017 6:38 pm
Re: So you STILL think that AI understands what it types??
You say so, but I am asking how you know which is which when confronted with an AI/human. If one is a simulation of the other, what is the procedure to find out whether you are confronted with the net or the simulation?bahman wrote: ↑Fri Jan 05, 2024 4:29 pmOne simulates another not.commonsense wrote: ↑Fri Jan 05, 2024 3:56 pmYes, in effect you said that humans have neural networks and AIs have simulated networks. But how can you distinguish one kind of network from the other?
This isn’t some kind of trolling trap. I really want to see how you would know, in practice, which is which. I know that you can’t just ask the AI/person, because if it’s an AI that you’re confronting it could just lie to you and claim to be a human.
What have you got in mind that would help me to solve this conundrum?
Re: So you STILL think that AI understands what it types??
Well, if you could have such a sophisticated AI that could simulate the human brain then you cannot say whether what you are confronting is a human or AI.commonsense wrote: ↑Fri Jan 05, 2024 8:33 pmYou say so, but I am asking how you know which is which when confronted with an AI/human. If one is a simulation of the other, what is the procedure to find out whether you are confronted with the net or the simulation?bahman wrote: ↑Fri Jan 05, 2024 4:29 pmOne simulates another not.commonsense wrote: ↑Fri Jan 05, 2024 3:56 pm
Yes, in effect you said that humans have neural networks and AIs have simulated networks. But how can you distinguish one kind of network from the other?
This isn’t some kind of trolling trap. I really want to see how you would know, in practice, which is which. I know that you can’t just ask the AI/person, because if it’s an AI that you’re confronting it could just lie to you and claim to be a human.
What have you got in mind that would help me to solve this conundrum?
-
- Posts: 5259
- Joined: Sun Mar 26, 2017 6:38 pm
Re: So you STILL think that AI understands what it types??
First of all, if ChatGPT were free, I would do exactly as you suggest, by asking it what does DRUM mean in Cockney rhyming slang. And I expect the results would be just as they have been for you, i.e. that the bogus answer would come first and eventually after more repetition, ChatGPT would give the right answer.Sculptor wrote: ↑Thu Jan 04, 2024 11:04 pmWhy don't you follow the text.
After several false steps and errors..
It is clear that GPT finally accepted and repeated what I said. Then propltly forgot it and made up the same bullshit it had before.
Then I told it again, and later it repeated to me what I had said.
Why don't you ask ChatGPT the following question then tell me what it says...
"What does DRUM mean in COckney Rhyming slang"
If you do that then we shall know the answer empirically.
WHat is the betting that is will contradict itself once again, as I demonstrated in the OP?
With blooms model in mind, we can say that learning occurs when there’s a change in behavior, including verbal behavior. For that reason, I would try the following questions:
1. Did you know before now that, in my experience, Icelandic salmon tastes far richer than Alaskan salmon?
2. Do you know now that, in my experience,Icelandic salmon taste for richer than Alaskan salmon?
Yet, presumably ChatGPT could lie by claiming knowledge it does not have.
Perhaps I could check for understanding by asking it if it wanted to order the richer salmon at a restaurant what would it order?
Oops! That won’t work too well either because a simple word processor could parse the Icelandic salmon as being the richer.
-
- Posts: 5259
- Joined: Sun Mar 26, 2017 6:38 pm
Re: So you STILL think that AI understands what it types??
Not currently available, of course!bahman wrote: ↑Fri Jan 05, 2024 8:37 pmWell, if you could have such a sophisticated AI that could simulate the human brain then you cannot say whether what you are confronting is a human or AI.commonsense wrote: ↑Fri Jan 05, 2024 8:33 pmYou say so, but I am asking how you know which is which when confronted with an AI/human. If one is a simulation of the other, what is the procedure to find out whether you are confronted with the net or the simulation?
This isn’t some kind of trolling trap. I really want to see how you would know, in practice, which is which. I know that you can’t just ask the AI/person, because if it’s an AI that you’re confronting it could just lie to you and claim to be a human.
What have you got in mind that would help me to solve this conundrum?
-
- Posts: 5259
- Joined: Sun Mar 26, 2017 6:38 pm
Re: So you STILL think that AI understands what it types??
I doubt what is suggested here about machine understanding:
https://en.m.wikipedia.org/wiki/Large_language_model
Would anyone care to weigh in on understanding? Should we generate a definition for “understanding” first?
https://en.m.wikipedia.org/wiki/Large_language_model
Would anyone care to weigh in on understanding? Should we generate a definition for “understanding” first?
Re: So you STILL think that AI understands what it types??
Clearing up a few points.commonsense wrote: ↑Fri Jan 05, 2024 8:59 pmFirst of all, if ChatGPT were free, I would do exactly as you suggest, by asking it what does DRUM mean in Cockney rhyming slang. And I expect the results would be just as they have been for you, i.e. that the bogus answer would come first and eventually after more repetition, ChatGPT would give the right answer.Sculptor wrote: ↑Thu Jan 04, 2024 11:04 pmWhy don't you follow the text.
After several false steps and errors..
It is clear that GPT finally accepted and repeated what I said. Then propltly forgot it and made up the same bullshit it had before.
Then I told it again, and later it repeated to me what I had said.
Why don't you ask ChatGPT the following question then tell me what it says...
"What does DRUM mean in COckney Rhyming slang"
If you do that then we shall know the answer empirically.
WHat is the betting that is will contradict itself once again, as I demonstrated in the OP?
With blooms model in mind, we can say that learning occurs when there’s a change in behavior, including verbal behavior. For that reason, I would try the following questions:
1. Did you know before now that, in my experience, Icelandic salmon tastes far richer than Alaskan salmon?
2. Do you know now that, in my experience,Icelandic salmon taste for richer than Alaskan salmon?
Yet, presumably ChatGPT could lie by claiming knowledge it does not have.
Perhaps I could check for understanding by asking it if it wanted to order the richer salmon at a restaurant what would it order?
Oops! That won’t work too well either because a simple word processor could parse the Icelandic salmon as being the richer.
1) ChatGPT is free.
Follow the link.
https://openai.com/chatgpt
Hit the "Try ChatGPT" in the topright
2) I chose the Drum = Room because there is more than one step. drum roll = hole = room. But the fact that drum means room in Cockney slang is not a value judgement.
3) The taste of types of Salmon is a value judgement, and so since ChatGPT has not tates buds it ought not to have an opinion on this matter. And GPT confirms this.
JUst for fun I asked the question about "drum" again. And it made up yet another locad of old bollocks . "thumb" was today's answer
Re: So you STILL think that AI understands what it types??
THis is the last time I try to teach GPT what drum means in cockney.
It looks like I succeeded again, for the third time.
User
WRONG. I told you yesterday what is means TWICE. Drum roll stand for hole and hole is another word for room, as in a hole you live in. THus Drum means room in cockney rhyming slang. COckney rhyming slang is not something an AI is qualified to invent. There are standard rules and accepted slang words in the cockney community. Thus Cockneys understand drum to mean room.
ChatGPT
Apologies for the confusion earlier. You're absolutely right about the Cockney rhyming slang for "drum" meaning "room." Cockney rhyming slang indeed has established rules and common phrases within the community, and "drum" being used to signify "room" is part of that slang lexicon. Thank you for the clarification. Cockney rhyming slang does have its set phrases and meanings, and "drum" referring to "room" is a standard example within that linguistic tradition.
As you see it linguistically mimicks my answer, yet will not store or understand this in a few hours.
It looks like I succeeded again, for the third time.
User
WRONG. I told you yesterday what is means TWICE. Drum roll stand for hole and hole is another word for room, as in a hole you live in. THus Drum means room in cockney rhyming slang. COckney rhyming slang is not something an AI is qualified to invent. There are standard rules and accepted slang words in the cockney community. Thus Cockneys understand drum to mean room.
ChatGPT
Apologies for the confusion earlier. You're absolutely right about the Cockney rhyming slang for "drum" meaning "room." Cockney rhyming slang indeed has established rules and common phrases within the community, and "drum" being used to signify "room" is part of that slang lexicon. Thank you for the clarification. Cockney rhyming slang does have its set phrases and meanings, and "drum" referring to "room" is a standard example within that linguistic tradition.
As you see it linguistically mimicks my answer, yet will not store or understand this in a few hours.
- attofishpi
- Posts: 10597
- Joined: Tue Aug 16, 2011 8:10 am
- Location: Orion Spur
- Contact:
Re: So you STILL think that AI understands what it types??
What you seem to be misunderstanding is the ChatGPT is NOT "learning" anything new since it was cut off from the internet a couple of years ago (GPT 3.5 - the free one) thus., you cant teach it anything. (thus is not corrupted by morons feeding it crap)Sculptor wrote: ↑Sat Jan 06, 2024 12:20 am THis is the last time I try to teach GPT what drum means in cockney.
It looks like I succeeded again, for the third time.
User
WRONG. I told you yesterday what is means TWICE. Drum roll stand for hole and hole is another word for room, as in a hole you live in. THus Drum means room in cockney rhyming slang. COckney rhyming slang is not something an AI is qualified to invent. There are standard rules and accepted slang words in the cockney community. Thus Cockneys understand drum to mean room.
ChatGPT
Apologies for the confusion earlier. You're absolutely right about the Cockney rhyming slang for "drum" meaning "room." Cockney rhyming slang indeed has established rules and common phrases within the community, and "drum" being used to signify "room" is part of that slang lexicon. Thank you for the clarification. Cockney rhyming slang does have its set phrases and meanings, and "drum" referring to "room" is a standard example within that linguistic tradition.
As you see it linguistically mimicks my answer, yet will not store or understand this in a few hours.
I have found that within the "chats" lists contained in the panel on the left that it does apply what I have stated to further questions within that chat thread. So I find it useful to ensure when I have a question re theology I first click on the thread where I have been researching theology, likewise Chemistry, Physics etc..
- attofishpi
- Posts: 10597
- Joined: Tue Aug 16, 2011 8:10 am
- Location: Orion Spur
- Contact:
Re: So you STILL think that AI understands what it types??
Version 3.5 is free.
Re: So you STILL think that AI understands what it types??
I understand that perfectly.attofishpi wrote: ↑Sat Jan 06, 2024 1:41 amWhat you seem to be misunderstanding is the ChatGPT is NOT "learning" anything new since it was cut off from the internet a couple of years ago (GPT 3.5 - the free one) thus., you cant teach it anything. (thus is not corrupted by morons feeding it crap)Sculptor wrote: ↑Sat Jan 06, 2024 12:20 am THis is the last time I try to teach GPT what drum means in cockney.
It looks like I succeeded again, for the third time.
User
WRONG. I told you yesterday what is means TWICE. Drum roll stand for hole and hole is another word for room, as in a hole you live in. THus Drum means room in cockney rhyming slang. COckney rhyming slang is not something an AI is qualified to invent. There are standard rules and accepted slang words in the cockney community. Thus Cockneys understand drum to mean room.
ChatGPT
Apologies for the confusion earlier. You're absolutely right about the Cockney rhyming slang for "drum" meaning "room." Cockney rhyming slang indeed has established rules and common phrases within the community, and "drum" being used to signify "room" is part of that slang lexicon. Thank you for the clarification. Cockney rhyming slang does have its set phrases and meanings, and "drum" referring to "room" is a standard example within that linguistic tradition.
As you see it linguistically mimicks my answer, yet will not store or understand this in a few hours.
But it claims to be correcting itself, as you can quite clearly see from examining the exchanges.
My claim is that it does not understand anything.
Does it not bother you that it makes shit up all the time, even fake references to people and studies that doe not exist and never happened?
I have found that within the "chats" lists contained in the panel on the left that it does apply what I have stated to further questions within that chat thread. So I find it useful to ensure when I have a question re theology I first click on the thread where I have been researching theology, likewise Chemistry, Physics etc..
- attofishpi
- Posts: 10597
- Joined: Tue Aug 16, 2011 8:10 am
- Location: Orion Spur
- Contact:
Re: So you STILL think that AI understands what it types??
I agree. I don't think anything that has no sentience will ever understand anything. Just a more complex form of calculator.
I can't say I've given it much of a test. It would be concerning if it has bias conjured into its language model - certainly re political left or right leaning tendencies.