Seriously, we need to ban people from using GPT etc to construct their posts, create their arguments, and tell people that their arguments are valid and sound.
Well, we shouldn't need to, people should have more self control than this. But there's no way to rely on internet people's restraint. Just look at this complete shit heap
viewtopic.php?t=43226&sid=43da43a94dc03 ... d396b5d3e3
It's one of maybe hundreds of threads now where the same person has done the same thing to get an obsequious AI to write something that pleases him (which is the only thing they actually do) and attempted to pass it off as philosophy.
It's not even the laziness that I oppose, so much as the mental wellbeing issues that come with this type of confirmation bias. not to mention the relentless low quality repetitive spam it enables.
The GPT doggerel is out of hand here
- FlashDangerpants
- Posts: 7532
- Joined: Mon Jan 04, 2016 11:54 pm
-
- Posts: 9662
- Joined: Sun Sep 25, 2011 3:08 pm
- Location: It's my fault
Re: The GPT doggerel is out of hand here
I use ChatGPT once in a while to search for information or fact check. It beats reading through pages and pages of fluff just to find out if something someone said is true or not. I usually put the exact quote from ChatGPT in my reply rather than rephrasing it in my own words. I suppose I could just use my own words to say something like, "I searched with ChatGPT and it seems to be incorrect that X happened this way..." (or whatever).FlashDangerpants wrote: ↑Sun Dec 15, 2024 1:29 pm Seriously, we need to ban people from using GPT etc to construct their posts, create their arguments, and tell people that their arguments are valid and sound.
Well, we shouldn't need to, people should have more self control than this. But there's no way to rely on internet people's restraint. Just look at this complete shit heap
viewtopic.php?t=43226&sid=43da43a94dc03 ... d396b5d3e3
It's one of maybe hundreds of threads now where the same person has done the same thing to get an obsequious AI to write something that pleases him (which is the only thing they actually do) and attempted to pass it off as philosophy.
It's not even the laziness that I oppose, so much as the mental wellbeing issues that come with this type of confirmation bias. not to mention the relentless low quality repetitive spam it enables.
- FlashDangerpants
- Posts: 7532
- Joined: Mon Jan 04, 2016 11:54 pm
Re: The GPT doggerel is out of hand here
I guess it all boils down to what other tool you are using the AI in place of. You are using it instead of Google there, so I guess with some minor caveats about the tendency of AI systems to hallucinate books and laws and stuff that don't actually exist if that's the best way to return the text you ask for, I expect it's fine.Gary Childress wrote: ↑Tue Dec 17, 2024 11:09 am I use ChatGPT once in a while to search for information or fact check. It beats reading through pages and pages of fluff just to find out if something someone said is true or not. I usually put the exact quote from ChatGPT in my reply rather than rephrasing it in my own words. I suppose I could just use my own words to say something like, "I searched with ChatGPT and it seems to be incorrect that X happened this way..." (or whatever).
VA is using ChatGPT in place of an education and a brain. These are tools that should not be replaced with robots. He writes a circular argument under his own steam, then he bullies an AI into telling him his argument isn't circular - this fails. So then he uses AI to compose a post complaining that people who hold him to the whole circular argument thing are bad men. This is shit if done once, but spamming the same endless shit every day is unacceptable.
- attofishpi
- Posts: 12250
- Joined: Tue Aug 16, 2011 8:10 am
- Location: Orion Spur
- Contact:
Re: The GPT doggerel is out of hand here
Just imagine if the likes of Flashdoopypants started to use ChatGPT to present something interesting, something akin to a personality that is not boring and in fact, an actual personality that that people would find interesting and not at all dull with continuous cliched responses..(ya know, something opposite to textbook Flash)Gary Childress wrote: ↑Tue Dec 17, 2024 11:09 am I use ChatGPT once in a while to search for information or fact check. It beats reading through pages and pages of fluff just to find out if something someone said is true or not. I usually put the exact quote from ChatGPT in my reply rather than rephrasing it in my own words. I suppose I could just use my own words to say something like, "I searched with ChatGPT and it seems to be incorrect that X happened this way..." (or whatever).
- FlashDangerpants
- Posts: 7532
- Joined: Mon Jan 04, 2016 11:54 pm
Re: The GPT doggerel is out of hand here
Further to my concerns, when people take this stuff too far they replace meaty thinking with mechanical thinking and make horrible mistakes which they further assume to supported by some sort of science. This is a perfect example:
Religion requires an absolutely perfect God (supported by some weird stuff about soothing cognitive dissonance)
Absolute perfection cannot exist in reality (supported by some stuff about perfect circles not existing in reality, and some of that bullshit FSK stuff he likes)
THEREFORE: God cannot exist as real.
You can see more details in the original post here
The ChatGPT "validation" of this argument as presented by its author...
Others, perhaps meaner in spirit than I, might object on the more basic level that it just makes no fucking sense and is a clearly stupid argument made out of incoherent premises that cannot support any conclusion as they fall apart under their own weight before they can support any other.
Either way, the last time this cropped up, VA claimed that he only uses AI as something akin to apsellcheck program, and this is obviously a lie...
Maybe that's just harmless data collation and summarisation, which is sort of what GPT is good for... or maybe not. We need to apply some sort of standard.
The argument to which this guy is referring amounts to (paraphrased)Veritas Aequitas wrote: ↑Fri Jan 03, 2025 3:49 am I don't have an issue with basic logic; I have 259 files in 29 sub-topics in my PC drive thus I am reasonably informed of what is logic.
Especially now with AI, there should be no issue with basic logic at all where any doubt re validity can be polished by AI.
Religion requires an absolutely perfect God (supported by some weird stuff about soothing cognitive dissonance)
Absolute perfection cannot exist in reality (supported by some stuff about perfect circles not existing in reality, and some of that bullshit FSK stuff he likes)
THEREFORE: God cannot exist as real.
You can see more details in the original post here
The ChatGPT "validation" of this argument as presented by its author...
Anybody with a basic grasp of logic can see flaws that GPT misses. I would say the most obvious is that the argument applies natural world limits to a supernatural entity.Veritas Aequitas wrote: ↑Sun Jun 11, 2023 2:51 am Validation:ChatGpt made some comments on the premises and after some explanations, ChatGpt concluded;
- ChatGpt:
Your argument is structured well and attempts to logically demonstrate that it is impossible for God to exist as real based on the concept of absolute perfection.
- ChatGpt:
In summary, your argument is logically coherent, but the strength of its persuasion depends on the acceptance of the premise about the conditioned nature of reality. Anticipating and addressing potential counterarguments would further enhance the robustness of your position.
Others, perhaps meaner in spirit than I, might object on the more basic level that it just makes no fucking sense and is a clearly stupid argument made out of incoherent premises that cannot support any conclusion as they fall apart under their own weight before they can support any other.
Either way, the last time this cropped up, VA claimed that he only uses AI as something akin to apsellcheck program, and this is obviously a lie...
It's spreading too. Fishpie is using GPT to write posts for him now viewtopic.php?p=748158#p748158Veritas Aequitas wrote: ↑Fri Dec 15, 2023 7:55 am In most cases, I am not relying or is borrowing from Bard or ChatGpt totally but rather [given English not my native tongue] I present my points and views and got Bard or ChatGpt to represent it in a more organized and structured manner and ensure it follows logically.
Bard and ChatGt in my case is more like a spellcheck or grammar check program.
Maybe that's just harmless data collation and summarisation, which is sort of what GPT is good for... or maybe not. We need to apply some sort of standard.
-
- Posts: 4978
- Joined: Wed Feb 10, 2010 2:04 pm
Re: The GPT doggerel is out of hand here
trust the magic box
it only speaks truth
-Imp
it only speaks truth
-Imp