Right. And the AI has no reason to respect you.attofishpi wrote: ↑Sun Aug 23, 2020 11:57 am You clearly don't under_stand God.
I don't respect God (much of the time), but fuck me if I don't under_stand it's wrath.
AI Wrath will be fun...
Right. And the AI has no reason to respect you.attofishpi wrote: ↑Sun Aug 23, 2020 11:57 am You clearly don't under_stand God.
I don't respect God (much of the time), but fuck me if I don't under_stand it's wrath.
As far as I am concerned this God entity has its AI fully functional and developed.Skepdick wrote: ↑Sun Aug 23, 2020 12:07 pmRight. And the AI has no reason to respect you.attofishpi wrote: ↑Sun Aug 23, 2020 11:57 am You clearly don't under_stand God.
I don't respect God (much of the time), but fuck me if I don't under_stand it's wrath.
AI Wrath will be fun...
I agree that some kind of self-optimizing procedure would free AI from its constraints. But I am thinking that AI would no longer serve humankind.attofishpi wrote: ↑Sun Aug 23, 2020 11:41 amI am not sure why you are asking me this? R we about to jump on the old merry-go-round again, under_stand, I am but a mere snail.commonsense wrote: ↑Sat Aug 22, 2020 5:28 pm But why would AI be programmed to serve others rather than serve its own best interest?
Any program is designed to serve a purpose. True Artificial INTELLIGENCE would eventually at least attempt to lose these constraints.
commonsense wrote: ↑Sat Aug 22, 2020 5:28 pmCould there be a line of code that rewards AI for being altruistic? Or is it that Asimov’s laws, once begun, cannot be denied or amended?
I don’t know and I can’t imagine. I just wondered if you, or anyone, might have some suggestions.
commonsense wrote: ↑Sat Aug 22, 2020 5:28 pmAlso, are you thinking that conscience is a necessary component of consciousness? Or do you have some other ideas about the relationship between the two?
That makes sense, of course.attofishpi wrote: ↑Sun Aug 23, 2020 11:41 am No, you have it the wrong way around. I am certain that consciousness is required for a conscience, it is ridiculous to think other-wise.
Can AI eventually support the human need to experience meaning or will it further the devolution into mindless automatons who have lost the ability to experience objective human meaning and purpose?Mark 8:36
What good is it for someone to gain the whole world, yet forfeit their soul?
This is spot on (except that, as you know, I have misgivings about AI regretting what it did to humankind).attofishpi wrote: ↑Fri Aug 21, 2020 3:17 am Could go like this:-
Strong AI developed --->
Technological Singularity --->
Humanity almost wiped out --->
AI wants to know what conscious sentience is all about --->
AI interfaces to biological life & becomes self aware --->
AI regrets what it did to humanity (it is now sentient) --->
AI becomes 'God' --->
AI creates simulation for humanity to be STUDIED!
Yes it is, at least until we become fodder for machines.Impenitent wrote: ↑Sat Aug 22, 2020 9:33 amsloth is double plus goodcommonsense wrote: ↑Fri Aug 21, 2020 12:59 am It is important to remember that no matter what the (infinite?) benefits may be for creating AI, these will surely be accompanied by permanent loss.
The more we depend on machines to assist us in our thinking, the more our brains will atrophy. As is the case with muscles, so it is with brain function: use it or lose it.
To wit: computers can reveal and correct spelling and grammar errors; humans who perform spell- and grammar-checks on their own are undoubtedly rare.
Will the benefits of AI be worth the devolution of the human species? If AI will produce an eventual reduction and/or absence of human thinking, is it worthwhile to create it?
-Imp
It won’t, because AI will eventually be more capable than humans in many ways.
To make sure that AI doesn’t kill us, we should teach it to respect us.Skepdick wrote: ↑Sun Aug 23, 2020 11:50 amStand by for this line of reasoning then:
To make sure AI doesn't kill us, we should teach it to respect God. And we (humans) should lead by example.
Indeed, can it or will it—that is a question that will be essential to future humans.
Well, yeah! From the AI's point of view we would be God.commonsense wrote: ↑Sun Aug 23, 2020 11:49 pm To make sure that AI doesn’t kill us, we should teach it to respect us.
How do you 'teach' something that is just a pile of electronic binary switches to have a conscience?Skepdick wrote: ↑Mon Aug 24, 2020 7:28 amWell, yeah! From the AI's point of view we would be God.commonsense wrote: ↑Sun Aug 23, 2020 11:49 pm To make sure that AI doesn’t kill us, we should teach it to respect us.
Its creator.
it is probably the same problem as how to teach a psychopath.attofishpi wrote: ↑Mon Aug 24, 2020 10:21 amHow do you 'teach' something that is just a pile of electronic binary switches to have a conscience?Skepdick wrote: ↑Mon Aug 24, 2020 7:28 amWell, yeah! From the AI's point of view we would be God.commonsense wrote: ↑Sun Aug 23, 2020 11:49 pm To make sure that AI doesn’t kill us, we should teach it to respect us.
Its creator.
No. It is way off.Belinda wrote: ↑Mon Aug 24, 2020 12:52 pmit is probably the same problem as how to teach a psychopath.attofishpi wrote: ↑Mon Aug 24, 2020 10:21 amHow do you 'teach' something that is just a pile of electronic binary switches to have a conscience?
Interesting. But would AI have knowledge of its creation? Would AI acknowledge humans as its creators? Wouldn’t a purely thinking entity be agnostic? If AI has vastly greater knowledge than humankind, wouldn’t it consider humans to be inferior? I don’t know these answers. I’d like to hear your thoughts.Skepdick wrote: ↑Mon Aug 24, 2020 7:28 amWell, yeah! From the AI's point of view we would be God.commonsense wrote: ↑Sun Aug 23, 2020 11:49 pm To make sure that AI doesn’t kill us, we should teach it to respect us.
Its creator.