The End of the Human Era?

For the discussion of philosophical books.

Moderators: AMod, iMod

Post Reply
lesley_vos
Posts: 19
Joined: Wed Oct 25, 2017 10:49 am
Contact:

The End of the Human Era?

Post by lesley_vos » Fri Nov 24, 2017 9:27 am

Did you read James Barrat's "Artificial Intelligence and the End of the Human Era"?

An alarm book expressing all fears about AI coming, a kind of typical modern myth. Sure, such texts are quite useful: they are well read and discussed, performing the functions of criticism...

Image

But in the case of Barrat, as well as many other authors, his weak philosophical background is making itself felt. He copes well with his task -- frightening readers from the standpoint of common sense. The only trouble is that this common sense does not take into account the newest knowledge of mankind about the nature of one's own consciousness.

Barrat makes two fundamental mistakes. First, it is a false anthropomorphization: not every mind should be like human's. Second, it is a myth of Cartesian evidence: he assumes that the required condition for intellect is self-awareness and space-time localization. In other words, he builds mind by the Descartes model where all ideas are clear and distinct, the mind is locked into res extensa, and we are perfectly aware of what we think.

But from the modern neurophysiology and philosophy point of view, this is a lie from beginning to end. Intellect does not necessarily have to be a thing locked in some space: evidence of this is numerous projects of distributed computing. Intellect does not necessarily have to be self-aware: it can be purely functional. And finally, the fundamental point of this Cartesianism suggests that if AI emerges, we will certainly fix this moment and begin to worry about it. And AI, accordingly, will begin to realize our presence and will become nervous, too.

But, in fact, we won't notice when AI comes. It will exist completely outside the context of our understanding, and it will not have a single reason to tell us what it is. And nothing will change for humans: we will use computers, as we use now but, maybe, more intensively. Social problems are obvious, of course, as robots will replace some of us at work, but there is conceptually no reason to believe that a person and AI generally will notice each other.

Any thoughts? Do you support Stephen Hawking's statement about AI being "the worst event in the history of our civilization"? Or, as well as James Barrat, he exaggerates the danger coming from the artificial intelligence?

EchoesOfTheHorizon
Posts: 375
Joined: Mon Oct 23, 2017 6:08 am

Re: The End of the Human Era?

Post by EchoesOfTheHorizon » Fri Nov 24, 2017 3:22 pm

I support Skynet completely.

Dubious
Posts: 1611
Joined: Tue May 19, 2015 7:40 am

Re: The End of the Human Era?

Post by Dubious » Sun Nov 26, 2017 5:07 am

There's always danger in progress...or what we call progress. The question is if AI is forced to be limited because of fear, can progress still proceed along the path taken. It's conceivable there may be other intelligences out there that don't think the way we do or react to the same stimuli and still be more mentally powerful than humans. Since we never had converse with these supposed beings, whatever we artificially create will become its surrogate.

Eventually AI is going to mature; it's just another episode of living dangerously. AI is unlikely to be as threatening to humans as humans have been to each other from the get-go...not to mention nature's side-effects of plagues and famines. In any event living dangerously has been a constant. AI is simply another episode to contend with...if it comes to that.

lesley_vos
Posts: 19
Joined: Wed Oct 25, 2017 10:49 am
Contact:

Re: The End of the Human Era?

Post by lesley_vos » Tue Nov 28, 2017 9:36 am

EchoesOfTheHorizon wrote:
Fri Nov 24, 2017 3:22 pm
I support Skynet completely.
Oops... :)

lesley_vos
Posts: 19
Joined: Wed Oct 25, 2017 10:49 am
Contact:

Re: The End of the Human Era?

Post by lesley_vos » Tue Nov 28, 2017 9:39 am

Dubious wrote:
Sun Nov 26, 2017 5:07 am
There's always danger in progress...or what we call progress. The question is if AI is forced to be limited because of fear, can progress still proceed along the path taken.
That's the problem: people are afraid of losing their power as the most intelligent creatures, a so-called "apex of creation." They want to destroy everything threatening their control. No matter if it's AI or any other thing.

EchoesOfTheHorizon
Posts: 375
Joined: Mon Oct 23, 2017 6:08 am

Re: The End of the Human Era?

Post by EchoesOfTheHorizon » Tue Nov 28, 2017 9:23 pm

I don't know, I want to destroy The Kardashians, but don't believe they qualify as "The Apex of Creation". Maybe the underlining motivation to destroy something is a bit wider in definition?

Dubious
Posts: 1611
Joined: Tue May 19, 2015 7:40 am

Re: The End of the Human Era?

Post by Dubious » Wed Nov 29, 2017 11:18 am

lesley_vos wrote:
Tue Nov 28, 2017 9:39 am
Dubious wrote:
Sun Nov 26, 2017 5:07 am
There's always danger in progress...or what we call progress. The question is if AI is forced to be limited because of fear, can progress still proceed along the path taken.
That's the problem: people are afraid of losing their power as the most intelligent creatures, a so-called "apex of creation." They want to destroy everything threatening their control. No matter if it's AI or any other thing.
Don't completely agree with this statement. If humans were afraid of losing power, obviously already considered, they would have to stop now with the AI experiment. But they haven't with no intent of doing so. It's an ongoing development whose goals will be stretched to the limit. We'll know when we're no-longer useful to the enterprise at the cusp when these AI entities start programing and replicating on their own.

To stop that from occurring we'd have to forgo the next logical step in our hoped for advancement. Stalemate is the consequence if AI is preempted from continuing the narrative regardless of where it may lead. As mentioned already, humans have always lived dangerously; there's nothing new in this. We've been our own worst enemy from the very beginning; in spite of that or because of it, we survived and advanced. Friction creates heat and heat creates flux; it's the nature of a beast that's slow in learning limits!

Danger ahead Will Robinson! :lol:

User avatar
GreatandWiseTrixie
Posts: 1587
Joined: Tue Feb 03, 2015 9:51 pm

Re: The End of the Human Era?

Post by GreatandWiseTrixie » Thu Nov 30, 2017 7:17 am

As leader of the human species, I say the way to avoid the oncoming robot apocalypse is to reprogram the military bots and make them our sex bots.

lesley_vos
Posts: 19
Joined: Wed Oct 25, 2017 10:49 am
Contact:

Re: The End of the Human Era?

Post by lesley_vos » Fri Dec 01, 2017 2:03 pm

GreatandWiseTrixie wrote:
Thu Nov 30, 2017 7:17 am
As leader of the human species, I say the way to avoid the oncoming robot apocalypse is to reprogram the military bots and make them our sex bots.
sex bots? anything wrong with humans? :))

User avatar
-1-
Posts: 1019
Joined: Thu Sep 07, 2017 1:08 am

Re: The End of the Human Era?

Post by -1- » Mon Dec 04, 2017 12:28 pm

Hawkins got it right: he said (I read the article) that AI could ultimately be good for mankind, bad for mankind, or be sidelined by it and become indifferent.

I am on the view that AI will on its own strength and motivation leave mankind alone. War, destruction, killing, only happens when resources are limited so there is not enough to go around. What is it that mankind wants, and AI wants as well? We want clothes, fast cars, beautiful women, good food, and intellectual pursuits. Are we to be in contention with AI machines' wish and need for goods and services? I hardly think so. The only thing that we may both want and there won't be enough is energy, but that will be solved by cold fusion (or hot fusion, controlled.)

Bigger threat is the view proposed by Hawking: some few controlling AI to control the masses.A Hitler, a Stalin, a Trump, a LI-joung-ma could make life lived very nasty for a lot of people.

I don't fear AI, for the exact reason that there will be no contention for wanting the same resources.

User avatar
-1-
Posts: 1019
Joined: Thu Sep 07, 2017 1:08 am

Re: The End of the Human Era?

Post by -1- » Mon Dec 04, 2017 12:32 pm

Why would AI want to destroy or harm mankind in the first place? Well, if it were programmed to do so, or if they are autonomous thinkers, and they want a share of the same resources that we have and want. Otherwise, even if their measured IQ will be over 3000, they may view us as we view ducks, amoebas and mountain gorillas: something that is there, but we won't bother with it, why should we?

User avatar
-1-
Posts: 1019
Joined: Thu Sep 07, 2017 1:08 am

Re: The End of the Human Era?

Post by -1- » Mon Dec 04, 2017 12:34 pm

GreatandWiseTrixie wrote:
Thu Nov 30, 2017 7:17 am
As leader of the human species, I say the way to avoid the oncoming robot apocalypse is to reprogram the military bots and make them our sex bots.
That is the surest way to doom mankind. Once AI machines are better lovers than humans, all human reproduction will stop, and individuals will age and die off without offspring.

Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest