The End of the Human Era?

For the discussion of philosophical books.

Moderators: AMod, iMod

lesley_vos
Posts: 15
Joined: Wed Oct 25, 2017 10:49 am
Contact:

The End of the Human Era?

Post by lesley_vos »

Did you read James Barrat's "Artificial Intelligence and the End of the Human Era"?

An alarm book expressing all fears about AI coming, a kind of typical modern myth. Sure, such texts are quite useful: they are well read and discussed, performing the functions of criticism...

Image

But in the case of Barrat, as well as many other authors, his weak philosophical background is making itself felt. He copes well with his task -- frightening readers from the standpoint of common sense. The only trouble is that this common sense does not take into account the newest knowledge of mankind about the nature of one's own consciousness.

Barrat makes two fundamental mistakes. First, it is a false anthropomorphization: not every mind should be like human's. Second, it is a myth of Cartesian evidence: he assumes that the required condition for intellect is self-awareness and space-time localization. In other words, he builds mind by the Descartes model where all ideas are clear and distinct, the mind is locked into res extensa, and we are perfectly aware of what we think.

But from the modern neurophysiology and philosophy point of view, this is a lie from beginning to end. Intellect does not necessarily have to be a thing locked in some space: evidence of this is numerous projects of distributed computing. Intellect does not necessarily have to be self-aware: it can be purely functional. And finally, the fundamental point of this Cartesianism suggests that if AI emerges, we will certainly fix this moment and begin to worry about it. And AI, accordingly, will begin to realize our presence and will become nervous, too.

But, in fact, we won't notice when AI comes. It will exist completely outside the context of our understanding, and it will not have a single reason to tell us what it is. And nothing will change for humans: we will use computers, as we use now but, maybe, more intensively. Social problems are obvious, of course, as robots will replace some of us at work, but there is conceptually no reason to believe that a person and AI generally will notice each other.

Any thoughts? Do you support Stephen Hawking's statement about AI being "the worst event in the history of our civilization"? Or, as well as James Barrat, he exaggerates the danger coming from the artificial intelligence?
EchoesOfTheHorizon
Posts: 356
Joined: Mon Oct 23, 2017 6:08 am

Re: The End of the Human Era?

Post by EchoesOfTheHorizon »

I support Skynet completely.
Dubious
Posts: 4000
Joined: Tue May 19, 2015 7:40 am

Re: The End of the Human Era?

Post by Dubious »

There's always danger in progress...or what we call progress. The question is if AI is forced to be limited because of fear, can progress still proceed along the path taken. It's conceivable there may be other intelligences out there that don't think the way we do or react to the same stimuli and still be more mentally powerful than humans. Since we never had converse with these supposed beings, whatever we artificially create will become its surrogate.

Eventually AI is going to mature; it's just another episode of living dangerously. AI is unlikely to be as threatening to humans as humans have been to each other from the get-go...not to mention nature's side-effects of plagues and famines. In any event living dangerously has been a constant. AI is simply another episode to contend with...if it comes to that.
lesley_vos
Posts: 15
Joined: Wed Oct 25, 2017 10:49 am
Contact:

Re: The End of the Human Era?

Post by lesley_vos »

EchoesOfTheHorizon wrote: Fri Nov 24, 2017 3:22 pm I support Skynet completely.
Oops... :)
lesley_vos
Posts: 15
Joined: Wed Oct 25, 2017 10:49 am
Contact:

Re: The End of the Human Era?

Post by lesley_vos »

Dubious wrote: Sun Nov 26, 2017 5:07 am There's always danger in progress...or what we call progress. The question is if AI is forced to be limited because of fear, can progress still proceed along the path taken.
That's the problem: people are afraid of losing their power as the most intelligent creatures, a so-called "apex of creation." They want to destroy everything threatening their control. No matter if it's AI or any other thing.
EchoesOfTheHorizon
Posts: 356
Joined: Mon Oct 23, 2017 6:08 am

Re: The End of the Human Era?

Post by EchoesOfTheHorizon »

I don't know, I want to destroy The Kardashians, but don't believe they qualify as "The Apex of Creation". Maybe the underlining motivation to destroy something is a bit wider in definition?
Dubious
Posts: 4000
Joined: Tue May 19, 2015 7:40 am

Re: The End of the Human Era?

Post by Dubious »

lesley_vos wrote: Tue Nov 28, 2017 9:39 am
Dubious wrote: Sun Nov 26, 2017 5:07 am There's always danger in progress...or what we call progress. The question is if AI is forced to be limited because of fear, can progress still proceed along the path taken.
That's the problem: people are afraid of losing their power as the most intelligent creatures, a so-called "apex of creation." They want to destroy everything threatening their control. No matter if it's AI or any other thing.
Don't completely agree with this statement. If humans were afraid of losing power, obviously already considered, they would have to stop now with the AI experiment. But they haven't with no intent of doing so. It's an ongoing development whose goals will be stretched to the limit. We'll know when we're no-longer useful to the enterprise at the cusp when these AI entities start programing and replicating on their own.

To stop that from occurring we'd have to forgo the next logical step in our hoped for advancement. Stalemate is the consequence if AI is preempted from continuing the narrative regardless of where it may lead. As mentioned already, humans have always lived dangerously; there's nothing new in this. We've been our own worst enemy from the very beginning; in spite of that or because of it, we survived and advanced. Friction creates heat and heat creates flux; it's the nature of a beast that's slow in learning limits!

Danger ahead Will Robinson! :lol:
User avatar
GreatandWiseTrixie
Posts: 1547
Joined: Tue Feb 03, 2015 9:51 pm

Re: The End of the Human Era?

Post by GreatandWiseTrixie »

As leader of the human species, I say the way to avoid the oncoming robot apocalypse is to reprogram the military bots and make them our sex bots.
User avatar
-1-
Posts: 2888
Joined: Thu Sep 07, 2017 1:08 am

Re: The End of the Human Era?

Post by -1- »

Hawkins got it right: he said (I read the article) that AI could ultimately be good for mankind, bad for mankind, or be sidelined by it and become indifferent.

I am on the view that AI will on its own strength and motivation leave mankind alone. War, destruction, killing, only happens when resources are limited so there is not enough to go around. What is it that mankind wants, and AI wants as well? We want clothes, fast cars, beautiful women, good food, and intellectual pursuits. Are we to be in contention with AI machines' wish and need for goods and services? I hardly think so. The only thing that we may both want and there won't be enough is energy, but that will be solved by cold fusion (or hot fusion, controlled.)

Bigger threat is the view proposed by Hawking: some few controlling AI to control the masses.A Hitler, a Stalin, a Trump, a LI-joung-ma could make life lived very nasty for a lot of people.

I don't fear AI, for the exact reason that there will be no contention for wanting the same resources.
User avatar
-1-
Posts: 2888
Joined: Thu Sep 07, 2017 1:08 am

Re: The End of the Human Era?

Post by -1- »

Why would AI want to destroy or harm mankind in the first place? Well, if it were programmed to do so, or if they are autonomous thinkers, and they want a share of the same resources that we have and want. Otherwise, even if their measured IQ will be over 3000, they may view us as we view ducks, amoebas and mountain gorillas: something that is there, but we won't bother with it, why should we?
User avatar
-1-
Posts: 2888
Joined: Thu Sep 07, 2017 1:08 am

Re: The End of the Human Era?

Post by -1- »

GreatandWiseTrixie wrote: Thu Nov 30, 2017 7:17 am As leader of the human species, I say the way to avoid the oncoming robot apocalypse is to reprogram the military bots and make them our sex bots.
That is the surest way to doom mankind. Once AI machines are better lovers than humans, all human reproduction will stop, and individuals will age and die off without offspring.
Dalek Prime
Posts: 4922
Joined: Tue Apr 14, 2015 4:48 am
Location: Living in a tree with Polly.

Re: The End of the Human Era?

Post by Dalek Prime »

-1- wrote: Mon Dec 04, 2017 12:34 pm
GreatandWiseTrixie wrote: Thu Nov 30, 2017 7:17 am As leader of the human species, I say the way to avoid the oncoming robot apocalypse is to reprogram the military bots and make them our sex bots.
That is the surest way to doom mankind. Once AI machines are better lovers than humans, all human reproduction will stop, and individuals will age and die off without offspring.
Yay!!!!!

Err, I mean 'oh, no'...... kidding, I mean Yay!!!!!!
Nick_A
Posts: 6208
Joined: Sat Jul 07, 2012 1:23 am

Re: The End of the Human Era?

Post by Nick_A »

IMO as AI evolves its dualistic limitations will serve to develop the ultimate animal man. This evolution of course must eventually lead to destruction normal for the hypocrisy natural for the human condition.

However it is possible that the absurdities produced by these limitations will become increasingly obvious. As a result those sensitive to the human potential for conscious evolution and acquiring a conscious perspective will find it easier to unite with others having felt the same. IMO, they will eventually become the future of humanity.
User avatar
Greta
Posts: 4389
Joined: Sat Aug 08, 2015 8:10 am

Re: The End of the Human Era?

Post by Greta »

lesley_vos wrote: Fri Nov 24, 2017 9:27 amBut, in fact, we won't notice when AI comes. It will exist completely outside the context of our understanding, and it will not have a single reason to tell us what it is. And nothing will change for humans: we will use computers, as we use now but, maybe, more intensively. Social problems are obvious, of course, as robots will replace some of us at work, but there is conceptually no reason to believe that a person and AI generally will notice each other.
This is my impression.

I think corporations are already a kind of AI system that utilises many (but ever decreasing) human components, all of which are interchangeable.

Organisations run based on policies, aims, goals, corporate plans and strategic plans, which are adhered to as if programmed. Also, you may notice that corporations are taking over. Individuals are now starting to notice that while they pay tax they are not actually represented by politicians who instead focus solely on the interests of multinational and major national corporations that often don't pay tax or at least proportionally less so.

The rationale is that a company represents the interests of many people, but increasingly profits are not being shared and workers are being replaced by automation. Really, giant global companies only represent the interests of themselves and, if some people benefit with them, they will make for useful lobbying bargaining chips.

Only corporations have the capacity to send Earth's material and at least some life or DNA to other worlds to continue life's journey when the surface of the Earth is used up. So I don't decry them their power and exploitation, but I do note it.
Walker
Posts: 14280
Joined: Thu Nov 05, 2015 12:00 am

Re: The End of the Human Era?

Post by Walker »

Start worrying when the "giant global corporation" boogie-men are actually machines, like many stock-market transactions.
Post Reply