Why Philosophers Should Care About Computational Complexity

What is the basis for reason? And mathematics?

Moderators: AMod, iMod

Univalence
Posts: 492
Joined: Sun May 12, 2019 6:28 pm

Why Philosophers Should Care About Computational Complexity

Post by Univalence »

https://www.scottaaronson.com/papers/philos.pdf

One might think that, once we know something is computable, how efficiently it can be computed is a practical question with little further philosophical importance. In this essay, I offer a detailed case that one would be wrong. In particular, I argue that computational complexity theory—the field that studies the resources (such as time, space, and randomness) needed to solve computational problems—leads to new perspectives on the nature of mathematical knowledge, the strong AI debate, computationalism, the problem of logical omniscience, Hume’s problem of induction, Goodman’s grue riddle, the foundations of quantum mechanics, economic rationality, closed timelike curves, and several other topics of philosophical interest. I end by discussing aspects of complexity theory itself that could benefit from philosophical analysis.
surreptitious57
Posts: 4257
Joined: Fri Oct 25, 2013 6:09 am

Re: Why Philosophers Should Care About Computational Complexity

Post by surreptitious57 »

The more complex that a computation is then the more likely it will contain extra mathematical or philosophical applications
It could be that beyond a certain degree of complexity such applications are inevitable even if they may not always be known
Its probably true to say that in mathematics there are an infinite number of applications that can exist within a complex spectrum
This is because the number line itself - the basic foundation of mathematics - has an infinite number of members on its single axis
surreptitious57
Posts: 4257
Joined: Fri Oct 25, 2013 6:09 am

Re: Why Philosophers Should Care About Computational Complexity

Post by surreptitious57 »

The development of quantum computers will raise complexity to another level entirely
Here we may be pushing the absolute limit of what a human mind can actually understand mathematically
So we almost certainly will in the future have machines using machines to calculate beyond our capability
Univalence
Posts: 492
Joined: Sun May 12, 2019 6:28 pm

Re: Why Philosophers Should Care About Computational Complexity

Post by Univalence »

surreptitious57 wrote: Wed Jun 05, 2019 8:53 am The development of quantum computers will raise complexity to another level entirely
Quite the opposite, I think? The hallmark of a useful tool is its ability to helps humans manage complexity.

Classical Computers helped us develop empirical intuitions, thought-patterns and vocabularies that we didn't quite have 50 years ago.
They helped us solve (some) problems, but we are well aware of their limits.

I think (hope?) that quantum computation's popularisation will help us move even more problems into the 'tractable' category.
Eodnhoj7
Posts: 8595
Joined: Mon Mar 13, 2017 3:18 am

Re: Why Philosophers Should Care About Computational Complexity

Post by Eodnhoj7 »

Univalence wrote: Wed Jun 05, 2019 6:40 am https://www.scottaaronson.com/papers/philos.pdf

One might think that, once we know something is computable, how efficiently it can be computed is a practical question with little further philosophical importance. In this essay, I offer a detailed case that one would be wrong. In particular, I argue that computational complexity theory—the field that studies the resources (such as time, space, and randomness) needed to solve computational problems—leads to new perspectives on the nature of mathematical knowledge, the strong AI debate, computationalism, the problem of logical omniscience, Hume’s problem of induction, Goodman’s grue riddle, the foundations of quantum mechanics, economic rationality, closed timelike curves, and several other topics of philosophical interest. I end by discussing aspects of complexity theory itself that could benefit from philosophical analysis.
First of all, you claim philosophy is dead...now you are crawling on your hands and knees like oliver twist asking "please sir may I have some attention?".

Efficiency is subject to to many variables that make it inefficient...hence a paradox. One could claim programming allows me to efficiently get a better cup of coffee (controlling production rates, resource input/output, etc.). Or I can just be satisfied with what is in front of me. Noone takes into account the number of man hours, resources, etc. required to form the computer, the machines the create the computer, etc. Versus planting "x" coffee bean in the ground instead of "y".

Computation is inefficient as it requires a manifestation of complex relations in order to achieve it.

You are just running around in circles. Computation is strictly the connection and separation of various patterns to create further various patterns...natural law already has been doing this for years. Computation is an illusion of choice.
Univalence
Posts: 492
Joined: Sun May 12, 2019 6:28 pm

Re: Why Philosophers Should Care About Computational Complexity

Post by Univalence »

Eodnhoj7 wrote: Fri Jun 07, 2019 7:14 pm Efficiency is subject to to many variables that make it inefficient...hence a paradox.
Computational complexity is only subject to three. Space, time and randomness.

It's in the very abstract I quoted.
Eodnhoj7 wrote: Fri Jun 07, 2019 7:14 pm Computation is an illusion.
So are placebos. But if it works that's all that matters.
Eodnhoj7
Posts: 8595
Joined: Mon Mar 13, 2017 3:18 am

Re: Why Philosophers Should Care About Computational Complexity

Post by Eodnhoj7 »

Univalence wrote: [/color]4 user_id=17231]
Eodnhoj7 wrote: Fri Jun 07, 2019 7:14 pm Efficiency is subject to to many variables that make it inefficient...hence a paradox.
Computational complexity is only subject to three. Space, time and randomness.

It's in the very abstract I quoted.

False considering Time can be observed strictly as a variation of space where one space is divided through another. Thus we are left with space and randomness where randomness is a percieve multiplicity of spaces conducive to an approximation of space itself. Thus we are left with space.

The terms you provide, because they are assumed, must effectively be defined further and as such there categorization changes within the manner of how they relate. We can be left with "The One" and "The Many" as strict variables.

Eodnhoj7 wrote: Fri Jun 07, 2019 7:14 pm Computation is an illusion.
So are placebos. But if it works that's all that matters.
Placebos are a physicalization of belief conducive to an idol. People idolize pills, therefore they belief in them; hence there belief changes their perception of the illness which effectively cures them given certain circumstances. The pill cancels itself out eventually, no different than the illusion, relative to its actual value in light of the realization that belief what the variable that initiated the healing.
Univalence
Posts: 492
Joined: Sun May 12, 2019 6:28 pm

Re: Why Philosophers Should Care About Computational Complexity

Post by Univalence »

Eodnhoj7 wrote: Sat Jun 08, 2019 5:19 pm False considering Time can be observed strictly as a variation of space where one space is divided through another. Thus we are left with space and randomness where randomness is a percieve multiplicity of spaces conducive to an approximation of space itself. Thus we are left with space.
Since time doesn't exist, I decided to read your sentence backwards.

Can't make any sense of it.

.ecaps htiw tfel era ew suhT .flesti ecaps fo noitamixorppa na ot evicudnoc secaps fo yticilpitlum eveicrep a si ssenmodnar erehw ssenmodnar dna ecaps htiw tfel era ew suhT .rehtona hguorht dedivid si ecaps eno erehw ecaps fo noitairav a sa yltcirts devresbo eb nac emiT gniredisnoc eslaF
Impenitent
Posts: 4331
Joined: Wed Feb 10, 2010 2:04 pm

Re: Why Philosophers Should Care About Computational Complexity

Post by Impenitent »

"T"ruth is a placebo

-Imp
Scott Mayers
Posts: 2446
Joined: Wed Jul 08, 2015 1:53 am

Re: Why Philosophers Should Care About Computational Complexity

Post by Scott Mayers »

If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?
commonsense
Posts: 5116
Joined: Sun Mar 26, 2017 6:38 pm

Re: Why Philosophers Should Care About Computational Complexity

Post by commonsense »

Scott Mayers wrote: Sat Jun 08, 2019 7:29 pm If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?
Scott M, between Scott A’s essay and a flurry of Wikipedia references, I was able to grasp a working definition of what computational complexity is. (This was no easy task for me, a novice at best.)

Scott A, I wonder if the response to your thread can be reduced to a statement that computational complexity theorists can predict the amount of resources a philosopher will need to work a philosophical problem to its theoretical conclusion.

Accordingly, if the conclusion of a philosophical problem were not theoretical, one could determine beforehand whether it is desirable or not to expend the resources necessary to reach a conclusion.

However, if philosophical questions cannot be answered with a definitive conclusion, should the amount of resources needed to reach a conclusion be of any concern to philosophers?

Further, if the joy of the journey is more important to philosophers than the fruit of the finish, then your thread would more aptly be worded, “Why should philosophers care about computational complexity?”
Univalence
Posts: 492
Joined: Sun May 12, 2019 6:28 pm

Re: Why Philosophers Should Care About Computational Complexity

Post by Univalence »

Scott Mayers wrote: Sat Jun 08, 2019 7:29 pm If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?
Over-simplified. Computational complexity asks two questions:
1. Can this question be answered in principle and how? (produce an algorithm)
2. How much resources would it take to answer it in practice? (how long would the algorithm run for).

To the question of "What is the meaning of life?" I think the algorithm is currently running. I do not believe it will ever halt.
wtf
Posts: 1178
Joined: Tue Sep 08, 2015 11:36 pm

Re: Why Philosophers Should Care About Computational Complexity

Post by wtf »

Scott Mayers wrote: Sat Jun 08, 2019 7:29 pm If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?
commonsense wrote: Sun Jun 09, 2019 2:40 pm Scott M, between Scott A’s essay and a flurry of Wikipedia references, I was able to grasp a working definition of what computational complexity is.
Among all the tasks that are computable, some can be computed more efficiently than others. What do we mean by efficiency? In this case it means the growth rate of the resources consumed by a computation, as a function of the length of its input. That's the key idea.

Let me illustrate with four examples.

* Constant time. Example: Determining whether a number is odd or even.

If I give you a number, like 3425 and ask you if it's even, you just look at the low-order digit, the 5, and if that's even the entire number is even and if it's odd the number is odd. If I give you the number 48594859043590438583 you do exactly the same. No matter how big the given number is, it always takes you exactly the same amount of computing work to determine if the number is even or odd. So this problem can be done in "constant time" as they say.

* Linear time. Example: Counting characters in a string.

To count the characters in "dfsljg" we iterate through each character and increment a counter for each character we see. If the length is n then we have to count n characters. This is "linear time."

* Polynomial time. Example: Multiplying two n-digit numbers.

We all know the old grade school algorithm. I hear Common Core doesn't teach that anymore and has a lot of bizarre new methods. I hear bad stories. But I'm assuming most readers learned multiplication the old-fashioned way.

If it's a 3-digit number by a 3-digit number, you have a total of 9 multiplications plus the addition at the end. If it's 4x4 you have 16 multiplications. In general you always have n^2 multiplications plus a big addition at the end. n^2 is a polynomial and the addition at the end doesn't add much work. So multiplication is "polynomial time."

Since n is a polynomial of first degree, we see that linear time is a subset of polynomial time.

In general, an algorithm that's polynomial or better is regarded as efficient; and anything worse is inefficient.

* Exponential time. Example: Brute-force guessing of an n-character password.

Suppose for simplicity a password must consist of n decimal digits. I the password length is 1, then we have to make possibly up to 10 guesses, averaging around 5.

If the passwords are two digits long, we have to make in the worst case 100 guesses and in the average case around 50.

In other words for each 1-character input, the amount of work to guess the password goes up by a factor of 10. So the formula for the amount of guesswork is around 10^n. This is called "exponential time" and it is regarded as bad. The resource usage goes up so fast that even though we can solve the problem in theory, we haven't got enough resources to solve it in practice.

That's complexity theory. My examples all involve the time domain, but we can also analyze algorithms in terms of how much space (or memory) they require. Complexity theory builds out all these different complexity classes and figures out which problems are in which class.

Why is computational complexity important?

* One reason is that performing computations on a physical computer consumes resources and takes time. We always want to be able to go faster, so it's important to find more efficient algorithms.

* P = NP. You may have heard of this famous problem, one of the Clay Millennium problems that pays a million dollars to the first person to prove it true or false, as the case may be.

P is the class of problems that can be solved by a deterministic Turing machine; and NP is the class that can be solved by a nondeterministic TM. A nondeterministic TM essentially takes all possible paths at once. So even though P and NP solve exactly the same set of problems, the nondeterministic algorithms can solve SOME problems more efficiently. We think. Nobody knows for sure.

* Quantum computers will break Internet security and cryptocurrencies. You may have heard or read this. All Internet and cryptocurrency security is based on Public key cryptography, which is based on the ancient mathematical problem of factoring integers into primes.

Since factoring on conventional computers is hard (not as bad as exponential but still not polynomial), it takes the attacker too much time to decode an encrypted message.

On the other hand if we use a quantum computer, there's an algorith called Shor's algorithm that can factor an arbitrary integer in polynomial time. This is truly an amazing fact.

If we eventually develop practical quantum computers (much farther off than the hype would have you believe IMO), then public key cryptography is defeated and all of Internet security is compromised.

* Going beyond all that, Scott Aaronson believes that computational complexity has deep philosophical import. That's what his article's about.
Scott Mayers
Posts: 2446
Joined: Wed Jul 08, 2015 1:53 am

Re: Why Philosophers Should Care About Computational Complexity

Post by Scott Mayers »

wtf wrote: Wed Jun 12, 2019 10:20 pm
Scott Mayers wrote: Sat Jun 08, 2019 7:29 pm If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?
commonsense wrote: Sun Jun 09, 2019 2:40 pm Scott M, between Scott A’s essay and a flurry of Wikipedia references, I was able to grasp a working definition of what computational complexity is.
Among all the tasks that are computable, some can be computed more efficiently than others. What do we mean by efficiency? In this case it means the growth rate of the resources consumed by a computation, as a function of the length of its input. That's the key idea.

Let me illustrate with four examples.

* Constant time. Example: Determining whether a number is odd or even.

If I give you a number, like 3425 and ask you if it's even, you just look at the low-order digit, the 5, and if that's even the entire number is even and if it's odd the number is odd. If I give you the number 48594859043590438583 you do exactly the same. No matter how big the given number is, it always takes you exactly the same amount of computing work to determine if the number is even or odd. So this problem can be done in "constant time" as they say.

* Linear time. Example: Counting characters in a string.

To count the characters in "dfsljg" we iterate through each character and increment a counter for each character we see. If the length is n then we have to count n characters. This is "linear time."

* Polynomial time. Example: Multiplying two n-digit numbers.

We all know the old grade school algorithm. I hear Common Core doesn't teach that anymore and has a lot of bizarre new methods. I hear bad stories. But I'm assuming most readers learned multiplication the old-fashioned way.

If it's a 3-digit number by a 3-digit number, you have a total of 9 multiplications plus the addition at the end. If it's 4x4 you have 16 multiplications. In general you always have n^2 multiplications plus a big addition at the end. n^2 is a polynomial and the addition at the end doesn't add much work. So multiplication is "polynomial time."

Since n is a polynomial of first degree, we see that linear time is a subset of polynomial time.

In general, an algorithm that's polynomial or better is regarded as efficient; and anything worse is inefficient.

* Exponential time. Example: Brute-force guessing of an n-character password.

Suppose for simplicity a password must consist of n decimal digits. I the password length is 1, then we have to make possibly up to 10 guesses, averaging around 5.

If the passwords are two digits long, we have to make in the worst case 100 guesses and in the average case around 50.

In other words for each 1-character input, the amount of work to guess the password goes up by a factor of 10. So the formula for the amount of guesswork is around 10^n. This is called "exponential time" and it is regarded as bad. The resource usage goes up so fast that even though we can solve the problem in theory, we haven't got enough resources to solve it in practice.

That's complexity theory. My examples all involve the time domain, but we can also analyze algorithms in terms of how much space (or memory) they require. Complexity theory builds out all these different complexity classes and figures out which problems are in which class.

Why is computational complexity important?

* One reason is that performing computations on a physical computer consumes resources and takes time. We always want to be able to go faster, so it's important to find more efficient algorithms.

* P = NP. You may have heard of this famous problem, one of the Clay Millennium problems that pays a million dollars to the first person to prove it true or false, as the case may be.

P is the class of problems that can be solved by a deterministic Turing machine; and NP is the class that can be solved by a nondeterministic TM. A nondeterministic TM essentially takes all possible paths at once. So even though P and NP solve exactly the same set of problems, the nondeterministic algorithms can solve SOME problems more efficiently. We think. Nobody knows for sure.

* Quantum computers will break Internet security and cryptocurrencies. You may have heard or read this. All Internet and cryptocurrency security is based on Public key cryptography, which is based on the ancient mathematical problem of factoring integers into primes.

Since factoring on conventional computers is hard (not as bad as exponential but still not polynomial), it takes the attacker too much time to decode an encrypted message.

On the other hand if we use a quantum computer, there's an algorith called Shor's algorithm that can factor an arbitrary integer in polynomial time. This is truly an amazing fact.

If we eventually develop practical quantum computers (much farther off than the hype would have you believe IMO), then public key cryptography is defeated and all of Internet security is compromised.

* Going beyond all that, Scott Aaronson believes that computational complexity has deep philosophical import. That's what his article's about.
Thank you. I'm already aware of these concepts in computer science but never saw it specified/defined as "computational complexity". I express the particular issue with respect to problems on an architectural level. One that I had used at the skepticforum.com (and possibly here back then), how and why a computer program can't be used to suffice as a test for the Monty Hall problem as some think this can be permitted. So it CAN be a relevant potential part of discussion for 'natural philosophy' as to any limitations we may expect a computer to utilize. However, beyond those of us who are already interested in the depths of computer science relations with respect to logic, I'm not sure if this topic can be 'sold' through his paper without appealing to clear definitions and SAMPLES (as you at least provided here, for example) to the lay person within the GENERAL field of philosophy.
Scott Mayers
Posts: 2446
Joined: Wed Jul 08, 2015 1:53 am

Re: Why Philosophers Should Care About Computational Complexity

Post by Scott Mayers »

Also, if I'm confused, then what does this mean for the average person lacking the extent of logic and/or computing?
Post Reply