Why Philosophers Should Care About Computational Complexity

 Posts: 492
 Joined: Sun May 12, 2019 6:28 pm
Why Philosophers Should Care About Computational Complexity
https://www.scottaaronson.com/papers/philos.pdf
One might think that, once we know something is computable, how efficiently it can be computed is a practical question with little further philosophical importance. In this essay, I offer a detailed case that one would be wrong. In particular, I argue that computational complexity theory—the field that studies the resources (such as time, space, and randomness) needed to solve computational problems—leads to new perspectives on the nature of mathematical knowledge, the strong AI debate, computationalism, the problem of logical omniscience, Hume’s problem of induction, Goodman’s grue riddle, the foundations of quantum mechanics, economic rationality, closed timelike curves, and several other topics of philosophical interest. I end by discussing aspects of complexity theory itself that could benefit from philosophical analysis.
One might think that, once we know something is computable, how efficiently it can be computed is a practical question with little further philosophical importance. In this essay, I offer a detailed case that one would be wrong. In particular, I argue that computational complexity theory—the field that studies the resources (such as time, space, and randomness) needed to solve computational problems—leads to new perspectives on the nature of mathematical knowledge, the strong AI debate, computationalism, the problem of logical omniscience, Hume’s problem of induction, Goodman’s grue riddle, the foundations of quantum mechanics, economic rationality, closed timelike curves, and several other topics of philosophical interest. I end by discussing aspects of complexity theory itself that could benefit from philosophical analysis.

 Posts: 4225
 Joined: Fri Oct 25, 2013 6:09 am
Re: Why Philosophers Should Care About Computational Complexity
The more complex that a computation is then the more likely it will contain extra mathematical or philosophical applications
It could be that beyond a certain degree of complexity such applications are inevitable even if they may not always be known
Its probably true to say that in mathematics there are an infinite number of applications that can exist within a complex spectrum
This is because the number line itself  the basic foundation of mathematics  has an infinite number of members on its single axis
It could be that beyond a certain degree of complexity such applications are inevitable even if they may not always be known
Its probably true to say that in mathematics there are an infinite number of applications that can exist within a complex spectrum
This is because the number line itself  the basic foundation of mathematics  has an infinite number of members on its single axis

 Posts: 4225
 Joined: Fri Oct 25, 2013 6:09 am
Re: Why Philosophers Should Care About Computational Complexity
The development of quantum computers will raise complexity to another level entirely
Here we may be pushing the absolute limit of what a human mind can actually understand mathematically
So we almost certainly will in the future have machines using machines to calculate beyond our capability
Here we may be pushing the absolute limit of what a human mind can actually understand mathematically
So we almost certainly will in the future have machines using machines to calculate beyond our capability

 Posts: 492
 Joined: Sun May 12, 2019 6:28 pm
Re: Why Philosophers Should Care About Computational Complexity
Quite the opposite, I think? The hallmark of a useful tool is its ability to helps humans manage complexity.surreptitious57 wrote: ↑Wed Jun 05, 2019 8:53 am The development of quantum computers will raise complexity to another level entirely
Classical Computers helped us develop empirical intuitions, thoughtpatterns and vocabularies that we didn't quite have 50 years ago.
They helped us solve (some) problems, but we are well aware of their limits.
I think (hope?) that quantum computation's popularisation will help us move even more problems into the 'tractable' category.
Re: Why Philosophers Should Care About Computational Complexity
First of all, you claim philosophy is dead...now you are crawling on your hands and knees like oliver twist asking "please sir may I have some attention?".Univalence wrote: ↑Wed Jun 05, 2019 6:40 am https://www.scottaaronson.com/papers/philos.pdf
One might think that, once we know something is computable, how efficiently it can be computed is a practical question with little further philosophical importance. In this essay, I offer a detailed case that one would be wrong. In particular, I argue that computational complexity theory—the field that studies the resources (such as time, space, and randomness) needed to solve computational problems—leads to new perspectives on the nature of mathematical knowledge, the strong AI debate, computationalism, the problem of logical omniscience, Hume’s problem of induction, Goodman’s grue riddle, the foundations of quantum mechanics, economic rationality, closed timelike curves, and several other topics of philosophical interest. I end by discussing aspects of complexity theory itself that could benefit from philosophical analysis.
Efficiency is subject to to many variables that make it inefficient...hence a paradox. One could claim programming allows me to efficiently get a better cup of coffee (controlling production rates, resource input/output, etc.). Or I can just be satisfied with what is in front of me. Noone takes into account the number of man hours, resources, etc. required to form the computer, the machines the create the computer, etc. Versus planting "x" coffee bean in the ground instead of "y".
Computation is inefficient as it requires a manifestation of complex relations in order to achieve it.
You are just running around in circles. Computation is strictly the connection and separation of various patterns to create further various patterns...natural law already has been doing this for years. Computation is an illusion of choice.

 Posts: 492
 Joined: Sun May 12, 2019 6:28 pm
Re: Why Philosophers Should Care About Computational Complexity
Computational complexity is only subject to three. Space, time and randomness.
It's in the very abstract I quoted.
So are placebos. But if it works that's all that matters.
Re: Why Philosophers Should Care About Computational Complexity
Placebos are a physicalization of belief conducive to an idol. People idolize pills, therefore they belief in them; hence there belief changes their perception of the illness which effectively cures them given certain circumstances. The pill cancels itself out eventually, no different than the illusion, relative to its actual value in light of the realization that belief what the variable that initiated the healing.Univalence wrote: ↑[/color]4 user_id=17231]Computational complexity is only subject to three. Space, time and randomness.
It's in the very abstract I quoted.
False considering Time can be observed strictly as a variation of space where one space is divided through another. Thus we are left with space and randomness where randomness is a percieve multiplicity of spaces conducive to an approximation of space itself. Thus we are left with space.
The terms you provide, because they are assumed, must effectively be defined further and as such there categorization changes within the manner of how they relate. We can be left with "The One" and "The Many" as strict variables.
So are placebos. But if it works that's all that matters.

 Posts: 492
 Joined: Sun May 12, 2019 6:28 pm
Re: Why Philosophers Should Care About Computational Complexity
Since time doesn't exist, I decided to read your sentence backwards.Eodnhoj7 wrote: ↑Sat Jun 08, 2019 5:19 pm False considering Time can be observed strictly as a variation of space where one space is divided through another. Thus we are left with space and randomness where randomness is a percieve multiplicity of spaces conducive to an approximation of space itself. Thus we are left with space.
Can't make any sense of it.
.ecaps htiw tfel era ew suhT .flesti ecaps fo noitamixorppa na ot evicudnoc secaps fo yticilpitlum eveicrep a si ssenmodnar erehw ssenmodnar dna ecaps htiw tfel era ew suhT .rehtona hguorht dedivid si ecaps eno erehw ecaps fo noitairav a sa yltcirts devresbo eb nac emiT gniredisnoc eslaF

 Posts: 3133
 Joined: Wed Feb 10, 2010 2:04 pm
Re: Why Philosophers Should Care About Computational Complexity
"T"ruth is a placebo
Imp
Imp

 Posts: 1872
 Joined: Wed Jul 08, 2015 1:53 am
 Location: Saskatoon, SK, Canada
Re: Why Philosophers Should Care About Computational Complexity
If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?

 Posts: 2894
 Joined: Sun Mar 26, 2017 6:38 pm
Re: Why Philosophers Should Care About Computational Complexity
Scott M, between Scott A’s essay and a flurry of Wikipedia references, I was able to grasp a working definition of what computational complexity is. (This was no easy task for me, a novice at best.)Scott Mayers wrote: ↑Sat Jun 08, 2019 7:29 pm If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?
Scott A, I wonder if the response to your thread can be reduced to a statement that computational complexity theorists can predict the amount of resources a philosopher will need to work a philosophical problem to its theoretical conclusion.
Accordingly, if the conclusion of a philosophical problem were not theoretical, one could determine beforehand whether it is desirable or not to expend the resources necessary to reach a conclusion.
However, if philosophical questions cannot be answered with a definitive conclusion, should the amount of resources needed to reach a conclusion be of any concern to philosophers?
Further, if the joy of the journey is more important to philosophers than the fruit of the finish, then your thread would more aptly be worded, “Why should philosophers care about computational complexity?”

 Posts: 492
 Joined: Sun May 12, 2019 6:28 pm
Re: Why Philosophers Should Care About Computational Complexity
Oversimplified. Computational complexity asks two questions:Scott Mayers wrote: ↑Sat Jun 08, 2019 7:29 pm If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?
1. Can this question be answered in principle and how? (produce an algorithm)
2. How much resources would it take to answer it in practice? (how long would the algorithm run for).
To the question of "What is the meaning of life?" I think the algorithm is currently running. I do not believe it will ever halt.
Re: Why Philosophers Should Care About Computational Complexity
Scott Mayers wrote: ↑Sat Jun 08, 2019 7:29 pm If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?
Among all the tasks that are computable, some can be computed more efficiently than others. What do we mean by efficiency? In this case it means the growth rate of the resources consumed by a computation, as a function of the length of its input. That's the key idea.commonsense wrote: ↑Sun Jun 09, 2019 2:40 pm Scott M, between Scott A’s essay and a flurry of Wikipedia references, I was able to grasp a working definition of what computational complexity is.
Let me illustrate with four examples.
* Constant time. Example: Determining whether a number is odd or even.
If I give you a number, like 3425 and ask you if it's even, you just look at the loworder digit, the 5, and if that's even the entire number is even and if it's odd the number is odd. If I give you the number 48594859043590438583 you do exactly the same. No matter how big the given number is, it always takes you exactly the same amount of computing work to determine if the number is even or odd. So this problem can be done in "constant time" as they say.
* Linear time. Example: Counting characters in a string.
To count the characters in "dfsljg" we iterate through each character and increment a counter for each character we see. If the length is n then we have to count n characters. This is "linear time."
* Polynomial time. Example: Multiplying two ndigit numbers.
We all know the old grade school algorithm. I hear Common Core doesn't teach that anymore and has a lot of bizarre new methods. I hear bad stories. But I'm assuming most readers learned multiplication the oldfashioned way.
If it's a 3digit number by a 3digit number, you have a total of 9 multiplications plus the addition at the end. If it's 4x4 you have 16 multiplications. In general you always have n^2 multiplications plus a big addition at the end. n^2 is a polynomial and the addition at the end doesn't add much work. So multiplication is "polynomial time."
Since n is a polynomial of first degree, we see that linear time is a subset of polynomial time.
In general, an algorithm that's polynomial or better is regarded as efficient; and anything worse is inefficient.
* Exponential time. Example: Bruteforce guessing of an ncharacter password.
Suppose for simplicity a password must consist of n decimal digits. I the password length is 1, then we have to make possibly up to 10 guesses, averaging around 5.
If the passwords are two digits long, we have to make in the worst case 100 guesses and in the average case around 50.
In other words for each 1character input, the amount of work to guess the password goes up by a factor of 10. So the formula for the amount of guesswork is around 10^n. This is called "exponential time" and it is regarded as bad. The resource usage goes up so fast that even though we can solve the problem in theory, we haven't got enough resources to solve it in practice.
That's complexity theory. My examples all involve the time domain, but we can also analyze algorithms in terms of how much space (or memory) they require. Complexity theory builds out all these different complexity classes and figures out which problems are in which class.
Why is computational complexity important?
* One reason is that performing computations on a physical computer consumes resources and takes time. We always want to be able to go faster, so it's important to find more efficient algorithms.
* P = NP. You may have heard of this famous problem, one of the Clay Millennium problems that pays a million dollars to the first person to prove it true or false, as the case may be.
P is the class of problems that can be solved by a deterministic Turing machine; and NP is the class that can be solved by a nondeterministic TM. A nondeterministic TM essentially takes all possible paths at once. So even though P and NP solve exactly the same set of problems, the nondeterministic algorithms can solve SOME problems more efficiently. We think. Nobody knows for sure.
* Quantum computers will break Internet security and cryptocurrencies. You may have heard or read this. All Internet and cryptocurrency security is based on Public key cryptography, which is based on the ancient mathematical problem of factoring integers into primes.
Since factoring on conventional computers is hard (not as bad as exponential but still not polynomial), it takes the attacker too much time to decode an encrypted message.
On the other hand if we use a quantum computer, there's an algorith called Shor's algorithm that can factor an arbitrary integer in polynomial time. This is truly an amazing fact.
If we eventually develop practical quantum computers (much farther off than the hype would have you believe IMO), then public key cryptography is defeated and all of Internet security is compromised.
* Going beyond all that, Scott Aaronson believes that computational complexity has deep philosophical import. That's what his article's about.

 Posts: 1872
 Joined: Wed Jul 08, 2015 1:53 am
 Location: Saskatoon, SK, Canada
Re: Why Philosophers Should Care About Computational Complexity
Thank you. I'm already aware of these concepts in computer science but never saw it specified/defined as "computational complexity". I express the particular issue with respect to problems on an architectural level. One that I had used at the skepticforum.com (and possibly here back then), how and why a computer program can't be used to suffice as a test for the Monty Hall problem as some think this can be permitted. So it CAN be a relevant potential part of discussion for 'natural philosophy' as to any limitations we may expect a computer to utilize. However, beyond those of us who are already interested in the depths of computer science relations with respect to logic, I'm not sure if this topic can be 'sold' through his paper without appealing to clear definitions and SAMPLES (as you at least provided here, for example) to the lay person within the GENERAL field of philosophy.wtf wrote: ↑Wed Jun 12, 2019 10:20 pmScott Mayers wrote: ↑Sat Jun 08, 2019 7:29 pm If this is an appeal to philosophers, shouldn't you first define "Computational Complexity" at least?Among all the tasks that are computable, some can be computed more efficiently than others. What do we mean by efficiency? In this case it means the growth rate of the resources consumed by a computation, as a function of the length of its input. That's the key idea.commonsense wrote: ↑Sun Jun 09, 2019 2:40 pm Scott M, between Scott A’s essay and a flurry of Wikipedia references, I was able to grasp a working definition of what computational complexity is.
Let me illustrate with four examples.
* Constant time. Example: Determining whether a number is odd or even.
If I give you a number, like 3425 and ask you if it's even, you just look at the loworder digit, the 5, and if that's even the entire number is even and if it's odd the number is odd. If I give you the number 48594859043590438583 you do exactly the same. No matter how big the given number is, it always takes you exactly the same amount of computing work to determine if the number is even or odd. So this problem can be done in "constant time" as they say.
* Linear time. Example: Counting characters in a string.
To count the characters in "dfsljg" we iterate through each character and increment a counter for each character we see. If the length is n then we have to count n characters. This is "linear time."
* Polynomial time. Example: Multiplying two ndigit numbers.
We all know the old grade school algorithm. I hear Common Core doesn't teach that anymore and has a lot of bizarre new methods. I hear bad stories. But I'm assuming most readers learned multiplication the oldfashioned way.
If it's a 3digit number by a 3digit number, you have a total of 9 multiplications plus the addition at the end. If it's 4x4 you have 16 multiplications. In general you always have n^2 multiplications plus a big addition at the end. n^2 is a polynomial and the addition at the end doesn't add much work. So multiplication is "polynomial time."
Since n is a polynomial of first degree, we see that linear time is a subset of polynomial time.
In general, an algorithm that's polynomial or better is regarded as efficient; and anything worse is inefficient.
* Exponential time. Example: Bruteforce guessing of an ncharacter password.
Suppose for simplicity a password must consist of n decimal digits. I the password length is 1, then we have to make possibly up to 10 guesses, averaging around 5.
If the passwords are two digits long, we have to make in the worst case 100 guesses and in the average case around 50.
In other words for each 1character input, the amount of work to guess the password goes up by a factor of 10. So the formula for the amount of guesswork is around 10^n. This is called "exponential time" and it is regarded as bad. The resource usage goes up so fast that even though we can solve the problem in theory, we haven't got enough resources to solve it in practice.
That's complexity theory. My examples all involve the time domain, but we can also analyze algorithms in terms of how much space (or memory) they require. Complexity theory builds out all these different complexity classes and figures out which problems are in which class.
Why is computational complexity important?
* One reason is that performing computations on a physical computer consumes resources and takes time. We always want to be able to go faster, so it's important to find more efficient algorithms.
* P = NP. You may have heard of this famous problem, one of the Clay Millennium problems that pays a million dollars to the first person to prove it true or false, as the case may be.
P is the class of problems that can be solved by a deterministic Turing machine; and NP is the class that can be solved by a nondeterministic TM. A nondeterministic TM essentially takes all possible paths at once. So even though P and NP solve exactly the same set of problems, the nondeterministic algorithms can solve SOME problems more efficiently. We think. Nobody knows for sure.
* Quantum computers will break Internet security and cryptocurrencies. You may have heard or read this. All Internet and cryptocurrency security is based on Public key cryptography, which is based on the ancient mathematical problem of factoring integers into primes.
Since factoring on conventional computers is hard (not as bad as exponential but still not polynomial), it takes the attacker too much time to decode an encrypted message.
On the other hand if we use a quantum computer, there's an algorith called Shor's algorithm that can factor an arbitrary integer in polynomial time. This is truly an amazing fact.
If we eventually develop practical quantum computers (much farther off than the hype would have you believe IMO), then public key cryptography is defeated and all of Internet security is compromised.
* Going beyond all that, Scott Aaronson believes that computational complexity has deep philosophical import. That's what his article's about.

 Posts: 1872
 Joined: Wed Jul 08, 2015 1:53 am
 Location: Saskatoon, SK, Canada
Re: Why Philosophers Should Care About Computational Complexity
Also, if I'm confused, then what does this mean for the average person lacking the extent of logic and/or computing?