Skepdick wrote: ↑Thu Feb 04, 2021 2:04 pm
I am not sure it qualifies as a scientific model, even if some scientists may use it.
There are models and MODELS. You have things that some scientists use but it is not especially rigorous or accepted widely. You also have models that are the model in a certain discipline.
[/quote]
Isn't that the criterion for all scientific models? Utility.[/quote]Sure, but there are degrees. Here we are in a philosophy forum where we can look at the level of accuracy of a metaphor or model. We are not, here, generating research, were utility is the focus. We are trying to look at whether assertions hold and to what degree, in which circumstances, etc.
I am not arguing semantics though - I am pointing out how they are functionally identical for the purposes of preduction.
OK, I disagree that they are functionally identical. That was part of the motivation for various assertions in my post.
Iwannaplato wrote: ↑Thu Feb 04, 2021 1:24 pm
It would be absurd to say we are not programmed like computers and this MUSt mean one cannot learn, then.
If it's absurd then don't say it.
Iwannaplato wrote: ↑Thu Feb 04, 2021 1:24 pm
Of course it ain't just neurons. Glial cells are involved in processing not just as structural glue.
Whatever the resolution of your ontology, it's not as complex as factorial interconnection.
It's more complex than any computer. I don't know anything about factorial interconnection. I am comparing computers and minds, also learning and being programmed. For example there need be no programmer for an incredible amount of human learning. We learn much more outside of conscious teaching then we do from teaching.
Iwannaplato wrote: ↑Thu Feb 04, 2021 1:24 pm
So right off we have two different systems...
Observe that's an old, long before computers word from Greek.
Obviously. That's what it means for a model to be incomplete.
Fine, but again. I was responding to saying that if you thinking programming is not the same as learning, then you think people cannot learn.
Furthermore observe what you are DOING.
You are using your brain to understand your brain. That's self-application/recursion.
Welcome back to the land of computation.
You've got the cart before the horse. Just because we have managed to make things that do things that we do does not mean that therefore brains are computers.
Horses are bicycles because we can travel on them. Just because things that come after do things that things that were there before does not mean those things that came before ARE those things that came after. Dolls aren't humans.
Oh, you talk to a doll, you have a doll family. See, humans are dolls.
Iwannaplato wrote: ↑Thu Feb 04, 2021 1:24 pm
Programming, if correct in the grammar and vocabulary and order of computer will work, period, barring hardware problems. Not so with a child, let alone children.
You aren't using the "correct" grammar and vocabulary for your programming to work. You don't understand that child's language.
If you look later I talk about how vastly more complex learning is and depends on NONPROGRAMMED learning.
Iwannaplato wrote: ↑Thu Feb 04, 2021 1:24 pm
It sets in motion a causal chain not dependent on how the computer is feeling, what happened this morning, how much sleep the computer gets, who I remind the computer of, what happened with other computers, the computer's experience with other programmers
Yeah! Computers can behave non-deterministically at runtime! I don't understand why the computer is throwing a tantrum because I don't understand what happened to it in the last 24 hours to get into that state.
That's what debugging is for! To arrive at the precise sequence of events which caused the observed behaviour.
If you solve the problem with programming, then your model fits. If the problem was caused by non-programming issues and!!!!!!!!!!! non-programmed learning, then your model does not hold. And most of our learning is not language based. It is not language based.
Iwannaplato wrote: ↑Thu Feb 04, 2021 1:24 pm
and man I could go on for hours with more categories.
And I could go on for hours mapping them to computational concepts.
with false generalizations.
Yep! Different hardware is different. Making "the same" software run on different hardware is called porting.
https://en.wikipedia.org/wiki/Porting
You can think of "Porting" as "translating your ideas in a language the other person can understand"
though this does not take into account all the non-verbal learning (by that I mean any experiencing) that a child goes through.
What you seem to me to be doing is showing that you are happy to model the same way over and over and come up with metaphors from computing for human behavior and processes. I know you have that tendency already. Are they actually covering the same ground and well? that's up in the air.
You have a six month child, one day my programming does not function well. There is nothing at all remotely comparable to the child is autistic. Or the other parent is sexually abusing the child. Or the child has syaesthesia. Or the child is an impatient child. Or that has gone through a hormonal change in its development. Or has through some developmental shift started to focus more on sounds then images - and not through anything I programmed or the child's teachers and not through any language interactions with that child. Now you may be able to come up with something that we can create a metaphor out of when dealing with computers that somehow connects with it, but in the real world there is nothing remotely as important as these things in the computer world and it would be facile to make the comparison. Maybe someday when computers have strong experiential learning and strong developmental processes - something that is like puberty (and pain, sexuality, pleasure and all the subtle processes that leads to piaget type changes/stages/foci) - then the model will be stronger.
IOW once we have reverse engineered ourselves and then created from that much better than we have so far and certain qualitative thresholds have been passed. And IOW one step further these things we are trying to make like ourselves (but with vastly more calculating powers) have reached a level where their learning is like our learning in all ways NOT just those that are like programming.
To say that something is "much more complicated" is pretty vacuous.
It's vague but not vacuous. And here we have machines that learn in some of the ways we do and those ways can be analogous in us to programming, but we are more complicated and learn in ways that are not like programming and do not involve language. We are more complicated in the ways we learn (and develop our hardware) and this means that programming and human learning are not the same. So, it can be useful to use the model/analogy but in a philosophy forum one can disagree with their utter equivalence without being absurd. Whereas, sure, if I went to a lab and told some AI guys that using the model was dumb would be absurd.
I can see that it's "much more complicated" than the vocabulary I have to talk about it. Which is how I know I don't understand it.
The limits of my language are the limits of my world. --Wittgenstein
No mine. There are lots of things I have no words for that I experience. I might be able to point in the direction at best with words. And it is certainly not the case for babies who are learning incredibly. The limits of their language is NOT the limits of their world. And that points to there being a difference between programming and human learning.
Except the programming they arrived with at birth, right? Else you are insisting that "learning how to classify things" is some sort of magic.
I wouldn't call it programming. That's using something less to describe something more. If you created a new word, that was larger than programming to cover things like genetic communication, experiential learning AND programming, and also included the changes via development
OK, I am happy with that word. But when you keep using programming to cover things that are not programming and that no programmer on earth is remotely capable of doing, I think you are misusing the word.
We could call it writing or communicating too.
I have a computational model...
Sure, I really did get that, that's the issue.
I don't know, but IF we live in a computer simulation then there is a programmer.[/quote]One level up, at least. But with the programming model there is a programmer. That's an enormous difference. Not a quantative one. If you presume one, well, ok. In fact then the model fits nicely with intelligent design. Intention is present. If not, then it is problematic not quantitatively, but qualitatively.
But if you add God into the mix, then the programming model becomes much better. Because then all my complaints about the inadequacy of programming can be taken care of. Since God, a vastly more talented programmer, using things that are prelingual to 'program' can be programming along with parents and teachers and the environment and pets and friends and artificial and natural things the child encounters.
Of course then there is no reason to use a verb made up for those language intructions used to get computers to do things we want them to do.
We could use a broader verb.
And I want to emphasize the context. Here in a philosophical forum, not an AI lab, saying learning is programming and dismissing an objection as absurd and saying the terms are same, period, is problematic. That it can be used to metaphorically describe much learning, fine. That one can stretch it to cover things never done by a human programmer, not remotely, and involving non-linguistic processes of learning , despite programmer using language, hm stretching and at the least a worthy discussion and disagreeing is not absurd. Obviously I think it is the stronger position. I think a word that covers a much more limited area, programming, is being used as a metaphor for something with processes not covered by programming, at least now. That can be very useful in certain contexts, but sure, get down in ontology and it is speculative at best and works potentially only as a kind of theism or at least a 13th floor type simulated universe model.
How do we program for consciousness, pain, pleasure, desire?
'Things' deeply embroiled with learning in us. Oh, sure, maybe some day, those can be programmed by some programmer. Or maybe not.
But I think someone disagreeing now, before all that, has some grounds to say programming and learning are not the same. Or that brains are not computers (only at least, though really at all since computers are things now we are trying to make like brains but haven't succeeded in many core categories.)