For those of you interested in this sort of thing, J. R. Lucas wrote "Minds, Machines and Gödel" and The Freedom of the Will several decades ago, in which he argued that Gödel's Incompleteness Theorems would apply to any physical deterministic system. Any such system should have a Gödel sentence, which is basically the proposition, "This proposition is not provable within this system", which is true but not provable (if it were provable, then it would be false; and if it were false, it would be provable). Mechanistic systems cannot recognize their Gödel sentence as true, because they can only "comprehend" truth as a matter of provability. This is not the case for human beings: we can see that Gödel-type sentences would be true, even though they are inherently unprovable (within their respective systems). Therefore, human beings cannot be explained in purely physical, deterministic terms. Nor can any mere physical system, i.e. computers, be able to duplicate the functions of the human mind -- since they would not be able to recognize their Gödel sentences as true.
At his website, Lucas has most of his contributions to this debate, and the online journal Etica e Politica published several of the more important essays by both Lucas and his detractors. Interesting stuff.
Discuss this post at the Quodlibeta Forum
Sunday, June 06, 2010
Subscribe to:
Post Comments (Atom)
4 comments:
I think that argument risks conflating "proof" with certain types of logic, & so as an argument I believe it will ultimately fail.
For example, let's suppose we have System X and a model of understanding that uses purely inductive reasoning. It's Godel sentence "This proposition is not provable within this system" is unprovable within that system so long as we stick with inductive reasoning. But as soon as we introduce other types of logic it becomes provable, & I think would ultimately render the Godel sentence as "This proposition is not provable within this system by inductive reasoning." But it is provable by other means, so the "unprovability" would be defeated by being contextualized, put within a certain scope as it were.
Take care & God bless
Anne / WF
This a potentially good argument against the computer/mind analogy but not against physicalism.
Only because something is material, it doesn't follow that it must work on the machinery principle.
But (playing devil's advocate here) if the mind isn't a computer then how does it perform mathematical operations?
I personally think that the mind/computer analogy is rather dubious anyway, on Darwinian grounds. After all, a computer's pretty much by definition a product of intelligent design, whereas a brain almost certainly is not. Also there's equivocation between multiple senses of the word "information" that's ubiquitous in the AI literature (ie. between the sense we normally use it in conversation, and the alternative, observer-relative sense that we talk about in computing).
Weekend Fisher: You say that once we introduce other types of logic, Gödel sentences become provable. But when you introduce other types of logic, you're changing the system; and the question is whether that system's Gödel sentence is provable within that system (the answer being no). Of course you can then take another step back and make a new system that could prove that Gödel sentence -- but then that system has a Gödel sentence of its own which is not provable within it.
Matko: Lucas only focuses his argument on physicalistic determinism. He rejects other types of determinism on other grounds, and he considers quantum indeterminacy to be a refutation of physicalistic determinism. So you're right, and you're not really disagreeing with Lucas.
Perplexed: Lucas is only arguing that human beings, the human mind, are not mechanistic insofar as they can see that their Gödel sentence is true, even though it's not provable. In doing mathematics the comparison to computer systems holds. It might seem insignificant to hold that there's this one thing that people can do that a computer couldn't, but it's enough to show the former cannot be reduced to the latter.
Post a Comment