After invoking Maude's Law, I'll answer this one because it is interesting and not invoking any dieties.
Forrest wrote:
Slamlander wrote:
Deterministic, in the software context, is code that is absolutely predictable, like ones and zeros. Quantum Mechanics doesn't enter into it and is irrelevent. ALL digital systems have no choice but to be fully deterministic. Therefore, we cannot use them to build AIs. BTW, this was also the context of our previous discussion along these lines.
Analog Neural nets do something that digital neural nets cannot do and that is to have an infinite variation between two values. Digital systems, because they are digital systems, have a finite and fixed number of values within the same context.
This is precisely what I mean by us using different notions of "deterministic": analog systems, in classical (pre-quantum) models, were still considered fully deterministic, in the sense that physics and philosophy (e.g. questions regarding free will, consciousness, etc) use.
Other than the fact that I am still convinced that such mislaid concepts of determinism are still bending a knee towards distant Rome (those spouting non-determinism were once burnt at the stake, as agents for Satan, the Lord of Chaos. Please take that for the slightly-red herring it is
).
Were I to write a decent paper on the subject, the core statement would be:
Quote:
Digital systems are mathematically discontinuous, by definition, there is no way to implement a true function in a digital system where f(x)=y can yield all possible values for y, rather only an approximation that yields strictly determined y values, at strictly determined step-values of x and intermediate values of x cannot yield other than those strictly determined y values.
Of course, there would be loads of mathematical proof of the above and as I said, I don't have the time to write such a paper. I would hope that I would not have to prove that discontinuous functions are not functions, by definition, as one learns that in first semester Calculus.
There is also the further statement:
Quote:
Such discontinuous functions are fully deterministic by their nature, due to the fact that their output is strictly limited to yield only the pre-determined results.
This is a more difficult proof, but it is possible. However, it would take even more work than the previous statement. Note the use of deteminism here. It is not quite the same as used in the quantum world and indeed irrelevent to it. This would be a proof against the possibility of using digital systems to build an Artificial Intelligence. Essentially, most, if not all, AI workers realised this in the late 80's and that's why work in the field has fallen off so rapidly. Hopes are now being pinned on work in
neural nets, begun largely at MIT, as a possible source of a new computing model that would allow us to build an AI.
Forrest wrote:
I believe I asked this question before, but I'll ask again: in your sense of "deterministic", assuming quantum theory to be false (that is, assuming that the world is fully deterministic in the physical, philosophical sense), would weather patterns be a deterministic system or not?
Weather patterns are chaotic and therefore barely deterministic. They are like Three-card Monte, where the observer becomes sufficiently confused by the motions that they lose track of the queen. However, weather systems are analog systems, and not digital. Indeed, one could argue against the efficacy of computational climate models, based on the arguments above and I have indeed made such arguments. No amount of processing power will negate the aforementioned arguments.
Forrest wrote:
Also, do you distinguish your sense of "indeterminism" from <A HREF="http://en.wikipedia.org/wiki/Nonlinear">nonlinearity</A>, or is the subject of the forgoing wiki link the sort of thing you're talking about?
That is an interesting question (THE interesting question). Frankly, I wasn't considering it, as the aforementioned can atest. I would observe, however, that one cannot have
any linear or non-linear effects when all the base functions are discontinuous. Using proper analog neural nets, however, it is quite possible to have both and indeed, some of the neural net effects appear to be non-linear and therein lies the hope for eventual Artifical Intelligence.
I might point out that there will probably be some point where quantum effects will enter into the effort. One of the new computational models is a Quantum Computer. No, it isn't Science Fiction, early work in that area does exist (IBM Watson Labs and Hitachi Data Systems) and is showing quite some promise.