In that article, much is made of the fact that humans take longer to determine that "candy" and "candle" are different words than to determine that "jacket" and "candle" are different words. But computers will give the same result: On this computer, a program comparing "candle" and "jacket" for equality 100 million times took 0.8 seconds, while "candle" and "candy" took 1.6 seconds. There are no "curving mouse trajectories" to investigate here, but some microprocessors are designed to use speculative execution: If it looks like some condition is true, and the first steps to be taken if it is will do no harm if it isn't, then the computer can begin to follow those steps until it knows for sure.
From the scienceblog.com article:
"For decades, the cognitive and neural sciences have treated mental processes as though they involved passing discrete packets of information in a strictly feed-forward fashion from one cognitive module to the next or in a string of individuated binary symbols—like a digital computer," said Spivey.Nobody really thinks that human brains are actually binary computers. However, work like Turing's shows us that there is a general computational model (the so-called Universal Turing machine), and there is no more powerful model. Turing's machine isn't actually a binary computer, but finite binary computers are capable of emulating finite Turing machines. Of course, binary computers are also capable of simulating limited real-world processes. For example, binary computers have been used to simulate the weather, the H-bomb, biological reactions, the large-scale organization of the universe, and even quantum computers. The breadth and depth of these simulations will only increase in the future. As they are physical objects that behave according to natural laws, why would brains be impossible to simulate on a computer?
Binary computers are of course composed of physical objects, little pieces of metal, semiconductor, and insulator arranged in various ways to form basic building blocks: resistor, capacitor, and transistor. These are all analog devices--a capacitor can store a variable amount of energy, and a transistor can be turned off, fully on, or somewhere in between.
"Logic gates" (a class of electronic component that is intended to be treated as a binary device) can be formed by combining transistors and resistors, and computer memory can be made by combining transistors and capacitors. When designers combine the building blocks into a larger device, such as a memory chip or a microprocessor, they are very careful to combine them in ways that let the users of the chips treat them as purely digital, but you don't have to.
In fact, one researcher took a kind of reprogrammable chip (FPGA) and used "evolution" to create a circuit that could perform a specific task. Like the microprocessor in your computer, these chips are designed to be used as digital devices. But in his experiment, evolution found a different, non-digital, way to use the same building blocks:
When he looked at the final circuit, [Adrian] Thompson found the input signal routed through a complex assortment of feedback loops. He believes that these probably create modified and time-delayed versions of the signal that interfere with the original signal in a way that enables the circuit to discriminate between the two tones. "But really, I don't have the faintest idea how it works," he says.(More of Adrian Thomson's work is here.)One thing is certain: the FPGA is working in an analogue manner. Up until the final version, the circuits were producing analogue waveforms, not the neat digital outputs of 0 volts and 5 volts. Thompson says the feedback loops in the final circuit are unlikely to sustain the 0 and 1 logic levels of a digital circuit. "Evolution has been free to explore the full repertoire of behaviours available from the silicon resources," says Thompson.
I'm in the camp that firmly believes there's nothing magical that the meat in our skulls does; it can probably be done by digital computers, and if not, it can be done by "freeing" our computers to operate at least partially in the analog realm. When someone says "Clearly computers can never " (fill in the blank), I suspect it's more because of our lack of facility at writing computer software than some limitation inherent in the transistor or in the binary representation of integers.
If we do manage to create machine intelligence, I don't think it's likely to pass the Turing Test, though. Lots of things about the way humans act are products of our long evolution. If we design intelligence from the ground up, there's no reason to deliberately add things like the ability to become addicted to chemical substances or gambling; if we evolve intelligence like Adrian Thomson evolved his FPGAs, we'll get an intelligence that is quirky and probably prone to mental illness (just like we are) but in ways that only underscore how alien those metal minds would be.
This is a revised and expanded version of a message I posted last year on metafilter.
Entry first conceived on 9 May 2006, 14:02 UTC, last modified on 15 January 2012, 3:46 UTC
Website Copyright © 2004-2024 Jeff Epler