This is a useful summary comparison between digital computers and the brain: "Reported estimates of how much data the brain holds in long-term memory range from 100 megabytes to 10 exabytes—in terms of Thriller on MP3, that’s either one album or 100 billion albums. This range alone should give you an immediate sense of how seriously to take the estimates." Here's the good stuff:
The fundamental difference between analog and digital information is that analog information is continuous and digital information is made of discrete chunks. Digital computers work by manipulating bits, ones, and zeroes. And operations on these bits occur in discrete steps. With each step, transistors representing bits switch on or off. Jiggle a particular atom on a transistor this way or that, and it will have no effect on the computation, because with each step the transistor’s status is rounded up or down to a one or a zero. Any drift is swiftly corrected.
On a neuron, however, jiggle an atom this way or that, and the strength of a synapse might change. People like to describe the signals between neurons as digital, because a neuron either fires or it doesn’t, sending a one or a zero to its neighbors in the form of a sharp electrical spike or lack of one. But there may be meaningful variation in the size of these spikes and in the possibility that nearby neurons will spike in response. The particular arrangement of the chemical messengers in a synapse, or the exact positioning of the two neurons, or the precise timing between two spikes—these all can have an impact on how one neuron reacts to another and whether a message is passed along.
Plus, synaptic strength is not all that matters in brain function. There are myriad other factors and processes, both outside neurons and inside neurons: network structure, the behavior of support cells, cell shape, protein synthesis, ion channeling, vesicle formation. How do you calculate how many bits are communicated in one molecule’s bumping against another? How many computational “operations” is that? “The complexity of the brain is much higher at the biochemical level” than models of neural networks would have you believe, according to Terrence Sejnowski, the head of the Salk Institute’s Computational Neurobiology Laboratory. “The problem is that we don’t know enough about the brain to interpret the relevant measurement or metric at that level.”
There's more at the link.
No comments:
Post a Comment