The power of the human brain.

by metacognizant

My dad is a highly skilled computer geek. What he does is manage what’s called the “performance lab” at an esteemed computer company. Many big-name companies–such as EA games, Microsoft, Google, and Norton Antivirus, among others–use his company’s product, which he is in part responsible for creating in its current form. The “performance lab” that my dad runs exists to seek out ways to make their product run faster, and this can occasionally include improving the computers that run my dad’s companies product. Because of this, my dad was recently asked by his company to attend an extremely large conference on super-computing, in order to glean some ideas of how to speed up their not-so-super-computers.

At this super-computing conference, some of the most powerful supercomputers in the world were on display. A floating point operation is a calculation done by computers. Computing speed is measured in floating point operations per second (hereafter, FLOPS). It’s much more detailed or technical than this, but for purposes of this blog, consider a floating point operation something like 1.5 + 2.5 = 4 (for an actual floating point operation, see here).

Most of us are familiar with the terms mega, giga, or tera. Hard drives nowadays can hold terabytes of information, but I’m sure that most of us could remember when the max they could hold were megabytes. Measuring FLOPS utilizes these same computer words. A megaFLOPS is a million floating point operations per second, a gigaFLOPS is a billion, and a teraFLOPS is a trillion floating point operations per second. Now, the supercomputers today–the ones my dad saw in this conference–are of the speed of being able to do petaFLOPS–that’s right, over one quadrillion floating point operations per second! What a feat of of modern computing, engineering, and civilization! That this many calculations can be done each second should boggle anyone’s mind.

But the mind might have something to say about that. Nature doesn’t like to be outdone.

Neurons are complex. I’ll include a good diagram of a neuron here before I continue:

 

Synaptic connections are the bridge between two neurons (two individual brain cells). These connections are rather complex in and of themselves. When an “action potential” (an electrical charge that has reached a certain level and is capable of making a neuron release neurotransmitters) travels down a neuron’s “axon” (a long part of a neuron that begins at the cell body [the main part of the cell, see previous image] and ends at the cell’s synapse [see previous image]), its charge releases stored neurotransmitters in “vesicles” (small containers of neurotransmitters) out of the neuron and across the “synaptic cleft” towards the “dendrites” on the “post-synaptic neuron.” What happens when a neuron “fires” is displayed below:

And neurons are even more complex than this. The above diagram depicts neurotransmission in four steps, but properly speaking there are at least 7. In addition, there are 15 major types of neurotransmitters, and more than 50 types in total. Moreover, each neurotransmitter has its own receptor, which only (barring special exceptions) it can activate. Further, neurotransmitters are either excitatory or inhibitory. If the neurotransmitter is excitatory, when it binds to its receptor on the post-synaptic neuron it allows an influx of positively charged ions. If the neurotransmitter is inhibitory, when it binds to its receptor it allows an influx of negatively charged ions. These charges travel down the dendrites of the post-synaptic neuron to the soma, or cell body. In the soma, all of the positive and negative charges are calculated. If the overall charge reaches the “threshold” (the charge level in which the neuron inevitably must fire), the neuron sends an action potential down its axon. If the overall calculated charge does not reach the threshold, the neuron does not fire an action potential.

Each neuron, then, in its soma is constantly calculating; it’s tallying up charges in order to fire an action potential or not. And it gets even crazier. In addition to constantly calculating positive and negative charges, it gives priority to those charges that come in on places on its dendrites closest to its cell body (as the further out charges lose a little bit of their punch), and it gives priority to charges that it receives within either the same space on its body or at the same time. So we have calculations of two things (positive and negative charges) including immense amounts of priority being given.

Moreover, it’s doing these calculations at ridiculous rate. Each neuron contains between 5,000 to 200,000 synaptic connections with other neurons! (here is the source for that statistic, as it can be a hard one to find) And on top of that, each neuron fires between 1 to 1,000 times per second, with the average firing rate being between 300 to 400 times per second! Combined with the fact that each neuron is calculating two different entities as well as giving priority to some, this rate of calculation should, indeed, be mind blowing!

It is for these reasons and more that one expert claimed that the most powerful supercomputer does not have the information processing capacity of a single neuron (Jonscher, ctd. in Rita Carter 2002, pg. 118 [Admittedly, the most powerful supercomputer from 2002 is not as powerful as the most powerful supercomputers now, but it was nearly as strong. In addition, the neuron may still have more information-processing power than the most powerful supercomputer today, but I’m not qualified to make that judgment.]). Wow. [[[Addendum: upon questioning, I cannot determine the reason that Jonscher made this statement. However, this does not mean that it is not a valid statement. While the complexities of each individual neuron in its communication with others are enormous, I cannot mathematically quantify these, and any attempt I’ve seen to do so falls short of a supercomputer’s power (although, admittedly, I have only seen simplistic models of a neuron). Nevertheless, Jonscher has a Ph.D. from Harvard and specializes in economics related to artificial intelligence, and from reading his works you can see that he has his neuroscience down pat and he has a masterful ability to do statistics (as any Ph.D. in economics should). Additionally, his statement was agreed upon by Rita Carter, author of Exploring Consciousness and Mapping the Mind. Because of this, I’m inclined to trust his statement, but I cannot give an account of why he makes it. Accept it at your own discretion.]]]

But it doesn’t stop there. The brain is even more amazing than that. The average brain has more than 100 billion neurons, according to the Scientific American Book of the Brain. 100 billion neurons, and every single one of them with more information-processing power than some of the most powerful supercomputers in the world!

But it gets better. A study cited in Kolb and Whishaw’s An Introduction to Brain and Behavior, 2nd edition looked at possible different arrangements of neuronal connections in 11 different areas in the brain’s frontal lobe. They calculated 39.9 million possible arrangements, but the actual arrangement was optimal. That blows my mind. Out of all the possible arrangements of neurons, the brain makes no waste or excess. Connection times are cut to a minimum and no extra axons were needed. So not only does each neuron have more information-processing power than some of the most powerful supercomputers, but we have over 100 billion different neurons in our brains, and they’re arranged in an optimal arrangement for maximum efficiency of information-processing!

And this mind-blowing process has to occur for us to have our minds blown when we take a step back and look at the process. We truly are fearfully and wonderfully made.

Advertisements