|Homepage / Publications & Opinion / Archive / Articles, Lectures, Preprints & Reprints
Biological limits to information processing in the human brain
P. Cochrane, C. S. Winter, and A. Hardwick The human brain is a product of Darwinian evolution and as such it has evolved from a set of underlying structures that constrain its ultimate potential. A combination of the physical size of the dendrites, axons and the associated blood vessels, and therefore their related signal space, limit the amount of information the brain can effectively store and process. By analysing the inter-relationship of the key constraints we have shown that:
Thus we contend that the human brain is at, or near, the capability limits that a neuron-based system allows. This implies that our future evolutionary potential is limited and that, as a species, Homo Sapiens may be near the pinnacle of achievable intelligence using current cellular carbon technology.
As one example of a simple structural question that invites analysis, consider the connectivity of the neurons in the cortex medulla. The degree of connectivity is often taken to be an essential feature of a complex intelligent system. But it is not obvious why the brain has evolved the particular connectivity that it possess - a binary connectivity would be sufficient to access all the data, and full interconnectivity would give faster processing. In this paper we show that, when the combination of processing time at synapses, transmission speed on the axons and density of the components are considered as a whole, then the particular connectivity observed is near optimal for processing performance. However, as a model, it appears to be neither a definitive guide value for designing other systems, nor a prerequisite of an intelligent system. Another conclusion we also draw is more radical. Looking at the brain size and structure in whales and dolphins we see man as representing a probable limit of evolvable, efficient brain power using current cellular technology.
Biological organisms function as information processors; they take in information about the environment, process it, and then use the information to locate the necessary energy sources for survival. They are driven, therefore, by entropy processing. The more efficiently organisms process and extract information from the environment, the more successfully they, and their offspring, can continue their existence. These organisms are perpetuated at the expense of less efficient entropy-engines. The last two billion years of information processing has been driven by carbon-based molecular systems which have evolved through a combination of random mutation and selection. Homo Sapiens arose through this molecular-based Darwinian evolution. The future evolution of Homo Sapiens depends on understanding that living creatures are information (and subsequently order) processors; that is consumers of entropy rather than just energy. This implies that systems that are more efficient at information processing may one day supplant Homo Sapiens. Indeed for task specific applications this is axiomatic and exemplified by the difficulties even the world chess champion, Kasparov, had with Deep Blue. At one stage it was believed such open-ended games as chess would always be beyond the limitations of computers, but no longer!
These observations lead to a further interesting conclusion. If the ability of an organism to process information about its environment is a driving force behind evolution - that is if there is evolutionary pressure to evolve better brains to survive - then genetic engineering and other biological options will not help if our brain is inherently limited by architecture and operational modes. The next step in 'evolution' would then be to appropriate silicon as the intelligence medium and/or derive a collective consciousness through the networking of our wetware. Future evolution would then be driven by mechanisms and forces radically different to those manifestly of nature. With (or without ?) our help, Darwinian evolution from carbon to silicon could lead to the generation of new species based on a carbon-silicon mix.
The limits of neuronal control
There are a number of related mechanisms that obviously interact to limit brain size:
These factors are now addressed in two sections dealing with thermal and density/transmission/processing limitations. Inertial shock is not dealt with in detail as it appears to be the least important of the previous six listed.
Thermal size limits to the human brain
Passive Verses Active Cooling
Blood Temperature, Flow Rate and Volume
When the volume of a human body is compared to those of larger mammals, such as mammoths and whales, it can be seen that large mammalian hearts can pump far greater volumes of blood than the human equivalent. It is unlikely therefore that the rate of blood flow required from the heart is a serious limit. In a human, blood usually flows faster in the wider blood vessels than in the narrower ones. To ensure that the speed will not exceed the limit for a mammal, we assume it to be a constant value. This puts our initial model on the optimistic side in terms of forming an estimate and, for determining the importance of thermal restrictions, it is certainly an upper bound.
To estimate the extra pressure needed when the level in the branching hierarchy is increased from n to n+1 in an homogeneous brain model, one can simply consider putting together two half-volume brains (of radius Rn) and adding the necessary extra blood vessels (Fig 1). This can then be reshaped to resemble an nlevel brain but of double the volume. The new brain radius, Rn+1, therefore will be Rn+1 = Rn. Note that this does not represent the neural structure of a human brain, which is highly inhomogeneous; this model is just for the calculation of the blood supply requirements. Neither does it represent the growth of blood vessels in a growing foetus, where extra branches are made at the smallest level and then all the blood vessels increase in size, but the resulting structure is the same which is all that matters for the calculation's results to be valid.
The point where the main arteries of the two nlevel brains are joined is now in the middle of the (n+1)level brain so a extra artery of length R is needed to take the blood to that point (Fig 1). Similarly an extra vein will be needed. The radius of the new tubes, rn+1, will need to be rn, where rn is the radius of the largest tubes in an nlevel brain (Fig 2), in order to supply the required flow of blood.
The pressure difference across this extra length of blood vessel consists of the separation loss at the point of branching and the viscous drag in the tube.
A separation loss is the pressure needed to overcome an obstruction to fluid flow such as a junction. Taking a rather extreme case of a tee-junction, the loss will be 1.8 rv2 Pa m2 s2. The blood speed will be about 0.1 m/s or less, so the separation loss will be typically less than 18 Pa for each level of branching. A human heart produces around 100 mmHg = 13 kPa so the separation losses are negligible.
The viscous drag (q"n) can be calculated from the dimensions of the tube, the blood speed (v) and the viscosity (h) by approximating the flow to a laminar flow in a straight cylinder and using Poiseuille's formula. The result is 8hv (Rn+1)(r"n+1)"2.
For each successive level in the branching hierarchy, this drag term decreases by a factor of 22/3 so the total drag through all the levels of branching cannot exceed
The quantity q"0 is the pressure across the smallest blood vessels, the capillaries, which is about 20 mmHg = 2.7 kPa - so the total viscous drag does not exceed 7.2 kPa regardless of the size of the brain. There is therefore little need for a greater blood pressure just because a brain is increased in size.
Blood Vessel Volume
Doubling the brain radius increases its volume eight times but it also necessitates an extra artery and an extra vein to take the blood into and out of the centre of that conglomeration (Fig 4).
Let Rn, Vn, Bn and rn be the brain radius, the brain volume, the total blood volume and the radius of the largest blood vessels in a brain with n levels of branching. The doubling of the radius when eight similar units are combined implies Rn+1 = 2Rn and Vn+1 = 8Vn. The extra tubes needed to carry away the blood from the eight-way join to the outside of the brain need a cross-sectional area 8 times that of the vessels in the previous level in the hierarchy so rn+1 = rn. Therefore
Rn = 2n Ro, Vn = 23n Vo and r = 23n/2 ro.
The total blood vessel volume in a brain with n levels of eightfold branching is eight times that in a brain with n1 levels plus the volume of that extra blood vessel which is needed, i.e.
Bn = 8Bn1 + p rn2 Rn
Recursively substituting in for Bn-1, Bn-2 etc. shows that .
The ratio of blood volume to brain volume, Hn, is given by:
Hn = Bn/Vn = (2n+1 - 1) p ro2 Ro / Vo.
When the number of levels is increased by one this ratio increases by a factor:
Hn+1 / Hn = (2n+1 - 1) / (2n - 1)
As n gets large, this factor tends rapidly to 2, i.e. the ratio of blood vessel volume to brain volume doubles with each doubling of the brain radius.
In a normal size human brain, about a fourteenth of the volume is taken up with blood so the radius could only be doubled thrice before the blood supply uses up half the cranial space. This limits the increase in processing brain volume to about 250 times its present volume. This is the dominant thermal limit.
Interim Conclusions 1
Signal processing limits
Definitions and Background
To calculate intelligence we treat the brain as a control system and not a computational system. Then the efficiency of the brain is measured by the minimum time to process a signal. This means comparing the incoming signal with all possible related information (anything less is partial processing). Thus, in theory, information pulses must have the ability to interact with any point of memory or synapse. The transit time for the pulse to fan out to all possible neurons is effectively the time taken to traverse the extremities of the brain (in any direction).
For ease of analysis we make the assumption that all the 1014 synapses function as memory elements, remembering some state information about past actions. The sampling time is taken to be limited by the width of the ionic pulse travelling along a neuron. The brain effectively runs asynchronously at a bit rate of about 100 bit/s. Although not used here (since we are not comparing systems) the input bit rate is the sum of all the sensory nerves. This is dominated by the eye which has about 127M rods and cones concentrated down to about 1M neurons, and thus gives rise to an input bit rate of about 100 Mbit/s. The rest (sound, tactile, taste and smell) adding up to no more than one tenth of this figure.
The task of any intelligent control or processing system is not computational arithmetic - it is memory comparison. Our brain has to compare the current inputs with as many memory states as possible to obtain the best possible reaction to the environment. The limit to intelligence therefore lies in an ability to correlate all synaptic outputs in a minimal time.
Transmission speed-size relationship
The diameter of a nerve cell determines the maximum conduction speed of an ionic pulse. To find the transmission speed limits, the first step is to analyse the distribution of charge along the axon from a local injection of current, and the consequent decay length of the pulse:
l = [rm / (ri+ ro)]0.5 .........(1)
where: l = the consequent decay length of the pulse
rm = the membrane resistance
ri = the internal axial resistance
r0 = the axial resistance of the external medium
and r0 << ri
Both rm and ri scale with changes in the radius, R, but whereas the membrane resistance is a surface property decreasing as 1/2pR, ri scales by 1/pR2. Thus doubling the radius halves the membrane resistance, rm, but quarters the internal resistance. Thus increasing the propagation distance by 1.4. The trade-off for this increase is that the volume has increased fourfold.
This local potential depolarises the membrane past the threshold further down the axon, and causes the action potential to advance. The further the local depolarisation extends, the faster the propagation. So quadrupling the diameter of an axon doubles its speed. This is used to great effect in the giant squid where the nerve axons have diameters as large as 1mm and achieve propagation speeds of 20 m/s at 20 C. Typical values of l in skeletal muscles are 2mm and in small axons about 100mm.
To improve the speed further it is necessary to find a way to simultaneously lower the membrane capacitance (Cm) which also slows speed, reduce ri, and increase rm. The evolutionary route to achieve this was to myelinate the axon. By adding layers of insulating membrane around the axon, with gaps at regular intervals, it is possible to force the pulse to hop from gap to gap. The insulating membrane layer has two effects: rm increases linearly with the number of layers; and Cm falls since each layer acts a series capacitance. The net effect of myelination is to give a ten fold speed increase at the same diameter as an unmyelinated axon. This principal is used by sensory neurons which achieve similar conduction speeds to the squid axon but with axons only 1-5 mm in diameter. However, increasing myelination still gives l doubling for a four fold increase in volume. Thus myelination gives a one-off speed increase and subsequent closer packing of the elements.
The thermal studies show that this limit is even harder: as we increase the brain size the amount of plumbing must increase. Doubling the brain size doubles the plumbing content and increases the plumbing from around 7% of the volume to some 14%, reducing the efficiency gain to 15%. At quadruple the volume, the efficiency gains remains 15% and at eightfold the volume actual falls by 12% as fewer elements can now be packed in!
The small myelinated neurons of the 'white matter' (1-5sqmm) manage about 10 m/s. They fill about 90% of the brain space, compared with about 10% for the unmyelinated (grey) nerves which have a transmission speed of about 1 m/s. The sum effect is that the brain cavity (approx. 10cm) can be traversed in about 10-20 ms. This is comparable with the width of the ionic pulse. Myelination has an associated energy cost, this suggests that the balance of myelination has been selected in the brain to optimise speed and minimise energy expenditure. Also the optimal size of a control system would be one in which the processing was completed in one clock cycle. The degree of myelination is clearly tuned to the pulse width.
N = M2/2 connecting paths (for M>>100)
In comparison, a binary system requires:
N = M interconnections
The latter approach then gives P = log2N processing steps, and the former a single step.
More generally, for an interconnectivity of I, the number of processing steps:
P = log(M)/log(I)
and the number of paths:
N = I*M/2.
The paths (axons) are larger than the memory elements (synapses). Since, as I decreases N increase, the volume must increase with N. The trade-off is now between the time taken to transit the extra volume (proportional to N1/3) and the number of processing steps, P.
Consider the following extreme cases with synaptic processing time 2ms, transmission speed 10 m/s, 1015 synapses,
Clearly the system as a whole will be matched when the transit and processing time are equivalent. Interestingly the resulting figures are more or less equivalent to the pulse width, showing a degree of universal optimisation.
The synapse processing time of about 2 ms allows us to predict the number of synapses in the path to be about 4-5 (to give a matched delay). This gives an interconnectivity of the order of 104. Hence, using this analysis, we can say that the brain has an interconnectivity of 104 not because this is ideal for some algorithmic reason, but because this is the ideal trade-off, for this particular combination of cellular technologies, of processing and transmission speeds. When modelling the brain we should therefore be wary of placing over importance on the interconnectivity arguments.
Fig 5 shows the overall consequence of changing interconnectivity and/or increasing the size of myelinated neurons to obtain more speed on the information processing capability per unit time. Note the broad performance plateau, with the human brain lying about 20-30% below the optimal, but with the optimal processing ability corresponding to a brain about twice the current volume. Increases in size give little performance gain but would cause immense inertial shock problems. Also note that the flatness of the plateau permits quite wide errors in assigning modelling values to the various parameters without significantly changing the conclusions.
Figure 5: Information processing vs axon speed and interconnectivity.
Interim Conclusions 2
Limitations to biological enhancements
This appears to be a severely difficult undertaking. In a similar manner, drug based enhancement may marginally improve the use of inefficiently arranged or used sub-components but can never realise significant enhancements for the same reasons. There is a role for drugs and genetic engineering, but it is solely to reach the ceiling, not to fundamentally improve it.
It is reckoned that we use less than 10% of our mental capacity. Is that because we have just built layer upon layer of cells on top of disused applications, with new applications piled high. For example, do we still have our "swinging from a tree and using our tail for balancing" algorithm somewhere toward the inner core - and many more? And are these now over written by our "stand up and walk" and/or "what the heck is quantum mechanics" reasoning/ intelligence algorithms? It could be that our brain now resembles the layering we see in huge software programmes for silicon brains. Do we ever throw away pre-programmed abilities and learning algorithms?
In contrast to our biological brain, the advancement of the silicon brain (computer) is exponential, and will continue to be so for some considerable time. Given that we are effectively in a mental stasis, and silicon systems are not, we may soon see an extension of our present richness of male and female minds, to include machines. It may even be that mankind already has a new and genetically different competitor!
Further background and Links to other Intersting Sites