www.mitchellhowe.com


The Singularity Q&A

Q: How could neurohacking improve intelligence?

A: I'll open with another standard ethical disclaimer:  Yes, much of the research that might effectively find ways to boost the intelligence of existing brains would probably involve distasteful experiments on primates or field trials involving children.  This is actually the main drawback of neurohacking -- the reason it has been, and is likely to remain, a rather stagnant field, even though existing technology could probably allow for some useful breakthroughs if more research were permitted and funded.

If the word neurohacking invokes a cluttered mental image in your head, perhaps involving a brain surgeon with a soldiering iron, you've got the right idea -- not because soldiering irons need be involved, but because successful neurohacking is likely to be fairly complicated -- even messy.  As with neural interfacing, the basic obstacle to improving the function of a mature brain is Algernon's Law, the idea that any simple tricks able to improve our intelligence without serious side-effects would have probably been found and implemented by evolution a long time ago.  Hence, scientists should not expect to find any one drug or procedure that will safely improve intelligence, although it is possible that the side-effects of an otherwise miracle treatment could be effectively compensated for in a modern setting. 

This is the primary reason I place little stock in the so-called nootropic drugs and supplements currently on the market.   Many of these are FDA-approved medications proven safe and effective for treating certain conditions not related to mental function, but which are rumored to have positive side-effects in this area.  As with herbal extracts and supplements, there is simply too little conclusive research to lend much more than anecdotal support to any of these claims.  (And although this is a purely subjective impression on my part, those who peddle or offer testimonials on nootropics often seem to me to be, well, a few amino acids shy of a protein.  Would you buy a miracle diet from a morbidly obese salesperson?)

One area of officially-sanctioned study that actually has produced some encouraging results is the treatment of age-related mental degradation.  Turning many decades of conventional wisdom on their head, scientists have learned that mature brains are capable of growing new neurons, and do so regularly.  As Alzheimer's and related ailments are strongly correlated to an overall reduction in the number and efficiency of neurons, a great deal of effort is going into discovering the mechanisms and signals a healthy brain uses to foster neural growth.  Ultimately, such an understanding may allow people to stay mentally sharper longer -- and perhaps even make modest improvements to their mental performance.

But really (although this may not technically increase intelligence), effective brain power could be boosted significantly just by making more consistent use of the neurons we already have.  We all enjoy moments of brilliance; those rare times when we seem to top the thinking of ten people put together, cajoling the solution out of a hitherto unsolvable problem -- perhaps with metaphorical steam coming out our ears.  Why is it we can't think like this all the time?  Why do our mental "muscles" seem limited to short sprints?

Evolution provides the answer.  Counterintuitively, concentrating too long on any one task could be deadly to one's genes in a primitive setting.  Too much time building or improving a shelter could mean not enough time hunting.  Too much time gathering firewood could mean missing out on opportunities to woo the opposite sex back at the fire.  In fact, too intently concentrating on anything could waste energy, reduce alertness, and make one an easy target for a predator or a competitor.  But since intense concentration does produce better answers than non-stop "multi-tasking," the evolutionary compromise seems to be to permit concentration, but only in short bursts, and only when the situation requires it.

Hence, for every "normal" person, procrastination and attention deficit are not merely inconvenient habits, but deep-running survival instincts that one cannot expect to overcome through willpower alone;  subduing them is as difficult as dieting, and for much the same reason.  To the would-be neurohacker, the question of how brain function changes under "pressure" to allow for intense concentration is therefore very important.  Similarly, it would be useful to know how the brain decides when concentration is warranted in the first place.  If either one of these mechanisms could be understood, it might be possible to mimic it on occasions when the natural inclination would be for constant distraction and delay.  Even if it were not possible to use a pill or other kind of "switch" to enable a long, sustained period of concentration, it could be useful just to have some kind of indicator for times when the mind begins to wander or think in ways counterproductive to one's current goals.

One might say that neurohacking, in part, hopes to be to concentration what liposuction is to weight loss.  With this in mind, an attempt to "rewire" oneself without the use of any chemical or physical aids probably does not, technically, fall under the umbrella of "neurohacking."  But people who are very aggressive in their efforts to reprogram their minds from the inside will sometimes refer to it as such;  I am inclined to approve of this usage, if only because the results, when successful, can be as shocking and inspiring as the loss of 150 lbs through willpower alone.  In a future response, I will suggest a general regimen of study subjects likely to be especially useful to anyone determined not to wait until the Singularity to actually improve the way they think.




[Back to Futurism] [Back to Main]  ©2002 by Mitchell Howe