image

As a young teenager in the late 1970’s, I spent a good many hours in my damp, poorly-lit basement battling my older brother in “Pong” on our family’s television console. At the time, I couldn’t in my wildest imagination have envisioned that Pong’s successors–such as the Super Mario Brothers, Grand Theft Auto and John Madden’s NFL–would someday rival and eventually surpass, in terms of economic clout and output, that other diversionary stalwart of my youth–the movie industry.

It would have been even more difficult for me to fathom that the game’s descendants would also be utilized by U.S. military planners to help train the soldier of the 21st century. Yet that is precisely what the Defense Department is now doing. It invests millions of dollars annually and sends thousands of U.S. soldiers to train in simulated environments in order to better prepare them for the chaos and confusion they will actually face in real combat.

The technology is now so advanced that not only does it succeed in getting their hearts pounding and bodies sweating, but some soldiers–after they have experienced actual combat–have said that the simulated experience actually felt more like what they expected combat to feel like than the real thing. Imagine that.

I have been increasingly reminded of the video gaming industry’s startling advances as I contemplate the progress researchers in the cognitive sciences continue to with neural chips. In late 2004, it was widely reported that a paraplegic from Rhode Island had a small device implanted in his brain which allowed him to send an email just by thinking. The implant, produced by Massachusetts-based Cyberkinetics, is called the Braingate Neural Interface System and it has been awarded FDA regulatory approval. In its simplest sense, the technology translates electronic impulses generated by the brain and turns them into commands that can operate a computer.

To be sure, it was an impressive technology but it is still a fairly crude device. So to, though, was Pong when it first came out. And like that first video game, neural technology is only going to get better. Just as Pong was fueled by exponential advances in computer processing power, data storage and graphic software to the point where it could become a legitimate war-training tool; so too will brain neural technology be fueled by exponential advances in information technology, biotechnology, nanotechnology and the cognitive sciences to the point where it can become a viable and effective tool for our country’s most senior decision-makers.

Today, there are more than 55,000 neuroscientists around the world working feverishly to increase our understanding of the human brain and improve its functions. In addition to accessing extensive databases in their quest to build better models of the brain, they are also adding ever more knowledge to those databases.

Furthermore, a variety of new tools are being developed and added to their arsenal. These, in turn, are further compounding our understanding of how the human mind operates. Supercomputers like BlueGene L are being dedicated to reverse engineer the brain; fMRI technology is providing detailed maps of certain regions of the brain; and, as Ray Kurzweil points out in his book, The Singularity, the resolution of non-invasive brain-scanning devices is doubling every twelve months.

On top of this impressive array of hardware, software and human capital, an equally impressive amount of money is being invested in pursuit of a deeper understanding of the human brain. For instance, the National Institute of Health is investing millions in the cognitive sciences, and it is now public knowledge that DARPA has funded groups to build sensing electrodes into a helmet that is capable of picking up brain signals without surgical implants. Such advances portend the day when jet fighters can operate a plane by thought alone.

But if DARPA is investing in a technology to create better fighter jocks, what else is it working on? Moreover, where are all these advances headed?

One possibility is the development of neural tools to aid our decision-makers in making better and faster decisions. To do this, one would need many things, including 1) the ability to access all available information that could possibly affect the decision; 2) a means to process that information and discern recognizable patterns from it; and 3) create an algorithm capable of calculating all possible outcomes and assessing the desirable and probability of those outcome. In short, it would seem to be an exercise of almost unfathomable complexity.

But is it, in relative terms, any more complex than asking those first developers of the game of Pong to contemplate constructing a video game that feels as real as actual combat?

I don’t believe so. As proof, I would encourage you to watch this video of Claudia Mitchell controlling her robotic prosthetic arm by thought alone. (It is controlled by a tiny computer chip located in her brain. The chip deciphers her neurons into electronic bits and then sends to her robotic arm where they are, in turn, translated into the prescribed action that her brain was “thinking.”)

Therefore, the question we need to ask ourselves is not simply whether our decision-makers should use such cognitive tools to make better decisions in the future, but rather whether they should be mandated to do so–even if that includes the possibility of having to implant a neural chip in their brain.

The idea might sound positively ludicrous but, in 1978, I am certain the notion of training soldiers with video games to fight more effectively would have sounded equally preposterous.

The advances in the cognitive sciences, as well as the exponential trends facilitating the field’s development are real. The time to begin contemplating such seemingly radical notions is now. Given the accelerating pace of scientific advancement, there is a realistic chance such advances could be upon us within a decade’s time.

Therefore, to jump start the debate, I would pose the following question to all forward-thinking think tanks and debate clubs: RESOLVED: By 2016, the President of the United States should be required to utilize state-of-the-art neural technology during all periods of international crisis.

Jack Uldrich is a writer, futurist, public speaker and host of jumpthecurve.net. He is the author of seven books, including Jump the Curve and The Next Big Thing is Really Small: How Nanotechnology Will Change the Future of Your Business. He is also a frequent speaker on future trends, innovation, change management and executive leadership to a variety of businesses, industries and non-profit organizations and associations.