Brain-Machine Interfaces and Collective Minds

ISBN: 978-1-60785-268-1

edited by Tim Lenoir


The first practical steps of augmenting human capability through a close coupling of man and machine have their origins in Ivan Sutherland’s work at MIT and the University of Utah and in work by the generation of students Sutherland and his colleague, David Evans, trained at the University of Utah. Having launched the field of interactive computer-aided design in his dissertation project, Sketchpad, between 1965-1968 Sutherland pursued an ambitious project to create what he called “the ultimate display,” an augmented reality system in which computer generated images of all sorts could be overlaid on scenes viewed through a head-mounted camera display system. Among the visionary suggestions Sutherland made in this early work was that interaction with the computer need not be based on keyboard or joystick linkages but could be controlled through computer-based sensing of the positions of almost any of our body muscles; and going further, he noted that while gestural control through hands and arms were obvious choices, machines to sense and interpret eye motion data could and would be built. “An interesting experiment, he claimed, “will be to make the display presentation depend on where we look.” Sutherland’s work inspired Scott Fisher, Brenda Laurel, and Jaron Lanier, the inventors of the dataglove and first virtual reality and telepresence systems at NASA-Ames Research Center, and Tom Furness at Wright-Patterson Air Force Base in Ohio, who developed his own version of the ultimate display, based on eye and gesture tracking as a quasi “Darth-Vader Helmet” and integrated virtual cockpit. Furness was trying to solve problems of how humans interact with very complex machines, particularly the new high-tech F-16, F-14 and F-18 fighter planes, which were becoming so complicated that the amount of information a fighter pilot had to assimilate from the cockpit's instruments and command communications had become overwhelming. Furness’ solution was a cockpit that fed 3-D sensory information directly to the pilot, who could then fly by nodding and pointing his way through a simulated landscape below. (more...)


Vernon B. Mountcastle
The Columnar Organization of the Neocortex
Jonathan C. Horton and Daniel L. Adams
The Cortical Column: A Structure Without a Function
John A. Bargh, Tanya L. Chartrand
The Unbearable Automaticity of Being
Miguel A. L. Nicolelis, Asif A. Ghazanfar, Barbara M. Faggin, Scott Votaw, Laura M. O. Oliveira
Reconstructing the Engram: Simultaneous, Multisite, Many Single Neuron Recordings
John K. Chapin, Karen A. Moxon, Ronald S. Markowitz, Miguel A. L. Nicolelis
Real-time Control of a Robot Arm Using Simultaneously Recorded Neurons in the Motor Cortex
Jose M. Carmena, Mikhail A. Lebedev, Roy E. Crist, Joseph E. O'Doherty, David M. Santucci, Dragan F. Dimitrov, Parag G. Patil, Craig S. Henriquez, Miguel A. L. Nicolelis
Learning to Control a Brain–Machine Interface for Reaching and Grasping by Primates
Benjamin Blankertz, Michael Tangermann, Carmen Vidaurre, Siamac Fazli1, Claudia Sannelli, Stefan Haufe, Cecilia Maeder, Lenny Ramsey, Irene Sturm, Gabriel Curio, Klaus-Robert Müller
The Berlin Brain–Computer Interface: Non-Medical Uses of BCI Technology
Karl Deisseroth, Guoping Feng, Ania K. Majewska, Gero Miesenböck, Alice Ting, Mark J. Schnitzer
Next-Generation Optical Technologies for Illuminating Genetically Targeted Brain Circuits
Edward S. Boyden
A History of Optogenetics: The Development of Tools for Controlling Brain Circuits with Light
Olaf Sporns, Giulio Tononi, Rolf Kotter
The Human Connectome: A Structural Description of the Human Brain


A 'Frozen' PDF Version of this Living Book[edit]

Download a 'frozen' PDF version of this book as it appeared on 7th October 2011