Neuroengineering/AffectiveTurn

From Books

Jump to: navigation, search

Introduction
Brain-Machine Interfaces
Optogenetic Mapping: Neurotechnology Renaissance
Ubiquious Computing and Augmented Reality
The Affective Turn: Emotional Branding, Neuromarketing, and the New, New Media
Concluding Thoughts

The Affective Turn: Emotional Branding, Neuromarketing and the New, New Media


A number of critical theorists, including Deleuze and Guattari, Brian Massumi, Bernard Stiegler, Patricia Clough, and more recently Hardt and Negri have observed that under globalization capitalism has shifted from production and consumption to focusing on the economic circulation of pre-individual bodily capacities or affects in the domain of biopolitical control. At the same moment these scholars were urging us to pay heed to the affective turn and the radicalizing shift taking place in capitalism, marketing theorists and “mad men” were similarly becoming sensitized to the shifts taking place in capitalism. At the end of the 1990s major marketing gurus, such as Marc Gobé, pointed out to their marketing colleagues that the world is clearly moving from an industrially driven economy toward a people-driven economy that puts the consumer in the seat of power; and as we are all becoming painfully aware, over the past fifty years the economic base has shifted from production to consumption. Marketers have embraced the challenges of this new reality with new strategies. Gobé pointed out that what used to be straightforward functional ideas, such as computers, have morphed from “technology equipment” into larger, consumer-focused concepts such as “lifestyle entertainment.” Food is no longer about cooking or chores but about home/lifestyle design and “sensory experiences;” Even before neuroscience entered the marketing scene Gobé and his marketing colleagues were coming to terms with what the theorists I have named above have called the immaterial labor of affect. Gobé called the visionary approach he was offering to address the new capitalist realities “Emotional Branding,” but what he had in mind was more than simple emotion and closer to what we call affect. “By emotional,” Gobé meant, “how a brand engages consumers on the level of the senses and emotions; how a brand comes to life for people and forges a deeper, lasting connection... It focuses on the most compelling aspect of the human character; the desire to transcend material satisfaction, and experience emotional fulfillment. A brand is uniquely situated to achieve this because it can tap into the aspirational drives which underlie human motivation.”

Nearly every academic discipline from art history and visual studies to critical theory and recently even the bastions of economics have been moved in one way or another to get on the affective bandwagon. Recent neuroscience points to an entirely new set of constructs underlying economic decision-making. The standard economic theory of constrained utility maximization is most naturally interpreted either as the result of learning based on consumption experiences, or careful deliberation—a balancing of the costs and benefits of different options—as might characterize complex decisions like planning for retirement, buying a house, or hammering out a contract. While not denying that deliberation is part of human decision making, neuroscience points out two generic inadequacies of this approach—its inability to handle the crucial roles of automatic and emotional processing.

A body of empirical research spanning the past fifteen years, too large to discuss here, has documented the range and extent of complex psychological functions that can transpire automatically, triggered by environmental events and without an intervening act of conscious will or subsequent conscious guidance.(Bargh, 1999; 2000; Hassin, 2005) First, much of the brain implements “automatic” processes, which are faster than conscious deliberations and which occur with little or no awareness or feeling of effort (John Bargh et al., 1996; Bargh and Tanya Chartrand, 1999). Because people have little or no introspective access to these processes, or volitional control over them, and these processes were evolved to solve problems of evolutionary importance rather than respect logical dicta, the behavior these processes generate need not follow normative axioms of inference and choice. Second, our behavior is strongly influenced by finely tuned affective (emotion) systems whose basic design is common to humans and many animals (Joseph LeDoux 1996; Jaak Panksepp 1998; Edmund Rolls 1999). These systems are essential for daily functioning, and when they are damaged or perturbed, by brain injury, stress, imbalances in neurotransmitters, or the “heat of the moment,” the logical-deliberative system— even if completely intact—cannot regulate behavior appropriately. Human behavior thus requires a fluid interaction between controlled and automatic processes, and between cognitive and affective systems. A number of studies by Damasio and his colleagues have shown that deliberative action cannot take place in the absence of affective systems. However, many behaviors that emerge from this interplay are routinely and falsely interpreted as being the product of cognitive deliberation alone (George Wolford, Michael Miller, and Michael Gazzaniga 2000). These results suggest that introspective accounts of the basis for choice should be taken with a grain of salt. Because automatic processes are designed to keep behavior “off-line” and below consciousness, we have far more introspective access to controlled than to automatic processes. Since we see only the top of the automatic iceberg, we naturally tend to exaggerate the importance of control. Taking these findings onboard, a growing vanguard of “neuroeconomists” are arguing that economic theory ought to take the findings of neuroscience and neuromarketing seriously.(Perrachione and Perrachione, 2008)

But even in advance of engineering solutions to building neurochips and neuro-coprocessors a burgeoning “adfotainment-industrial complex” is emerging that marries an applied science of affect with media and brand analysis. Among the most successful entrants in this field are MindSign Neuromarketing, a San Diego firm that engages media and game companies to fine-tune their products through the company’s techniques of “neurocinema,” the real-time monitoring of the brain’s reaction to movies by using fMRI technology, eye-tracking, galvanic skin response and other scanning techniques to monitor the amygdale while test subjects watch a movie or play a game. MindSign examines subject brain response “to your ad, game, speech, or film. We look at how well and how often it engages the areas for attention/emotion/memory/and personal meaning (importance).” MindSign cofounder Philip Carlsen said in an NPR interview that he foresees a future where directors send their dailies (raw footage fresh from the set) to the MRI lab for optimization. “You can actually make your movie more activating,” he said, “based on subjects’ brains. We can show you how your product is affecting the consumer brain even before the consumer is able to say anything about it.” The leaders in this adfotainment-industrial complex are not building on pseudoscience but have close connections to major neuroscience labs and employ some of the leading researchers of the neuroscience of affect on their teams. NeuroFocus, located in Berkeley, California, was founded by UC Berkeley-trained engineer, Dr. A.K. Pradeep and has a team of scientists working with the firm that includes Robert T. Knight, the director of the Helen Willis Neuroscience Institute at UC Berkeley. NeuroFocus was recently acquired by the powerful Nielsen Company.

I want to consider the convergence of these powerful tools of neuro-analysis and media in light of what some theorists have considered the potential of our increasing symbiosis with media technology for reconfiguring the human. Our new collective minds are deeply rooted in an emerging corporeal axiomatic, the domain identified by Felix Guattari as the machinic unconscious and elaborated by Patricia Clough as a “teletechnological machinic unconscious”(Clough, 2000)—a wide range of media ecologies, material practices, social apparatuses for encoding and enforcing ways of behaving through routines, patterns of movement and gestures, as well as haptic and even neurological patterning/re-patterning that facilitate specific behaviors and modes of action.(Guattari, 2009) In this model technological media are conjoined with unconscious and preconscious cognitive activity to constitute subjects in particular, medium-specific directions.

The affective domain is being reshaped by electronic media. Core elements of the domain of affect are unconscious social signals, primarily consisting of body language, facial expressions, and tone of voice. These social signals are not just a complement to conscious language; they form a separate communication network that influences behavior, and can provide a window into our intentions, goals, and values. Much contemporary research in cognitive science and other areas of social psychology is reaffirming that humans are intensely social animals and that our behavior is much more a function of our social networks than anyone has previously imagined. The social circuits formed by the back-and-forth pattern of unconscious signaling between people shapes much of our behavior in families, work groups and larger organizations. (Pentland, 2007b) By paying careful attention to the patterns of signaling within a social network, Pentland and others are demonstrating that it is possible to harvest tacit knowledge that is spread across the network’s individuals. While our hominid ancestors communicated face-to-face through voice, face, and hand gestures, our communications today are increasingly electronically mediated, our social groups dispersed and distributed. But this does not mean that affect has disappeared or somehow been stripped away. On the contrary, as the “glue” of social life, affect is present in the electronic social signals that link us together. The domain of affect is embedded within and deeply intertwined with these pervasive computing networks. The question is, as we become more socially interlinked than ever through electronic media can the domain of affect be accessed, measured, perhaps understood and possibly manipulated for better or worse?

A number of researchers are developing systems to access, record and map the domain of affect, including a suite of applications by Sony Interaction Laboratory director Jun Rekimoto (Rekimoto, 2006; 2007a; 2007b; 2010) such as the Affect Phone and a LifeLogging system coupled with an augmented reality and a multiperson awareness medium for connecting distant friends and family developed by Pattie Maes’ group at MIT. For the past five years Sandy Pentland and his students at the MIT Media Lab have been working on what they call a socioscope for accessing the affective domain in order to make new social networked media smarter by analyzing prosody, gesture, and social context. The socioscope consists of three main parts: “smart” phones programmed to keep track of their owners’ locations and their proximity to other people by sensing cell tower and Bluetooth IDs; electronic badges that record the wearers’ locations, ambient audio, and upper body movement via a two-dimensional accelerometer; and a microphone with body-worn camera to record the wearers’ context, and software that is used to extract audio “signals”, specifically, the exact timing of individuals’ vocalizations and the amount of modulation (in both pitch and amplitude) of those vocalizations. Unlike most speech or gesture research, the goal is to measure and classify speaker interaction rather than trying to puzzle out the speakers’ meanings or intentions.

One implementation of this technology is the Serendipity system, which is implemented on Bluetooth-enabled mobile phones and built on BlueAware, an application that scans for other Bluetooth devices in the user’s proximity. (Eagle, 2005) When Serendipity discovers a new device nearby, it automatically sends a message to a social gateway server with the discovered device’s ID. If it finds a match, it sends a customized picture message to each user, introducing them to one another. The phone extracts the social signaling features as a background process so that it can provide feedback to the user about how that person sounded and to build a profile of the interactions the user had with the other person. The power of this system is that it can be used to create, verify, and better characterize relationships in online social network systems, such as Facebook, MySpace, and LinkedIn. A commercial application of this technology is Citysense, which acquires millions of data points to analyze aggregate human behavior and to develop a live map of city activity, learns about where each user likes to spend time and processes the movements of other users with similar patterns. Citysense displays not only "where is everyone right now" on the user’s PDA but "where is everyone like me right now." (Sense Networks, 2008)

There are a number of implications of this technology for quantifying the machinic unconscious of social signals. Enabling machines to know social context will enhance many forms of socially aware communication, and indeed, the idea is to overcome some of the major drawbacks in our current use of computationally mediated forms of communication. For example, having a quantifiable model of social context will permit the mapping of group structures, information flows, identification of enabling nodes and bottlenecks, and provide feedback on group interactions: Did you sound forceful during a negotiation? Did you sound interested when you were talking to your spouse? Did you sound like a good team member during the teleconference?


I want to close these reflections by pointing to two newly introduced technologies that build upon the some of the same data-mining techniques for creating profiles discussed in Pentland’s CitySense programs. Of these final two, the least invasive new technology I want to highlight is Streetline, a company that realizes many of the innovations first experimented with in Cooltown and incorporates low power mesh technologies first developed in the MOTES project at Berkeley in the late 1990s. Streetline, a San Francisco-based tech firm, was selected as the winner by the IBM Global Entrepreneurship Program's SmartCamp 2010 for developing the free Parker app which not only shows you where parking meters are located, but also shows you which meters are available. Forget circling a five-block radius waiting for a spot to appear. With this app (available for iPhone and Android) you can pinpoint and snag that elusive space. Streetline captures data using self-powered motes, sensors mounted in the ground at each parking space, which can detect whether or not a space is vacant. The Parker app uses your smartphone's location sensors to know where you are and highlight local parking spots. It also uses the large screen (in your car for instance) to display a dynamic map of the nearest spots (rather than just display a list of street addresses). The parking meter data from the sensors is transmitted across ultra-low power mesh networks to Streetline servers which build a real-time picture of which parking meters are vacant. This information can be shared with drivers through the Parker app, and also with city officials, operators and policy managers. The app even goes further: once you park, the app uses this information to provide walking directions back to your vehicle and can record how much time you have on the meter and alert you when time is getting short. This is a truly cool app. But this app is on a spectrum of technologies that use cell-phone data to track and trace your location. A more disturbing surveillance-use of new media technology combined with data-mining and profiling tools is Immersive Labs of New York, which uses webcams embedded in billboards and display systems in public areas, such as Times Square, an airport, or theme park, to grab footage of passers-by for facial recognition tools to measure the impact of an ad running on the screen. In this application artificial Intelligence software makes existing digital signs smarter, sequences ads, and pushes media to persons in front of the screen. Immersive Labs software makes real-time decisions on what ads to display based on current weather, gender, age, crowd, and attention time of the audience. The technology can adapt to multiple environments and ads on a single screen and works with both individuals and large groups. Using a standard web cam connected to any existing digital screen to determine age, gender, attention time and automatically schedule targeted advertising content. The software calculates the probability of success for each advertisement and makes real-time decisions of what ad should play next. The analytics report on ad performance and demographics (e.g., gender, age, distance, attention time, dwell time, gazes). The company claims not to store the images of individuals it has analyzed but immediately discards them after the interaction—we’re not so sure. (More: Concluding Thoughts)


References

Bargh, John A., and Tanya L. Chartrand. "The Unbearable Automaticity of Being." American Psychologist 54, no. 7 (1999): 462-79.

Bargh, John A., and Melissa J. Ferguson. "Beyond Behaviorism: On the Automaticity of Higher Mental Processes." Psychological Bulletin 126, no. 6 (2000): 925-45.

Camerer, Colin, George Loewenstein, and Drazen Prelec. "Neuroeconomics: How Neuroscience Can Inform Economics." Journal of Economic Literature 43, no. 1 (2005): 9-64.

Camerer, Colin F. "Neuroeconomics: Opening the Gray Box." Neuron 60, no. 3 (2008): 416-19.

Clough, Patricia Ticineto. Autoaffection: Unconscious Thought in the Age of Teletechnology. Minneapolis: University of Minnesota Press, 2000.

Damasio, Antonio. Descartes' Error: Emotion, Reason, and the Human Brain. New York: G.P. Putnam, 1994.

———. The Feeling of What Happens: Body and Emotion in the Making of Consciousness. New York: Harcourt Brace, 1999.

———. "Fundamental Feelings." Nature 413, no. 25 October (2001): 781.

Damasio, Antonio R. "Emotion and the Human Brain." Annals NY Academy of Science 935, no. 1 (2001): 101-06.

Eagle, Nathan., and Alexander Pentland. "Social Serendipity: Mobilizing Social Software." Pervasive Computing, IEEE 4, no. 2 (2005): 28-34.

Gobé, Marc. Emotional Branding: The New Paradigm for Connecting Brands to People. New York: Allworth Press, 2001.

Guattari, Felix. "Beyond the Psycholanalytical Unconscious." In Chaosophy: Texts and Interviews 1972-1977. Los Angeles: Semiotexte, 2009.

Hassin, Ran R., James S. Uleman, and John A. Bargh, ed. The New Unconscious. Oxford: Oxford University Press, 2005.

Panksepp, Jaak. Affective Neuroscience: The Foundations of Human and Animal Emotions. New York: Oxford, 1998.

Pentland, Alexander. "Automatic Mapping and Modeling of Human Networks." Physica A 378 (2007a): 59-67.

———. "On the Collective Nature of Human Intelligence." Adaptive Behavior 15, no. 2 (2007b): 189-98.

———. "Reality Mining of Mobile Communications: Toward a New Deal on Data." In The Global Information Technology Report 2008-2009, 75-80: World Economic Forum, 2009.

———. "Socially Aware, Computation and Communication." Computer 38, no. 3 (2005): 33-40.

Pentland, Alexander. and Nathan Eagle. "Reality Mining: Sensing Complex Social Systems." Journal of Ubiquitous Computing 10 (2006): 255-68.

Perrachione, Tyler K., and John R. Perrachione. "Brains and Brands: Developing Mutually Informative Research in Neuroscience and Marketing." Journal of Consumer Behaviour 7, no. 4-5 (2008): 303-18.

Rolls, Edmund.The Brain and Emotion. New York: Oxford University Press, 1999.

SenseNetworks. http://www.sensenetworks.com/.

Wolford, George, Michael B. Miller, and Michael Gazzaniga. "The Left Hemisphere's Role in Hypothesis Formation." The Journal of Neuroscience 20, no. 6 (2000): RC 1-4.

Personal tools