Sunday, May 06, 2012

Building The Pentagon's ‘Like Me’ Weapon

Activists in Pakistan burn a US national flag (Copyright: Getty Images)
(Copyright: Getty Images)


The Pentagon wants to understand the science behind what makes people violent. The question is what do they plan to do with it?


By Sharon Weinberger
2 May 2012
Courtesy Of "The BBC"


In February this year, the US government was forced into full damage limitation mode. News that US troops in Afghanistan had sent copies of the Koran to be incinerated, sparked a wave of deadly protests that left 36 people dead and more than 200 injured. Despite an apology from President Barack Obama and assurances that the burning was accidental, the public relations offensive launched to counter the damage done to the military’s reputation and stem the violence showed little sign of success.
Now imagine that instead of employing public relations experts to advise on the best strategy, US officials had a device that could advise them what to say, generating a story based on a scientific understanding of the brain’s inner workings to soothe tempers and calm the mood of the population. It sounds like something from a science fiction blockbuster, but is in fact the premise behind the Pentagon’s growing interest in the neurobiology of political violence, a relatively new field that combines neuroscience with more traditional social science-based approaches to understanding human behaviour.
One programme, started last year by the Pentagon’s Defense Advanced Research Projects Agency (Darpa), even looks at finding ways to generate versions of events that could be used in attempts to persuade people not to support the enemy. Known as Narrative Networks, it seeks to "understand how narratives influence human thoughts and behaviour, then apply those findings to a security context in order to address security challenges such as radicalization, violent social mobilization, insurgency and terrorism, and conflict prevention and resolution,” says William Casebeer, the Darpa official leading the work.
The idea is straightforward: scientists have long known that narratives exert a powerful force on the human mind, helping to shape people’s concept of individual and group identities, even motivating them to conduct violent acts. Some bloggers and people posting on Twitter have suggested the Pentagon is seeking to elevate brainwashing to a science. "Darpa looking to master propaganda via Narrative Networks,'" read the headline of a report on the science news website Phys.org, for example, alongside countless similar blog posts and tweets.
Those involved in the research disagree. “None of the work we are doing, nor anyone else I know in the Narrative Networks group, is about increasing the ability of soldiers or sailors to kill people or to brainwash people,” says Paul Zak, a professor at Claremont Graduate University in Claremont, California, who specializes in neuroeconomics, and whose work has been funded by the Darpa program.
Zak and others see this type of research being used in the shaping of messages that shows the US military in the best possible light, such as by highlighting its humanitarian work abroad. “Is there a way to hold events that might publicise things like healthcare, public health factors, [or] tooth brushing for children and you could give away half a million toothbrushes,” he says. “There could be things that help countries understand that most of the time what we want to do is get along with everybody.”
Zak’s work involves trying to understand how listening to stories affects the brain’s natural release of oxytocin, sometimes called the trust hormone. “Why are we grabbed by some stories and not others?’ he says. “It just seems like a great question to ask.”
To test his theories, Zak uses an experiment that involves involves university students watching a short video featuring a father describing his son’s battle with brain cancer. After watching the video, Zak measures oxytocin levels in the blood of the participants, as well as their willingness to give the money they’ve earned from participating in the experiment to charity. “Our hypothesis is that this connection system that human beings have, which utilizes oxytocin, is activated by these same kinds of narratives, these same kinds of stories,” he says.
But stories aren’t the only way to increase trust. Zak has also experimented with having subjects spray oxytocin into their nose, but it's not an approach that would have practical applications for the military, he cautions. The government is not looking to “just spray oxytocin into the crowds,” he says.  “That, first of all, would be highly unethical and illegal, and it wouldn’t work anyway. You have to get a lot into the brain.“
War stories
While Zak is focusing on oxytocin, other researchers working with Darpa’s support are trying to understand the parts of the brain responsible for values and ideals. Emory University professor Greg Berns, a neuroeconomist, recently conducted an experiment that involved paying people to give up their fundamental ideals and beliefs. Participants were placed in a functional magnetic resonance imaging (fMRI) scanner while statements based on answers they had previously given on a questionnaire were presented on a screen. Topics related to either core beliefs such as views on gay marriage, sex with children and the sterilization of people with genetic conditions, or less fundamental matters such as preference for PCs or Macs.
The volunteers were then offered up to $100 to sign statements disavowing their previous views. Perhaps unsurprisingly, more were willing to take money to change position on things like whether they were a cat person rather than a dog person than were willing to do so to shift their stances on whether they would accept money for sex, for example. More interestingly, Berns found that fundamental values, such as those concerning sex and belief in God, triggered activity in a part of the brain called the left temporoparietal junction, while more every-day belief statements stimulated activity in the entirely separate left and right inferior parietal lobes.
These findings, suggests Berns, means there is a biological basis for ethnic conflict. “Many of the conflicts that we currently face internationally are ultimately about control of biology,” says Berns. People may say they are fighting for ideas, but what they are really fighting for, according to Berns, is for values connected to survival, such as reproductive rights. “Things like religion are placeholders for that; what we’re seeing is a very Darwinian struggle for limited resources,” he says.
Berns, like the other researchers involved, says the Darpa program is about finding ways to stop people from fighting, not controlling them. “It’s not about brainwashing people," he says. "We’re not in the business of reading people’s minds, or implanting thoughts. By understanding the biology of what causes people go to war, we might begin to understand how to mitigate it.”
Whether creating better narratives can help reduce conflict is still an open question, however. Neuroscientists at the Massachusetts Institute of Technology (MIT), in Cambridge, Massachusetts, have been studying the role of stories and dialogue on those involved the Arab-Israeli conflict, and in particular, how stories affect sympathy for others.
“I think there’s a perception out there that if someone commits these horrible atrocities to another group that they must be sociopaths, they must be psychopaths that lack empathy for other people,” says Emile Bruneau, a post-doctoral fellow at the Saxe Lab at MIT, which is not funded by the Darpa programme. “But, I think it might be very different, that they might actually be highly empathic people, but their empathy is highly regulated so that it’s applied strongly to in-group members but not at all to out-group members.”
In a study published last year, Bruneau and his colleagues looked at what happens in the brain when Jewish Israelis and Arabs read stories intended to evoke sympathy about members of each other's group. Participants read about children suffering physical or emotional pain such as by cutting themselves with a knife or losing a parent, for example. Brain scans carried out with fMRI machines showed these stories elicited similar patterns of activation in the medial prefrontal cortex, the brain region associated with sympathy, whether subjects read about members of their own group or about "the enemy". Interestingly, reading the same stories about the suffering of South Americans triggered a noticeably different response in this brain region and others involved in thinking about others' emotions. “The most poetic interpretation of that is these are the brain regions where the opposite of love is not hate, but indifference,” says Bruneau.
In a separate study, Bruneau and colleagues asked Israelis and Palestinians to write about the difficulties they faced because of the ongoing conflict. The accounts were then read by members of the opposing group, and feelings such as empathy, trust and warmth were measured using a survey. The researchers found the attitudes of the Palestinians towards the Israelis improved more when they were allowed to tell their stories, rather than listening, whereas Israelis' attitudes about Palestinians improved more after they listened to Palestinians describing their experiences.
The MIT research could hold some lessons for the US government, which spends over a billion dollars a year on trying to convince foreign audiences of its point of view, whether via radio broadcasting, or through the Pentagon’s foreign language news sites. “It’s interesting that we spend a lot of money as a country on the Voice of America [radio station],” Bruneau says, “when this research is starting to show that what might be most effective would be the ear of America.”
Line of defence
Beyond the question of better storytelling is a fundamental question about whether such research will actually help the Pentagon convince people that the US military is really there to help them. Tom Pyszczynski, a social psychologist at the University of Colorado who studies terrorism, says it’s not clear that understanding the neuroscience of violence, while an interesting scientific endeavor, will lead on its own to solutions to terrorism.
“We need to understand those things, no doubt about it, but, in terms of promoting peace I’m not sure that knowing where in the brain the anger that leads to violence is happening is going to help us discourage war,” says Pyszczynski, who has been studying the effects of the recent Arab Spring uprisings on attitudes towards the West. “We’re not going to be able to go in and zap people’s amygdalae or anesthetize them or do whatever,” he says. “We’re going to need to change the way they interpret things that happen and we’re going to need to stop doing things that people interpret as insulting or challenging to their way of life.”
For Pyszczynski, the potential for such work also raises an interesting ethical question reminiscent of the issues addressed A Clockwork Orange, both the 1971 film and the book on which it was based. “If you could somehow reliably change peoples’ minds so that they didn’t want to kill anymore, should that be done?” he asks. “Well, you’re impinging on their freedom in a way, but on the other hand you’re saving a lot of lives.”
But shaping public relations campaigns – and people’s minds - isn’t necessarily the only military application for such research. David Matsumoto, a professor of psychology and director of the Culture and Emotion Research Laboratory at San Francisco State University, is being funded by another Pentagon initiative, called Minerva, to conduct scientific research on the role of emotions in inciting political violence. Matsumoto and his colleagues are studying language and facial expressions used by political leaders to see if those can be used to predict future violence. 
“I think that one of the most logical direct applications of this kind of finding and this line of research [is] to develop sensors that can watch, either monitor the words that are being spoken and/or the non-verbal behaviors that are expressive of those emotions,” he says of the Pentagon’s interest in his work. “I think the development of sensors like that ... would be sort of an early warning signal or system [to detect violence].”
Of course, some might question whether the vision of a machine that spits out story lines at the flip of a switch, or provides an early warning “emotion” sensor for war, is blue sky dreaming. But Read Montague, a neuroscientist at the Virginia Tech Carilion Research Institute in Roanoke, Virginia, sees the possibility of technology that could come into play in cases like the Koran-burning protests in Afghanistan.
“I see a device coming that’s going to make suggestions to you, like, a, this situation is getting tense, and, b, here are things you need to do now, I’ll help you as you start talking,” says Montague, who is part of the Darpa Narrative Networks project. “That could be really useful.”
Montague points out that people also once doubted that a computer could beat a chess master, but as technology advanced, computers eventually became good enough that they could out manoeuvre even the best chess players. Of course, the idea of Big Blue-style computer that taps the mind’s biology to generate stories sounds less like a feel-good storytelling machine than a military weapon designed to manipulate people’s mental state. “It’s a weapon,” says Montague, “but it’s a defensive weapon.”



No comments: