It’s a common statement among psychiatrists and people who read too many self-help books that you can’t control your emotions, but you can control how you react to them. Most of us don’t look at that and think, “Well, yeah, but you don’t have a microchip in your brain! You could totally control your emotions with a microchip in your brain!” This is why DARPA recruits the “outside-the-box” thinkers.

This week the Defense Advanced Research Projects Agency, or DARPA, awarded two large contracts to Massachusetts General Hospital and the University of California, San Francisco, to create electrical brain implants capable of treating seven psychiatric conditions, including addiction, depression, and borderline personality disorder.

“Imagine if I have an addiction to alcohol and I have a craving,” says Carmena, who is a professor at the University of California, Berkeley, and involved in the UCSF-led project. “We could detect that feeling and then stimulate inside the brain to stop it from happening.”

Share This

If you’re wondering why brain implants are of interest to the military, mental health issues are a serious concern among returning veterans. Suicide rates are three to four times the national average among current and former military personnel. So, if these operate as advertised, it could save lives. The fact that DARPA’s response to trauma and human suffering is to try cracking open your skull like a walnut and shoving implants in your brain to zap it into compliance is a bit more disturbing, though.

This is preliminary groundwork, but you don’t have to go very far to find troubling implications. First of all, we have no idea how the brain works, and sticking what amounts to a bug zapper in your noggin may not be the best idea, long term. Secondly, these brain implants are designed to be controlled remotely, if necessary. Gee, that couldn’t possibly end badly.

There is an ethics panel overseeing the research, since as we all know ethics panels in the military have absolute power and are always listened to. And it’s supposedly only intended as the last resort. That said, if we wake up in the Matrix and don’t have any superpowers, DARPA, we’re going to be upset.

This development has me… excited? Fearful, sure. But this is kind of groundbreaking and insanely hopeful technology. The potential to abuse it is MASSIVE, but that doesn’t make it any less potent and potential game-changing when it comes to severe mental health issues.

It’s undeniably exciting on one level! I am excited, actually. I’m just worried because the potential for abuse is SO enormous and SO prevalent that I’m not sure there shouldn’t be a stronger legal framework here, not to mention more general neurology research.

That’s pretty much how I feel, too. On the off chance that the system can be set up in such a way that it won’t be abused, it’s a very compelling idea. @Dan, is there any legal framework at all? This project sounds too young for there to be many practical ideas on how to regulate it yet. More neurological research would certainly be better, but I’d assume that this kind of project does necessarily include such research.

Honestly, I often feel that neural implants may be necessary if we’re going to end some of the world’s evils. Gets into sci-fi territory a bit, probably, but stuff like lie detection/awareness of others’ motives (practical telepathy via the implant), emotional regulation (a switch on the implant that’s tripped whenever someone’s brain tells them to murder or rape or whatever), and instant information downloads (imagine if everyone could immediately know everything about all political candidates before voting for them) really seem like the only ways we’re ever going to put an end to some of our shit. The potential for abuse by both the controller and the user would be there, but maybe it could be minimized with a strong legal, democratic framework built around it. It seems like solving one batch of problems by creating another batch, but maybe the new batch would be smaller and less severe.

As far as I can tell, if the patient consents, you can do damn near anything. You may have to demonstrate that the patient understood what you were doing, and that you did it competently and safely, but beyond that, anything goes.

Such a slippery slope. While most of us will agree that suppressing mental health problems is a good thing, where is the line drawn on behaviors to be restricted? Drug use? Overeating? Texting while drunk? Voting?

As someone who worked at a psycho hospital for 4 years, I am excited by these developments. The long term cost treating patients this way could save people and the government a lot of money. A Haldol deck shot, an anti-psychotic given monthly to patients with severe mental illness, costs 1,600 $.

I am afraid of the governments future ambitions, but I am afraid of pharmaceutical companies more.

It’s not just about cost, either; effectiveness would also be a big selling point. After all, unless someone’s actually committed to a mental hospital, there’s no guarantee that they’ll take their meds indefinitely; whereas this kind of implant (assuming it doesn’t need to be recharged or otherwise maintained) doesn’t seem like something a user would be able to discontinue.

However, I’d be surprised if the pharmaceutical companies didn’t find some way to weasel their way into this enterprise.

“cracking open your skull like a walnut and shoving implants in your brain”–I think they’d be a little more subtle than that. More like the thing stuffed up Arnold Schwarzenegger’s nose in Total Recall. Or this:

If they were to actually perform large scale evaluations on the population, you’d be shocked how many people show signs and traits that would classify them as having some form of “mental illness”. The biggest problem is, who will decide?

As for those with substance addictions, this would be an even easier way out than the drugs themselves. Dig into the willpower well. The only thing stopping you is you