Charlatans have long claimed to move objects by using their minds. Michael McLoughlin is helping turn the trick into a reality.
As program manager and chief engineer for the Johns Hopkins University’s Applied Physics Laboratory, McLoughlin helped lead the development of “modular prosthetic limb” technologies, which have helped amputees experience lifelike movement of prosthetic limbs.
The initiative has been backed by roughly $120 million from the Defense Advanced Research Projects Agency, or DARPA, and leverages new neurological sensor technology and robotics.
STAT spoke with McLoughlin at Aspen Ideas Festival’s Spotlight Health program. This transcript has been edited for length and clarity.
I just saw a guy who was moving his mechanical arm and hand just like I would move mine.
Yeah, that’s Johnny Matheny. It’s amazing, isn’t it? He had a tumor and a number of surgeries, and the cancer just kept moving up his arm, and they finally decided if they don’t take the arm it’s going to get into his torso. So they amputated just above the elbow. This was roughly 10 years ago.
He’s using your lab’s breakthrough technology? How does that work?
Right. The way you typically control a conventional prosthetic arm is something called myoelectrics. You have sensors on your biceps, triceps, and if you do various contractions of those muscles, you can switch modes and control the arm. Typically you can do something very simple, like open and close, or operate a hook. But it takes a lot of practice to do it well and it’s limited in capability.
In our approach, Dr. Albert Chi at Johns Hopkins essentially took the nerves that control your fingers and attached them to portions of Johnny’s biceps and triceps. Now, to flex his wrist, the signal that used to go to his wrist muscle now goes someplace on his biceps. He thinks “flex my wrist” and he gets a little contraction in his biceps, and we put electrodes on the skin’s surface to measure that contraction. So he thinks “flex my wrist” and we can then take that signal and send it to the prosthetic arm and say “flex your wrist.”
It’s very natural, and he can go through five different grasps, rotate and flex the wrist, bend the elbow. He can pick up that cup without crushing it, shake your hand without crushing it.
So what’s the next frontier for this technology?
The thing we’re really interested in, and is the real game-changer here, is the ability to make that connection between the brain and the machine. The prosthetic arm is a robotic arm, basically. And if you think about people with disabilities, they have all kinds of things they’d like to do, from typing on a computer to driving a car. If we could change the way that people with disabilities can interact with machines, we can greatly improve their quality of life.
How do you make that work technologically?
If you think about moving your arm — just visualize it — you get all these signals firing in your brain. So for someone with a spinal injury, or ALS, or to some extent stroke, you can pick up those signals. And we can do the exact same thing we do with Johnny. We interpret those signals and make the arm move. “60 Minutes” did a piece on this, and our first participant, Jan, would say I’m just thinking about moving my arm, and it would move.
How do you pick up the signals from her brain?
We can put electrodes in the brain. And of the hundreds of millions of neurons that Jan would use to move her arm, we can see a few hundred of them. We can do some pretty remarkable things with that.
Now think about what could happen if we could see thousands, hundreds of thousands, or millions of neurons? Billions? What would that mean, in terms of our understanding of what’s happening in the brain? Right now brain science is very indirect. I can make very coarse measurements and infer things that are happening. It’s very difficult to see what’s happening in there.
I see us as being on the verge of exploration of the brain that’s starting to go the same way astronomy went. There’s some really remarkable research going on, looking at the next generation of toolsets that’ll give us over the next five to 10 years much greater visibility into how the brain works. So this starts to open up tremendous windows.
How far has Jan been able to take this technology?
Not as far right now as we’d hoped, at least at the moment. The biggest risk in all of this is infection. She had these pedestals that were glued to the skull, which connected to the electrodes. But the skin started to retract a little, and doctors were concerned about infection, so they had to come out. The barrier in all this is that it’s great until you’ve got to cut a hole in your head and stick in some electrodes. That tends to be a barrier to most people.
So we can see through the skull and pick up those neurons’ signals?
Right. But I really believe that there are some things going on now that in the next year or two will produce some real breakthroughs in this area of noninvasive brain-computer interfaces.
How close are we to breakthroughs like that?
Give me about another six months or so and it’ll be a good time to talk.
In the meantime, you’ll probably continue to hear the critics talking about the improvements that could’ve been made in basic prosthetics if your funding had gone to that instead.
If you look at the history of the program, it’s about 30 organizations and $120 million. Jan’s big moment was when she fed herself chocolate. I say it cost us $100 million to get her to eat that chocolate bar. But when Johnny said he wanted to try controlling his arm, that probably cost us a few hundred thousand because we had all the technology done. DARPA did the big initial push to show this would work and to develop the platforms. So we’ve solved the technology problem. It’s more of a business problem now.
So why don’t we see more of these on the market?
It’s because there aren’t many people like Johnny out there. There are about 18,000 upper-extremity amputees in the US every year. Most are partial hand. You’ll never get to the volumes that’ll drive the cost to where we want it to go, so we’re making the technology available to lots of other people: people in wheelchairs, people with ALS, spinal cord injuries, stroke. When that happens, you start getting into the numbers that will drive the cost of the technology down. The hardware in Johnny’s arm is the same thing we can put on someone’s wheelchair. Or a robot. It’s a platform. I hope over the next five years we’ll get to the point where it will really start to scale.
When you got into this, were you prepared for the mind blow of it?
Not really. Watching the reactions, I’ve literally seen people who, the first time they move the arm, just burst into tears. And I’m still amazed by it. I’ll be across the way from Johnny and will look up and see him moving that thing and I’m like, “Wow. That is just amazing.”