Science Can Link Your Brain to a Computer. Are You Ready?

Science Can Link Your Brain to a Computer. Are You Ready?

Show Video

I do think we are at an inflection point for brain-computer interfaces, what I would call a Kitty Hawk moment. And I think just as we saw after the Wright brothers sent that plane into the air, we're gonna see an explosion of advanced development of these technologies. Over the past few years, the prospect of linking our brains directly with computers has given rise to the beginnings of a new industry. Dozens of startups are popping up, creating products to help people with mental and physical disabilities, revolutionize the way we interact with technology, and even alter the functioning of our own brains. We essentially open the door to the access of our minds by technology.

So we're talking about the changing of foundations of what it means to be human. How did we get here? How well does all this stuff actually work? And if this technology lives up to its transformational potential, are we ready for it? So a brain-computer interface is a device that interfaces the brain with the outside world. It could be a computer, it could be a machine.

And these brain-computer interfaces, or BCIs, could be inserted inside the brain. So those are invasive BCIs, or they could be wearables that you would wear on the top of your head. BCIs can record the activity of the brain, the firing of individual neurons. Obviously, if you take information out of the brain, you can in principle decode what that means. And just like there was a transition in technology from the PC to the smartphone, it is possible, I would say likely, that the next transition will be from a device that you have in your pocket to a device that you have in your head. If you've heard of one brain-computer interface, it's probably this one.

All right, welcome to the Neuralink product demo. I'm really excited to show you what we've got. I think it's gonna blow your mind. When Elon speaks, we all listen, that's just life in the 21st century.

But Neuralink and its competitors didn't come out of nowhere. They're built on decades of work by neuroscientists attempting to understand and interpret what's going on in the brain. One of those neuroscientists, and a key player in the development of BCIs, is this guy, Dr. Leigh Hochberg. I'm the director of the BrainGate pilot clinical trials. The BrainGate pilot clinical trials are seeking to determine the safety and feasibility of allowing somebody with paralysis to control an external device simply by thinking. The origins of this field can be traced easily back to the late 1950s, early 1960s.

Dr. Robert Heath has implanted electrodes, which cause no pain, in the brains of these monkeys. Researchers were placing single electrodes into the brain of animals in a research laboratory, asking the fairly straightforward question of, "How does that neuron in that part of the brain relate to the control of voluntary movement? And then in the late 1990s, here at Brown University, John Donoghue, professor of neuroscience, did an experiment in the research laboratory, which was to ask a non-human primate to play a video game.

And the researchers begin to build a map between the brain activity and the movement of the hand. After building that map, that program would be able to predict where the animal's hand actually was in space. You could then play a small trick on the animal, which is to disconnect the joystick. And at that moment, the joystick is no longer controlling the cursor on the screen, the neural activity itself is controlling the cursor movement. So we begin to describe that as a cursor that's moving under neural power alone. And that was really a breakthrough moment.

Within a few years, researchers were trying out the same thing in humans. So in the spring of 2004, the BrainGate clinical trials were launched. The technology begins with a sensor, or an array as we call it, that's placed in the brain. The sensor itself is about four by four millimeters, the size of a baby aspirin. And when tapped into the top of the brain, it has the ability to record from individual neurons.

So that when somebody thinks about moving their own hand, we can allow them to control a cursor on a computer screen or some other assistive technology. What do you want to do first? [Study Participant] I'd like to draw a circle. You're gonna draw a circle? Excellent. That incredible first participant was able to think about opening and closing a prosthetic hand.

And which was, I think he said out loud what everybody else was thinking. The third participant was an extraordinary woman who had had a stroke in the brainstem. She could feel everything, she could hear everything, but she couldn't move and she couldn't speak. And she said she wanted to take a drink. So we brought in a robotic arm. She was thinking about reaching and grasping that thermos of coffee.

As she did that, the robot arm moved out, grabbed that thermos of coffee. And then she took a sip of cinnamon latte and that was the first time in nearly 15 years that she was able to do so solely of her own volition. Over the 17 years of the BrainGate trials, the technology has come a long way, particularly as advances in machine learning have made strides helping to decode data recorded from the brain. Recently, they were even able to have a paralyzed man imagine the motion of handwriting and translate that into text on a screen with 94% accuracy. But clinical trials in university labs, they're really just a first step.

At some point there needs to be an off-ramp to a company who is then gonna focus on developing what, as an academic, a word that I don't use that often, to make a product. It needs to be submitted for regulatory approval and then it needs to be marketed and supported and made available. When we first started this company, people almost kind of took this as just kind of almost crazy science fiction.

And I think we're getting past the believability factor. We're starting to see what was originally kind of research technology take flight in the clinical domain. And I think we're gonna see many, many more of these type of technologies. Here in St. Louis, a company called Neurolutions recently got one of the first FDA approvals ever for a BCI.

In this case, a noninvasive EEG headset that reads the electrical activity of the brain. The measurements aren't as precise as with an invasive system, like the one BrainGate uses, but they work for Neurolutions' purpose: helping stroke victims to reconnect their mind with a paralyzed limb. All right, so we're just gonna test the headset real quick. All connections good. Nice and relaxed. My name's Mark Forrest.

I lived in St. Louis all my life and turns out I had a stroke and I was pretty much paralyzed on my right side. A lot of people told me after six months that I was pretty much done and I was gonna have hardly any movement. When somebody has a stroke, a part of their brain dies. For instance, the right side is injured, the left side can't move. Now, but if you talk to these patients, they can imagine moving.

They can try to move. They just can't actually execute that movement. We know that 90% of those motor fibers that control opening and closing that hand are on that side where the injury occurred. What we found is that there's actually 10% on the healthy side, on the other side of the brain that controls thought of, "I want to move my hand."

And so what we did is, we created a brain-computer interface that really has three parts to it: a wearable headset, a robotic exoskeleton, and a tablet that walks them through how to use the system. All right, let's see. So let's do 30 seconds rest and then I'm gonna hit Next and we'll see what control you'll get. At the beginning of the program it'll tells you not to move your hand, but think about moving your hand, and that's what you have to do.

Now just imagine. It's picking up that intention to move on the uninjured side of the brain. It's converting that intention to a movement of the exoskeleton. As the person continues to use it over time, it's essentially leading to a rewiring of the brain.

There's this kind of neuroscience notion of what fires together, wires together. So when they're generating those brain signals to move and they're getting that feedback from that wearable, that's leading to new connections forming in the brain that allows, eventually, that uninjured side of the brain to take over control of the paralyzed limb. My movement was just this, that's all I could do. Look at how much difference I've made - progressed, using that machine. I have arm movement. I have finger movement. I have wrist movement.

It got me motivated for, I actually built my own fishing boat. My eldest son said, it's gonna sink. It's not gonna float. My best friend told me it was gonna sink and not gonna float. Well, I've proved them all wrong. The lord works in mysterious ways, [expletive deleted] happens and there's nothing you can do about it. So you just make it work.

Companies like Neurolutions are bringing BCIs into the realm of healthcare, but that's kind of just the tip of the iceberg. Tech companies are starting to get in on the action too and they've got a whole different set of uses in mind for these technologies. At the end of the day, the only thing that a computer or a smartphone does is connect you to the 'net. Well, with a BCI, you'll be able to do this much faster and easier.

The leaders of Google and Facebook, they expect in the future, to be able to just communicate directly with BCI. In the future we want to get to an input where we can just think something and it happens. And this also has stimulated the generation of many, many new companies, neurotech companies, probably more than 100 over the last year or so. One of those companies is French startup NextMind, which is making a small wearable that can detect the firing of your neurons, much like the Neurolutions headset. And you can actually buy one from them right now for about 400 bucks. We are a neurotech company using a device that you can put on your head directly, and that can allow you to control interfaces directly with your mind.

Back of my head, like right here, like the dome part of the head? We enlisted some Quicktake producers to take NextMind's interface for a test drive. Comb the electrodes through your hair to touch the scalp. Super cool. Okay. We decode, directly from your visual cortex at the back of the head, the fluctuations of attention that you have in the visual cortex when you're focusing on something. So when you have a bunch of objects on the screen, those small triangles, we call them neurotags, basically we can not only know that you are looking at one of them, but also discriminate, very quickly, whether you wanna push this one or this one. Focusing on the center disc.

Oh, my gosh. Oh, wow, okay. It's like the Force, you know? Freaky. The current product is aimed at developers who want to build their own mind-controlled apps, but it does come with a few short demos, like a game that's controlled part manually, part mentally.

Lead the little square guy to the end of the level. Help him out using your mind to blow up enemies and trigger mechanisms. This is quite fun.

I blew up a enemy square. I don't know how I feel about staring at enemies to blow them up, this is promoting some kind of violence. It's like your brain thinks that you're able to do it and it just does it. So it feels, you know, like the future.

Bloomberg's beta testers had a pretty good time with NextMind's device overall, but there was some disappointment that it wasn't more, well, mind-read-y. At the moment, you can push a button by staring at it, and that's about it. It's like cool, but it's, I just don't really know how practical it is. Honestly, my brain kinda hurts using this. I'm a little afraid about being that plugged into a video game.

That like, literally, if my attention goes off of it, it would break the game, right? But of course, it's early days for the company and they ultimately see this as a first step towards new kinds of virtual interfaces. The thing we have in mind, really, is to create this very immersive experience between your brain and the virtual reality around you. So I am very convinced that in a few years from now, we're gonna have those cool, like augmented reality glasses, and there's gonna be sensors on them. And those sensors are gonna be able to capture some of your brain data and allow you to basically not have to move your hands in any way.

Just like be in direct connection with the virtual objects surrounding you. NextMind sits at the consumer-grade end of the emerging BCI market, but it's just one of many nascent companies in a field with a wide range of different approaches, intended applications, and funding levels. At the other end of the spectrum are companies like Elon Musk's Neuralink, spending many millions of dollars in an attempt to develop truly game-changing neural hardware. Another such company is Los Angeles-based Kernel. With over $100 million dollars in funding, founder Bryan Johnson is setting out to read brain activity in ways we haven't quite seen before.

The intuition people oftentimes associate with interfaces is the ability to control things. That is, change a channel on TV, skip a song, control a drone. We believe the most promising potential of these interfaces is measurement.

When we have a wearable on our wrist, we are aware of our heartbeat and our steps and our sleep stages. If you went to the cardiologist, you get your blood pressure and EKG, cholesterol levels. One of the only things that we cannot consistently measure is our brains.

We decided that the path forward is to build new hardware. Kernel pitches its first product, a $50,000 headset called Flow, as a giant leap forward for the field. Most headsets on the market right now use EEG technology to get a fairly general idea of your brain activity. The Flow offers a much more detailed look inside your skull. The way it works is we shine light into the brain.

That light goes into the brain. Some of it bounces back and then we use that information to reconstruct the activity in the brain in terms of blood oxygenation. It's called spectroscopy. There has been neuroimaging going on for decades.

For example, fMRI. They provide great images of the brain, they've just been going on in very large systems. They're a few million dollars and they are room-sized. So what we've done is we've shrunken a room-sized piece of equipment into a helmet. And so now if you say a person on average gets their brain scanned, say, one time every five years, you can now imagine a scenario with our technology, where a person is scanning their brain one or multiple times a day.

Johnson imagines a deluge of big brain data coming from these devices, helping science to tackle all manner of hard questions about the workings of the mind. It's a bit of a gamble though, because we don't know exactly what all that information will tell us. You know, it'll be TBD. We've built the technology out in front of the science, so the science does need to be developed. The next step for us is to get the first systems in the hands of some of the top researchers in the entire world looking at health and wellness of the brain, PTSD, concussion, meditation, the effects of caffeine on the brain, psychedelics.

And so it opens up all kinds of possibilities for mental health and wellness, for performance, for learning, for basically anything that we do with our minds. As varied as the current crop of BCIs is, most focus on the same thing: reading the activity of the brain and translating that into useful information. But there's another frontier in our attempts to access the brain that comes with more troubling implications: not just reading from the brain, but actually writing to, or directly influencing it. In a sense, this is already being done, with devices like deep brain stimulators, which send electrical impulses into the brain to treat movement disorders like Parkinson's. But here at Columbia University, neurobiologist Rafael Yuste is proving that it's possible to go much further, to actually change the brain's perception of reality.

So what we've done with mice is, we've used a laser to measure the activity of the visual cortex while the mouse is looking at an image and we train the mouse to lick whenever he sees that image. That's the first step. We can also use a second laser to activate neurons at will, a little bit as if we were playing the piano with the brain. So we actually activate the neurons that correspond to a particular image, and the mouse behaves exactly the same as if he were seeing that image. So the mouse licks, even though we're not showing him the image, we're just activating these neurons.

So in a way we're sort of taking control of his perception and making him believe that he's seeing something that is not there. Now, what can be done in animals today could be done in humans tomorrow. And we became concerned that neurotechnology should be regulated. To that end, Yuste and a group of fellow neuroscientists put together a list of guidelines that they hope will curb the dystopian potential of the BCI. So we're proposing five new human rights, neuro-rights: the right to our mental privacy; the right to our own identity; the right to our own decision-making, to prevent interference that could change our behavior; The fourth one has to do with cognitive augmentation, to prevent a situation where people that have more resources in certain countries, will become augmented, and become like super race of humans versus the rest of the people. There's already companies, like Neuralink, whose purpose is to mentally augment humans; And the final right, protection against bias in artificial intelligence.

It would be terrible if we augment humans, but actually bring into their brains a lot of biases that are carried out by algorithms. Given all those potential problems, do you think neurotechnology ends up being a net positive or a negative? Oh, I'm very bullish about BCIs and neurotechnology. I think this is gonna be a tremendous force of progress. Neurotechnologies are always neutral. You can use them for good or for bad. We always have the duty when we come up with new technology, to use it responsibly and put the guard rails so that this technology is used to help humanity.

There's still a lot about the human mind that we don't understand. And even the leaders of the field of neurotechnology are certainly a long way from cracking its code. Still, they've been able to leverage what we do know into the beginnings of a meaningful mind-machine link. And it's clear the applications will be incredibly varied, and for some, life changing.

As to what the future holds... I think that we're really gonna see kind of an integration of kind of our brains and, you know, machines for, I think, really kind of a profound alteration of our human evolution, quite frankly. Is it possible to create, you know, virtual memories? Absolutely, it is.

Is it possible to have brain to brain communication? That's been, you know, demonstrated by some researchers down at Duke. Can you upload and download information into the brain? I think that's a theoretic possibility. Whether, again, that's in 20 years or 100 years, I think those are possibilities going forward. And I think just in the last century, how we saw an explosive transition from the Wright brothers' airplane to kind of an F-22 Raptor, which would have been unimaginable 100 years ago, we're gonna see that same evolution with brain-computer interfaces.

2021-07-19 02:06

Show Video

Other news