A game where you compete in relaxation. The players’ brainwaves control a ball on a table, and the more relaxed scores a goal over the opponent.
2006-11-15: Slow but steady progress
Hitachi has successfully tested a brain-machine interface that allows users to turn power switches on and off with their mind. Relying on optical topography, a neuroimaging technique that uses near-infrared light to map blood concentration in the brain, the system can recognize the changes in brain blood flow associated with mental activity and translate those changes into voltage signals for controlling external devices. In the experiments, test subjects were able to activate the power switch of a model train by performing mental arithmetic and reciting items from memory.
2007-05-29: National Neurotechnology Initiative
We’ve learned more about the brain in the last 5 years than we did in the last 50 years. Lynch is working on a proposal for a 5-year National Neurotechnology Initiative with a budget of $200 million a year. It would identify projects to fund, such as the development of a “brain interface” device that would route signals from the muscles and sensory organs; technology that would allow nerves to control prosthetic devices; and a brain-simulation project that would replicate the way the brain works.
2007-08-25: Brainloop, Google Earth controlled by a brain computer interface.
2008-02-20: EEG startup.
Emotiv has created technologies that allow machines to take both conscious and non-conscious inputs directly from your mind.
I think I have future shock with this one.
Between this and haptic interfaces… woah.
2008-10-23: An update on the neuro cyborgs
He inserts a 4 sq. mm array of 100 neural probes into the M1 arm knob of the cortex. With a random sample of neural signaling from that region of the brain, and some Kalman filtering, patients can instantly control the cursor on screen (unlike biofeedback or sensory remapping which require training). They can deduce motor intent from a sample of an average of 24 neurons. When connected to a robot hand for the first time, and asked to “make a fist” the patient exclaimed “holy shit” as it worked the first time. Prior to the experiments, open questions included: Do the neurons stay active (other work indicates that the motor cortex reorganizes within minutes of decoupled sensory input)? Can thinking still activate the motor neurons? The test patients had been in sensory deprivation for 2-9 years prior. Will there be scarring and degradation over time? 1 patient is 3 years in. What are the neural plasticity effects?
2012-07-01: Brain in a vat is here!
The first real-time brain-scanning speller will allow people in an apparent vegetative state to communicate
2012-12-21: HCI chocolate
Researchers described the brain-computer interface that allowed Ms. Scheuermann to move an arm, turn and bend a wrist, and close a hand for the first time in 9 years. Less than 1 year after she told the research team, “I’m going to feed myself chocolate before this is over,” Ms. Scheuermann savored its taste and announced as they applauded her feat, “1 small nibble for a woman, 1 giant bite for BCI.”
2013-03-01: Brain to brain communication
Even though the animals were on different continents, with the resulting noisy transmission and signal delays, they could still communicate. This tells us that we could create a workable, network of animal brains distributed in many different locations.
2013-03-17: Hive mind privacy. One of the most interesting arguments for privacy in our (near) hive mind: to cut down on the quadratic communication overhead. Even our brain isn’t fully connected, rather sparsely in fact.
2014-03-04: I had somehow missed this 2 years ago. In the estimation of Mary Lou Jepsen: Could future devices read images from our brains? It should be possible to increase resolution 1000x in the next few years.
2014-04-27: Vegetative patients may be aware
a significant proportion of patients who were classified as vegetative in recent years have been misdiagnosed – Owen estimates perhaps 20%. Schiff, who weighs up the extent of misdiagnosis a different way, goes further. Based on recent studies, he says 40% of patients thought to be vegetative are, when examined more closely, partly aware. Among this group of supposedly vegetative patients are those who are revealed by scanners to be able to communicate and should be diagnosed as locked-in, if they are fully conscious, or minimally conscious, if their abilities wax and wane. But Schiff believes the remainder will have to be defined another way altogether, since being aware does not necessarily mean being able to use mental imagery. Nor does being aware enough to follow a command mean possessing the ability to communicate.
Another story:
For 12 years, Scott had remained silent, locked inside his body, quietly watching the world go by. Now, the fMRI had revealed a person: a living, breathing soul who had a life, attitudes, beliefs, memories and experiences, and who had the sense of being somebody who was alive and in the world – no matter how strange and limited that world had become.
On many occasions in the months that followed, we conversed with Scott in the scanner. He expressed himself, speaking to us through this magical connection we had made between his mind and our machine. Somehow, Scott came back to life. He was able to tell us that he knew who he was; he knew where he was; and he knew how much time had passed since his accident. And thankfully, he confirmed that he wasn’t in any pain.
Neuroethics / when you are declared brain dead are in for an upheaval.
After a major injury, some patients are in such serious condition that doctors deliberately place them in an artificial coma to protect their body and brain so they can recover. That could be a mistake. An extreme deep coma — based on the experiment on the cats — may actually be more protective. “Indeed, an organ or muscle that remains inactive for a long time eventually atrophies. It is plausible that the same applies to a brain kept for an extended period in a state corresponding to a flat EEG. An inactive brain coming out of a prolonged coma may be in worse shape than a brain that has had minimal activity. Research on the effects of extreme deep coma during which the hippocampus is active is absolutely vital for the benefit of patients.”
2014-09-11: Brain coupling
intriguing new possibilities for computer-assisted communication of brain states between individuals. The brain-to-brain method may be used to augment this mutual coupling of the brains, and may have a positive impact on human social behavior
2015-07-10: Rat onemind.
Brainet uses signals from an array of electrodes implanted in the brains of multiple rodents in experiments to merge their collective brain activity and jointly control a virtual avatar arm or even perform sophisticated computations — including image pattern recognition and even weather forecasting
2015-09-26: Unaided paraplegic walking
A novel brain-computer-interface has allowed a paraplegic man to walk for a short distance, unaided by an exoskeleton or other types of robotic support.
2016-06-01: Remote controlled insects. This is an improvement over the robo cockroach:
The rapid pace of miniaturization is swiftly blurring the line between the technological base we’ve created and the technological base that created us. Extreme miniaturization and advanced neural interfaces have enabled us to explore the remote control of insects in free flight via implantable radio-equipped miniature neural stimulating systems
2016-08-04: Neural Dust
UC Berkeley researchers are developing “Neural Dust,” tiny wireless sensors for implanting in the brain, muscles, and intestines that could someday be used to control prosthetics or a “electroceuticals” to treat epilepsy or fire up the immune system. So far, they’ve tested a 3 millimeter long version of the device in rats. “I think the long-term prospects for neural dust are not only within nerves and the brain, but much broader. Having access to in-body telemetry has never been possible because there has been no way to put something supertiny superdeep. But now I can take a speck of nothing and park it next to a nerve or organ, your GI tract or a muscle, and read out the data.”
2016-09-11: Do we really want to fuse our brains together?
If a rat can teach herself to use a completely new sensory modality – something the species has never experienced throughout the course of its evolutionary history – is there any cause to believe our own brains will prove any less capable of integrating novel forms of input?
2016-10-04: CCortex
Artificial Development is building CCortex, a massive spiking neural network simulation of the human cortex and peripheral systems. Upon completion, CCortex will represent up to 20b neurons and 20t connections, achieving a level of complexity that rivals the mammalian brain, and making it the largest, most biologically realistic neural network ever built. The system is up to 10k times larger than any previous attempt to replicate primary characteristics of human intelligence.
2017-03-23: Our Future Cyborg Brains
2017-09-05: 100x smaller Antennas
Antennas 100x smaller could lead to tiny brain implants, micro–medical devices, or phones you can wear on your finger. The antennas are expected to have sizes comparable to the acoustic wavelength, thus leading to orders of magnitude reduced antenna size compared to state-of-the-art compact antennas. These miniaturized ME antennas have drastically enhanced antenna gain at small size owing to the acoustically actuated ME effect based receiving/transmitting mechanisms at RF frequencies.
2018-02-27: EEG image reconstruction:
The new technique “could provide a means of communication for people who are unable to verbally communicate. It could also have forensic uses for law enforcement in gathering eyewitness information on potential suspects, rather than relying on verbal descriptions provided to a sketch artist.”
2018-05-14: Tetraplegics win race
But what about letting patients actively participate with AI in improving performance? To test that idea, researchers conducted research using “mutual learning” between computer and humans — 2 severely impaired (tetraplegic) participants with chronic spinal cord injury. The goal: win a live virtual racing game at an international event. After training for several months, in Oct. 8, 2016, the 2 pilots participated in Cybathlon in Zurich, Switzerland — the first international para-Olympics for disabled individuals in control of bionic assistive technology. 1 of those pilots won the gold medal and the other held the tournament record.
2018-09-11: DARPA Neurotechnology
DARPA is funding development of high resolution brain interfaces. At the same time there are 2 companies who have breakthrough technology for higher resolution brain interfaces. The 2 companies are Elon Musk’s Neuralink and Mary Lou Jepsen’s Openwater red light scanner.
2019-02-09: 75% Thought to Speech
A system that translates thought into intelligible speech. Devices monitor brain activity and Artificial Intelligence reconstructs the words a person hears. This breakthrough harnesses the power of speech synthesizers and artificial intelligence. It could lead to new ways for computers to communicate directly with the brain. The DNN-vocoder combination achieved the best performance (75% accuracy), which is 67% higher than the baseline system (Linear regression with auditory spectrogram).

2019-04-24: 43% Thought to speech
An implanted brain-computer interface (above) coupled with deep-learning algorithms can translate thought into computerized speech. The researchers asked native English speakers on Amazon’s Mechanical Turk crowdsourcing marketplace to transcribe the sentences they heard. The listeners accurately heard the sentences 43% of the time when given a set of 25 possible words to choose from, and 21% of the time when given 50 words. Although the accuracy rate remains low, it would be good enough to make a meaningful difference to a “locked-in” person, who is almost completely paralyzed and unable to speak.

2019-05-02: HCI Superpowers
The new documentary I Am Human chronicles how neurotechnology could restore sight, retrain the body, and treat diseases—then make us all more than human.
2019-08-01: Facebook has a 76% system:
Here, human participants listened to questions and responded aloud with answers while we used high-density electrocorticography (ECoG) recordings to detect when they heard or said an utterance and to then decode the utterance’s identity. Because certain answers were only plausible responses to certain questions, we could dynamically update the prior probabilities of each answer using the decoded question likelihoods as context. We decode produced and perceived utterances with accuracy rates as high as 61% and 76%, respectively (chance is 7% and 20%). Contextual integration of decoded question likelihoods significantly improves answer decoding. These results demonstrate real-time decoding of speech in an interactive, conversational setting, which has important implications for patients who are unable to communicate.
2019-10-30: Brain-to-Brain communication for group problem-solving
The interface combines electroencephalography (EEG) to record brain signals and transcranial magnetic stimulation (TMS) to deliver information noninvasively to the brain. The interface allows 3 human subjects to collaborate and solve a task using direct brain-to-brain communication. 2 of the 3 subjects are designated as “Senders” whose brain signals are decoded using real-time EEG data analysis. The decoding process extracts each Sender’s decision about whether to rotate a block in a Tetris-like game before it is dropped to fill a line. The Senders’ decisions are transmitted via the Internet to the brain of a third subject, the “Receiver,” who cannot see the game screen. The Senders’ decisions are delivered to the Receiver’s brain via magnetic stimulation of the occipital cortex. The Receiver integrates the information received from the 2 Senders and uses an EEG interface to make a decision about either turning the block or keeping it in the same orientation. A second round of the game provides an additional chance for the Senders to evaluate the Receiver’s decision and send feedback to the Receiver’s brain, and for the Receiver to rectify a possible incorrect decision made in the first round.
2021-05-14: 94% Thought to text
Using an implant, a paralyzed individual achieved typing speeds of 90 characters per minute with 94.1% raw accuracy online, and greater than 99% accuracy offline with a general-purpose autocorrect. Despite working with a relatively small amount of data (only 242 sentences’ worth of characters), the system worked remarkably well. The lag between the thought and a character appearing on screen was ~500ms, and the participant was able to produce 90 characters per minute, easily topping the previous record for implant-driven typing, which was ~25 characters per minute.
2022-04-15: EEG are terrible sensors. In-ear may fix that, and allow for continuous readings, and perhaps writing too.
But while the immediate uses of NextSense’s earbuds are medical, Berent hopes to eventually build a mass-market brain monitor that, if enough people start using it, can generate enormous quantities of day-to-day brain performance data. The catch, of course, is that since no one has ever done that, it’s not yet obvious what most people would get out of the information. That’s also what’s exciting. “We don’t necessarily know what we would learn because we’ve never had access to that type of data”.
Berent and his team envision a multipurpose device that can stream music and phone calls like AirPods; boost local sound like a hearing aid; and monitor your brain to provide a window into your moods, attention, sleep patterns, and periods of depression. He also hopes to zero in on a few sizes that would fit a vast majority of people, to dispense with all the ear-scanning.
Far along on the NextSense road map is something unproven, and kind of wild. If AI can decode tons of brain data, the next step would be to then change those patterns—perhaps by doing something as simple as playing a well-timed sound. “It’s almost a transformative moment in history,” fascinated by the prospect of using audio to nudge someone into a deeper sleep state. “It’s so convenient, it doesn’t bother you. People are wearing stuff in the ear typically anyway, right?”

2023-01-24: Faster speech to text
Our BCI decoded speech at 62 words per minute, which is 3.4x faster than the prior record for any kind of BCI and begins to approach the speed of natural conversation (160 words per minute). We highlight 2 aspects of the neural code for speech that are encouraging for speech BCIs: spatially intermixed tuning to speech articulators that makes accurate decoding possible from only a small region of cortex, and a detailed articulatory representation of phonemes that persists years after paralysis. These results show a feasible path forward for using intracortical speech BCIs to restore rapid communication to people with paralysis who can no longer speak.