Tag: science

Computational REST

REpresentational State Transfer (REST) guided the creation and expansion of the modern web. The reformations introduced with REST permitted the web to achieve its goal as an internet-scale distributed hypermedia system. Yet, the web has now seen the introduction of a vast sea of shared and interdependent services. Despite the expressive power of REST, these new services have not consistently realized the anticipated benefits from REST. In order to better understand the unwritten axioms necessary to realize these anticipated benefits, we survey the history and evolution of the web’s infrastructure – including Apache HTTP Server, Firefox, and Squid. We also recount our experiences developing such systems and the challenges we faced due to the lack of thorough design guidance. We then critically examine these new services from the vast sea – including Service-oriented architectures, RESTful Web Services, and AJAX – to glean previously undocumented lessons about how these services are constructed and why they do not consistently realize the benefits expected from REST. Based on this, this dissertation presents a new architectural style called Computational REST (CREST). This style recasts the web from a model where content is the fundamental measure of exchange to a model where computational exchange is the primary mechanism. This crucial observation keys a number of new axioms and constraints that provide new ways of thinking about the construction of web applications. We show that this new style pinpoints, in many cases, the root cause of the apparent dissonance between style and implementation in critical portions of the web’s infrastructure. CREST also explains emerging web architectures (such as mashups) and points to novel computational structure. Furthermore, CREST provides the necessary design guidance to create new web applications which have not been seen before. These applications are characterized by the presence of recombinant services which rely upon fine-grained computational exchange to permit rapid evolution.

justin has some big plans for his dissertation

Brain Communication

A game where you compete in relaxation. The players’ brainwaves control a ball on a table, and the more relaxed scores a goal over the opponent.

2006-11-15: Slow but steady progress

Hitachi has successfully tested a brain-machine interface that allows users to turn power switches on and off with their mind. Relying on optical topography, a neuroimaging technique that uses near-infrared light to map blood concentration in the brain, the system can recognize the changes in brain blood flow associated with mental activity and translate those changes into voltage signals for controlling external devices. In the experiments, test subjects were able to activate the power switch of a model train by performing mental arithmetic and reciting items from memory.

2007-05-29: National Neurotechnology Initiative

We’ve learned more about the brain in the last 5 years than we did in the last 50 years. Lynch is working on a proposal for a 5-year National Neurotechnology Initiative with a budget of $200 million a year. It would identify projects to fund, such as the development of a “brain interface” device that would route signals from the muscles and sensory organs; technology that would allow nerves to control prosthetic devices; and a brain-simulation project that would replicate the way the brain works.

2007-08-25: Brainloop, Google Earth controlled by a brain computer interface.

2008-02-20: EEG startup.

Emotiv has created technologies that allow machines to take both conscious and non-conscious inputs directly from your mind.

I think I have future shock with this one.

Between this and haptic interfaces… woah.

2008-10-23: An update on the neuro cyborgs

He inserts a 4 sq. mm array of 100 neural probes into the M1 arm knob of the cortex. With a random sample of neural signaling from that region of the brain, and some Kalman filtering, patients can instantly control the cursor on screen (unlike biofeedback or sensory remapping which require training). They can deduce motor intent from a sample of an average of 24 neurons. When connected to a robot hand for the first time, and asked to “make a fist” the patient exclaimed “holy shit” as it worked the first time. Prior to the experiments, open questions included: Do the neurons stay active (other work indicates that the motor cortex reorganizes within minutes of decoupled sensory input)? Can thinking still activate the motor neurons? The test patients had been in sensory deprivation for 2-9 years prior. Will there be scarring and degradation over time? 1 patient is 3 years in. What are the neural plasticity effects?

2012-07-01: Brain in a vat is here!

The first real-time brain-scanning speller will allow people in an apparent vegetative state to communicate

2012-12-21: HCI chocolate

Researchers described the brain-computer interface that allowed Ms. Scheuermann to move an arm, turn and bend a wrist, and close a hand for the first time in 9 years. Less than 1 year after she told the research team, “I’m going to feed myself chocolate before this is over,” Ms. Scheuermann savored its taste and announced as they applauded her feat, “1 small nibble for a woman, 1 giant bite for BCI.”

2013-03-01: Brain to brain communication

Even though the animals were on different continents, with the resulting noisy transmission and signal delays, they could still communicate. This tells us that we could create a workable, network of animal brains distributed in many different locations.

2013-03-17: Hive mind privacy. One of the most interesting arguments for privacy in our (near) hive mind: to cut down on the quadratic communication overhead. Even our brain isn’t fully connected, rather sparsely in fact.
2014-03-04: I had somehow missed this 2 years ago. In the estimation of Mary Lou Jepsen: Could future devices read images from our brains? It should be possible to increase resolution 1000x in the next few years.

2014-04-27: Vegetative patients may be aware

a significant proportion of patients who were classified as vegetative in recent years have been misdiagnosed – Owen estimates perhaps 20%. Schiff, who weighs up the extent of misdiagnosis a different way, goes further. Based on recent studies, he says 40% of patients thought to be vegetative are, when examined more closely, partly aware. Among this group of supposedly vegetative patients are those who are revealed by scanners to be able to communicate and should be diagnosed as locked-in, if they are fully conscious, or minimally conscious, if their abilities wax and wane. But Schiff believes the remainder will have to be defined another way altogether, since being aware does not necessarily mean being able to use mental imagery. Nor does being aware enough to follow a command mean possessing the ability to communicate.

Another story:

For 12 years, Scott had remained silent, locked inside his body, quietly watching the world go by. Now, the fMRI had revealed a person: a living, breathing soul who had a life, attitudes, beliefs, memories and experiences, and who had the sense of being somebody who was alive and in the world – no matter how strange and limited that world had become.

On many occasions in the months that followed, we conversed with Scott in the scanner. He expressed himself, speaking to us through this magical connection we had made between his mind and our machine. Somehow, Scott came back to life. He was able to tell us that he knew who he was; he knew where he was; and he knew how much time had passed since his accident. And thankfully, he confirmed that he wasn’t in any pain.

Neuroethics / when you are declared brain dead are in for an upheaval.

After a major injury, some patients are in such serious condition that doctors deliberately place them in an artificial coma to protect their body and brain so they can recover. That could be a mistake. An extreme deep coma — based on the experiment on the cats — may actually be more protective. “Indeed, an organ or muscle that remains inactive for a long time eventually atrophies. It is plausible that the same applies to a brain kept for an extended period in a state corresponding to a flat EEG. An inactive brain coming out of a prolonged coma may be in worse shape than a brain that has had minimal activity. Research on the effects of extreme deep coma during which the hippocampus is active is absolutely vital for the benefit of patients.”

2014-09-11: Brain coupling

intriguing new possibilities for computer-assisted communication of brain states between individuals. The brain-to-brain method may be used to augment this mutual coupling of the brains, and may have a positive impact on human social behavior

2015-07-10: Rat onemind.

Brainet uses signals from an array of electrodes implanted in the brains of multiple rodents in experiments to merge their collective brain activity and jointly control a virtual avatar arm or even perform sophisticated computations — including image pattern recognition and even weather forecasting

2015-09-26: Unaided paraplegic walking

A novel brain-computer-interface has allowed a paraplegic man to walk for a short distance, unaided by an exoskeleton or other types of robotic support.

2016-06-01: Remote controlled insects. This is an improvement over the robo cockroach:

The rapid pace of miniaturization is swiftly blurring the line between the technological base we’ve created and the technological base that created us. Extreme miniaturization and advanced neural interfaces have enabled us to explore the remote control of insects in free flight via implantable radio-equipped miniature neural stimulating systems

2016-08-04: Neural Dust

UC Berkeley researchers are developing “Neural Dust,” tiny wireless sensors for implanting in the brain, muscles, and intestines that could someday be used to control prosthetics or a “electroceuticals” to treat epilepsy or fire up the immune system. So far, they’ve tested a 3 millimeter long version of the device in rats. “I think the long-term prospects for neural dust are not only within nerves and the brain, but much broader. Having access to in-body telemetry has never been possible because there has been no way to put something supertiny superdeep. But now I can take a speck of nothing and park it next to a nerve or organ, your GI tract or a muscle, and read out the data.”

2016-09-11: Do we really want to fuse our brains together?

If a rat can teach herself to use a completely new sensory modality – something the species has never experienced throughout the course of its evolutionary history – is there any cause to believe our own brains will prove any less capable of integrating novel forms of input?

2016-10-04: CCortex

Artificial Development is building CCortex, a massive spiking neural network simulation of the human cortex and peripheral systems. Upon completion, CCortex will represent up to 20b neurons and 20t connections, achieving a level of complexity that rivals the mammalian brain, and making it the largest, most biologically realistic neural network ever built. The system is up to 10k times larger than any previous attempt to replicate primary characteristics of human intelligence.

2017-03-23: Our Future Cyborg Brains

2017-09-05: 100x smaller Antennas

Antennas 100x smaller could lead to tiny brain implants, micro–medical devices, or phones you can wear on your finger. The antennas are expected to have sizes comparable to the acoustic wavelength, thus leading to orders of magnitude reduced antenna size compared to state-of-the-art compact antennas. These miniaturized ME antennas have drastically enhanced antenna gain at small size owing to the acoustically actuated ME effect based receiving/transmitting mechanisms at RF frequencies.

2018-02-27: EEG image reconstruction:

The new technique “could provide a means of communication for people who are unable to verbally communicate. It could also have forensic uses for law enforcement in gathering eyewitness information on potential suspects, rather than relying on verbal descriptions provided to a sketch artist.”

2018-05-14: Tetraplegics win race

But what about letting patients actively participate with AI in improving performance? To test that idea, researchers conducted research using “mutual learning” between computer and humans — 2 severely impaired (tetraplegic) participants with chronic spinal cord injury. The goal: win a live virtual racing game at an international event. After training for several months, in Oct. 8, 2016, the 2 pilots participated in Cybathlon in Zurich, Switzerland — the first international para-Olympics for disabled individuals in control of bionic assistive technology. 1 of those pilots won the gold medal and the other held the tournament record.

2018-09-11: DARPA Neurotechnology

DARPA is funding development of high resolution brain interfaces. At the same time there are 2 companies who have breakthrough technology for higher resolution brain interfaces. The 2 companies are Elon Musk’s Neuralink and Mary Lou Jepsen’s Openwater red light scanner.

2019-02-09: 75% Thought to Speech

A system that translates thought into intelligible speech. Devices monitor brain activity and Artificial Intelligence reconstructs the words a person hears. This breakthrough harnesses the power of speech synthesizers and artificial intelligence. It could lead to new ways for computers to communicate directly with the brain. The DNN-vocoder combination achieved the best performance (75% accuracy), which is 67% higher than the baseline system (Linear regression with auditory spectrogram).


2019-04-24: 43% Thought to speech

An implanted brain-computer interface (above) coupled with deep-learning algorithms can translate thought into computerized speech. The researchers asked native English speakers on Amazon’s Mechanical Turk crowdsourcing marketplace to transcribe the sentences they heard. The listeners accurately heard the sentences 43% of the time when given a set of 25 possible words to choose from, and 21% of the time when given 50 words. Although the accuracy rate remains low, it would be good enough to make a meaningful difference to a “locked-in” person, who is almost completely paralyzed and unable to speak.


2019-05-02: HCI Superpowers

The new documentary I Am Human chronicles how neurotechnology could restore sight, retrain the body, and treat diseases—then make us all more than human.

2019-08-01: Facebook has a 76% system:

Here, human participants listened to questions and responded aloud with answers while we used high-density electrocorticography (ECoG) recordings to detect when they heard or said an utterance and to then decode the utterance’s identity. Because certain answers were only plausible responses to certain questions, we could dynamically update the prior probabilities of each answer using the decoded question likelihoods as context. We decode produced and perceived utterances with accuracy rates as high as 61% and 76%, respectively (chance is 7% and 20%). Contextual integration of decoded question likelihoods significantly improves answer decoding. These results demonstrate real-time decoding of speech in an interactive, conversational setting, which has important implications for patients who are unable to communicate.

2019-10-30: Brain-to-Brain communication for group problem-solving

The interface combines electroencephalography (EEG) to record brain signals and transcranial magnetic stimulation (TMS) to deliver information noninvasively to the brain. The interface allows 3 human subjects to collaborate and solve a task using direct brain-to-brain communication. 2 of the 3 subjects are designated as “Senders” whose brain signals are decoded using real-time EEG data analysis. The decoding process extracts each Sender’s decision about whether to rotate a block in a Tetris-like game before it is dropped to fill a line. The Senders’ decisions are transmitted via the Internet to the brain of a third subject, the “Receiver,” who cannot see the game screen. The Senders’ decisions are delivered to the Receiver’s brain via magnetic stimulation of the occipital cortex. The Receiver integrates the information received from the 2 Senders and uses an EEG interface to make a decision about either turning the block or keeping it in the same orientation. A second round of the game provides an additional chance for the Senders to evaluate the Receiver’s decision and send feedback to the Receiver’s brain, and for the Receiver to rectify a possible incorrect decision made in the first round.

2021-05-14: 94% Thought to text

Using an implant, a paralyzed individual achieved typing speeds of 90 characters per minute with 94.1% raw accuracy online, and greater than 99% accuracy offline with a general-purpose autocorrect. Despite working with a relatively small amount of data (only 242 sentences’ worth of characters), the system worked remarkably well. The lag between the thought and a character appearing on screen was ~500ms, and the participant was able to produce 90 characters per minute, easily topping the previous record for implant-driven typing, which was ~25 characters per minute.

2022-04-15: EEG are terrible sensors. In-ear may fix that, and allow for continuous readings, and perhaps writing too.

But while the immediate uses of NextSense’s earbuds are medical, Berent hopes to eventually build a mass-market brain monitor that, if enough people start using it, can generate enormous quantities of day-to-day brain performance data. The catch, of course, is that since no one has ever done that, it’s not yet obvious what most people would get out of the information. That’s also what’s exciting. “We don’t necessarily know what we would learn because we’ve never had access to that type of data”.

Berent and his team envision a multipurpose device that can stream music and phone calls like AirPods; boost local sound like a hearing aid; and monitor your brain to provide a window into your moods, attention, sleep patterns, and periods of depression. He also hopes to zero in on a few sizes that would fit a vast majority of people, to dispense with all the ear-scanning.

Far along on the NextSense road map is something unproven, and kind of wild. If AI can decode tons of brain data, the next step would be to then change those patterns—perhaps by doing something as simple as playing a well-timed sound. “It’s almost a transformative moment in history,” fascinated by the prospect of using audio to nudge someone into a deeper sleep state. “It’s so convenient, it doesn’t bother you. People are wearing stuff in the ear typically anyway, right?”


2023-01-24: Faster speech to text

Our BCI decoded speech at 62 words per minute, which is 3.4x faster than the prior record for any kind of BCI and begins to approach the speed of natural conversation (160 words per minute). We highlight 2 aspects of the neural code for speech that are encouraging for speech BCIs: spatially intermixed tuning to speech articulators that makes accurate decoding possible from only a small region of cortex, and a detailed articulatory representation of phonemes that persists years after paralysis. These results show a feasible path forward for using intracortical speech BCIs to restore rapid communication to people with paralysis who can no longer speak.

Robot Exploration

A fleet of 100 robotic submarines could in 5 years’ time be roaming the vast unexplored stretches of the world’s seafloors and helping unlock their mysteries. “The pace of exploration in the ocean is going a little too slowly”. Only 5% of the ocean floor has been explored in detail, which means there may be numerous new species and geothermal processes waiting to be discovered.

NOAA plans to map the oceans floors with unmanned vehicles.
2010-10-25: Antarctica Ocean UAV. Such a baby step. we should have fleets of fully autonomous ocean robots by now, mapping the sea floors.

Gavia, a bullet-shaped robot developed by the University of British Columbia, is currently in Antarctica on a mission to explore heretofore uncharted areas of the ocean.

2013-04-21: Some speculation

excavating the past will mean deploying teams of remote-sensing robotic machines semi-autonomously flying, crawling, gridding, scanning, squeezing, and non-destructively burrowing their way into lost rooms and buried cities, perhaps even translating ancient languages along the way.

2018-11-20: Robot Wreck Discovery

The wreckage of the ARA San Juan (S-42) was found by Ocean Infinity. Ocean Infinity used 5 Autonomous Underwater Vehicles (AUVs) to carry out the search. Ocean Infinity’s ocean search capability is the most advanced in the world. Their AUV’s are capable of operating in depths from 5 meters to 6000 meters and covering vast areas of the seabed at unparalleled speed. The AUVs are not tethered, allowing them to go deeper and collect higher quality data. They are equipped with side scan sonar, a multi-beam echo-sounder HD camera, and synthetic aperture sonar. Ocean Infinity is able to deploy 2 work class ROVs and heavy lifting equipment capable of retrieving objects weighing up to 45T from 6000 meters.

Complex Adaptive Intelligence Community

We must transform the Intelligence Community into a community that dynamically reinvents itself by continuously learning and adapting as the national security environment changes. These changes include allowing our officers more autonomy in the context of improved tradecraft and information sharing. In addition, several new technologies will facilitate this transformation. 2 examples are self-organizing knowledge websites, known as Wikis, and information sharing websites known as Blogs.

Deep Brain Stimulation

Electrical brain stimulation rouses people from deep coma sometimes and is bound to wreak havoc with ethics and braindead determinations. Terri Schiavo was nothing.

For someone left for dead 12 years ago, Candice Ivey seems to be doing pretty well. She’s still got her homecoming queen looks and A-student smarts. She has earned a college degree and holds a job as a recreational therapist in a retirement community. She has, however, lost her ballerina grace and now walks a bit like her feet are asleep. She slurs her words a little, too, which sometimes leads to trouble. “One time I got pulled over. The cop looked at me and said, ‘What have you been drinking?’ I said, ‘Nothing.’ He said, ‘Get out here and walk the line.’ I was staggering all over the place. He said, ‘All right, blow into this.’ Of course I blew a 0, and he had to let me go.”

2008-09-15: Wireheads

Soon after insertion of the nVPL electrode, the patient noted that stimulation also produced erotic sensations. This pleasurable response was heightened by continuous stimulation at 75% maximal amplitude, frequently augmented by short bursts at maximal amplitude. Though sexual arousal was prominent, no orgasm occurred with these brief increases in stimulation intensity. Despite several episodes of paroxysmal atrial tachycardia and development of adverse behavioral and neurological symptoms during maximal stimulation, compulsive use of the stimulator developed. At its most frequent, the patient self-stimulated throughout the day, neglecting personal hygiene and family commitments. A chronic ulceration developed at the tip of the finger used to adjust the amplitude dial and she frequently tampered with the device in an effort to increase the stimulation amplitude.

2013-06-25: Consider: brain computer interfaces. Without this, this poor guy would have a pretty miserable life.

2015-06-14: Neurophilic implants

But with our injectable electronics, it’s as if it’s not there at all. They are 1m times more flexible than any state-of-the-art flexible electronics and have subcellular feature sizes. They’re what I call ‘neurophilic’ — they actually like to interact with neurons.

2015-11-09: Self-experimentation

Last year, Kennedy, a 67-year-old neurologist and inventor, did something unprecedented in the annals of self-experimentation. He paid a surgeon in Central America $25K to implant electrodes into his brain in order to establish a connection between his motor cortex and a computer.

2016-05-14: Brainjacking

A group of neurosurgeons round up a set of dire, terrifying warnings about the way that neural implants are vulnerable to networked attacks. Most of the article turns on deep brain stimulation devices, which can be used to stimulate or suppress activity in different parts of the brain, already used to treat some forms of mental illness, chronic pain and other disorders. The researchers round up a whole dystopia’s worth of potential attacks on these implants, including tampering with the victim’s reward system “to exert substantial control over a patient’s behavior”; pain attacks that induce “severe pain in these patients”; and attacks on impulse control that could induce “Mania, hypersexuality, and pathological gambling.”

2021-07-06: Perhaps everyone could lead better lives with a bit of DBS.

Why is Deep Brain Stimulation so transformative – not just eliminating OCD symptoms, but increasing self-confidence and openness to the world? And how can we make sense of self-confidence in the context of electrically induced changes in the brain? It could be that changes in the brain and an increase in self-confidence are both needed to set the sick person right. Understanding the effects of DBS on the brain might therefore be only a part of the explanation of how DBS changes the person.

It is the whole person who responds to DBS, and not only the parts of their brain where the electrodes are implanted. DBS changes many aspects of how a person engages with the world. Their social interactions, tendency to reflect and ruminate, mood, interests and, more generally, their self-confidence in life. Even for those without a pathology, the experience of over- and under-confidence can be common throughout life. Think of going into an interview where your dream job is at stake. In this kind of situation, many might experience a lack of self-confidence. Overconfidence on the job, on the other hand, can lead to precipitous calculations and risks. Too much self-confidence can tip over into impulsive acts that appear pathological; too little self-confidence can lead to anxiety and lack of trust in oneself and the world.

Biohybrids

The so-called biohybrid system sports a power pack and computer all contained within the prosthesis and uses sensors to allow more realistic movements than static, strap-on devices. The first systems have noninvasive sensors attached to the prostheses. In 2 years scientists will implant sensors into study volunteers’ nervous systems.

the state of the art of medical implants / prosthetics