You've heard of Neuralink, and you're probably aware of the advances in prosthetic limbs.
Well, this is the next generation - a bidirectional interface that basically generates an external recurrent processing loop.
It can be used for "anything", from controlling limbs to playing video games to storing information in external memory.
One of the key players in this technology is Prof. Eberhard "Eb" Fetz from the University of Washington.
Here's his home page, and a little about him:
wanprc.uw.edu
Eb has a PhD in physics from MIT. He was the first to show that the brain can consciously control the activity of single neurons. (Mid 70's).
The basic idea of the bidirectional interface is straightforward: you have a recording electrode and a stimulator in the same circuit.
It's what's in between that matters.
If you're into machine learning, you know about "hidden layers". Well, the BBCI has hidden layers between the transducer and the stimulator. Basically, it's hooking an external AI directly into your brain.
What Prof Fetz has shown, is that single (artificial) neurons in the external hidden layers(s) can be consciously controlled just like real neurons.
This is a remarkable achievement. Lends a whole new meaning to "expanded consciousness".
The only missing piece right now, is the brain can not yet store real memories in the external device. (Because we have no idea what memories really are, as of yet - we know they involve changes in synaptic weights, but we don't know how they're programmed - we don't know what an "engram" is or how it's accessed).
So far, this device is only "real time" - and so far, it's only our brains learning how the device works and how to control it.
But this is a very short step from the converse situation, which is the external device learning how our brain works and learning how to use it.
The future is merging AI with our own brains. Today you can only talk to AI, with ChatGPT or whatever your favorite AI is - but tomorrow you'll be able to think with it, and it with you.
en.wikipedia.org
Well, this is the next generation - a bidirectional interface that basically generates an external recurrent processing loop.
It can be used for "anything", from controlling limbs to playing video games to storing information in external memory.
One of the key players in this technology is Prof. Eberhard "Eb" Fetz from the University of Washington.
Here's his home page, and a little about him:

Eb Fetz receives Aspen Brain Forum Prize for innovation in neurotechnology - Washington National Primate Research Center
Eberhard Fetz, WaNPRC core staff scientist, received the first Aspen Brain Forum Prize in Neurotechnology. The prize was awarded to Fetz for “work that has broad application and impact in translating basic research into effective therapeutics within the area of neural prosthetics.”

Eb has a PhD in physics from MIT. He was the first to show that the brain can consciously control the activity of single neurons. (Mid 70's).
The basic idea of the bidirectional interface is straightforward: you have a recording electrode and a stimulator in the same circuit.
It's what's in between that matters.
If you're into machine learning, you know about "hidden layers". Well, the BBCI has hidden layers between the transducer and the stimulator. Basically, it's hooking an external AI directly into your brain.
What Prof Fetz has shown, is that single (artificial) neurons in the external hidden layers(s) can be consciously controlled just like real neurons.
This is a remarkable achievement. Lends a whole new meaning to "expanded consciousness".
The only missing piece right now, is the brain can not yet store real memories in the external device. (Because we have no idea what memories really are, as of yet - we know they involve changes in synaptic weights, but we don't know how they're programmed - we don't know what an "engram" is or how it's accessed).
So far, this device is only "real time" - and so far, it's only our brains learning how the device works and how to control it.
But this is a very short step from the converse situation, which is the external device learning how our brain works and learning how to use it.
The future is merging AI with our own brains. Today you can only talk to AI, with ChatGPT or whatever your favorite AI is - but tomorrow you'll be able to think with it, and it with you.
