– Hands –
Hands is one of our first experiences that we are building. It will be a BCI – brain computer interface experience to research, explore and understand how BCI can work with the brain’s plasticity (ability to change) to help people with neurological conditions. There is potential for this system to be useful for other conditions and for general well-being. Scroll down this page for more detailed information.
The CORE section in the menu bar will go into more detail about the team, and what we need to finish the first prototype and it’s 6 related BCI experiences. It also has some information about the budget and fundraising strategy for the first prototype. Other
– Hands –
The Hands experience will initially integrate BCI with an onscreen representation of a person’s hands.
We will determine which brainwaves are related to intentional hand or finger movement, and program our system to respond to this kind of intent to action.
The onscreen hands will respond to the brainwaves of the person wearing the BCI headset.
The person we know with ALS is starting to lose the ability to move his fingers.
We want to determine if we can slow down or stop the developing paralysis.
We will explore this by seeing if a BCI oriented system like this can provide a training regimen for this person to trigger positive changes in his brain. A rewiring of sorts.
* In a subsequent development phase, we will build a motorized exoskeleton system that will allow the patient to use the Brain Computer Interface to move their fingers and hands as part of the training regimen.
> click here to see a video of a BCI system that uses a motorized exoskeleton system for stroke patients.
Below is a screenshot of a 3d model of a set of virtual hands. They will respond to the brainwave activity of our participants.
This concept will be fleshed out further in a future update.
To start, we are going to build 5 different kinds of BCI experiences related to the first prototype, including the Hands experience.
Once the 5 are ready, we are going to introduce them to 3 different people. One older adult male with ALS/Lou Gehrig’s Disease, one older adult male with Muscular Dystrophy, and one young girl with severe epilepsy. We will expand access to more people over time.
– Hands App – Test 1. (Visual interface is an early stage test only. Future version will look much better.)
A. In Unity 3D Authoring, we have been able to move a 3D model of a hand/arm up and down, based on specific brainwave activity. Done in Demo mode only. Demo mode simulates brainwave activity, and triggers responses. We have an issue within the authoring environment, with making a live connection between the Brainwave hardware and Unity 3D. We expect to solve this problem within the next 3 weeks, or sooner.
B. Later on in the Alpha stage of the Hands App the patient will have access to the virtual bones/muscle of the hand and fingers. We will then integrate the BCI headset and software into being able to have the patient move 1 or more fingers, based on brainwave activity.
One of the first people who will test our device is a person with ALS/Lou Gehrig’s disease. He has various significant challenges, including a continuing decline in the full use of his fingers. We are exploring to find out how this can affect Brain plasticity.
C. MES – Motorized Exoskeleton System.
Part way thru the Beta stage, we will integrate a Motorized Exoskeletion System into the Hands demo. We speculate that this will increase the potential related to relearning and brain plasticity.
– more info about this will be posted here soon.
Check out this video at the link below of the BCI related work done by Dr. Eric C. Leuthardt, a Neurological Surgeon and Biomedical Engineer. Leuthardt’s BCI system combined with a motorized exoskeleton device has helped Rick Arnold, a stroke victim, regain the use of his hands.
> BCI – Brain-computer interface reverses paralysis in stroke victims – Reuters News : Dr. Leuthardt video link
Additional resources are below: