TLDR: I am researching and trying to find a solution to help my mum who has GBS, a debilitating disease that paralyses your entire body leaving you with only your eyes for communication. Read the intro for more.
I like problems, it is one of the things in life I feel brings out the best in me… however, THIS is up there with the most challenges I have faced and I am way out of my depth.
This post is written over a few weeks with some changes in mums situation and continual learning about how this may apply to the situation.
“Impossible ‘is not a fact, it’s a perception of your current point of view and where you exist in time. All things can be overcome, maybe not now but always in time. As humans, we have proven this throughout our history.
Keep trying and never give up, the time may be right. “
As anyone reading this may be aware, my mum recently was diagnosed with GBS [Guillain-Barré syndrome] which is a paralyzing disease that affects the central nervous system and in the most severe cases renders you entirely paralysed with only the use of your eyes and almost certainly on ventilator support for some of the time in ICU.
My mum is currently paralysed from the neck down, with a little head, face (lips) movement and the use of her eyes. She is a lot better than a couple of weeks ago and seems settled if you could ever be in her condition. She has one of the most severe forms of the disease that is ‘Axonal’ nerve damage which affects the signals passing from the brain to the muscles. She is currently ventilated via a tracheostomy to keep her breathing supported, but is (now) breathing for the most part on her own.
The ability of the medical staff to communicate effectively with a person that has the condition is badly hampered. Both the patient and the medical staff need a channel to communicate that is simple, effective and designed for this setting.
I explain more about my personal experience with GBS in another post, as for one it is not over and, I am not experiencing it as a patient but as a concerned family member looking on.
So, how can I help?
I am a tech guy, not a medical guy (I can’t even watch Animal Hospital)… however, I have read every single research paper on GBS, found every organisation related to the problem and have a pretty good handle on it and there is nothing but time with this one.
The only way I can help is to find a way through technology to help her regain her power to communicate and that is what I will do, but first I need to understand all the tech and terminology involved.
This is the broad category of technology that helps people who have some form of impairment to assist them in daily life.
This is a massive category and one I have not delved onto before as I have never had the need. The closest I have come in recent times is voice control and home automation with Google Home and Alexa.
Due to my mum’s condition, I’ll need to infer input via the eyes using some form of tablet computer and eye tracker, webcam and this is where I start my research.
The subcategories of this are AAC, also known as augmentative and alternative communication – this encompasses communication devices, systems, and tools that replace or support natural speech.
Modern computing input methods
Input methods revolve heavily around the hands, from the standard keyboard and mouse to the fingers for mobile and tablet devices.
Some elements of voice control are now evident in things like Alexa and Google Home that are more of a novelty than real control but these do offer hope for those with disabilities that can speak. I use Google Home to turn on lights, heaters and music daily (when in Sydney).
Software is still mainly developed with the standard hand input methods and support for eye trackers or ‘eye gaze’ technology is few and far between but can now be found in several games.
Eye-tracking comes in two parts, the ‘eye tracking’ part which is similar to that of a mouse moving across the screen and ‘gaze interaction’ which is where the eye focuses on an element for a given timeframe (dwell time) that is similar to the ‘click’ of the mouse.
The input device I need is an eye tracker that is very much like a mouse in how it is reflective of the area the user wants to interact with and in this case, instead of a cursor, there is a gaze point – X Y coordinates within an area of the screen.
The usual mouse click action is a pause on the area the user wants to interact with for a set period.
The complication with this is the head position, when it moves (lay down, move from one side to the other) this affects the calibration and must be redone so the device can find the ‘position in space’ of the eyes.
How does eye-tracking technology work?
There are three methods of eye tracking from what I understand, the camera only(webcam) and infra-red light camera combinations and glasses (built into VR headsets in some cases).
The only two methods I am interested in are the Webcam options and the IR and camera options as the other are way too expensive and not suitable for my mums’ condition.
The webcam option relies on head and eye movement mapping by taking pictures at a high frame rate and then uses an algorithm to determine where the person is looking based on what is on the screen.
This doesn’t use IR light so can be inaccurate for a useful input method where you would want to navigate the screen.
I did test a few software options on my built-in laptop camera but these were a little frustrating. I can see where the application of this technology would be useful in a marketing sense but not currently as an input method.
IR Light and Camera:
This method uses a separate hardware device that fixes to the bottom of your screen, laptop or tablet device.
These devices use invisible Infrared light to illuminate the eyes where (in Tobii products at least) they create a 3D model of the eyes when light is reflected off the retina and the cornea. The best way to understand this is when you look at your cat/dog on a mobile phone camera the eyes are often lit up, this is what the IR light does to assist with understanding your eye position.
The devices need to know where your eyes are in relative position to where the device is located and where your eyes are looking. After the system gathers all of this information it uses a complex algorithm to filter out all the ‘noise’ and make smart judgments about where you are looking accurately.
This is the type of device that appears in all of my research and appears to be the hardware of choice for people with motor impairment.
More detailed information on how these systems work can be found here
Pros and cons of both types can be found in a great article here
Who are the players in eye-tracking tech and what kind of prices?
Eye-tracking companies appear to be getting absorbed into the likes of Facebook (Oculus), Google and Apple in recent times… this is mainly due to their advancement in AR and VR and needing a more suitable input method and biosensors for use in these technologies.
Some companies who provide eye-tracking tech do not offer solutions for the assisted market, so I’ll leave them off but for my research, but this is quite obviously a growing market and the devices are not cheap!
Prices are not readily available for most companies’ products, however, they range from £200 – £10,000+ and this is the hardware, not the software. The lower end re the gaming options and not intended for use in medical or research applications *apparently.
By far the leader in terms of products, research and overall internet saturation. They have a massive range of eye trackers, tablets and software to go with them. They also provide products for gaming usage.
- Tobii PCEye 5 (with communicator software, plus pc control) – £1,501
- Tobii 5 Eye tracker (Gaming) – £183 (quite a bit of a price difference)
Here are some others in no particular order, each offers a slightly different option and use case.
- EyeTech: EyeTech Digital TM5 £2,500
- Eyegaze: £3,000 hardware, software package, and goes up.
- Think Smart Box: (was owned by Tobii but competition watchdog asked to sell) – make software, but have a hardware-software solution.
- Eye Control: Hardware glasses type solution, unsure if it is even sold but would solve the ICU issue based on a number of factors – website won’t let me submit a form!
- Gazepoint: Hardware eye tracker, GP3 – £500
- Iris Bond (makers of Tallk*, with Samsung) – Hardware eye tracker – waiting on pricing.
There are probably a lot more… and I’ll update as I find them.
*Samsung has teamed up with a software/hardware provider to provide an App called Tallk – T[all]k, that pertains to provide a hardware and software solution from *any Samsung tablet or phone… not available yet in any other language by Spanish but would love to test it. More info on the project can be found here, and the promo video here looks interesting.
What is important in eye-tracking technologies?
How quickly it takes the input (retina movement) and transfers this to the output device. If there was a delay from looking at something on the screen to the action that was desired from the user this could cause frustration in real-time communication, much like when there is a delay on a telephone call.
Accuracy and precision:
How accurately the hardware and internal software algorithms pick up the retinas focal points on the screen. If these were wildly off the users would end up clicking on the wrong thing.
Every eye is different and every head position is different, calibration is key to getting accuracy.
Head position can change with each use, and if it changes it will affect calibration. The same goes with the user of the device, each eye is different and affects the accuracy of the gaze points.
Good video illustrating the use of calibration, the first minute: https://www.youtube.com/watch?v=gYtF3j2Hdl4
Usage indoor or outdoor has an effect on the cameras and the IR light projected.
What are the software options available for eye-tracking technology?
In the assistive technology space, there are a few options for eye tracker enabled software and vary in price and functionality.
These are often provided by the hardware companies and come as an all in one solution, however, there are a few standalone options available.
During my research, I have come across several open-source solutions where either a university research team has built something or a concerned relative looking for a low-cost solution.
You have two types of software, emulated and purpose-built applications. The emulated type tries to enable the computer’s full functionality whilst the purpose-built version enables what’s needed in the context of the app.
Windows software is the most common and has the most functionality.
Grid 3 by Think Smart Box, the cost is about £550 and comes with a 60-day trial.
Communicator 5 by Tobii Dynavox, the cost is about £428 and comes with a 30-day trial.
OptiKey (Open source), comes with a few options but is not as inclusive as the two above, again however it does depend on the use case.
GazeTheWeb (Open Source), is a browser application for eye trackers by a German uni research team who just got funding for a more commercial product off the back of this research.
Talon is mouse emulation software for eye trackers (amongst other things).
There are probably a ton of other solutions on the market and I will evaluate them if the need arises as I move through the process.
The solutions vary a lot in what they can do and each will suit a specific set of needs, for my current use case this needs to be as simple as possible.
What does AAC software typically do?
Generally speaking the common AAC software I have looked at is a set of grids that represent language.
The grids have words, symbols or pictures that represent common themes you would encounter day today, and they have a speech engine. You gaze at a grid item for a period of time and this is activated and speaks for you.
Most of the solutions have different types of grids for different users needs, either based on age or ability.
Often they also contain a letter board so the user can spell a word that is then ‘spoken’ by the system to whoever they are communicating with.
There is also pointer emulation and browser control solutions that come with or can be purchased if needed.
Games and certain application control are available in some software, such as Spotify and Facebook.
The use case for my mum situation
Mum can’t move most of her body, however, she has got more head movement and can move eyes freely with blinking.
She gets tired easily.
Current communication methods
Inbound is the only way mum can communicate with anyone, there is no way she can alert anyone of anything unless they are in her direct view. This is almost incomprehensible to someone who for their entire life has been able to alert others about something.
She can now move her mouth and lips a lot more allowing her to formulate these into words we can understand with lip reading.
At times where this is difficult, we use a laminated chart with letters, numbers, where we run a finger along with them and spell what she is trying to say.
As you may imagine this is not an easy task for anyone… for example:
You have a paper chart held up in your hand, facing the patient and waiting for a response whilst also trying to point at a letter you are not looking at. When you get a response you look down to find you have moved on… and you have to repeat. Then you forget the rest of the letters… it is not fun.
She is still in ICU/Critical care on a ventilator (trache ventilated, throat for those who don’t know).
Her breathing pipes are mainly out of the way so the eyes are visible, however, there is currently a feed tube via the nose that could prove tricky.
She is mainly lying down on her back with her upper torso slightly raised and laying either to the left, right or on her back.
She is at times in a chair to help the body adjust to the way it should be, this is not a normal chair but does make you feel like you are sitting talking to her at a normal level. This is also better for getting a table positioned in front at a level where eye trackers are most happy.
This is a sterile hospital ICU ward with limited beds, secure access and set visiting times – 3 pm and 6 pm.
The beds are well spaced with a lot of equipment at the head end of the bed, including ventilators and monitors.
There are extendable tables that can be positioned over or near the bed on wheels.
There are power and extension cords, lighting is good with no direct sunlight from any windows.
Staff (nurses mainly) are open and happy to test new things as long as they are for the benefit of the patient and don’t obstruct care.
ICU wards are busy at times, quite at others… no day has been the same since I visited (in both hospitals).
Requirements of the proposed solution
So what does the solution need to do?
- Alert people based on a glance with audio-outbound communication freedom.
- Have a simple way to pick a specific ‘thing’ that is an issue with a glance, such as pain level or breathing difficulty.
- Spell out any more complex issues that can’t be easily added to a dashboard with letters.
- Portability and lightweight
- Angle and height-adjustable.
- Easy to assemble
- Large screen for readability
- Unobtrusive and cable managed
I am not a rich man, but I’ll work out a way to pay for the right solution to this problem based on what is needed and when it is needed.
Ultimately the problem I am trying to solve is to help my mum communicate, and when she needs to ask for help she can, all using her eyes.
The environment I am looking to use this in is not what most of the technologies on the market are designed for – ICU/critical care patients. I find this a little unusual considering Covid and the use of ventilators in hospitals across the globe – how do these people communicate?
I now need to evaluate the various options, run some tests where possible and see how this works in real life.
I hope to find a solution that works and eases the burden of being locked in if only by some small amount I’ll sleep better at night.
Part 2 – coming soon… Building & testing an eye-tracking solution [LINK]