How Intel Keeps Stephen Hawking Talking with Assistive Technology (Advertorial)
For two years, Intel has worked to upgrade Stephen Hawking’s computer system, a pioneering assistive technology project that will have far-reaching benefits for the disabled.
Professor Stephen Hawking is arguably as famous for his computerized voice as he is for his ground-breaking work with general relativity and black holes. Intel has been working with Hawking since 1997, helping to maintain and improve the assistive computer system that enables him to interact with the world.
As Hawking’s motor neurone disease has advanced, his ability to communicate has slowed to one word per minute.
Intel’s challenge was to keep Hawking talking.
Hawking’s computer system uses a rudimentary timed interface. A cursor automatically scans across an on-screen keyboard, and whenever the renowned physicist blinks, he triggers an infrared sensor on his cheek. This stops the moving cursor and selects whatever key or option the cursor was highlighting at the time.
It’s not perfect, and if Intel’s engineers could build a new assistive system from scratch, it would probably look very different. It might use cutting-edge eye tracking technology or an electroencephalogram (EEG) approach, which translates brain activity into simple commands. The problem was this: Hawking didn’t want a completely new system.
“Stephen has used the same interface for decades,” explains Lama Nachman, principal engineer at User Experience Research at Intel Labs (seated next to Hawking in photo above).
“He is very adamant about keeping it. So our task was to retain the familiar user experience, but make that experience more intuitive and powerful.”
It’s taken two years of trial and error, working closely with Hawking, to create an enhanced version of his original system. Nachman and her team have recoded the software from scratch, adding an array of new features.
One of the biggest improvements is a new word predictor. Just like a modern smartphone keyboard, the new system (based on the SwiftKey SDK) is capable of autocompleting words as they are typed in. It can also predict the next most commonly used word, dramatically reducing the number of ‘clicks’ it takes for Hawking to build words.
As they were unable to redesign Hawking’s computer interface, Nachman and her team looked for improvements they could make elsewhere. They started by observing how Hawking used his system, tracking ‘interaction flows’ that included: writing documents or emails, giving lectures, searching the web and reading PDFs. Once a flow had been identified, engineers then attempted to streamline it.
“If you’re using Microsoft Word, which Stephen uses a lot, there are a few sets of functions that you want to use most often — open a new document, save, edit, and so on,” explains Nachman.
“We added a lot of contextual menus to his system, so he can select one with a single click, rather than having to go to the mouse, then to the menu, then to select an option. We created a lot of these new contextual options throughout the system to speed up use.”
Professor Hawking has been using his new software for several months, while Nachman and her team have been debugging and fine-tuning it. It’s almost finished, and when it is, Intel plans to make the system available to the open source community.
It’s a move that will allow other people to take the platform and develop it further. Nachman and her team hope that their pioneering work with Hawking will go on to help people with similar disabilities and communication issues, advancing the assistive technology field.
Professor Hawking was diagnosed with motor neurone disease when he was 21 and wasn’t expected to live past the age of 25. With technology’s help, he has not only defied his illness, he has regained his independence and clung stubbornly to his identity, even if that has meant saying ‘no’ to some technology upgrades.