Eye Control – Inbuilt EyeGaze Access for Windows 10

Just yesterday Microsoft announced what is possibly their biggest step forward in functionality within their Ease of Access accessibility settings since Windows 7. Eye Control is an inbuilt feature to facilitate access to the Windows 10 OS using the low cost eyegaze peripheral the Tracker 4 C from Tobii. More about what you can actually do with Eye Control below but first a little background to how this has come about.

Steve Gleeson and his son

Former American Football professional and MND (ALS) sufferer Steve Gleason (above) challenged Microsoft in 2014 to help people affected by this degenerative condition through the advancement eye tracking technology. This initial contact lead to the development of a prototype eye gaze controlled wheelchair, receiving lots of publicity and generating increased awareness in the process. However it was never likely to be progressed to a product that would be available to other people in a similar situation. What this project did achieve was to pique the interest of some of the considerable talent within Microsoft into the input technology itself and its application, particularly for people with MND.

A combination of factors felt on both sides of the Atlantic have proved problematic when it comes to providing timely AT support to people diagnosed with MND. Eyegaze input is the only solution that will allow successful computer access as the condition progresses, eye movement being the only ability left in the final stages of the illness. However, historically the cost of the technology meant that either insurance, government funding or private fundraising was the only means by which people could pay for eyegaze equipment. Usually this resulted in a significant delay which, due to the often aggressive nature of MND meant valuable time was lost and often the solution arrived too late. This situation was recognized by Julius Sweetland who led the development of Optikey, an Open Source computer access/AAC solution designed to work with low cost eye trackers back in 2015. Interestingly some of the innovative features of Optikey seem to have made it to Eye Control on Windows 10 (Multi-Key selection called Shape Writing on Eye Control – see gif below).

demo of shap writing on Eye Control - works like swiping on a touch keyboard. dwell on the first letter of a word, glance at subsequent letters and dwell on last letter. word is entered

Since the initial Steve Gleason Wheelchair hack there has been a steady stream of high quality research papers coming from people at Microsoft on the subject of eyegaze input and MND solutions. This should have been a hint that something like Eye Control was on the horizon. EyeGaze input has promised to break into the mainstream several times over the last decade however with Eye Control and support for devices being included in the core Windows OS it has never been this close.

For more background on the path to Eye Control see this Microsoft blog post from Microsoft:  From Hack to Product, Microsoft Empowers People with Eye Control for Windows 10

Want to find out how to get early access to Eye Control or get some more information on the functionality read this post from Tobii (be warned there are still bugs):  How to get started with Eye Control on Windows.

Hands-free Minecraft from Special Effect

Love it or hate it, the game of Minecraft has captured the imagination of over 100 million young, and not so young people. It is available on multiple platforms; mobile device (Pocket Edition), Raspberry Pi, Computer, Xbox or PlayStation and it looks and feels pretty much the same on all. For those of us old enough to remember, the blocky graphics will hold some level of nostalgia for the bygone 8 Bit days when mere blobs of colour and our imagination were enough to render Ghosts and Goblins vividly. This is almost certainly lost on the main cohort of Minecraft players however who would most probably be bored silly with the 2 dimensional repetitive and predictable video games of the 80’s and early 90’s. The reason Minecraft is such a success is that it has blended its retro styling with modern gameplay and a (mind bogglingly massive) open world where no two visits are the same and there is room for self-expression and creativity. This latter quality has lead it to become the first video game to be embraced by mainstream education, being used as a tool for teaching everything from history to health or empathy to economics. It is however the former quality, the modern gameplay, that we are here to talk about. Unlike the afore mentioned Ghosts and Goblins, Minecraft is played in a 3 dimensional world using either the first person perspective (you see through the characters eyes) or third person perspective (like a camera is hovering above and slightly behind the character). While undoubtedly offering a more immersive and realistic experience, this means controlling the character and playing the game is also much more complex and requires a high level of dexterity in both hands to be successful. For people without the required level of dexterity this means that not only is there a risk of social exclusion, being unable to participate in an activity so popular among their peers, but also the possibility of being excluded within an educational context.

Fortunately UK based charity Special Effect have recognised this need and are in the process doing something about it. Special Effect are a charity dedicated to enabling those with access difficulties play video games through custom access solutions. Since 2007 their interdisciplinary team of clinical and technical professionals (and of course gamers) have been responsible for a wide range of bespoke solutions based on individuals’ unique abilities and requirements. Take a look at this page for some more information on the work they do and to see what a life enhancing service they provide. The problem with this approach of course is reach, which is why their upcoming work on Minecraft is so exciting. Based on the Open Source eyegaze AAC/Computer Access solution Optikey by developer Julius Sweetland, Special Effect are in the final stages of developing an on-screen Minecraft keyboard that will work with low cost eye trackers like the Tobii Eye X and the Tracker 4C (€109 and €159 respectively).

minecraft on screen keyboard

The inventory keyboard

MineCraft on screen keyboards

The main Minecraft on screen keyboard

Currently being called ‘Minekey’ this solution will allow Minecraft to be played using a pointing device like a mouse or joystick or even totally hands free using an eyegaze device or headmouse. The availability of this application will ensure that Minecraft it now accessible to many of those who have been previously excluded. Special Effect were kind enough to let us trial a beta version of the software and although I’m no Minecraft expert it seemed to work great. The finished software will offer a choice of onscreen controls, one with smaller buttons and more functionality for expert eyegaze users (pictured above) and a more simplified version with larger targets. Bill Donegan, Projects Manager with Special Effect told us they hope to have it completed and available to download for free by the end of the year. I’m sure this news that will excite many people out there who had written off Minecraft as something just not possible for them. Keep an eye on Special Effect or ATandMe for updates on its release.

Boardmaker Online now launched in Ireland

Tobii Dynavox have recently launched their new Boardmaker Online product in Ireland through SafeCare Technologies. It has all the functionalities of previous versions of Boardmaker, except now that it’s web-based you don’t need any disks and multiple users can access it from any PC.

Instructor showing students how to use Boardmaker Online

You can purchase a Personal, Professional or District account and the amount you pay depends on the type of account, the amount of “instructors” and how many years you want to sign up for. You can also get a discount for any old Boardmaker disks that you want to trade in.

You get all the symbols that have been available in past versions, as well as some new symbol sets and any new ones that are created in the future will also be given to you. Because it’s web-based, you have access to previously created activities via the online community and you can upload activities you create yourself to that community and share them with other people in your district or all over the world.

Because it’s no longer tied to one device, you can create activities on your PC and assign them to your “students” who can use them either in school and/or at home. You no longer need to have a user’s device in your possession to update their activities and they don’t need to have a period without their device while you do this.

You (and the other instructors in your district if you have a district licence) can also assign the same activity to many students and by having different accessibility options set up for different students, the activity is automatically accessible for their individual needs. For example, you could create an activity and assign it to a student who uses eye gaze and to a student who uses switches and that activity will show up on their device in the format that’s accessible for them.

Picture shows how instructors can assign Boardmaker Online activities to multiple students

The results of students’ work can be tracked against IEP or educational goals which then helps you decide what activities would be suitable to assign next. You can also track staff and student usage.

One limitation is that you can only create activities on a Windows PC or Mac. You can play activities on an iPad using the free app but not create them on it, and you can’t use Boardmaker Online to either create or play activities on an Android or Windows-based tablet.

The other point to mention is that because it’s a subscription-based product, the payment you have to make is recurring every year rather than being a one-off payment, which may not suit everyone.

However, with the new features it’s definitely worth getting the free 30-day trial and deciding for yourself if you’d like to trade in your old Boardmaker disks for the new online version!

Eye gaze controlled Power wheelchair

eye controlled wheelchair via tablet mounted on wheelchair

EyeTech Digital Systems, has partnered with Quantum Rehab to bring eye controlled wheel chairs to individuals who are unable to use hand controls. EyeTech’s eye tracking technology mounts directly to a tablet PC and allows the user to control the entire computer using eye movements. The system then mounts to the wheelchair. An eye control driving app gives the user the ability to drive hands-free. The driving controls are overlaid on the scene camera.  Simply looking at the driving controls activates them to control the basic directions and movements of the chair.

Quantum Rehab® products, including a range of rehab mobility technologies such as the Q6 Edge® 2.0 and Quantum Series of power bases. www.QuantumRehab.com.

EyeTech Digital Systems products include eye tracking technology for a variety of markets such as medical, transportation, entertainment, and augmentative communication http://www.eyetechds.com/

Irish suppliers

LC Seating and MMS Medical.

GazeSpeak & Microsoft’s ongoing efforts to support people with Motor Neuron Disease (ALS)

Last Friday (February 17th) New Scientist published an article about a new app in development at Microsoft called GazeSpeak. Due to be released over the coming months on iOS, GazeSpeak aims at facilitating communication between a person with MND (known as ALS in the US, I will use both terms interchangeably) and another individual, perhaps their partner, carer or friend. Developed by Microsoft intern, Xiaoyi Zhang, GazeSpeak differs from traditional approaches in a number of ways. Before getting into the details however it’s worth looking at the background, GazeSpeaker didn’t come from nowhere, it’s actually one of the products of some heavyweight research into Augmentative and Alternative Communication (AAC) that has been taking place at Microsoft over the last few years. Since 2013, inspired by football legend and ALS sufferer Steve Gleason (read more here) Microsoft researchers and developers have put the weight of their considerable collective intellect to bear on the subject of increasing the ease and efficiency of communication for people with MND.

Last year Microsoft Research published a paper called ”
AACrobat: Using Mobile Devices to Lower Communication Barriers and Provide Autonomy with Gaze-Based AAC” (abstract and pdf download at previous link) which proposed a companion app to allow an AAC user’s communication partner assist (in an non-intrusive way) in the communication process. Take a look at the video below for a more detailed explanation.

This is an entirely new approach to increasing the efficiency of AAC and one that I suggest, could only have come from a large mainstream tech organisation who have over thirty years experience facilitating communication and collaboration.

Another Microsoft research paper published last year (with some of the same authors at the previous paper) called “Exploring the Design Space of AAC Awareness Displays” looks at importance of a communication partners “awareness of the subtle, social, and contextual cues that are necessary for people to naturally communicate in person”. There research focused on creating a display that would allow the person with ALS express things like humor, frustration, affection etc, emotions difficult to express with text alone. Yes they proposed the use of Emoji, which are a proven and effective way a similar difficulty is overcome in remote or non face to face interactions however they went much further and also looked at solutions like Avatars, Skins and even coloured LED arrays. This, like the other one above, is an academic paper and as such not an easy read but the ideas and solutions being proposed by these researchers are practical and will hopefully be filtering through to end users of future AAC solutions.

That brings us back to GazeSpeak, the first fruits of the Microsoft/Steve Gleason partnership to reach the general public. Like the AACrobat solution outlined above GazeSpeak gives the communication partner a tool rather than focusing on tech for the person with MND. As the image below illustrates the communication partner would have GazeSpeak installed on their phone and with the app running they would hold their device up to the person with MND as if they were photographing them. They suggest a sticker with four grids of letters is placed on the back of the smart phone facing the speaker. The app then tracks the persons eyes: up, down, left or right, each direction means the letter they are selecting is contained in the grid in that direction (see photo below).

man looking right, other person holding smartphone up with gazespeak installed

Similar to how the old T9 predictive text worked, GazeSpeak selects the appropriate letter from each group and predicts the word based on the most common English words. So the app is using AI in the form of machine vision to track the eyes and also to make the word prediction. In the New Scientist  article they mention that the user would be able to add their own commonly used words and people/place names which one assumes would prioritize them within the prediction list. In the future perhaps some capacity for learning could be added to further increase efficiency. After using this system for a while the speaker may not even need to see the sticker with letters, they could write words from muscle memory. At this stage a simple QR code leading to the app download would allow them to communicate with complete strangers using just their eyes and no personal technology.

2016 – Technology Trends and Assistive Technology (AT) Highlights

As we approach the end of 2016 it’s an appropriate time to look back and take stock of the year from an AT perspective. A lot happened in 2016, not all good. Socially, humanity seems to have regressed over the past year. Maybe this short term, inward looking protectionist sentiment has been brewing longer but 2016 brought the opportunity to express politically, you know the rest. While society steps and looks back technology continues to leap and bound forward and 2016 has seen massive progress in many areas but particularly areas associated with Artificial Intelligence (AI) and Smart Homes. This is the first in a series of posts examining some technology trends of 2016 and a look at how they affect the field of Assistive Technology. The links will become active as the posts are added. If I’m missing something please add it to the comments section.

Dawn of the Personal Digital Assistants

Game Accessibility

Inbuilt Accessibility – AT in mainstream technology 

Software of the Year – The Grid 3

Open Source AT Hardware and Software

The Big Life Fix

So although 2016 is unlikely to be looked on kindly by future historians… you know why; it has been a great year for Assistive Technology, perhaps one of promise rather than realisation however. One major technology trend of 2016 missing from this series posts is Virtual (or Augmented) Reality. While VR was everywhere this year with products coming from Sony, Samsung, Oculus and Microsoft its usefulness beyond gaming is only beginning to be explored (particularly within Education).

So what are the goals for next year? Well harnessing some of these innovations in a way where they can be made accessible and usable by people with disabilities at an affordable price. If in 2017 we can start putting some of this tech into the hands of those who stand to benefit most from its use, then next year will be even better.

Tobii Power Clinic Online Webinars

TobiiDynavox the AAC arm of Tobii Group, industry leading eye gaze/tracking solutions manufacturer, are currently producing a series of webinars on their range of products. Although they have completed 8 webinars to date they are all recorded and available on Youtube and the Power Clinic Online page. Taken together these are a fantastic resource for anybody interested in EyeGaze input. While some are product specific (for example “Getting Started With Communicator“) others are more general and will have content of interest to both therapists and technicians working in the area (Amazing Calibrations for EyegazeAAC and Autism – Strategies for challenging behaviour & EyeGaze Strategies for Reading!).

Tobii users, especially those new to the technology will be particularly interested in Eyegaze Windows Control for beginners! and Advanced Windows Control (eyegaze). If you rather the live experience make sure to sign up for next weeks webinar AAC AND MUSIC (PERFECT SYMPHONY) and you will be able to ask questions and contribute to the discussion.

We have created a playlist of all the Tobii Webinars to date on our Youtube channel if that makes it easier for you to access and share them (Click here)

Assistive Technology (AT) in the era of the Digital Schoolbag

child using tablet computer to study biology- zooming in on screen on mid section of human skeleton

Increasingly schools are opting for what is sometimes termed a digital schoolbag. This involves the purchase of an electronic device, usually an iPad with a package of digital textbooks pre-installed. Digital textbooks are undoubtedly a step in the right direction in terms of accessibility and are indeed essential for many students with disabilities. There are students however who may need to use a different platform (hardware and/or operating system – OS) because of compatibility issues with their Assistive Technology. Currently the most popular platform being adopted by schools is Apple iOS with parents being directed to purchase an iPad from a contracted supplier. Many readers of this article will be well aware of all the great inbuilt accessibility features within iOS however if you are a user of Eye Gaze or Speech Recognition (for access) it does not currently support your chosen AT.

It is understandable why from a school’s perspective having all students using identical standardised devices would be preferable and there are plenty of reasons why Apple iOS would be the obvious choice. There is a concern however that the small minority who may need to use other platforms because of access difficulties could be put at a disadvantage or perhaps not be able to participate fully in all activities. One of the leading school suppliers have assured us that the textbooks can be accessed on Windows, iOS and Android and as these textbooks are sourced from the same few publishers one can assume this applies for all suppliers. It is therefore up to the schools to ensure all lessons utilizing technology are identical whenever possible; equivalent when not, regardless of the device/platform you are using. Parents, particularly those whose children use Assistive Technology should not feel pressured by schools to purchase technology that isn’t the optimum for their child’s needs. If a therapist or AT specialist has recommended a particular solution that differs from what is being suggested by the school, the priority should obviously be the students’ needs. When it comes to AT it is the school’s responsibility to accommodate the different needs of its student, just as it was before the digital schoolbag. The use of technology within our schools is to be embraced but it is important that schools ensure that the curriculum is open and in no part dependent on one particular platform or device. That would just see us swapping one form of inequality for another and that’s not progress.

If anyone would like advice on what technologies are available to support access, literacy and productivity on any platform they should feel free to contact us here in the National Assistive Technology Service in Sandymount, Dublin.

Accessible Gaming & Playing Agario with your Eyes

We in Enable Ireland Assistive Technology Training Service have long recognised the importance of gaming to many young and not so young assistive technology users. It’s a difficult area for a number of reasons. Firstly games (and we are talking about video games here) are designed to be challenging, if they are too easy they’re not fun, however if too difficult the player will also lose interest. Successful games manage to get the balance just right. Of course when it comes to physical dexterity as well as other skills required for gaming (strategic, special awareness, timing) this often involves game designers taking a one size fits all approach which frequently doesn’t include people with physical, sensory or cognitive difficulties. There are two methods of getting around this which when taken together ensure a game can be accessed and enjoyed by a much broader range of people; difficulty levels (not a new concept) and accessibility features (sometimes called assists). Difficulty levels are self-explanatory and have been a feature of good games for decades. Accessibility features might include the ability to remap buttons (useful for one handed gamers), automate certain controls, subtitles, high contrast and magnification.

Another challenge faced when creating an alternative access solution to allow someone successfully play a video game is that you need to have a pretty good understanding of the activity, how to play the game. This is where we often have difficulty and I’d imaging other non-specialist services (general AT services rather than game accessibility specialist services like SpecialEffect or OneSwitch.org.uk  ) also run into problems. We simply do not have the time required to familiarise ourselves with the games or keep up to date with new releases (which would allow us better match a person with an appropriate game for their range of ability). We try and compensate for this by enlisting the help of volunteers (often from Enable Irelands IT department with whom we share office space), interns and transition year students. It’s often the younger transition year students who bring us some of the best suggestions and last week was no exception. After we demonstrated some eyegaze technology to Patrick, a transition year student visiting from Ardscoil Ris, Dublin 9, he suggested we take a look at a browser based game called agar.io. I implore you, do not to click that link if you have work to get done today. This game is equal parts addictive and infuriating but in terms of playability and simplicity it’s also very accessible with simple controls and a clear objective. The idea is that using your mouse you control a little (at first) coloured circular blob, think of it as a cell and the aim of the game it to eat other little coloured cells and grow. The fun part is that other players from every corner of the globe are also controlling cells and growing, if they are bigger than you they move a little slower but can eat you! Apart from the mouse there are two other buttons, the spacebar allows you to split your cell (can be used as an aggressive or defensive strategy) and the “W” key allows you to shed some weight. We set up the game to be played with a Tobii EyeX  (€119) and IRIS software (€99). IRIS allows you to emulate the mouse action with your eyes and set up two onscreen buttons (called interactors) that can also be activated using your gaze, the video below should make this clearer.

Big thanks to Patrick for suggesting we take a look at Agrio and helping us set it up for eyegaze control. I’ll leave the final words to him. “I found playing Agrio with gaze software really fun. I think you have just as much control with your eyes as with your mouse. If an interactor was placed in the corner of the screen to perform the function of the spacebar (splits the cell in half) it would be beneficial. I believe it would be a very entertaining game for people who can only control their eyes, not their arms.”

Assistive Technology Webinars

 

webinar graphic

Are you looking for free expert training and advice is assistive technology?

Then consider signing up for a webinar.  There are lots of webinars available within various areas of assistive technology.  Some have a charge, but there are many freely available for anyone to take part in.

A webinar is a live meeting that takes place over the web.  The meeting can be a presentation, discussion, demonstration, or an instructional session.  Participants can view documents and applications via their computers, while join in the discussion by audio or via a live Q&A text area.

Many assistive technology suppliers and organisations are using webinars as a way to share information.  Below are a list of a few online webinars that you can register on or listen to archived sessions.

Inclusive technology

http://www.inclusive.co.uk/events/webinars

The Great Lakes ADA Center’s

http://www.ada-audio.org/Webinar/AccessibleTechnology/Schedule/#fy2015Session6

ATIA Online Professional Development

http://www.atia.org/i4a/member_directory/feResultsListing.cfm?directory_id=8&viewAll=1

Don Johnston Incorporated

http://donjohnston.com/webinars/#.VecAe_lViko

AbleNet University Live Webinars

https://www.ablenetinc.com/resources/live_webinars/

Iowa Assistive Technology Professional Development Network

https://www.education.uiowa.edu/centers/icater/webinars