Just yesterday Microsoft announced what is possibly their biggest step forward in functionality within their Ease of Access accessibility settings since Windows 7. Eye Control is an inbuilt feature to facilitate access to the Windows 10 OS using the low cost eyegaze peripheral the Tracker 4 C from Tobii. More about what you can actually do with Eye Control below but first a little background to how this has come about.
Former American Football professional and MND (ALS) sufferer Steve Gleason (above) challenged Microsoft in 2014 to help people affected by this degenerative condition through the advancement eye tracking technology. This initial contact lead to the development of a prototype eye gaze controlled wheelchair, receiving lots of publicity and generating increased awareness in the process. However it was never likely to be progressed to a product that would be available to other people in a similar situation. What this project did achieve was to pique the interest of some of the considerable talent within Microsoft into the input technology itself and its application, particularly for people with MND.
A combination of factors felt on both sides of the Atlantic have proved problematic when it comes to providing timely AT support to people diagnosed with MND. Eyegaze input is the only solution that will allow successful computer access as the condition progresses, eye movement being the only ability left in the final stages of the illness. However, historically the cost of the technology meant that either insurance, government funding or private fundraising was the only means by which people could pay for eyegaze equipment. Usually this resulted in a significant delay which, due to the often aggressive nature of MND meant valuable time was lost and often the solution arrived too late. This situation was recognized by Julius Sweetland who led the development of Optikey, an Open Source computer access/AAC solution designed to work with low cost eye trackers back in 2015. Interestingly some of the innovative features of Optikey seem to have made it to Eye Control on Windows 10 (Multi-Key selection called Shape Writing on Eye Control – see gif below).
Love it or hate it, the game of Minecraft has captured the imagination of over 100 million young, and not so young people. It is available on multiple platforms; mobile device (Pocket Edition), Raspberry Pi, Computer, Xbox or PlayStation and it looks and feels pretty much the same on all. For those of us old enough to remember, the blocky graphics will hold some level of nostalgia for the bygone 8 Bit days when mere blobs of colour and our imagination were enough to render Ghosts and Goblins vividly. This is almost certainly lost on the main cohort of Minecraft players however who would most probably be bored silly with the 2 dimensional repetitive and predictable video games of the 80’s and early 90’s. The reason Minecraft is such a success is that it has blended its retro styling with modern gameplay and a (mind bogglingly massive) open world where no two visits are the same and there is room for self-expression and creativity. This latter quality has lead it to become the first video game to be embraced by mainstream education, being used as a tool for teaching everything from history to health or empathy to economics. It is however the former quality, the modern gameplay, that we are here to talk about. Unlike the afore mentioned Ghosts and Goblins, Minecraft is played in a 3 dimensional world using either the first person perspective (you see through the characters eyes) or third person perspective (like a camera is hovering above and slightly behind the character). While undoubtedly offering a more immersive and realistic experience, this means controlling the character and playing the game is also much more complex and requires a high level of dexterity in both hands to be successful. For people without the required level of dexterity this means that not only is there a risk of social exclusion, being unable to participate in an activity so popular among their peers, but also the possibility of being excluded within an educational context.
Fortunately UK based charity Special Effect have recognised this need and are in the process doing something about it. Special Effect are a charity dedicated to enabling those with access difficulties play video games through custom access solutions. Since 2007 their interdisciplinary team of clinical and technical professionals (and of course gamers) have been responsible for a wide range of bespoke solutions based on individuals’ unique abilities and requirements. Take a look at this page for some more information on the work they do and to see what a life enhancing service they provide. The problem with this approach of course is reach, which is why their upcoming work on Minecraft is so exciting. Based on the Open Source eyegaze AAC/Computer Access solution Optikey by developer Julius Sweetland, Special Effect are in the final stages of developing an on-screen Minecraft keyboard that will work with low cost eye trackers like the Tobii Eye X and the Tracker 4C (€109 and €159 respectively).
The inventory keyboard
The main Minecraft on screen keyboard
Currently being called ‘Minekey’ this solution will allow Minecraft to be played using a pointing device like a mouse or joystick or even totally hands free using an eyegaze device or headmouse. The availability of this application will ensure that Minecraft it now accessible to many of those who have been previously excluded. Special Effect were kind enough to let us trial a beta version of the software and although I’m no Minecraft expert it seemed to work great. The finished software will offer a choice of onscreen controls, one with smaller buttons and more functionality for expert eyegaze users (pictured above) and a more simplified version with larger targets. Bill Donegan, Projects Manager with Special Effect told us they hope to have it completed and available to download for free by the end of the year. I’m sure this news that will excite many people out there who had written off Minecraft as something just not possible for them. Keep an eye on Special Effect or ATandMe for updates on its release.
You may have heard about or seen photos of Enable Irelands fantastic “No Limits” Garden at this year’s Bloom festival. Some of you were probably even lucky enough to have actually visited it in the Phoenix Park over the course of the Bank Holiday weekend. In order to support visitors but also to allow those who didn’t get the chance to go share in some of the experience we put together a “No Limits” Bloom 2017 Grid. If you use the Grid (2 or 3) from Sensory software, or you know someone who does and you would like to learn more about the range of plants used in Enable Ireland’s garden you can download and install it by following the instructions below.
How do I install this Grid?
If you are using the Grid 3 you can download and install the Bloom 2017 Grid without leaving the application. From Grid explorer:
Click on the Menu Bar at the top of the screen
In the top left click the + sign (Add Grid Set)
A window will open (pictured below). In the bottom corner click on the Online Grids button (you will need to be connected to the Internet).
If you do not see the Bloom2017 Grid in the newest section you can either search for it (enter Bloom2017 in the search box at the top right) or look in the Interactive learning or Education Categories.
If you are using the Grid 2 or you want to install this Grid on a computer or device that is not connected to the Internet then you can download the Grid set at the link below. You can then add it to the Grid as above except select Grid Set File tab and browse to where you have the Grid Set saved.
Today May 18th is Global Accessibility Awareness Day and to mark the occasion Apple have produced a series of 7 videos (also available with audio description) highlighting how their products are being used in innovative ways by people with disabilities. All the videos are available in a playlist here and I guarantee you, if you haven’t seen them and you are interested in accessibility and AT, it’ll be the best 15 minutes you have spent today! Okay the cynical among you will point out this is self promotion by Apple, a marketing exercise. Certainly on one level of course it is, they are a company and like any company their very existence depends on generating profit for their shareholders. These videos promote more than Apple however, they promote independence, creativity and inclusion through technology. Viewed in this light these videos will illustrate to people with disabilities how far technology has moved on in recent years and make them aware of the potential benefits to their own lives. Hopefully the knock on effect of this increased awareness will be increased demand. Demand these technologies people, it’s your right!
As far as a favorite video from this series goes, everyone will have their own. In terms of the technology on show, to me Todd “The Quadfather” below was possibly the most interesting.
This video showcases Apple’s HomeKit range of associated products and how they can be integrated with Siri.
My overall favorite video however is Patrick, musician, DJ and cooking enthusiast. Patrick’s video is an ode to independence and creativity. The technologies he illustrates are Logic Pro (Digital Audio Workstation software) with VoiceOver (Apple’s inbuilt screen-reader) and the object recognizer app TapTapSee which although has been around for several years now, is still an amazing use of technology. It’s Patrick’s personality that makes the video though, this guy is going places, I wouldn’t be surprised if he had his own prime time TV show this time next year.
The FLipMouse (Finger- and Lip mouse) is a computer input device intended to offer an alternative for people with access difficulties that prevent them using a regular mouse, keyboard or touchscreen. It is designed and supported by the Assistive Technology group at the UAS Technikum Wien (Department of Embedded Systems) and funded by the City of Vienna (ToRaDes project and AsTeRICS Academy project). The device itself consists of a low force (requires minimal effort to operate) joystick that can be controlled with either the lips, finger or toe. The lips are probably the preferred access method as the FlipMouse also allows sip and puff input.
Sip and Puff is an access method which is not as common in Europe as it is in the US however it is an ideal way to increase the functionality of a joystick controlled by lip movement. See the above link to learn more about sip and puff but to give a brief explanation, it uses a sensor that monitors the air pressure coming from a tube. A threshold can be set (depending on the user’s ability) for high pressure (puff) and low pressure (sip). Once this threshold is passed it can act as an input signal like a mouse click, switch input or keyboard press among other things. The Flipmouse also has two jack inputs for standard ability switches as well as Infrared in (for learning commands) and out (for controlling TV or other environmental controls). All these features alone make the Flipmouse stand out against similar solutions however that’s not what makes the Flipmouse special.
The Flipmouse is the first of a new kind of assistive technology (AT) solution, not because of what it does but because of how it’s made. It is completely Open Source which means that everything you need to make this solution for yourself is freely available. The source code for the GUI (Graphical User Interface) which is used to configure the device and the code for the microcontroller (TeensyLC), bill of materials listing all the components and design files for the enclosure are all available on their GitHub page. The quality of the documentation distinguishes it from previous Open Source AT devices. The IKEA style assembly guide clearly outlines the steps required to put the device together making the build not only as simple as some of the more advanced Lego kits available but also as enjoyable. That said, unlike Lego this project does require reasonable soldering skills and a steady hand, some parts are tricky enough to keep you interested. The process of constructing the device also gives much better insight into how it works which is something that will undoubtedly come in handy should you need to troubleshoot problems at a later date. Although as stated above Asterics Academy provide a list of all components a much better option in my opinion would be to purchase the construction kit which contains everything you need to build your own FlipMouse, right down to the glue for the laser cut enclosure, all neatly packed into a little box (pictured below). The kit costs €150 and all details are available from the FlipMouse page on the Asterics Academy site. Next week I will post some video demonstrations of the device and look at the GUI which allows you program the FlipMouse as a computer input device, accessible game controller or remote control.
I can’t overstate how important a development the FlipMouse could be to the future of Assistive Technology. Giving communities the ability to build and support complex AT solutions locally not only makes them more affordable but also strengthens the connection between those who have a greater requirement for technology in their daily life and those with the creativity, passion and in-depth knowledge of emerging technologies, the makers. Here’s hoping the FlipMouse is the first of many projects to take this approach.
As we approach the end of 2016 it’s an appropriate time to look back and take stock of the year from an AT perspective. A lot happened in 2016, not all good. Socially, humanity seems to have regressed over the past year. Maybe this short term, inward looking protectionist sentiment has been brewing longer but 2016 brought the opportunity to express politically, you know the rest. While society steps and looks back technology continues to leap and bound forward and 2016 has seen massive progress in many areas but particularly areas associated with Artificial Intelligence (AI) and Smart Homes. This is the first in a series of posts examining some technology trends of 2016 and a look at how they affect the field of Assistive Technology. The links will become active as the posts are added. If I’m missing something please add it to the comments section.
So although 2016 is unlikely to be looked on kindly by future historians… you know why; it has been a great year for Assistive Technology, perhaps one of promise rather than realisation however. One major technology trend of 2016 missing from this series posts is Virtual (or Augmented) Reality. While VR was everywhere this year with products coming from Sony, Samsung, Oculus and Microsoft its usefulness beyond gaming is only beginning to be explored (particularly within Education).
So what are the goals for next year? Well harnessing some of these innovations in a way where they can be made accessible and usable by people with disabilities at an affordable price. If in 2017 we can start putting some of this tech into the hands of those who stand to benefit most from its use, then next year will be even better.
With iOS Apple have firmly established themselves as the mobile device brand of choice for those with alternative access needs. The extensive accessibility features, wide range of AT apps and third party hardware as well as iOS’ familiarity, ease of use and security, all make it a choice hard to look beyond. Yet this is exactly what many people do, 1.3 Billion Android devices were shipped in 2015, that’s 55% of all computing devices mobile or otherwise. A large majority of these would be budget smartphones or tablets purchased in developing markets where the price tag associated with Apple products could be considered prohibitive. There are however reasons other than cost to choose Android and thankfully Google have been quietly working away to give you even more. One in particular, which is currently in beta testing (click here to apply) is Voice Access. As its name suggests this new accessibility feature (and that is what it is being developed as, immediately distinguishing it from previous speech recognition apps) allows complete access to your device through voice alone. I’ll let Google describe it: “Voice Access is an accessibility service that lets you control your Android device with your voice. Using spoken commands, you can activate on-screen controls, launch apps, navigate your device, and edit text. Voice Access can be helpful for users for whom using a touch screen is difficult.” It certainly sounds promising and if these aspirations can be realised will be very welcome indeed. Voice control of mobile devices is something we are frequently asked about in Enable Ireland’s Assistive Technology Training Service. I’ll post more on Voice Access after I’ve had the opportunity to test it a bit more. In the meantime take a look at the video below to whet your appetite.
Another alternative access option now available to Android users is a third party application developed and promoted by CREA with the support of Fundación Vodafone España called EVA Facial Mouse. EVA Facial Mouse has been created by the same people who brought us Enable Viacam for Windows and Linux and seems to be a mobile version of that popular and effective camera input system. EVA uses a combination of the front facing camera and face recognition to allow the user position the cursor and click on icons without having to touch the device. See video below for more on EVA (Spanish with subtitles)
Reviews of EVA on Google Play are mostly positive with many negative reviews most probably explained by device specific incompatibilities. This remains the primary difficulty associated with the use and support of Android based devices as Assistive Technology. All Android devices are not created equally and how they handle apps can vary significantly depending on the resources they have available (CPU/RAM) and how Android features (pointing device compatibility in this case) are implemented. That said, on the right device both new access options mentioned above could mean greatly improved access efficiency for two separate user groups who have up until now had to rely primarily on switch access. Next week I will release a post reviewing current Android phones and follow that up with a couple of in-depth reviews of the above apps and their compatibility with selected Android devices and other third party AT apps like ClickToPhone.
Increasingly schools are opting for what is sometimes termed a digital schoolbag. This involves the purchase of an electronic device, usually an iPad with a package of digital textbooks pre-installed. Digital textbooks are undoubtedly a step in the right direction in terms of accessibility and are indeed essential for many students with disabilities. There are students however who may need to use a different platform (hardware and/or operating system – OS) because of compatibility issues with their Assistive Technology. Currently the most popular platform being adopted by schools is Apple iOS with parents being directed to purchase an iPad from a contracted supplier. Many readers of this article will be well aware of all the great inbuilt accessibility features within iOS however if you are a user of Eye Gaze or Speech Recognition (for access) it does not currently support your chosen AT.
It is understandable why from a school’s perspective having all students using identical standardised devices would be preferable and there are plenty of reasons why Apple iOS would be the obvious choice. There is a concern however that the small minority who may need to use other platforms because of access difficulties could be put at a disadvantage or perhaps not be able to participate fully in all activities. One of the leading school suppliers have assured us that the textbooks can be accessed on Windows, iOS and Android and as these textbooks are sourced from the same few publishers one can assume this applies for all suppliers. It is therefore up to the schools to ensure all lessons utilizing technology are identical whenever possible; equivalent when not, regardless of the device/platform you are using. Parents, particularly those whose children use Assistive Technology should not feel pressured by schools to purchase technology that isn’t the optimum for their child’s needs. If a therapist or AT specialist has recommended a particular solution that differs from what is being suggested by the school, the priority should obviously be the students’ needs. When it comes to AT it is the school’s responsibility to accommodate the different needs of its student, just as it was before the digital schoolbag. The use of technology within our schools is to be embraced but it is important that schools ensure that the curriculum is open and in no part dependent on one particular platform or device. That would just see us swapping one form of inequality for another and that’s not progress.
If anyone would like advice on what technologies are available to support access, literacy and productivity on any platform they should feel free to contact us here in the National Assistive Technology Service in Sandymount, Dublin.
APPLY BY 15/09/2016
EDF and the company Oracle are pleased to announce a scholarship of 8.000 EUR to a student with disability of a high education programme studying in the field of Information and Communication Technologies (ICT) in the academic year of 2016-2017. It will be awarded based on a project or thesis that will be conducted during the academic year. The project or thesis should take into account the needs of persons with disabilities in terms of accessibility to ICT, and/or an innovative solution to enhance their access. How can you apply? Find more information on EDF’s website.
Applications to be sent by 15 September 2016.
If you have any questions, please write an email to: firstname.lastname@example.org.
DFI is a member of the European Disability Forum. The European Disability Forum (EDF), is the umbrella organisation representing 80 million persons with disabilities in Europe.
EDF have partnered with Oracle and have announced an e-Accessibility scholarship