Hands-free Minecraft from Special Effect

Love it or hate it, the game of Minecraft has captured the imagination of over 100 million young, and not so young people. It is available on multiple platforms; mobile device (Pocket Edition), Raspberry Pi, Computer, Xbox or PlayStation and it looks and feels pretty much the same on all. For those of us old enough to remember, the blocky graphics will hold some level of nostalgia for the bygone 8 Bit days when mere blobs of colour and our imagination were enough to render Ghosts and Goblins vividly. This is almost certainly lost on the main cohort of Minecraft players however who would most probably be bored silly with the 2 dimensional repetitive and predictable video games of the 80’s and early 90’s. The reason Minecraft is such a success is that it has blended its retro styling with modern gameplay and a (mind bogglingly massive) open world where no two visits are the same and there is room for self-expression and creativity. This latter quality has lead it to become the first video game to be embraced by mainstream education, being used as a tool for teaching everything from history to health or empathy to economics. It is however the former quality, the modern gameplay, that we are here to talk about. Unlike the afore mentioned Ghosts and Goblins, Minecraft is played in a 3 dimensional world using either the first person perspective (you see through the characters eyes) or third person perspective (like a camera is hovering above and slightly behind the character). While undoubtedly offering a more immersive and realistic experience, this means controlling the character and playing the game is also much more complex and requires a high level of dexterity in both hands to be successful. For people without the required level of dexterity this means that not only is there a risk of social exclusion, being unable to participate in an activity so popular among their peers, but also the possibility of being excluded within an educational context.

Fortunately UK based charity Special Effect have recognised this need and are in the process doing something about it. Special Effect are a charity dedicated to enabling those with access difficulties play video games through custom access solutions. Since 2007 their interdisciplinary team of clinical and technical professionals (and of course gamers) have been responsible for a wide range of bespoke solutions based on individuals’ unique abilities and requirements. Take a look at this page for some more information on the work they do and to see what a life enhancing service they provide. The problem with this approach of course is reach, which is why their upcoming work on Minecraft is so exciting. Based on the Open Source eyegaze AAC/Computer Access solution Optikey by developer Julius Sweetland, Special Effect are in the final stages of developing an on-screen Minecraft keyboard that will work with low cost eye trackers like the Tobii Eye X and the Tracker 4C (€109 and €159 respectively).

minecraft on screen keyboard

The inventory keyboard

MineCraft on screen keyboards

The main Minecraft on screen keyboard

Currently being called ‘Minekey’ this solution will allow Minecraft to be played using a pointing device like a mouse or joystick or even totally hands free using an eyegaze device or headmouse. The availability of this application will ensure that Minecraft it now accessible to many of those who have been previously excluded. Special Effect were kind enough to let us trial a beta version of the software and although I’m no Minecraft expert it seemed to work great. The finished software will offer a choice of onscreen controls, one with smaller buttons and more functionality for expert eyegaze users (pictured above) and a more simplified version with larger targets. Bill Donegan, Projects Manager with Special Effect told us they hope to have it completed and available to download for free by the end of the year. I’m sure this news that will excite many people out there who had written off Minecraft as something just not possible for them. Keep an eye on Special Effect or ATandMe for updates on its release.

Mefacilyta Desktop app

Mefacilyta Desktop

In this podcast, Sarah Boland, together with David Deane and Áine Walsh, talk about the training they hosted on 21st June 2017 on the Mefacilyta Desktop app in St John of God in Stillorgan.

Mefacilyta Desktop is a new Android app developed by Vodafone Foundation Spain in conjunction with St John of God, which can be individually tailored to support people with intellectual disabilities to learn how to carry out their everyday activities independently.Vodafone symbol with person pointing to letter M Mefacilyta app

GazeSpeak & Microsoft’s ongoing efforts to support people with Motor Neuron Disease (ALS)

Last Friday (February 17th) New Scientist published an article about a new app in development at Microsoft called GazeSpeak. Due to be released over the coming months on iOS, GazeSpeak aims at facilitating communication between a person with MND (known as ALS in the US, I will use both terms interchangeably) and another individual, perhaps their partner, carer or friend. Developed by Microsoft intern, Xiaoyi Zhang, GazeSpeak differs from traditional approaches in a number of ways. Before getting into the details however it’s worth looking at the background, GazeSpeaker didn’t come from nowhere, it’s actually one of the products of some heavyweight research into Augmentative and Alternative Communication (AAC) that has been taking place at Microsoft over the last few years. Since 2013, inspired by football legend and ALS sufferer Steve Gleason (read more here) Microsoft researchers and developers have put the weight of their considerable collective intellect to bear on the subject of increasing the ease and efficiency of communication for people with MND.

Last year Microsoft Research published a paper called ”
AACrobat: Using Mobile Devices to Lower Communication Barriers and Provide Autonomy with Gaze-Based AAC” (abstract and pdf download at previous link) which proposed a companion app to allow an AAC user’s communication partner assist (in an non-intrusive way) in the communication process. Take a look at the video below for a more detailed explanation.

This is an entirely new approach to increasing the efficiency of AAC and one that I suggest, could only have come from a large mainstream tech organisation who have over thirty years experience facilitating communication and collaboration.

Another Microsoft research paper published last year (with some of the same authors at the previous paper) called “Exploring the Design Space of AAC Awareness Displays” looks at importance of a communication partners “awareness of the subtle, social, and contextual cues that are necessary for people to naturally communicate in person”. There research focused on creating a display that would allow the person with ALS express things like humor, frustration, affection etc, emotions difficult to express with text alone. Yes they proposed the use of Emoji, which are a proven and effective way a similar difficulty is overcome in remote or non face to face interactions however they went much further and also looked at solutions like Avatars, Skins and even coloured LED arrays. This, like the other one above, is an academic paper and as such not an easy read but the ideas and solutions being proposed by these researchers are practical and will hopefully be filtering through to end users of future AAC solutions.

That brings us back to GazeSpeak, the first fruits of the Microsoft/Steve Gleason partnership to reach the general public. Like the AACrobat solution outlined above GazeSpeak gives the communication partner a tool rather than focusing on tech for the person with MND. As the image below illustrates the communication partner would have GazeSpeak installed on their phone and with the app running they would hold their device up to the person with MND as if they were photographing them. They suggest a sticker with four grids of letters is placed on the back of the smart phone facing the speaker. The app then tracks the persons eyes: up, down, left or right, each direction means the letter they are selecting is contained in the grid in that direction (see photo below).

man looking right, other person holding smartphone up with gazespeak installed

Similar to how the old T9 predictive text worked, GazeSpeak selects the appropriate letter from each group and predicts the word based on the most common English words. So the app is using AI in the form of machine vision to track the eyes and also to make the word prediction. In the New Scientist  article they mention that the user would be able to add their own commonly used words and people/place names which one assumes would prioritize them within the prediction list. In the future perhaps some capacity for learning could be added to further increase efficiency. After using this system for a while the speaker may not even need to see the sticker with letters, they could write words from muscle memory. At this stage a simple QR code leading to the app download would allow them to communicate with complete strangers using just their eyes and no personal technology.

2016 – Technology Trends and Assistive Technology (AT) Highlights

As we approach the end of 2016 it’s an appropriate time to look back and take stock of the year from an AT perspective. A lot happened in 2016, not all good. Socially, humanity seems to have regressed over the past year. Maybe this short term, inward looking protectionist sentiment has been brewing longer but 2016 brought the opportunity to express politically, you know the rest. While society steps and looks back technology continues to leap and bound forward and 2016 has seen massive progress in many areas but particularly areas associated with Artificial Intelligence (AI) and Smart Homes. This is the first in a series of posts examining some technology trends of 2016 and a look at how they affect the field of Assistive Technology. The links will become active as the posts are added. If I’m missing something please add it to the comments section.

Dawn of the Personal Digital Assistants

Game Accessibility

Inbuilt Accessibility – AT in mainstream technology 

Software of the Year – The Grid 3

Open Source AT Hardware and Software

The Big Life Fix

So although 2016 is unlikely to be looked on kindly by future historians… you know why; it has been a great year for Assistive Technology, perhaps one of promise rather than realisation however. One major technology trend of 2016 missing from this series posts is Virtual (or Augmented) Reality. While VR was everywhere this year with products coming from Sony, Samsung, Oculus and Microsoft its usefulness beyond gaming is only beginning to be explored (particularly within Education).

So what are the goals for next year? Well harnessing some of these innovations in a way where they can be made accessible and usable by people with disabilities at an affordable price. If in 2017 we can start putting some of this tech into the hands of those who stand to benefit most from its use, then next year will be even better.

Accessible Apps, Games and Toys

range of children's toys

Enable Ireland’s National Assistive Technology Service has gathered together information on a range of accessible toys. It includes a variety of accessible games, apps, and toys. These are not recommendations but simply a selection of items which may be of interest, particularly at times such as Christmas and birthdays, when presents are high on the list of priorities.

Available here https://goo.gl/QpcD1z

Tap Tap See for iOS

the very cleverly named Tap Tap See is an app (you may have noticed, I like apps a lot!) which allows you to identify objects by simply taking a picture. Once you’ve taken the picture, the app searches through a huge database of objects and brand names to find a match foryour picture.  The app then tells you what it sees.

 

I tend to use it when I need quick information, such as the flavour of a tin of soup or the colour of a piece of clothing, so it’s not an app which can give a lot of detail – but the detail it can give can be remarkably accurate.

 

It does also take a little time to get used to where exactly to point the camera, especially if you’re blind from birth (as I am), but the app is free to use, so yu don’t need to worry about the number of pictures you take.

 

The app also has a handy features which allows you to use it to identify photos in your library, which I really luke if I want to put a photo on Facebook but can’t remember which one I want to use.

 

So, all in all, I’d really recommend having a play with this app.   have fun!

Hands-free access for Android

With iOS Apple have firmly established themselves as the mobile device brand of choice for those with alternative access needs. The extensive accessibility features, wide range of AT apps and third party hardware as well as iOS’ familiarity, ease of use and security, all make it a choice hard to look beyond. Yet this is exactly what many people do, 1.3 Billion Android devices were shipped in 2015, that’s 55% of all computing devices mobile or otherwise. A large majority of these would be budget smartphones or tablets purchased in developing markets where the price tag associated with Apple products could be considered prohibitive. There are however reasons other than cost to choose Android and thankfully Google have been quietly working away to give you even more. One in particular, which is currently in beta testing (click here to apply) is Voice Access. As its name suggests this new accessibility feature (and that is what it is being developed as, immediately distinguishing it from previous speech recognition apps) allows complete access to your device through voice alone. I’ll let Google describe it: “Voice Access is an accessibility service that lets you control your Android device with your voice. Using spoken commands, you can activate on-screen controls, launch apps, navigate your device, and edit text. Voice Access can be helpful for users for whom using a touch screen is difficult.” It certainly sounds promising and if these aspirations can be realised will be very welcome indeed. Voice control of mobile devices is something we are frequently asked about in Enable Ireland’s Assistive Technology Training Service. I’ll post more on Voice Access after I’ve had the opportunity to test it a bit more. In the meantime take a look at the video below to whet your appetite.

Another alternative access option now available to Android users is a third party application developed and promoted by CREA with the support of Fundación Vodafone España called EVA Facial Mouse. EVA Facial Mouse has been created by the same people who brought us Enable Viacam for Windows and Linux and seems to be a mobile version of that popular and effective camera input system. EVA uses a combination of the front facing camera and face recognition to allow the user position the cursor and click on icons without having to touch the device. See video below for more on EVA (Spanish with subtitles)

Reviews of EVA on Google Play are mostly positive with many negative reviews most probably explained by device specific incompatibilities. This remains the primary difficulty associated with the use and support of Android based devices as Assistive Technology. All Android devices are not created equally and how they handle apps can vary significantly depending on the resources they have available (CPU/RAM) and how Android features (pointing device compatibility in this case) are implemented. That said, on the right device both new access options mentioned above could mean greatly improved access efficiency for two separate user groups who have up until now had to rely primarily on switch access. Next week I will release a post reviewing current Android phones and follow that up with a couple of in-depth reviews of the above apps and their compatibility with selected Android devices and other third party AT apps like ClickToPhone.

CALL Scotland webinars

Users hand on computer keyboard with webinar graphic in the foreground
Webinars are a convenient way to view live presentations that are delivered to your computer or tablet over the web. During a presentation as well as listening to the presenter you can view documents, programs and most importantly ask questions about the live presentation.
CALL Scotland are hosting a range of free assistive technology related webinars that look interesting.

Some Upcoming webinars

Widgit Software Apps for Supporting Language and Literacy
Wednesday, 04 May 2016

Creating Accessible Documents in Word
Wednesday, 11 May 2016

Exploring the Creativity iPad App Wheel
Wednesday, 18 May 2016

There are also a selection of archived videos of webinars
Click for more information

Wheelmap.org

find wheelchair accessible places

Barriers in public environments constantly prevent mobility-impaired people from free movement and participation. A narrow doorway here, a step there – that’s all it takes. Wheelmap looks like a very worthwhile project.  It is an open and free online map for wheelchair-accessible places. It empowers users to share and access information on the wheelchair-accessibility of public places.  Anyone can participate by tagging places.

Map of accessible places in Dublin

Places that are not yet marked have a grey tag and can be quickly and easily marked by everyone. The crowdsourced information is free, easy to understand and can be shared with everyone.

Logged in users can upload photos to places or write comments to further describe the wheelchair accessibility of a place! This additional information makes it easy for mobility-impaired users to determine whether they can access the place or not.

Get tagging..

App Review: Be My Eyes for iPhone

 

So, first of all, I need to nail my colours to the mast here, so to speak: I’m a huge Apple fan. This is mainly because, since 2009, all of Apple’s products have come with built-in screenreading technology, which enables someone who is blind – such as myself – to interact with an iPhone completely independently.

 

In the last seven years, many, many apps have been developed for the specific use of blind users. I use a lot of these, which I might talk about in future posts, but today I’d like to mention one in particular – Be My Eyes:

www.bemyeyes.org

is an app which allows blind people to “borrow” the eyes of a sighted volunteer, through a live video chat system.

 

This app is very simple to use, is free on IOS (an Android version is still in the works), and means that, for me, I’m not always relying on the same people to help me.

 

Its uses are endless – because blind people might have scaled mountains and crossed the South Poll, but we still can’t read the expiry date on a packet of ham without help.

 

Since I discovered Be My Eyes three days ago, I’ve used it for everything from the trivial – making sure my outfit matched when I was going on a night out – to the more important – not mixing up cough syrup with another medicine.

 

For me, as for most people, independence is all about choices: I can struggle for the sake of pride, or I can seek a little help. Be My Eyes allows me to ask for that help without feeling self-conscious or like I’m asking the same people repeatedly.

 

So, whether you’re sighted and fancy a little volunteering , or you have a visual impairment and need to know when your milk is about to go off, then this is a really handy little app.

 

If

you’ve used this app, or have any other app recommendations, it’d be great to hear your thoughts!

 

Note: DO NOT GIVE OUT PERSONAL INFORMATION OVER THE APP