Today May 18th is Global Accessibility Awareness Day and to mark the occasion Apple have produced a series of 7 videos (also available with audio description) highlighting how their products are being used in innovative ways by people with disabilities. All the videos are available in a playlist here and I guarantee you, if you haven’t seen them and you are interested in accessibility and AT, it’ll be the best 15 minutes you have spent today! Okay the cynical among you will point out this is self promotion by Apple, a marketing exercise. Certainly on one level of course it is, they are a company and like any company their very existence depends on generating profit for their shareholders. These videos promote more than Apple however, they promote independence, creativity and inclusion through technology. Viewed in this light these videos will illustrate to people with disabilities how far technology has moved on in recent years and make them aware of the potential benefits to their own lives. Hopefully the knock on effect of this increased awareness will be increased demand. Demand these technologies people, it’s your right!
As far as a favorite video from this series goes, everyone will have their own. In terms of the technology on show, to me Todd “The Quadfather” below was possibly the most interesting.
This video showcases Apple’s HomeKit range of associated products and how they can be integrated with Siri.
My overall favorite video however is Patrick, musician, DJ and cooking enthusiast. Patrick’s video is an ode to independence and creativity. The technologies he illustrates are Logic Pro (Digital Audio Workstation software) with VoiceOver (Apple’s inbuilt screen-reader) and the object recognizer app TapTapSee which although has been around for several years now, is still an amazing use of technology. It’s Patrick’s personality that makes the video though, this guy is going places, I wouldn’t be surprised if he had his own prime time TV show this time next year.
Last Friday (February 17th) New Scientist published an article about a new app in development at Microsoft called GazeSpeak. Due to be released over the coming months on iOS, GazeSpeak aims at facilitating communication between a person with MND (known as ALS in the US, I will use both terms interchangeably) and another individual, perhaps their partner, carer or friend. Developed by Microsoft intern, Xiaoyi Zhang, GazeSpeak differs from traditional approaches in a number of ways. Before getting into the details however it’s worth looking at the background, GazeSpeaker didn’t come from nowhere, it’s actually one of the products of some heavyweight research into Augmentative and Alternative Communication (AAC) that has been taking place at Microsoft over the last few years. Since 2013, inspired by football legend and ALS sufferer Steve Gleason (read more here) Microsoft researchers and developers have put the weight of their considerable collective intellect to bear on the subject of increasing the ease and efficiency of communication for people with MND.
This is an entirely new approach to increasing the efficiency of AAC and one that I suggest, could only have come from a large mainstream tech organisation who have over thirty years experience facilitating communication and collaboration.
Another Microsoft research paper published last year (with some of the same authors at the previous paper) called “Exploring the Design Space of AAC Awareness Displays” looks at importance of a communication partners “awareness of the subtle, social, and contextual cues that are necessary for people to naturally communicate in person”. There research focused on creating a display that would allow the person with ALS express things like humor, frustration, affection etc, emotions difficult to express with text alone. Yes they proposed the use of Emoji, which are a proven and effective way a similar difficulty is overcome in remote or non face to face interactions however they went much further and also looked at solutions like Avatars, Skins and even coloured LED arrays. This, like the other one above, is an academic paper and as such not an easy read but the ideas and solutions being proposed by these researchers are practical and will hopefully be filtering through to end users of future AAC solutions.
That brings us back to GazeSpeak, the first fruits of the Microsoft/Steve Gleason partnership to reach the general public. Like the AACrobat solution outlined above GazeSpeak gives the communication partner a tool rather than focusing on tech for the person with MND. As the image below illustrates the communication partner would have GazeSpeak installed on their phone and with the app running they would hold their device up to the person with MND as if they were photographing them. They suggest a sticker with four grids of letters is placed on the back of the smart phone facing the speaker. The app then tracks the persons eyes: up, down, left or right, each direction means the letter they are selecting is contained in the grid in that direction (see photo below).
Similar to how the old T9 predictive text worked, GazeSpeak selects the appropriate letter from each group and predicts the word based on the most common English words. So the app is using AI in the form of machine vision to track the eyes and also to make the word prediction. In the New Scientist article they mention that the user would be able to add their own commonly used words and people/place names which one assumes would prioritize them within the prediction list. In the future perhaps some capacity for learning could be added to further increase efficiency. After using this system for a while the speaker may not even need to see the sticker with letters, they could write words from muscle memory. At this stage a simple QR code leading to the app download would allow them to communicate with complete strangers using just their eyes and no personal technology.
There is of course some cross over between the different AT highlights of 2016 I have included here. An overall theme running through all the highlights this year is the mainstreaming of AT. Apple, Google and Microsoft have all made significant progress in the areas previously mentioned: natural language understanding and smart homes. This has led to easier access to computing devices and through them the ability to automate and remotely control devices and services that assist us with daily living tasks around the house. However these developments are aimed at the mainstream market with advantages to AT users being a welcome additional benefit. What I want to look at here are the features they are including in their mainstream products specifically aimed at people with disabilities with the goal of making their products more inclusive. Apple have always been strong in this area and have lead the way now for the last five years. 2016 saw them continue this fine work with new features such as Dwell within MacOS and Touch Accommodations in iOS 10 as well as many other refinements of already existing features. Apple also along with Siri have brought Switch Control to Apple TV either using a dedicated Bluetooth switch or through a connected iOS device in a method they are calling Platform Switching. Platform Switching which also came out this year with iOS 10 “allows you to use a single device to operate any other devices you have synced with your iCloud account. So you can control your Mac directly from your iPhone or iPad, without having to set up your switches on each new device” (need to be on the same WiFi network). The video below from Apple really encapsulates how far they have come in this area and how important this approach is.
Not to be outdone Microsoft bookended 2016 with some great features in the area of literacy support, an area they had perhaps neglected for a while. They more than made up for this last January with the announcement of Learning Tools for OneNote. I’m not going to go into details of what Learning Tools offers as I have covered it in a previous post. All I’ll say is that it is free, it works with OneNote (also free and a great note taking and organisation support in its own right) and is potentially all many students would need by way of literacy support (obviously some students may need additional supports). Then in the fourth quarter of the year they updated their OCR app Office Lens for iOS to provide the immersive reader (text to speech) directly within the app.
Finally Google who would probably have the weakest record of the big 3 in terms of providing inbuilt accessibility features (to be fair they always followed a different approach which proved to be equally effective) really hit a home run with their Voice Access solution which was made available for beta testing this year. Again I have discussed this in a previous post here where you can read about it in more detail. Having tested it I can confirm that it gives complete voice access to all Android devices features as well as any third party apps I tested. Using a combination of direct voice commands (Open Gmail, Swipe left, Go Home etc.) and a system of numbering buttons and links, even obscure apps can be operated. The idea of using numbers for navigation while not new is extremely appropriate in this case, numbers are easily recognised regardless of voice quality or regional accent. Providing alternative access and supports to mainstream Operating Systems is the corner stone of recent advances in AT. As the previous video from Apple showed, access to smartphones or computers gives access to a vast range of services and activities. For example inbuilt accessibility features like Apple’s Switch Control or Google’s Voice Access open up a range of mainstream Smart Home and security devices and services to people with alternative access needs where before they would have to spend a lot more for a specialist solution that would have probably been inferior.
As we approach the end of 2016 it’s an appropriate time to look back and take stock of the year from an AT perspective. A lot happened in 2016, not all good. Socially, humanity seems to have regressed over the past year. Maybe this short term, inward looking protectionist sentiment has been brewing longer but 2016 brought the opportunity to express politically, you know the rest. While society steps and looks back technology continues to leap and bound forward and 2016 has seen massive progress in many areas but particularly areas associated with Artificial Intelligence (AI) and Smart Homes. This is the first in a series of posts examining some technology trends of 2016 and a look at how they affect the field of Assistive Technology. The links will become active as the posts are added. If I’m missing something please add it to the comments section.
So although 2016 is unlikely to be looked on kindly by future historians… you know why; it has been a great year for Assistive Technology, perhaps one of promise rather than realisation however. One major technology trend of 2016 missing from this series posts is Virtual (or Augmented) Reality. While VR was everywhere this year with products coming from Sony, Samsung, Oculus and Microsoft its usefulness beyond gaming is only beginning to be explored (particularly within Education).
So what are the goals for next year? Well harnessing some of these innovations in a way where they can be made accessible and usable by people with disabilities at an affordable price. If in 2017 we can start putting some of this tech into the hands of those who stand to benefit most from its use, then next year will be even better.
Microsoft announced earlier this week that they are building on the success of their much acclaimed literacy support suite for OneNote “Learning Tools” by making some of the features available within other products. First though, if you haven’t come across Learning Tools for OneNote take a look at the video below for an outline of what it offers. Take it away Jeff..
As you can see from the video, offering Text To Speech (TTS) with highlighting, easy to read fonts on distraction free, high visibility backgrounds as well as the comprehension supports, Learning Tools could be very useful to those who need a little assistance with text based content. Learning Tools was originally only available for the version of OneNote which comes bundled with Office 2013 and 2016. However earlier this week Microsoft announced that they are bringing some features to other apps, the most interesting and potentially useful of these would be Office Lens and Word. Office Lens is already a very useful multi-platform app with powerful optical character recognition (OCR) capabilities which allow you photograph a document and have it converted to editable text. Now with the addition of the Immersive Reader functionality of Learning Tools you can photograph a document, export it to immersive reader and immediately use the tools mentioned above to support your understanding of the text. For the moment this feature is only available on Office Lens for iOS but my understanding is it’s their intention to gradually roll it out to other platforms.
Within Word even more functionality is offered through the new editor feature. These include dictionary supports such as synonyms of suggested corrections for misspelled words that can be read aloud with TTS and additional support for commonly confused words. I’ll leave it to Jeff again for a full review of the new features (video below).
the very cleverly named Tap Tap See is an app (you may have noticed, I like apps a lot!) which allows you to identify objects by simply taking a picture. Once you’ve taken the picture, the app searches through a huge database of objects and brand names to find a match foryour picture. The app then tells you what it sees.
I tend to use it when I need quick information, such as the flavour of a tin of soup or the colour of a piece of clothing, so it’s not an app which can give a lot of detail – but the detail it can give can be remarkably accurate.
It does also take a little time to get used to where exactly to point the camera, especially if you’re blind from birth (as I am), but the app is free to use, so yu don’t need to worry about the number of pictures you take.
The app also has a handy features which allows you to use it to identify photos in your library, which I really luke if I want to put a photo on Facebook but can’t remember which one I want to use.
So, all in all, I’d really recommend having a play with this app. have fun!
With iOS Apple have firmly established themselves as the mobile device brand of choice for those with alternative access needs. The extensive accessibility features, wide range of AT apps and third party hardware as well as iOS’ familiarity, ease of use and security, all make it a choice hard to look beyond. Yet this is exactly what many people do, 1.3 Billion Android devices were shipped in 2015, that’s 55% of all computing devices mobile or otherwise. A large majority of these would be budget smartphones or tablets purchased in developing markets where the price tag associated with Apple products could be considered prohibitive. There are however reasons other than cost to choose Android and thankfully Google have been quietly working away to give you even more. One in particular, which is currently in beta testing (click here to apply) is Voice Access. As its name suggests this new accessibility feature (and that is what it is being developed as, immediately distinguishing it from previous speech recognition apps) allows complete access to your device through voice alone. I’ll let Google describe it: “Voice Access is an accessibility service that lets you control your Android device with your voice. Using spoken commands, you can activate on-screen controls, launch apps, navigate your device, and edit text. Voice Access can be helpful for users for whom using a touch screen is difficult.” It certainly sounds promising and if these aspirations can be realised will be very welcome indeed. Voice control of mobile devices is something we are frequently asked about in Enable Ireland’s Assistive Technology Training Service. I’ll post more on Voice Access after I’ve had the opportunity to test it a bit more. In the meantime take a look at the video below to whet your appetite.
Another alternative access option now available to Android users is a third party application developed and promoted by CREA with the support of Fundación Vodafone España called EVA Facial Mouse. EVA Facial Mouse has been created by the same people who brought us Enable Viacam for Windows and Linux and seems to be a mobile version of that popular and effective camera input system. EVA uses a combination of the front facing camera and face recognition to allow the user position the cursor and click on icons without having to touch the device. See video below for more on EVA (Spanish with subtitles)
Reviews of EVA on Google Play are mostly positive with many negative reviews most probably explained by device specific incompatibilities. This remains the primary difficulty associated with the use and support of Android based devices as Assistive Technology. All Android devices are not created equally and how they handle apps can vary significantly depending on the resources they have available (CPU/RAM) and how Android features (pointing device compatibility in this case) are implemented. That said, on the right device both new access options mentioned above could mean greatly improved access efficiency for two separate user groups who have up until now had to rely primarily on switch access. Next week I will release a post reviewing current Android phones and follow that up with a couple of in-depth reviews of the above apps and their compatibility with selected Android devices and other third party AT apps like ClickToPhone.
So, first of all, I need to nail my colours to the mast here, so to speak: I’m a huge Apple fan. This is mainly because, since 2009, all of Apple’s products have come with built-in screenreading technology, which enables someone who is blind – such as myself – to interact with an iPhone completely independently.
In the last seven years, many, many apps have been developed for the specific use of blind users. I use a lot of these, which I might talk about in future posts, but today I’d like to mention one in particular – Be My Eyes:
is an app which allows blind people to “borrow” the eyes of a sighted volunteer, through a live video chat system.
This app is very simple to use, is free on IOS (an Android version is still in the works), and means that, for me, I’m not always relying on the same people to help me.
Its uses are endless – because blind people might have scaled mountains and crossed the South Poll, but we still can’t read the expiry date on a packet of ham without help.
Since I discovered Be My Eyes three days ago, I’ve used it for everything from the trivial – making sure my outfit matched when I was going on a night out – to the more important – not mixing up cough syrup with another medicine.
For me, as for most people, independence is all about choices: I can struggle for the sake of pride, or I can seek a little help. Be My Eyes allows me to ask for that help without feeling self-conscious or like I’m asking the same people repeatedly.
So, whether you’re sighted and fancy a little volunteering , or you have a visual impairment and need to know when your milk is about to go off, then this is a really handy little app.
you’ve used this app, or have any other app recommendations, it’d be great to hear your thoughts!
Note: DO NOT GIVE OUT PERSONAL INFORMATION OVER THE APP
Having a mobile device that is accessible for you, will mean it will be more usable, reliable, efficient and will help eliminate any frustration in using the device. This upcoming webinar on Accessibility Features in iOS 9 and Android-based Devices will be of interest to anyone who needs to learn whats new within the recent updates of mobile operating systems.
It’s hosted by AbleNet University Live Webinars on 16th February at 8pm GMT. Duration is for 1 hour.
We are frequently contacted in Enable Ireland AT service by people asking us to recommend accessible phones. It might be for a grandparent or parent, someone with cognitive or access difficulties or just people who like their technology simple and functional. The brand we usually recommend for simple accessible phones is Doro http://www.doro.ie. They have a range of phones on their website (pictured below) mostly standard or feature phones (rather than smartphones) based on the classic clam-shell and candy bar designs. Most mobile providers (I can confirm Vodafone and 3Mobile) offer at least one Doro as a prepay option however to get access to the full range you may need to buy an unlocked sim free device from an online retailer like Amazon. The smartphone in the picture below is not listed on the Irish Doro website but can be purchased unlocked online.
Customise an Android
Another option would be to customise an Android phone. There are a range of sub €100 smartphones available from all mobile carriers and while it is a truism that you get what you pay for with smartphones the quality gap between budget and premium has never been smaller. Use Google to search for reviews and lists of the best smartphones in your price range and check spec and reviews on GSMArena for the model you are interested in (don’t get too put off by user submitted comments on this site, almost always negative). Finally and most importantly go into a bricks and mortar shop and get a hands on.
Many older people (and younger people too!) have difficulty with touchscreens but to make them easier to use you can install a launcher app on Android phones. A launcher app will sit over the normal Android user interface, hiding as much of the complexity as you think appropriate. As your Granddad (or whoever the user is) gets used to the device you can give him other options like Internet or the Camera. Below are some Android launchers available on the Google Play store, many are specifically designed for older users and users with low vision. There are often free version available that are either supported by advertising or have limited functionality. It’s great to be able to test an app like this out before parting with money but be wary that those supported by advertising may lead to confusion, inappropriate content or even malware if the ad is clicked and so may not be appropriate for some users.
Wiser – Simple Launcher
Wiser – Simple Launcher (pictured above) looks like a very nicely designed interface that is simple yet manages to avoid the fisher-price type design trap that some others fall into.
Microsoft Windows Phone offers a very user friendly and customisable home screen. The tiles of the interface can be easily removed or enlarged offering easy access to only what is required. The main reason the Windows platform is less popular on mobile devices is the lack of apps available (this is changing) but this may not be an issue for this user group. If you want to take advantage of the nice Windows UI yet still use an Android based phone you could try Launcher 8. Of course you could just get the real thing and go for a Microsoft Lumia phone (pictured above).
Another difficulty faced by some users of touchscreen smartphones is using the on-screen keyboard. Fortunately there are a number of Android keyboard apps that offer assistance in this regard also.
If the user is a “hunt and peck” style typist Thick Buttons might work best. It enlarges the keys it thinks will most probably be used next. This makes the keyboard easier to use while at the same time offering subtle assistance to those who are unfamiliar with the QWERTY layout or with literacy difficulties.
Many users who find touchscreens difficult find using a stylus helps a great deal. If you are going to try this, choose a device with a large screen like a Galaxy Note or less expensive Asus. These large phones are often referred to as… wait for it… Phablets https://en.wikipedia.org/wiki/Phablet (just to prove I didn’t make that up).
Voice Commands and Speech Recognition
Speech Recognition may not be an option for all users or it might be a bridge too far for some technophobes. If the initial fear is overcome however it is a very natural way of interacting with your device and is becoming more accurate every day. Android, iOS and Windows all offer digital personal assistants (Google Now, Siri and Cortana) that will allow you access many phone functions with voice alone. There are also apps that offer increased functionality in this area. Do a search for speech recognition on your app store of choice but stick with popular downloads and have a read through user comments before installing.