With just an Android phone, a deaf person or someone who is
hard of hearing can have a conversation with anyone. Live Transcribe is an app that types captions
accurately in the language that’s being spoken. It’s powered by Google’s speech
recognition technology and there are 70 languages to choose from.
Live Transcribe is easy to use, anywhere you have a Wi-Fi or
network connection and it’s free to download.
The video below demonstrates how the app can be used.
According to Dr. Mohammad Objedat, Professor, Gallaudet
“Live Transcribe gives me a more flexible and efficient way to communicate with hearing people. I just love it, it really changed the way I solve my communication problem.”
And what’s next?
Google are currently working on the Live Relay project which
aims to make phone calls easier for individuals who are deaf or non-speaking.
Live Relay uses on-device speech recognition and text-to-speech conversion to allow the phone to listen and speak on the users’ behalf while they type. By offering instant responses and predictive writing suggestions, Smart Reply and Smart Compose will help make typing fast enough to hold phone calls without any significant delays. Follow @googleaccess for updates.
The captioning accuracy is excellent
The not so good: No
Works really well, a valuable tool for individuals who are deaf or hard of
One aspect of modern technological life that might help us to keep some faith in humanity are the comprehensive assistive technologies that are built into, or free to download for mobile computing devices. Accessibility features, as they are loosely called, are a range of tools designed to support non-standard users of the technology. If you can’t see the screen very well you can magnify text and icons (1) or use high contrast (2). If you can’t see the screen at all you can have the content read back to you using a screen-reader (3). There are options to support touch input (4, 5) and options to use devices hands free (6). Finally there also some supports for deaf and hard of hearing (HoH) people like the ability to switch to mono audio or visual of haptic alternatives to audio based information.
their mobile operating system iOS Apple do accessibility REALLY well and this is
reflected in the numbers. In the 2018 WebAim Survey of Low
Vision users there were over 3 times
as many iOS users as Android users. That is almost the exact reverse of the
general population (3 to 1 in favour of Android). For those with Motor Difficulties it
was less significant but iOS was still favoured.
So what are Apple doing right? Well obviously, first and foremost, the credit would have to go to their developers and designers for producing such innovative and well implemented tools. But Google and other Android developers are also producing some great AT, often highlighting some noticeable gaps in iOS accessibility. Voice Access, EVA Facial Mouse and basic pointing device support are some examples, although these are gaps that will soon be filled if reports of coming features to iOS 13 are to be believed.
Rather than being just about the tools it is as much, if not more, about awareness of those tools: where to find them, how they work. In every Apple mobile device you go to Settings>General>Accessibility and you will have Vision (1, 2, 3), Interaction (4, 5, 6) and Hearing settings. I’m deliberately not naming these settings here so that you can play a little game with yourself and see if you know what they are. I suspect most readers of this blog will get 6 from 6, which should help make my point. You can check your answers at the bottom of the post 🙂 This was always the problem with Android devices. Where Apple iOS accessibility is like a tool belt, Android accessibility is like a big bag. There is probably more in there but you have to find it first. This isn’t Google’s fault, they make great accessibility features. It’s more a result of the open nature of Android. Apple make their own hardware and iOS is designed specifically for that hardware. It’s much more locked down. Android is an open operating system and as such it depends on the hardware manufactured how accessibility is implemented. This has been slowly improving in recent years but Google’s move to bundle all their accessibility features into the Android Accessibility Suite last year meant a huge leap forward in Android accessibility.
What’s in Android Accessibility Suite?
Use this large on-screen menu to control gestures, hardware buttons, navigation, and more. A similar idea to Assistive Touch on iOS. If you are a Samsung Android user it is similar (but not as good in my opinion) as the Assistant Menu already built in.
Select to Speak
Select something on your screen or point your camera at an image to hear text spoken. This is a great feature for people with low vision or a literacy difficulty. It will read the text on screen when required without being always on like a screen reader. A similar feature was available inbuilt in Samsung devices before inexplicably disappearing with the last Android update. The “point your camera at an image to hear text spoken” claim had me intrigued. Optical Character Recognition like that found in Office Lens or SeeingAI built into the regular camera could be extremely useful. Unfortunately I have been unable to get this feature to work on my Samsung Galaxy A8. Even when selecting a headline in a newspaper I’m told “no text found at that location”.
Interact with your Android device using one or more switches or a keyboard instead of the touch screen. Switch Access on Android has always been the poor cousin to Switch Control on iOS but is improving all the time.
TalkBack Screen Reader
Get spoken, audible, and vibration feedback as you use your device. Googles mobile screen reader has been around for a while, while apparently, like Switch Access it’s improving, I’ve yet to meet anybody who actually uses it full time.
So to summarise, as well as adding features that may have been missing on your particular “flavour” of Android, this suite standardises the accessibility experience and makes it more visible. Also another exciting aspect of these features being bundled in this way is their availability for media boxes. Android is a hugely popular OS for TV and entertainment but what is true of mobile device manufacturer is doubly so of Android Box manufacturers where it is still very much the Wild West. If you are in the market for an Android Box and Accessibility is important make sure it’s running Android Version 6 or later so you can install this suite and take advantage of these features.
Claro MagX is an app that converts your iPhone, iPad or Android device into a visual magnifier. It basically makes small items bigger such as small text in a book or newspaper. Just hold your phone up to whatever you want to magnify.
As the app can use the devices in-built flash, it can be used in a dimly lit area. Advanced visual features include full-colour mode, two colour mode and grey scale mode. The app features 16 levels of magnification, high contrast and colour viewing options to make the text easier on your eyes. Freeze mode option – tap the viewfinder to freeze the image for closer viewing. Tap the screen to release the freeze.
Tamas and Peter from route4u.org called in last week to tell us about their accessible route finding service. Based on Open Street Maps, Route4u allows users to plan routes that are appropriate to their level and method of mobility. Available on iOS, Android and as a web app at route4u.org/maps, Route4u is the best accessible route planning solution I have seen. Where a service like Mobility Mojo gives detailed accessibility information on destinations (business, public buildings), route4u concentrates more on the journey, making them complementary services. When first setting up the app you will be given the option to select either pram, active wheelchair, electronic wheelchair, handbike or walking (left screenshot below). You can further configure your settings later in the accessibility menu selecting curb heights and maximum slopes etc. (right screenshot below)
Further configure your settings in Accessibility
You are first asked to select your mobility method
This is great but so far nothing really groundbreaking, we have seen services like this before. Forward thinking cities with deep pockets like London and Ontario have had similar accessibility features built into their public transport route planners for the last decade. That is a lot easier to achieve however because you are dealing with a finite number of route options. Where Route4u is breaking new ground is that it facilitates this level of planning throughout an entire city. It does this by using the technology built into smartphones to provide crowdsourced data that constantly updates the maps. If you are using a wheelchair or scooter the sensors on your smartphone can measure the level of vibration experienced on a journey. This data is sent back to route4u who use it to estimate the comfort experienced on that that journey, giving other users access to even more information on which to base their route choice. The user doesn’t have to do anything, they are helping to improve the service by simply using it. Users can also more proactively improve the service by marking obstacles they encounter on their journey. The obstacle can be marked as temporary or permanent. Temporary obstacles like road works or those ubiquitous sandwich boards that litter our pavements will remain on the map helping to inform the accessibility of the route until another user confirms they have been removed and enters that information.
Example of obstacle added by user –
Example of obstacle added by user
If you connect route4u to your FaceBook account you get access to a points based reward system. This allows you compete with friends and have your own league table. In Budapest where they are already well established they have linked with sponsors who allow you cash points in for more tangible rewards like a free breakfast or refreshment. These gamification features should help encourage users less inclined towards altruism to participate and that is key. Route4u when established relies on its users to keep information up to date. This type of service based on crowdsourced data is a proven model, particularly in the route planning sphere. It’s a bit of a catch 22 however as a service needs to be useful first to attract users. It is early days for Route4u in Dublin and Tamas and Peter acknowledge that a lot of work needs to be done before promoting the service here. Over the next few months their team will begin mapping Dublin city centre, this way, when they launch there will be the foundation of an accessible route finding service which people can use, update and build upon. While route4u has obvious benefits for end users with mobility difficulties there is another beneficiary of the kind of data this service will generate. Tamas and Peter were also keen to point out how this information could be used by local authorities to identify where infrastructure improvements are most needed and where investment will yield the most return. In the long run this will help Dublin and her residents tackle the accessibility problem from both sides making it a truly smart solution.
We all know what it’s like being in school when the sun is shining outside and all you can think about is being out there! Or when you’re trying to get your homework done and all you can think about is who’s posting what on Snapchat or Instagram? Or have you ever found yourself managing to get a study block done and then taking a well-deserved 5-minute break to take a peek at social media, only to emerge from your phone a half an hour later and way behind on your study schedule? Well, the following free apps are for you! In fact, they’re for anyone who wants to use their time on their computer or smartphone more productively, whether you’re a student or not.
Stay Focused is a free google chrome extension that helps you to stay focused on your work by stopping you from looking at time-wasting websites (e.g. Instagram, Snapchat, Facebook, Twitter). You set a certain amount of time in the day that you’re allowed to look at those distracting websites and then once your allotted time for the day has been used up, it blocks you out of them. End of distractions! You can also choose to have a complete block on the websites that are your major culprits for time-wasting.
This one works in a similar way to Stay Focused but it’s for the Mozilla Firefox browser instead of Chrome. You can specify up to six sets of sites to block, with different times and days for each set (e.g. you could have Twitter blocked from 9am to 5pm and Facebook blocked for all but 10 minutes in every hour).
This is one of many apps that use the timing principle behind the Pomodoro Technique (i.e. you work for 25 minutes, then take a 5 minute break, then after four of these sessions you can take a longer break of 15-30mins). This Google Chrome extension helps you to concentrate on your work by blocking a list of websites for the amount of time you’ve set and once your working period is over, it’ll unblock those sites to give you a break from work before you hit those books again!
Offtime is an app for iOS and Android smartphones that not only lets you block calls, texts and notifications when you’re trying to work, but it can also track your phone and app usage so you can identify what distracts you most. You can set different profiles, like School, Family and Me Time and when you’re finished your work, it gives you an activity log with a list of everything that happened while you were working so you don’t have to worry about missing out on anything.
So, with these apps you’ll be able to maximise your study time and even better, you’ll be able to look at all your favourite websites and apps guilt-free on your breaks!
In this podcast, Sarah Boland, together with David Deane and Áine Walsh, talk about the training they hosted on 21st June 2017 on the Mefacilyta Desktop app in St John of God in Stillorgan.
Mefacilyta Desktop is a new Android app developed by Vodafone Foundation Spain in conjunction with St John of God, which can be individually tailored to support people with intellectual disabilities to learn how to carry out their everyday activities independently.
The FLipMouse (Finger- and Lip mouse) is a computer input device intended to offer an alternative for people with access difficulties that prevent them using a regular mouse, keyboard or touchscreen. It is designed and supported by the Assistive Technology group at the UAS Technikum Wien (Department of Embedded Systems) and funded by the City of Vienna (ToRaDes project and AsTeRICS Academy project). The device itself consists of a low force (requires minimal effort to operate) joystick that can be controlled with either the lips, finger or toe. The lips are probably the preferred access method as the FlipMouse also allows sip and puff input.
Sip and Puff is an access method which is not as common in Europe as it is in the US however it is an ideal way to increase the functionality of a joystick controlled by lip movement. See the above link to learn more about sip and puff but to give a brief explanation, it uses a sensor that monitors the air pressure coming from a tube. A threshold can be set (depending on the user’s ability) for high pressure (puff) and low pressure (sip). Once this threshold is passed it can act as an input signal like a mouse click, switch input or keyboard press among other things. The Flipmouse also has two jack inputs for standard ability switches as well as Infrared in (for learning commands) and out (for controlling TV or other environmental controls). All these features alone make the Flipmouse stand out against similar solutions however that’s not what makes the Flipmouse special.
The Flipmouse is the first of a new kind of assistive technology (AT) solution, not because of what it does but because of how it’s made. It is completely Open Source which means that everything you need to make this solution for yourself is freely available. The source code for the GUI (Graphical User Interface) which is used to configure the device and the code for the microcontroller (TeensyLC), bill of materials listing all the components and design files for the enclosure are all available on their GitHub page. The quality of the documentation distinguishes it from previous Open Source AT devices. The IKEA style assembly guide clearly outlines the steps required to put the device together making the build not only as simple as some of the more advanced Lego kits available but also as enjoyable. That said, unlike Lego this project does require reasonable soldering skills and a steady hand, some parts are tricky enough to keep you interested. The process of constructing the device also gives much better insight into how it works which is something that will undoubtedly come in handy should you need to troubleshoot problems at a later date. Although as stated above Asterics Academy provide a list of all components a much better option in my opinion would be to purchase the construction kit which contains everything you need to build your own FlipMouse, right down to the glue for the laser cut enclosure, all neatly packed into a little box (pictured below). The kit costs €150 and all details are available from the FlipMouse page on the Asterics Academy site. Next week I will post some video demonstrations of the device and look at the GUI which allows you program the FlipMouse as a computer input device, accessible game controller or remote control.
I can’t overstate how important a development the FlipMouse could be to the future of Assistive Technology. Giving communities the ability to build and support complex AT solutions locally not only makes them more affordable but also strengthens the connection between those who have a greater requirement for technology in their daily life and those with the creativity, passion and in-depth knowledge of emerging technologies, the makers. Here’s hoping the FlipMouse is the first of many projects to take this approach.
There is of course some cross over between the different AT highlights of 2016 I have included here. An overall theme running through all the highlights this year is the mainstreaming of AT. Apple, Google and Microsoft have all made significant progress in the areas previously mentioned: natural language understanding and smart homes. This has led to easier access to computing devices and through them the ability to automate and remotely control devices and services that assist us with daily living tasks around the house. However these developments are aimed at the mainstream market with advantages to AT users being a welcome additional benefit. What I want to look at here are the features they are including in their mainstream products specifically aimed at people with disabilities with the goal of making their products more inclusive. Apple have always been strong in this area and have lead the way now for the last five years. 2016 saw them continue this fine work with new features such as Dwell within MacOS and Touch Accommodations in iOS 10 as well as many other refinements of already existing features. Apple also along with Siri have brought Switch Control to Apple TV either using a dedicated Bluetooth switch or through a connected iOS device in a method they are calling Platform Switching. Platform Switching which also came out this year with iOS 10 “allows you to use a single device to operate any other devices you have synced with your iCloud account. So you can control your Mac directly from your iPhone or iPad, without having to set up your switches on each new device” (need to be on the same WiFi network). The video below from Apple really encapsulates how far they have come in this area and how important this approach is.
Not to be outdone Microsoft bookended 2016 with some great features in the area of literacy support, an area they had perhaps neglected for a while. They more than made up for this last January with the announcement of Learning Tools for OneNote. I’m not going to go into details of what Learning Tools offers as I have covered it in a previous post. All I’ll say is that it is free, it works with OneNote (also free and a great note taking and organisation support in its own right) and is potentially all many students would need by way of literacy support (obviously some students may need additional supports). Then in the fourth quarter of the year they updated their OCR app Office Lens for iOS to provide the immersive reader (text to speech) directly within the app.
Finally Google who would probably have the weakest record of the big 3 in terms of providing inbuilt accessibility features (to be fair they always followed a different approach which proved to be equally effective) really hit a home run with their Voice Access solution which was made available for beta testing this year. Again I have discussed this in a previous post here where you can read about it in more detail. Having tested it I can confirm that it gives complete voice access to all Android devices features as well as any third party apps I tested. Using a combination of direct voice commands (Open Gmail, Swipe left, Go Home etc.) and a system of numbering buttons and links, even obscure apps can be operated. The idea of using numbers for navigation while not new is extremely appropriate in this case, numbers are easily recognised regardless of voice quality or regional accent. Providing alternative access and supports to mainstream Operating Systems is the corner stone of recent advances in AT. As the previous video from Apple showed, access to smartphones or computers gives access to a vast range of services and activities. For example inbuilt accessibility features like Apple’s Switch Control or Google’s Voice Access open up a range of mainstream Smart Home and security devices and services to people with alternative access needs where before they would have to spend a lot more for a specialist solution that would have probably been inferior.
As we approach the end of 2016 it’s an appropriate time to look back and take stock of the year from an AT perspective. A lot happened in 2016, not all good. Socially, humanity seems to have regressed over the past year. Maybe this short term, inward looking protectionist sentiment has been brewing longer but 2016 brought the opportunity to express politically, you know the rest. While society steps and looks back technology continues to leap and bound forward and 2016 has seen massive progress in many areas but particularly areas associated with Artificial Intelligence (AI) and Smart Homes. This is the first in a series of posts examining some technology trends of 2016 and a look at how they affect the field of Assistive Technology. The links will become active as the posts are added. If I’m missing something please add it to the comments section.
So although 2016 is unlikely to be looked on kindly by future historians… you know why; it has been a great year for Assistive Technology, perhaps one of promise rather than realisation however. One major technology trend of 2016 missing from this series posts is Virtual (or Augmented) Reality. While VR was everywhere this year with products coming from Sony, Samsung, Oculus and Microsoft its usefulness beyond gaming is only beginning to be explored (particularly within Education).
So what are the goals for next year? Well harnessing some of these innovations in a way where they can be made accessible and usable by people with disabilities at an affordable price. If in 2017 we can start putting some of this tech into the hands of those who stand to benefit most from its use, then next year will be even better.
With iOS Apple have firmly established themselves as the mobile device brand of choice for those with alternative access needs. The extensive accessibility features, wide range of AT apps and third party hardware as well as iOS’ familiarity, ease of use and security, all make it a choice hard to look beyond. Yet this is exactly what many people do, 1.3 Billion Android devices were shipped in 2015, that’s 55% of all computing devices mobile or otherwise. A large majority of these would be budget smartphones or tablets purchased in developing markets where the price tag associated with Apple products could be considered prohibitive. There are however reasons other than cost to choose Android and thankfully Google have been quietly working away to give you even more. One in particular, which is currently in beta testing (click here to apply) is Voice Access. As its name suggests this new accessibility feature (and that is what it is being developed as, immediately distinguishing it from previous speech recognition apps) allows complete access to your device through voice alone. I’ll let Google describe it: “Voice Access is an accessibility service that lets you control your Android device with your voice. Using spoken commands, you can activate on-screen controls, launch apps, navigate your device, and edit text. Voice Access can be helpful for users for whom using a touch screen is difficult.” It certainly sounds promising and if these aspirations can be realised will be very welcome indeed. Voice control of mobile devices is something we are frequently asked about in Enable Ireland’s Assistive Technology Training Service. I’ll post more on Voice Access after I’ve had the opportunity to test it a bit more. In the meantime take a look at the video below to whet your appetite.
Another alternative access option now available to Android users is a third party application developed and promoted by CREA with the support of Fundación Vodafone España called EVA Facial Mouse. EVA Facial Mouse has been created by the same people who brought us Enable Viacam for Windows and Linux and seems to be a mobile version of that popular and effective camera input system. EVA uses a combination of the front facing camera and face recognition to allow the user position the cursor and click on icons without having to touch the device. See video below for more on EVA (Spanish with subtitles)
Reviews of EVA on Google Play are mostly positive with many negative reviews most probably explained by device specific incompatibilities. This remains the primary difficulty associated with the use and support of Android based devices as Assistive Technology. All Android devices are not created equally and how they handle apps can vary significantly depending on the resources they have available (CPU/RAM) and how Android features (pointing device compatibility in this case) are implemented. That said, on the right device both new access options mentioned above could mean greatly improved access efficiency for two separate user groups who have up until now had to rely primarily on switch access. Next week I will release a post reviewing current Android phones and follow that up with a couple of in-depth reviews of the above apps and their compatibility with selected Android devices and other third party AT apps like ClickToPhone.