One aspect of modern technological life that might help us to keep some faith in humanity are the comprehensive assistive technologies that are built into, or free to download for mobile computing devices. Accessibility features, as they are loosely called, are a range of tools designed to support non-standard users of the technology. If you can’t see the screen very well you can magnify text and icons (1) or use high contrast (2). If you can’t see the screen at all you can have the content read back to you using a screen-reader (3). There are options to support touch input (4, 5) and options to use devices hands free (6). Finally there also some supports for deaf and hard of hearing (HoH) people like the ability to switch to mono audio or visual of haptic alternatives to audio based information.
their mobile operating system iOS Apple do accessibility REALLY well and this is
reflected in the numbers. In the 2018 WebAim Survey of Low
Vision users there were over 3 times
as many iOS users as Android users. That is almost the exact reverse of the
general population (3 to 1 in favour of Android). For those with Motor Difficulties it
was less significant but iOS was still favoured.
So what are Apple doing right? Well obviously, first and foremost, the credit would have to go to their developers and designers for producing such innovative and well implemented tools. But Google and other Android developers are also producing some great AT, often highlighting some noticeable gaps in iOS accessibility. Voice Access, EVA Facial Mouse and basic pointing device support are some examples, although these are gaps that will soon be filled if reports of coming features to iOS 13 are to be believed.
Rather than being just about the tools it is as much, if not more, about awareness of those tools: where to find them, how they work. In every Apple mobile device you go to Settings>General>Accessibility and you will have Vision (1, 2, 3), Interaction (4, 5, 6) and Hearing settings. I’m deliberately not naming these settings here so that you can play a little game with yourself and see if you know what they are. I suspect most readers of this blog will get 6 from 6, which should help make my point. You can check your answers at the bottom of the post 🙂 This was always the problem with Android devices. Where Apple iOS accessibility is like a tool belt, Android accessibility is like a big bag. There is probably more in there but you have to find it first. This isn’t Google’s fault, they make great accessibility features. It’s more a result of the open nature of Android. Apple make their own hardware and iOS is designed specifically for that hardware. It’s much more locked down. Android is an open operating system and as such it depends on the hardware manufactured how accessibility is implemented. This has been slowly improving in recent years but Google’s move to bundle all their accessibility features into the Android Accessibility Suite last year meant a huge leap forward in Android accessibility.
What’s in Android Accessibility Suite?
Use this large on-screen menu to control gestures, hardware buttons, navigation, and more. A similar idea to Assistive Touch on iOS. If you are a Samsung Android user it is similar (but not as good in my opinion) as the Assistant Menu already built in.
Select to Speak
Select something on your screen or point your camera at an image to hear text spoken. This is a great feature for people with low vision or a literacy difficulty. It will read the text on screen when required without being always on like a screen reader. A similar feature was available inbuilt in Samsung devices before inexplicably disappearing with the last Android update. The “point your camera at an image to hear text spoken” claim had me intrigued. Optical Character Recognition like that found in Office Lens or SeeingAI built into the regular camera could be extremely useful. Unfortunately I have been unable to get this feature to work on my Samsung Galaxy A8. Even when selecting a headline in a newspaper I’m told “no text found at that location”.
Interact with your Android device using one or more switches or a keyboard instead of the touch screen. Switch Access on Android has always been the poor cousin to Switch Control on iOS but is improving all the time.
TalkBack Screen Reader
Get spoken, audible, and vibration feedback as you use your device. Googles mobile screen reader has been around for a while, while apparently, like Switch Access it’s improving, I’ve yet to meet anybody who actually uses it full time.
So to summarise, as well as adding features that may have been missing on your particular “flavour” of Android, this suite standardises the accessibility experience and makes it more visible. Also another exciting aspect of these features being bundled in this way is their availability for media boxes. Android is a hugely popular OS for TV and entertainment but what is true of mobile device manufacturer is doubly so of Android Box manufacturers where it is still very much the Wild West. If you are in the market for an Android Box and Accessibility is important make sure it’s running Android Version 6 or later so you can install this suite and take advantage of these features.
Over the course of history there have always been single named women who have influenced our lives and Culture: Cleopatra, Maggie, Madonna, and now it’s the turn of Alexa! I have been curious and intrigued by the benefits of technological assistants with regards my disability, so I was very excited when Enable Ireland gave me an opportunity to try out Alexa in the form of the Amazon Echo.
How easy is it to get the Echo up and running?
initial setup of the Amazon Echo is very simple to carry out. You need to
download the Amazon Alexa app to your smartphone (get used to downloading apps
on your phone), the app will search for the device, the app will then connect
to the device through the devices own Wi-Fi signal, you then connect your
device to your home broadband, and hey presto within a few minutes your Amazon
Echo is up and running.
What can Alexa do on its own?
initial benefits of the Amazon Echo for a person with a disability are very
limited. You can ask Alexa what the weather will be like, what time it is, to
set reminders, and some other quirky less useful questions: “Alexa, tell
me a joke”, “What’s the capital of Finland?”, or more randomly
“Alexa, beatbox for me”.
the Alexa app you can enable other skills to assist you in your daily
activities. If you are into music you can add 🙂 your Spotify profile to Alexa,
this is very simple to do if you can use a smartphone. Alexa will then play
your playlists through its impressive speakers. This is very handy, even for
someone who is not into music much, as it means I don’t need to listen to music
through my basic phone speakers nor do I have to call someone to change a cd in
my stereo. It is great for podcasts as well, though as Alexa sometimes has
difficulty understanding people you might be better off setting up a playlist
through your Spotify app first if any of your favourite podcasts have quirky
names like my favourite Arsenal podcast Arsecast by Arseblog!
you have vision impairment, have difficulty holding a book, or you just like
Audiobooks you can quickly add your Audible account too, tilt back in your
chair and listen to your favourite book or a new release. It can also update
you with the latest news, traffic, and weather for your area as well.
you have trouble with your memory because of a head injury, or you just have a
head like a sieve as I do, the reminders and timers could be very useful. I
normally add reminders to my phone as I can’t write them down but just
immediately calling them out is useful as sometimes I go to add them to my
phone and get distracted by Twitter and the likes. The timers are useful if
you’re cooking and the chicken needs just five minutes more.
What can Alexa do using IOT – The Internet Of Things?
For someone with a physical disability this is where it
really sparked my interest. I struggle with some aspects of technology and to
physically control my environment so I thought I would benefit from Alexa and
Smart WeMo Plug
Firstly I decided to set up the lamp in my sitting room. In order to use Alexa to switch on your light you either need a smart plug or you need smart bulbs and a Wi-Fi hub. Enable Ireland had also provided me with a WeMo smart plug in this instance. The setup for the WeMo smart plug was very similar to the initial setup of the Amazon Echo: download the app, connect to the devices own Wi-Fi, and connect the device to your home broadband.
Once you have that done you can control the lamp directly
from your smartphone only if you wanted, in order to connect it to the Alexa
you need to go back to the Alexa app and pair the Alexa with the WeMo smart
plug from there.
Overall it is very simple System and process and once you
have it up all you have to do is say “Alexa, turn on the lamp”. This
was a complete success and over the time I had the devices this is the one that
proved most simple to use and most consistent. It was lovely if I was on my own
for a little while coming toward evening, I could give that simple command and
“Let there be light!”
The other devices I had to connect to the Echo were related to the TV. I use an Amazon fire stick to play games on my TV and also to watch Netflix. I knew from watching YouTube videos that you could pair your Amazon Echo with your fire stick and use Alexa to open Netflix and play your movies and shows.
Unfortunately this was not so easy to carry out. It seemed simple at first, get your Alexa device to scan your Wi-Fi for compatible devices and when you see the Firestick click connect. Unfortunately this is where I ran into some problems. In order to get the Alexa to carry out these procedures I had to enable its TV skills through the app. I had to do something similar to set up my Spotify account so I wasn’t too worried at first. Frustratingly when I went into the app to enable that TV skill the screen went blank and gave me no options to enable it. After numerous attempts to carry this out and searches on the internet to find a solution I eventually contacted Amazon’s online support and having gone through three advisors I found the solution by enabling it through my laptop and my Amazon account on the Desktop site. Phew!
The results of that is I can come into sitting room in the
morning, with the TV turned off, and ask Alexa to open Netflix. If you know the
name of the movie or show you want to watch you can ask Alexa to open it
directly. You can play, pause and fast forward or rewind whatever you are
watching. This has been very helpful for me is the remote for my fire stick is
tiny and the buttons are incredibly difficult to press. If you are a movie buff
and have difficulties using small remotes then this solution is probably worth
all the hassle it took to set it up in the first place!
In the package from Enable Ireland there was also a Logitech
Harmony Hub. At first, I had no idea what it was. I had never heard of it
before. A bit of Googling revealed that it is a universal remote control. A bit
of YouTubing revealed that it could be paired with Alexa to turn on and control
a whole host of electronic devices including your TV, Stereo System, or Sky
This is a complex setup. You set up the Harmony hub much the same way as you do the other devices. So again that means you need to download another app to connect it to your Wi-Fi, I hope you have enough space on your smartphone! Once it is set up and ready to go you need to use the Alexa app to enable the Harmony Hub skill so Alexa can communicate with the Harmony Hub. Now use the Harmony App to scan for smart devices that may be on your Wi-Fi already, like a smart TV. If you have something that is not smart like my Sky box, you simply search in the app for the product and add it to your list of devices. Right, now that you have your devices listed and the Hub and Alexa can talk to one another what can you tell them to do?
Using the Harmony app you can set up a range of
“activities”. These are relatively easy to set up as you follow a step by step
process through the app. Quite quickly I had it set up so that I could tell
Alexa to turn on the TV, it would turn on the TV and set it to the Sky TV
extension immediately. I also set it up so I could increase and decrease the
volume of the TV and I could change the ordinary terrestrial channels on the TV.
I have seen that you can change channels on your Sky box and set “favourite
channels” to tune to quickly but, frustratingly, while I can do that through
the Harmony app on my phone I haven’t been able to do that using Alexa despite
numerous and persistent attempts. Apparently, it is possible if you set an “activity”
for each individual channel but life is too short!
If you are technically proficient enough and you have a big
enough budget there are whole host of other devices you could use with the
Alexa to smarten up your home whether it is to control your heating or even to
unlock your door!
Are there Privacy Issues?
There are some concerns about privacy and the Alexa. Some of
the stories surrounding this issue I’m sure have been exaggerated for headlines
but there is a basis to some of the concern too with Amazon admitting that
staff listen to people’s interactions with Alexa (I think they’ll get a laugh
from some of my frustrated interactions where Alexa was called everything under
the sun while I tried in vain to control the Sky box via Alexa).
download the Alexa app. This sort of sets the tone for what to expect with
I know from my experience with the Alexa that there have
been some strange happenings. During conversations in the same room as the
Alexa the blue light that indicates Alexa is listening has come on. On another
occasion Alexa has piped up with search results that were not asked for in the
middle of a conversation. Nothing too sinister I’m sure but something I’m
personally not too comfortable with.
It’s up to you whether you’re willing to give up that sense
of personal privacy in place of the benefits Alexa provides.
I was very excited to try out the Amazon Echo and Alexa. I
felt this was my opportunity to finally make up my mind on whether to purchase
one or not, a decision I had been debating over for some time.
Alexa promises so much to help me with my physical
disability. Overall in this aspect it did live up to expectation. It was
frustrating that I couldn’t manage to set it up to operate my Sky box but I was
able to set it up to use most the functions on my TV, and the Alexa in
conjunction with the WeMo plug gave the most satisfying and consistent function
of switching my sitting room lamp on and off. If I were to purchase an Echo I
would consider investing further into the other devices that could do as the
WeMo plug did.
The other aspects of the Echo were less beneficial to me as
they didn’t involve improving my access to my physical environment. That does
not take away from the fact that they could be hugely beneficial for someone
with a different disability such as a sensory disability: reminders, timers,
your Spotify, and your Audiobooks through Alexa would simplify so many parts of
a person’s life.
For someone with a high level disability or someone who has difficulty using a smartphone the set up process of the Echo itself may be a little complex. The set up process for some of the “activities” on the Harmony Hub would take the most seasoned of smartphone users to the point where they just give up (ie. me 🙂
The initial cost of the Amazon Echo is very affordable.
However, if someone with a disability wishes to use the Echo and Alexa to its
full potential to make their lives more independent then they will need to
spend a lot more. A quick Google suggested that a Wi-Fi plug similar to the
WeMo plug is €22 each while a Harmony Hub remote is available for approximately
€120. So if you’re hoping to live in a completely smart home it’s going to be
difficult if you’re sole source of income is your Disability Allowance.
All that being said, that decision I have been debating over
for some time, have I made it? Well, in a sense I have. I am fortunate to be
able to use my mobile phone without much difficulty so in the short term I
think I will get a Harmony Hub which will allow me to carry out most of what
Alexa has been doing for me on this trial but through my phone and without the
worry of Amazon employees listening in on me. In the medium to long term I’m sure
I’ll revisit Alexa or even the Google equivalent!
Until now, people with significant physical disabilities could only operate an iPad or iPhone by switch control. With AMAneo BTi it is possible for the first time to operate an iPad or iPhone directly with any mouse or assistive mouse including a trackball, joystick, head mouse or thumb mouse, and even a wheelchair joystick. The AMAneo BTi also has some very useful built-in features such as tremor filter, dwell click and 2 jack plugs for external switches.
Simply connect the AMAneo BTi to your iPad or iPhone via
Bluetooth and the pointer will automatically appear on your device’s screen,
with no additional App required. This allows the user to navigate around the screen
and interact with a mouse to connect with friends, browse the internet, and
Tamas and Peter from route4u.org called in last week to tell us about their accessible route finding service. Based on Open Street Maps, Route4u allows users to plan routes that are appropriate to their level and method of mobility. Available on iOS, Android and as a web app at route4u.org/maps, Route4u is the best accessible route planning solution I have seen. Where a service like Mobility Mojo gives detailed accessibility information on destinations (business, public buildings), route4u concentrates more on the journey, making them complementary services. When first setting up the app you will be given the option to select either pram, active wheelchair, electronic wheelchair, handbike or walking (left screenshot below). You can further configure your settings later in the accessibility menu selecting curb heights and maximum slopes etc. (right screenshot below)
Further configure your settings in Accessibility
You are first asked to select your mobility method
This is great but so far nothing really groundbreaking, we have seen services like this before. Forward thinking cities with deep pockets like London and Ontario have had similar accessibility features built into their public transport route planners for the last decade. That is a lot easier to achieve however because you are dealing with a finite number of route options. Where Route4u is breaking new ground is that it facilitates this level of planning throughout an entire city. It does this by using the technology built into smartphones to provide crowdsourced data that constantly updates the maps. If you are using a wheelchair or scooter the sensors on your smartphone can measure the level of vibration experienced on a journey. This data is sent back to route4u who use it to estimate the comfort experienced on that that journey, giving other users access to even more information on which to base their route choice. The user doesn’t have to do anything, they are helping to improve the service by simply using it. Users can also more proactively improve the service by marking obstacles they encounter on their journey. The obstacle can be marked as temporary or permanent. Temporary obstacles like road works or those ubiquitous sandwich boards that litter our pavements will remain on the map helping to inform the accessibility of the route until another user confirms they have been removed and enters that information.
Example of obstacle added by user –
Example of obstacle added by user
If you connect route4u to your FaceBook account you get access to a points based reward system. This allows you compete with friends and have your own league table. In Budapest where they are already well established they have linked with sponsors who allow you cash points in for more tangible rewards like a free breakfast or refreshment. These gamification features should help encourage users less inclined towards altruism to participate and that is key. Route4u when established relies on its users to keep information up to date. This type of service based on crowdsourced data is a proven model, particularly in the route planning sphere. It’s a bit of a catch 22 however as a service needs to be useful first to attract users. It is early days for Route4u in Dublin and Tamas and Peter acknowledge that a lot of work needs to be done before promoting the service here. Over the next few months their team will begin mapping Dublin city centre, this way, when they launch there will be the foundation of an accessible route finding service which people can use, update and build upon. While route4u has obvious benefits for end users with mobility difficulties there is another beneficiary of the kind of data this service will generate. Tamas and Peter were also keen to point out how this information could be used by local authorities to identify where infrastructure improvements are most needed and where investment will yield the most return. In the long run this will help Dublin and her residents tackle the accessibility problem from both sides making it a truly smart solution.
We all know what it’s like being in school when the sun is shining outside and all you can think about is being out there! Or when you’re trying to get your homework done and all you can think about is who’s posting what on Snapchat or Instagram? Or have you ever found yourself managing to get a study block done and then taking a well-deserved 5-minute break to take a peek at social media, only to emerge from your phone a half an hour later and way behind on your study schedule? Well, the following free apps are for you! In fact, they’re for anyone who wants to use their time on their computer or smartphone more productively, whether you’re a student or not.
Stay Focused is a free google chrome extension that helps you to stay focused on your work by stopping you from looking at time-wasting websites (e.g. Instagram, Snapchat, Facebook, Twitter). You set a certain amount of time in the day that you’re allowed to look at those distracting websites and then once your allotted time for the day has been used up, it blocks you out of them. End of distractions! You can also choose to have a complete block on the websites that are your major culprits for time-wasting.
This one works in a similar way to Stay Focused but it’s for the Mozilla Firefox browser instead of Chrome. You can specify up to six sets of sites to block, with different times and days for each set (e.g. you could have Twitter blocked from 9am to 5pm and Facebook blocked for all but 10 minutes in every hour).
This is one of many apps that use the timing principle behind the Pomodoro Technique (i.e. you work for 25 minutes, then take a 5 minute break, then after four of these sessions you can take a longer break of 15-30mins). This Google Chrome extension helps you to concentrate on your work by blocking a list of websites for the amount of time you’ve set and once your working period is over, it’ll unblock those sites to give you a break from work before you hit those books again!
Offtime is an app for iOS and Android smartphones that not only lets you block calls, texts and notifications when you’re trying to work, but it can also track your phone and app usage so you can identify what distracts you most. You can set different profiles, like School, Family and Me Time and when you’re finished your work, it gives you an activity log with a list of everything that happened while you were working so you don’t have to worry about missing out on anything.
So, with these apps you’ll be able to maximise your study time and even better, you’ll be able to look at all your favourite websites and apps guilt-free on your breaks!
When we use technology we need to be able to position it so that it is easy to use. We need to be able operate the controls and have it positioned so that we can see it without eyestrain. Sometimes it’s useful to mount a device, as our hands may be tied up doing something else; the device may be too heavy or we may even have a limited ability to reach, grasp, or hold the device.
Some of the most common items we use are mobile phones and tablets. There are various mounting options available. The suitability of a mount depends on various factors such as the fixing clamp, where you intend to mount the device, weight of device to be mounted, the reach and adjustability of the mount etc.
Mounting systems are generally composed of a (i)fixing clamp to mount either to a flat table top surface or a circular tubing, (ii) an adjustable arm usually no longer that 500mm (iii) some kind of attachment or cradle to hold the device.
Below are two mounting systems which may offer you some good solutions. Ram mounts are a mainstream supplier of mounts for electronic devices within cars, bikes and trucks. Rehadapt on the other hand, have a range of mounting products to serve clients with “special needs”. Both systems consist of a fixing clamp, adjustable arm and a cradle to hold the device.
In choosing a fixing clamp you need to consider the surface you are fixing it to: do you need to remove clamp often? Will the clamp be secure? Below are two clamps however their site offers more options.
Light 3D wheelchair mount with one tube and one joint with screw. The L3D-WC 2AK is a two pole version of this. You can state the required wheelchair clamp on order. Combine with any cradle with REHAdapt’s Spigot Link System (SLS).
Onto the end of the REHAdapt’s Spigot Link System, there a various options. Some universal cradles are below.
Adjustable tablet holder for mounting any tablet from between 7” and 13″ to a wheelchair with Rehadapt’s Universal Device Socket (UDS). The tablet mount works with Apple iPad and Samsung Galaxy tablets, with and without without protective cases.
The good: It looks good. Lots of component options and excellent for mounting onto a wheelchair.
Today May 18th is Global Accessibility Awareness Day and to mark the occasion Apple have produced a series of 7 videos (also available with audio description) highlighting how their products are being used in innovative ways by people with disabilities. All the videos are available in a playlist here and I guarantee you, if you haven’t seen them and you are interested in accessibility and AT, it’ll be the best 15 minutes you have spent today! Okay the cynical among you will point out this is self promotion by Apple, a marketing exercise. Certainly on one level of course it is, they are a company and like any company their very existence depends on generating profit for their shareholders. These videos promote more than Apple however, they promote independence, creativity and inclusion through technology. Viewed in this light these videos will illustrate to people with disabilities how far technology has moved on in recent years and make them aware of the potential benefits to their own lives. Hopefully the knock on effect of this increased awareness will be increased demand. Demand these technologies people, it’s your right!
As far as a favorite video from this series goes, everyone will have their own. In terms of the technology on show, to me Todd “The Quadfather” below was possibly the most interesting.
This video showcases Apple’s HomeKit range of associated products and how they can be integrated with Siri.
My overall favorite video however is Patrick, musician, DJ and cooking enthusiast. Patrick’s video is an ode to independence and creativity. The technologies he illustrates are Logic Pro (Digital Audio Workstation software) with VoiceOver (Apple’s inbuilt screen-reader) and the object recognizer app TapTapSee which although has been around for several years now, is still an amazing use of technology. It’s Patrick’s personality that makes the video though, this guy is going places, I wouldn’t be surprised if he had his own prime time TV show this time next year.
Last Friday (February 17th) New Scientist published an article about a new app in development at Microsoft called GazeSpeak. Due to be released over the coming months on iOS, GazeSpeak aims at facilitating communication between a person with MND (known as ALS in the US, I will use both terms interchangeably) and another individual, perhaps their partner, carer or friend. Developed by Microsoft intern, Xiaoyi Zhang, GazeSpeak differs from traditional approaches in a number of ways. Before getting into the details however it’s worth looking at the background, GazeSpeaker didn’t come from nowhere, it’s actually one of the products of some heavyweight research into Augmentative and Alternative Communication (AAC) that has been taking place at Microsoft over the last few years. Since 2013, inspired by football legend and ALS sufferer Steve Gleason (read more here) Microsoft researchers and developers have put the weight of their considerable collective intellect to bear on the subject of increasing the ease and efficiency of communication for people with MND.
This is an entirely new approach to increasing the efficiency of AAC and one that I suggest, could only have come from a large mainstream tech organisation who have over thirty years experience facilitating communication and collaboration.
Another Microsoft research paper published last year (with some of the same authors at the previous paper) called “Exploring the Design Space of AAC Awareness Displays” looks at importance of a communication partners “awareness of the subtle, social, and contextual cues that are necessary for people to naturally communicate in person”. There research focused on creating a display that would allow the person with ALS express things like humor, frustration, affection etc, emotions difficult to express with text alone. Yes they proposed the use of Emoji, which are a proven and effective way a similar difficulty is overcome in remote or non face to face interactions however they went much further and also looked at solutions like Avatars, Skins and even coloured LED arrays. This, like the other one above, is an academic paper and as such not an easy read but the ideas and solutions being proposed by these researchers are practical and will hopefully be filtering through to end users of future AAC solutions.
That brings us back to GazeSpeak, the first fruits of the Microsoft/Steve Gleason partnership to reach the general public. Like the AACrobat solution outlined above GazeSpeak gives the communication partner a tool rather than focusing on tech for the person with MND. As the image below illustrates the communication partner would have GazeSpeak installed on their phone and with the app running they would hold their device up to the person with MND as if they were photographing them. They suggest a sticker with four grids of letters is placed on the back of the smart phone facing the speaker. The app then tracks the persons eyes: up, down, left or right, each direction means the letter they are selecting is contained in the grid in that direction (see photo below).
Similar to how the old T9 predictive text worked, GazeSpeak selects the appropriate letter from each group and predicts the word based on the most common English words. So the app is using AI in the form of machine vision to track the eyes and also to make the word prediction. In the New Scientist article they mention that the user would be able to add their own commonly used words and people/place names which one assumes would prioritize them within the prediction list. In the future perhaps some capacity for learning could be added to further increase efficiency. After using this system for a while the speaker may not even need to see the sticker with letters, they could write words from muscle memory. At this stage a simple QR code leading to the app download would allow them to communicate with complete strangers using just their eyes and no personal technology.
UPDATE (August 2018): GazeSpeak has been released for iOS and is now called SwipeSpeak. Download here. For more information on how it works or to participate in further development have a look at their GitHub page here.
There is of course some cross over between the different AT highlights of 2016 I have included here. An overall theme running through all the highlights this year is the mainstreaming of AT. Apple, Google and Microsoft have all made significant progress in the areas previously mentioned: natural language understanding and smart homes. This has led to easier access to computing devices and through them the ability to automate and remotely control devices and services that assist us with daily living tasks around the house. However these developments are aimed at the mainstream market with advantages to AT users being a welcome additional benefit. What I want to look at here are the features they are including in their mainstream products specifically aimed at people with disabilities with the goal of making their products more inclusive. Apple have always been strong in this area and have lead the way now for the last five years. 2016 saw them continue this fine work with new features such as Dwell within MacOS and Touch Accommodations in iOS 10 as well as many other refinements of already existing features. Apple also along with Siri have brought Switch Control to Apple TV either using a dedicated Bluetooth switch or through a connected iOS device in a method they are calling Platform Switching. Platform Switching which also came out this year with iOS 10 “allows you to use a single device to operate any other devices you have synced with your iCloud account. So you can control your Mac directly from your iPhone or iPad, without having to set up your switches on each new device” (need to be on the same WiFi network). The video below from Apple really encapsulates how far they have come in this area and how important this approach is.
Not to be outdone Microsoft bookended 2016 with some great features in the area of literacy support, an area they had perhaps neglected for a while. They more than made up for this last January with the announcement of Learning Tools for OneNote. I’m not going to go into details of what Learning Tools offers as I have covered it in a previous post. All I’ll say is that it is free, it works with OneNote (also free and a great note taking and organisation support in its own right) and is potentially all many students would need by way of literacy support (obviously some students may need additional supports). Then in the fourth quarter of the year they updated their OCR app Office Lens for iOS to provide the immersive reader (text to speech) directly within the app.
Finally Google who would probably have the weakest record of the big 3 in terms of providing inbuilt accessibility features (to be fair they always followed a different approach which proved to be equally effective) really hit a home run with their Voice Access solution which was made available for beta testing this year. Again I have discussed this in a previous post here where you can read about it in more detail. Having tested it I can confirm that it gives complete voice access to all Android devices features as well as any third party apps I tested. Using a combination of direct voice commands (Open Gmail, Swipe left, Go Home etc.) and a system of numbering buttons and links, even obscure apps can be operated. The idea of using numbers for navigation while not new is extremely appropriate in this case, numbers are easily recognised regardless of voice quality or regional accent. Providing alternative access and supports to mainstream Operating Systems is the corner stone of recent advances in AT. As the previous video from Apple showed, access to smartphones or computers gives access to a vast range of services and activities. For example inbuilt accessibility features like Apple’s Switch Control or Google’s Voice Access open up a range of mainstream Smart Home and security devices and services to people with alternative access needs where before they would have to spend a lot more for a specialist solution that would have probably been inferior.
As we approach the end of 2016 it’s an appropriate time to look back and take stock of the year from an AT perspective. A lot happened in 2016, not all good. Socially, humanity seems to have regressed over the past year. Maybe this short term, inward looking protectionist sentiment has been brewing longer but 2016 brought the opportunity to express politically, you know the rest. While society steps and looks back technology continues to leap and bound forward and 2016 has seen massive progress in many areas but particularly areas associated with Artificial Intelligence (AI) and Smart Homes. This is the first in a series of posts examining some technology trends of 2016 and a look at how they affect the field of Assistive Technology. The links will become active as the posts are added. If I’m missing something please add it to the comments section.
So although 2016 is unlikely to be looked on kindly by future historians… you know why; it has been a great year for Assistive Technology, perhaps one of promise rather than realisation however. One major technology trend of 2016 missing from this series posts is Virtual (or Augmented) Reality. While VR was everywhere this year with products coming from Sony, Samsung, Oculus and Microsoft its usefulness beyond gaming is only beginning to be explored (particularly within Education).
So what are the goals for next year? Well harnessing some of these innovations in a way where they can be made accessible and usable by people with disabilities at an affordable price. If in 2017 we can start putting some of this tech into the hands of those who stand to benefit most from its use, then next year will be even better.