Mobile Device Accessibility: iOS and the Android Accessibility Suite

One aspect of modern technological life that might help us to keep some faith in humanity are the comprehensive assistive technologies that are built into, or free to download for mobile computing devices. Accessibility features, as they are loosely called, are a range of tools designed to support non-standard users of the technology. If you can’t see the screen very well you can magnify text and icons (1) or use high contrast (2). If you can’t see the screen at all you can have the content read back to you using a screen-reader (3). There are options to support touch input (4, 5) and options to use devices hands free (6). Finally there also some supports for deaf and hard of hearing (HoH) people like the ability to switch to mono audio or visual of haptic alternatives to audio based information.  

With their mobile operating system iOS Apple do accessibility REALLY well and this is reflected in the numbers. In the 2018 WebAim Survey of Low Vision users  there were over 3 times as many iOS users as Android users. That is almost the exact reverse of the general population (3 to 1 in favour of Android). For those with Motor Difficulties it was less significant but iOS was still favoured.

So what are Apple doing right? Well obviously, first and foremost, the credit would have to go to their developers and designers for producing such innovative and well implemented tools. But Google and other Android developers are also producing some great AT, often highlighting some noticeable gaps in iOS accessibility. Voice Access, EVA Facial Mouse and basic pointing device support are some examples, although these are gaps that will soon be filled if reports of coming features to iOS 13 are to be believed.

Rather than being just about the tools it is as much, if not more, about awareness of those tools: where to find them, how they work. In every Apple mobile device you go to Settings>General>Accessibility and you will have Vision (1, 2, 3), Interaction (4, 5, 6) and Hearing settings. I’m deliberately not naming these settings here so that you can play a little game with yourself and see if you know what they are. I suspect most readers of this blog will get 6 from 6, which should help make my point. You can check your answers at the bottom of the post 🙂 This was always the problem with Android devices. Where Apple iOS accessibility is like a tool belt, Android accessibility is like a big bag. There is probably more in there but you have to find it first. This isn’t Google’s fault, they make great accessibility features. It’s more a result of the open nature of Android. Apple make their own hardware and iOS is designed specifically for that hardware. It’s much more locked down. Android is an open operating system and as such it depends on the hardware manufactured how accessibility is implemented. This has been slowly improving in recent years but Google’s move to bundle all their accessibility features into the Android Accessibility Suite last year meant a huge leap forward in Android accessibility.

What’s in Android Accessibility Suite?

Accessibility Menu

Android OS Accessibility Suite Assistant Menu. An onscreen menu with large colourful buttons for features like, power, lock screen, volume
The figure highlighted in the bottom corner launches whatever Accessibility Suite tools you have active. If you have more than one a long press will allow you switch between tools.

Use this large on-screen menu to control gestures, hardware buttons, navigation, and more. A similar idea to Assistive Touch on iOS. If you are a Samsung Android user it is similar (but not as good in my opinion) as the Assistant Menu already built in.

Select to Speak

The select to speak tool when active on a webpage. large red button to stop speech. Arrow at left to extend menu, pause button

Select something on your screen or point your camera at an image to hear text spoken. This is a great feature for people with low vision or a literacy difficulty. It will read the text on screen when required without being always on like a screen reader. A similar feature was available inbuilt in Samsung devices before inexplicably disappearing with the last Android update. The “point your camera at an image to hear text spoken” claim had me intrigued. Optical Character Recognition like that found in Office Lens or SeeingAI built into the regular camera could be extremely useful. Unfortunately I have been unable to get this feature to work on my Samsung Galaxy A8. Even when selecting a headline in a newspaper I’m told “no text found at that location”.

Switch Access

cartoon hand activating a Blue2 switch. Android phone desktop with message icon highlighted

Interact with your Android device using one or more switches or a keyboard instead of the touch screen. Switch Access on Android has always been the poor cousin to Switch Control on iOS but is improving all the time.

TalkBack Screen Reader

Get spoken, audible, and vibration feedback as you use your device. Googles mobile screen reader has been around for a while, while apparently, like Switch Access it’s improving, I’ve yet to meet anybody who actually uses it full time.

So to summarise, as well as adding features that may have been missing on your particular “flavour” of Android, this suite standardises the accessibility experience and makes it more visible. Also another exciting aspect of these features being bundled in this way is their availability for media boxes. Android is a hugely popular OS for TV and entertainment but what is true of mobile device manufacturer is doubly so of Android Box manufacturers where it is still very much the Wild West. If you are in the market for an Android Box and Accessibility is important make sure it’s running Android Version 6 or later so you can install this suite and take advantage of these features.

Could you name the Apple iOS features?

  1. Zoom
  2. Display Accommodations or Increase Contrast   
  3. VoiceOver
  4. Assistive Touch
  5. Touch Accommodations
  6. Switch Control

Pen Reader for students with dyslexia

Pen Reader a tool designed with students with dyslexia in mindA few weeks ago, we attended a demonstration of the C-Pen Reader, a tool designed with students with dyslexia in mind. This device, with its pen shape and an OLED screen with text to speech output, assists those with literacy difficulties to read.

It is a simple to use device, whereby the reader runs the tip of the pen over the word or words that they wish to hear spoken aloud.  Using realistic speech synthesiser software, the student can hear the text read aloud by the inbuilt speaker or by using ear phones.

There is also the option, when a single word is scanned, to hear an Oxford English dictionary definition of the word, and to have the word magnified on screen, useful for those who may have visual impairments.

There is also the option to scan and save text to the internal storage in the pen and transfer to a PC or Mac computer for use later, a handy option for those who may not have access to a scanner.

While there is a separate version of the pen available, the ExamReader, with limited functionality (i.e. no internal storage or dictionary features) that will meet the criteria for State Exams, the standard pen can be turned into an ExamReader by choosing a locked mode.

The C-Pen also works in French and Spanish, while the ExamReader can read German and Italian in addition. There is also the ability to record voice notes on the C-Pen.

While there are many differing options, both hardware and app based, available for those with literacy difficulties or using English as a second language requiring text to speech functionality, the C-pen has its niche market in the education sector where the use of mobile phone apps or bulky hardware may make the use of text to speech difficult or impossible. The c-pen is a user-friendly option that is easily transportable and can be personalised.

More information can be found at www.readerpen.com, and schools and colleges can arrange for a free trial for their students. The  C-pen costs €225 ex VAT.

New Windows 10 accessible updates

Microsoft has been making huge strides in the realm of accessibility with each successive update to Windows and have invested in updates to improve the user experience for people with disabilities.  The improvements in their Ease of Access features include eye tracking, the narrator, low vision features, and reading and writing improvements.

 

Eye Control

Eye Control delivers new exciting updates and new tools.  For users who can’t use a mouse or keyboard to control their computer, Eye Control presents a convenient entry point to a windows computer using eye-tracking technology. Having access to your computer via Eye Control gives individuals a way to communicate, the ability to stay in the workforce, and so much more!

What began as a hack project during a One Week Hackathon, has become a product concept for the Windows team.  Microsoft has introduced Eye Control, which empowers people with disabilities to use a compatible eye tracker, such as a Tobii Eye Tracker, to operate an on-screen mouse, keyboard, and text-to-speech in Windows 10 using only their eyes.

demo of shap writing on Eye Control - works like swiping on a touch keyboard. dwell on the first letter of a word, glance at subsequent letters and dwell on last letter. word is entered

 

Microsoft Learning Tools

The New Learning Tools capabilities within Microsoft Edge Microsoft Learning Tools are a set of features designed to make it easier for people with learning differences like dyslexia to read. In this update, a user can mow simultaneously highlight and listen to text in web pages and PDF documents to read and increase focus.

Now with the addition of the Immersive Reader functionality of Learning Tools you can photograph a document, export it to immersive reader and immediately use the tools to support your understanding of the text.

https://youtu.be/L1vq4Ma0lt4

 

Narrator

Narrator will include the ability to use artificial intelligence to generate descriptions for images that lack alternative text. For websites or apps that don’t have alt-text built in, this feature will provide descriptions of an image.  Narrator will now also include the ability to send commands from a keyboard, touch or braille display and get feedback about what the command does without invoking the command.  Also, there will be some Braille improvements – Narrator users can type and read using different braille translations. Users can now perform braille input for application shortcuts and modifier keys.

https://support.microsoft.com/en-ie/help/22798

Desktop Magnifier

Desktop Magnifier is also getting an option to smooth fonts and images, along with mouse wheel scrolling to zoom in and out. It is now possible to use Magnifier with Narrator, so you can zoom in on text and have it read aloud.

https://support.microsoft.com/en-ie/help/11542/windows-use-magnifier

 

Dictation on the Desktop

This feature already allowed people to speak into their microphone, and convert using Windows Speech Recognition into text that appears on the screen. In the Windows 10 Update, a person can now use dictation to convert spoken words into text anywhere on your PC

To start dictating, select a text field and press the Windows logo key  + H to open the dictation toolbar. Then say whatever’s on your mind.

As well as dictating text, you can also use voice commands to do basic editing or to input punctuation. (English only)

 

Colour filters

If it’s hard to see what’s on the screen, you can apply a color filter. Color filters change the color palette on the screen and can help you distinguish between things that differ only by color.

To change your color filter, select Start  > Settings  > Ease of Access  > Color & high contrast . Under Choose a filter, select a color filter from the menu. Try each filter to see which one suits you best.

 

Read the full Microsoft blog on the accessibility updates in Windows 10 Fall Creator.

Fair play to Microsoft for investing so heavily in developing their Ease of Access features.

Global Accessibility Awareness Day – Apple Accessibility – Designed for everyone Videos

Today May 18th is Global Accessibility Awareness Day and to mark the occasion Apple have produced a series of 7 videos (also available with audio description) highlighting how their products are being used in innovative ways by people with disabilities. All the videos are available in a playlist here and I guarantee you, if you haven’t seen them and you are interested in accessibility and AT, it’ll be the best 15 minutes you have spent today! Okay the cynical among you will point out this is self promotion by Apple, a marketing exercise. Certainly on one level of course it is, they are a company and like any company their very existence depends on generating profit for their shareholders. These videos promote more than Apple however, they promote independence, creativity and inclusion through technology. Viewed in this light these videos will illustrate to people with disabilities how far technology has moved on in recent years and make them aware of the potential benefits to their own lives. Hopefully the knock on effect of this increased awareness will be increased demand. Demand these technologies people, it’s your right!

As far as a favorite video from this series goes, everyone will have their own. In terms of the technology on show, to me Todd “The Quadfather” below was possibly the most interesting.

This video showcases Apple’s HomeKit range of associated products and how they can be integrated with Siri.

My overall favorite video however is Patrick, musician, DJ and cooking enthusiast. Patrick’s video is an ode to independence and creativity. The technologies he illustrates are Logic Pro (Digital Audio Workstation software) with VoiceOver (Apple’s inbuilt screen-reader) and the object recognizer app TapTapSee which although has been around for several years now, is still an amazing use of technology. It’s Patrick’s personality that makes the video though, this guy is going places, I wouldn’t be surprised if he had his own prime time TV show this time next year.

AT Surveys in the UK and Ireland: What do they tell us? And what do we still need to learn?

stock image representing statistics

In May 2016 the accessibility team responsible for the GOV.UK domain posted a survey looking for information about the types of Assistive Technology (AT) people visiting the site were using. GOV.UK is the central online hub in the UK for all government services and information and as such, it takes accessibility very seriously. The survey which was open for 6 weeks, was answered by over 700 people and has produced some interesting results. You can read a post on their blog with all the results here.

Around the same time here in Ireland, Enable Ireland and the Disability Federation of Ireland conducted their own online AT user survey which also had some interesting findings. You can read more about that here.

Online Accessibility Vs Personal AT Use

Before comparing the results we must first highlight that we are not comparing like with like here. The UK.GOV survey was at heart about accessibility: the information we learn about AT is in the context of accessing a website whereas the Irish survey was seeking to find out about the range of AT which people use, and their experience in securing it through public or private funding. This will obviously skew the UK.GOV sample towards software solutions (they don’t appear to have asked about hardware like alternative input devices) to support computer access, literacy and visual impairment. Taking this into account what does stand out is the amount of “premium” AT solutions identified as being used in the UK. 29% of those responding to this survey used a screen reader and of these just under 40% identified JAWS as being their solution of choice. VoiceOver was next (but by far the most popular choice for mobile users) followed by just 12% using the free Open Source NVDA. (See graph below).

results of screen reader survey. Important results are covered in text above. JAWS at the top with almost 40%, followed by voiceover then NVDA. Otgers further down include Supervova, WindowsEyes and Talkback

About 30% of respondents use magnification software, and almost 70% identified high end proprietary solutions like Zoomtext (54%), Supernova (11%) and Magic (4%). Within magnification the lack of a credible open source alternative could help to explain the result. There is a similar situation within literacy support software with Read & Write from TextHelp accounting for almost 70% of the total. Finally it’s no surprise that various iterations of Dragon Naturally Speaking from Nuance accounted for almost 90% of speech recognition software.

Do these results tell us that there is a thriving market for high end proprietary AT software in the UK? Maybe not, it’s far more likely that the people responding to this survey were professionals working in a corporate enterprise type environment which might favour proprietary over open source or inbuilt solutions.

Cost of AT

In terms of cost of AT, the surprise result of the Irish survey was that 62% of AT solutions cost less than €1000 and the UK.GOV results seem to be similar. JAWS and possibly some versions of Zoomtext would cost in excess of €1000. However, all other solutions identified here would come in below that.

It’s interesting to see both the U.K. and Ireland attempting to gather current data on AT use. What these surveys highlight most of all is the need for more comprehensive data gathering to enable us to plan for the future, across the life span: from early childhood to old age. Assistive Technology is a tool for all, but still, far too few people who could benefit from it, are aware of it, or know how to apply for funding for it.

Achieving an inclusive digital society

inclusive-digital-society

The European Parliament has approved the directive on making the websites and mobile apps of public sector bodies more accessible. This means that people with disabilities – especially persons with vision or hearing impairments – will have better access to the websites and mobile applications of public services.
The updated version of the directive adopted by the Council in July 2016. The directive will soon enter into force, and Member States will have 21 months to transpose the Directive into national legislation.
The rules encoded in the directive reflect the Commission’s ongoing work to build a social and inclusive European Union, where all Europeans can take full part in the digital economy and society.
The text of the Directive covers websites and mobile apps of public sector bodies with a limited number of exceptions (e.g. broadcasters, livestreaming). This is a crucial milestone to achieve an inclusive digital society in which people with disabilities and other users have access to online services and information on an equal footing to other people.
Member States shall ensure that public sector bodies take the necessary measures to make their websites and mobile applications more accessible by making them perceivable, operable, understandable and robust.

  • For example for someone who is blind this will mean that public sector websites and mobile applications will have text alternatives for non-text content i.e. short equivalents for images, including icons, buttons, and graphics and description of data represented on charts, diagrams, and illustrations.
  • Or for someone with dexterity issues all functionality that is available by mouse is also available by keyboard. This will help people using alternative keyboards and people using voice recognition. Content will also have to be well organized which will help users to orient themselves and to navigate effectively.

The Old with the New

Finger on Braille print
My name is Christina, I’m twenty-five, and I’ve been blind since birth. Being born three months early can mess with a person’s retinas.
To say that technology Is important to me would be a massive understatement – I honestly wouldn’t have been able to manage in mainstream education without it.

However, My favourite and most useful technological advance isn’t new –
It‘s actually over 200 years old. It’s Braille.

In case you’re wondering, Braille is a system of reading and writing used by many blind people the world over.  It’s made up of various combinations of a six-dot cells,
(think of the number six on a dice).

For me, Braille is my ink. Braille, Despite its age, has been built into new technology just like many other adaptations; For example, I’ve gone from using a Perkins Brailler, which is basically a typewriter with only six keys, to a Braille display, which converts the information on a computer screen into Braille (I’m not an engineer, so I don’t understand how that’s possible). You can even turn on a setting on an iPhone which allows you to type in Braille – that’s pretty good for a system that’s been around since 1809.

I’ve used Braille for everything since I was five – library books came through the door in big bags, like pizza delivery bags; they even had children’s magazines, which became teenage magazines. It didn’t matter that the title wasn’t exactly the same – the content was what mattered.

All through college, especially because I studied languages, Braille helped me hugely to learn spelling and grammar. If I want to remember something, I find the physical act of putting pen to paper, so to speak, helps me to memorise.

So to sum up, Braille is more important to me than all modern technology – because for me it’s part of every piece of modern technology.

Free and open source software

 

Free and open source software

Here is a nice guide put together by JISC in the UK.  It’s a guide of Free and open source software (FOSS).

Many FOSS tools can benefit learners and those with (or without!) a disabilities.  There are thousands of tools available.

On the guide the tools have been grouped by type so that they may be of benefit for specific purposes or needs.  For example Audio tools to enable you to record and/or listen to material or Display enhancement tools to need help with either displaying or working with text and graphics.

Before downloading any free and open source software we recommend keeping your computer secure using antivirus software.

JISC blog on FOSS

For other useful resources in Jisc see their blog page

Screen reader for the Irish Language

Abair.ie logo

For many years, screen reading solutions (both for synthetic speech and refreshable Braille) have been available in English and in other major world languages. These solutions have allowed blind people to operate a PC, and thus benefit from all facilities that can be accessed on a PC, at home, in education and in the workplace.

Such solutions have not been available in Irish until now!

The ABAIR team are now in the testing phase of a major new development taking place regarding accessibility to the Irish language for people with a visual impairment.

The NVDA screen reader has been enabled for Irish.  NVDA, which is a high-quality, free open source screen reader (www.nvaccess.org), has been enabled to support Irish in a number of ways. This Irish solution is now in its final testing phase. Very importantly, the entire software solution is available free of charge.

NVDA now supports the Irish language by,

  1. The synthetic speech sounds like a native Irish speaker.
  2. A PC user with a Braille Display unit can display and read Irish text in Braille.
  3. All NVDA’s menus, dialog boxes and messages can be displayed in Irish.

***Testers Wanted!***

ABAIR is actively looking for anyone who would like to test this system. Feedback from users is essential to continue to develop and improve all aspects of this Irish solution. Please contact ABAIR at abair@tcd.ie if you would like to try out the system or if interested in getting involved in this research.

ABAIR is an initiative in Trinity College, Dublin, funded by An Roinn Ealaíon, Oidhreachta agus Gaeltachta, and by the NCBI-ABAIR project.

Roadshow for Assistive Technology and Low Vision Aids

Low vision aid

NCBI will be hosting its annual technology roadshow from the 22nd to the 24th of October in three locations around Ireland.  Suppliers of assistive technology and low vision aids as well as low tech aids for independent living will exhibit products that can be of aid to the vision impaired in the home, school and work environments.  It’s a good opportunity to see what is available.

Wednesday 22nd October – Wexford

Venue: Talbot Hotel (Slaney Suite),11.00 AM – 3.00 PM

Thursday 23rd October – Athlone

Venue: Shamrock Lodge Hotel, 11.00 AM – 3.00 PM

Friday 24th October- Dublin

Venue: NCBI – Drumcondra (Training Centre), 10.00 AM – 2.00 PM

Exhibitors:

Ash Technologies: http://www.ashlowvision.com

Enhanced Vision: http://www.enhancedvision.co.uk

Rehan Electronics: http://www.rehanelectronics.com

Sight and Sound: http://www.sightandsound.co.uk

VisionAid Technologies: http://www.visionaid.co.uk

Ultracane: http://www.ultracane.com