Creative use of technology during Covid 19 pandemic

Featured

Posted on March 25th 2020 by Siobhan Long

Using technology to support people with disabilities, their families and those who support them during the Covid 19 pandemic

Some initial suggestions

Note: This is an evolving ideas post which we encourage you to contribute to: together we can be creative in how we use technology to support people with disabilities who may be feeling isolated and worried, and we can also consider innovative ways of remote working to benefit all.

This is already a very worrying time for people with disabilities, being constantly reminded that they are in a high-risk group when it comes to Covid 19. With schools and services shut down, how can we use technology to facilitate communication, prevent people feeling isolated and maybe provide some kind of distraction?

Disclaimer: By means of this website, Enable Ireland provides information concerning accessing and using technology. Every reasonable effort has been made to ensure that the information provided by this website is reasonably comprehensive, accurate and clear. However, the information provided on or via this website may not necessarily be completely comprehensive or accurate, and, for this reason, it is provided on an “AS IS” and “AS AVAILABLE” basis. Each individual or organisation accessing and relying on the information shared should carry out their own review of the suggestions we make from a legal and regulatory point of view.If you think you may have noticed any error or omission, please let us know by contacting Siobhan Long at: slong@enableireland.ie. It is our policy to correct errors or omissions as soon as any error or omission has been established to our satisfaction.

WhatsApp or Viber Groups

This is something most of us use and find very useful. Disability services could set up a group/groups and use them as a way to keep communication open while people are at home.

WhatsApp is very accessible as it allows people to contribute to a group chat using recorded Video or Audio or text. It’s a good way to share jokes and funny stories and keep morale up. It supports individual and groups (up to 4) video and audio calls.

Advantages

  • Accessible (to many)
  • Familiar

Disadvantages

  • Needs a smartphone, computer or tablet.
  • Only supports groups up to 4 in real-time calls or video
  • Your privacy is not guaranteed using these forums

Echo Dot or Echo Show

For some people, speech is the easiest way for them to access technology. The Amazon Alexa powered devices can be a very intuitive way of getting information, entertainment (music, radio, audiobooks adventure games). They also support a feature called “Drop-in”. When setting up a device you can add friends or contacts who also have Echo devices and allow them to “Drop-in”. This could provide a good means of keeping contact with people who may not be comfortable enough with technology to use a smartphone or WhatsApp. It works basically like an intercom. The person being dropped in on does not have to do anything other than answer, no buttons to press or commands are needed. It’s like talking to them if they were in the room with you. The Echo Show (only £50 on Amazon at the moment) has a screen and camera also. We are not sure if you can Drop-in with video of if you need to use a video calling service. (Maybe someone reading this already knows the answer?)

Advantages

  • Very easy to use natural speech interface.
  • Lots of entertainment options
  • Can open communication channels in a natural way with user input

Disadvantages

  • GDPR/Privacy/Consent considerations are an issue as you may not receive the privacy you expect

Video Conferencing

MICROSOFT TEAMS

Microsoft Teams is a hub for teamwork in Office 365. It is currently free to download and use, during this Covid 19 pandemic. It is most likely to be initially at least, most useful to staff, as there is a degree of learning and familiarization involved: Here’s an introductory video illustrating how Teams works.

SKYPE

Skype should be familiar with being the original voice and video calling service. Perhaps not as popular as it once was it is still used by many people. Once someone is set up and signed in it should be easy enough to navigate. Skype is keyboard accessible, which will allow us to use alternative input methods or create a simplified interface using software like the Grid 3. Unfortunately, Skype no longer supports games like checkers and chess but it is still a good option especially if people are already using it.

ZOOM

Currently free, the video conferencing tool Zoom is a great way of bringing larger groups together via video. It supports all the main platforms (Windows iOS, Android and macOS). It’s quite an easy app to use and is free to install and use for up to 40 minutes. This could be used to bring everyone together at a certain time every day and would be probably the best way of simulating the atmosphere people would be familiar with within the services they normally attend. When hosting a meeting, you can select ‘share screen only’ to ensure that there is no potential for making any changes to attendees’ own devices. Without selecting this feature, it would be possible to remotely access devices, and this is something that would require written/recorded consent.

Note: Corporate IT Departments may have concerns re: this solution as they may not have any prior agreement with them. So for service providers, best to check with their IT and Data Protection officer before considering it.

Advantages

  • Free and relatively easy to use
  • Supports large group video calls
  • Great casting tool

Disadvantages

  • GDPR concerns given your privacy is not guaranteed
  • Requires a computer or mobile device
  • Will be new and unfamiliar to most (all)

FaceBook Groups

Enable Ireland Communications Department have created guidelines for designated staff authorised to start Facebook Groups for the purpose of communicating with clients and their family’s. These guidelines offer some do’s and don’t in regard to moderating these groups and suggest the appropriate privacy settings that need to be applied. You can request a copy of this document from our Communications Department.

communication@enableireland.ie

Set up an Internet Radio Station

There are services that allow you to create an online radio station (for example https://radio.co/). This would be a great way of keeping people in touch with news and entertainment custom made for a specific audience. Rotate DJs between services, have chats, play music, share the news. Bit of a mad idea but could be fun for everyone. If a live radio channel is a bit of a stretch we could maybe produce a daily podcast. Get people to record introduction to songs on their phones and send us the audio. Record thoughts, news, jokes, and we can try to put it all together and send out a link for everyone to listen. Video could also be used and make private links on YouTube.

Advantages

  • Accessible to (almost) all as listeners
  • Offers opportunity to be a producer as well as consumer of news/entertainment
  • All content curated by surface users

Disadvantages

  • Totally new to us, not sure of the requirements for setting it up but happy to hear from others more familiar, and happy to try it out.

Watch Together

YouTube is very popular and supports synchronised watching of YouTube videos and real-time chat.

https://www.watch2gether.com/?lang=en

Online Games

There are lots of games available online that allow you to invite friends to play remotely. Why not curate and manage a range? Suited to Draughts, Battleship, Ludo, Scrabble, Chess although younger players might be more interested in Fortnite

Advantages

  • Many of these games will be familiar to people already
  • Great distraction; Start a league!

Disadvantages

  • Many of the sites that offer these games are funded by advertising and can be difficult to navigate (auto-playing videos, links to products, flashing ads designed to trick people into clicking on them. This is not an insurmountable problem but it would be a good bit of work identifying appropriate platforms. iOS might be better.

Virtual photo walks

This is a lovely idea we came across. The original uses Google Hangouts but any video conferencing app would work.

Books – reading, looking and listening audio books

Story Weaver https://storyweaver.org.in/ is an open platform for the creation and distribution of books aimed at children under 16. Although a lot of the content has been created by and for other cultures & languages with almost 20,000 currently in the catalogue there should be plenty of interest there. The real potential with this site, however, is creating your own richly illustrated books with their easy to use web app.

Audio Books are hugely popular, they are accessible and can be consumed while completing other activities like your daily.  Audible (free for 30 days and linked to Echo/Amazon/Kindle) is the big name with the largest catalogue. 

Bookshare Ireland is available for people with visual or print disabilities. You can also download Audio Books or eBooks from your local library https://www.librariesireland.ie/elibrary/eaudiobooks.

Do you have a nice voice, or rather has anybody else ever told you have a nice voice? If so and you have a good quality microphone why not volunteer for https://librivox.org/. The Librivox project has been creating high-quality audiobooks from all public domain literature for a number of years. There is a huge selection to download and listen to as well as for instructions on how to begin creating your own.


Webinars

AbilityNet

 https://abilitynet.org.uk/free-resources/webinars  have a webinar in conjunction with the UK Stroke Association next week. They are also planning weekly webinars (Tuesdays and Wednesdays) over the next month. https://abilitynet.org.uk/news-blogs/abilitynet-live-free-events-about-technology-and-disability

AHEAD

 https://www.ahead.ie/conference2020  have moved their conference online, with a series of webinars over the next 10 weeks, starting this afternoon and tomorrow. They also have an archive of past webinars https://www.ahead.ie/Digital-Accessibility-Webinar-Series

AbleNet

 https://www.ablenetinc.com/resources/live_webinars have some webinars scheduled over the coming weeks, but also have access to a large bank of recorded webinars at https://www.youtube.com/channel/UCnqbFTy0VIQ6fVxXY2HiOJw/videos

Perkins Learning

has some prerecorded webinars https://www.perkinselearning.org/videos/webinar/assistive-technology

Call Scotland

also have scheduled and archived webinars available https://www.callscotland.org.uk/professional-learning/webinars/

Pacer

have cancelled a lot of their webinars for April/May https://www.pacer.org/workshops/ but they have an extensive list of archived webinars – https://www.pacer.org/webinars/archive-listing.asp

Shane Hastings Giveback Directory of free products / services available during COVID-19

Education (26)

Business Resources (9)

Health & Wellbeing (17)

Sports (7)

Entertainment (6)

Music (8)

Technology (7)

As mentioned, this is just for starters: if we all think creatively we can harness technology in many ways to support service users and staff through this difficult time. Please contact us with your suggestions and we’ll add them to this document. Thanks!

And check out Enable Ireland’s National Assistive Technology Training Service’s free online content on Assistive Technology for Creative Expression: on enableirelandat.com

Stay safe and well, and please share/respond/add your own suggestions/ideas. We’re all better together:) Or as we say in Ireland, Ní neart go cur le céile

Siobhan, Karl, Juliann, Sean and Shirley: The Enable Ireland AT Team

Eyegaze for Musical Expression

Background – What is eyegaze?

Eyegaze is an alternative way of accessing a computer using eye movements to control the mouse. It is achieved through a combination of hardware and software. The hardware is a USB perhipal called an eye tracker. The eye tracker is positioned underneath the computer monitor. It contains a camera and Infrared lights. The user is positioned between 500 and 1000 mm from the monitor (600mm is usually about right) where the camera has a clear view of their eyes. The Infrared lights highlight the user’s pupils (think of red eye in photographs where a flash has been used) and create reflections on the user’s eyeballs. After a calibration process where the user looks at a dot moving around the screen, the software can accurately tell where the user is looking based on the reflections and movements of the pupil. For computer access the user will also need tome method of clicking. There are 3 methods usually used. Dwell is the most common method. This is where the click is automated. If the user holds their gaze (dwells) on a button or icon for more than a specified time duration, usually somewhere from .5 to 1.5 sec, a click is sent. A slight variation of this is used in some software designed for eyegaze where the button is activated after the user dwells on it. The main difference here is that the second method offers us the ability to select different dwell times for different buttons. The other input methods are less common. The first would be to use an external switch as a mouse click, the second would be to use a deliberate blink (longer than a normal blink to prevent accidental clicks) as a mouse click.  

Eye Tracker Devices

  • Tobii Tracker 4C https://gaming.tobii.com/tobii-eye-tracker-4c/ – This is a great option for those wanting to use eyegaze for activities like music and gaming but have other AT as their main access method. It is every bit as good as the two much more expensive “AT” eye trackers below and costs in the region of €170.
  • Tobii PC Eye Plus and Mini https://www.tobiidynavox.com/products/devices/ – The PC Eye Mini and PC Eye Plus are probably the most popular AT eye trackers. The mini will work well on a monitor up to 19”, the Plus also contains a high quality microphone array to support speech recognition, it also has a switch input port. The Plus will work on screens up to 28”.
  • EyeTech TM5 https://eyetechds.com/eye-tracking-products/tm5-mini-eye-tracker/. The EyeTech TM5 is quite similar to the Tobii PC Eye Mini. One key difference that might influence the choice of this eye trackers is that it supports a slightly closer user position.

Challenges associated with playing music using eye movement

These are a number of difficulties we might encounter when playing music using eye movements but all can be overcome with practice and by using some common music production tools and techniques. Eye gaze as an input method is quite restrictive. You only have one point of direct access, so you can think of it like playing a piano with one finger. To compound this difficulty and expand the piano analogy, because your eyes are also your input you cannot queue up your next note like a one fingered piano player might. Eyegaze in itself is just eye pointing, using it as an access method will require some input (click) ether a switch or a dwell (automatic click after a specific time duration, usually somewhere from .5 to 1.5 sec). If you are using dwell for input then this will add a layer of difficulty when it comes to timing. You could set the dwell to be really fast (like .1 second) but you may run into accidental activations in this case, for example playing a note as you are passing over it on the way to your intended note. Some of the specialist eyegaze software instruments like EyeHarp, EyePlayMusic and ii-music overcome this by using a circular clock style interface. This allows them set the onscreen buttons to instant activation and because of the radial layout each note can be directly accessed from the centre without passing over another note. Using the radial design if our eyes are in a central position all notes are equal distance from us and can be accessed in the most efficient way but we are still left with the “one finger piano” restriction. This means no chords and only the option of playing at a slower tempo. Using mainstream music productions like sequencers, arpeggiators or chord mode can overcome this limitation and allow us create much more complex music using eyegaze. A sequencer would allow you pre program accompanying notes with which to play along. An arpeggio is sometimes referred to as a broken chord. It is the notes of a chord played consecutively rather than simultaneously. Arpeggios are used a lot in electronic music. By playing arpeggios the slower input is offset by the additional life and movement provided by the arpeggio. Chord mode is something that can be set up in many digital audio workstations. You can map one note to automatically play the accompanying notes required to make it a chord. Live looping could also be used. In looping we would record a section being played live, then loop it back and play other notes over it. Other effects like delay, reverb and many more besides, will also allow is make interesting music.

Expression is another difficulty when playing music using eye tracking. By expression we mean how an accomplished musician can play the same note in different ways to make it more expressive. Velocity is a common means of expression, you can think of this a how fast/hard a note is struck. Velocity can affect volume and other qualities of the instrument’s sound. Another common means of expression is provided pedals like those on an organ or piano. Using eyegaze we really only have the ability to turn the note on or off. Some of the software however breaks note areas up into sections, each one giving an increased velocity (see photo below).           

Software for playing music with Eyegaze

  • Eye Harp http://theeyeharp.org/ One of the first software instruments made specifically for eyegaze, the EyeHarp remains one of the best options. This software was originally developed as a college project (I guessing he got a first!) and rather than let it die developer Zacharias Vamvakousis made it available free and open source. After a few years with now updates the news is that there are some big updates on the way. We are looking forward to seeing what they have in store for us.
animated gif showing the eyeharp performance screen. a clock type circular interface divided into sections. eyes at the centre of the circle

Another option for eyegaze music production is using software like the Grid 3 or Iris to create an eyegaze accessible interface for a mainstream digital audio workstation. The demo below is done using Ableton Live however any software that offers keyboard mapping or keyboard shortcuts (so any quality software) could be used in the same way.

Skyle – out of the blue

Anybody working with Assistive Technology (AT) knows how useful Apple iOS devices are. Over the years they have gradually built in a comprehensive and well-designed range of AT supports that go a long way to accommodating every access need. This is no small feat. In 2009 VoiceOver transformed what was essentially a smooth featureless square of glass with almost no tactile information, into the preferred computing device for blind people. In 2019 Voice Control and the improvements made to Assistive Touch filled two of the last big gaps in the area of “hands free” control of iOS. All this great work is not completely altruistic however as it has resulted in Apple mobile devices cementing their place as the preeminent platform in the area of disability and AT. It is because of this that it has always been somewhat of a mystery why there has never been a commercial eye tracking option available for either iOS or MacOS. Perhaps not so much iOS as we will see but certainly one would have thought an eyegaze solution for the Apple desktop OS could be a viable product.

There are a few technical reasons why iOS never has supported eyegaze. Firstly, up until the newer generations of eye gaze peripherals, eye gaze needed a computer with a decent spec to work well. iPads are Mobile devices and Apple originally made no apologies for sacrificing performance for more important mobile features like reducing weight, thickness and increasing battery life. As eye trackers evolved and got more sophisticated, they began to process more of the massive amount of gaze data they take in. So rather than passing large amounts of raw data straight through to the computer via USB 3 or Firewire they process the data first themselves. This means less work for the computer and connection with less bandwidth can be used. Therefore, in theory, an iPad Pro could support something like a Tobii PC Eye Mini but in practice, there was still one major barrier. iOS did not support any pointing device, let alone eye tracking devices. That was until last September’s iOS update. iOS 13 or iPadOS saw upgrades to the Assistive Touch accessibility feature that allowed it to support access to the operating system using a pointing device.     

iPad Pro 12" in black case with Skyle eye tracker
iPad Pro 12″ with Skyle eye tracker and case

It is through Assistive Touch that the recently announced Skyle for iPad Pro is possible. “Skyle is the world’s first eye tracker for iPad Pro” recently announced by German company EyeV https://eyev.de/ (who I admit I have not previously heard of). Last week it appeared as a product on Inclusive Technology for £2000 (ex VAT). There is very little information on the manufacturer website about Skyle so at this stage all we know is based on the Inclusive Technology product description (which is pretty good thankfully). The lack of information about this product (other than the aforementioned) significantly tempers my initial excitement on hearing that there is finally an eye tracking solution for iOS. There are no videos on YouTube (or Inclusive Technology), no user reviews anywhere. I understand it is a new product but it is odd for a product to be on the market before anybody has had the opportunity of using it and posting a review. I hope I am wrong but alarm bells are ringing. We’ve waited 10 years for eye tracking on iOS, why rush now?

Leaving my suspicion behind there are some details on Inclusive Technology which will be of interest to potential customers. If you have used a pointing device through Assistive Touch on iPadOS you will have a good idea of the user experience. Under Cursor in the Assistive Touch settings you can change the size and colour of the mouse cursor. You will need to use the Dwell feature to automate clicks and the Assistive Touch menu will hive you access to all the other gestures needed to operate the iPad. Anyone who works with people who use eye tracking for computer access will know that accuracy varies significantly from person to person. Designed for touch, targets in iPadOS (icons, menus) are not tiny, they are however smaller than a cell in the most detailed Grid used by a highly accurate eyegaze user. Unlike a Windows based eye gaze solution there are no additional supports, for example a Grid overlay or zooming to help users with small targets. Although many users will not have the accuracy to control the iPad with this device (switch between apps, change settings) it could be a good solution within an AAC app (where cell sizes can be configured to suit user accuracy) or a way of interacting with one of the many cause and effect apps and games. Again however, if you have a particular app or activity in mind please don’t assume it will work, try before you buy. It should be noted here that Inclusive Technology are offering a 28 Day returns policy on this product.

There is a Switch input jack which will offer an alternative to Dwell for clicking or could be set to another action (show Assistive Touch menu maybe). I assume you could also use the switch with iOS Switch Control which might be a work around for those who are not accurate enough to access smaller targets with the eye gaze device. It supports 5 and 9 point calibration to improve accuracy. I would like to see a 2 point calibration option as 5 points can be a stretch for some early eyegaze users. It would also be nice if you could change the standard calibration dot to something more likely to engage a child (cartoon dog perhaps).

Technical specs are difficult to compare between eye trackers on the same platform (Tobii v EyeTech for example) so I’m not sure what value it would be to compare this device with other Windows based eye trackers. That said some specs that will give us an indication of who this device may be appropriate for are sample rate and operating distance. Judging by the sample rate (given as 18Hz max 30Hz) the Skyle captures less than half the frames per second of its two main Windows based competitors (Tobii 30 FPS TM5 42 FPS). However even 15 FPS should be more than enough for accurate mouse control. The operating distance (how far the device is from the user) for Skyle is 55 to 65 cm which is about average for an eyegaze device. However only offering a range of 10 cm (Tobii range is 45cm to 85 cm, so 40 cm) as well as the photo below which shows the positioning guide both indicate that this not a solution for someone with even a moderate amount of head movement as the track box (area where eyes can be successfully tracked) seems to be very small.

the positioning guide in the skyle app. letterbox view of a persons eyes. seems to indicate only movement of a couple of centimeters is possible before going out of view.
Does the user have to keep their position within this narrow area or does Skyle use facial recognition to adjust to the user’s position? If it’s the former this solution will not be appropriate for users with even a moderate amount of head movement.

In summary if you are a highly accurate eyegaze user with good head control and you don’t wear glasses.. Skyle could offer you efficient and direct hands free access to your iPad Pro. It seems expensive at €2500 especially if you don’t already own a compatible iPad (add at least another €1000 for an iPad Pro 12”). If you have been waiting for an eyegaze solution for iOS (as I know many people have) I would encourage you to wait a little longer. When the opportunity arises, try Skyle for yourself. By that time, there may be other options available.

If any of the assumptions made here are incorrect or if there is anymore information available on Skyle please let us know and we will update this post.

My Computer My Way: Find how to make your device easier to use

Logo for My Computer My Way

My Computer My Way is a free online guide of accessibility features for computers, tablets and mobile phones. The aim is to provide you details to make whatever device you’re using easier to use via built-in accessibility features, browser extensions or via apps that you can install.

It’s been around for quite a number of years and having revisited the site again recently I am glad to see it has been updated to current operating systems features.  So whether you need help now with Android Pie, Windows 10 or iOS12 this useful guide has been updated to include the new built-in accessibility features. 

The layout of Accessibility features is divided into four categories

  • Vision; options include features to help you see and use applications more clearly
  • Hearing; accessibility features and information for people who are deaf or hard of hearing.
  • Motor; ways to make your keyboard, mouse and mobile device easier to use.
  • And cognitive; computer adjustments that will make reading writing and using the internet easier.

Further information

https://mcmw.abilitynet.org.uk/

The good:  provides details on just about every build-in accessibility feature for your device.

The not so good: There is a limited amount of information on apps or applications that might also provide useful features.

The verdict:  A useful tool for individuals who have limited or no access to an assistive technology service and need help to find solutions on their own.

Webcam Face trackers

User at a laptop using a webcam face tracker

Webcam Face trackers allow full control of mouse functions without the use of hands. They can be used to access a computer (Windows, Mac), as well as a tablet or smartphone (Android only at present).

Primary users of these technologies are people with motor impairments.  There are various options for hands-free control of your mouse on a computer screen such as reflective dot trackers, lip and chin joysticks, speech recognition or even eye trackers.  Webcam Face trackers are another possible option for hands-free control of your computer or phone. 

Although it may not be as accurate as other hands-free options, such as wearable sensors, with this approach, you don’t have to wear a sensor or reflective dot.  As you move your head, the motion is translated to mouse cursor movement by the webcam.  However, you do have to maintain a direct line-of-sight to the computer, and the performance is dependent on lighting conditions.

Basic pointing device support on an Android tablet or phone is possible with EVA Facial Mouse.  This is available through Google Play.  It will allow access to functions of the mobile device by means of tracking the user’s face, captured through the frontal camera. 

At the time of writing, a webcam face tracker is not available on iOS devices.  However, it is possible to use Switch Control with head gestures to act as switches.  For example look left for select, look right for home.

All 5 Webcam Face Trackers listed below have options for mouse dwell, click and drag lock.

There are two free windows webcam face trackers – Camera mouse and Enable Viacam.  Both work quite well.  For the paid options, SmyleMouse also tracks facial expressions and has the option of clicking with a smile.  ViVo offers integration with leading speech recognition programs.

As there are trial versions for most of these options below, its best to try them all to really get a feel for it and see which one works best for you.

Wearable hands-free mice options to consider are:

SmyleMouse $499


ViVo Mouse $430


Camera Mouse free


Enable Viacam free

iTracker for Mac $35

The good:  You don’t have to wear a sensor or reflective dot and they are battery-free.

The not so good: They are not as accurate as other methods of hands-free options.

The verdict:  If you don’t need very fine cursor control and don’t want to wear a sensor on your head, then webcam face trackers are a good option for hands-free control.

Mobile Device Accessibility: iOS and the Android Accessibility Suite

One aspect of modern technological life that might help us to keep some faith in humanity are the comprehensive assistive technologies that are built into, or free to download for mobile computing devices. Accessibility features, as they are loosely called, are a range of tools designed to support non-standard users of the technology. If you can’t see the screen very well you can magnify text and icons (1) or use high contrast (2). If you can’t see the screen at all you can have the content read back to you using a screen-reader (3). There are options to support touch input (4, 5) and options to use devices hands free (6). Finally there also some supports for deaf and hard of hearing (HoH) people like the ability to switch to mono audio or visual of haptic alternatives to audio based information.  

With their mobile operating system iOS Apple do accessibility REALLY well and this is reflected in the numbers. In the 2018 WebAim Survey of Low Vision users  there were over 3 times as many iOS users as Android users. That is almost the exact reverse of the general population (3 to 1 in favour of Android). For those with Motor Difficulties it was less significant but iOS was still favoured.

So what are Apple doing right? Well obviously, first and foremost, the credit would have to go to their developers and designers for producing such innovative and well implemented tools. But Google and other Android developers are also producing some great AT, often highlighting some noticeable gaps in iOS accessibility. Voice Access, EVA Facial Mouse and basic pointing device support are some examples, although these are gaps that will soon be filled if reports of coming features to iOS 13 are to be believed.

Rather than being just about the tools it is as much, if not more, about awareness of those tools: where to find them, how they work. In every Apple mobile device you go to Settings>General>Accessibility and you will have Vision (1, 2, 3), Interaction (4, 5, 6) and Hearing settings. I’m deliberately not naming these settings here so that you can play a little game with yourself and see if you know what they are. I suspect most readers of this blog will get 6 from 6, which should help make my point. You can check your answers at the bottom of the post 🙂 This was always the problem with Android devices. Where Apple iOS accessibility is like a tool belt, Android accessibility is like a big bag. There is probably more in there but you have to find it first. This isn’t Google’s fault, they make great accessibility features. It’s more a result of the open nature of Android. Apple make their own hardware and iOS is designed specifically for that hardware. It’s much more locked down. Android is an open operating system and as such it depends on the hardware manufactured how accessibility is implemented. This has been slowly improving in recent years but Google’s move to bundle all their accessibility features into the Android Accessibility Suite last year meant a huge leap forward in Android accessibility.

What’s in Android Accessibility Suite?

Accessibility Menu

Android OS Accessibility Suite Assistant Menu. An onscreen menu with large colourful buttons for features like, power, lock screen, volume
The figure highlighted in the bottom corner launches whatever Accessibility Suite tools you have active. If you have more than one a long press will allow you switch between tools.

Use this large on-screen menu to control gestures, hardware buttons, navigation, and more. A similar idea to Assistive Touch on iOS. If you are a Samsung Android user it is similar (but not as good in my opinion) as the Assistant Menu already built in.

Select to Speak

The select to speak tool when active on a webpage. large red button to stop speech. Arrow at left to extend menu, pause button

Select something on your screen or point your camera at an image to hear text spoken. This is a great feature for people with low vision or a literacy difficulty. It will read the text on screen when required without being always on like a screen reader. A similar feature was available inbuilt in Samsung devices before inexplicably disappearing with the last Android update. The “point your camera at an image to hear text spoken” claim had me intrigued. Optical Character Recognition like that found in Office Lens or SeeingAI built into the regular camera could be extremely useful. Unfortunately I have been unable to get this feature to work on my Samsung Galaxy A8. Even when selecting a headline in a newspaper I’m told “no text found at that location”.

Switch Access

cartoon hand activating a Blue2 switch. Android phone desktop with message icon highlighted

Interact with your Android device using one or more switches or a keyboard instead of the touch screen. Switch Access on Android has always been the poor cousin to Switch Control on iOS but is improving all the time.

TalkBack Screen Reader

Get spoken, audible, and vibration feedback as you use your device. Googles mobile screen reader has been around for a while, while apparently, like Switch Access it’s improving, I’ve yet to meet anybody who actually uses it full time.

So to summarise, as well as adding features that may have been missing on your particular “flavour” of Android, this suite standardises the accessibility experience and makes it more visible. Also another exciting aspect of these features being bundled in this way is their availability for media boxes. Android is a hugely popular OS for TV and entertainment but what is true of mobile device manufacturer is doubly so of Android Box manufacturers where it is still very much the Wild West. If you are in the market for an Android Box and Accessibility is important make sure it’s running Android Version 6 or later so you can install this suite and take advantage of these features.

Could you name the Apple iOS features?

  1. Zoom
  2. Display Accommodations or Increase Contrast   
  3. VoiceOver
  4. Assistive Touch
  5. Touch Accommodations
  6. Switch Control

Hands free reflective dot trackers

user using a refective dot tracker to control their computer

If you have a physical limitation that makes it difficult or impossible to use a traditional mouse with your hands, a hands-free mouse can be critical to accessing a computer comfortably and efficiently. A hands-free mouse allows you to perform computer mouse functions without using your hands. There are various options for hands free control of your mouse on a computer screen such as wearable sensors, eye trackers or even speech recognition.  One other possible group of devices are reflective dot trackers. You wear a small reflective dot (often placed as a sticker on the forehead or glasses), and a special sensor unit mounted on or near your computer tracks the motion of the dot to control the mouse cursor as you move.  There is no wired connection between you and the device.   The wearable reflective dot is smaller and less conspicuous than some of the other wearable sensor options. 

These products can replace a traditional mouse for computing platforms such as Windows, Mac OS X, and Linux. And some will work with platforms like Android and Chrome OS as well.

Some reflective dot trackers options to consider are as follows

TrackerPro $995

HeadMouse Nano £888.00

SmartNAV 4:AT €465.00

AccuPoint $1,995.00

The good:  If you are OK with wearing the reflective dot you can independently control a mouse cursor without requiring someone to assist putting on a wearable sensor.  Also less chance in something not working than other hands free options such as eye gaze or voice recognition.

The not so good: does require a line-of-sight to the computer, and can be sensitive to lighting conditions.

The verdict:  If you need or want the ability to make very fine, high-resolution movements of the mouse cursor, similar to what is possible with a traditional mouse, then reflective dot trackers are a good option.

Control your mobile phone, PC or TV with your wheelchair joystick

Have you ever considered controlling your computer or mobile devices with your wheelchair joystick?

As well as the basic wheelchair functions such as driving, the CJSM2 –BT also enables control of a computer or mobile devices and so the integration of environmental controls is possible.  The same controls that the user drives the power wheelchair with, typically a joystick, can also be used to control an appliance within their environment.

For example for chairs with R-net controls you can replace the old joystick with a CJSM2 –BT as seen in the video below. This R-net Joystick Module has Infra-Red (IR) capabilities included. IR technology is widely used to remotely control household devices such as TVs, DVD players, and multi-media systems, as well as some home-automation equipment. Individual IR commands can be learned from an appliance’s remote handset and stored in the CJSM2.

Integrated Bluetooth technology is also an option, to enable control of computers, Android tablets, iPads, iPhones and other smart devices from a powered wheelchair. To switch between the devices, the user simply navigates the menu and selects the device they wish to control. The R-net’s CJSM2 can easily replace an existing R-net joystick module, with no system re-configuration or programming required.

As well as Curtiss-Wright’s R-net controls, other wheelchair controller manufacturers have Bluetooth mouse options too, including Dynamics Controls with their Linx controller and Curtis instrument’s quantum q-logic controller.

My Experience with Voice Recognition

Man wearing headset using Dragon naturally speaking

Dragon’s voice recognition software enables people to control their device, create content, browse and write an email, update spreadsheets, surf the Web and create documents using only their voice.

A demo of correcting, formatting and proofreading using Dragon naturally speaking 13.

Further information

The good

Dragon NS provides a means voice to text production not only in word processing applications but also to control your computer operations. This for me is the main advantage over other voice to text programmes- which are often “in app” such as the microphone in the Pages app.

Dragon NS versus other voice to text software- my take on it!

  1. Dragon NaturallySpeaking for the PC is much more powerful than the built-in voice recognition software in android or within the iPhone (Siri) i.e. less inaccuracies and more time efficient. It can dramatically cut down the time it takes to create email, word documents and other correspondence on your PC.
  2. It Learns. Dragon NaturallySpeaking actually improves through use. It learns about how you speak, how you sound, what words you use and it creates a database called a voice profile. This voice profile matures over time and allows Dragon NaturallySpeaking to become very accurate with regular use.
    1.    Dragon NaturallySpeaking on the PC has “regional accent modelling”. This makes the program far more accurate than basic mobile device speech recognition which uses a generic accent model.
    2.    Dragon NaturallySpeaking adapts to your specific vocabulary. Siri or the Google android speech recognition application do not do this, they run off a generic limited vocabulary.
  3. Amount Processed. Free speech software on your phone can only process 30-second chunks of speech. Dragon speech recognition on the PC is continuous for a long as you can talk and doesn’t need a continuous internet connection.

The not so good

Good flow of speech is important, even if just for short passages. Dragon NS writes everything you say; even inflections of speech such as “mmmm” and “eh”. If a user tends to use these inflections in speech, it will type these inflections. Continuously deleting them can be time consuming and frustrating. Training oneself not to use these inflections can be very tricky.

The user needs to be very cognitively able to command the system with their voice, planning out the actions and remembering specific commands.

The verdict

Fantastic software for the right client, especially if for any reason direct access is not an option. Even if a form of direct access is an option for the client, Dragon NS is still a nice option for long passages of text production. For the wrong client, this software would be more of a hindrance and a frustration than a help.

Tobii buys SmartBox – What might this mean for computer access and AAC?

Big news (in the AT world anyway) may have arrived in your mail box early last week. It was announced that leading AAC and Computer Access manufacturer Tobii purchased SmartBox AT (Sensory Software), developers of The Grid 3 and Look2Learn. As well as producing these very popular software titles, SmartBox were also a leading supplier of a range of AAC and Computer Access hardware, including their own GridPad and PowerPad ranges. Basically (in this part of the world at least) they were the two big guns in this area of AT, between them accounting for maybe 90% of the market. An analogy using soft drink companies would be that this is like Coca-Cola buying Pepsi.

Before examining what this takeover (or amalgamation?) means to their customers going forward it is worth looking back at what each company has historically done well. This way we can hopefully provide a more optimistic future for AT users rather than the future offered by what might be considered a potential monopoly.

Sensory Software began life in 2000 from the spare bedroom of founder Paul Hawes. Paul had previously worked for AbilityNet and had 13 years’ experience working in the area of AT. Early software like GridKeys and The Grid had been very well received and the company continued to grow. In 2006 they setup Smartbox to concentrate on complete AAC systems while sister company Sensory Software concentrated on developing software. In 2015 both arms of the company joined back together under the SamrtBox label. By this time their main product, the Grid 3, had established itself as a firm favourite with Speech and Language Therapists (SLT), for the wide range of communication systems it supported and Occupational Therapists and AT Professionals for its versatility in providing alternative input options to Windows and other software. Many companies would have been satisfied with providing the best product on the market however there were a couple of other areas where SmartBox also excelled. They may not have been the first AT software developers to harness the potential resources of their end users (they also may have been, I would need to research that further) but they were certainly the most successful. They succeeded in creating a strong community around the Grid 2 & 3 with a significant proportion of the online grids available to download being user generated. Their training and support was also second to none. Regular high quality training events were offered throughout Ireland and the UK. Whether by email, phone or the chat feature on their website their support was always top quality also. Their staff clearly knew their product inside out, responses were timely and they were always a pleasure to deal with.

Tobii have been around since 2001. The Swedish firm actually started with eyegaze, three entrepreneurs – John Elvesjö, Mårten Skogö and Henrik Eskilsson recognised the potential of eye tracking as an input method for people with disabilities. In 2005 they released the MyTobii P10, the world’s first computer with built-in eye tracking (and I’ve no doubt there are still a few P10 devices still in use). What stood out about the P10 was the build quality of the hardware, it was built like a tank. While Tobii could be fairly criticized for under specifying their all-in-one devices in terms of Processor and Memory, the build quality of their hardware is always top class. Over the years Tobii have grown considerably, acquiring Viking Software AS (2007), Assistive Technology Inc. (2008) and DynaVox Systems LLC (2014). They have grown into a global brand with offices around the world. As mentioned above, Tobii’s main strength is that they make good hardware. In my opinion they make the best eye trackers and have consistently done so for the last 10 years. Their AAC software has also come on considerably since the DynaVox acquisition. While Communicator always seemed to be a pale imitation of the Grid (apologies if I’m being unfair, but certainly true in terms of its versatility and ease of use for computer access) it has steadily being improving. Their newer Snap + Core First AAC software has been a huge success and for users just looking for communication solution would be an attractive option over the more expensive (although much fuller featured) Grid 3. Alongside Snap + Core they have also brought out a “Pathways” companion app. This app is designed to guide parents, care givers and communication partners in best practices for engaging Snap + Core First users. It supports the achievement of communication goals through video examples, lesson plans, interactive goals grid for tracking progress, and a suite of supporting digital and printable materials. A really useful resource which will help to empower parents and prove invaluable to those not lucky enough to have regular input from an SLT.

To sum things up. We had two great companies, both with outstanding products. I have recommended the combination of the Grid software and a Tobii eye tracker more times than I remember. The hope is that Tobii can keep the Grid on track and incorporate the outstanding support and communication that was always an integral part of SmartBox’s operation. With the addition of their hardware expertise and recent research driven progress in the area of AAC, there should be a lot to look forward to in the future.

If you are a Grid user and you have any questions or concerns about this news, true to form, the communication lines are open. There is some information at this link and at the bottom of the page you can submit your question.