My Computer My Way is a free online guide of accessibility
features for computers, tablets and mobile phones. The aim is to provide you
details to make whatever device you’re using easier to use via built-in accessibility
features, browser extensions or via apps that you can install.
It’s been around for quite a number of years and having
revisited the site again recently I am glad to see it has been updated to
current operating systems features. So
whether you need help now with Android Pie, Windows 10 or iOS12 this useful
guide has been updated to include the new built-in accessibility features.
The layout of Accessibility features is divided into four
options include features to help you see and use applications more clearly
accessibility features and information for people who are deaf or hard of
ways to make your keyboard, mouse and mobile device easier to use.
computer adjustments that will make reading writing and using the internet
Webcam Face trackers allow full control of mouse functions
without the use of hands. They can be used to access a computer (Windows, Mac),
as well as a tablet or smartphone (Android only at present).
Primary users of these technologies are people with motor
impairments. There are various options
for hands-free control of your mouse on a computer screen such as reflective
dot trackers, lip and chin joysticks, speech recognition or even eye
trackers. Webcam Face trackers are another possible option for hands-free
control of your computer or phone.
Although it may not be as accurate as other hands-free
options, such as wearable sensors, with this approach, you don’t have to wear a
sensor or reflective dot. As you move
your head, the motion is translated to mouse cursor movement by the webcam. However, you do have to maintain a direct
line-of-sight to the computer, and the performance is dependent on lighting
Basic pointing device support on an Android tablet or phone
is possible with EVA
Facial Mouse. This is
available through Google Play. It will allow access to functions of the
mobile device by means of tracking the user’s face, captured through the
At the time of writing, a webcam face tracker is not
available on iOS devices. However, it is
possible to use Switch Control with head gestures to act as switches. For example look left for select, look right
All 5 Webcam Face Trackers listed below have options for
mouse dwell, click and drag lock.
There are two free windows webcam face trackers – Camera
mouse and Enable Viacam. Both work quite
well. For the paid options, SmyleMouse also
tracks facial expressions and has the option of clicking with a smile. ViVo offers integration with leading speech
As there are trial versions for most of these options below,
its best to try them all to really get a feel for it and see which one works
best for you.
One aspect of modern technological life that might help us to keep some faith in humanity are the comprehensive assistive technologies that are built into, or free to download for mobile computing devices. Accessibility features, as they are loosely called, are a range of tools designed to support non-standard users of the technology. If you can’t see the screen very well you can magnify text and icons (1) or use high contrast (2). If you can’t see the screen at all you can have the content read back to you using a screen-reader (3). There are options to support touch input (4, 5) and options to use devices hands free (6). Finally there also some supports for deaf and hard of hearing (HoH) people like the ability to switch to mono audio or visual of haptic alternatives to audio based information.
their mobile operating system iOS Apple do accessibility REALLY well and this is
reflected in the numbers. In the 2018 WebAim Survey of Low
Vision users there were over 3 times
as many iOS users as Android users. That is almost the exact reverse of the
general population (3 to 1 in favour of Android). For those with Motor Difficulties it
was less significant but iOS was still favoured.
So what are Apple doing right? Well obviously, first and foremost, the credit would have to go to their developers and designers for producing such innovative and well implemented tools. But Google and other Android developers are also producing some great AT, often highlighting some noticeable gaps in iOS accessibility. Voice Access, EVA Facial Mouse and basic pointing device support are some examples, although these are gaps that will soon be filled if reports of coming features to iOS 13 are to be believed.
Rather than being just about the tools it is as much, if not more, about awareness of those tools: where to find them, how they work. In every Apple mobile device you go to Settings>General>Accessibility and you will have Vision (1, 2, 3), Interaction (4, 5, 6) and Hearing settings. I’m deliberately not naming these settings here so that you can play a little game with yourself and see if you know what they are. I suspect most readers of this blog will get 6 from 6, which should help make my point. You can check your answers at the bottom of the post 🙂 This was always the problem with Android devices. Where Apple iOS accessibility is like a tool belt, Android accessibility is like a big bag. There is probably more in there but you have to find it first. This isn’t Google’s fault, they make great accessibility features. It’s more a result of the open nature of Android. Apple make their own hardware and iOS is designed specifically for that hardware. It’s much more locked down. Android is an open operating system and as such it depends on the hardware manufactured how accessibility is implemented. This has been slowly improving in recent years but Google’s move to bundle all their accessibility features into the Android Accessibility Suite last year meant a huge leap forward in Android accessibility.
What’s in Android Accessibility Suite?
Use this large on-screen menu to control gestures, hardware buttons, navigation, and more. A similar idea to Assistive Touch on iOS. If you are a Samsung Android user it is similar (but not as good in my opinion) as the Assistant Menu already built in.
Select to Speak
Select something on your screen or point your camera at an image to hear text spoken. This is a great feature for people with low vision or a literacy difficulty. It will read the text on screen when required without being always on like a screen reader. A similar feature was available inbuilt in Samsung devices before inexplicably disappearing with the last Android update. The “point your camera at an image to hear text spoken” claim had me intrigued. Optical Character Recognition like that found in Office Lens or SeeingAI built into the regular camera could be extremely useful. Unfortunately I have been unable to get this feature to work on my Samsung Galaxy A8. Even when selecting a headline in a newspaper I’m told “no text found at that location”.
Interact with your Android device using one or more switches or a keyboard instead of the touch screen. Switch Access on Android has always been the poor cousin to Switch Control on iOS but is improving all the time.
TalkBack Screen Reader
Get spoken, audible, and vibration feedback as you use your device. Googles mobile screen reader has been around for a while, while apparently, like Switch Access it’s improving, I’ve yet to meet anybody who actually uses it full time.
So to summarise, as well as adding features that may have been missing on your particular “flavour” of Android, this suite standardises the accessibility experience and makes it more visible. Also another exciting aspect of these features being bundled in this way is their availability for media boxes. Android is a hugely popular OS for TV and entertainment but what is true of mobile device manufacturer is doubly so of Android Box manufacturers where it is still very much the Wild West. If you are in the market for an Android Box and Accessibility is important make sure it’s running Android Version 6 or later so you can install this suite and take advantage of these features.
If you have a physical limitation that makes it difficult or
impossible to use a traditional mouse with your hands, a hands-free mouse can
be critical to accessing a computer comfortably and efficiently. A hands-free
mouse allows you to perform computer mouse functions without using your hands. There
are various options for hands free control of your mouse on a computer screen
such as wearable sensors, eye trackers or even speech recognition. One other possible group of devices are
reflective dot trackers. You wear a small reflective dot (often placed as a
sticker on the forehead or glasses), and a special sensor unit mounted on or
near your computer tracks the motion of the dot to control the mouse cursor as
you move. There is no wired connection
between you and the device. The wearable reflective dot is smaller and
less conspicuous than some of the other wearable sensor options.
These products can replace a traditional mouse for computing
platforms such as Windows, Mac OS X, and Linux. And some will work with
platforms like Android and Chrome OS as well.
Some reflective dot trackers options to consider are as follows
The good: If
you are OK with wearing the reflective dot you can independently control a
mouse cursor without requiring someone to assist putting on a wearable
sensor. Also less chance in something
not working than other hands free options such as eye gaze or voice
The not so good: does
require a line-of-sight to the computer, and can be sensitive to lighting
If you need or want the ability to make very fine, high-resolution movements of
the mouse cursor, similar to what is possible with a traditional mouse, then
reflective dot trackers are a good option.
Have you ever considered controlling your computer or mobile devices with your wheelchair joystick?
As well as the basic wheelchair functions such as driving, the CJSM2 –BT also enables control of a computer or mobile devices and so the integration of environmental controls is possible. The same controls that the user drives the power wheelchair with, typically a joystick, can also be used to control an appliance within their environment.
For example for chairs with R-net controls you can replace the old joystick with a CJSM2 –BT as seen in the video below. This R-net Joystick Module has Infra-Red (IR) capabilities included. IR technology is widely used to remotely control household devices such as TVs, DVD players, and multi-media systems, as well as some home-automation equipment. Individual IR commands can be learned from an appliance’s remote handset and stored in the CJSM2.
Integrated Bluetooth technology is also an option, to enable control of computers, Android tablets, iPads, iPhones and other smart devices from a powered wheelchair. To switch between the devices, the user simply navigates the menu and selects the device they wish to control. The R-net’s CJSM2 can easily replace an existing R-net joystick module, with no system re-configuration or programming required.
As well as Curtiss-Wright’s R-net controls, other wheelchair controller manufacturers have Bluetooth mouse options too, including Dynamics Controls with their Linx controller and Curtis instrument’s quantum q-logic controller.
Dragon NS provides a means voice to text production not only in word processing applications but also to control your computer operations. This for me is the main advantage over other voice to text programmes- which are often “in app” such as the microphone in the Pages app.
Dragon NS versus other voice to text software- my take on it!
Dragon NaturallySpeaking for the PC is much more powerful than the built-in voice recognition software in android or within the iPhone (Siri) i.e. less inaccuracies and more time efficient. It can dramatically cut down the time it takes to create email, word documents and other correspondence on your PC.
It Learns. Dragon NaturallySpeaking actually improves through use. It learns about how you speak, how you sound, what words you use and it creates a database called a voice profile. This voice profile matures over time and allows Dragon NaturallySpeaking to become very accurate with regular use.
Dragon NaturallySpeaking on the PC has “regional accent modelling”. This makes the program far more accurate than basic mobile device speech recognition which uses a generic accent model.
Dragon NaturallySpeaking adapts to your specific vocabulary. Siri or the Google android speech recognition application do not do this, they run off a generic limited vocabulary.
Amount Processed. Free speech software on your phone can only process 30-second chunks of speech. Dragon speech recognition on the PC is continuous for a long as you can talk and doesn’t need a continuous internet connection.
The not so good
Good flow of speech is important, even if just for short passages. Dragon NS writes everything you say; even inflections of speech such as “mmmm” and “eh”. If a user tends to use these inflections in speech, it will type these inflections. Continuously deleting them can be time consuming and frustrating. Training oneself not to use these inflections can be very tricky.
The user needs to be very cognitively able to command the system with their voice, planning out the actions and remembering specific commands.
Fantastic software for the right client, especially if for any reason direct access is not an option. Even if a form of direct access is an option for the client, Dragon NS is still a nice option for long passages of text production. For the wrong client, this software would be more of a hindrance and a frustration than a help.
Big news (in the AT world anyway) may have arrived in your mail box early last week. It was announced that leading AAC and Computer Access manufacturer Tobii purchased SmartBox AT (Sensory Software), developers of The Grid 3 and Look2Learn. As well as producing these very popular software titles, SmartBox were also a leading supplier of a range of AAC and Computer Access hardware, including their own GridPad and PowerPad ranges. Basically (in this part of the world at least) they were the two big guns in this area of AT, between them accounting for maybe 90% of the market. An analogy using soft drink companies would be that this is like Coca-Cola buying Pepsi.
Before examining what this takeover (or amalgamation?) means to their customers going forward it is worth looking back at what each company has historically done well. This way we can hopefully provide a more optimistic future for AT users rather than the future offered by what might be considered a potential monopoly.
Sensory Software began life in 2000 from the spare bedroom of founder Paul Hawes. Paul had previously worked for AbilityNet and had 13 years’ experience working in the area of AT. Early software like GridKeys and The Grid had been very well received and the company continued to grow. In 2006 they setup Smartbox to concentrate on complete AAC systems while sister company Sensory Software concentrated on developing software. In 2015 both arms of the company joined back together under the SamrtBox label. By this time their main product, the Grid 3, had established itself as a firm favourite with Speech and Language Therapists (SLT), for the wide range of communication systems it supported and Occupational Therapists and AT Professionals for its versatility in providing alternative input options to Windows and other software. Many companies would have been satisfied with providing the best product on the market however there were a couple of other areas where SmartBox also excelled. They may not have been the first AT software developers to harness the potential resources of their end users (they also may have been, I would need to research that further) but they were certainly the most successful. They succeeded in creating a strong community around the Grid 2 & 3 with a significant proportion of the online grids available to download being user generated. Their training and support was also second to none. Regular high quality training events were offered throughout Ireland and the UK. Whether by email, phone or the chat feature on their website their support was always top quality also. Their staff clearly knew their product inside out, responses were timely and they were always a pleasure to deal with.
Tobii have been around since 2001. The Swedish firm actually started with eyegaze, three entrepreneurs – John Elvesjö, Mårten Skogö and Henrik Eskilsson recognised the potential of eye tracking as an input method for people with disabilities. In 2005 they released the MyTobii P10, the world’s first computer with built-in eye tracking (and I’ve no doubt there are still a few P10 devices still in use). What stood out about the P10 was the build quality of the hardware, it was built like a tank. While Tobii could be fairly criticized for under specifying their all-in-one devices in terms of Processor and Memory, the build quality of their hardware is always top class. Over the years Tobii have grown considerably, acquiring Viking Software AS (2007), Assistive Technology Inc. (2008) and DynaVox Systems LLC (2014). They have grown into a global brand with offices around the world. As mentioned above, Tobii’s main strength is that they make good hardware. In my opinion they make the best eye trackers and have consistently done so for the last 10 years. Their AAC software has also come on considerably since the DynaVox acquisition. While Communicator always seemed to be a pale imitation of the Grid (apologies if I’m being unfair, but certainly true in terms of its versatility and ease of use for computer access) it has steadily being improving. Their newer Snap + Core First AAC software has been a huge success and for users just looking for communication solution would be an attractive option over the more expensive (although much fuller featured) Grid 3. Alongside Snap + Core they have also brought out a “Pathways” companion app. This app is designed to guide parents, care givers and communication partners in best practices for engaging Snap + Core First users. It supports the achievement of communication goals through video examples, lesson plans, interactive goals grid for tracking progress, and a suite of supporting digital and printable materials. A really useful resource which will help to empower parents and prove invaluable to those not lucky enough to have regular input from an SLT.
To sum things up. We had two great companies, both with outstanding products. I have recommended the combination of the Grid software and a Tobii eye tracker more times than I remember. The hope is that Tobii can keep the Grid on track and incorporate the outstanding support and communication that was always an integral part of SmartBox’s operation. With the addition of their hardware expertise and recent research driven progress in the area of AAC, there should be a lot to look forward to in the future.
It is easy for someone to assume that their wheelchair can only be used for driving. However, wheelchair manufacturers have developed their products in recent years and considered the needs of the user such as the need to also interact with their mobile phone, PC or even a TV. As well as the basic chair functions such as driving or controlling the actuators these electronic systems can also enable control of a computer or portable devices and so the integration of environmental controls is possible on most power wheelchairs. The same controls that the user drives the power wheelchair with, typically a joystick, can also be used to control an appliance within their environment. Another benefit of integrating control of other devices within the wheelchair joystick is that it may help to ensure the user maintains a good posture while operating other devices.
For example for chairs with R-net controls you can replace the old joystick with a CJSM2 –BT as seen in the picture here. This R-net Joystick Module has Infra-Red (IR) capabilities included. IR technology is widely used to remotely control household devices such as TVs, DVD players, and multi-media systems, as well as some home-automation equipment. Individual IR commands can be learned from an appliance’s remote handset and stored in the CJSM2. Also Integrated Bluetooth technology is an option, to enable control of computers, Android tablets, iPads, iPhones and other smart devices from a powered wheelchair. To switch between the devices, the user simply navigates the menu and selects the device they wish to control. The R-net’s CJSM2 can easily replace the existing rnet joystick module, with no system re-configuration or programming required.
Although not all power wheelchairs can be fitted with Bluetooth mouse-enabled joysticks, there are some good alternatives that may still work. The BJoy ring is a sensor that can be fitted to most wheelchair joysticks where deflections of the joystick can be translated to mouse movements picked up on a Bluetooth mouse receiver placed on a tablet or PC.
The good: Users can do many daily tasks using one device
The not so good: This capability is only available on high spec wheelchair systems.
The verdict: Using a wheelchair joystick that is Bluetooth enabled will ensure the user maintains a good posture while operating their other devices.
Microsoft has been making huge strides in the realm of accessibility with each successive update to Windows and have invested in updates to improve the user experience for people with disabilities. The improvements in their Ease of Access features include eye tracking, the narrator, low vision features, and reading and writing improvements.
Eye Control delivers new exciting updates and new tools. For users who can’t use a mouse or keyboard to control their computer, Eye Control presents a convenient entry point to a windows computer using eye-tracking technology. Having access to your computer via Eye Control gives individuals a way to communicate, the ability to stay in the workforce, and so much more!
What began as a hack project during a One Week Hackathon, has become a product concept for the Windows team. Microsoft has introduced Eye Control, which empowers people with disabilities to use a compatible eye tracker, such as a Tobii Eye Tracker, to operate an on-screen mouse, keyboard, and text-to-speech in Windows 10 using only their eyes.
Microsoft Learning Tools
The New Learning Tools capabilities within Microsoft Edge Microsoft Learning Tools are a set of features designed to make it easier for people with learning differences like dyslexia to read. In this update, a user can mow simultaneously highlight and listen to text in web pages and PDF documents to read and increase focus.
Now with the addition of the Immersive Reader functionality of Learning Tools you can photograph a document, export it to immersive reader and immediately use the tools to support your understanding of the text.
Narrator will include the ability to use artificial intelligence to generate descriptions for images that lack alternative text. For websites or apps that don’t have alt-text built in, this feature will provide descriptions of an image. Narrator will now also include the ability to send commands from a keyboard, touch or braille display and get feedback about what the command does without invoking the command. Also, there will be some Braille improvements – Narrator users can type and read using different braille translations. Users can now perform braille input for application shortcuts and modifier keys.
Desktop Magnifier is also getting an option to smooth fonts and images, along with mouse wheel scrolling to zoom in and out. It is now possible to use Magnifier with Narrator, so you can zoom in on text and have it read aloud.
This feature already allowed people to speak into their microphone, and convert using Windows Speech Recognition into text that appears on the screen. In the Windows 10 Update, a person can now use dictation to convert spoken words into text anywhere on your PC
To start dictating, select a text field and press the Windows logo key + H to open the dictation toolbar. Then say whatever’s on your mind.
As well as dictating text, you can also use voice commands to do basic editing or to input punctuation. (English only)
If it’s hard to see what’s on the screen, you can apply a color filter. Color filters change the color palette on the screen and can help you distinguish between things that differ only by color.
To change your color filter, select Start > Settings > Ease of Access > Color & high contrast . Under Choose a filter, select a color filter from the menu. Try each filter to see which one suits you best.
Just yesterday Microsoft announced what is possibly their biggest step forward in functionality within their Ease of Access accessibility settings since Windows 7. Eye Control is an inbuilt feature to facilitate access to the Windows 10 OS using the low cost eyegaze peripheral the Tracker 4 C from Tobii. More about what you can actually do with Eye Control below but first a little background to how this has come about.
Former American Football professional and MND (ALS) sufferer Steve Gleason (above) challenged Microsoft in 2014 to help people affected by this degenerative condition through the advancement eye tracking technology. This initial contact lead to the development of a prototype eye gaze controlled wheelchair, receiving lots of publicity and generating increased awareness in the process. However it was never likely to be progressed to a product that would be available to other people in a similar situation. What this project did achieve was to pique the interest of some of the considerable talent within Microsoft into the input technology itself and its application, particularly for people with MND.
A combination of factors felt on both sides of the Atlantic have proved problematic when it comes to providing timely AT support to people diagnosed with MND. Eyegaze input is the only solution that will allow successful computer access as the condition progresses, eye movement being the only ability left in the final stages of the illness. However, historically the cost of the technology meant that either insurance, government funding or private fundraising was the only means by which people could pay for eyegaze equipment. Usually this resulted in a significant delay which, due to the often aggressive nature of MND meant valuable time was lost and often the solution arrived too late. This situation was recognized by Julius Sweetland who led the development of Optikey, an Open Source computer access/AAC solution designed to work with low cost eye trackers back in 2015. Interestingly some of the innovative features of Optikey seem to have made it to Eye Control on Windows 10 (Multi-Key selection called Shape Writing on Eye Control – see gif below).