Today May 18th is Global Accessibility Awareness Day and to mark the occasion Apple have produced a series of 7 videos (also available with audio description) highlighting how their products are being used in innovative ways by people with disabilities. All the videos are available in a playlist here and I guarantee you, if you haven’t seen them and you are interested in accessibility and AT, it’ll be the best 15 minutes you have spent today! Okay the cynical among you will point out this is self promotion by Apple, a marketing exercise. Certainly on one level of course it is, they are a company and like any company their very existence depends on generating profit for their shareholders. These videos promote more than Apple however, they promote independence, creativity and inclusion through technology. Viewed in this light these videos will illustrate to people with disabilities how far technology has moved on in recent years and make them aware of the potential benefits to their own lives. Hopefully the knock on effect of this increased awareness will be increased demand. Demand these technologies people, it’s your right!
As far as a favorite video from this series goes, everyone will have their own. In terms of the technology on show, to me Todd “The Quadfather” below was possibly the most interesting.
This video showcases Apple’s HomeKit range of associated products and how they can be integrated with Siri.
My overall favorite video however is Patrick, musician, DJ and cooking enthusiast. Patrick’s video is an ode to independence and creativity. The technologies he illustrates are Logic Pro (Digital Audio Workstation software) with VoiceOver (Apple’s inbuilt screen-reader) and the object recognizer app TapTapSee which although has been around for several years now, is still an amazing use of technology. It’s Patrick’s personality that makes the video though, this guy is going places, I wouldn’t be surprised if he had his own prime time TV show this time next year.
This conference may be of interest, it is aimed at disability professionals involved in post education and the workplace, ATEC is a one-day event that allows you to listen to and meet with experts, solution providers, and other like-minded people.
40 organizations from the world of assistive technology attending and 150 professionals from the workplace and post education.
Choose to attend 4 seminars from a range of 16, and gain a CPD certificate for each session attended.
Lunchtime speed dating sessions – 3 back-to-back 15-minute presentations.
Two parallel panel debate sessions covering the themes of Education and the Workplace.
Network with disability professionals from post education and the workplace.
The FLipMouse (Finger- and Lip mouse) is a computer input device intended to offer an alternative for people with access difficulties that prevent them using a regular mouse, keyboard or touchscreen. It is designed and supported by the Assistive Technology group at the UAS Technikum Wien (Department of Embedded Systems) and funded by the City of Vienna (ToRaDes project and AsTeRICS Academy project). The device itself consists of a low force (requires minimal effort to operate) joystick that can be controlled with either the lips, finger or toe. The lips are probably the preferred access method as the FlipMouse also allows sip and puff input.
Sip and Puff is an access method which is not as common in Europe as it is in the US however it is an ideal way to increase the functionality of a joystick controlled by lip movement. See the above link to learn more about sip and puff but to give a brief explanation, it uses a sensor that monitors the air pressure coming from a tube. A threshold can be set (depending on the user’s ability) for high pressure (puff) and low pressure (sip). Once this threshold is passed it can act as an input signal like a mouse click, switch input or keyboard press among other things. The Flipmouse also has two jack inputs for standard ability switches as well as Infrared in (for learning commands) and out (for controlling TV or other environmental controls). All these features alone make the Flipmouse stand out against similar solutions however that’s not what makes the Flipmouse special.
The Flipmouse is the first of a new kind of assistive technology (AT) solution, not because of what it does but because of how it’s made. It is completely Open Source which means that everything you need to make this solution for yourself is freely available. The source code for the GUI (Graphical User Interface) which is used to configure the device and the code for the microcontroller (TeensyLC), bill of materials listing all the components and design files for the enclosure are all available on their GitHub page. The quality of the documentation distinguishes it from previous Open Source AT devices. The IKEA style assembly guide clearly outlines the steps required to put the device together making the build not only as simple as some of the more advanced Lego kits available but also as enjoyable. That said, unlike Lego this project does require reasonable soldering skills and a steady hand, some parts are tricky enough to keep you interested. The process of constructing the device also gives much better insight into how it works which is something that will undoubtedly come in handy should you need to troubleshoot problems at a later date. Although as stated above Asterics Academy provide a list of all components a much better option in my opinion would be to purchase the construction kit which contains everything you need to build your own FlipMouse, right down to the glue for the laser cut enclosure, all neatly packed into a little box (pictured below). The kit costs €150 and all details are available from the FlipMouse page on the Asterics Academy site. Next week I will post some video demonstrations of the device and look at the GUI which allows you program the FlipMouse as a computer input device, accessible game controller or remote control.
I can’t overstate how important a development the FlipMouse could be to the future of Assistive Technology. Giving communities the ability to build and support complex AT solutions locally not only makes them more affordable but also strengthens the connection between those who have a greater requirement for technology in their daily life and those with the creativity, passion and in-depth knowledge of emerging technologies, the makers. Here’s hoping the FlipMouse is the first of many projects to take this approach.
To use technology effectively it needs to be at an optimal position for our use. Whether it’s a computer display, tablet computer, or even the chair we sit on, the position of items we use is important for ease of use and comfort. For someone with a physical disability this is even more important as their ability to reach, grasp, hold or interact with physical objects may be limited. Mounting can improve the overall view and the ability to manipulate the controls of the device.
There is now a range of mounting solutions available from mounting arms to modular mounting kits.
We need to consider a range of things when mounting assistive technology, to ensure technology can be used effectively in a range of environments and contexts to meet the lifestyle needs of the user.
Some very useful documentation is now available. It is designed for service providers and others who are involved with attaching one piece of assistive technology, such as a communication device to another, such as a wheelchair. It’s designed to help ensure all relevant aspects have been considered to ensure the best solution is reached.
This best practice guidelines documentation is available for general use at http://mat-doc.org/
MAT-doc also includes Best Practice Guidelines which have been developed by a team of people who are all actively involved with mounting assistive technology.
These guidelines are intended to promote and facilitate independence and participation and not as a mechanism to find barriers to the provision of equipment.
It is based on the Mounting Assistive Technology Documentation (MAT-DOC)
EyeTech Digital Systems, has partnered with Quantum Rehab to bring eye controlled wheel chairs to individuals who are unable to use hand controls. EyeTech’s eye tracking technology mounts directly to a tablet PC and allows the user to control the entire computer using eye movements. The system then mounts to the wheelchair. An eye control driving app gives the user the ability to drive hands-free. The driving controls are overlaid on the scene camera. Simply looking at the driving controls activates them to control the basic directions and movements of the chair.
Quantum Rehab® products, including a range of rehab mobility technologies such as the Q6 Edge® 2.0 and Quantum Series of power bases. www.QuantumRehab.com.
EyeTech Digital Systems products include eye tracking technology for a variety of markets such as medical, transportation, entertainment, and augmentative communication http://www.eyetechds.com/
Tobii Gaming, the division of Swedish technology firm Tobii responsible for mainstream (and therefore low cost) eye trackers have released a new peripheral called the Tracker 4C (pictured below).
Before getting into the details of this new device I first want to highlight that although this eye tracker can be used as a computer access solution for someone with a disability (it already works with Optikey and Project IRIS), it is not being marketed as such. What this means in practice is that it may not provide the reliability that their much costlier Assistive Technology (AT) eye trackers such as the Tobii PC Eye Mini do. So if eye tracking is your only means of communication or computer access and you have the funds I would recommend spending that extra money. That said, many people don’t have the funds or perhaps they have other more robust means of computer access and just want to use eye tracking for specific tasks like creating music or gaming. For those people the Tracker 4 C is really good news as it packs a lot into the €159 price tag and overcomes many of the weaknesses of its predecessor the Tobii Eye X. The big improvement over the Tobii Eye X is the inclusion of the EyeChip. The EyeChip which was previously only included in the much more expensive range of Tobii AT eye trackers takes care of most of the data processing before sending it on to the computer. The result of this is much less data is being passed to the computer from the eyetracker (100KB/S compared to 20MB/S) and a much lower CPU (Central Processing Unit) load (1% compared to 10%). This allows it to work over an older USB 2 connection and means most (even budget) computers should have no problem running this device (unlike the Eye X which required a high end PC).
All this must have come as some compromise in performance right? Wrong. The Tracker 4C actually beats the Eye X in almost every category. Frequency has risen from 70Hz to 90Hz, slightly longer operating distance is possible .95m, and the max screen size has increased by 3” to 30”. This last stat could be the deciding factor that convinces Tobii PC Eye Mini users to buy the Tracker 4 C as a secondary device as the Mini only works with a max screen size of 19”. The Tracker 4 C also offers head tracking but as I haven’t tested the device I’m unsure of how this works or of it is something that could be utilised as AT. Watch this space, the Tracker 4 C is on our shopping list and I’ll post an update as soon as we get to test whether it’s as impressive in real life as it seems on paper.
The table below compares specs for all Tobii’s current range of consumer eye trackers. In some areas where information was not available I have added a question mark and if appropriate a speculation. I am open to correction.
Last Friday (February 17th) New Scientist published an article about a new app in development at Microsoft called GazeSpeak. Due to be released over the coming months on iOS, GazeSpeak aims at facilitating communication between a person with MND (known as ALS in the US, I will use both terms interchangeably) and another individual, perhaps their partner, carer or friend. Developed by Microsoft intern, Xiaoyi Zhang, GazeSpeak differs from traditional approaches in a number of ways. Before getting into the details however it’s worth looking at the background, GazeSpeaker didn’t come from nowhere, it’s actually one of the products of some heavyweight research into Augmentative and Alternative Communication (AAC) that has been taking place at Microsoft over the last few years. Since 2013, inspired by football legend and ALS sufferer Steve Gleason (read more here) Microsoft researchers and developers have put the weight of their considerable collective intellect to bear on the subject of increasing the ease and efficiency of communication for people with MND.
This is an entirely new approach to increasing the efficiency of AAC and one that I suggest, could only have come from a large mainstream tech organisation who have over thirty years experience facilitating communication and collaboration.
Another Microsoft research paper published last year (with some of the same authors at the previous paper) called “Exploring the Design Space of AAC Awareness Displays” looks at importance of a communication partners “awareness of the subtle, social, and contextual cues that are necessary for people to naturally communicate in person”. There research focused on creating a display that would allow the person with ALS express things like humor, frustration, affection etc, emotions difficult to express with text alone. Yes they proposed the use of Emoji, which are a proven and effective way a similar difficulty is overcome in remote or non face to face interactions however they went much further and also looked at solutions like Avatars, Skins and even coloured LED arrays. This, like the other one above, is an academic paper and as such not an easy read but the ideas and solutions being proposed by these researchers are practical and will hopefully be filtering through to end users of future AAC solutions.
That brings us back to GazeSpeak, the first fruits of the Microsoft/Steve Gleason partnership to reach the general public. Like the AACrobat solution outlined above GazeSpeak gives the communication partner a tool rather than focusing on tech for the person with MND. As the image below illustrates the communication partner would have GazeSpeak installed on their phone and with the app running they would hold their device up to the person with MND as if they were photographing them. They suggest a sticker with four grids of letters is placed on the back of the smart phone facing the speaker. The app then tracks the persons eyes: up, down, left or right, each direction means the letter they are selecting is contained in the grid in that direction (see photo below).
Similar to how the old T9 predictive text worked, GazeSpeak selects the appropriate letter from each group and predicts the word based on the most common English words. So the app is using AI in the form of machine vision to track the eyes and also to make the word prediction. In the New Scientist article they mention that the user would be able to add their own commonly used words and people/place names which one assumes would prioritize them within the prediction list. In the future perhaps some capacity for learning could be added to further increase efficiency. After using this system for a while the speaker may not even need to see the sticker with letters, they could write words from muscle memory. At this stage a simple QR code leading to the app download would allow them to communicate with complete strangers using just their eyes and no personal technology.
Here in Enable Ireland AT service we have been investigating using the Office Mix plugin for PowerPoint to create more engaging and accessible eLearning content. While we are still at the early stages and haven’t done any thorough user testing yet, so far it shows some real promise.
From the end user perspective it offers a number of advantages over the standard YouTube style hosted video. Each slide is marked out allowing the user to easily skip forward or back to different sections. So you can skip forward if you are comfortable with a particular area of the presentation or more importantly revisit parts that may have not been clear. The table of contents button makes this even easier by expanding thumbnail views of all the slides which directly link to the relevant sections of the video. There is also the ability to speed up or slow down the narration. Apart from the obvious comic value of this it is actually a very useful accessibility feature for people who may be looking at a presentation made in a language not native to them or by someone with a strong regional accent. On the flip side it’s also a good way to save time, the equivalent of speed reading.
From the content creator’s perspective it is extremely user friendly. Most of us are already familiar with PowerPoint, these additional tools sit comfortably within that application. You can easily record your microphone or camera and add to a presentation you may have already created. Another feature is “Inking”, the ability to write on slides and highlight areas with different colour inks. You can also add live web pages, YouTube videos (although this feature did not work in my test), questions and polls. Finally the analytics will give you a very good insight as to what areas of your presentation might need more clarification as you can see if some chooses to look at a slide a number or times. You can also see if slides were skipped or questions answered incorrectly.
Below is a nice post outlining some ways to create inclusive content using Office Mix and Sway, Microsoft’s other new(ish) web based presentation platform. Below that is a much more detailed introduction to Office Mix using… yes you guessed it Office Mix.
It’s hard to beat the quality of mounting equipment offered by mounting suppliers such as Dassey or Rehadapt or even mainstream suppliers like Ram Mounts. These mounting systems are designed to keep your hardware safe, made to last and they look good.
However, these mounting solutions also tend to be expensive and may be far from the budget of a user who may just require a second mount to take here and there with them.
There are many options for low-cost mounts that still provide the function of holding your phone or tablet so you can use it effectively.
Many low-cost mounts can be found on Amazon, eBay or even bought from supermarket chains such as Aldi or Lidl. So it’s worth keeping an eye out as some of these products sell for as little as a few euro.
An example of mount recently bought from Lidl for €4. It comprises of a spring loaded cradle, goose neck and spring loaded clamp. Although it will not take excessive pressure, but it works quite well for holding a phone at eye level for light touch screen use.
Sharon’s Shortcuts is a new educational resource for people who primarily use keyboard shortcuts to access a computer. The site contains different sections covering common tasks carried out using a PC. All the keyboard shortcuts mentioned in this site are standard, Windows shortcuts that anyone can use.
While it’s easy to find plenty of tutorials and step by step instructions for using a PC that are mouse-based, this unique website gives step by step instructions on using a PC without the mouse making it a useful resource for screen reader users.
Sharon has over 10 years experience supporting people with a vision impairment and also provides One to One Tutoring Sessions for specific IT skills, getting to grips with work based systems, or a program of study towards a qualification like ECDL.