A hands-free mouse allows you to perform computer mouse
functions without using your hands. There are various options for hands free
control of your mouse on a computer screen such as reflective dot trackers,
wearable sensors, speech recognition or even eye trackers. One other
possible group of devices are Lip and chin Joysticks.
These products are designed specifically for users with
physical disabilities. They are typically USB Plug and Play, which means they
will work with any computer platform that supports USB mice, including Windows,
Mac OS X, Linux, and Android. All can be customized using the built-in mouse
settings in the operating system, while some will also include setup software
for further customization.
To activate the mouse buttons. The IntegraMouse+, Jouse3,
and QuadJoy incorporate a sip/puff switch into their joystick, so that a sip
action clicks one mouse button, and a puff action clicks the other. Other
options are switches, the BJOY Chin has two circular switch pads, one on either
side of the joystick, which can be pressed using the chin or cheek. And the
TetraMouse has a second joystick that is devoted to button actions, right next
to the joystick for cursor control. Low cost options are the Tobias’ mouse and the
Flipmouse. This are open source hardware
and software projects with documented instruction on how to build and 3D
Print. The user moves the cursor by
using a mouthpiece. The right mouse button is operated by pushing the
mouthpiece towards the case. The left mouse button is emulated by a sensor that
recognizes if the user sucks air through it.
Some Lip and chin Joysticks options to consider are
I have always been a bit of a gamer. From Tetris on the original Gameboy to Sonic and the SEGA Mega Drive, I was always keen to pass the time away rapidly instructing a cartoon character to bounce from one side of the screen to another. Since I acquired my disability in 1999 though I felt that large parts of this world were now no longer accessible to me. I felt with limited use of my arms and no use of my fingers consoles were out of the question. That changed recently when the Xbox brought out their new accessible controller.
I had tried to use several different games on the PlayStation and the Xbox, my nephew had a PlayStation and I had been able to use the left stick and some of the buttons on the ordinary controller but despite me telling him not to use the trigger buttons which were inaccessible to me I still got hammered several times by him on FIFA.
This new accessible controller seemed as though it would provide me with the opportunity to have the full experience of console gaming again, but who is going to buy an Xbox One and accessible controller just to see if they can use it or not? Thankfully Enable Ireland came to my rescue and they allowed me to borrow their console and controller for the period of a month.
XBox Adaptive Controller (XAC)
controller is simple to use and simple to set up. I needed some help to physically
plug some aids in and out of the controller but apart from that it was a
The controller is setup for people of all abilities. The variety of configurations
is as wide as the number of disabilities of the people who it is geared to
Some games I used just the accessible controller with the coloured plug in
switches that Enable Ireland provided alongside the console.
For other more complicated games, I used the Co-Pilot feature. The Co-Pilot feature allows you
to use the ordinary controller as best you can while using the accessible
controller switches for any bits or buttons on the ordinary controller that you
My setup for Forza, the car racing game, was the simplest of
all. I took 4 of the aid switches and plugged them into the accessible
controller, one was plugged into RT for the accelerator, one was plugged into
LT for the brake, and the remaining two were plugged into the left and right
the d-pad. I placed the RT switch under my elbow to continuously accelerate, which
then meant my hands only had to focus on the three remaining buttons for
steering and braking. That was a huge success, and meant I did not need any
assistance throughout any of the gameplay on that particular game. Though that
does not mean I was a great driver!
FIFA I used the Co-Pilot feature. I used the ordinary controller as I had done
previously with my nephew, steering my player with the left stick while
passing, tackling, shooting, etc with the usual A, B, X, and Y buttons.
I used the Xbox Accessible Controller then for the sprint and switch player options.
I simply plugged in the switches into the RT and LT ports on the accessible
controller and played normally on the ordinary controller while occasionally
tapping the switches to change player or holding them down
with my elbow to sprint.
A very successful and intelligent solution which resulted in a 5-1 victory for
me over my nephew! His face was a picture 🙂
Ryse, GTA & Battlefield
Each of these I played with a similar set up to FIFA (pictured above). I used the Co-Pilot feature, the ordinary controller in conjunction with the accessible controller with four switches plugged into the RT, LT, RB, and LB ports.
These games were a bit more intricate in their controls in
comparison to the others and a little more difficult to use as a result. The
accessible controller meant though that it was possible for me to at least give
it a go.
This controls setup was good and meant that I
actually completed the story mode of Ryse, on easy.
I could play the vast majority of GTA and Battlefield without any difficulty,
but there were certain issues. To use the character’s “special abilities”
in GTA you had to press down on both the left and right sticks. I think you
could set that up but that would require two more switches which I didn’t have.
Also, on occasion, while I had all the right buttons the scenario in the game
was so complex that it involved pressing a number of buttons and steering at
least one, if not both, sticks at the same time. It was almost equivalent to
playing some musical instrument. On one mission I did have to fall back on some
assistance from my nephew.
it is still not quite the same as gaming prior to my disability the Xbox
Accessible Controller has reopened the prospect of gaming properly on a regular
basis and owning a console of my own again. This was a world that I thought had
long left me behind but thanks to Microsoft and Xbox I’m
right back in the game!
Love it or hate it, the game of Minecraft has captured the imagination of over 100 million young, and not so young people. It is available on multiple platforms; mobile device (Pocket Edition), Raspberry Pi, Computer, Xbox or PlayStation and it looks and feels pretty much the same on all. For those of us old enough to remember, the blocky graphics will hold some level of nostalgia for the bygone 8 Bit days when mere blobs of colour and our imagination were enough to render Ghosts and Goblins vividly. This is almost certainly lost on the main cohort of Minecraft players however who would most probably be bored silly with the 2 dimensional repetitive and predictable video games of the 80’s and early 90’s. The reason Minecraft is such a success is that it has blended its retro styling with modern gameplay and a (mind bogglingly massive) open world where no two visits are the same and there is room for self-expression and creativity. This latter quality has lead it to become the first video game to be embraced by mainstream education, being used as a tool for teaching everything from history to health or empathy to economics. It is however the former quality, the modern gameplay, that we are here to talk about. Unlike the afore mentioned Ghosts and Goblins, Minecraft is played in a 3 dimensional world using either the first person perspective (you see through the characters eyes) or third person perspective (like a camera is hovering above and slightly behind the character). While undoubtedly offering a more immersive and realistic experience, this means controlling the character and playing the game is also much more complex and requires a high level of dexterity in both hands to be successful. For people without the required level of dexterity this means that not only is there a risk of social exclusion, being unable to participate in an activity so popular among their peers, but also the possibility of being excluded within an educational context.
Fortunately UK based charity Special Effect have recognised this need and are in the process doing something about it. Special Effect are a charity dedicated to enabling those with access difficulties play video games through custom access solutions. Since 2007 their interdisciplinary team of clinical and technical professionals (and of course gamers) have been responsible for a wide range of bespoke solutions based on individuals’ unique abilities and requirements. Take a look at this page for some more information on the work they do and to see what a life enhancing service they provide. The problem with this approach of course is reach, which is why their upcoming work on Minecraft is so exciting. Based on the Open Source eyegaze AAC/Computer Access solution Optikey by developer Julius Sweetland, Special Effect are in the final stages of developing an on-screen Minecraft keyboard that will work with low cost eye trackers like the Tobii Eye X and the Tracker 4C (€109 and €159 respectively).
The inventory keyboard
The main Minecraft on screen keyboard
Currently being called ‘Minekey’ this solution will allow Minecraft to be played using a pointing device like a mouse or joystick or even totally hands free using an eyegaze device or headmouse. The availability of this application will ensure that Minecraft it now accessible to many of those who have been previously excluded. Special Effect were kind enough to let us trial a beta version of the software and although I’m no Minecraft expert it seemed to work great. The finished software will offer a choice of onscreen controls, one with smaller buttons and more functionality for expert eyegaze users (pictured above) and a more simplified version with larger targets. Bill Donegan, Projects Manager with Special Effect told us they hope to have it completed and available to download for free by the end of the year. I’m sure this news that will excite many people out there who had written off Minecraft as something just not possible for them. Keep an eye on Special Effect or ATandMe for updates on its release.
The FLipMouse (Finger- and Lip mouse) is a computer input device intended to offer an alternative for people with access difficulties that prevent them using a regular mouse, keyboard or touchscreen. It is designed and supported by the Assistive Technology group at the UAS Technikum Wien (Department of Embedded Systems) and funded by the City of Vienna (ToRaDes project and AsTeRICS Academy project). The device itself consists of a low force (requires minimal effort to operate) joystick that can be controlled with either the lips, finger or toe. The lips are probably the preferred access method as the FlipMouse also allows sip and puff input.
Sip and Puff is an access method which is not as common in Europe as it is in the US however it is an ideal way to increase the functionality of a joystick controlled by lip movement. See the above link to learn more about sip and puff but to give a brief explanation, it uses a sensor that monitors the air pressure coming from a tube. A threshold can be set (depending on the user’s ability) for high pressure (puff) and low pressure (sip). Once this threshold is passed it can act as an input signal like a mouse click, switch input or keyboard press among other things. The Flipmouse also has two jack inputs for standard ability switches as well as Infrared in (for learning commands) and out (for controlling TV or other environmental controls). All these features alone make the Flipmouse stand out against similar solutions however that’s not what makes the Flipmouse special.
The Flipmouse is the first of a new kind of assistive technology (AT) solution, not because of what it does but because of how it’s made. It is completely Open Source which means that everything you need to make this solution for yourself is freely available. The source code for the GUI (Graphical User Interface) which is used to configure the device and the code for the microcontroller (TeensyLC), bill of materials listing all the components and design files for the enclosure are all available on their GitHub page. The quality of the documentation distinguishes it from previous Open Source AT devices. The IKEA style assembly guide clearly outlines the steps required to put the device together making the build not only as simple as some of the more advanced Lego kits available but also as enjoyable. That said, unlike Lego this project does require reasonable soldering skills and a steady hand, some parts are tricky enough to keep you interested. The process of constructing the device also gives much better insight into how it works which is something that will undoubtedly come in handy should you need to troubleshoot problems at a later date. Although as stated above Asterics Academy provide a list of all components a much better option in my opinion would be to purchase the construction kit which contains everything you need to build your own FlipMouse, right down to the glue for the laser cut enclosure, all neatly packed into a little box (pictured below). The kit costs €150 and all details are available from the FlipMouse page on the Asterics Academy site. Next week I will post some video demonstrations of the device and look at the GUI which allows you program the FlipMouse as a computer input device, accessible game controller or remote control.
I can’t overstate how important a development the FlipMouse could be to the future of Assistive Technology. Giving communities the ability to build and support complex AT solutions locally not only makes them more affordable but also strengthens the connection between those who have a greater requirement for technology in their daily life and those with the creativity, passion and in-depth knowledge of emerging technologies, the makers. Here’s hoping the FlipMouse is the first of many projects to take this approach.
As we approach the end of 2016 it’s an appropriate time to look back and take stock of the year from an AT perspective. A lot happened in 2016, not all good. Socially, humanity seems to have regressed over the past year. Maybe this short term, inward looking protectionist sentiment has been brewing longer but 2016 brought the opportunity to express politically, you know the rest. While society steps and looks back technology continues to leap and bound forward and 2016 has seen massive progress in many areas but particularly areas associated with Artificial Intelligence (AI) and Smart Homes. This is the first in a series of posts examining some technology trends of 2016 and a look at how they affect the field of Assistive Technology. The links will become active as the posts are added. If I’m missing something please add it to the comments section.
So although 2016 is unlikely to be looked on kindly by future historians… you know why; it has been a great year for Assistive Technology, perhaps one of promise rather than realisation however. One major technology trend of 2016 missing from this series posts is Virtual (or Augmented) Reality. While VR was everywhere this year with products coming from Sony, Samsung, Oculus and Microsoft its usefulness beyond gaming is only beginning to be explored (particularly within Education).
So what are the goals for next year? Well harnessing some of these innovations in a way where they can be made accessible and usable by people with disabilities at an affordable price. If in 2017 we can start putting some of this tech into the hands of those who stand to benefit most from its use, then next year will be even better.
One of the more dubious advantages of working in a long running Assistive Technology service is access to an ever growing supply of obsolete hardware. While much of it is worthless junk now considering the technological progress in the field over the last 10 years, there are some real gems to be rediscovered. These were innovative solutions of their time grounded in strong research and while being seemingly made obsolete by a newer technology actually still have much to offer. The LOMAK keyboard is certainly one of these and being possibly the only piece of AT on permanent display at New York’s Museum of Modern Art I’m obviously not alone in thinking this.
The LOMAK (Light Operated Mouse And Keyboard) was invented by New Zealander Mike Watling and first came on the market in 2005 after a number of years research. It allowed hands free computer access through the innovative use of a laser pointer and light sensitive keyboard and mouse controls. To make the light sensitive keyboard and mouse (I’ll call it an input device from here) Watling used an array or photoresistors, one for each keyboard, mouse action and setting. This amounted to a whopping 122 photoresistors and possibly the most electronically complex input device ever marketed. Although complex the idea behind the LOMAK is quite straight forward. Photoresistors change their resistance depending on the amount of light they are picking up. Once you figure out roughly how much shining a laser pen on the resistor changes its value you have a good idea of where to set your threshold. You can then use the photo-resistor as a straightforward momentary switch, like a keyboard key, that activates once the resistance goes above/below a certain threshold. If you are like me you will want to see inside this thing so here it is.. (Below), a thing of beauty I’m sure you’ll agree.
So why aren’t more people using LOMAK keyboards today? Well eye tracking technology was just starting to become a realistic possibility for AT users with devices like the Tobii P10 hitting the market. Eye tracking just made more sense for computer access, it allows a neater more mobile solution and it a more direct input method. What has given the whole concept behind the LOMAK a new lease of life is the availability of cheap user-friendly prototyping platforms like Arduino.
This was the basis of one of the project proposals we made available to the final year students of the BSc (Honours) Creative Media Technologies course in IADT. Over the last few years Enable Ireland AT service have worked with IADT lecturer Conor Brennan to provide students with a selection of project briefs that both fit with their learning and skills while also fulfilling a need that has been recognised through our work supporting AT users and professionals in the area. This particular brief was to create a MIDI interface based on the same concept as the LOMAK that would allow someone to perform and compose music using only head movements. There are solutions available that use eye tracking to achieve this, for example the fantastic EyeHarp and more recently Ruud van der Wel of My Breath My Music released his Eye Play Music software. However these solutions all require a computer, we wanted something that was more in keeping with current trends in mainstream electronic music which seems to be moving back to a more hardware based performance. Thankfully a particularly talented student by the name of Rudolf Triebel took on the challenge of designing and building what we are now calling the MILO (Musical Interface using Laser Operation) (previously called LOMI Light Operated MIDI Interface which I think is much better..:). Rudolf exceeded our expectations and created the prototype you can see in the (badly filmed, sorry) video below. He has also created a tutorial including wiring diagram, code and bill of materials and put it up on Instructables to allow the project to be replicated and improved by others.
If you would like to see and maybe have a go of the MILO prototype (in its spanking new laser cut enclosure) Conor Brennan of IADT will be showing and demonstrating it at the 25th EAN Conference which takes place in University College Dublin between Sunday 29th – Tuesday 31st May.
Keep an eye on electroat.com where I hope to add a few more detailed posts on building, modifying and increasing the functionality of Rudolf’s design. I will also look into the possibility of using the same concept for building a hands free video game controller.
We in Enable Ireland Assistive Technology Training Service have long recognised the importance of gaming to many young and not so young assistive technology users. It’s a difficult area for a number of reasons. Firstly games (and we are talking about video games here) are designed to be challenging, if they are too easy they’re not fun, however if too difficult the player will also lose interest. Successful games manage to get the balance just right. Of course when it comes to physical dexterity as well as other skills required for gaming (strategic, special awareness, timing) this often involves game designers taking a one size fits all approach which frequently doesn’t include people with physical, sensory or cognitive difficulties. There are two methods of getting around this which when taken together ensure a game can be accessed and enjoyed by a much broader range of people; difficulty levels (not a new concept) and accessibility features (sometimes called assists). Difficulty levels are self-explanatory and have been a feature of good games for decades. Accessibility features might include the ability to remap buttons (useful for one handed gamers), automate certain controls, subtitles, high contrast and magnification.
Another challenge faced when creating an alternative access solution to allow someone successfully play a video game is that you need to have a pretty good understanding of the activity, how to play the game. This is where we often have difficulty and I’d imaging other non-specialist services (general AT services rather than game accessibility specialist services like SpecialEffect or OneSwitch.org.uk ) also run into problems. We simply do not have the time required to familiarise ourselves with the games or keep up to date with new releases (which would allow us better match a person with an appropriate game for their range of ability). We try and compensate for this by enlisting the help of volunteers (often from Enable Irelands IT department with whom we share office space), interns and transition year students. It’s often the younger transition year students who bring us some of the best suggestions and last week was no exception. After we demonstrated some eyegaze technology to Patrick, a transition year student visiting from Ardscoil Ris, Dublin 9, he suggested we take a look at a browser based game called agar.io. I implore you, do not to click that link if you have work to get done today. This game is equal parts addictive and infuriating but in terms of playability and simplicity it’s also very accessible with simple controls and a clear objective. The idea is that using your mouse you control a little (at first) coloured circular blob, think of it as a cell and the aim of the game it to eat other little coloured cells and grow. The fun part is that other players from every corner of the globe are also controlling cells and growing, if they are bigger than you they move a little slower but can eat you! Apart from the mouse there are two other buttons, the spacebar allows you to split your cell (can be used as an aggressive or defensive strategy) and the “W” key allows you to shed some weight. We set up the game to be played with a Tobii EyeX (€119) and IRIS software (€99). IRIS allows you to emulate the mouse action with your eyes and set up two onscreen buttons (called interactors) that can also be activated using your gaze, the video below should make this clearer.
Big thanks to Patrick for suggesting we take a look at Agrio and helping us set it up for eyegaze control. I’ll leave the final words to him. “I found playing Agrio with gaze software really fun. I think you have just as much control with your eyes as with your mouse. If an interactor was placed in the corner of the screen to perform the function of the spacebar (splits the cell in half) it would be beneficial. I believe it would be a very entertaining game for people who can only control their eyes, not their arms.”
Finding accessible toys may at first may seem a difficult task. However there are various options from toys that are switch adapted to toys that are accessible by the nature of their design. The following information has been prepared by Enable Ireland’s National Assistive Technology Service to show some of the options and resources that you might want to consider.
The toys shown are not necessarily recommendations but simply a selection of items which may be of interest, particularly at times such as Christmas and birthdays, when presents are high on the list of priorities.
Game Accessibility Guidelines is a web resource created to help developers create games which are more inclusive of disabled gamers. Although launched back in 2012 from the collaboration of a group of game developers, accessibility specialists and academics it has recently been awarded for promoting accessibility for persons with disabilities. The web resources invites constant feedback from gamers and developers in order to keep the guidelines inclusive and up-to-date.
An example of how to make games more inclusive is providing gamers with options such as a choice of difficulty level or being able to configure which button does what on your controller or providing support for assistive technologies such as switches and screen-readers.
Making games more inclusive is important. It can be a great contributor to the quality of life for a person with a disability as well as making economic sense within a growing industry.