AT for Creative Expression and Leisure

As we all figure out how best to cope with the Covid 19 pandemic and the social distancing that comes with it, we figured that many of you might be interested in learning about Assistive Technologies for Creative Expression and Leisure: Music, Photography and Gaming. Some of these may come in very handy as we all try to stay connected with one another during these trying times.

We are making our AT for Creative Expression and Leisure courses free for everyone to access over the next few months. These 4 short courses look at some ways that technology can assist people with disabilities engaging in creative pursuits and leisure activities. We have included the Introduction course below. This should be of interest to everybody and helps frame the subsequent content. The remaining 3 courses are available on our Learning Portal at enableirelandAT.ie.

You will need to create an account to access these courses but once you have your account you can self-enrol for free. Creating an account is easy. All you need is access to email to confirm your account. There is a video at the bottom of this post which will guide you through the process of creating an account. You don’t need to look at the second half of the video as these courses do not require an enrolment key.

Please let us know how you get on, and feel free to post your queries and comments at the bottom of this page. We’d love to hear what your own experiences are, and if there is content that you think we should add to these courses.

Introduction 

Below we have embedded the Introduction course. It’s too small to use as it but you can make it full screen by clicking the third blue button from the left at the bottom or click here to open in a new tab/window.

We hope that after completing this short introduction you are inspired to learn more. If so there are links to the other 3 courses below and also the video showing you how to create your account on our Learning Portal.

Art & Photography

Abstract painting. Blue dominant colour. distinct brush or  pallet knife strokes. text repeated below

In this short course we suggest some technologies that will enable people with disabilities access, engage and create art through media like painting or drawing, photography, video or animation.

Enrol in Art & Photography

Leisure & Gaming

2 children using an apple powerbook. boy has hands in the air, c=smiling, celebrating

Leisure and gaming can be sometimes overlooked when considering the needs of an individual. But it can be an important part of a young person’s development and help enable inclusion into society. This module looks at how we can make leisure time and gaming more inclusive to a wide range of abilities. There are now many options for accessible toys, game consoles and switch adapted toys. The module covers a sample of these options with some suggested links for further reading.

Enrol in the Leisure & Gaming Course

Music: Listen, Create, Share

screenshot of the eyeharp eyegaze music software. clock like radial interface. users eyes in letterbox image at centre. text below

Music is an accessible means of creative expression for all abilities. Even the act of passively listening to music engages the brain in the creative process. In this short course we will look at some mainstream and specialist hardware and software that can help facilitate creative musical expression.

Enrol in the Music: Listen, Create, Share Course

Creating an account on enableirelandAT.ie

Eyegaze for Musical Expression

Background – What is eyegaze?

Eyegaze is an alternative way of accessing a computer using eye movements to control the mouse. It is achieved through a combination of hardware and software. The hardware is a USB perhipal called an eye tracker. The eye tracker is positioned underneath the computer monitor. It contains a camera and Infrared lights. The user is positioned between 500 and 1000 mm from the monitor (600mm is usually about right) where the camera has a clear view of their eyes. The Infrared lights highlight the user’s pupils (think of red eye in photographs where a flash has been used) and create reflections on the user’s eyeballs. After a calibration process where the user looks at a dot moving around the screen, the software can accurately tell where the user is looking based on the reflections and movements of the pupil. For computer access the user will also need tome method of clicking. There are 3 methods usually used. Dwell is the most common method. This is where the click is automated. If the user holds their gaze (dwells) on a button or icon for more than a specified time duration, usually somewhere from .5 to 1.5 sec, a click is sent. A slight variation of this is used in some software designed for eyegaze where the button is activated after the user dwells on it. The main difference here is that the second method offers us the ability to select different dwell times for different buttons. The other input methods are less common. The first would be to use an external switch as a mouse click, the second would be to use a deliberate blink (longer than a normal blink to prevent accidental clicks) as a mouse click.  

Eye Tracker Devices

  • Tobii Tracker 4C https://gaming.tobii.com/tobii-eye-tracker-4c/ – This is a great option for those wanting to use eyegaze for activities like music and gaming but have other AT as their main access method. It is every bit as good as the two much more expensive “AT” eye trackers below and costs in the region of €170.
  • Tobii PC Eye Plus and Mini https://www.tobiidynavox.com/products/devices/ – The PC Eye Mini and PC Eye Plus are probably the most popular AT eye trackers. The mini will work well on a monitor up to 19”, the Plus also contains a high quality microphone array to support speech recognition, it also has a switch input port. The Plus will work on screens up to 28”.
  • EyeTech TM5 https://eyetechds.com/eye-tracking-products/tm5-mini-eye-tracker/. The EyeTech TM5 is quite similar to the Tobii PC Eye Mini. One key difference that might influence the choice of this eye trackers is that it supports a slightly closer user position.

Challenges associated with playing music using eye movement

These are a number of difficulties we might encounter when playing music using eye movements but all can be overcome with practice and by using some common music production tools and techniques. Eye gaze as an input method is quite restrictive. You only have one point of direct access, so you can think of it like playing a piano with one finger. To compound this difficulty and expand the piano analogy, because your eyes are also your input you cannot queue up your next note like a one fingered piano player might. Eyegaze in itself is just eye pointing, using it as an access method will require some input (click) ether a switch or a dwell (automatic click after a specific time duration, usually somewhere from .5 to 1.5 sec). If you are using dwell for input then this will add a layer of difficulty when it comes to timing. You could set the dwell to be really fast (like .1 second) but you may run into accidental activations in this case, for example playing a note as you are passing over it on the way to your intended note. Some of the specialist eyegaze software instruments like EyeHarp, EyePlayMusic and ii-music overcome this by using a circular clock style interface. This allows them set the onscreen buttons to instant activation and because of the radial layout each note can be directly accessed from the centre without passing over another note. Using the radial design if our eyes are in a central position all notes are equal distance from us and can be accessed in the most efficient way but we are still left with the “one finger piano” restriction. This means no chords and only the option of playing at a slower tempo. Using mainstream music productions like sequencers, arpeggiators or chord mode can overcome this limitation and allow us create much more complex music using eyegaze. A sequencer would allow you pre program accompanying notes with which to play along. An arpeggio is sometimes referred to as a broken chord. It is the notes of a chord played consecutively rather than simultaneously. Arpeggios are used a lot in electronic music. By playing arpeggios the slower input is offset by the additional life and movement provided by the arpeggio. Chord mode is something that can be set up in many digital audio workstations. You can map one note to automatically play the accompanying notes required to make it a chord. Live looping could also be used. In looping we would record a section being played live, then loop it back and play other notes over it. Other effects like delay, reverb and many more besides, will also allow is make interesting music.

Expression is another difficulty when playing music using eye tracking. By expression we mean how an accomplished musician can play the same note in different ways to make it more expressive. Velocity is a common means of expression, you can think of this a how fast/hard a note is struck. Velocity can affect volume and other qualities of the instrument’s sound. Another common means of expression is provided pedals like those on an organ or piano. Using eyegaze we really only have the ability to turn the note on or off. Some of the software however breaks note areas up into sections, each one giving an increased velocity (see photo below).           

Software for playing music with Eyegaze

  • Eye Harp http://theeyeharp.org/ One of the first software instruments made specifically for eyegaze, the EyeHarp remains one of the best options. This software was originally developed as a college project (I guessing he got a first!) and rather than let it die developer Zacharias Vamvakousis made it available free and open source. After a few years with now updates the news is that there are some big updates on the way. We are looking forward to seeing what they have in store for us.
animated gif showing the eyeharp performance screen. a clock type circular interface divided into sections. eyes at the centre of the circle

Another option for eyegaze music production is using software like the Grid 3 or Iris to create an eyegaze accessible interface for a mainstream digital audio workstation. The demo below is done using Ableton Live however any software that offers keyboard mapping or keyboard shortcuts (so any quality software) could be used in the same way.

Skyle – out of the blue

Anybody working with Assistive Technology (AT) knows how useful Apple iOS devices are. Over the years they have gradually built in a comprehensive and well-designed range of AT supports that go a long way to accommodating every access need. This is no small feat. In 2009 VoiceOver transformed what was essentially a smooth featureless square of glass with almost no tactile information, into the preferred computing device for blind people. In 2019 Voice Control and the improvements made to Assistive Touch filled two of the last big gaps in the area of “hands free” control of iOS. All this great work is not completely altruistic however as it has resulted in Apple mobile devices cementing their place as the preeminent platform in the area of disability and AT. It is because of this that it has always been somewhat of a mystery why there has never been a commercial eye tracking option available for either iOS or MacOS. Perhaps not so much iOS as we will see but certainly one would have thought an eyegaze solution for the Apple desktop OS could be a viable product.

There are a few technical reasons why iOS never has supported eyegaze. Firstly, up until the newer generations of eye gaze peripherals, eye gaze needed a computer with a decent spec to work well. iPads are Mobile devices and Apple originally made no apologies for sacrificing performance for more important mobile features like reducing weight, thickness and increasing battery life. As eye trackers evolved and got more sophisticated, they began to process more of the massive amount of gaze data they take in. So rather than passing large amounts of raw data straight through to the computer via USB 3 or Firewire they process the data first themselves. This means less work for the computer and connection with less bandwidth can be used. Therefore, in theory, an iPad Pro could support something like a Tobii PC Eye Mini but in practice, there was still one major barrier. iOS did not support any pointing device, let alone eye tracking devices. That was until last September’s iOS update. iOS 13 or iPadOS saw upgrades to the Assistive Touch accessibility feature that allowed it to support access to the operating system using a pointing device.     

iPad Pro 12" in black case with Skyle eye tracker
iPad Pro 12″ with Skyle eye tracker and case

It is through Assistive Touch that the recently announced Skyle for iPad Pro is possible. “Skyle is the world’s first eye tracker for iPad Pro” recently announced by German company EyeV https://eyev.de/ (who I admit I have not previously heard of). Last week it appeared as a product on Inclusive Technology for £2000 (ex VAT). There is very little information on the manufacturer website about Skyle so at this stage all we know is based on the Inclusive Technology product description (which is pretty good thankfully). The lack of information about this product (other than the aforementioned) significantly tempers my initial excitement on hearing that there is finally an eye tracking solution for iOS. There are no videos on YouTube (or Inclusive Technology), no user reviews anywhere. I understand it is a new product but it is odd for a product to be on the market before anybody has had the opportunity of using it and posting a review. I hope I am wrong but alarm bells are ringing. We’ve waited 10 years for eye tracking on iOS, why rush now?

Leaving my suspicion behind there are some details on Inclusive Technology which will be of interest to potential customers. If you have used a pointing device through Assistive Touch on iPadOS you will have a good idea of the user experience. Under Cursor in the Assistive Touch settings you can change the size and colour of the mouse cursor. You will need to use the Dwell feature to automate clicks and the Assistive Touch menu will hive you access to all the other gestures needed to operate the iPad. Anyone who works with people who use eye tracking for computer access will know that accuracy varies significantly from person to person. Designed for touch, targets in iPadOS (icons, menus) are not tiny, they are however smaller than a cell in the most detailed Grid used by a highly accurate eyegaze user. Unlike a Windows based eye gaze solution there are no additional supports, for example a Grid overlay or zooming to help users with small targets. Although many users will not have the accuracy to control the iPad with this device (switch between apps, change settings) it could be a good solution within an AAC app (where cell sizes can be configured to suit user accuracy) or a way of interacting with one of the many cause and effect apps and games. Again however, if you have a particular app or activity in mind please don’t assume it will work, try before you buy. It should be noted here that Inclusive Technology are offering a 28 Day returns policy on this product.

There is a Switch input jack which will offer an alternative to Dwell for clicking or could be set to another action (show Assistive Touch menu maybe). I assume you could also use the switch with iOS Switch Control which might be a work around for those who are not accurate enough to access smaller targets with the eye gaze device. It supports 5 and 9 point calibration to improve accuracy. I would like to see a 2 point calibration option as 5 points can be a stretch for some early eyegaze users. It would also be nice if you could change the standard calibration dot to something more likely to engage a child (cartoon dog perhaps).

Technical specs are difficult to compare between eye trackers on the same platform (Tobii v EyeTech for example) so I’m not sure what value it would be to compare this device with other Windows based eye trackers. That said some specs that will give us an indication of who this device may be appropriate for are sample rate and operating distance. Judging by the sample rate (given as 18Hz max 30Hz) the Skyle captures less than half the frames per second of its two main Windows based competitors (Tobii 30 FPS TM5 42 FPS). However even 15 FPS should be more than enough for accurate mouse control. The operating distance (how far the device is from the user) for Skyle is 55 to 65 cm which is about average for an eyegaze device. However only offering a range of 10 cm (Tobii range is 45cm to 85 cm, so 40 cm) as well as the photo below which shows the positioning guide both indicate that this not a solution for someone with even a moderate amount of head movement as the track box (area where eyes can be successfully tracked) seems to be very small.

the positioning guide in the skyle app. letterbox view of a persons eyes. seems to indicate only movement of a couple of centimeters is possible before going out of view.
Does the user have to keep their position within this narrow area or does Skyle use facial recognition to adjust to the user’s position? If it’s the former this solution will not be appropriate for users with even a moderate amount of head movement.

In summary if you are a highly accurate eyegaze user with good head control and you don’t wear glasses.. Skyle could offer you efficient and direct hands free access to your iPad Pro. It seems expensive at €2500 especially if you don’t already own a compatible iPad (add at least another €1000 for an iPad Pro 12”). If you have been waiting for an eyegaze solution for iOS (as I know many people have) I would encourage you to wait a little longer. When the opportunity arises, try Skyle for yourself. By that time, there may be other options available.

If any of the assumptions made here are incorrect or if there is anymore information available on Skyle please let us know and we will update this post.

Getting started with an eye – gaze device

Introducing an eye-gaze device to an individual who is non – verbal can open up a world of possibility for them; it can allow them to communicate, engage with games and play as well as allowing them to access and control their environment.

When working with children who have the potential to use eye gaze, it can be difficult to find fun and motivating ways to encourage them to engage with the device. Introducing communication-based programs too early can be too demanding and may ultimately lead to failure using the device.

Smartbox Technologies have developed a program called Look to Learn and describe it as a motivating and fun way to get started with eye gaze technology. Every activity has been developed in consultation with teachers and therapists to improve access and choice-making skills. The software consists of 40 specially created activities that easily allows therapists, families and teachers develop basic eye-gaze interaction with the child. A companion workbook is also available from the Smartbox website to download (free) and helps track and document the child’s progress as they move through the program and the complexity of the activities.

Look to Learn is available to download from Smartbox on https://thinksmartbox.com/downloads/look_to_learn/ and starts at £360.

Accessible Photography – Photo Editing with Adobe Lightroom & the Grid 3

Some time back, when I was finishing up a photography shoot, I met a gentleman who had informed me that his photography career had been cut short due to having a stroke a few years earlier. This was back in 2011, and options were a lot more limited in terms of cameras, software and accessibility in general. Earlier in the year, as part of my Foundations in AT course, it was suggested to me to incorporate my photography background into my project. Now in 2019, there are a lot more options for accessibility in photography, between mounts for the cameras, wi-fi connectivity between camera and PC/Phone/Tablet. However taking the photo is only half the work for a photographer.

Film photographers have to develop their photos, Digital photographers have to edit their photos. Adobe Lightroom is an industry standard program for editing photos. It is also very shortcut friendly. As a result, I was able to make it work with Grid 3 to enable basic editing such as converting to black and white, adjusting colour balance, brightness. Contrast and exposure. Cropping and converting an image from Portrait to Landscape and vice versa could also be achieved via the Grid. In the short time I had to create this grid, it can be easily expanded on, adding access to other modules (such as Export, Slideshow, Book, Print, etc) to access other features like Slideshow Templates, Print Setup, Exporting with previous settings or email a photo. While functionality of this grid is minimal, there is plenty of room for expansion.

Download the Lightroom Grid here or directly through the Grid application (search for Adobe or Lightroom).

Below is a demonstration of the Lightroom Grid.

Tobii buys SmartBox – What might this mean for computer access and AAC?

Big news (in the AT world anyway) may have arrived in your mail box early last week. It was announced that leading AAC and Computer Access manufacturer Tobii purchased SmartBox AT (Sensory Software), developers of The Grid 3 and Look2Learn. As well as producing these very popular software titles, SmartBox were also a leading supplier of a range of AAC and Computer Access hardware, including their own GridPad and PowerPad ranges. Basically (in this part of the world at least) they were the two big guns in this area of AT, between them accounting for maybe 90% of the market. An analogy using soft drink companies would be that this is like Coca-Cola buying Pepsi.

Before examining what this takeover (or amalgamation?) means to their customers going forward it is worth looking back at what each company has historically done well. This way we can hopefully provide a more optimistic future for AT users rather than the future offered by what might be considered a potential monopoly.

Sensory Software began life in 2000 from the spare bedroom of founder Paul Hawes. Paul had previously worked for AbilityNet and had 13 years’ experience working in the area of AT. Early software like GridKeys and The Grid had been very well received and the company continued to grow. In 2006 they setup Smartbox to concentrate on complete AAC systems while sister company Sensory Software concentrated on developing software. In 2015 both arms of the company joined back together under the SamrtBox label. By this time their main product, the Grid 3, had established itself as a firm favourite with Speech and Language Therapists (SLT), for the wide range of communication systems it supported and Occupational Therapists and AT Professionals for its versatility in providing alternative input options to Windows and other software. Many companies would have been satisfied with providing the best product on the market however there were a couple of other areas where SmartBox also excelled. They may not have been the first AT software developers to harness the potential resources of their end users (they also may have been, I would need to research that further) but they were certainly the most successful. They succeeded in creating a strong community around the Grid 2 & 3 with a significant proportion of the online grids available to download being user generated. Their training and support was also second to none. Regular high quality training events were offered throughout Ireland and the UK. Whether by email, phone or the chat feature on their website their support was always top quality also. Their staff clearly knew their product inside out, responses were timely and they were always a pleasure to deal with.

Tobii have been around since 2001. The Swedish firm actually started with eyegaze, three entrepreneurs – John Elvesjö, Mårten Skogö and Henrik Eskilsson recognised the potential of eye tracking as an input method for people with disabilities. In 2005 they released the MyTobii P10, the world’s first computer with built-in eye tracking (and I’ve no doubt there are still a few P10 devices still in use). What stood out about the P10 was the build quality of the hardware, it was built like a tank. While Tobii could be fairly criticized for under specifying their all-in-one devices in terms of Processor and Memory, the build quality of their hardware is always top class. Over the years Tobii have grown considerably, acquiring Viking Software AS (2007), Assistive Technology Inc. (2008) and DynaVox Systems LLC (2014). They have grown into a global brand with offices around the world. As mentioned above, Tobii’s main strength is that they make good hardware. In my opinion they make the best eye trackers and have consistently done so for the last 10 years. Their AAC software has also come on considerably since the DynaVox acquisition. While Communicator always seemed to be a pale imitation of the Grid (apologies if I’m being unfair, but certainly true in terms of its versatility and ease of use for computer access) it has steadily being improving. Their newer Snap + Core First AAC software has been a huge success and for users just looking for communication solution would be an attractive option over the more expensive (although much fuller featured) Grid 3. Alongside Snap + Core they have also brought out a “Pathways” companion app. This app is designed to guide parents, care givers and communication partners in best practices for engaging Snap + Core First users. It supports the achievement of communication goals through video examples, lesson plans, interactive goals grid for tracking progress, and a suite of supporting digital and printable materials. A really useful resource which will help to empower parents and prove invaluable to those not lucky enough to have regular input from an SLT.

To sum things up. We had two great companies, both with outstanding products. I have recommended the combination of the Grid software and a Tobii eye tracker more times than I remember. The hope is that Tobii can keep the Grid on track and incorporate the outstanding support and communication that was always an integral part of SmartBox’s operation. With the addition of their hardware expertise and recent research driven progress in the area of AAC, there should be a lot to look forward to in the future.

If you are a Grid user and you have any questions or concerns about this news, true to form, the communication lines are open. There is some information at this link and at the bottom of the page you can submit your question.

‘Eye-Touch’ – an eye-controlled musical instrument

Last week we were visited in Enable Ireland, Sandymount, by two of the most experienced practitioners working in the area of assistive music technology. Dr Tim Anderson http://www.inclusivemusic.org.uk/ and Elin Skogdal (SKUG) dropped by to talk about the new eyegaze music software they have been developing and to share some tips with the musicians from Enable Ireland Adult’s Services. Tim Anderson has been developing accessible music systems for the last 25 years. E-Scape which he developed, is the only MIDI composition and performance software designed from the ground up for users of alternative input methods (Switch, Joystick and now Eyegaze). Tim also works as an accessible music consultant for schools and councils. Elin Skogdal is a musician and educator based at the SKUG Centre. She has been using Assistive Music Technology in music education since 2001 and was one of those responsible for establishing the SKUG Centre. The SKUG Centre is located in Tromsø, Northern Norway. SKUG stands for “Performing Music Together Without Borders”, and the aim of the Centre is to provide opportunities for people who can’t use conventional instruments to play and learn music. SKUG is part of the mainstream art school of Tromsø (Tromsø Kulturskole), which provides opportunities for SKUG students to collaborate with other music and dance students and teachers. SKUG have students at all levels and ages – from young children to university students. If you would to like to know more about Elin’s work at SKUG click here to read a blog post from Apollo Ensemble.

Following the visit and workshop they sent us some more detailed information about the exciting new eyegaze music software they are currently developing Eye-Touch. We have included this in the paragraphs below. If you are interested in getting involved in their very user lead development process you can contact us here (comments below) and we will put you in touch with Tim and Elin.

‘Eye-touch’ (Funded by ‘NAV Hjelpemidler og tilrettelegging’ in 2017, and Stiftelsen Sophie’s Minde in 2018) is a software instrument being developed by the SKUG centre (Part of ‘Kulturskolen i Tromsø’), in collaboration with Dr. Tim Anderson, which enables people to learn and play music using only their eyes. It includes a built-in library of songs called ‘Play-screens’, with graphical buttons which play when you activate them.
Buttons are laid out on screen to suit the song and the player’s abilities, and can be of any size and colour, or show a picture. When you look at a button (using an eye-gaze tracking system such as Tobii or Rolltalk) it plays its musical content. You can also play buttons in other ways to utilise the screen’s attractive look: you can touch a touch-screen or smartboard, press switches or PC keys, or hit keys on a MIDI instrument.
The music within each button can either be musical notes played on a synthesised instrument, or an audio sample of any recorded sound, for example animal noises or sound effects. Sound samples can also be recordings of people’s voices speaking or singing words or phrases. So a child in a class group could play vocal phrases to lead the singing (‘call’), with the other children then answering by singing the ‘response’.

see caption

Pictured above, a pupil in Finland is trying out playing a screen with just three buttons, with musical phrases plus a sound effect of a roaring bear (popular with young players!). She has been using the system for just a few minutes, and was successfully playing the song, which proved very enjoyable and motivating for her.

SKUG’s experience from their previous prototype system has led to the incorporation of some innovative playing features, which distinguish it from other eyegaze music systems, and have been shown to enable people to play who couldn’t otherwise. These features provide an easy entry level, and we have found that they enable new users to start playing immediately and gain motivation. These support features can also be changed or removed by teachers to suit each player’s abilities, and most importantly, be able to evolve as a player practises and improves. One feature is to have the buttons in a sequence which can only be played in the right order, so the player can ‘look over’ other buttons to get to the next ‘correct’ button.
Here are two examples: The Play-screen below has buttons each containing a single note, arranged as a keyboard with colouring matching the Figurenotes scheme. A player with enough ability could learn a melody and play it by moving between the buttons in the empty space below. But by putting the buttons into a sequence order, the player is able to learn and play the melody far more easily – they can look over buttons to get to the next ‘correct’ button (note) of the song, without playing the buttons in between.

screen shot from eyetouch
As well as illustrating a general theme, the facility to add pictures gives us many more possibilities. The Play-screen below left has buttons which show pictures and play sounds and music relating to J.S. Bach’s life story. The buttons could be played freely, but in this case have been put into a sequence order to illustrate his life chronologically. As before, a player can move through the buttons to play then in order, even though they are close together. But we may want to make them even bigger, and make the player’s job even easier, by setting to only display the ‘next’ button in the sequence (below right). So the other buttons are hidden, and the player only sees the button which is next to play, and can then move onto it.

bach lesson can be split into stages to make it more accessibleplay screen featuring images representing the life of classical musician Bach. Each picture plays some music from that period

There is also an accompanying text to tell the story which, if desired, can be displayed on screen via a built in ‘song-sheet’. Teachers can also make their own Play-screens by putting their own music into buttons – by either playing live on a MIDI keyboard, or recording their own sound samples. To further personalise a Play-screen for a pupil, people can also organise and edit all the visual aspects including adding their own pictures.
The Eye-Touch software is also very easy to install and operate – we have found it quick and easy to install it on school pupils’ eye-gaze tablets, and it worked for them straight away.
In January 2018 the SKUG team started a project to further develop Eye-Touch to expand the ways of playing, the creating and editing facilities for teachers, and the range of songs provided in the library.

 

 

Eye Control – Inbuilt EyeGaze Access for Windows 10

Just yesterday Microsoft announced what is possibly their biggest step forward in functionality within their Ease of Access accessibility settings since Windows 7. Eye Control is an inbuilt feature to facilitate access to the Windows 10 OS using the low cost eyegaze peripheral the Tracker 4 C from Tobii. More about what you can actually do with Eye Control below but first a little background to how this has come about.

Steve Gleeson and his son

Former American Football professional and MND (ALS) sufferer Steve Gleason (above) challenged Microsoft in 2014 to help people affected by this degenerative condition through the advancement eye tracking technology. This initial contact lead to the development of a prototype eye gaze controlled wheelchair, receiving lots of publicity and generating increased awareness in the process. However it was never likely to be progressed to a product that would be available to other people in a similar situation. What this project did achieve was to pique the interest of some of the considerable talent within Microsoft into the input technology itself and its application, particularly for people with MND.

A combination of factors felt on both sides of the Atlantic have proved problematic when it comes to providing timely AT support to people diagnosed with MND. Eyegaze input is the only solution that will allow successful computer access as the condition progresses, eye movement being the only ability left in the final stages of the illness. However, historically the cost of the technology meant that either insurance, government funding or private fundraising was the only means by which people could pay for eyegaze equipment. Usually this resulted in a significant delay which, due to the often aggressive nature of MND meant valuable time was lost and often the solution arrived too late. This situation was recognized by Julius Sweetland who led the development of Optikey, an Open Source computer access/AAC solution designed to work with low cost eye trackers back in 2015. Interestingly some of the innovative features of Optikey seem to have made it to Eye Control on Windows 10 (Multi-Key selection called Shape Writing on Eye Control – see gif below).

demo of shap writing on Eye Control - works like swiping on a touch keyboard. dwell on the first letter of a word, glance at subsequent letters and dwell on last letter. word is entered

Since the initial Steve Gleason Wheelchair hack there has been a steady stream of high quality research papers coming from people at Microsoft on the subject of eyegaze input and MND solutions. This should have been a hint that something like Eye Control was on the horizon. EyeGaze input has promised to break into the mainstream several times over the last decade however with Eye Control and support for devices being included in the core Windows OS it has never been this close.

For more background on the path to Eye Control see this Microsoft blog post from Microsoft:  From Hack to Product, Microsoft Empowers People with Eye Control for Windows 10

Want to find out how to get early access to Eye Control or get some more information on the functionality read this post from Tobii (be warned there are still bugs):  How to get started with Eye Control on Windows.

Hands-free Minecraft from Special Effect

Love it or hate it, the game of Minecraft has captured the imagination of over 100 million young, and not so young people. It is available on multiple platforms; mobile device (Pocket Edition), Raspberry Pi, Computer, Xbox or PlayStation and it looks and feels pretty much the same on all. For those of us old enough to remember, the blocky graphics will hold some level of nostalgia for the bygone 8 Bit days when mere blobs of colour and our imagination were enough to render Ghosts and Goblins vividly. This is almost certainly lost on the main cohort of Minecraft players however who would most probably be bored silly with the 2 dimensional repetitive and predictable video games of the 80’s and early 90’s. The reason Minecraft is such a success is that it has blended its retro styling with modern gameplay and a (mind bogglingly massive) open world where no two visits are the same and there is room for self-expression and creativity. This latter quality has lead it to become the first video game to be embraced by mainstream education, being used as a tool for teaching everything from history to health or empathy to economics. It is however the former quality, the modern gameplay, that we are here to talk about. Unlike the afore mentioned Ghosts and Goblins, Minecraft is played in a 3 dimensional world using either the first person perspective (you see through the characters eyes) or third person perspective (like a camera is hovering above and slightly behind the character). While undoubtedly offering a more immersive and realistic experience, this means controlling the character and playing the game is also much more complex and requires a high level of dexterity in both hands to be successful. For people without the required level of dexterity this means that not only is there a risk of social exclusion, being unable to participate in an activity so popular among their peers, but also the possibility of being excluded within an educational context.

Fortunately UK based charity Special Effect have recognised this need and are in the process doing something about it. Special Effect are a charity dedicated to enabling those with access difficulties play video games through custom access solutions. Since 2007 their interdisciplinary team of clinical and technical professionals (and of course gamers) have been responsible for a wide range of bespoke solutions based on individuals’ unique abilities and requirements. Take a look at this page for some more information on the work they do and to see what a life enhancing service they provide. The problem with this approach of course is reach, which is why their upcoming work on Minecraft is so exciting. Based on the Open Source eyegaze AAC/Computer Access solution Optikey by developer Julius Sweetland, Special Effect are in the final stages of developing an on-screen Minecraft keyboard that will work with low cost eye trackers like the Tobii Eye X and the Tracker 4C (€109 and €159 respectively).

minecraft on screen keyboard

The inventory keyboard

MineCraft on screen keyboards

The main Minecraft on screen keyboard

Currently being called ‘Minekey’ this solution will allow Minecraft to be played using a pointing device like a mouse or joystick or even totally hands free using an eyegaze device or headmouse. The availability of this application will ensure that Minecraft it now accessible to many of those who have been previously excluded. Special Effect were kind enough to let us trial a beta version of the software and although I’m no Minecraft expert it seemed to work great. The finished software will offer a choice of onscreen controls, one with smaller buttons and more functionality for expert eyegaze users (pictured above) and a more simplified version with larger targets. Bill Donegan, Projects Manager with Special Effect told us they hope to have it completed and available to download for free by the end of the year. I’m sure this news that will excite many people out there who had written off Minecraft as something just not possible for them. Keep an eye on Special Effect or ATandMe for updates on its release.

Boardmaker Online now launched in Ireland

Tobii Dynavox have recently launched their new Boardmaker Online product in Ireland through SafeCare Technologies. It has all the functionalities of previous versions of Boardmaker, except now that it’s web-based you don’t need any disks and multiple users can access it from any PC.

Instructor showing students how to use Boardmaker Online

You can purchase a Personal, Professional or District account and the amount you pay depends on the type of account, the amount of “instructors” and how many years you want to sign up for. You can also get a discount for any old Boardmaker disks that you want to trade in.

You get all the symbols that have been available in past versions, as well as some new symbol sets and any new ones that are created in the future will also be given to you. Because it’s web-based, you have access to previously created activities via the online community and you can upload activities you create yourself to that community and share them with other people in your district or all over the world.

Because it’s no longer tied to one device, you can create activities on your PC and assign them to your “students” who can use them either in school and/or at home. You no longer need to have a user’s device in your possession to update their activities and they don’t need to have a period without their device while you do this.

You (and the other instructors in your district if you have a district licence) can also assign the same activity to many students and by having different accessibility options set up for different students, the activity is automatically accessible for their individual needs. For example, you could create an activity and assign it to a student who uses eye gaze and to a student who uses switches and that activity will show up on their device in the format that’s accessible for them.

Picture shows how instructors can assign Boardmaker Online activities to multiple students

The results of students’ work can be tracked against IEP or educational goals which then helps you decide what activities would be suitable to assign next. You can also track staff and student usage.

One limitation is that you can only create activities on a Windows PC or Mac. You can play activities on an iPad using the free app but not create them on it, and you can’t use Boardmaker Online to either create or play activities on an Android or Windows-based tablet.

The other point to mention is that because it’s a subscription-based product, the payment you have to make is recurring every year rather than being a one-off payment, which may not suit everyone.

However, with the new features it’s definitely worth getting the free 30-day trial and deciding for yourself if you’d like to trade in your old Boardmaker disks for the new online version!