Beyond Boundaries: How Interactive and Immersive Media are being used to support people with autism

This is the first in a two part post about Enable Ireland’s Immersive Media Beyond Boundaries Garden project. If you want to try the apps for yourself you can get them from Google Play here or there are links and some more information on our website here. This first post (Part 1) will give a brief background to Virtual Reality and related technologies and look at some of the research into its potential in the area of autism. Part 2 of the post will outline how we put our Beyond Boundaries and SecretGarden apps together and how we hope to incorporate this technology into future training and use it to support clients of our service.

Background: VR, AR, Mixed Media, 360 Video?

Virtual Reality, referred to as the acronym VR, is one of those technologies that is perpetually “the next big thing”. If you grew up looking at movies like Tron and The Lawnmower Man (giving away my age here), VR is probably filed away in your brain somewhere between hoverboards (that actually hover) and teleportation. When the concept of a technology has been part of popular culture so far in advance of the capability of its realisation, it can hinder rather than promote its development. The trajectory the evolution of VR has taken however is much closer to a technology like Speech Recognition than hoverboards. VR, as with Speech Recognition, saw a great deal of progress in the latter part of the 1980s. With both technologies, although important, this progress was almost nullified by the hype surrounding and subsequent commercialisation of a technology that clearly wasn’t ready for the public consumption. The reality of what VR could offer at the time led to people becoming disillusioned with the technology.

Before I talk about how VR is being used in the area of autism it’s worth clarifying what exactly is meant by some of the terms that are being used. As an emerging technology there is still quite a lot of confusion around what is meant by Virtual Reality and associated technologies; Augmented Reality (AR), Mixed Reality, Immersive Media and 360 Video. First let’s look at the video below which explains what VR and AR are and how they differ.

So what is Mixed Reality? Well in short Mixed Reality is a combination of VR and AR, in theory offering the best of both. Mixed Reality is also closely associated with Microsoft and other Windows aligned hardware manufacturers. Have a look at the short video below.

360 degree Video and Photography are less interactive than the technologies discussed above. The viewer is also restricted in terms of movement, they can only view the scene from the position the camera was placed. Movement can be simulated to some extent however through the use of hotspots or menus, allowing them to navigate between different scenes. More traditional film techniques like fading between scenes can also be used as in the video below. 360 Degree can be either flat or in stereo. Stereo video or 3D video is captured with a camera that has 2 lens about the same distance apart as a person’s eyes. Each eye then gets a slightly different view which our brain stitch together as a 3D image.

Finally Immersive Media is frequently used as an umbrella term for all the technologies discussed above but would more correctly refer to the less interactive 360 Video and Photography.

Immersive Media and Autism

Since the early days of the technology people have proposed that VR may offer potential as a therapeutic or training tool within the area of neurodiversity. Dorothy Strickland of North Carolina State University’s short paper “Two Case Studies Using Virtual Reality As A Learning Tool For Autistic Children” (Journal of Autism and Developmental Disorders, Vol. 26, No. 6, 1996) is generally accepted as being the first documented use of VR as a tool to increase the capabilities of someone with a disability. In this early study (which you can read at the link above) VR was used as a means to teach the children how to safely cross the street. While VR technology itself has clearly moved on, for the reasons outlined above, its use in this area (up until recently) has not and there is still a great deal about this paper that is relevant today. In particular regarding the children’s acceptance of the headset (which would have been chunkier and more uncomfortable than todays) and their understanding of the 3D world presented by it.

Stepping forward almost a quarter of a century and we are riding the peak of the second wave of commercial VR. Thanks largely to developments made due to the rapid evolution of mobile device in the early years of this decade, VR is becoming more accessible and less disappointing than it was first time around. With the new generation of headsets and their ability to render sharp and detailed 3D environments has come a renewed interest in the use of VR in the area of autism.  At a recent CTD Institute webinar on this very subject (Virtual Reality and Assistive Technology) Jaclyn Wickham (@JacWickham), a teacher turned technologist and founder of AcclimateVR outlined some of the reasons why VR could be an appropriate technology to provide training for some people on the autistic spectrum. These included the ability to create a safe and controlled environment where tasks can be practiced and repeated. How the VR experience puts emphases on the visual and auditory senses (with the ability to configure and control both presumably). How you can create an individualised experience and that there are many non-verbal interaction possibilities. Anecdotally this all makes complete sense but we are in the early days and much of the research is still being conducted.

A leading researcher in this area is Dr Nigel Newbutt (@Newbutt) who in June of this year published a short but enlightening update about his progress working with children from Mendip School in the UK. After seeing him present at Doctrid V conference in 2017 I can assure you that progress in this area is being made but even he acknowledges more work is needed. “Our research suggests that head-mounted displays might be a suitable space in which to develop specific interventions and opportunities; to practice some skills people with autism might struggle with in the real world. We’re seeking further funding to address this important question – one that has eluded this field to date.” (Full interview here: From apps to robots and VR: How technology is helping treat autism)

The commercial offerings in the area of VR and Autism (Floreo and AcclimateVR) tend to concentrate on providing a virtual space where basic life skills can be practiced. Another use is as a form of exposure therapy where immersive video and audio of environments and situations are used as a means of preparing someone for the real life experience. You can see examples of both in action at the links above.

Within Enable Ireland AT service our own VR journey was spurred on by a visit and demonstration from James Corbett (@JamesCorbett) of SimVirtua. James could be considered a real pioneer in this area and had in fact met with us previously almost 10 years ago to show us some work he was doing with non-immersive virtual environments (without headsets) in schools. SimVirtua had worked on a Mindfulness VR app called MindMyths and it was this idea of providing a retreat or sanctuary using immersive video that inspired us when it came to working on the Bloom Beyond Boundaries Garden project.

In the second part of this post (coming soon) I’ll give some background to what we hoped to achieve with the Beyond Boundaries garden project and some technical information on how we put it together.

Tobii buys SmartBox – What might this mean for computer access and AAC?

Big news (in the AT world anyway) may have arrived in your mail box early last week. It was announced that leading AAC and Computer Access manufacturer Tobii purchased SmartBox AT (Sensory Software), developers of The Grid 3 and Look2Learn. As well as producing these very popular software titles, SmartBox were also a leading supplier of a range of AAC and Computer Access hardware, including their own GridPad and PowerPad ranges. Basically (in this part of the world at least) they were the two big guns in this area of AT, between them accounting for maybe 90% of the market. An analogy using soft drink companies would be that this is like Coca-Cola buying Pepsi.

Before examining what this takeover (or amalgamation?) means to their customers going forward it is worth looking back at what each company has historically done well. This way we can hopefully provide a more optimistic future for AT users rather than the future offered by what might be considered a potential monopoly.

Sensory Software began life in 2000 from the spare bedroom of founder Paul Hawes. Paul had previously worked for AbilityNet and had 13 years’ experience working in the area of AT. Early software like GridKeys and The Grid had been very well received and the company continued to grow. In 2006 they setup Smartbox to concentrate on complete AAC systems while sister company Sensory Software concentrated on developing software. In 2015 both arms of the company joined back together under the SamrtBox label. By this time their main product, the Grid 3, had established itself as a firm favourite with Speech and Language Therapists (SLT), for the wide range of communication systems it supported and Occupational Therapists and AT Professionals for its versatility in providing alternative input options to Windows and other software. Many companies would have been satisfied with providing the best product on the market however there were a couple of other areas where SmartBox also excelled. They may not have been the first AT software developers to harness the potential resources of their end users (they also may have been, I would need to research that further) but they were certainly the most successful. They succeeded in creating a strong community around the Grid 2 & 3 with a significant proportion of the online grids available to download being user generated. Their training and support was also second to none. Regular high quality training events were offered throughout Ireland and the UK. Whether by email, phone or the chat feature on their website their support was always top quality also. Their staff clearly knew their product inside out, responses were timely and they were always a pleasure to deal with.

Tobii have been around since 2001. The Swedish firm actually started with eyegaze, three entrepreneurs – John Elvesjö, Mårten Skogö and Henrik Eskilsson recognised the potential of eye tracking as an input method for people with disabilities. In 2005 they released the MyTobii P10, the world’s first computer with built-in eye tracking (and I’ve no doubt there are still a few P10 devices still in use). What stood out about the P10 was the build quality of the hardware, it was built like a tank. While Tobii could be fairly criticized for under specifying their all-in-one devices in terms of Processor and Memory, the build quality of their hardware is always top class. Over the years Tobii have grown considerably, acquiring Viking Software AS (2007), Assistive Technology Inc. (2008) and DynaVox Systems LLC (2014). They have grown into a global brand with offices around the world. As mentioned above, Tobii’s main strength is that they make good hardware. In my opinion they make the best eye trackers and have consistently done so for the last 10 years. Their AAC software has also come on considerably since the DynaVox acquisition. While Communicator always seemed to be a pale imitation of the Grid (apologies if I’m being unfair, but certainly true in terms of its versatility and ease of use for computer access) it has steadily being improving. Their newer Snap + Core First AAC software has been a huge success and for users just looking for communication solution would be an attractive option over the more expensive (although much fuller featured) Grid 3. Alongside Snap + Core they have also brought out a “Pathways” companion app. This app is designed to guide parents, care givers and communication partners in best practices for engaging Snap + Core First users. It supports the achievement of communication goals through video examples, lesson plans, interactive goals grid for tracking progress, and a suite of supporting digital and printable materials. A really useful resource which will help to empower parents and prove invaluable to those not lucky enough to have regular input from an SLT.

To sum things up. We had two great companies, both with outstanding products. I have recommended the combination of the Grid software and a Tobii eye tracker more times than I remember. The hope is that Tobii can keep the Grid on track and incorporate the outstanding support and communication that was always an integral part of SmartBox’s operation. With the addition of their hardware expertise and recent research driven progress in the area of AAC, there should be a lot to look forward to in the future.

If you are a Grid user and you have any questions or concerns about this news, true to form, the communication lines are open. There is some information at this link and at the bottom of the page you can submit your question.

‘Eye-Touch’ – an eye-controlled musical instrument

Last week we were visited in Enable Ireland, Sandymount, by two of the most experienced practitioners working in the area of assistive music technology. Dr Tim Anderson http://www.inclusivemusic.org.uk/ and Elin Skogdal (SKUG) dropped by to talk about the new eyegaze music software they have been developing and to share some tips with the musicians from Enable Ireland Adult’s Services. Tim Anderson has been developing accessible music systems for the last 25 years. E-Scape which he developed, is the only MIDI composition and performance software designed from the ground up for users of alternative input methods (Switch, Joystick and now Eyegaze). Tim also works as an accessible music consultant for schools and councils. Elin Skogdal is a musician and educator based at the SKUG Centre. She has been using Assistive Music Technology in music education since 2001 and was one of those responsible for establishing the SKUG Centre. The SKUG Centre is located in Tromsø, Northern Norway. SKUG stands for “Performing Music Together Without Borders”, and the aim of the Centre is to provide opportunities for people who can’t use conventional instruments to play and learn music. SKUG is part of the mainstream art school of Tromsø (Tromsø Kulturskole), which provides opportunities for SKUG students to collaborate with other music and dance students and teachers. SKUG have students at all levels and ages – from young children to university students. If you would to like to know more about Elin’s work at SKUG click here to read a blog post from Apollo Ensemble.

Following the visit and workshop they sent us some more detailed information about the exciting new eyegaze music software they are currently developing Eye-Touch. We have included this in the paragraphs below. If you are interested in getting involved in their very user lead development process you can contact us here (comments below) and we will put you in touch with Tim and Elin.

‘Eye-touch’ (Funded by ‘NAV Hjelpemidler og tilrettelegging’ in 2017, and Stiftelsen Sophie’s Minde in 2018) is a software instrument being developed by the SKUG centre (Part of ‘Kulturskolen i Tromsø’), in collaboration with Dr. Tim Anderson, which enables people to learn and play music using only their eyes. It includes a built-in library of songs called ‘Play-screens’, with graphical buttons which play when you activate them.
Buttons are laid out on screen to suit the song and the player’s abilities, and can be of any size and colour, or show a picture. When you look at a button (using an eye-gaze tracking system such as Tobii or Rolltalk) it plays its musical content. You can also play buttons in other ways to utilise the screen’s attractive look: you can touch a touch-screen or smartboard, press switches or PC keys, or hit keys on a MIDI instrument.
The music within each button can either be musical notes played on a synthesised instrument, or an audio sample of any recorded sound, for example animal noises or sound effects. Sound samples can also be recordings of people’s voices speaking or singing words or phrases. So a child in a class group could play vocal phrases to lead the singing (‘call’), with the other children then answering by singing the ‘response’.

see caption

Pictured above, a pupil in Finland is trying out playing a screen with just three buttons, with musical phrases plus a sound effect of a roaring bear (popular with young players!). She has been using the system for just a few minutes, and was successfully playing the song, which proved very enjoyable and motivating for her.

SKUG’s experience from their previous prototype system has led to the incorporation of some innovative playing features, which distinguish it from other eyegaze music systems, and have been shown to enable people to play who couldn’t otherwise. These features provide an easy entry level, and we have found that they enable new users to start playing immediately and gain motivation. These support features can also be changed or removed by teachers to suit each player’s abilities, and most importantly, be able to evolve as a player practises and improves. One feature is to have the buttons in a sequence which can only be played in the right order, so the player can ‘look over’ other buttons to get to the next ‘correct’ button.
Here are two examples: The Play-screen below has buttons each containing a single note, arranged as a keyboard with colouring matching the Figurenotes scheme. A player with enough ability could learn a melody and play it by moving between the buttons in the empty space below. But by putting the buttons into a sequence order, the player is able to learn and play the melody far more easily – they can look over buttons to get to the next ‘correct’ button (note) of the song, without playing the buttons in between.

screen shot from eyetouch
As well as illustrating a general theme, the facility to add pictures gives us many more possibilities. The Play-screen below left has buttons which show pictures and play sounds and music relating to J.S. Bach’s life story. The buttons could be played freely, but in this case have been put into a sequence order to illustrate his life chronologically. As before, a player can move through the buttons to play then in order, even though they are close together. But we may want to make them even bigger, and make the player’s job even easier, by setting to only display the ‘next’ button in the sequence (below right). So the other buttons are hidden, and the player only sees the button which is next to play, and can then move onto it.

bach lesson can be split into stages to make it more accessibleplay screen featuring images representing the life of classical musician Bach. Each picture plays some music from that period

There is also an accompanying text to tell the story which, if desired, can be displayed on screen via a built in ‘song-sheet’. Teachers can also make their own Play-screens by putting their own music into buttons – by either playing live on a MIDI keyboard, or recording their own sound samples. To further personalise a Play-screen for a pupil, people can also organise and edit all the visual aspects including adding their own pictures.
The Eye-Touch software is also very easy to install and operate – we have found it quick and easy to install it on school pupils’ eye-gaze tablets, and it worked for them straight away.
In January 2018 the SKUG team started a project to further develop Eye-Touch to expand the ways of playing, the creating and editing facilities for teachers, and the range of songs provided in the library.

 

 

Route4U – Accessible route planning

Tamas and Peter from route4u.org called in last week to tell us about their accessible route finding service. Based on Open Street Maps, Route4u allows users to plan routes that are appropriate to their level and method of mobility. Available on iOS, Android and as a web app at route4u.org/maps, Route4u is the best accessible route planning solution I have seen. Where a service like Mobility Mojo gives detailed accessibility information on destinations (business, public buildings), route4u concentrates more on the journey, making them complementary services. When first setting up the app you will be given the option to select either pram, active wheelchair, electronic wheelchair, handbike or walking (left screenshot below). You can further configure your settings later in the accessibility menu selecting curb heights and maximum slopes etc. (right screenshot below)

Accessibility screen shot featuring settings like maximum slope or curb height

Further configure your settings in Accessibility

select you vehicle screen - see text above

You are first asked to select your mobility method

This is great but so far nothing really groundbreaking, we have seen services like this before. Forward thinking cities with deep pockets like London and Ontario have had similar accessibility features built into their public transport route planners for the last decade. That is a lot easier to achieve however because you are dealing with a finite number of route options. Where Route4u is breaking new ground is that it facilitates this level of planning throughout an entire city. It does this by using the technology built into smartphones to provide crowdsourced data that constantly updates the maps. If you are using a wheelchair or scooter the sensors on your smartphone can measure the level of vibration experienced on a journey. This data is sent back to route4u who use it to estimate the comfort experienced on that that journey, giving other users access to even more information on which to base their route choice. The user doesn’t have to do anything, they are helping to improve the service by simply using it. Users can also more proactively improve the service by marking obstacles they encounter on their journey. The obstacle can be marked as temporary or permanent. Temporary obstacles like road works or those ubiquitous sandwich boards that litter our pavements will remain on the map helping to inform the accessibility of the route until another user confirms they have been removed and enters that information.

Example of obstacle added by user - pictusr of curb that may not be accessible to wheelchair

Example of obstacle added by user –

Example of obstacle added by user - picture of gate which would not be accessible to wheelchair

Example of obstacle added by user

If you connect route4u to your FaceBook account you get access to a points based reward system. This allows you compete with friends and have your own league table. In Budapest where they are already well established they have linked with sponsors who allow you cash points in for more tangible rewards like a free breakfast or refreshment. These gamification features should help encourage users less inclined towards altruism to participate and that is key. Route4u when established relies on its users to keep information up to date. This type of service based on crowdsourced data is a proven model, particularly in the route planning sphere. It’s a bit of a catch 22 however as a service needs to be useful first to attract users. It is early days for Route4u in Dublin and Tamas and Peter acknowledge that a lot of work needs to be done before promoting the service here. Over the next few months their team will begin mapping Dublin city centre, this way, when they launch there will be the foundation of an accessible route finding service which people can use, update and build upon. While route4u has obvious benefits for end users with mobility difficulties there is another beneficiary of the kind of data this service will generate. Tamas and Peter were also keen to point out how this information could be used by local authorities to identify where infrastructure improvements are most needed and where investment will yield the most return. In the long run this will help Dublin and her residents tackle the accessibility problem from both sides making it a truly smart solution.

map showing blue, green and red routes

Area that has been mapped

Legend showing levels of accessibility

Legend showing levels of accessibility

 

Digital Textbooks – Better but still not good enough

It’s that time of year again. The days are getting shorter and there is a definite nip in the evening air. After two or three months of care free holidays, children and young adults, all over the country, are getting ready for another academic year. Although more years than I care to mention since my school days, I share the sense of foreboding felt by some of these young people during the close of summer. It’s not the approach of double maths on a Monday morning or a state exam on the horizon that I dread. As an AT Technician working in Enable Ireland, it is the inevitable queries from parents and therapists about digital textbooks that is the cause of my anxiety. Can we get textbooks in digital format? How? Will they be compatible with the technology being recommended? If they are workbooks, how will they fill in the answers? These are some of the very pertinent, and for the most part frustratingly unanswerable questions that come in at this time of year. In the remainder of this post I’ll try to clarify the current situation, just don’t expect all the answers… sorry.

Can you get textbooks in digital format?

In April 2016 the Irish Educational Publishers’ Association (IEPA), who represent 95% of Irish educational publishing houses, agreed on a centralised special needs policy relating to making texts available in digital format. This is progress, although limited as you will soon see. Their policy (which you can read here) falls short of committing to the supply a digital version of the textbook to those who need them. “The publisher will make every effort to accommodate the request but cannot guarantee the availability of a particular title, or a title in a specific format. The format of the title remains at the discretion of the publisher.” Reading into this a little I think it’s safe to assume that all the commonly used titles will be available but anything a bit out of the ordinary will not.

How do I get digital versions of school textbooks?

Up until last year this was a tough one. Each publisher had different requirements and there was little information publicly available. Thankfully the IEPA have made some efforts to standardise the process which is also outlined on the page linked above. “The request must be submitted by a parent, or teacher, of the named student, accompanied by acceptable proof of medical condition. Files, in pdf, text files or eBook access are then provided to the student in question.” Obviously it’s not ideal that “proof of medical condition” needs to be submitted, but it is perhaps understandable from the publishers’ perspective that there are some restrictions.

Will the digital textbooks be compatible with the technology being recommended?

This is the question that keeps me up at night (well this and the new season of Game of Thrones) because there are so many variables. We would need to know the format that the textbooks will be supplied in, and the IEPA are very non-committal in this regard. Statements like “Files, in pdf, text files or eBook..”, and “The format of the title remains at the discretion of the publisher”, make it quite clear that they refuse to be pinned down. This really needs to be looked at. It is not in the publishers’ interests to commit to a specific format. It is however in the students’ interests, particularly students with access or literacy difficulties that require the use Assistive Technology. This is something the Department of Education need to enforce, as is the case in other jurisdictions. The only advice I can give here is to contact the publishers and find out what format the textbooks will be supplied in, then contact us at Enable Ireland AT Service.

If they are workbooks how will they fill in the answers?

Depends on the format, see above (sorry).

If you are looking for more on this subject you can read last year’s rant on AT in the Era of the Digital Schoolbag here

Eye Control – Inbuilt EyeGaze Access for Windows 10

Just yesterday Microsoft announced what is possibly their biggest step forward in functionality within their Ease of Access accessibility settings since Windows 7. Eye Control is an inbuilt feature to facilitate access to the Windows 10 OS using the low cost eyegaze peripheral the Tracker 4 C from Tobii. More about what you can actually do with Eye Control below but first a little background to how this has come about.

Steve Gleeson and his son

Former American Football professional and MND (ALS) sufferer Steve Gleason (above) challenged Microsoft in 2014 to help people affected by this degenerative condition through the advancement eye tracking technology. This initial contact lead to the development of a prototype eye gaze controlled wheelchair, receiving lots of publicity and generating increased awareness in the process. However it was never likely to be progressed to a product that would be available to other people in a similar situation. What this project did achieve was to pique the interest of some of the considerable talent within Microsoft into the input technology itself and its application, particularly for people with MND.

A combination of factors felt on both sides of the Atlantic have proved problematic when it comes to providing timely AT support to people diagnosed with MND. Eyegaze input is the only solution that will allow successful computer access as the condition progresses, eye movement being the only ability left in the final stages of the illness. However, historically the cost of the technology meant that either insurance, government funding or private fundraising was the only means by which people could pay for eyegaze equipment. Usually this resulted in a significant delay which, due to the often aggressive nature of MND meant valuable time was lost and often the solution arrived too late. This situation was recognized by Julius Sweetland who led the development of Optikey, an Open Source computer access/AAC solution designed to work with low cost eye trackers back in 2015. Interestingly some of the innovative features of Optikey seem to have made it to Eye Control on Windows 10 (Multi-Key selection called Shape Writing on Eye Control – see gif below).

demo of shap writing on Eye Control - works like swiping on a touch keyboard. dwell on the first letter of a word, glance at subsequent letters and dwell on last letter. word is entered

Since the initial Steve Gleason Wheelchair hack there has been a steady stream of high quality research papers coming from people at Microsoft on the subject of eyegaze input and MND solutions. This should have been a hint that something like Eye Control was on the horizon. EyeGaze input has promised to break into the mainstream several times over the last decade however with Eye Control and support for devices being included in the core Windows OS it has never been this close.

For more background on the path to Eye Control see this Microsoft blog post from Microsoft:  From Hack to Product, Microsoft Empowers People with Eye Control for Windows 10

Want to find out how to get early access to Eye Control or get some more information on the functionality read this post from Tobii (be warned there are still bugs):  How to get started with Eye Control on Windows.

Hands-free Minecraft from Special Effect

Love it or hate it, the game of Minecraft has captured the imagination of over 100 million young, and not so young people. It is available on multiple platforms; mobile device (Pocket Edition), Raspberry Pi, Computer, Xbox or PlayStation and it looks and feels pretty much the same on all. For those of us old enough to remember, the blocky graphics will hold some level of nostalgia for the bygone 8 Bit days when mere blobs of colour and our imagination were enough to render Ghosts and Goblins vividly. This is almost certainly lost on the main cohort of Minecraft players however who would most probably be bored silly with the 2 dimensional repetitive and predictable video games of the 80’s and early 90’s. The reason Minecraft is such a success is that it has blended its retro styling with modern gameplay and a (mind bogglingly massive) open world where no two visits are the same and there is room for self-expression and creativity. This latter quality has lead it to become the first video game to be embraced by mainstream education, being used as a tool for teaching everything from history to health or empathy to economics. It is however the former quality, the modern gameplay, that we are here to talk about. Unlike the afore mentioned Ghosts and Goblins, Minecraft is played in a 3 dimensional world using either the first person perspective (you see through the characters eyes) or third person perspective (like a camera is hovering above and slightly behind the character). While undoubtedly offering a more immersive and realistic experience, this means controlling the character and playing the game is also much more complex and requires a high level of dexterity in both hands to be successful. For people without the required level of dexterity this means that not only is there a risk of social exclusion, being unable to participate in an activity so popular among their peers, but also the possibility of being excluded within an educational context.

Fortunately UK based charity Special Effect have recognised this need and are in the process doing something about it. Special Effect are a charity dedicated to enabling those with access difficulties play video games through custom access solutions. Since 2007 their interdisciplinary team of clinical and technical professionals (and of course gamers) have been responsible for a wide range of bespoke solutions based on individuals’ unique abilities and requirements. Take a look at this page for some more information on the work they do and to see what a life enhancing service they provide. The problem with this approach of course is reach, which is why their upcoming work on Minecraft is so exciting. Based on the Open Source eyegaze AAC/Computer Access solution Optikey by developer Julius Sweetland, Special Effect are in the final stages of developing an on-screen Minecraft keyboard that will work with low cost eye trackers like the Tobii Eye X and the Tracker 4C (€109 and €159 respectively).

minecraft on screen keyboard

The inventory keyboard

MineCraft on screen keyboards

The main Minecraft on screen keyboard

Currently being called ‘Minekey’ this solution will allow Minecraft to be played using a pointing device like a mouse or joystick or even totally hands free using an eyegaze device or headmouse. The availability of this application will ensure that Minecraft it now accessible to many of those who have been previously excluded. Special Effect were kind enough to let us trial a beta version of the software and although I’m no Minecraft expert it seemed to work great. The finished software will offer a choice of onscreen controls, one with smaller buttons and more functionality for expert eyegaze users (pictured above) and a more simplified version with larger targets. Bill Donegan, Projects Manager with Special Effect told us they hope to have it completed and available to download for free by the end of the year. I’m sure this news that will excite many people out there who had written off Minecraft as something just not possible for them. Keep an eye on Special Effect or ATandMe for updates on its release.

Bloom 2017 ‘No Limits’ Grid Set

You may have heard about or seen photos of Enable Irelands fantastic “No Limits” Garden at this year’s Bloom festival. Some of you were probably even lucky enough to have actually visited it in the Phoenix Park over the course of the Bank Holiday weekend. In order to support visitors but also to allow those who didn’t get the chance to go share in some of the experience we put together a “No Limits” Bloom 2017 Grid. If you use the Grid (2 or 3) from Sensory software, or you know someone who does and you would like to learn more about the range of plants used in Enable Ireland’s garden you can download and install it by following the instructions below.

How do I install this Grid?

If you are using the Grid 3 you can download and install the Bloom 2017 Grid without leaving the application. From Grid explorer:

  • Click on the Menu Bar at the top of the screen
  • In the top left click the + sign (Add Grid Set)
  • A window will open (pictured below). In the bottom corner click on the Online Grids button (you will need to be connected to the Internet).

grid 3 screen shot

  • If you do not see the Bloom2017 Grid in the newest section you can either search for it (enter Bloom2017 in the search box at the top right) or look in the Interactive learning or Education Categories.

If you are using the Grid 2 or you want to install this Grid on a computer or device that is not connected to the Internet then you can download the Grid set at the link below. You can then add it to the Grid as above except select Grid Set File tab and browse to where you have the Grid Set saved.

For Grid 2 users:

Download Bloom 2017 Grid here https://grids.sensorysoftware.com/en/k-43/bloom2017

Makers Making Change – Canada provides $750,000 to fund development of Open Source AT

Makers Making Change have a mission, to “connect makers to people with disabilities who need assistive technologies”. This is also our mission and something we’ve talked about before, it is also the goal of a number of other projects including TOM Global and Enable Makeathon. Makers Making Change which is being run by Canadian NGO the Neil Squire Society and supported by Google.org differs from previous projects sharing the same goal in a couple of ways. Firstly their approach. They are currently concentrating their efforts on one particular project, the LipSync and touring the North American continent holding events where groups of Makers get together and build a quantity of these devices. These events are called Buildathons. This approach both raises awareness about their project within the maker community while also ensuring they have plenty of devices in stock, ready to go out to anybody who needs them. Secondly, thanks to the recent promise from the Canadian government of funding to the tune of $750,000 they may be on the verge of bringing their mission into the mainstream.

Canada have always had a well-deserved reputation for being at the forefront of Assistive Technology and Accessibility. It is one of only a handful of nations the rest of the world look to for best practice approaches in the area of disability. For that reason this funding announced by Minister of Sport and Persons with Disabilities, Carla Qualtrough may have a positive effect even greater than its significant monetary value, and far beyond Canada’s borders. Minster Qualtrough stated the funding was “for the development of a network of groups and people with technical skills to support the identification, development, testing, dissemination and deployment of open source assistive technologies.” Specifying that it is Open Source assistive technologies they will be developing and disseminating means that any solutions identified will have the potential to be reproduced by makers anywhere in the world. It is also interesting that the funding is to support the development of a network of groups and people rather than specific technologies, the goal here being sustainability. Neil Squire Society Executive Director, Gary Birch said “This funding is instrumental in enabling the Neil Squire Society to develop, and pilot across Canada, an innovative open source model to produce and deliver hardware-based assistive technologies to Canadians with disabilities. Hopefully this forward thinking move by the Canadian Government will inspire some EU governments into promoting and maybe even funding similar projects over here.

What is the LipSync?

The Lipsync is an Open Source Sip&Puff low force joystick that can enable access to computers or mobile devices for people without the use of their hands. Sound familiar? If you are a regular reader of this blog you are probably thinking about the FlipMouse, they are similar devices. I haven’t used the LipSync but from what I’ve read it offers slightly less functionality than the Flipmouse but this may make it more suitable for some users. Take a look at the video below.

If you want to know more about LipSync have a look at their project page on Hackaday.io where you will find build instructions, bill of materials, code and user manual.

If the idea of building or designing a technology that could enhance the life of someone with a disability or an older person appeals to you, either head down to your local maker space (Ireland, Global) or set a date in your diary for Ireland’s premier Maker Faire – Dublin Maker which will take place in Merrion Square, Dublin 4 on Saturday July 22nd. We’ll be there showing the FlipMouse as well as some of our more weird and wonderful music projects. There will also be wild, exciting and inspiring demonstrations and projects from Maker Spaces/Groups and Fab Labs from around the country and beyond. See here for a list of those taking part. 

Accessibility Checker for Word Tutorial

The Accessibility Checker feature has been part of Microsoft Office for the last few iterations of the software package. It provides a fast and easy way to check whether the content you are producing is accessible to users of assistive technology. By making accessibility accessible Microsoft have left no room for excuses like “I didn’t know how…” or “I didn’t have time..”. You wouldn’t send a document to all your colleagues full of misspellings because you were in a hurry would you? The one criticism that could have been leveled at Microsoft was perhaps they didn’t provide enough support to new users of the tool. As I said above it’s easy to use but sometimes users need a little extra support, especially when you are introducing them to something that may be perceived as additional work. Thankfully Microsoft have filled that gap with a 6 part tutorial video which clearly explains why and how to get started using Accessibility Checker. Part 1 is a short introduction (embedded below) followed by a video on each important accessibility practice; Alternative Text, Heading Styles, Hyperlinks, File naming and Tables. Each video is accompanied by a short exercise to allow you put your new skill into practice immediately. The whole tutorial can be completed in under 20 minutes. This tutorial should be a requirement for anybody producing documents for circulation to the public. Have a look at the introduction video below.