As we all figure out how best to cope with the Covid 19 pandemic and the social distancing that comes with it, we figured that many of you might be interested in learning about Assistive Technologies for Creative Expression and Leisure: Music, Photography and Gaming. Some of these may come in very handy as we all try to stay connected with one another during these trying times.
We are making our AT for Creative Expression and Leisure courses free for everyone to access over the next few months. These 4 short courses look at some ways that technology can assist people with disabilities engaging in creative pursuits and leisure activities. We have included the Introduction course below. This should be of interest to everybody and helps frame the subsequent content. The remaining 3 courses are available on our Learning Portal at enableirelandAT.ie.
You will need to create an account to access these courses but once you have your account you can self-enrol for free. Creating an account is easy. All you need is access to email to confirm your account. There is a video at the bottom of this post which will guide you through the process of creating an account. You don’t need to look at the second half of the video as these courses do not require an enrolment key.
Please let us know how you get on, and feel free to post your queries and comments at the bottom of this page. We’d love to hear what your own experiences are, and if there is content that you think we should add to these courses.
Below we have embedded the Introduction course. It’s too small to use as it but you can make it full screen by clicking the third blue button from the left at the bottom or click here to open in a new tab/window.
Leisure and gaming can be sometimes overlooked when considering the needs of an individual. But it can be an important part of a young person’s development and help enable inclusion into society. This module looks at how we can make leisure time and gaming more inclusive to a wide range of abilities. There are now many options for accessible toys, game consoles and switch adapted toys. The module covers a sample of these options with some suggested links for further reading.
Music is an accessible means of creative expression for all abilities. Even the act of passively listening to music engages the brain in the creative process. In this short course we will look at some mainstream and specialist hardware and software that can help facilitate creative musical expression.
an alternative way of accessing a computer using eye movements to control the
mouse. It is achieved through a combination of hardware and software. The hardware
is a USB perhipal called an eye tracker. The eye tracker is positioned
underneath the computer monitor. It contains a camera and Infrared lights. The
user is positioned between 500 and 1000 mm from the monitor (600mm is usually
about right) where the camera has a clear view of their eyes. The Infrared
lights highlight the user’s pupils (think of red eye in photographs where a
flash has been used) and create reflections on the user’s eyeballs. After a calibration
process where the user looks at a dot moving around the screen, the software
can accurately tell where the user is looking based on the reflections and
movements of the pupil. For computer access the user will also need tome method
of clicking. There are 3 methods usually used. Dwell is the most common method.
This is where the click is automated. If the user holds their gaze (dwells) on
a button or icon for more than a specified time duration, usually somewhere
from .5 to 1.5 sec, a click is sent. A slight variation of this is used in some
software designed for eyegaze where the button is activated after the user
dwells on it. The main difference here is that the second method offers us the
ability to select different dwell times for different buttons. The other input
methods are less common. The first would be to use an external switch as a
mouse click, the second would be to use a deliberate blink (longer than a
normal blink to prevent accidental clicks) as a mouse click.
Eye Tracker Devices
Tracker 4C https://gaming.tobii.com/tobii-eye-tracker-4c/
– This is a great option for those wanting to use eyegaze for activities like
music and gaming but have other AT as their main access method. It is every bit
as good as the two much more expensive “AT” eye trackers below and costs in the
region of €170.
Eye Plus and Mini https://www.tobiidynavox.com/products/devices/
– The PC Eye Mini and PC Eye Plus are probably the most popular AT eye
trackers. The mini will work well on a monitor up to 19”, the Plus also contains
a high quality microphone array to support speech recognition, it also has a
switch input port. The Plus will work on screens up to 28”.
Challenges associated with playing music using eye movement
a number of difficulties we might encounter when playing music using eye
movements but all can be overcome with practice and by using some common music
production tools and techniques. Eye gaze as an input method is quite
restrictive. You only have one point of direct access, so you can think of it
like playing a piano with one finger. To compound this difficulty and expand
the piano analogy, because your eyes are also your input you cannot queue up
your next note like a one fingered piano player might. Eyegaze in itself is
just eye pointing, using it as an access method will require some input (click)
ether a switch or a dwell (automatic click after a specific time duration,
usually somewhere from .5 to 1.5 sec). If you are using dwell for input then
this will add a layer of difficulty when it comes to timing. You could set the
dwell to be really fast (like .1 second) but you may run into accidental
activations in this case, for example playing a note as you are passing over it
on the way to your intended note. Some of the specialist eyegaze software
instruments like EyeHarp, EyePlayMusic and ii-music overcome this by using a
circular clock style interface. This allows them set the onscreen buttons to
instant activation and because of the radial layout each note can be directly
accessed from the centre without passing over another note. Using the radial design
if our eyes are in a central position all notes are equal distance from us and
can be accessed in the most efficient way but we are still left with the “one
finger piano” restriction. This means no chords and only the option of playing at
a slower tempo. Using mainstream music productions like sequencers, arpeggiators
or chord mode can overcome this limitation and allow us create much more
complex music using eyegaze. A sequencer would allow you pre program
accompanying notes with which to play along. An arpeggio is sometimes referred
to as a broken chord. It is the notes of a chord played consecutively rather
than simultaneously. Arpeggios are used a lot in electronic music. By playing arpeggios
the slower input is offset by the additional life and movement provided by the arpeggio.
Chord mode is something that can be set up in many digital audio workstations. You
can map one note to automatically play the accompanying notes required to make
it a chord. Live looping could also be used. In looping we would record a
section being played live, then loop it back and play other notes over it. Other
effects like delay, reverb and many more besides, will also allow is make
is another difficulty when playing music using eye tracking. By expression we
mean how an accomplished musician can play the same note in different ways to
make it more expressive. Velocity is a common means of expression, you can
think of this a how fast/hard a note is struck. Velocity can affect volume and other qualities of the
instrument’s sound. Another common means of expression is provided pedals like
those on an organ or piano. Using eyegaze we really only have the ability to
turn the note on or off. Some of the software however breaks note areas up into
sections, each one giving an increased velocity (see photo below).
Software for playing music with Eyegaze
Eye Harp http://theeyeharp.org/ One of the first software
instruments made specifically for eyegaze, the EyeHarp remains one of the best
options. This software was originally developed as a college project (I
guessing he got a first!) and rather than let it die developer Zacharias Vamvakousis made
it available free and open source. After a few years with now updates the news
is that there are some big updates on the way. We are looking forward to seeing
what they have in store for us.
Apollo Ensemble – http://www.apolloensemble.co.uk/. Although this software can enable someone play music using eyegaze, it can do so much more besides. In the right hands this application can coordinate an entire music group all using different AT and alternative instruments.
E-scape http://www.inclusivemusic.org.uk/ developed by Dr Tim Anderson is eyegaze accessible software for composition and performance. We have posted about this wonderful software and the legend of accessible music, Tim Anderson before here.
EyePlayMusic https://mybreathmymusic.com/en/eyeplaymusic. Another great of accessible music, Ruud Van Der Wel of My Breath My Music collaborated on this free eyegaze music app. Simple but effective, this could be a great starting point before moving on to the EyeHarp
Another option for eyegaze music production is using software like the Grid 3 or Iris to create an eyegaze accessible interface for a mainstream digital audio workstation. The demo below is done using Ableton Live however any software that offers keyboard mapping or keyboard shortcuts (so any quality software) could be used in the same way.
Last year I wrote a review of the Xbox adaptive controller. I detailed how it had opened up the world of gaming to many people with a disability after years of looking longingly at gamers who delved into another round of FIFA or Grand Theft Auto. By the time I was done I realised that now only one barrier remained the barrier of cost. Thankfully that is where Logitech has stepped in with their new gaming accessory kit to alleviate some of that financial pressure.
Taking a quick look back at the review of the Xbox adaptive controller you’ll see that the controller connects with the Xbox and where it becomes adaptive is that it can be used with any form of adaptive devices that you may use depending on your disability, most often those devices are series of different pressure pads or buddy buttons. In my case I use the adaptive controller along with a series of about 4 to 6 buddy buttons to act as the trigger buttons on the top of the normal Xbox controller, buttons I normally otherwise would never be able to access restricting me in 90% of games available on the Xbox.
To Quote Brad Pitt in Seven “What’s in the Box?”
Before I even get as far as describing what is in the box funnily enough I’m going to describe the box itself. Logitech seem to have taken to take all aspects of the adaptive nature of the product into account by making the packaging more accessible. The tape sealing the box shut has Loops at the end for somebody with limited use of their hands and weak grip to easily pull the box open. Inside there is a huge array of devices each of which is packaged in a plastic bag (not for the environmentalists) that are loose and slippy so the device can be easily slid out.
So that’s the box itself dealt with it. now what is inside the box? The box contains an array of 12 different pressure activation buttons (see photo below). These activation buttons vary in size and in response time and are designed to suit a variety of different disabilities. Logitech have also included two sheets of stickers that you can apply to each button you’re using , these stickers identify which button on the Xbox controller your activation pressure buttons represent.
It has also taken into account the frustration that is involved when one button slips at the most crucial of points by including a collection of velcro stickers and two pads that can interconnect with one another that sit across your lap and hold your buttons in place making them more accessible to you when you need them most. Now you’re far less likely to have them slip from underneath your hand as you are about to shoot that last enemy in Fortnite or score the winning goal in FIFA.
It’s All About the Money, Cost?
It’s very simple if you are living on disability allowance alone gaming is still very expensive. The consoles themselves are expensive not to mention the price of the games.
Unfortunately like most things once you add in the word disability there is a further cost. The Xbox adaptive controller on its own is not very useful for most people with a disability and that unit itself cost in the region of €80.
The adaptive controller must be combined with the activation pressure buttons that are most often used in conjunction with the adaptive controller. This is where the price starts to go up very very quickly.
Each buddy button can cost in the region of 60 to €80. When you consider that I need to use a minimum of 4 to 6 body buttons to use the adaptive controller to it’s full potential you can see how the cost can rocket very quickly. That’s a potential cost of €480 to fully equip you with the buttons you need.
So taking that into account Logitech gaming accessory pack price of €99 is a complete bargain with a variety of 12 different pressure buttons included within the pack. They are more lightweight and possibly will take less of a beating than some of the official ones which appear to have a more sturdy build but it is a fantastic opportunity.
Have a look at the video below to learn more about the process that made this kit possible.
Even if you are not a gamer but use a number of pressure activation buttons or buddy buttons around the house in your day-to-day life then the Logitech gaming accessory it could be a solution for you.
Introducing an eye-gaze device to
an individual who is non – verbal can open up a world of possibility for them;
it can allow them to communicate, engage with games and play as well as
allowing them to access and control their environment.
When working with children who have the potential to use eye gaze, it can be difficult to find fun and motivating ways to encourage them to engage with the device. Introducing communication-based programs too early can be too demanding and may ultimately lead to failure using the device.
Smartbox Technologies have developed a program called Look to Learn and describe it as a motivating and fun way to get started with eye gaze technology. Every activity has been developed in consultation with teachers and therapists to improve access and choice-making skills. The software consists of 40 specially created activities that easily allows therapists, families and teachers develop basic eye-gaze interaction with the child. A companion workbook is also available from the Smartbox website to download (free) and helps track and document the child’s progress as they move through the program and the complexity of the activities.
Since the year 2000 Enable Ireland’s Assistive Technology (AT) training service have run a Foundations in AT (5 ECTS) course certified by the Technological University Dublin (TUD). Those of you reading this post will most likely be familiar with AT and what a broad and rapidly evolving area it is. While overall the direction AT has taken over the last decade is positive and exciting, it has also become a more challenging area to work in. As a result, the importance and value of the Foundations in AT course has also increased and this is both reflected in, and as a direct result of the calibre of course participant we’ve had in recent years. The wealth of experience brought by participants each year helps the course evolve and develop, filling in gaps and offering new directions for technology to support people in areas beyond primary needs such as communication, access and daily living. Last month we began what is a new effort on our part to share with a wider audience some of the excellent work produced by Foundations in AT course participants with Shaun Neary’s post Accessible Photography – Photo Editing with Adobe Lightroom & the Grid 3. This month we will look at another area of creativity, music.
Alex Lucas enrolled in the 2018 Foundations in AT course. As soon as we learned about his background and experience, we knew that his involvement in the course was an opportunity for us to learn more about accessible music technology and practice. Alex is an academic (PhD research student in Queen’s University Belfast), a maker, a musician, a developer and a product designer. Before returning to studies, he had gained 10 years’ experience working in mainstream music technology with big name companies like Focusrite and Novation. In Queens he is currently researching “Supporting the Sustained Use of Bespoke Assistive Music Technology” and is part of the Research Group: Performance Without Barriers. He also works with Drake Music Northern Ireland.
We could be accused of having underutilised Alex, but our suggestion for his project was to produce a resource that would act as an introduction to people new to the area of accessible music technology. Alex chose to focus on the mainstream Digital Audio Workstation (DAW) application Ableton Live and Switch input. As well as the project document (download link below) he released 5 really excellent tutorial videos on YouTube, the first of which is embedded here.
Alex kindly agreed to contribute to this post so we asked him why he chose to focus on Ableton, to tell us a bit more about his work in inclusive music and a little about the research he is currently undertaking at Queens. Over to you Alex..
There are many software applications available for computer-based music production. Ableton Live is arguably one of the most popular DAWs. When first released in 2001, Ableton Live set itself apart from other DAWs through a unique feature called Session View.
Session View is a mode of operation which can be thought of as a musical sketchbook providing composers with an intuitive way to create loop-based music; a feature which is particularly useful when creating electronic music. When combined with Ableton Live’s built-in virtual musical instruments and devices for creating and modifying musical ideas, we find ourselves with a rich toolset for composing music in inclusive settings.
How this works with groups?
Music connects people; we see this often when conducting group-based inclusive music workshops, making work of this kind essential to Drake Music NI. There could be up to twelve participants of mixed abilities in a typical Drake workshop. As Access Music Tutors, we approach group workshops by first speaking to each participant in turn to identify their creative goals. One individual may have an interest in playing distorted synthesiser bass sounds, while another may prefer the softer sound of a real instrument such as a piano. Knowledge of an individual’s creative goals and their access requirements is used to select an appropriate device for the participant to use to control a virtual instrument within Ableton Live.
In addition to the Access Switches described in the video’s mentioned above, Drake Music also uses commercially available assistive music technologies such as Soundbeam and Skoog, and mainstream MIDI controllers such as the Novation Launchpad. It’s possible to connect several of these devices to a single computer running Live.
Together, the group make core musical decisions; i.e. genre, tempo, musical key. The workshop will proceed in one of two ways, either we jam together, or record each participant in turn, building up a composition gradually using overdubbing techniques.
OMHI – One-Handed Musical Instrument Trust
There are a handful of other organisations within the UK, working towards providing inclusion in music. One notable organisation is the One-Handed Musical Instrument Trust (https://www.ohmi.org.uk/). Many traditional musical instruments are designed in such a way that they place a fundamental requirement on the musician; they must have two fully functional hands. This assumption results in the exclusion of some individuals from learning a traditional musical instrument. Furthermore, in some cases, accomplished musicians are not able to return to their instrument after losing the function of a hand due to illness or an accident. OHMI aims to address this shortcoming by running an annual competition which invites instrument designers to adapt traditional musical instruments to be played by one hand only. Many fantastic designs are submitted to OHMI each year. I’m particularly impressed by David Nabb’s Toggle-Key Saxophone (https://www.unk.edu/academics/music/_files/toggle-key-system.pdf) which retains all of the functionality of a standard saxophone while being playable by one hand.
Whilst OHMI primarily focuses on the adaptation of traditional acoustic instruments for inclusion and accessibility; my research centres on the challenges faced by disabled musicians in the long-term use of custom-made digital musical instruments.
In partnership with a disabled musician named Eoin at Drake Music NI, together we’ve been designing a digital musical instrument tailored towards Eoin’s unique abilities. Eoin has a strong desire to play electric guitar but as Eoin cannot hold a guitar, due to its physical characteristics, he has been unable to up until this point.
Using a motion sensor and an access switch, coupled with a Raspberry Pi embedded computer, Eoin is now able to play rudimentary guitar sounds using the movements of his right arm. We’ve tested several prototypes and are now in the process of assembling the instrument for Eoin to use both during Drake music workshops and at home.
As a musician, Eoin is the primary user of the device; however we’ve also been considering Eoin’s primary carer, his father Peter, as a secondary user. We’ve designed a high-level interface for Peter to use, hopefully allowing him to easily set-up the device for Eoin to use at home. We’re particularly interested in the longevity of the device, whether or not it’s viable for Eoin and Peter to use independently. Obsolescence can be a problem for assistive technology in general. Our current assumption is that obsolescence may be an issue with custom-made accessible digital musical instruments but hope, through this research to discover useful mitigation strategies.
Some time back, when I was finishing up a photography shoot,
I met a gentleman who had informed me that his photography career had been cut
short due to having a stroke a few years earlier. This was back in 2011, and
options were a lot more limited in terms of cameras, software and accessibility
in general. Earlier in the year, as part of my Foundations in AT course, it was
suggested to me to incorporate my photography background into my project. Now
in 2019, there are a lot more options for accessibility in photography, between
mounts for the cameras, wi-fi connectivity between camera and PC/Phone/Tablet.
However taking the photo is only half the work for a photographer.
Film photographers have to develop their photos, Digital photographers have to edit their photos. Adobe Lightroom is an industry standard program for editing photos. It is also very shortcut friendly. As a result, I was able to make it work with Grid 3 to enable basic editing such as converting to black and white, adjusting colour balance, brightness. Contrast and exposure. Cropping and converting an image from Portrait to Landscape and vice versa could also be achieved via the Grid. In the short time I had to create this grid, it can be easily expanded on, adding access to other modules (such as Export, Slideshow, Book, Print, etc) to access other features like Slideshow Templates, Print Setup, Exporting with previous settings or email a photo. While functionality of this grid is minimal, there is plenty of room for expansion.
I have always been a bit of a gamer. From Tetris on the original Gameboy to Sonic and the SEGA Mega Drive, I was always keen to pass the time away rapidly instructing a cartoon character to bounce from one side of the screen to another. Since I acquired my disability in 1999 though I felt that large parts of this world were now no longer accessible to me. I felt with limited use of my arms and no use of my fingers consoles were out of the question. That changed recently when the Xbox brought out their new accessible controller.
I had tried to use several different games on the PlayStation and the Xbox, my nephew had a PlayStation and I had been able to use the left stick and some of the buttons on the ordinary controller but despite me telling him not to use the trigger buttons which were inaccessible to me I still got hammered several times by him on FIFA.
This new accessible controller seemed as though it would provide me with the opportunity to have the full experience of console gaming again, but who is going to buy an Xbox One and accessible controller just to see if they can use it or not? Thankfully Enable Ireland came to my rescue and they allowed me to borrow their console and controller for the period of a month.
XBox Adaptive Controller (XAC)
controller is simple to use and simple to set up. I needed some help to physically
plug some aids in and out of the controller but apart from that it was a
The controller is setup for people of all abilities. The variety of configurations
is as wide as the number of disabilities of the people who it is geared to
Some games I used just the accessible controller with the coloured plug in
switches that Enable Ireland provided alongside the console.
For other more complicated games, I used the Co-Pilot feature. The Co-Pilot feature allows you
to use the ordinary controller as best you can while using the accessible
controller switches for any bits or buttons on the ordinary controller that you
My setup for Forza, the car racing game, was the simplest of
all. I took 4 of the aid switches and plugged them into the accessible
controller, one was plugged into RT for the accelerator, one was plugged into
LT for the brake, and the remaining two were plugged into the left and right
the d-pad. I placed the RT switch under my elbow to continuously accelerate, which
then meant my hands only had to focus on the three remaining buttons for
steering and braking. That was a huge success, and meant I did not need any
assistance throughout any of the gameplay on that particular game. Though that
does not mean I was a great driver!
FIFA I used the Co-Pilot feature. I used the ordinary controller as I had done
previously with my nephew, steering my player with the left stick while
passing, tackling, shooting, etc with the usual A, B, X, and Y buttons.
I used the Xbox Accessible Controller then for the sprint and switch player options.
I simply plugged in the switches into the RT and LT ports on the accessible
controller and played normally on the ordinary controller while occasionally
tapping the switches to change player or holding them down
with my elbow to sprint.
A very successful and intelligent solution which resulted in a 5-1 victory for
me over my nephew! His face was a picture 🙂
Ryse, GTA & Battlefield
Each of these I played with a similar set up to FIFA (pictured above). I used the Co-Pilot feature, the ordinary controller in conjunction with the accessible controller with four switches plugged into the RT, LT, RB, and LB ports.
These games were a bit more intricate in their controls in
comparison to the others and a little more difficult to use as a result. The
accessible controller meant though that it was possible for me to at least give
it a go.
This controls setup was good and meant that I
actually completed the story mode of Ryse, on easy.
I could play the vast majority of GTA and Battlefield without any difficulty,
but there were certain issues. To use the character’s “special abilities”
in GTA you had to press down on both the left and right sticks. I think you
could set that up but that would require two more switches which I didn’t have.
Also, on occasion, while I had all the right buttons the scenario in the game
was so complex that it involved pressing a number of buttons and steering at
least one, if not both, sticks at the same time. It was almost equivalent to
playing some musical instrument. On one mission I did have to fall back on some
assistance from my nephew.
it is still not quite the same as gaming prior to my disability the Xbox
Accessible Controller has reopened the prospect of gaming properly on a regular
basis and owning a console of my own again. This was a world that I thought had
long left me behind but thanks to Microsoft and Xbox I’m
right back in the game!
A few weeks ago, Lee Ridley (a.k.a. Lost Voice Guy) became the first comedian to win Britain’s Got Talent, now in its 12th year. As well as outshining his competitors along the way, and winning with a clear margin, Lee was a favourite with both the judges and the public.
What also makes Lee’s win even more incredible is that fact that he is the first person with a disability to win the show. For a stand-up comedian, being able to connect with your audience is essential, and he did this with self-depreciating humour, fantastic delivery and some killer one-liners, all done through the use of Alternative and Augmentative Communication(AAC).
AAC provides a means of communication for those whose speech is not sufficient to communicate functionally in all environment and with all partners. Lee uses a combination of two devices to support his communication – an iPad with apps, and a dedicated device called a Lightwriter.
Lee has been on the comedy circuit since 2012, and has won prestigious prizes, including the BBC Radio New Comedy Awards in 2014. Below is an interview that Lee participated in, via email, with Karl O’Keeffe back in 2013, which gives some insights into his process and the unique challenges that using a synthesised voice can present.
Karl: You are the first person ever to do stand up comedy who uses a communication device, so you had nobody to learn from. What are the most important techniques and tricks you have learned so far that you wish someone had told you when you were starting?
Lee: I think one of the most important techniques that I have learnt is how to deal with timing. Obviously it’s pretty hard to know when to leave pauses for laughter and stuff, especially as I have to pre plan this. I can pause whenever I want but you have to be ready to pause when people laugh otherwise the start of the next bit gets lost or they don’t laugh as long. You sort of have to know when it’s coming so you’re ready for it. Obviously every audience is different so I’m never going to get it right every time. I think I’m getting better at anticipating when to pause though.
Karl: I see from your videos that you use both a LightWriter and an iPad. Can you tell me which it better for stand up comedy?
Lee: I use my iPad for my stand up and I use my Lightwriter for day to day conversations. I just find that my iPad is easier to understand slightly. It is also easier to find my material on the iPad and because it backs up to the cloud, it’s a bit more secure and means i can use any Apple device. It’s also a bit sexier than my Lightwriter.
Karl: Do you always use the same voice? Why is the voice important in your performance?
Lee: I use the same voice mostly yes. However I do use other voices in my act as well for comedy purposes. For example, I use a woman’s voice to do an impression of my mother. I think that my main voice is important to me because it has become ‘my’ voice. It’d be weird if I changed it now.
Karl: What app do you use on the iPad for communication?
Lee: I use Proloquo2go, which is a brilliant app. It is very complex but easy to use at the same time. It does everything that I need it to do really.
Karl: What is your favourite app on the iPad?
Lee: I tweet quite a lot so I tend to use Tweetbot all the time. I couldn’t get through long train journeys with the Spotify app either!
Karl: Do you use any other Assistive Technology (computer access etc.)?
Lee: No. I only use Proloquo2go on my iPad and iPhone and then my Lightwriter.
What is your vision of a sensory room? A room with a soft mat, bean bags, bubble tubes, fibre optic lighting? Switch everything on when someone wants to use the room?
Wouldn’t you like to see a little more thought as to how to control the room’s special lighting, music, and objects so that it can be more immersive? How about a projection of a motorbike on the wall while you feel the vibrations in your cushion? Or a picture of an animal while you hear its name, or its sound? Or even a projection of a beach in a blue-coloured lit environment while you feel a breeze? Well, next-generation sensory rooms are here.
The SHX system developed by BJ Live allows all resources and solutions present in the room to act in coordination to create integral stimulation environments. A single control system allows integration of all the interactive and multimedia element of the room.
The SHX system supports 2 projections as well as 4 vibroacoustic elements in the room. There is a range of scenes provided by the SHX control software, combinations of videos, images, noise, lighting, vibration or effects that can be customised to the user.
The good: The system allows you to control the level of stimulation and the method of interaction to adapt the space for each user.
The not so good: Needs time to set up a sensory room installation.
The verdict: With a range of scenes provided by the SHX control software, combinations of videos, images, noise, lighting, vibration or effects can be customised for any user child or adult.
Enable Ireland’s National Assistive Technology Service has gathered together information on a range of accessible toys. It includes a variety of accessible games, apps, and toys. These are not recommendations but simply a selection of items which may be of interest, particularly at times such as Christmas and birthdays, when presents are high on the list of priorities.