As we all figure out how best to cope with the Covid 19 pandemic and the social distancing that comes with it, we figured that many of you might be interested in learning about Assistive Technologies for Creative Expression and Leisure: Music, Photography and Gaming. Some of these may come in very handy as we all try to stay connected with one another during these trying times.
We are making our AT for Creative Expression and Leisure courses free for everyone to access over the next few months. These 4 short courses look at some ways that technology can assist people with disabilities engaging in creative pursuits and leisure activities. We have included the Introduction course below. This should be of interest to everybody and helps frame the subsequent content. The remaining 3 courses are available on our Learning Portal at enableirelandAT.ie.
You will need to create an account to access these courses but once you have your account you can self-enrol for free. Creating an account is easy. All you need is access to email to confirm your account. There is a video at the bottom of this post which will guide you through the process of creating an account. You don’t need to look at the second half of the video as these courses do not require an enrolment key.
Please let us know how you get on, and feel free to post your queries and comments at the bottom of this page. We’d love to hear what your own experiences are, and if there is content that you think we should add to these courses.
Below we have embedded the Introduction course. It’s too small to use as it but you can make it full screen by clicking the third blue button from the left at the bottom or click here to open in a new tab/window.
Leisure and gaming can be sometimes overlooked when considering the needs of an individual. But it can be an important part of a young person’s development and help enable inclusion into society. This module looks at how we can make leisure time and gaming more inclusive to a wide range of abilities. There are now many options for accessible toys, game consoles and switch adapted toys. The module covers a sample of these options with some suggested links for further reading.
Music is an accessible means of creative expression for all abilities. Even the act of passively listening to music engages the brain in the creative process. In this short course we will look at some mainstream and specialist hardware and software that can help facilitate creative musical expression.
an alternative way of accessing a computer using eye movements to control the
mouse. It is achieved through a combination of hardware and software. The hardware
is a USB perhipal called an eye tracker. The eye tracker is positioned
underneath the computer monitor. It contains a camera and Infrared lights. The
user is positioned between 500 and 1000 mm from the monitor (600mm is usually
about right) where the camera has a clear view of their eyes. The Infrared
lights highlight the user’s pupils (think of red eye in photographs where a
flash has been used) and create reflections on the user’s eyeballs. After a calibration
process where the user looks at a dot moving around the screen, the software
can accurately tell where the user is looking based on the reflections and
movements of the pupil. For computer access the user will also need tome method
of clicking. There are 3 methods usually used. Dwell is the most common method.
This is where the click is automated. If the user holds their gaze (dwells) on
a button or icon for more than a specified time duration, usually somewhere
from .5 to 1.5 sec, a click is sent. A slight variation of this is used in some
software designed for eyegaze where the button is activated after the user
dwells on it. The main difference here is that the second method offers us the
ability to select different dwell times for different buttons. The other input
methods are less common. The first would be to use an external switch as a
mouse click, the second would be to use a deliberate blink (longer than a
normal blink to prevent accidental clicks) as a mouse click.
Eye Tracker Devices
Tracker 4C https://gaming.tobii.com/tobii-eye-tracker-4c/
– This is a great option for those wanting to use eyegaze for activities like
music and gaming but have other AT as their main access method. It is every bit
as good as the two much more expensive “AT” eye trackers below and costs in the
region of €170.
Eye Plus and Mini https://www.tobiidynavox.com/products/devices/
– The PC Eye Mini and PC Eye Plus are probably the most popular AT eye
trackers. The mini will work well on a monitor up to 19”, the Plus also contains
a high quality microphone array to support speech recognition, it also has a
switch input port. The Plus will work on screens up to 28”.
Challenges associated with playing music using eye movement
a number of difficulties we might encounter when playing music using eye
movements but all can be overcome with practice and by using some common music
production tools and techniques. Eye gaze as an input method is quite
restrictive. You only have one point of direct access, so you can think of it
like playing a piano with one finger. To compound this difficulty and expand
the piano analogy, because your eyes are also your input you cannot queue up
your next note like a one fingered piano player might. Eyegaze in itself is
just eye pointing, using it as an access method will require some input (click)
ether a switch or a dwell (automatic click after a specific time duration,
usually somewhere from .5 to 1.5 sec). If you are using dwell for input then
this will add a layer of difficulty when it comes to timing. You could set the
dwell to be really fast (like .1 second) but you may run into accidental
activations in this case, for example playing a note as you are passing over it
on the way to your intended note. Some of the specialist eyegaze software
instruments like EyeHarp, EyePlayMusic and ii-music overcome this by using a
circular clock style interface. This allows them set the onscreen buttons to
instant activation and because of the radial layout each note can be directly
accessed from the centre without passing over another note. Using the radial design
if our eyes are in a central position all notes are equal distance from us and
can be accessed in the most efficient way but we are still left with the “one
finger piano” restriction. This means no chords and only the option of playing at
a slower tempo. Using mainstream music productions like sequencers, arpeggiators
or chord mode can overcome this limitation and allow us create much more
complex music using eyegaze. A sequencer would allow you pre program
accompanying notes with which to play along. An arpeggio is sometimes referred
to as a broken chord. It is the notes of a chord played consecutively rather
than simultaneously. Arpeggios are used a lot in electronic music. By playing arpeggios
the slower input is offset by the additional life and movement provided by the arpeggio.
Chord mode is something that can be set up in many digital audio workstations. You
can map one note to automatically play the accompanying notes required to make
it a chord. Live looping could also be used. In looping we would record a
section being played live, then loop it back and play other notes over it. Other
effects like delay, reverb and many more besides, will also allow is make
is another difficulty when playing music using eye tracking. By expression we
mean how an accomplished musician can play the same note in different ways to
make it more expressive. Velocity is a common means of expression, you can
think of this a how fast/hard a note is struck. Velocity can affect volume and other qualities of the
instrument’s sound. Another common means of expression is provided pedals like
those on an organ or piano. Using eyegaze we really only have the ability to
turn the note on or off. Some of the software however breaks note areas up into
sections, each one giving an increased velocity (see photo below).
Software for playing music with Eyegaze
Eye Harp http://theeyeharp.org/ One of the first software
instruments made specifically for eyegaze, the EyeHarp remains one of the best
options. This software was originally developed as a college project (I
guessing he got a first!) and rather than let it die developer Zacharias Vamvakousis made
it available free and open source. After a few years with now updates the news
is that there are some big updates on the way. We are looking forward to seeing
what they have in store for us.
Apollo Ensemble – http://www.apolloensemble.co.uk/. Although this software can enable someone play music using eyegaze, it can do so much more besides. In the right hands this application can coordinate an entire music group all using different AT and alternative instruments.
E-scape http://www.inclusivemusic.org.uk/ developed by Dr Tim Anderson is eyegaze accessible software for composition and performance. We have posted about this wonderful software and the legend of accessible music, Tim Anderson before here.
EyePlayMusic https://mybreathmymusic.com/en/eyeplaymusic. Another great of accessible music, Ruud Van Der Wel of My Breath My Music collaborated on this free eyegaze music app. Simple but effective, this could be a great starting point before moving on to the EyeHarp
Another option for eyegaze music production is using software like the Grid 3 or Iris to create an eyegaze accessible interface for a mainstream digital audio workstation. The demo below is done using Ableton Live however any software that offers keyboard mapping or keyboard shortcuts (so any quality software) could be used in the same way.
Since the year 2000 Enable Ireland’s Assistive Technology (AT) training service have run a Foundations in AT (5 ECTS) course certified by the Technological University Dublin (TUD). Those of you reading this post will most likely be familiar with AT and what a broad and rapidly evolving area it is. While overall the direction AT has taken over the last decade is positive and exciting, it has also become a more challenging area to work in. As a result, the importance and value of the Foundations in AT course has also increased and this is both reflected in, and as a direct result of the calibre of course participant we’ve had in recent years. The wealth of experience brought by participants each year helps the course evolve and develop, filling in gaps and offering new directions for technology to support people in areas beyond primary needs such as communication, access and daily living. Last month we began what is a new effort on our part to share with a wider audience some of the excellent work produced by Foundations in AT course participants with Shaun Neary’s post Accessible Photography – Photo Editing with Adobe Lightroom & the Grid 3. This month we will look at another area of creativity, music.
Alex Lucas enrolled in the 2018 Foundations in AT course. As soon as we learned about his background and experience, we knew that his involvement in the course was an opportunity for us to learn more about accessible music technology and practice. Alex is an academic (PhD research student in Queen’s University Belfast), a maker, a musician, a developer and a product designer. Before returning to studies, he had gained 10 years’ experience working in mainstream music technology with big name companies like Focusrite and Novation. In Queens he is currently researching “Supporting the Sustained Use of Bespoke Assistive Music Technology” and is part of the Research Group: Performance Without Barriers. He also works with Drake Music Northern Ireland.
We could be accused of having underutilised Alex, but our suggestion for his project was to produce a resource that would act as an introduction to people new to the area of accessible music technology. Alex chose to focus on the mainstream Digital Audio Workstation (DAW) application Ableton Live and Switch input. As well as the project document (download link below) he released 5 really excellent tutorial videos on YouTube, the first of which is embedded here.
Alex kindly agreed to contribute to this post so we asked him why he chose to focus on Ableton, to tell us a bit more about his work in inclusive music and a little about the research he is currently undertaking at Queens. Over to you Alex..
There are many software applications available for computer-based music production. Ableton Live is arguably one of the most popular DAWs. When first released in 2001, Ableton Live set itself apart from other DAWs through a unique feature called Session View.
Session View is a mode of operation which can be thought of as a musical sketchbook providing composers with an intuitive way to create loop-based music; a feature which is particularly useful when creating electronic music. When combined with Ableton Live’s built-in virtual musical instruments and devices for creating and modifying musical ideas, we find ourselves with a rich toolset for composing music in inclusive settings.
How this works with groups?
Music connects people; we see this often when conducting group-based inclusive music workshops, making work of this kind essential to Drake Music NI. There could be up to twelve participants of mixed abilities in a typical Drake workshop. As Access Music Tutors, we approach group workshops by first speaking to each participant in turn to identify their creative goals. One individual may have an interest in playing distorted synthesiser bass sounds, while another may prefer the softer sound of a real instrument such as a piano. Knowledge of an individual’s creative goals and their access requirements is used to select an appropriate device for the participant to use to control a virtual instrument within Ableton Live.
In addition to the Access Switches described in the video’s mentioned above, Drake Music also uses commercially available assistive music technologies such as Soundbeam and Skoog, and mainstream MIDI controllers such as the Novation Launchpad. It’s possible to connect several of these devices to a single computer running Live.
Together, the group make core musical decisions; i.e. genre, tempo, musical key. The workshop will proceed in one of two ways, either we jam together, or record each participant in turn, building up a composition gradually using overdubbing techniques.
OMHI – One-Handed Musical Instrument Trust
There are a handful of other organisations within the UK, working towards providing inclusion in music. One notable organisation is the One-Handed Musical Instrument Trust (https://www.ohmi.org.uk/). Many traditional musical instruments are designed in such a way that they place a fundamental requirement on the musician; they must have two fully functional hands. This assumption results in the exclusion of some individuals from learning a traditional musical instrument. Furthermore, in some cases, accomplished musicians are not able to return to their instrument after losing the function of a hand due to illness or an accident. OHMI aims to address this shortcoming by running an annual competition which invites instrument designers to adapt traditional musical instruments to be played by one hand only. Many fantastic designs are submitted to OHMI each year. I’m particularly impressed by David Nabb’s Toggle-Key Saxophone (https://www.unk.edu/academics/music/_files/toggle-key-system.pdf) which retains all of the functionality of a standard saxophone while being playable by one hand.
Whilst OHMI primarily focuses on the adaptation of traditional acoustic instruments for inclusion and accessibility; my research centres on the challenges faced by disabled musicians in the long-term use of custom-made digital musical instruments.
In partnership with a disabled musician named Eoin at Drake Music NI, together we’ve been designing a digital musical instrument tailored towards Eoin’s unique abilities. Eoin has a strong desire to play electric guitar but as Eoin cannot hold a guitar, due to its physical characteristics, he has been unable to up until this point.
Using a motion sensor and an access switch, coupled with a Raspberry Pi embedded computer, Eoin is now able to play rudimentary guitar sounds using the movements of his right arm. We’ve tested several prototypes and are now in the process of assembling the instrument for Eoin to use both during Drake music workshops and at home.
As a musician, Eoin is the primary user of the device; however we’ve also been considering Eoin’s primary carer, his father Peter, as a secondary user. We’ve designed a high-level interface for Peter to use, hopefully allowing him to easily set-up the device for Eoin to use at home. We’re particularly interested in the longevity of the device, whether or not it’s viable for Eoin and Peter to use independently. Obsolescence can be a problem for assistive technology in general. Our current assumption is that obsolescence may be an issue with custom-made accessible digital musical instruments but hope, through this research to discover useful mitigation strategies.
Last week we were visited in Enable Ireland, Sandymount, by two of the most experienced practitioners working in the area of assistive music technology. Dr Tim Anderson http://www.inclusivemusic.org.uk/ and Elin Skogdal (SKUG) dropped by to talk about the new eyegaze music software they have been developing and to share some tips with the musicians from Enable Ireland Adult’s Services. Tim Anderson has been developing accessible music systems for the last 25 years. E-Scape which he developed, is the only MIDI composition and performance software designed from the ground up for users of alternative input methods (Switch, Joystick and now Eyegaze). Tim also works as an accessible music consultant for schools and councils. Elin Skogdal is a musician and educator based at the SKUG Centre. She has been using Assistive Music Technology in music education since 2001 and was one of those responsible for establishing the SKUG Centre. The SKUG Centre is located in Tromsø, Northern Norway. SKUG stands for “Performing Music Together Without Borders”, and the aim of the Centre is to provide opportunities for people who can’t use conventional instruments to play and learn music. SKUG is part of the mainstream art school of Tromsø (Tromsø Kulturskole), which provides opportunities for SKUG students to collaborate with other music and dance students and teachers. SKUG have students at all levels and ages – from young children to university students. If you would to like to know more about Elin’s work at SKUG click here to read a blog post from Apollo Ensemble.
Following the visit and workshop they sent us some more detailed information about the exciting new eyegaze music software they are currently developing Eye-Touch. We have included this in the paragraphs below. If you are interested in getting involved in their very user lead development process you can contact us here (comments below) and we will put you in touch with Tim and Elin.
‘Eye-touch’ (Funded by ‘NAV Hjelpemidler og tilrettelegging’ in 2017, and Stiftelsen Sophie’s Minde in 2018) is a software instrument being developed by the SKUG centre (Part of ‘Kulturskolen i Tromsø’), in collaboration with Dr. Tim Anderson, which enables people to learn and play music using only their eyes. It includes a built-in library of songs called ‘Play-screens’, with graphical buttons which play when you activate them.
Buttons are laid out on screen to suit the song and the player’s abilities, and can be of any size and colour, or show a picture. When you look at a button (using an eye-gaze tracking system such as Tobii or Rolltalk) it plays its musical content. You can also play buttons in other ways to utilise the screen’s attractive look: you can touch a touch-screen or smartboard, press switches or PC keys, or hit keys on a MIDI instrument.
The music within each button can either be musical notes played on a synthesised instrument, or an audio sample of any recorded sound, for example animal noises or sound effects. Sound samples can also be recordings of people’s voices speaking or singing words or phrases. So a child in a class group could play vocal phrases to lead the singing (‘call’), with the other children then answering by singing the ‘response’.
Pictured above, a pupil in Finland is trying out playing a screen with just three buttons, with musical phrases plus a sound effect of a roaring bear (popular with young players!). She has been using the system for just a few minutes, and was successfully playing the song, which proved very enjoyable and motivating for her.
SKUG’s experience from their previous prototype system has led to the incorporation of some innovative playing features, which distinguish it from other eyegaze music systems, and have been shown to enable people to play who couldn’t otherwise. These features provide an easy entry level, and we have found that they enable new users to start playing immediately and gain motivation. These support features can also be changed or removed by teachers to suit each player’s abilities, and most importantly, be able to evolve as a player practises and improves. One feature is to have the buttons in a sequence which can only be played in the right order, so the player can ‘look over’ other buttons to get to the next ‘correct’ button.
Here are two examples: The Play-screen below has buttons each containing a single note, arranged as a keyboard with colouring matching the Figurenotes scheme. A player with enough ability could learn a melody and play it by moving between the buttons in the empty space below. But by putting the buttons into a sequence order, the player is able to learn and play the melody far more easily – they can look over buttons to get to the next ‘correct’ button (note) of the song, without playing the buttons in between.
As well as illustrating a general theme, the facility to add pictures gives us many more possibilities. The Play-screen below left has buttons which show pictures and play sounds and music relating to J.S. Bach’s life story. The buttons could be played freely, but in this case have been put into a sequence order to illustrate his life chronologically. As before, a player can move through the buttons to play then in order, even though they are close together. But we may want to make them even bigger, and make the player’s job even easier, by setting to only display the ‘next’ button in the sequence (below right). So the other buttons are hidden, and the player only sees the button which is next to play, and can then move onto it.
There is also an accompanying text to tell the story which, if desired, can be displayed on screen via a built in ‘song-sheet’. Teachers can also make their own Play-screens by putting their own music into buttons – by either playing live on a MIDI keyboard, or recording their own sound samples. To further personalise a Play-screen for a pupil, people can also organise and edit all the visual aspects including adding their own pictures.
The Eye-Touch software is also very easy to install and operate – we have found it quick and easy to install it on school pupils’ eye-gaze tablets, and it worked for them straight away.
In January 2018 the SKUG team started a project to further develop Eye-Touch to expand the ways of playing, the creating and editing facilities for teachers, and the range of songs provided in the library.
Makers Making Change have a mission, to “connect makers to people with disabilities who need assistive technologies”. This is also our mission and something we’ve talked about before, it is also the goal of a number of other projects including TOM Global and Enable Makeathon. Makers Making Change which is being run by Canadian NGO the Neil Squire Society and supported by Google.org differs from previous projects sharing the same goal in a couple of ways. Firstly their approach. They are currently concentrating their efforts on one particular project, the LipSync and touring the North American continent holding events where groups of Makers get together and build a quantity of these devices. These events are called Buildathons. This approach both raises awareness about their project within the maker community while also ensuring they have plenty of devices in stock, ready to go out to anybody who needs them. Secondly, thanks to the recent promise from the Canadian government of funding to the tune of $750,000 they may be on the verge of bringing their mission into the mainstream.
Canada have always had a well-deserved reputation for being at the forefront of Assistive Technology and Accessibility. It is one of only a handful of nations the rest of the world look to for best practice approaches in the area of disability. For that reason this funding announced by Minister of Sport and Persons with Disabilities, Carla Qualtrough may have a positive effect even greater than its significant monetary value, and far beyond Canada’s borders. Minster Qualtrough stated the funding was “for the development of a network of groups and people with technical skills to support the identification, development, testing, dissemination and deployment of open source assistive technologies.” Specifying that it is Open Source assistive technologies they will be developing and disseminating means that any solutions identified will have the potential to be reproduced by makers anywhere in the world. It is also interesting that the funding is to support the development of a network of groups and people rather than specific technologies, the goal here being sustainability. Neil Squire Society Executive Director, Gary Birch said “This funding is instrumental in enabling the Neil Squire Society to develop, and pilot across Canada, an innovative open source model to produce and deliver hardware-based assistive technologies to Canadians with disabilities. Hopefully this forward thinking move by the Canadian Government will inspire some EU governments into promoting and maybe even funding similar projects over here.
If the idea of building or designing a technology that could enhance the life of someone with a disability or an older person appeals to you, either head down to your local maker space (Ireland,Global) or set a date in your diary for Ireland’s premier Maker Faire – Dublin Maker which will take place in Merrion Square, Dublin 4 on Saturday July 22nd. We’ll be there showing the FlipMouse as well as some of our more weird and wonderful music projects. There will also be wild, exciting and inspiring demonstrations and projects from Maker Spaces/Groups and Fab Labs from around the country and beyond. See here for a list of those taking part.
One of the more dubious advantages of working in a long running Assistive Technology service is access to an ever growing supply of obsolete hardware. While much of it is worthless junk now considering the technological progress in the field over the last 10 years, there are some real gems to be rediscovered. These were innovative solutions of their time grounded in strong research and while being seemingly made obsolete by a newer technology actually still have much to offer. The LOMAK keyboard is certainly one of these and being possibly the only piece of AT on permanent display at New York’s Museum of Modern Art I’m obviously not alone in thinking this.
The LOMAK (Light Operated Mouse And Keyboard) was invented by New Zealander Mike Watling and first came on the market in 2005 after a number of years research. It allowed hands free computer access through the innovative use of a laser pointer and light sensitive keyboard and mouse controls. To make the light sensitive keyboard and mouse (I’ll call it an input device from here) Watling used an array or photoresistors, one for each keyboard, mouse action and setting. This amounted to a whopping 122 photoresistors and possibly the most electronically complex input device ever marketed. Although complex the idea behind the LOMAK is quite straight forward. Photoresistors change their resistance depending on the amount of light they are picking up. Once you figure out roughly how much shining a laser pen on the resistor changes its value you have a good idea of where to set your threshold. You can then use the photo-resistor as a straightforward momentary switch, like a keyboard key, that activates once the resistance goes above/below a certain threshold. If you are like me you will want to see inside this thing so here it is.. (Below), a thing of beauty I’m sure you’ll agree.
So why aren’t more people using LOMAK keyboards today? Well eye tracking technology was just starting to become a realistic possibility for AT users with devices like the Tobii P10 hitting the market. Eye tracking just made more sense for computer access, it allows a neater more mobile solution and it a more direct input method. What has given the whole concept behind the LOMAK a new lease of life is the availability of cheap user-friendly prototyping platforms like Arduino.
This was the basis of one of the project proposals we made available to the final year students of the BSc (Honours) Creative Media Technologies course in IADT. Over the last few years Enable Ireland AT service have worked with IADT lecturer Conor Brennan to provide students with a selection of project briefs that both fit with their learning and skills while also fulfilling a need that has been recognised through our work supporting AT users and professionals in the area. This particular brief was to create a MIDI interface based on the same concept as the LOMAK that would allow someone to perform and compose music using only head movements. There are solutions available that use eye tracking to achieve this, for example the fantastic EyeHarp and more recently Ruud van der Wel of My Breath My Music released his Eye Play Music software. However these solutions all require a computer, we wanted something that was more in keeping with current trends in mainstream electronic music which seems to be moving back to a more hardware based performance. Thankfully a particularly talented student by the name of Rudolf Triebel took on the challenge of designing and building what we are now calling the MILO (Musical Interface using Laser Operation) (previously called LOMI Light Operated MIDI Interface which I think is much better..:). Rudolf exceeded our expectations and created the prototype you can see in the (badly filmed, sorry) video below. He has also created a tutorial including wiring diagram, code and bill of materials and put it up on Instructables to allow the project to be replicated and improved by others.
If you would like to see and maybe have a go of the MILO prototype (in its spanking new laser cut enclosure) Conor Brennan of IADT will be showing and demonstrating it at the 25th EAN Conference which takes place in University College Dublin between Sunday 29th – Tuesday 31st May.
Keep an eye on electroat.com where I hope to add a few more detailed posts on building, modifying and increasing the functionality of Rudolf’s design. I will also look into the possibility of using the same concept for building a hands free video game controller.
On February 23rd Enable Ireland Assistive Technology Service and the Institute of Art Design and Technology (IADT) Dun Laoghaire will be holding an Accessible Music Technology and Practice Seminar. This is a rare opportunity for anybody interested in music and disability to hear a highly experienced range of speakers from Ireland, the UK and Norway who are involved in the design and development of accessible musical instruments/interfaces, the delivery of accessible musical education or in supporting musicians and performers with disabilities. Places are limited so advanced booking is essential.
For more details see electroAT.com/amtp where you will find the booking form and regular updates as the day approaches.
Who might be interested in attending?
Musicians or aspiring musicians with a disability.
Musicians or music therapists who work with people with disabilities.
Music teachers or community musicians interested in providing more inclusive classes and environments.
Therapists or anybody who works with or supports people with disabilities that would like to introduce music based activities.
Product or software designers interested in creating alternative musical instruments and interfaces.
As you will see from the range of experience below this promises to be a very interesting group representing all areas from the design and development of accessible hardware and software to practice based experience of working with musicians with disabilities.
Dr Tim Anderson (inclusivemusic.org.uk ) – has been involved with developing technology and software for enabling people with disabilities to make music for the last 25 years. Over that time he has been Research and Development (R&D) manager and later Technology manager with Drake Music and more recently an independent consultant to the software developers as well as to schools, colleges, councils and individuals. Tim developed, sells and supports the E-Scape software system, that allows people to compose and play music unaided, whatever their physical ability or musical knowledge.
Elin Skogdal – SKUG Centre, Norway. The SKUG Centre provides education and support for musicians with disabilities. They also offer training, education, demonstrations, and advice for teachers and supporters and participate in the development of specialist hardware and software for access to music making.
Dr Brendan McCloskey (Ulster University, School of Creative Arts and Technologies) is a musician and designer working closely with performers with disabilities. Having worked extensively a researcher and practitioner in the field of inclusive community music with Ulster University, Drake Music N. Ireland, Drake Music UK, Share Music Sweden and Stravaganza Production Company across the past 15 years, he has designed an innovative digital instrument for musicians with quadriplegic cerebral palsy. The instrument, called inGrid, was shortlisted for the Prix Ars Electronica one-handed musical instrument competition in November 2013, and selected as a finalist for the Margaret Guthman Prize run by Georgia Tech in Atlanta, January 2014. He will discuss the key innovations underpinning inGrid, and how they will be developed in the immediate future.
Brian Dillon (Unique Perspectives) designer and manufacturer of assistive technology solutions who along with Ruud van der Wel (MyBreathMyMusic) has developed accessible music technologies such as the Quintet and the Magic Flute. The Quintet is an exciting device that enables people with disabilities to play music using switches. Easy to use and set-up it is suitable for teachers, therapists, parents and others who want to use music in an activity with children or adults. It can be used with a group of people or by a single individual. The Magic Flute is an electronic musical instrument that enables people whose only reliable movement is the head and breath to play music. The flute is similar to a harmonica or slide whistle in that it requires no fingers to play and both note and expression can be controlled using the mouth.
Grainne McHale and Graham McCarthy from SoundOUT , a group based in Cork who provides inclusive music-making and performance opportunities for young people with and without disabilities in Ireland.
Jason Noone – music therapist active in clinical work, music therapy training/supervision and research. He is a member of the Music and Health Research Group at the University of Limerick. His clinical expertise are mainly in the area of developmental disability and research interests include sensory integration and music therapy, music technology for access and participation and participatory action research.
Koichi Samuels – PhD candidate based at the Sonic Arts Research Centre (SARC), Queen’s University Belfast. He is researching inclusive music practices and interfaces with Drake Music Northern Ireland, a charity that aims to enable musicians with physical disabilities and learning difficulties to compose and perform their own music through music technology. Research interests include: inclusive music, DMIs, DIY/maker culture, critical design, HCI.