If you’re on Twitter you may have heard the sad news of Larry Tesler passing away recently. I, like many others I’m sure, hadn’t heard of Mr Tesler until his death. You will be familiar with his work however. Larry Tesler was the inventor of Cut, Copy and Paste. Copy and Paste is an action we do every day on a computer (some more than others perhaps.. guilty). Editing documents, moving text around, quoting people, downright plagrasium.. it’s a quick and useful way of repurposing text. If you are slow at typing, find spelling difficult or maybe experience short term memory problems it’s a godsend. On Windows you can just use the mouse, select the text, right click and choose your weapon of choice (Cut, Copy or Paste). Mac users don’t have a right mouse button but I suspect they use this feature just as much. Taking it to the next level, let’s call them the serious amateurs, we have keyboard shortcuts. Shift + Delete for Cut (can’t say I use this much), Ctrl + C for Copy and Ctrl + V for Paste. Keyboard shortcuts are great, more productive and help prevent repetitive strain injury (RSI).
As much as I like this tool (technique?) the standard Copy/Paste has a major limitation which you will have certainly have come across if you use it frequently. It only remembers the last item copied. Let’s say you want to copy a few phrases from a document. This means you need to switch between documents: copy, switch, paste, switch, copy… or you could use my preferred method which is to open a Notepad doc, split the screen and paste into that. I do this to remove any style associated with the text but it’s still increasing the workload and thereby defeating the purpose!
There have long been third party tools called Clipboard Managers which allow you take Copy and Paste to the next level. A clipboard manager will allow you: copy. copy, copy > Paste, Paste, Paste. Very handy. What most people don’t know, and the reason for this post, is that Windows 10 has a Clipboard Manager built in, you just need to enable it. Copy as normal (Ctrl + C) but instead of using Ctrl + V to paste, Use Windows Key + V. Windows Key is on the bottom row, left of spacebar between Ctrl and Alt and is the Windows Logo. Once enabled (doing it the first time will prompt you to enable the feature) you will be offered a window with your clipboard history (screenshot below, it works for screenshots too).
As we all figure out how best to cope with the Covid 19 pandemic and the social distancing that comes with it, we figured that many of you might be interested in learning about Assistive Technologies for Creative Expression and Leisure: Music, Photography and Gaming. Some of these may come in very handy as we all try to stay connected with one another during these trying times.
We are making our AT for Creative Expression and Leisure courses free for everyone to access over the next few months. These 4 short courses look at some ways that technology can assist people with disabilities engaging in creative pursuits and leisure activities. We have included the Introduction course below. This should be of interest to everybody and helps frame the subsequent content. The remaining 3 courses are available on our Learning Portal at enableirelandAT.ie.
You will need to create an account to access these courses but once you have your account you can self-enrol for free. Creating an account is easy. All you need is access to email to confirm your account. There is a video at the bottom of this post which will guide you through the process of creating an account. You don’t need to look at the second half of the video as these courses do not require an enrolment key.
Please let us know how you get on, and feel free to post your queries and comments at the bottom of this page. We’d love to hear what your own experiences are, and if there is content that you think we should add to these courses.
Below we have embedded the Introduction course. It’s too small to use as it but you can make it full screen by clicking the third blue button from the left at the bottom or click here to open in a new tab/window.
Leisure and gaming can be sometimes overlooked when considering the needs of an individual. But it can be an important part of a young person’s development and help enable inclusion into society. This module looks at how we can make leisure time and gaming more inclusive to a wide range of abilities. There are now many options for accessible toys, game consoles and switch adapted toys. The module covers a sample of these options with some suggested links for further reading.
Music is an accessible means of creative expression for all abilities. Even the act of passively listening to music engages the brain in the creative process. In this short course we will look at some mainstream and specialist hardware and software that can help facilitate creative musical expression.
an alternative way of accessing a computer using eye movements to control the
mouse. It is achieved through a combination of hardware and software. The hardware
is a USB perhipal called an eye tracker. The eye tracker is positioned
underneath the computer monitor. It contains a camera and Infrared lights. The
user is positioned between 500 and 1000 mm from the monitor (600mm is usually
about right) where the camera has a clear view of their eyes. The Infrared
lights highlight the user’s pupils (think of red eye in photographs where a
flash has been used) and create reflections on the user’s eyeballs. After a calibration
process where the user looks at a dot moving around the screen, the software
can accurately tell where the user is looking based on the reflections and
movements of the pupil. For computer access the user will also need tome method
of clicking. There are 3 methods usually used. Dwell is the most common method.
This is where the click is automated. If the user holds their gaze (dwells) on
a button or icon for more than a specified time duration, usually somewhere
from .5 to 1.5 sec, a click is sent. A slight variation of this is used in some
software designed for eyegaze where the button is activated after the user
dwells on it. The main difference here is that the second method offers us the
ability to select different dwell times for different buttons. The other input
methods are less common. The first would be to use an external switch as a
mouse click, the second would be to use a deliberate blink (longer than a
normal blink to prevent accidental clicks) as a mouse click.
Eye Tracker Devices
Tracker 4C https://gaming.tobii.com/tobii-eye-tracker-4c/
– This is a great option for those wanting to use eyegaze for activities like
music and gaming but have other AT as their main access method. It is every bit
as good as the two much more expensive “AT” eye trackers below and costs in the
region of €170.
Eye Plus and Mini https://www.tobiidynavox.com/products/devices/
– The PC Eye Mini and PC Eye Plus are probably the most popular AT eye
trackers. The mini will work well on a monitor up to 19”, the Plus also contains
a high quality microphone array to support speech recognition, it also has a
switch input port. The Plus will work on screens up to 28”.
Challenges associated with playing music using eye movement
a number of difficulties we might encounter when playing music using eye
movements but all can be overcome with practice and by using some common music
production tools and techniques. Eye gaze as an input method is quite
restrictive. You only have one point of direct access, so you can think of it
like playing a piano with one finger. To compound this difficulty and expand
the piano analogy, because your eyes are also your input you cannot queue up
your next note like a one fingered piano player might. Eyegaze in itself is
just eye pointing, using it as an access method will require some input (click)
ether a switch or a dwell (automatic click after a specific time duration,
usually somewhere from .5 to 1.5 sec). If you are using dwell for input then
this will add a layer of difficulty when it comes to timing. You could set the
dwell to be really fast (like .1 second) but you may run into accidental
activations in this case, for example playing a note as you are passing over it
on the way to your intended note. Some of the specialist eyegaze software
instruments like EyeHarp, EyePlayMusic and ii-music overcome this by using a
circular clock style interface. This allows them set the onscreen buttons to
instant activation and because of the radial layout each note can be directly
accessed from the centre without passing over another note. Using the radial design
if our eyes are in a central position all notes are equal distance from us and
can be accessed in the most efficient way but we are still left with the “one
finger piano” restriction. This means no chords and only the option of playing at
a slower tempo. Using mainstream music productions like sequencers, arpeggiators
or chord mode can overcome this limitation and allow us create much more
complex music using eyegaze. A sequencer would allow you pre program
accompanying notes with which to play along. An arpeggio is sometimes referred
to as a broken chord. It is the notes of a chord played consecutively rather
than simultaneously. Arpeggios are used a lot in electronic music. By playing arpeggios
the slower input is offset by the additional life and movement provided by the arpeggio.
Chord mode is something that can be set up in many digital audio workstations. You
can map one note to automatically play the accompanying notes required to make
it a chord. Live looping could also be used. In looping we would record a
section being played live, then loop it back and play other notes over it. Other
effects like delay, reverb and many more besides, will also allow is make
is another difficulty when playing music using eye tracking. By expression we
mean how an accomplished musician can play the same note in different ways to
make it more expressive. Velocity is a common means of expression, you can
think of this a how fast/hard a note is struck. Velocity can affect volume and other qualities of the
instrument’s sound. Another common means of expression is provided pedals like
those on an organ or piano. Using eyegaze we really only have the ability to
turn the note on or off. Some of the software however breaks note areas up into
sections, each one giving an increased velocity (see photo below).
Software for playing music with Eyegaze
Eye Harp http://theeyeharp.org/ One of the first software
instruments made specifically for eyegaze, the EyeHarp remains one of the best
options. This software was originally developed as a college project (I
guessing he got a first!) and rather than let it die developer Zacharias Vamvakousis made
it available free and open source. After a few years with now updates the news
is that there are some big updates on the way. We are looking forward to seeing
what they have in store for us.
Apollo Ensemble – http://www.apolloensemble.co.uk/. Although this software can enable someone play music using eyegaze, it can do so much more besides. In the right hands this application can coordinate an entire music group all using different AT and alternative instruments.
E-scape http://www.inclusivemusic.org.uk/ developed by Dr Tim Anderson is eyegaze accessible software for composition and performance. We have posted about this wonderful software and the legend of accessible music, Tim Anderson before here.
EyePlayMusic https://mybreathmymusic.com/en/eyeplaymusic. Another great of accessible music, Ruud Van Der Wel of My Breath My Music collaborated on this free eyegaze music app. Simple but effective, this could be a great starting point before moving on to the EyeHarp
Another option for eyegaze music production is using software like the Grid 3 or Iris to create an eyegaze accessible interface for a mainstream digital audio workstation. The demo below is done using Ableton Live however any software that offers keyboard mapping or keyboard shortcuts (so any quality software) could be used in the same way.
Anybody working with Assistive Technology (AT) knows how useful Apple iOS devices are. Over the years they have gradually built in a comprehensive and well-designed range of AT supports that go a long way to accommodating every access need. This is no small feat. In 2009 VoiceOver transformed what was essentially a smooth featureless square of glass with almost no tactile information, into the preferred computing device for blind people. In 2019 Voice Control and the improvements made to Assistive Touch filled two of the last big gaps in the area of “hands free” control of iOS. All this great work is not completely altruistic however as it has resulted in Apple mobile devices cementing their place as the preeminent platform in the area of disability and AT. It is because of this that it has always been somewhat of a mystery why there has never been a commercial eye tracking option available for either iOS or MacOS. Perhaps not so much iOS as we will see but certainly one would have thought an eyegaze solution for the Apple desktop OS could be a viable product.
There are a few technical reasons why iOS never has
supported eyegaze. Firstly, up until the newer generations of eye gaze
peripherals, eye gaze needed a computer with a decent spec to work well. iPads
are Mobile devices and Apple originally made no apologies for sacrificing
performance for more important mobile features like reducing weight, thickness
and increasing battery life. As eye trackers evolved and got more sophisticated,
they began to process more of the massive amount of gaze data they take in. So
rather than passing large amounts of raw data straight through to the computer
via USB 3 or Firewire they process the data first themselves. This means less
work for the computer and connection with less bandwidth can be used. Therefore,
in theory, an iPad Pro could support something like a Tobii PC Eye Mini but in practice,
there was still one major barrier. iOS did not support any pointing device, let
alone eye tracking devices. That was until last September’s iOS update. iOS 13
or iPadOS saw upgrades to the Assistive Touch accessibility feature that
allowed it to support access to the operating system using a pointing device.
It is through Assistive Touch that the recently announced Skyle
for iPad Pro is possible. “Skyle is the world’s first eye tracker for iPad Pro”
recently announced by German company EyeV https://eyev.de/
(who I admit I have not previously heard of). Last week it appeared as a
product on Inclusive Technology for £2000 (ex VAT). There is very little
information on the manufacturer website about Skyle so at this stage all we
know is based on the Inclusive Technology product description (which is pretty
good thankfully). The lack of information about this product (other than the
aforementioned) significantly tempers my initial excitement on hearing that
there is finally an eye tracking solution for iOS. There are no videos on YouTube
(or Inclusive Technology), no user reviews anywhere. I understand it is a new
product but it is odd for a product to be on the market before anybody has had
the opportunity of using it and posting a review. I hope I am wrong but alarm
bells are ringing. We’ve waited 10 years for eye tracking on iOS, why rush now?
Leaving my suspicion behind there are some details on Inclusive Technology which will be of interest to potential customers. If you have used a pointing device through Assistive Touch on iPadOS you will have a good idea of the user experience. Under Cursor in the Assistive Touch settings you can change the size and colour of the mouse cursor. You will need to use the Dwell feature to automate clicks and the Assistive Touch menu will hive you access to all the other gestures needed to operate the iPad. Anyone who works with people who use eye tracking for computer access will know that accuracy varies significantly from person to person. Designed for touch, targets in iPadOS (icons, menus) are not tiny, they are however smaller than a cell in the most detailed Grid used by a highly accurate eyegaze user. Unlike a Windows based eye gaze solution there are no additional supports, for example a Grid overlay or zooming to help users with small targets. Although many users will not have the accuracy to control the iPad with this device (switch between apps, change settings) it could be a good solution within an AAC app (where cell sizes can be configured to suit user accuracy) or a way of interacting with one of the many cause and effect apps and games. Again however, if you have a particular app or activity in mind please don’t assume it will work, try before you buy. It should be noted here that Inclusive Technology are offering a 28 Day returns policy on this product.
There is a Switch input jack which will offer an alternative to Dwell for clicking or could be set to another action (show Assistive Touch menu maybe). I assume you could also use the switch with iOS Switch Control which might be a work around for those who are not accurate enough to access smaller targets with the eye gaze device. It supports 5 and 9 point calibration to improve accuracy. I would like to see a 2 point calibration option as 5 points can be a stretch for some early eyegaze users. It would also be nice if you could change the standard calibration dot to something more likely to engage a child (cartoon dog perhaps).
Technical specs are difficult to compare between eye trackers on the same platform (Tobii v EyeTech for example) so I’m not sure what value it would be to compare this device with other Windows based eye trackers. That said some specs that will give us an indication of who this device may be appropriate for are sample rate and operating distance. Judging by the sample rate (given as 18Hz max 30Hz) the Skyle captures less than half the frames per second of its two main Windows based competitors (Tobii 30 FPS TM5 42 FPS). However even 15 FPS should be more than enough for accurate mouse control. The operating distance (how far the device is from the user) for Skyle is 55 to 65 cm which is about average for an eyegaze device. However only offering a range of 10 cm (Tobii range is 45cm to 85 cm, so 40 cm) as well as the photo below which shows the positioning guide both indicate that this not a solution for someone with even a moderate amount of head movement as the track box (area where eyes can be successfully tracked) seems to be very small.
In summary if you are a highly accurate eyegaze user with good head control and you don’t wear glasses.. Skyle could offer you efficient and direct hands free access to your iPad Pro. It seems expensive at €2500 especially if you don’t already own a compatible iPad (add at least another €1000 for an iPad Pro 12”). If you have been waiting for an eyegaze solution for iOS (as I know many people have) I would encourage you to wait a little longer. When the opportunity arises, try Skyle for yourself. By that time, there may be other options available.
If any of the assumptions made here are incorrect or if there is anymore information available on Skyle please let us know and we will update this post.
One aspect of modern technological life that might help us to keep some faith in humanity are the comprehensive assistive technologies that are built into, or free to download for mobile computing devices. Accessibility features, as they are loosely called, are a range of tools designed to support non-standard users of the technology. If you can’t see the screen very well you can magnify text and icons (1) or use high contrast (2). If you can’t see the screen at all you can have the content read back to you using a screen-reader (3). There are options to support touch input (4, 5) and options to use devices hands free (6). Finally there also some supports for deaf and hard of hearing (HoH) people like the ability to switch to mono audio or visual of haptic alternatives to audio based information.
their mobile operating system iOS Apple do accessibility REALLY well and this is
reflected in the numbers. In the 2018 WebAim Survey of Low
Vision users there were over 3 times
as many iOS users as Android users. That is almost the exact reverse of the
general population (3 to 1 in favour of Android). For those with Motor Difficulties it
was less significant but iOS was still favoured.
So what are Apple doing right? Well obviously, first and foremost, the credit would have to go to their developers and designers for producing such innovative and well implemented tools. But Google and other Android developers are also producing some great AT, often highlighting some noticeable gaps in iOS accessibility. Voice Access, EVA Facial Mouse and basic pointing device support are some examples, although these are gaps that will soon be filled if reports of coming features to iOS 13 are to be believed.
Rather than being just about the tools it is as much, if not more, about awareness of those tools: where to find them, how they work. In every Apple mobile device you go to Settings>General>Accessibility and you will have Vision (1, 2, 3), Interaction (4, 5, 6) and Hearing settings. I’m deliberately not naming these settings here so that you can play a little game with yourself and see if you know what they are. I suspect most readers of this blog will get 6 from 6, which should help make my point. You can check your answers at the bottom of the post 🙂 This was always the problem with Android devices. Where Apple iOS accessibility is like a tool belt, Android accessibility is like a big bag. There is probably more in there but you have to find it first. This isn’t Google’s fault, they make great accessibility features. It’s more a result of the open nature of Android. Apple make their own hardware and iOS is designed specifically for that hardware. It’s much more locked down. Android is an open operating system and as such it depends on the hardware manufactured how accessibility is implemented. This has been slowly improving in recent years but Google’s move to bundle all their accessibility features into the Android Accessibility Suite last year meant a huge leap forward in Android accessibility.
What’s in Android Accessibility Suite?
Use this large on-screen menu to control gestures, hardware buttons, navigation, and more. A similar idea to Assistive Touch on iOS. If you are a Samsung Android user it is similar (but not as good in my opinion) as the Assistant Menu already built in.
Select to Speak
Select something on your screen or point your camera at an image to hear text spoken. This is a great feature for people with low vision or a literacy difficulty. It will read the text on screen when required without being always on like a screen reader. A similar feature was available inbuilt in Samsung devices before inexplicably disappearing with the last Android update. The “point your camera at an image to hear text spoken” claim had me intrigued. Optical Character Recognition like that found in Office Lens or SeeingAI built into the regular camera could be extremely useful. Unfortunately I have been unable to get this feature to work on my Samsung Galaxy A8. Even when selecting a headline in a newspaper I’m told “no text found at that location”.
Interact with your Android device using one or more switches or a keyboard instead of the touch screen. Switch Access on Android has always been the poor cousin to Switch Control on iOS but is improving all the time.
TalkBack Screen Reader
Get spoken, audible, and vibration feedback as you use your device. Googles mobile screen reader has been around for a while, while apparently, like Switch Access it’s improving, I’ve yet to meet anybody who actually uses it full time.
So to summarise, as well as adding features that may have been missing on your particular “flavour” of Android, this suite standardises the accessibility experience and makes it more visible. Also another exciting aspect of these features being bundled in this way is their availability for media boxes. Android is a hugely popular OS for TV and entertainment but what is true of mobile device manufacturer is doubly so of Android Box manufacturers where it is still very much the Wild West. If you are in the market for an Android Box and Accessibility is important make sure it’s running Android Version 6 or later so you can install this suite and take advantage of these features.
Since the year 2000 Enable Ireland’s Assistive Technology (AT) training service have run a Foundations in AT (5 ECTS) course certified by the Technological University Dublin (TUD). Those of you reading this post will most likely be familiar with AT and what a broad and rapidly evolving area it is. While overall the direction AT has taken over the last decade is positive and exciting, it has also become a more challenging area to work in. As a result, the importance and value of the Foundations in AT course has also increased and this is both reflected in, and as a direct result of the calibre of course participant we’ve had in recent years. The wealth of experience brought by participants each year helps the course evolve and develop, filling in gaps and offering new directions for technology to support people in areas beyond primary needs such as communication, access and daily living. Last month we began what is a new effort on our part to share with a wider audience some of the excellent work produced by Foundations in AT course participants with Shaun Neary’s post Accessible Photography – Photo Editing with Adobe Lightroom & the Grid 3. This month we will look at another area of creativity, music.
Alex Lucas enrolled in the 2018 Foundations in AT course. As soon as we learned about his background and experience, we knew that his involvement in the course was an opportunity for us to learn more about accessible music technology and practice. Alex is an academic (PhD research student in Queen’s University Belfast), a maker, a musician, a developer and a product designer. Before returning to studies, he had gained 10 years’ experience working in mainstream music technology with big name companies like Focusrite and Novation. In Queens he is currently researching “Supporting the Sustained Use of Bespoke Assistive Music Technology” and is part of the Research Group: Performance Without Barriers. He also works with Drake Music Northern Ireland.
We could be accused of having underutilised Alex, but our suggestion for his project was to produce a resource that would act as an introduction to people new to the area of accessible music technology. Alex chose to focus on the mainstream Digital Audio Workstation (DAW) application Ableton Live and Switch input. As well as the project document (download link below) he released 5 really excellent tutorial videos on YouTube, the first of which is embedded here.
Alex kindly agreed to contribute to this post so we asked him why he chose to focus on Ableton, to tell us a bit more about his work in inclusive music and a little about the research he is currently undertaking at Queens. Over to you Alex..
There are many software applications available for computer-based music production. Ableton Live is arguably one of the most popular DAWs. When first released in 2001, Ableton Live set itself apart from other DAWs through a unique feature called Session View.
Session View is a mode of operation which can be thought of as a musical sketchbook providing composers with an intuitive way to create loop-based music; a feature which is particularly useful when creating electronic music. When combined with Ableton Live’s built-in virtual musical instruments and devices for creating and modifying musical ideas, we find ourselves with a rich toolset for composing music in inclusive settings.
How this works with groups?
Music connects people; we see this often when conducting group-based inclusive music workshops, making work of this kind essential to Drake Music NI. There could be up to twelve participants of mixed abilities in a typical Drake workshop. As Access Music Tutors, we approach group workshops by first speaking to each participant in turn to identify their creative goals. One individual may have an interest in playing distorted synthesiser bass sounds, while another may prefer the softer sound of a real instrument such as a piano. Knowledge of an individual’s creative goals and their access requirements is used to select an appropriate device for the participant to use to control a virtual instrument within Ableton Live.
In addition to the Access Switches described in the video’s mentioned above, Drake Music also uses commercially available assistive music technologies such as Soundbeam and Skoog, and mainstream MIDI controllers such as the Novation Launchpad. It’s possible to connect several of these devices to a single computer running Live.
Together, the group make core musical decisions; i.e. genre, tempo, musical key. The workshop will proceed in one of two ways, either we jam together, or record each participant in turn, building up a composition gradually using overdubbing techniques.
OMHI – One-Handed Musical Instrument Trust
There are a handful of other organisations within the UK, working towards providing inclusion in music. One notable organisation is the One-Handed Musical Instrument Trust (https://www.ohmi.org.uk/). Many traditional musical instruments are designed in such a way that they place a fundamental requirement on the musician; they must have two fully functional hands. This assumption results in the exclusion of some individuals from learning a traditional musical instrument. Furthermore, in some cases, accomplished musicians are not able to return to their instrument after losing the function of a hand due to illness or an accident. OHMI aims to address this shortcoming by running an annual competition which invites instrument designers to adapt traditional musical instruments to be played by one hand only. Many fantastic designs are submitted to OHMI each year. I’m particularly impressed by David Nabb’s Toggle-Key Saxophone (https://www.unk.edu/academics/music/_files/toggle-key-system.pdf) which retains all of the functionality of a standard saxophone while being playable by one hand.
Whilst OHMI primarily focuses on the adaptation of traditional acoustic instruments for inclusion and accessibility; my research centres on the challenges faced by disabled musicians in the long-term use of custom-made digital musical instruments.
In partnership with a disabled musician named Eoin at Drake Music NI, together we’ve been designing a digital musical instrument tailored towards Eoin’s unique abilities. Eoin has a strong desire to play electric guitar but as Eoin cannot hold a guitar, due to its physical characteristics, he has been unable to up until this point.
Using a motion sensor and an access switch, coupled with a Raspberry Pi embedded computer, Eoin is now able to play rudimentary guitar sounds using the movements of his right arm. We’ve tested several prototypes and are now in the process of assembling the instrument for Eoin to use both during Drake music workshops and at home.
As a musician, Eoin is the primary user of the device; however we’ve also been considering Eoin’s primary carer, his father Peter, as a secondary user. We’ve designed a high-level interface for Peter to use, hopefully allowing him to easily set-up the device for Eoin to use at home. We’re particularly interested in the longevity of the device, whether or not it’s viable for Eoin and Peter to use independently. Obsolescence can be a problem for assistive technology in general. Our current assumption is that obsolescence may be an issue with custom-made accessible digital musical instruments but hope, through this research to discover useful mitigation strategies.
Microsoft Ireland hosted a half-day workshop for second level students using
technology for additional support within education. This workshop came about
thanks to Tara O’Shea, Community Affairs Manager at Microsoft and Stephen
Howell, Academic Program Manager. Tara has been a huge supporter of Enable
Ireland Assistive Technology Service over the last decade and been the driving
force behind many of the successful projects we have collaborated on. Stephen
would be a very familiar face to anyone involved in that space where technology
and education meet, not just in Ireland but internationally.
The goal of
the workshop was to introduce some of the collaboration tools available to
students using Office365,
additional supports available to students with maths or language difficulties
and to provide alternative ways to produce and present content. Obviously as
Microsoft was hosting there was an emphases on their tools nevertheless Stephen
was quite open about how similar features are available on other platforms. We
(Enable Ireland AT) pride ourselves on providing independent recommendations;
the best solution for the user is the solution they use best. The practice of
schools forcing students down any particular route: Microsoft, Google or Apple,
is restrictive and cause difficulties if there are specific access or support
needs. Microsoft and Google though offer more browser-based tools that mean
users are free to use any device. I should also acknowledged that Microsoft
have really upped their game in the areas of Education and Accessibility over
the last few years.
Fostering collaboration is a cornerstone of modern education and promotes a vital real world skill (teamwork) that will serve students throughout their lives. The screenshot below from Facebook (Stephanie McKellop) and illustrates a way that tools we may have considered more for remote collaboration, can be used within a classroom or lecture hall.
When it comes to collaboration, Microsoft Teams is at the centre. Teams is a unified communications platform, basically it’s like WhatsApp or Facebook Messenger but with tonnes of additional features. Through Teams you can not only instant message, video/audio call or share desktops but you can also work on shared documents, whiteboards or mind maps. There are also plugins for many third party apps and services, so if you are already collaboration app or service there is probably an integration available. Stephen demonstrated how a tool like Teams could be used in a classroom session by setting up a class team and getting everyone to work on a short Sway presentation (we mentioned Sway in a previous post a couple of years ago, don’t understand why everyone isn’t using it by now). Once everyone had completed their presentation they posted a link to the class message stream and Stephen showed it on the large screen. Okay, this exercise could have been done without Teams but using the service made it so much easier and more importantly everything was recorded for students to revisit in their own time.
looked at Microsoft Learning Tools numerous times on this blog over the last
few years (read
this post is you want to know more about Learning Tools).
Thankfully, since its introduction as a plugin for OneNote in 2016 it has gone
from strength to strength. Features like Immersive
Reader are now standalone apps and have also found their way into
many other Office365 apps like Word and Outlook. Some other apps Stephen introduced
are listed below with a brief description. They are all free so we encourage
you to download and try them yourselves.
Microsoft Math: If you are familiar with the language-learning app Duolingo, this app takes a similar approach to teaching Mathematics. Short challenges with rewards and feedback. Gamifying Maths
Lets you quickly and easily capture content from the web (pictures, text etc),
draw and annotate it and share with other apps.
Provides a blank canvas where you can collaborate with others and share with
Microsoft Translator: Useful for translations or
transcriptions. Stephen also showed how it can be a great way to practice pronunciation
when learning to speak a foreign language.
This weeks post was contributed by Wyn McCormack, co-author of the Factsheets on Dyslexia at Second Level . Wyn has been involved with the Dyslexia Association of Ireland for over 20 years and has designed and presented courses on dyslexia for parents, teachers and students. She has written extensively on the topic including Lost for Words, a Practical Guide to Dyslexia at Second Level, (3rd Ed. 2006), and Dyslexia, An Irish Perspective (3nd Ed. 2011) as well as being the co-author of the Factsheets on Dyslexia at Second Level in 2013 (updated 2014, 2015, 2016). She has been a presenter for SESS, the Special Education Support Service. She is a former Guidance Counsellor and Special Educational Needs teacher. Her three sons have dyslexia.
* * * *
In 2014 the Dyslexia Association of Ireland asked myself and Mary Ball, an educational psychologist to write the Factsheets on Dyslexia at Second Level to celebrate their 40th anniversary. The key objective of the Factsheets was to give teachers clear and concise information on dyslexia, how it affects students and how schools and teachers can help. With dyslexia affecting approximately one in ten people, there are many thousands of students with dyslexia in schools.
There are 18 Factsheets. The majority were intended for teachers and schools and cover topics such as teaching literacy, numeracy, foreign languages, Maths and Assistive Technology. Factsheet 16 is for parents on how they can help and Factsheet 17 is for students on study strategies.
I update the Factsheets annually in August and they are available for free download at www.dyslexiacourses.ie. After putting the work into writing them, I really wanted to get them widely used. In 2014 I had taken early retirement as a Guidance Counsellor and Special Education Teacher. So I set up Dyslexia Courses Ireland to offer schools, parents and students courses on dyslexia friendly strategies and AT resources. I was then joined by Deirdre McElroy, a school colleague who had worked as a NEPS educational psychologist. The courses have been really well received. Since 2014 we have had just under 3000 teachers, 540 parents and 480 students attend our courses. We run courses at central venues for teachers and also give presentations to the teaching staff within schools. At this stage we have been to schools in every county (outside of N. Ireland). In 2018 in the last week of August which is the first week of the school year, we presented courses in 14 schools.
The course for students is a study skills workshop. Students with dyslexia may experience difficulties with organisation, reading, memory and learning, note-taking, writing and spelling. They may find it hard to show what they know in exams due to misreading questions and poorly structured answers. The workshop covers strategies that help the student to achieve and which also target their specific difficulties.
A key element of the teacher courses is that while we share ideas with the teachers, we ask them to recommend websites, Apps, and strategies that they are using in the classroom. As a result we have an extensive list of recommended websites. The teachers generously have allowed us to share these. We do this by twice a year sending out a newsletter to all schools as well as to those who attended our courses. The recommendations have grown so much that while we did have one handout called Useful websites/APPS on Keynotes, subject specific resources, study skills, exam preparation, assistive technology and on-line tutorials, we have had to split it into one for teachers and one for students. Both are available under downloads on the website.
While my favourite websites vary over time, some really helpful ones are as follows;
alison.com for on-line tutorials in Project Maths at Junior and Leaving Cert.
sparknotes.com and, in particular, their short videos of Shakespearian plays and the No Fear guides where the Shakespearian words are on side of the page with a modern English translation on the other.
studystack.com with flashcards and games when key facts have to be learnt.
The reason I am so involved is that my three sons are dyslexic and I realised much more needed to be done at second level. As I have travelled with them on their journey through education, I also realised there was a reason why I could never tell left from right and that I also shared some of dyslexic traits. These experiences have helped me appreciate the difficulties which many students with dyslexia face in school.
I hope the factsheets contribute to greater awareness of dyslexia at second level and all the ways that teachers and schools can support the these students.
Voice Banking involves the recording of a list of sentences into a computer. When enough recordings have been captured, software chops them up into individual sounds, phonetic units. A synthetic voice can then be built out of these phonetic units, this is called Concatenative speech synthesis. The number of sentences or statements needed to build a good quality English language synthetic voice using this process varies but is somewhere between 600 and 3500. This will take at least 8 hours of constant recording. Most people break it up over a few weeks which is recommended as voice quality will deteriorate over the course of a long session. So 20 minutes to half an hour in the morning (when most people’s voices are clearer) would be a good approach. The more recordings made the better quality the resulting voice will be.
There are a number of services offering voice banking and we have listed some that we are aware of below. The technology used varies from service to service and this post isn’t intended to be a guide to which service may be appropriate to a particular user. Our advice would be to investigate all options before making a decision as this process will be a considerable investment of time and in some cases money.
A person might choose to bank their voice for a number of reasons. The most common reason would be if someone has been diagnosed with a progressive illness like Motor Neuron Disease (MND/ALS) or similar that will result in the loss of speech. A voice is a very personal thing and being able to keep this aspect of individuality and identity can be important. The MND Association have detailed information Voice Banking on their website here. People unable to speak from birth can also take advantage of this technology. The VocalID service (although expensive) seems to offer good options in this regard. A family member could donate their voice by going through the voice banking process (or they could choose an appropriate donated voice). This synthetic voice could then be modified with filters modelled on the users own vocalisations. The result is a unique and personal voice with some of the regional qualities (accent, pronunciation) that reflect their background and heritage. Irish AAC user have historically had little choice when it came to selecting a voice, most grudgingly accepting the BBC newsreader upper-class English voice that was ubiquitous in communication devices. In Ireland, where an accents can vary significantly over such small geographical areas, how you speak is perhaps even more tied to your identity than other countries. Hopefully in the near future we will be hearing AAC users communicating in Cork, Limerick and Dublin accents!
For research purposes I used the ModelTalker service to create a synthetic voice. I wanted to see how well it dealt with the Irish accent. The ModelTalker service is run out of the Nemours Speech Research Laboratory (SRL) in the Nemours Center for Pediatric Auditory and Speech Sciences (CPASS) at the Alfred I. duPont Hospital for Children in Wilmington, Delaware. It is not a commercial service, only costing a nominal $100 to download your voice once banked. They offer an Online Recorder that works directly in the Chrome Browser or you can download and install their MTVR App if you are using the Windows OS. The only investment you need to make to begin banking your voice is a decent quality USB headset. I used the Andrea NC-181 (about €35). For the best quality they recommend you record about 1600 sentences but they can build a voice from 800. As this was just an experiment I recorded the minimum 800. At the beginning of each session you go through a sound check. Consistency is an important factor contributing to the overall quality of the finished voice. This is why you need to keep using the same computer and microphone throughout the whole process, ideally in the same location. When you begin you will hear the first statement read out, you then record the statement yourself. A colour code will give you feedback on whether the recording was acceptable or not. Red means it wasn’t good enough to use and so you should try again. Yellow is okay, could be better and green means perfect, move on. I found the Irish accent resulted in a lot of yellow. Don’t let this worry you too much. A nice feature for Irish people who want to engage in this process is the ability to recording custom sentences. They recommend that you at least record your own name. So many names and places in Ireland are anglicised versions of Irish that it would be worthwhile spending a bit of time on these custom sentences. “Siobhán is from Drogheda” for example would be incomprehensible using most Text to Speech. At the end of each session you upload your completed sentences which are added to your inventory (if using the browser based recorder they are added as you go). When you feel you have enough completed you can request your voice. When the voice is ready you need to audition it, this process allows you to fine tune how it sounds. I made a screen recording of this process and I will add it to this post when I have edited it down to a manageable length.
Click play below to hear a sample of my synthesized voice. Yes, unfortunately I do kind of sound like that J
It appeared in the Cosmo room as if out of nowhere. Looking like a section of the international space station (one of the newer parts), it immediately grabs the attention of anybody who enters the room. Enable Ireland Children’s Services have been trialling a Sensory Pod over the last few months and both staff and clients are enthusiastic about it. I had a quick chat with Robert Byrne, creator of the Sensory Pod, while he was making some minor modifications based on feedback from our therapists.
In a previous job Robert Byrne spent a lot of time visiting manufacturers in Asia, which is when he first came across the idea of a capsule hotel. Due to population density, space in some Asian cities is at a premium. A capsule hotel consists of rooms that are only the size of the bed they contain. You have enough head room to sit up in bed but not enough to stand. In this corner of the world with our open spaces and high ceilings the thoughts of a night in such accommodation might cause us to break into a claustrophobic sweat, Robert however only saw an opportunity. Through a family member, Robert had experience of Autism. A common symptom reported by people with this form of neurodiversity is oversensitivity to stimuli: light, noise, touch and smells. It is this aspect of Autism that can actually prevent some people from engaging in everyday activities such as work and education. Robert noticed how successful the capsule hotel room was at shielding its occupant from such outside stimuli and realised it could be a very cost effective way to provide a safe and comfortable space for schools and colleges.
He took the basic design of the capsule room and customised it to suit this new function. Along with his design team, he reinforced the plastic shell and mounted the pod in a steel frame, with an extra bed that can be pulled out alongside the Pod. This provides a comfortable area for a parent or caregiver to relax when the Pod is occupied. They added LED mood lighting, temperature control, audio and 22” learning screen. The design is modular, allowing customisation to best suit individual client’s needs, full details are on the Sensory Pod site.
It’s all very well having a good idea but it takes a particular type of person to be able to see it through to a marketable product. The Sensory Pod have built an extensive portfolio manufacturing and designing sleep systems and safe spaces for some of the Largest Corporate companies across Europe and further afield. They played a key role in Dublin City University’s successful Autism Friendly Campus initiative. Students can apply for a smart card and book a time slot. Using their card they can open the pod door and escape the hustle and bustle of campus life for an hour.