Lost Voice Guy – Winner of Britain’s Got Talent 2018

Lost Voice Guy – Winner of Britain’s Got Talent 2018

A few weeks ago, Lee Ridley (a.k.a. Lost Voice Guy) became the first comedian to win Britain’s Got Talent, now in its 12th year. As well as outshining his competitors along the way, and winning with a clear margin, Lee was a favourite with both the judges and the public.


What also makes Lee’s win even more incredible is that fact that he is the first person with a disability to win the show. For a stand-up comedian, being able to connect with your audience is essential, and he did this with self-depreciating humour, fantastic delivery and some killer one-liners, all done through the use of Alternative and Augmentative Communication(AAC).


AAC provides a means of communication for those whose speech is not sufficient to communicate functionally in all environment and with all partners. Lee uses a combination of two devices to support his communication – an iPad with apps, and a dedicated device called a Lightwriter.


Lee has been on the comedy circuit since 2012, and has won prestigious prizes, including the BBC Radio New Comedy Awards in 2014. Below is an interview that Lee participated in, via email, with Karl O’Keeffe back in 2013, which gives some insights into his process and the unique challenges that using a synthesised voice can present.


Check out Lee’s other work on his youtube channel (www.youtube.com/user/LostVoiceGuy) – be prepared to laugh your socks off!


Karl: You are the first person ever to do stand up comedy who uses a communication device, so you had nobody to learn from. What are the most important techniques and tricks you have learned so far that you wish someone had told you when you were starting?


Lee: I think one of the most important techniques that I have learnt is how to deal with timing. Obviously it’s pretty hard to know when to leave pauses for laughter and stuff, especially as I have to pre plan this. I can pause whenever I want but you have to be ready to pause when people laugh otherwise the start of the next bit gets lost or they don’t laugh as long. You sort of have to know when it’s coming so you’re ready for it. Obviously every audience is different so I’m never going to get it right every time.  I think I’m getting better at anticipating when to pause though.


Karl: I see from your videos that you use both a LightWriter and an iPad. Can you tell me which it better for stand up comedy?


Lee: I use my iPad for my stand up and I use my Lightwriter for day to day conversations. I just find that my iPad is easier to understand slightly. It is also easier to find my material on the iPad and because it backs up to the cloud, it’s a bit more secure and means i can use any Apple device. It’s also a bit sexier than my Lightwriter.


Karl: Do you always use the same voice? Why is the voice important in your performance?


Lee: I use the same voice mostly yes. However I do use other voices in my act as well for comedy purposes. For example, I use a woman’s voice to do an impression of my mother. I think that my main voice is important to me because it has become ‘my’ voice. It’d be weird if I changed it now.


Karl: What app do you use on the iPad for communication?


Lee: I use Proloquo2go, which is a brilliant app. It is very complex but easy to use at the same time. It does everything that I need it to do really.


Karl: What is your favourite app on the iPad?


Lee: I tweet quite a lot so I tend to use Tweetbot all the time. I couldn’t get through long train journeys with the Spotify app either!


Karl: Do you use any other Assistive Technology (computer access etc.)?


Lee: No. I only use Proloquo2go on my iPad and iPhone and then my Lightwriter.


Bloom 2017 ‘No Limits’ Grid Set

You may have heard about or seen photos of Enable Irelands fantastic “No Limits” Garden at this year’s Bloom festival. Some of you were probably even lucky enough to have actually visited it in the Phoenix Park over the course of the Bank Holiday weekend. In order to support visitors but also to allow those who didn’t get the chance to go share in some of the experience we put together a “No Limits” Bloom 2017 Grid. If you use the Grid (2 or 3) from Sensory software, or you know someone who does and you would like to learn more about the range of plants used in Enable Ireland’s garden you can download and install it by following the instructions below.

How do I install this Grid?

If you are using the Grid 3 you can download and install the Bloom 2017 Grid without leaving the application. From Grid explorer:

  • Click on the Menu Bar at the top of the screen
  • In the top left click the + sign (Add Grid Set)
  • A window will open (pictured below). In the bottom corner click on the Online Grids button (you will need to be connected to the Internet).

grid 3 screen shot

  • If you do not see the Bloom2017 Grid in the newest section you can either search for it (enter Bloom2017 in the search box at the top right) or look in the Interactive learning or Education Categories.

If you are using the Grid 2 or you want to install this Grid on a computer or device that is not connected to the Internet then you can download the Grid set at the link below. You can then add it to the Grid as above except select Grid Set File tab and browse to where you have the Grid Set saved.

For Grid 2 users:

Download Bloom 2017 Grid here https://grids.sensorysoftware.com/en/k-43/bloom2017

Boardmaker Online now launched in Ireland

Tobii Dynavox have recently launched their new Boardmaker Online product in Ireland through SafeCare Technologies. It has all the functionalities of previous versions of Boardmaker, except now that it’s web-based you don’t need any disks and multiple users can access it from any PC.

Instructor showing students how to use Boardmaker Online

You can purchase a Personal, Professional or District account and the amount you pay depends on the type of account, the amount of “instructors” and how many years you want to sign up for. You can also get a discount for any old Boardmaker disks that you want to trade in.

You get all the symbols that have been available in past versions, as well as some new symbol sets and any new ones that are created in the future will also be given to you. Because it’s web-based, you have access to previously created activities via the online community and you can upload activities you create yourself to that community and share them with other people in your district or all over the world.

Because it’s no longer tied to one device, you can create activities on your PC and assign them to your “students” who can use them either in school and/or at home. You no longer need to have a user’s device in your possession to update their activities and they don’t need to have a period without their device while you do this.

You (and the other instructors in your district if you have a district licence) can also assign the same activity to many students and by having different accessibility options set up for different students, the activity is automatically accessible for their individual needs. For example, you could create an activity and assign it to a student who uses eye gaze and to a student who uses switches and that activity will show up on their device in the format that’s accessible for them.

Picture shows how instructors can assign Boardmaker Online activities to multiple students

The results of students’ work can be tracked against IEP or educational goals which then helps you decide what activities would be suitable to assign next. You can also track staff and student usage.

One limitation is that you can only create activities on a Windows PC or Mac. You can play activities on an iPad using the free app but not create them on it, and you can’t use Boardmaker Online to either create or play activities on an Android or Windows-based tablet.

The other point to mention is that because it’s a subscription-based product, the payment you have to make is recurring every year rather than being a one-off payment, which may not suit everyone.

However, with the new features it’s definitely worth getting the free 30-day trial and deciding for yourself if you’d like to trade in your old Boardmaker disks for the new online version!

GazeSpeak & Microsoft’s ongoing efforts to support people with Motor Neuron Disease (ALS)

Last Friday (February 17th) New Scientist published an article about a new app in development at Microsoft called GazeSpeak. Due to be released over the coming months on iOS, GazeSpeak aims at facilitating communication between a person with MND (known as ALS in the US, I will use both terms interchangeably) and another individual, perhaps their partner, carer or friend. Developed by Microsoft intern, Xiaoyi Zhang, GazeSpeak differs from traditional approaches in a number of ways. Before getting into the details however it’s worth looking at the background, GazeSpeaker didn’t come from nowhere, it’s actually one of the products of some heavyweight research into Augmentative and Alternative Communication (AAC) that has been taking place at Microsoft over the last few years. Since 2013, inspired by football legend and ALS sufferer Steve Gleason (read more here) Microsoft researchers and developers have put the weight of their considerable collective intellect to bear on the subject of increasing the ease and efficiency of communication for people with MND.

Last year Microsoft Research published a paper called ”
AACrobat: Using Mobile Devices to Lower Communication Barriers and Provide Autonomy with Gaze-Based AAC” (abstract and pdf download at previous link) which proposed a companion app to allow an AAC user’s communication partner assist (in an non-intrusive way) in the communication process. Take a look at the video below for a more detailed explanation.

This is an entirely new approach to increasing the efficiency of AAC and one that I suggest, could only have come from a large mainstream tech organisation who have over thirty years experience facilitating communication and collaboration.

Another Microsoft research paper published last year (with some of the same authors at the previous paper) called “Exploring the Design Space of AAC Awareness Displays” looks at importance of a communication partners “awareness of the subtle, social, and contextual cues that are necessary for people to naturally communicate in person”. There research focused on creating a display that would allow the person with ALS express things like humor, frustration, affection etc, emotions difficult to express with text alone. Yes they proposed the use of Emoji, which are a proven and effective way a similar difficulty is overcome in remote or non face to face interactions however they went much further and also looked at solutions like Avatars, Skins and even coloured LED arrays. This, like the other one above, is an academic paper and as such not an easy read but the ideas and solutions being proposed by these researchers are practical and will hopefully be filtering through to end users of future AAC solutions.

That brings us back to GazeSpeak, the first fruits of the Microsoft/Steve Gleason partnership to reach the general public. Like the AACrobat solution outlined above GazeSpeak gives the communication partner a tool rather than focusing on tech for the person with MND. As the image below illustrates the communication partner would have GazeSpeak installed on their phone and with the app running they would hold their device up to the person with MND as if they were photographing them. They suggest a sticker with four grids of letters is placed on the back of the smart phone facing the speaker. The app then tracks the persons eyes: up, down, left or right, each direction means the letter they are selecting is contained in the grid in that direction (see photo below).

man looking right, other person holding smartphone up with gazespeak installed

Similar to how the old T9 predictive text worked, GazeSpeak selects the appropriate letter from each group and predicts the word based on the most common English words. So the app is using AI in the form of machine vision to track the eyes and also to make the word prediction. In the New Scientist  article they mention that the user would be able to add their own commonly used words and people/place names which one assumes would prioritize them within the prediction list. In the future perhaps some capacity for learning could be added to further increase efficiency. After using this system for a while the speaker may not even need to see the sticker with letters, they could write words from muscle memory. At this stage a simple QR code leading to the app download would allow them to communicate with complete strangers using just their eyes and no personal technology.

2016 – Technology Trends and Assistive Technology (AT) Highlights

As we approach the end of 2016 it’s an appropriate time to look back and take stock of the year from an AT perspective. A lot happened in 2016, not all good. Socially, humanity seems to have regressed over the past year. Maybe this short term, inward looking protectionist sentiment has been brewing longer but 2016 brought the opportunity to express politically, you know the rest. While society steps and looks back technology continues to leap and bound forward and 2016 has seen massive progress in many areas but particularly areas associated with Artificial Intelligence (AI) and Smart Homes. This is the first in a series of posts examining some technology trends of 2016 and a look at how they affect the field of Assistive Technology. The links will become active as the posts are added. If I’m missing something please add it to the comments section.

Dawn of the Personal Digital Assistants

Game Accessibility

Inbuilt Accessibility – AT in mainstream technology 

Software of the Year – The Grid 3

Open Source AT Hardware and Software

The Big Life Fix

So although 2016 is unlikely to be looked on kindly by future historians… you know why; it has been a great year for Assistive Technology, perhaps one of promise rather than realisation however. One major technology trend of 2016 missing from this series posts is Virtual (or Augmented) Reality. While VR was everywhere this year with products coming from Sony, Samsung, Oculus and Microsoft its usefulness beyond gaming is only beginning to be explored (particularly within Education).

So what are the goals for next year? Well harnessing some of these innovations in a way where they can be made accessible and usable by people with disabilities at an affordable price. If in 2017 we can start putting some of this tech into the hands of those who stand to benefit most from its use, then next year will be even better.

Tobii Power Clinic Online Webinars

TobiiDynavox the AAC arm of Tobii Group, industry leading eye gaze/tracking solutions manufacturer, are currently producing a series of webinars on their range of products. Although they have completed 8 webinars to date they are all recorded and available on Youtube and the Power Clinic Online page. Taken together these are a fantastic resource for anybody interested in EyeGaze input. While some are product specific (for example “Getting Started With Communicator“) others are more general and will have content of interest to both therapists and technicians working in the area (Amazing Calibrations for EyegazeAAC and Autism – Strategies for challenging behaviour & EyeGaze Strategies for Reading!).

Tobii users, especially those new to the technology will be particularly interested in Eyegaze Windows Control for beginners! and Advanced Windows Control (eyegaze). If you rather the live experience make sure to sign up for next weeks webinar AAC AND MUSIC (PERFECT SYMPHONY) and you will be able to ask questions and contribute to the discussion.

We have created a playlist of all the Tobii Webinars to date on our Youtube channel if that makes it easier for you to access and share them (Click here)

AAC Awareness Day – 24th August 2016

Hi Everyone

Just to make you aware that there’s a FREE AAC Awareness Day being held at the Central Remedial Clinic, Clontarf on 24th August 2016.

Liberator’s AAC Awareness Days are designed for anyone who works with or cares for a non-verbal individual – therapists, teachers, parents, carers etc. This is an excellent opportunity to learn more about:-

  • Natural Language Development in AAC
  • Language Acquisition through Motor Planning (LAMP) – The Centre for AAC and Autism
  • AAC Implementation Strategies
  • The Power of Core Vocabulary
  • Exploring EyeGaze
  •  An Overview of Low-Tech AAC Products

If you would like to attend, please contact:-

Lauren Argent
AAC Consultant Support Executive
Tel: 00 44 1733 889 799
Fax: 00 44 1476 552 473


Smartbox Study day in Dublin

A Smartbox Live event showing the audienceSmartbox have some very useful assistive technology products within communication, environment control and computer control. They have recently launched their Grid3 software which has continued to improve over the years, enabling many people with disabilities an alternative method to control their computer or to even communicate.
The Smartbox team will be taking to the road; bringing their developments in the world of AAC and assistive technology to venues across the UK and Ireland.

Next study day is:

Dublin – Friday, 6 November at Crowne Plaza Hotel
Everyone is welcome
Registration: http://www.eventbrite.co.uk/o/smartbox-assistive-technology-6787601933

Inclusive Technology AT Webinars

participant of webinar

Inclusive technology have hosted a number of assistive technology webinars.  If you have missed them, they have been recorded and are available to watch at the following link recordings here!

Although you won’t have access to the live Q&A, the recordings will still be very useful to watch.  All webinars are FREE – all you require is an Internet connection and a computer.

Webinars include: Using Technology to Support Reading and Writing, iPad Access for Physical Disabilities, Using Tobii Eye Gaze and many more.

Eye Pointing and Using an ETRAN Frame

Child looking at an Etran Frame

For some non-verbal children, choosing pictures or symbols with their hands is not an option, due to conditions that make it difficult for them to reach, point to or grasp. For these children, we need to look at ways for them to select picture or symbols, without undue effort or stress, so that they can communicate in an easy, timely manner.

Children naturally look at interesting pictures placed in front of them. They may not yet have the idea that looking intentionally at a picture caused particular outcome (i.e. looking at the picture of the book results in a story being read to them) but they may gradually make this connection over time (cause and effect) if the outcome of their selection, whether intentional or not, is consistently carried out.

For both children making intentional choices and unintentional choices, eye pointing (selecting a picture or symbol by looking at it) can be a powerful way of communicating. While many children may start with eye pointing to items in their visual field, it may be necessary to formalise this system, through the use of a consistent set-up. An Etran Frame is one method of doing this.

An Etran frame is like a large square, see-through donut, with a space in the centre (see picture below). It is usually made from a plastic transparent material, such as Perspex. It sometimes has a base or is mounted on an arm, so that the facilitator does not need to hold it, and has both hands free to interact with the child and their selections. The frame should be placed in front of the child, with picture symbols attached, facing toward the child. The hole in the middle is used by the facilitator to see the child and follow their gaze, to interpret what picture or symbol the child is looking at.


The child should be positioned well, with the Etran frame square on, in front of them. If possible it should be mounted, to prevent it moving around.

The partner should be directly opposite the child and able to make eye contact through the hole in the centre of the frame.

Using the Etran frame

Plan to use 2-3 locations initially – with the top left and bottom right being the first positions to introduce pictures.

The bottom right will always be used as an “All Done”/”Stop” indicator. The vocabulary at the top will vary according to activity.

Etran frame

Make it fun

The child needs to get the idea right from the start that this is a good game. Use highly motivating objects, activities or toys and stick symbols directly onto the frame.

Attaching items to the frame

Bluetack or Velcro is really the simplest and best method.  If you stick on picture symbols, make sure that you write what they are on the back, so that you can see which is which! (You cannot both read their eye movements from the front and keep popping around to see the child side of the frame and “sitting on his or her shoulder” to see what they are pointing at).

Child looking through Etran frame

How many objects are pictures?

Not too few (boring), not too many (overwhelming). Two, then quickly onto three is good. Move up gradually.

First steps

Model the activity for so that a child knows what they’re supposed to do. Let the child see you sticking pictures/symbols to the frame.


When the child looks at the object for 1-2 seconds, use the selection frame around the symbol s/he looked at (for visual reinforcement), and say “you looked at this. It’s a ____. Will we use/do_____?” (for auditory reinforcement). The reason for introducing the selection frame is for the future, when the child may move onto a high tech device and the selection is indicated with a colour frame around their selection.


If the child doesn’t look at the symbols, say, “Where is the _____? Let’s look for the _____. You help me find it”.

Take your finger and slowly point the item fixed in the position that is top left (to the child- always start at this same position). Say “Is this it?” Move your finger slowly and smoothly along to the next item along the top of the frame, if a second picture is being used. Try to take the child’s eyes with you as you move your finger “Is this it?”. Keep doing this until you get to the one you want. Then say “aha, we found it, look at the _____! Here it is!”.

Take the selection frame and hold it around the symbol for 2-3 seconds, so that the child knows that this is the choice s/he has selected. For example, below shows the “Stop/All done” symbol selected.

Etran frame

Ideas for using the Etran Frame.

Book reading – Use a “Turn the page” symbol in conjunction with the “All done/Stop” symbol. A second symbol for “Read it again” or “new book” can be introduced later.

Blowing bubbles/listening to music/any activity the child enjoys – use a “more” or “again” symbol with the “All done/Stop” symbol

Choice making – two symbols plus the “All done/Stop” symbols are required for this i.e. book vs. teddy, walk vs. car, visit Granny vs. visit shops, jack in the box vs. popper etc, etc. Even choosing clothes, colours, toys etc.

Playing games – Simon says is a great game for group activities. Use a couple of actions such as “dance”, “blow raspberries”, “sing”, “wiggle your bum” “hop” etc. etc, so that the child  can tell others what to do.

Use teddies/dolls for pretend play. Ask the child which s/he wants to play with first (“teddy” and “doll” symbols) and then choose what to do i.e. “have a drink”, “have a bath” “go to bed” etc. etc.