Supporting AAC Users

finger pointing on a communication  board

In this time of uncertainty, having access to communication is more important than ever, so that we can all stay informed, ask questions and ease anxieties. We’ve rounded up some of the support that AAC suppliers are offering currently, to support AAC users, their families and professionals. In addition to the below, all companies are offering phone and web support to users, and most are offering online webinars and training.

SafecareTechnologies are offering a 60 days free trial of Snap and Core, for both Windows https://www.tobiidynavox.com/en-GB/software/windows-software/snap/  and iOS https://www.tobiidynavox.com/en-GB/software/ipad-apps/Snap-1. There are also CoronaVirus content pages that can be downloaded https://www.mytobiidynavox.com/psc/snapcorefirst/63829. They are also still offering trails and loans of devices and ensuring that all equipment sent out is sanitised.

Smartbox has a few offers at present – As well as their usual 60 days free trial of the Grid 3 software, they are offering free copies of their software “Look to Read”, sign up at this link https://thinksmartbox.com/news/look-to-read-donations/. They also have some fantastic Coronavirus resources available for their Supercore users but are easily adapted for other users as well https://thinksmartbox.com/news/coronavirus-super-core-resources/. Virtual visits and support are also available.

HelpKidzLearn have 14 day free trials of most of their software and apps and have introduced a new low cost, monthly licence option for their Games and Activities and ChooseItMaker https://www.helpkidzlearn.com/updates/school-closure.html

Boardmaker has a 90 day trial of their Boardmaker Online, sign up here:

https://goboardmaker.com/blogs/news/boardmaker-online-working-from-home

Liberator are offering 2 months free access to the AAC Language Lab https://www.liberator.co.uk/blog/blog/2020/03/26/2-months-free-access-to-the-aac-language-lab/

Tobii Dynavox also have fantastic low tech communication chart options for people who might be in hospital – available in symbol format https://download.mytobiidynavox.com/MyTobiiDynavox/Documentation/Coronavirus/CV-US/ICU%20Communication%20Board%20-%20ENGLISH%20%28US%29%20WITH%20SYMBOLS.pdf  and text https://download.mytobiidynavox.com/MyTobiiDynavox/Documentation/Coronavirus/CV-US/ICU%20Communication%20Board%20-%20ENGLISH%20%28US%29.pdf

Autism Awareness 2020

April is Autism Awareness Month, and during this time, some suppliers offer discounts and special offers. Some extend for the entire month, others for just a few days, so get in quick if you want to take advantage!

Avaz is offering a 50% discount for the full month on the following products:

  • Avaz AAC : An award-winning communication app for users with speech difficulties arising from ASD, Cerebral Palsy, Downs Syndrome, aphasia, apraxia, strokes & more MDA
  • Avaz Reader : Education app that enables struggling readers become independent readers using research backed strategies.
  • Avaz FreeSpeech : An education app that makes learning English Grammar fun & easy for children with special needs

For more information, please see: https://www.avazapp.com/blog/autism-acceptance-month-discount-april-1-to-april-30-2020/?utm_source=ticker

Assistiveware have a 50% off promotion from today (2nd April) to the 4th of April on the following apps for iOS and software for Macs :

  • Proloquo2Go: a symbol based AAC, completely customisable and designed for a range of fine-motor and visual skills. Available as an app and software.
  • Proloquo4Text: a text based AAC system, with intuitive word and sentence prediction. Available as an app and software
  • Pictello: an app for creating visual stories and visual schedules, building literacy skills
  • Keeble: an accessible iOS keyboard, that can take the place of the standard keyboard, and helps individuals with physical or visual impairments.

Please see https://www.assistiveware.com/blog/discount-celebrating-autism-acceptance-month for more information.

PECS (Picture Exchange Communication System from Pyramid Educational Consultants) are offering a 15% discount on products purchased from their Resources shop with the code “pecs20” until the end of June 2020. https://pecs-unitedkingdom.com/store/. They also have a discount on their range of apps, valid for the month of April 2020, including ( PECS® IV+, PECS® Phase III, iHear PECS®: Animals™, Wait4it™, and Working4™) . Please see https://pecs-unitedkingdom.com/apps/ for further information.

Liberator have a 50% discount on the LAMP Words for Life app available from the 1st to 5th April. LAMP (Language acquisition through Motor Planning) is a therapeutic approach to using AAC with nonverbal individuals with Autism. Please click on the link for more information: https://mailchi.mp/liberator.co.uk/lampsale2020-847749?e=6bb0526a2b

We will add more special promotions as we become aware of them!

Skyle – out of the blue

Anybody working with Assistive Technology (AT) knows how useful Apple iOS devices are. Over the years they have gradually built in a comprehensive and well-designed range of AT supports that go a long way to accommodating every access need. This is no small feat. In 2009 VoiceOver transformed what was essentially a smooth featureless square of glass with almost no tactile information, into the preferred computing device for blind people. In 2019 Voice Control and the improvements made to Assistive Touch filled two of the last big gaps in the area of “hands free” control of iOS. All this great work is not completely altruistic however as it has resulted in Apple mobile devices cementing their place as the preeminent platform in the area of disability and AT. It is because of this that it has always been somewhat of a mystery why there has never been a commercial eye tracking option available for either iOS or MacOS. Perhaps not so much iOS as we will see but certainly one would have thought an eyegaze solution for the Apple desktop OS could be a viable product.

There are a few technical reasons why iOS never has supported eyegaze. Firstly, up until the newer generations of eye gaze peripherals, eye gaze needed a computer with a decent spec to work well. iPads are Mobile devices and Apple originally made no apologies for sacrificing performance for more important mobile features like reducing weight, thickness and increasing battery life. As eye trackers evolved and got more sophisticated, they began to process more of the massive amount of gaze data they take in. So rather than passing large amounts of raw data straight through to the computer via USB 3 or Firewire they process the data first themselves. This means less work for the computer and connection with less bandwidth can be used. Therefore, in theory, an iPad Pro could support something like a Tobii PC Eye Mini but in practice, there was still one major barrier. iOS did not support any pointing device, let alone eye tracking devices. That was until last September’s iOS update. iOS 13 or iPadOS saw upgrades to the Assistive Touch accessibility feature that allowed it to support access to the operating system using a pointing device.     

iPad Pro 12" in black case with Skyle eye tracker
iPad Pro 12″ with Skyle eye tracker and case

It is through Assistive Touch that the recently announced Skyle for iPad Pro is possible. “Skyle is the world’s first eye tracker for iPad Pro” recently announced by German company EyeV https://eyev.de/ (who I admit I have not previously heard of). Last week it appeared as a product on Inclusive Technology for £2000 (ex VAT). There is very little information on the manufacturer website about Skyle so at this stage all we know is based on the Inclusive Technology product description (which is pretty good thankfully). The lack of information about this product (other than the aforementioned) significantly tempers my initial excitement on hearing that there is finally an eye tracking solution for iOS. There are no videos on YouTube (or Inclusive Technology), no user reviews anywhere. I understand it is a new product but it is odd for a product to be on the market before anybody has had the opportunity of using it and posting a review. I hope I am wrong but alarm bells are ringing. We’ve waited 10 years for eye tracking on iOS, why rush now?

Leaving my suspicion behind there are some details on Inclusive Technology which will be of interest to potential customers. If you have used a pointing device through Assistive Touch on iPadOS you will have a good idea of the user experience. Under Cursor in the Assistive Touch settings you can change the size and colour of the mouse cursor. You will need to use the Dwell feature to automate clicks and the Assistive Touch menu will hive you access to all the other gestures needed to operate the iPad. Anyone who works with people who use eye tracking for computer access will know that accuracy varies significantly from person to person. Designed for touch, targets in iPadOS (icons, menus) are not tiny, they are however smaller than a cell in the most detailed Grid used by a highly accurate eyegaze user. Unlike a Windows based eye gaze solution there are no additional supports, for example a Grid overlay or zooming to help users with small targets. Although many users will not have the accuracy to control the iPad with this device (switch between apps, change settings) it could be a good solution within an AAC app (where cell sizes can be configured to suit user accuracy) or a way of interacting with one of the many cause and effect apps and games. Again however, if you have a particular app or activity in mind please don’t assume it will work, try before you buy. It should be noted here that Inclusive Technology are offering a 28 Day returns policy on this product.

There is a Switch input jack which will offer an alternative to Dwell for clicking or could be set to another action (show Assistive Touch menu maybe). I assume you could also use the switch with iOS Switch Control which might be a work around for those who are not accurate enough to access smaller targets with the eye gaze device. It supports 5 and 9 point calibration to improve accuracy. I would like to see a 2 point calibration option as 5 points can be a stretch for some early eyegaze users. It would also be nice if you could change the standard calibration dot to something more likely to engage a child (cartoon dog perhaps).

Technical specs are difficult to compare between eye trackers on the same platform (Tobii v EyeTech for example) so I’m not sure what value it would be to compare this device with other Windows based eye trackers. That said some specs that will give us an indication of who this device may be appropriate for are sample rate and operating distance. Judging by the sample rate (given as 18Hz max 30Hz) the Skyle captures less than half the frames per second of its two main Windows based competitors (Tobii 30 FPS TM5 42 FPS). However even 15 FPS should be more than enough for accurate mouse control. The operating distance (how far the device is from the user) for Skyle is 55 to 65 cm which is about average for an eyegaze device. However only offering a range of 10 cm (Tobii range is 45cm to 85 cm, so 40 cm) as well as the photo below which shows the positioning guide both indicate that this not a solution for someone with even a moderate amount of head movement as the track box (area where eyes can be successfully tracked) seems to be very small.

the positioning guide in the skyle app. letterbox view of a persons eyes. seems to indicate only movement of a couple of centimeters is possible before going out of view.
Does the user have to keep their position within this narrow area or does Skyle use facial recognition to adjust to the user’s position? If it’s the former this solution will not be appropriate for users with even a moderate amount of head movement.

In summary if you are a highly accurate eyegaze user with good head control and you don’t wear glasses.. Skyle could offer you efficient and direct hands free access to your iPad Pro. It seems expensive at €2500 especially if you don’t already own a compatible iPad (add at least another €1000 for an iPad Pro 12”). If you have been waiting for an eyegaze solution for iOS (as I know many people have) I would encourage you to wait a little longer. When the opportunity arises, try Skyle for yourself. By that time, there may be other options available.

If any of the assumptions made here are incorrect or if there is anymore information available on Skyle please let us know and we will update this post.

AAC Awareness Month

October is AAC (Alternative and Augmentative Communication) Awareness Month. The goal is to raise awareness of AAC and to promote the many different ways in which people communicate using communication systems, both low and high tech. In order to celebrate this, some AAC companies offer discounts and special promotions. More should be announced in the coming weeks, and we will update this post to let you know!

Assistiveware will be offering a 50% discount on some of their most popular apps between the 14th and 16th October, including:

Proloquo2Go – a symbol-based acc app, compatible with iPad. iPod, iPhone and the Apple Watch.

Proloquo4text – a text-based AAC app, again available on the platforms mentioned above

Keeble – A highly customisable keyboard for iPad, iPod and iPhone, with word prediction, accommodations for physical and visual difficulties, and a speak as you type feature.

Pictello – an app for creating visual stories and schedules for iPad, iPhone and iPod.

More information can be found here:

https://www.assistiveware.com/blog/save-the-date-aac-month-discount-2019

Liberator will also be offering significant discounts of 50% off between the 10th and 14th October on two of their most popular apps:

LAMP Words for Life – an AAC app designed for those with autism, focusing on a motor planning approach to accessing vocabulary. Available on iPad, iPhone and iPod.

TouchChat AAC – A versatile app that uses both symbols and keyboard to create messages, which can then be spoken aloud, or shared through social media or email, available on iPad, iPod and iPhone.

More information can be found at:

https://mailchi.mp/liberator/aacawarenessmonth-847477?e=2e9fcac5b6

Keep an eye on this post to see other discounts as they become available!

Voice Banking – ModelTalker

Voice Banking involves the recording of a list of sentences into a computer. When enough recordings have been captured, software chops them up into individual sounds, phonetic units. A synthetic voice can then be built out of these phonetic units, this is called Concatenative speech synthesis. The number of sentences or statements needed to build a good quality English language synthetic voice using this process varies but is somewhere between 600 and 3500. This will take at least 8 hours of constant recording. Most people break it up over a few weeks which is recommended as voice quality will deteriorate over the course of a long session. So 20 minutes to half an hour in the morning (when most people’s voices are clearer) would be a good approach. The more recordings made the better quality the resulting voice will be.

There are a number of services offering voice banking and we have listed some that we are aware of below. The technology used varies from service to service and this post isn’t intended to be a guide to which service may be appropriate to a particular user. Our advice would be to investigate all options before making a decision as this process will be a considerable investment of time and in some cases money.

A person might choose to bank their voice for a number of reasons. The most common reason would be if someone has been diagnosed with a progressive illness like Motor Neuron Disease (MND/ALS) or similar that will result in the loss of speech. A voice is a very personal thing and being able to keep this aspect of individuality and identity can be important. The MND Association have detailed information Voice Banking on their website here.  People unable to speak from birth can also take advantage of this technology. The VocalID service (although expensive) seems to offer good options in this regard. A family member could donate their voice by going through the voice banking process (or they could choose an appropriate donated voice). This synthetic voice could then be modified with filters modelled on the users own vocalisations. The result is a unique and personal voice with some of the regional qualities (accent, pronunciation) that reflect their background and heritage. Irish AAC user have historically had little choice when it came to selecting a voice, most grudgingly accepting the BBC newsreader upper-class English voice that was ubiquitous in communication devices. In Ireland, where an accents can vary significantly over such small geographical areas, how you speak is perhaps even more tied to your identity than other countries. Hopefully in the near future we will be hearing AAC users communicating in Cork, Limerick and Dublin accents!

ModelTalker

For research purposes I used the ModelTalker service to create a synthetic voice. I wanted to see how well it dealt with the Irish accent. The ModelTalker service is run out of the Nemours Speech Research Laboratory (SRL) in the Nemours Center for Pediatric Auditory and Speech Sciences (CPASS) at the Alfred I. duPont Hospital for Children in Wilmington, Delaware. It is not a commercial service, only costing a nominal $100 to download your voice once banked. They offer an Online Recorder that works directly in the Chrome Browser or you can download and install their MTVR App if you are using the Windows OS. The only investment you need to make to begin banking your voice is a decent quality USB headset. I used the Andrea NC-181 (about €35). For the best quality they recommend you record about 1600 sentences but they can build a voice from 800. As this was just an experiment I recorded the minimum 800. At the beginning of each session you go through a sound check. Consistency is an important factor contributing to the overall quality of the finished voice. This is why you need to keep using the same computer and microphone throughout the whole process, ideally in the same location. When you begin you will hear the first statement read out, you then record the statement yourself. A colour code will give you feedback on whether the recording was acceptable or not. Red means it wasn’t good enough to use and so you should try again. Yellow is okay, could be better and green means perfect, move on. I found the Irish accent resulted in a lot of yellow. Don’t let this worry you too much. A nice feature for Irish people who want to engage in this process is the ability to recording custom sentences. They recommend that you at least record your own name. So many names and places in Ireland are anglicised versions of Irish that it would be worthwhile spending a bit of time on these custom sentences. “Siobhán is from Drogheda” for example would be incomprehensible using most Text to Speech. At the end of each session you upload your completed sentences which are added to your inventory (if using the browser based recorder they are added as you go). When you feel you have enough completed you can request your voice. When the voice is ready you need to audition it, this process allows you to fine tune how it sounds. I made a screen recording of this process and I will add it to this post when I have edited it down to a manageable length.

Click play below to hear a sample of my synthesized voice. Yes, unfortunately I do kind of sound like that J

Speech synthesis is an area of technology that is progressing rapidly thanks to the interest of big multinationals like Google (listen to their DeepMind powered WaveNet Voices here) and Adobe (caused a stir and even concern in some quarters with project VoCo in 2016). Looking at the two previous examples it’s not hard to imagine that a high quality unique voice could be built from a short sample in the near future.

Voice Banking Services

More about Voice Banking

Good resource on Voice Banking from the MND Association: https://www.mndassociation.org/forprofessionals/aac-for-mnd/voice-banking/

Recent article from the Guardian about Voice Banking, focusing on VocalID: https://www.theguardian.com/news/2018/jan/23/voice-replacement-technology-adaptive-alternative-communication-vocalid

Tobii buys SmartBox – What might this mean for computer access and AAC?

Big news (in the AT world anyway) may have arrived in your mail box early last week. It was announced that leading AAC and Computer Access manufacturer Tobii purchased SmartBox AT (Sensory Software), developers of The Grid 3 and Look2Learn. As well as producing these very popular software titles, SmartBox were also a leading supplier of a range of AAC and Computer Access hardware, including their own GridPad and PowerPad ranges. Basically (in this part of the world at least) they were the two big guns in this area of AT, between them accounting for maybe 90% of the market. An analogy using soft drink companies would be that this is like Coca-Cola buying Pepsi.

Before examining what this takeover (or amalgamation?) means to their customers going forward it is worth looking back at what each company has historically done well. This way we can hopefully provide a more optimistic future for AT users rather than the future offered by what might be considered a potential monopoly.

Sensory Software began life in 2000 from the spare bedroom of founder Paul Hawes. Paul had previously worked for AbilityNet and had 13 years’ experience working in the area of AT. Early software like GridKeys and The Grid had been very well received and the company continued to grow. In 2006 they setup Smartbox to concentrate on complete AAC systems while sister company Sensory Software concentrated on developing software. In 2015 both arms of the company joined back together under the SamrtBox label. By this time their main product, the Grid 3, had established itself as a firm favourite with Speech and Language Therapists (SLT), for the wide range of communication systems it supported and Occupational Therapists and AT Professionals for its versatility in providing alternative input options to Windows and other software. Many companies would have been satisfied with providing the best product on the market however there were a couple of other areas where SmartBox also excelled. They may not have been the first AT software developers to harness the potential resources of their end users (they also may have been, I would need to research that further) but they were certainly the most successful. They succeeded in creating a strong community around the Grid 2 & 3 with a significant proportion of the online grids available to download being user generated. Their training and support was also second to none. Regular high quality training events were offered throughout Ireland and the UK. Whether by email, phone or the chat feature on their website their support was always top quality also. Their staff clearly knew their product inside out, responses were timely and they were always a pleasure to deal with.

Tobii have been around since 2001. The Swedish firm actually started with eyegaze, three entrepreneurs – John Elvesjö, Mårten Skogö and Henrik Eskilsson recognised the potential of eye tracking as an input method for people with disabilities. In 2005 they released the MyTobii P10, the world’s first computer with built-in eye tracking (and I’ve no doubt there are still a few P10 devices still in use). What stood out about the P10 was the build quality of the hardware, it was built like a tank. While Tobii could be fairly criticized for under specifying their all-in-one devices in terms of Processor and Memory, the build quality of their hardware is always top class. Over the years Tobii have grown considerably, acquiring Viking Software AS (2007), Assistive Technology Inc. (2008) and DynaVox Systems LLC (2014). They have grown into a global brand with offices around the world. As mentioned above, Tobii’s main strength is that they make good hardware. In my opinion they make the best eye trackers and have consistently done so for the last 10 years. Their AAC software has also come on considerably since the DynaVox acquisition. While Communicator always seemed to be a pale imitation of the Grid (apologies if I’m being unfair, but certainly true in terms of its versatility and ease of use for computer access) it has steadily being improving. Their newer Snap + Core First AAC software has been a huge success and for users just looking for communication solution would be an attractive option over the more expensive (although much fuller featured) Grid 3. Alongside Snap + Core they have also brought out a “Pathways” companion app. This app is designed to guide parents, care givers and communication partners in best practices for engaging Snap + Core First users. It supports the achievement of communication goals through video examples, lesson plans, interactive goals grid for tracking progress, and a suite of supporting digital and printable materials. A really useful resource which will help to empower parents and prove invaluable to those not lucky enough to have regular input from an SLT.

To sum things up. We had two great companies, both with outstanding products. I have recommended the combination of the Grid software and a Tobii eye tracker more times than I remember. The hope is that Tobii can keep the Grid on track and incorporate the outstanding support and communication that was always an integral part of SmartBox’s operation. With the addition of their hardware expertise and recent research driven progress in the area of AAC, there should be a lot to look forward to in the future.

If you are a Grid user and you have any questions or concerns about this news, true to form, the communication lines are open. There is some information at this link and at the bottom of the page you can submit your question.

Lost Voice Guy – Winner of Britain’s Got Talent 2018

Lost Voice Guy – Winner of Britain’s Got Talent 2018

A few weeks ago, Lee Ridley (a.k.a. Lost Voice Guy) became the first comedian to win Britain’s Got Talent, now in its 12th year. As well as outshining his competitors along the way, and winning with a clear margin, Lee was a favourite with both the judges and the public.

 

What also makes Lee’s win even more incredible is that fact that he is the first person with a disability to win the show. For a stand-up comedian, being able to connect with your audience is essential, and he did this with self-depreciating humour, fantastic delivery and some killer one-liners, all done through the use of Alternative and Augmentative Communication(AAC).

 

AAC provides a means of communication for those whose speech is not sufficient to communicate functionally in all environment and with all partners. Lee uses a combination of two devices to support his communication – an iPad with apps, and a dedicated device called a Lightwriter.

 

Lee has been on the comedy circuit since 2012, and has won prestigious prizes, including the BBC Radio New Comedy Awards in 2014. Below is an interview that Lee participated in, via email, with Karl O’Keeffe back in 2013, which gives some insights into his process and the unique challenges that using a synthesised voice can present.

 

Check out Lee’s other work on his youtube channel (www.youtube.com/user/LostVoiceGuy) – be prepared to laugh your socks off!


 

Karl: You are the first person ever to do stand up comedy who uses a communication device, so you had nobody to learn from. What are the most important techniques and tricks you have learned so far that you wish someone had told you when you were starting?

 

Lee: I think one of the most important techniques that I have learnt is how to deal with timing. Obviously it’s pretty hard to know when to leave pauses for laughter and stuff, especially as I have to pre plan this. I can pause whenever I want but you have to be ready to pause when people laugh otherwise the start of the next bit gets lost or they don’t laugh as long. You sort of have to know when it’s coming so you’re ready for it. Obviously every audience is different so I’m never going to get it right every time.  I think I’m getting better at anticipating when to pause though.

 

Karl: I see from your videos that you use both a LightWriter and an iPad. Can you tell me which it better for stand up comedy?

 

Lee: I use my iPad for my stand up and I use my Lightwriter for day to day conversations. I just find that my iPad is easier to understand slightly. It is also easier to find my material on the iPad and because it backs up to the cloud, it’s a bit more secure and means i can use any Apple device. It’s also a bit sexier than my Lightwriter.

 

Karl: Do you always use the same voice? Why is the voice important in your performance?

 

Lee: I use the same voice mostly yes. However I do use other voices in my act as well for comedy purposes. For example, I use a woman’s voice to do an impression of my mother. I think that my main voice is important to me because it has become ‘my’ voice. It’d be weird if I changed it now.

 

Karl: What app do you use on the iPad for communication?

 

Lee: I use Proloquo2go, which is a brilliant app. It is very complex but easy to use at the same time. It does everything that I need it to do really.

 

Karl: What is your favourite app on the iPad?

 

Lee: I tweet quite a lot so I tend to use Tweetbot all the time. I couldn’t get through long train journeys with the Spotify app either!

 

Karl: Do you use any other Assistive Technology (computer access etc.)?

 

Lee: No. I only use Proloquo2go on my iPad and iPhone and then my Lightwriter.

 

Bloom 2017 ‘No Limits’ Grid Set

You may have heard about or seen photos of Enable Irelands fantastic “No Limits” Garden at this year’s Bloom festival. Some of you were probably even lucky enough to have actually visited it in the Phoenix Park over the course of the Bank Holiday weekend. In order to support visitors but also to allow those who didn’t get the chance to go share in some of the experience we put together a “No Limits” Bloom 2017 Grid. If you use the Grid (2 or 3) from Sensory software, or you know someone who does and you would like to learn more about the range of plants used in Enable Ireland’s garden you can download and install it by following the instructions below.

How do I install this Grid?

If you are using the Grid 3 you can download and install the Bloom 2017 Grid without leaving the application. From Grid explorer:

  • Click on the Menu Bar at the top of the screen
  • In the top left click the + sign (Add Grid Set)
  • A window will open (pictured below). In the bottom corner click on the Online Grids button (you will need to be connected to the Internet).

grid 3 screen shot

  • If you do not see the Bloom2017 Grid in the newest section you can either search for it (enter Bloom2017 in the search box at the top right) or look in the Interactive learning or Education Categories.

If you are using the Grid 2 or you want to install this Grid on a computer or device that is not connected to the Internet then you can download the Grid set at the link below. You can then add it to the Grid as above except select Grid Set File tab and browse to where you have the Grid Set saved.

For Grid 2 users:

Download Bloom 2017 Grid here https://grids.sensorysoftware.com/en/k-43/bloom2017

Boardmaker Online now launched in Ireland

Tobii Dynavox have recently launched their new Boardmaker Online product in Ireland through SafeCare Technologies. It has all the functionalities of previous versions of Boardmaker, except now that it’s web-based you don’t need any disks and multiple users can access it from any PC.

Instructor showing students how to use Boardmaker Online

You can purchase a Personal, Professional or District account and the amount you pay depends on the type of account, the amount of “instructors” and how many years you want to sign up for. You can also get a discount for any old Boardmaker disks that you want to trade in.

You get all the symbols that have been available in past versions, as well as some new symbol sets and any new ones that are created in the future will also be given to you. Because it’s web-based, you have access to previously created activities via the online community and you can upload activities you create yourself to that community and share them with other people in your district or all over the world.

Because it’s no longer tied to one device, you can create activities on your PC and assign them to your “students” who can use them either in school and/or at home. You no longer need to have a user’s device in your possession to update their activities and they don’t need to have a period without their device while you do this.

You (and the other instructors in your district if you have a district licence) can also assign the same activity to many students and by having different accessibility options set up for different students, the activity is automatically accessible for their individual needs. For example, you could create an activity and assign it to a student who uses eye gaze and to a student who uses switches and that activity will show up on their device in the format that’s accessible for them.

Picture shows how instructors can assign Boardmaker Online activities to multiple students

The results of students’ work can be tracked against IEP or educational goals which then helps you decide what activities would be suitable to assign next. You can also track staff and student usage.

One limitation is that you can only create activities on a Windows PC or Mac. You can play activities on an iPad using the free app but not create them on it, and you can’t use Boardmaker Online to either create or play activities on an Android or Windows-based tablet.

The other point to mention is that because it’s a subscription-based product, the payment you have to make is recurring every year rather than being a one-off payment, which may not suit everyone.

However, with the new features it’s definitely worth getting the free 30-day trial and deciding for yourself if you’d like to trade in your old Boardmaker disks for the new online version!

GazeSpeak & Microsoft’s ongoing efforts to support people with Motor Neuron Disease (ALS)

Last Friday (February 17th) New Scientist published an article about a new app in development at Microsoft called GazeSpeak. Due to be released over the coming months on iOS, GazeSpeak aims at facilitating communication between a person with MND (known as ALS in the US, I will use both terms interchangeably) and another individual, perhaps their partner, carer or friend. Developed by Microsoft intern, Xiaoyi Zhang, GazeSpeak differs from traditional approaches in a number of ways. Before getting into the details however it’s worth looking at the background, GazeSpeaker didn’t come from nowhere, it’s actually one of the products of some heavyweight research into Augmentative and Alternative Communication (AAC) that has been taking place at Microsoft over the last few years. Since 2013, inspired by football legend and ALS sufferer Steve Gleason (read more here) Microsoft researchers and developers have put the weight of their considerable collective intellect to bear on the subject of increasing the ease and efficiency of communication for people with MND.

Last year Microsoft Research published a paper called ”
AACrobat: Using Mobile Devices to Lower Communication Barriers and Provide Autonomy with Gaze-Based AAC” (abstract and pdf download at previous link) which proposed a companion app to allow an AAC user’s communication partner assist (in an non-intrusive way) in the communication process. Take a look at the video below for a more detailed explanation.

This is an entirely new approach to increasing the efficiency of AAC and one that I suggest, could only have come from a large mainstream tech organisation who have over thirty years experience facilitating communication and collaboration.

Another Microsoft research paper published last year (with some of the same authors at the previous paper) called “Exploring the Design Space of AAC Awareness Displays” looks at importance of a communication partners “awareness of the subtle, social, and contextual cues that are necessary for people to naturally communicate in person”. There research focused on creating a display that would allow the person with ALS express things like humor, frustration, affection etc, emotions difficult to express with text alone. Yes they proposed the use of Emoji, which are a proven and effective way a similar difficulty is overcome in remote or non face to face interactions however they went much further and also looked at solutions like Avatars, Skins and even coloured LED arrays. This, like the other one above, is an academic paper and as such not an easy read but the ideas and solutions being proposed by these researchers are practical and will hopefully be filtering through to end users of future AAC solutions.

That brings us back to GazeSpeak, the first fruits of the Microsoft/Steve Gleason partnership to reach the general public. Like the AACrobat solution outlined above GazeSpeak gives the communication partner a tool rather than focusing on tech for the person with MND. As the image below illustrates the communication partner would have GazeSpeak installed on their phone and with the app running they would hold their device up to the person with MND as if they were photographing them. They suggest a sticker with four grids of letters is placed on the back of the smart phone facing the speaker. The app then tracks the persons eyes: up, down, left or right, each direction means the letter they are selecting is contained in the grid in that direction (see photo below).

man looking right, other person holding smartphone up with gazespeak installed

Similar to how the old T9 predictive text worked, GazeSpeak selects the appropriate letter from each group and predicts the word based on the most common English words. So the app is using AI in the form of machine vision to track the eyes and also to make the word prediction. In the New Scientist  article they mention that the user would be able to add their own commonly used words and people/place names which one assumes would prioritize them within the prediction list. In the future perhaps some capacity for learning could be added to further increase efficiency. After using this system for a while the speaker may not even need to see the sticker with letters, they could write words from muscle memory. At this stage a simple QR code leading to the app download would allow them to communicate with complete strangers using just their eyes and no personal technology.

UPDATE (August 2018): GazeSpeak has been released for iOS and is now called SwipeSpeak. Download here. For more information on how it works or to participate in further development have a look at their GitHub page here.