Boardmaker Online now launched in Ireland

Tobii Dynavox have recently launched their new Boardmaker Online product in Ireland through SafeCare Technologies. It has all the functionalities of previous versions of Boardmaker, except now that it’s web-based you don’t need any disks and multiple users can access it from any PC.

Instructor showing students how to use Boardmaker Online

You can purchase a Personal, Professional or District account and the amount you pay depends on the type of account, the amount of “instructors” and how many years you want to sign up for. You can also get a discount for any old Boardmaker disks that you want to trade in.

You get all the symbols that have been available in past versions, as well as some new symbol sets and any new ones that are created in the future will also be given to you. Because it’s web-based, you have access to previously created activities via the online community and you can upload activities you create yourself to that community and share them with other people in your district or all over the world.

Because it’s no longer tied to one device, you can create activities on your PC and assign them to your “students” who can use them either in school and/or at home. You no longer need to have a user’s device in your possession to update their activities and they don’t need to have a period without their device while you do this.

You (and the other instructors in your district if you have a district licence) can also assign the same activity to many students and by having different accessibility options set up for different students, the activity is automatically accessible for their individual needs. For example, you could create an activity and assign it to a student who uses eye gaze and to a student who uses switches and that activity will show up on their device in the format that’s accessible for them.

Picture shows how instructors can assign Boardmaker Online activities to multiple students

The results of students’ work can be tracked against IEP or educational goals which then helps you decide what activities would be suitable to assign next. You can also track staff and student usage.

One limitation is that you can only create activities on a Windows PC or Mac. You can play activities on an iPad using the free app but not create them on it, and you can’t use Boardmaker Online to either create or play activities on an Android or Windows-based tablet.

The other point to mention is that because it’s a subscription-based product, the payment you have to make is recurring every year rather than being a one-off payment, which may not suit everyone.

However, with the new features it’s definitely worth getting the free 30-day trial and deciding for yourself if you’d like to trade in your old Boardmaker disks for the new online version!

Eye gaze controlled Power wheelchair

eye controlled wheelchair via tablet mounted on wheelchair

EyeTech Digital Systems, has partnered with Quantum Rehab to bring eye controlled wheel chairs to individuals who are unable to use hand controls. EyeTech’s eye tracking technology mounts directly to a tablet PC and allows the user to control the entire computer using eye movements. The system then mounts to the wheelchair. An eye control driving app gives the user the ability to drive hands-free. The driving controls are overlaid on the scene camera.  Simply looking at the driving controls activates them to control the basic directions and movements of the chair.

Quantum Rehab® products, including a range of rehab mobility technologies such as the Q6 Edge® 2.0 and Quantum Series of power bases. www.QuantumRehab.com.

EyeTech Digital Systems products include eye tracking technology for a variety of markets such as medical, transportation, entertainment, and augmentative communication http://www.eyetechds.com/

Irish suppliers

LC Seating and MMS Medical.

GazeSpeak & Microsoft’s ongoing efforts to support people with Motor Neuron Disease (ALS)

Last Friday (February 17th) New Scientist published an article about a new app in development at Microsoft called GazeSpeak. Due to be released over the coming months on iOS, GazeSpeak aims at facilitating communication between a person with MND (known as ALS in the US, I will use both terms interchangeably) and another individual, perhaps their partner, carer or friend. Developed by Microsoft intern, Xiaoyi Zhang, GazeSpeak differs from traditional approaches in a number of ways. Before getting into the details however it’s worth looking at the background, GazeSpeaker didn’t come from nowhere, it’s actually one of the products of some heavyweight research into Augmentative and Alternative Communication (AAC) that has been taking place at Microsoft over the last few years. Since 2013, inspired by football legend and ALS sufferer Steve Gleason (read more here) Microsoft researchers and developers have put the weight of their considerable collective intellect to bear on the subject of increasing the ease and efficiency of communication for people with MND.

Last year Microsoft Research published a paper called ”
AACrobat: Using Mobile Devices to Lower Communication Barriers and Provide Autonomy with Gaze-Based AAC” (abstract and pdf download at previous link) which proposed a companion app to allow an AAC user’s communication partner assist (in an non-intrusive way) in the communication process. Take a look at the video below for a more detailed explanation.

This is an entirely new approach to increasing the efficiency of AAC and one that I suggest, could only have come from a large mainstream tech organisation who have over thirty years experience facilitating communication and collaboration.

Another Microsoft research paper published last year (with some of the same authors at the previous paper) called “Exploring the Design Space of AAC Awareness Displays” looks at importance of a communication partners “awareness of the subtle, social, and contextual cues that are necessary for people to naturally communicate in person”. There research focused on creating a display that would allow the person with ALS express things like humor, frustration, affection etc, emotions difficult to express with text alone. Yes they proposed the use of Emoji, which are a proven and effective way a similar difficulty is overcome in remote or non face to face interactions however they went much further and also looked at solutions like Avatars, Skins and even coloured LED arrays. This, like the other one above, is an academic paper and as such not an easy read but the ideas and solutions being proposed by these researchers are practical and will hopefully be filtering through to end users of future AAC solutions.

That brings us back to GazeSpeak, the first fruits of the Microsoft/Steve Gleason partnership to reach the general public. Like the AACrobat solution outlined above GazeSpeak gives the communication partner a tool rather than focusing on tech for the person with MND. As the image below illustrates the communication partner would have GazeSpeak installed on their phone and with the app running they would hold their device up to the person with MND as if they were photographing them. They suggest a sticker with four grids of letters is placed on the back of the smart phone facing the speaker. The app then tracks the persons eyes: up, down, left or right, each direction means the letter they are selecting is contained in the grid in that direction (see photo below).

man looking right, other person holding smartphone up with gazespeak installed

Similar to how the old T9 predictive text worked, GazeSpeak selects the appropriate letter from each group and predicts the word based on the most common English words. So the app is using AI in the form of machine vision to track the eyes and also to make the word prediction. In the New Scientist  article they mention that the user would be able to add their own commonly used words and people/place names which one assumes would prioritize them within the prediction list. In the future perhaps some capacity for learning could be added to further increase efficiency. After using this system for a while the speaker may not even need to see the sticker with letters, they could write words from muscle memory. At this stage a simple QR code leading to the app download would allow them to communicate with complete strangers using just their eyes and no personal technology.

2016 – Technology Trends and Assistive Technology (AT) Highlights

As we approach the end of 2016 it’s an appropriate time to look back and take stock of the year from an AT perspective. A lot happened in 2016, not all good. Socially, humanity seems to have regressed over the past year. Maybe this short term, inward looking protectionist sentiment has been brewing longer but 2016 brought the opportunity to express politically, you know the rest. While society steps and looks back technology continues to leap and bound forward and 2016 has seen massive progress in many areas but particularly areas associated with Artificial Intelligence (AI) and Smart Homes. This is the first in a series of posts examining some technology trends of 2016 and a look at how they affect the field of Assistive Technology. The links will become active as the posts are added. If I’m missing something please add it to the comments section.

Dawn of the Personal Digital Assistants

Game Accessibility

Inbuilt Accessibility – AT in mainstream technology 

Software of the Year – The Grid 3

Open Source AT Hardware and Software

The Big Life Fix

So although 2016 is unlikely to be looked on kindly by future historians… you know why; it has been a great year for Assistive Technology, perhaps one of promise rather than realisation however. One major technology trend of 2016 missing from this series posts is Virtual (or Augmented) Reality. While VR was everywhere this year with products coming from Sony, Samsung, Oculus and Microsoft its usefulness beyond gaming is only beginning to be explored (particularly within Education).

So what are the goals for next year? Well harnessing some of these innovations in a way where they can be made accessible and usable by people with disabilities at an affordable price. If in 2017 we can start putting some of this tech into the hands of those who stand to benefit most from its use, then next year will be even better.

Tobii Power Clinic Online Webinars

TobiiDynavox the AAC arm of Tobii Group, industry leading eye gaze/tracking solutions manufacturer, are currently producing a series of webinars on their range of products. Although they have completed 8 webinars to date they are all recorded and available on Youtube and the Power Clinic Online page. Taken together these are a fantastic resource for anybody interested in EyeGaze input. While some are product specific (for example “Getting Started With Communicator“) others are more general and will have content of interest to both therapists and technicians working in the area (Amazing Calibrations for EyegazeAAC and Autism – Strategies for challenging behaviour & EyeGaze Strategies for Reading!).

Tobii users, especially those new to the technology will be particularly interested in Eyegaze Windows Control for beginners! and Advanced Windows Control (eyegaze). If you rather the live experience make sure to sign up for next weeks webinar AAC AND MUSIC (PERFECT SYMPHONY) and you will be able to ask questions and contribute to the discussion.

We have created a playlist of all the Tobii Webinars to date on our Youtube channel if that makes it easier for you to access and share them (Click here)

Assistive Technology (AT) in the era of the Digital Schoolbag

child using tablet computer to study biology- zooming in on screen on mid section of human skeleton

Increasingly schools are opting for what is sometimes termed a digital schoolbag. This involves the purchase of an electronic device, usually an iPad with a package of digital textbooks pre-installed. Digital textbooks are undoubtedly a step in the right direction in terms of accessibility and are indeed essential for many students with disabilities. There are students however who may need to use a different platform (hardware and/or operating system – OS) because of compatibility issues with their Assistive Technology. Currently the most popular platform being adopted by schools is Apple iOS with parents being directed to purchase an iPad from a contracted supplier. Many readers of this article will be well aware of all the great inbuilt accessibility features within iOS however if you are a user of Eye Gaze or Speech Recognition (for access) it does not currently support your chosen AT.

It is understandable why from a school’s perspective having all students using identical standardised devices would be preferable and there are plenty of reasons why Apple iOS would be the obvious choice. There is a concern however that the small minority who may need to use other platforms because of access difficulties could be put at a disadvantage or perhaps not be able to participate fully in all activities. One of the leading school suppliers have assured us that the textbooks can be accessed on Windows, iOS and Android and as these textbooks are sourced from the same few publishers one can assume this applies for all suppliers. It is therefore up to the schools to ensure all lessons utilizing technology are identical whenever possible; equivalent when not, regardless of the device/platform you are using. Parents, particularly those whose children use Assistive Technology should not feel pressured by schools to purchase technology that isn’t the optimum for their child’s needs. If a therapist or AT specialist has recommended a particular solution that differs from what is being suggested by the school, the priority should obviously be the students’ needs. When it comes to AT it is the school’s responsibility to accommodate the different needs of its student, just as it was before the digital schoolbag. The use of technology within our schools is to be embraced but it is important that schools ensure that the curriculum is open and in no part dependent on one particular platform or device. That would just see us swapping one form of inequality for another and that’s not progress.

If anyone would like advice on what technologies are available to support access, literacy and productivity on any platform they should feel free to contact us here in the National Assistive Technology Service in Sandymount, Dublin.

Accessible Gaming & Playing Agario with your Eyes

We in Enable Ireland Assistive Technology Training Service have long recognised the importance of gaming to many young and not so young assistive technology users. It’s a difficult area for a number of reasons. Firstly games (and we are talking about video games here) are designed to be challenging, if they are too easy they’re not fun, however if too difficult the player will also lose interest. Successful games manage to get the balance just right. Of course when it comes to physical dexterity as well as other skills required for gaming (strategic, special awareness, timing) this often involves game designers taking a one size fits all approach which frequently doesn’t include people with physical, sensory or cognitive difficulties. There are two methods of getting around this which when taken together ensure a game can be accessed and enjoyed by a much broader range of people; difficulty levels (not a new concept) and accessibility features (sometimes called assists). Difficulty levels are self-explanatory and have been a feature of good games for decades. Accessibility features might include the ability to remap buttons (useful for one handed gamers), automate certain controls, subtitles, high contrast and magnification.

Another challenge faced when creating an alternative access solution to allow someone successfully play a video game is that you need to have a pretty good understanding of the activity, how to play the game. This is where we often have difficulty and I’d imaging other non-specialist services (general AT services rather than game accessibility specialist services like SpecialEffect or OneSwitch.org.uk  ) also run into problems. We simply do not have the time required to familiarise ourselves with the games or keep up to date with new releases (which would allow us better match a person with an appropriate game for their range of ability). We try and compensate for this by enlisting the help of volunteers (often from Enable Irelands IT department with whom we share office space), interns and transition year students. It’s often the younger transition year students who bring us some of the best suggestions and last week was no exception. After we demonstrated some eyegaze technology to Patrick, a transition year student visiting from Ardscoil Ris, Dublin 9, he suggested we take a look at a browser based game called agar.io. I implore you, do not to click that link if you have work to get done today. This game is equal parts addictive and infuriating but in terms of playability and simplicity it’s also very accessible with simple controls and a clear objective. The idea is that using your mouse you control a little (at first) coloured circular blob, think of it as a cell and the aim of the game it to eat other little coloured cells and grow. The fun part is that other players from every corner of the globe are also controlling cells and growing, if they are bigger than you they move a little slower but can eat you! Apart from the mouse there are two other buttons, the spacebar allows you to split your cell (can be used as an aggressive or defensive strategy) and the “W” key allows you to shed some weight. We set up the game to be played with a Tobii EyeX  (€119) and IRIS software (€99). IRIS allows you to emulate the mouse action with your eyes and set up two onscreen buttons (called interactors) that can also be activated using your gaze, the video below should make this clearer.

Big thanks to Patrick for suggesting we take a look at Agrio and helping us set it up for eyegaze control. I’ll leave the final words to him. “I found playing Agrio with gaze software really fun. I think you have just as much control with your eyes as with your mouse. If an interactor was placed in the corner of the screen to perform the function of the spacebar (splits the cell in half) it would be beneficial. I believe it would be a very entertaining game for people who can only control their eyes, not their arms.”

Assistive Technology Webinars

 

webinar graphic

Are you looking for free expert training and advice is assistive technology?

Then consider signing up for a webinar.  There are lots of webinars available within various areas of assistive technology.  Some have a charge, but there are many freely available for anyone to take part in.

A webinar is a live meeting that takes place over the web.  The meeting can be a presentation, discussion, demonstration, or an instructional session.  Participants can view documents and applications via their computers, while join in the discussion by audio or via a live Q&A text area.

Many assistive technology suppliers and organisations are using webinars as a way to share information.  Below are a list of a few online webinars that you can register on or listen to archived sessions.

Inclusive technology

http://www.inclusive.co.uk/events/webinars

The Great Lakes ADA Center’s

http://www.ada-audio.org/Webinar/AccessibleTechnology/Schedule/#fy2015Session6

ATIA Online Professional Development

http://www.atia.org/i4a/member_directory/feResultsListing.cfm?directory_id=8&viewAll=1

Don Johnston Incorporated

http://donjohnston.com/webinars/#.VecAe_lViko

AbleNet University Live Webinars

https://www.ablenetinc.com/resources/live_webinars/

Iowa Assistive Technology Professional Development Network

https://www.education.uiowa.edu/centers/icater/webinars

Inclusive Technology AT Webinars

participant of webinar

Inclusive technology have hosted a number of assistive technology webinars.  If you have missed them, they have been recorded and are available to watch at the following link recordings here!

Although you won’t have access to the live Q&A, the recordings will still be very useful to watch.  All webinars are FREE – all you require is an Internet connection and a computer.

Webinars include: Using Technology to Support Reading and Writing, iPad Access for Physical Disabilities, Using Tobii Eye Gaze and many more.

Eye Pointing and Using an ETRAN Frame

Child looking at an Etran Frame

For some non-verbal children, choosing pictures or symbols with their hands is not an option, due to conditions that make it difficult for them to reach, point to or grasp. For these children, we need to look at ways for them to select picture or symbols, without undue effort or stress, so that they can communicate in an easy, timely manner.

Children naturally look at interesting pictures placed in front of them. They may not yet have the idea that looking intentionally at a picture caused particular outcome (i.e. looking at the picture of the book results in a story being read to them) but they may gradually make this connection over time (cause and effect) if the outcome of their selection, whether intentional or not, is consistently carried out.

For both children making intentional choices and unintentional choices, eye pointing (selecting a picture or symbol by looking at it) can be a powerful way of communicating. While many children may start with eye pointing to items in their visual field, it may be necessary to formalise this system, through the use of a consistent set-up. An Etran Frame is one method of doing this.

An Etran frame is like a large square, see-through donut, with a space in the centre (see picture below). It is usually made from a plastic transparent material, such as Perspex. It sometimes has a base or is mounted on an arm, so that the facilitator does not need to hold it, and has both hands free to interact with the child and their selections. The frame should be placed in front of the child, with picture symbols attached, facing toward the child. The hole in the middle is used by the facilitator to see the child and follow their gaze, to interpret what picture or symbol the child is looking at.

Positioning

The child should be positioned well, with the Etran frame square on, in front of them. If possible it should be mounted, to prevent it moving around.

The partner should be directly opposite the child and able to make eye contact through the hole in the centre of the frame.

Using the Etran frame

Plan to use 2-3 locations initially – with the top left and bottom right being the first positions to introduce pictures.

The bottom right will always be used as an “All Done”/”Stop” indicator. The vocabulary at the top will vary according to activity.

Etran frame

Make it fun

The child needs to get the idea right from the start that this is a good game. Use highly motivating objects, activities or toys and stick symbols directly onto the frame.

Attaching items to the frame

Bluetack or Velcro is really the simplest and best method.  If you stick on picture symbols, make sure that you write what they are on the back, so that you can see which is which! (You cannot both read their eye movements from the front and keep popping around to see the child side of the frame and “sitting on his or her shoulder” to see what they are pointing at).

Child looking through Etran frame

How many objects are pictures?

Not too few (boring), not too many (overwhelming). Two, then quickly onto three is good. Move up gradually.

First steps

Model the activity for so that a child knows what they’re supposed to do. Let the child see you sticking pictures/symbols to the frame.

Prompting

When the child looks at the object for 1-2 seconds, use the selection frame around the symbol s/he looked at (for visual reinforcement), and say “you looked at this. It’s a ____. Will we use/do_____?” (for auditory reinforcement). The reason for introducing the selection frame is for the future, when the child may move onto a high tech device and the selection is indicated with a colour frame around their selection.

Scanning

If the child doesn’t look at the symbols, say, “Where is the _____? Let’s look for the _____. You help me find it”.

Take your finger and slowly point the item fixed in the position that is top left (to the child- always start at this same position). Say “Is this it?” Move your finger slowly and smoothly along to the next item along the top of the frame, if a second picture is being used. Try to take the child’s eyes with you as you move your finger “Is this it?”. Keep doing this until you get to the one you want. Then say “aha, we found it, look at the _____! Here it is!”.

Take the selection frame and hold it around the symbol for 2-3 seconds, so that the child knows that this is the choice s/he has selected. For example, below shows the “Stop/All done” symbol selected.

Etran frame

Ideas for using the Etran Frame.

Book reading – Use a “Turn the page” symbol in conjunction with the “All done/Stop” symbol. A second symbol for “Read it again” or “new book” can be introduced later.

Blowing bubbles/listening to music/any activity the child enjoys – use a “more” or “again” symbol with the “All done/Stop” symbol

Choice making – two symbols plus the “All done/Stop” symbols are required for this i.e. book vs. teddy, walk vs. car, visit Granny vs. visit shops, jack in the box vs. popper etc, etc. Even choosing clothes, colours, toys etc.

Playing games – Simon says is a great game for group activities. Use a couple of actions such as “dance”, “blow raspberries”, “sing”, “wiggle your bum” “hop” etc. etc, so that the child  can tell others what to do.

Use teddies/dolls for pretend play. Ask the child which s/he wants to play with first (“teddy” and “doll” symbols) and then choose what to do i.e. “have a drink”, “have a bath” “go to bed” etc. etc.