Eye Control – Inbuilt EyeGaze Access for Windows 10

Just yesterday Microsoft announced what is possibly their biggest step forward in functionality within their Ease of Access accessibility settings since Windows 7. Eye Control is an inbuilt feature to facilitate access to the Windows 10 OS using the low cost eyegaze peripheral the Tracker 4 C from Tobii. More about what you can actually do with Eye Control below but first a little background to how this has come about.

Steve Gleeson and his son

Former American Football professional and MND (ALS) sufferer Steve Gleason (above) challenged Microsoft in 2014 to help people affected by this degenerative condition through the advancement eye tracking technology. This initial contact lead to the development of a prototype eye gaze controlled wheelchair, receiving lots of publicity and generating increased awareness in the process. However it was never likely to be progressed to a product that would be available to other people in a similar situation. What this project did achieve was to pique the interest of some of the considerable talent within Microsoft into the input technology itself and its application, particularly for people with MND.

A combination of factors felt on both sides of the Atlantic have proved problematic when it comes to providing timely AT support to people diagnosed with MND. Eyegaze input is the only solution that will allow successful computer access as the condition progresses, eye movement being the only ability left in the final stages of the illness. However, historically the cost of the technology meant that either insurance, government funding or private fundraising was the only means by which people could pay for eyegaze equipment. Usually this resulted in a significant delay which, due to the often aggressive nature of MND meant valuable time was lost and often the solution arrived too late. This situation was recognized by Julius Sweetland who led the development of Optikey, an Open Source computer access/AAC solution designed to work with low cost eye trackers back in 2015. Interestingly some of the innovative features of Optikey seem to have made it to Eye Control on Windows 10 (Multi-Key selection called Shape Writing on Eye Control – see gif below).

demo of shap writing on Eye Control - works like swiping on a touch keyboard. dwell on the first letter of a word, glance at subsequent letters and dwell on last letter. word is entered

Since the initial Steve Gleason Wheelchair hack there has been a steady stream of high quality research papers coming from people at Microsoft on the subject of eyegaze input and MND solutions. This should have been a hint that something like Eye Control was on the horizon. EyeGaze input has promised to break into the mainstream several times over the last decade however with Eye Control and support for devices being included in the core Windows OS it has never been this close.

For more background on the path to Eye Control see this Microsoft blog post from Microsoft:  From Hack to Product, Microsoft Empowers People with Eye Control for Windows 10

Want to find out how to get early access to Eye Control or get some more information on the functionality read this post from Tobii (be warned there are still bugs):  How to get started with Eye Control on Windows.

Hands-free Minecraft from Special Effect

Love it or hate it, the game of Minecraft has captured the imagination of over 100 million young, and not so young people. It is available on multiple platforms; mobile device (Pocket Edition), Raspberry Pi, Computer, Xbox or PlayStation and it looks and feels pretty much the same on all. For those of us old enough to remember, the blocky graphics will hold some level of nostalgia for the bygone 8 Bit days when mere blobs of colour and our imagination were enough to render Ghosts and Goblins vividly. This is almost certainly lost on the main cohort of Minecraft players however who would most probably be bored silly with the 2 dimensional repetitive and predictable video games of the 80’s and early 90’s. The reason Minecraft is such a success is that it has blended its retro styling with modern gameplay and a (mind bogglingly massive) open world where no two visits are the same and there is room for self-expression and creativity. This latter quality has lead it to become the first video game to be embraced by mainstream education, being used as a tool for teaching everything from history to health or empathy to economics. It is however the former quality, the modern gameplay, that we are here to talk about. Unlike the afore mentioned Ghosts and Goblins, Minecraft is played in a 3 dimensional world using either the first person perspective (you see through the characters eyes) or third person perspective (like a camera is hovering above and slightly behind the character). While undoubtedly offering a more immersive and realistic experience, this means controlling the character and playing the game is also much more complex and requires a high level of dexterity in both hands to be successful. For people without the required level of dexterity this means that not only is there a risk of social exclusion, being unable to participate in an activity so popular among their peers, but also the possibility of being excluded within an educational context.

Fortunately UK based charity Special Effect have recognised this need and are in the process doing something about it. Special Effect are a charity dedicated to enabling those with access difficulties play video games through custom access solutions. Since 2007 their interdisciplinary team of clinical and technical professionals (and of course gamers) have been responsible for a wide range of bespoke solutions based on individuals’ unique abilities and requirements. Take a look at this page for some more information on the work they do and to see what a life enhancing service they provide. The problem with this approach of course is reach, which is why their upcoming work on Minecraft is so exciting. Based on the Open Source eyegaze AAC/Computer Access solution Optikey by developer Julius Sweetland, Special Effect are in the final stages of developing an on-screen Minecraft keyboard that will work with low cost eye trackers like the Tobii Eye X and the Tracker 4C (€109 and €159 respectively).

minecraft on screen keyboard

The inventory keyboard

MineCraft on screen keyboards

The main Minecraft on screen keyboard

Currently being called ‘Minekey’ this solution will allow Minecraft to be played using a pointing device like a mouse or joystick or even totally hands free using an eyegaze device or headmouse. The availability of this application will ensure that Minecraft it now accessible to many of those who have been previously excluded. Special Effect were kind enough to let us trial a beta version of the software and although I’m no Minecraft expert it seemed to work great. The finished software will offer a choice of onscreen controls, one with smaller buttons and more functionality for expert eyegaze users (pictured above) and a more simplified version with larger targets. Bill Donegan, Projects Manager with Special Effect told us they hope to have it completed and available to download for free by the end of the year. I’m sure this news that will excite many people out there who had written off Minecraft as something just not possible for them. Keep an eye on Special Effect or ATandMe for updates on its release.

Bloom 2017 ‘No Limits’ Grid Set

You may have heard about or seen photos of Enable Irelands fantastic “No Limits” Garden at this year’s Bloom festival. Some of you were probably even lucky enough to have actually visited it in the Phoenix Park over the course of the Bank Holiday weekend. In order to support visitors but also to allow those who didn’t get the chance to go share in some of the experience we put together a “No Limits” Bloom 2017 Grid. If you use the Grid (2 or 3) from Sensory software, or you know someone who does and you would like to learn more about the range of plants used in Enable Ireland’s garden you can download and install it by following the instructions below.

How do I install this Grid?

If you are using the Grid 3 you can download and install the Bloom 2017 Grid without leaving the application. From Grid explorer:

  • Click on the Menu Bar at the top of the screen
  • In the top left click the + sign (Add Grid Set)
  • A window will open (pictured below). In the bottom corner click on the Online Grids button (you will need to be connected to the Internet).

grid 3 screen shot

  • If you do not see the Bloom2017 Grid in the newest section you can either search for it (enter Bloom2017 in the search box at the top right) or look in the Interactive learning or Education Categories.

If you are using the Grid 2 or you want to install this Grid on a computer or device that is not connected to the Internet then you can download the Grid set at the link below. You can then add it to the Grid as above except select Grid Set File tab and browse to where you have the Grid Set saved.

For Grid 2 users:

Download Bloom 2017 Grid here https://grids.sensorysoftware.com/en/k-43/bloom2017

Makers Making Change – Canada provides $750,000 to fund development of Open Source AT

Makers Making Change have a mission, to “connect makers to people with disabilities who need assistive technologies”. This is also our mission and something we’ve talked about before, it is also the goal of a number of other projects including TOM Global and Enable Makeathon. Makers Making Change which is being run by Canadian NGO the Neil Squire Society and supported by Google.org differs from previous projects sharing the same goal in a couple of ways. Firstly their approach. They are currently concentrating their efforts on one particular project, the LipSync and touring the North American continent holding events where groups of Makers get together and build a quantity of these devices. These events are called Buildathons. This approach both raises awareness about their project within the maker community while also ensuring they have plenty of devices in stock, ready to go out to anybody who needs them. Secondly, thanks to the recent promise from the Canadian government of funding to the tune of $750,000 they may be on the verge of bringing their mission into the mainstream.

Canada have always had a well-deserved reputation for being at the forefront of Assistive Technology and Accessibility. It is one of only a handful of nations the rest of the world look to for best practice approaches in the area of disability. For that reason this funding announced by Minister of Sport and Persons with Disabilities, Carla Qualtrough may have a positive effect even greater than its significant monetary value, and far beyond Canada’s borders. Minster Qualtrough stated the funding was “for the development of a network of groups and people with technical skills to support the identification, development, testing, dissemination and deployment of open source assistive technologies.” Specifying that it is Open Source assistive technologies they will be developing and disseminating means that any solutions identified will have the potential to be reproduced by makers anywhere in the world. It is also interesting that the funding is to support the development of a network of groups and people rather than specific technologies, the goal here being sustainability. Neil Squire Society Executive Director, Gary Birch said “This funding is instrumental in enabling the Neil Squire Society to develop, and pilot across Canada, an innovative open source model to produce and deliver hardware-based assistive technologies to Canadians with disabilities. Hopefully this forward thinking move by the Canadian Government will inspire some EU governments into promoting and maybe even funding similar projects over here.

What is the LipSync?

The Lipsync is an Open Source Sip&Puff low force joystick that can enable access to computers or mobile devices for people without the use of their hands. Sound familiar? If you are a regular reader of this blog you are probably thinking about the FlipMouse, they are similar devices. I haven’t used the LipSync but from what I’ve read it offers slightly less functionality than the Flipmouse but this may make it more suitable for some users. Take a look at the video below.

If you want to know more about LipSync have a look at their project page on Hackaday.io where you will find build instructions, bill of materials, code and user manual.

If the idea of building or designing a technology that could enhance the life of someone with a disability or an older person appeals to you, either head down to your local maker space (Ireland, Global) or set a date in your diary for Ireland’s premier Maker Faire – Dublin Maker which will take place in Merrion Square, Dublin 4 on Saturday July 22nd. We’ll be there showing the FlipMouse as well as some of our more weird and wonderful music projects. There will also be wild, exciting and inspiring demonstrations and projects from Maker Spaces/Groups and Fab Labs from around the country and beyond. See here for a list of those taking part. 

Accessibility Checker for Word Tutorial

The Accessibility Checker feature has been part of Microsoft Office for the last few iterations of the software package. It provides a fast and easy way to check whether the content you are producing is accessible to users of assistive technology. By making accessibility accessible Microsoft have left no room for excuses like “I didn’t know how…” or “I didn’t have time..”. You wouldn’t send a document to all your colleagues full of misspellings because you were in a hurry would you? The one criticism that could have been leveled at Microsoft was perhaps they didn’t provide enough support to new users of the tool. As I said above it’s easy to use but sometimes users need a little extra support, especially when you are introducing them to something that may be perceived as additional work. Thankfully Microsoft have filled that gap with a 6 part tutorial video which clearly explains why and how to get started using Accessibility Checker. Part 1 is a short introduction (embedded below) followed by a video on each important accessibility practice; Alternative Text, Heading Styles, Hyperlinks, File naming and Tables. Each video is accompanied by a short exercise to allow you put your new skill into practice immediately. The whole tutorial can be completed in under 20 minutes. This tutorial should be a requirement for anybody producing documents for circulation to the public. Have a look at the introduction video below.

Global Accessibility Awareness Day – Apple Accessibility – Designed for everyone Videos

Today May 18th is Global Accessibility Awareness Day and to mark the occasion Apple have produced a series of 7 videos (also available with audio description) highlighting how their products are being used in innovative ways by people with disabilities. All the videos are available in a playlist here and I guarantee you, if you haven’t seen them and you are interested in accessibility and AT, it’ll be the best 15 minutes you have spent today! Okay the cynical among you will point out this is self promotion by Apple, a marketing exercise. Certainly on one level of course it is, they are a company and like any company their very existence depends on generating profit for their shareholders. These videos promote more than Apple however, they promote independence, creativity and inclusion through technology. Viewed in this light these videos will illustrate to people with disabilities how far technology has moved on in recent years and make them aware of the potential benefits to their own lives. Hopefully the knock on effect of this increased awareness will be increased demand. Demand these technologies people, it’s your right!

As far as a favorite video from this series goes, everyone will have their own. In terms of the technology on show, to me Todd “The Quadfather” below was possibly the most interesting.

This video showcases Apple’s HomeKit range of associated products and how they can be integrated with Siri.

My overall favorite video however is Patrick, musician, DJ and cooking enthusiast. Patrick’s video is an ode to independence and creativity. The technologies he illustrates are Logic Pro (Digital Audio Workstation software) with VoiceOver (Apple’s inbuilt screen-reader) and the object recognizer app TapTapSee which although has been around for several years now, is still an amazing use of technology. It’s Patrick’s personality that makes the video though, this guy is going places, I wouldn’t be surprised if he had his own prime time TV show this time next year.

FlipMouse – Powerful, open and low cost computer access solution

The FLipMouse (Finger- and Lip mouse) is a computer input device intended to offer an alternative for people with access difficulties that prevent them using a regular mouse, keyboard or touchscreen. It is designed and supported by the Assistive Technology group at the UAS Technikum Wien (Department of Embedded Systems) and funded by the City of Vienna (ToRaDes project and AsTeRICS Academy project). The device itself consists of a low force (requires minimal effort to operate) joystick that can be controlled with either the lips, finger or toe. The lips are probably the preferred access method as the FlipMouse also allows sip and puff input.

man using a mounted flipmouse to access a laptop computer

Sip and Puff is an access method which is not as common in Europe as it is in the US however it is an ideal way to increase the functionality of a joystick controlled by lip movement. See the above link to learn more about sip and puff but to give a brief explanation, it uses a sensor that monitors the air pressure coming from a tube. A threshold can be set (depending on the user’s ability) for high pressure (puff) and low pressure (sip). Once this threshold is passed it can act as an input signal like a mouse click, switch input or keyboard press among other things. The Flipmouse also has two jack inputs for standard ability switches as well as Infrared in (for learning commands) and out (for controlling TV or other environmental controls). All these features alone make the Flipmouse stand out against similar solutions however that’s not what makes the Flipmouse special.

Open Source

The Flipmouse is the first of a new kind of assistive technology (AT) solution, not because of what it does but because of how it’s made. It is completely Open Source which means that everything you need to make this solution for yourself is freely available. The source code for the GUI (Graphical User Interface) which is used to configure the device and the code for the microcontroller (TeensyLC), bill of materials listing all the components and design files for the enclosure are all available on their GitHub page. The quality of the documentation distinguishes it from previous Open Source AT devices. The IKEA style assembly guide clearly outlines the steps required to put the device together making the build not only as simple as some of the more advanced Lego kits available but also as enjoyable. That said, unlike Lego this project does require reasonable soldering skills and a steady hand, some parts are tricky enough to keep you interested. The process of constructing the device also gives much better insight into how it works which is something that will undoubtedly come in handy should you need to troubleshoot problems at a later date. Although as stated above Asterics Academy provide a list of all components a much better option in my opinion would be to purchase the construction kit which contains everything you need to build your own FlipMouse, right down to the glue for the laser cut enclosure, all neatly packed into a little box (pictured below). The kit costs €150 and all details are available from the FlipMouse page on the Asterics Academy site. Next week I will post some video demonstrations of the device and look at the GUI which allows you program the FlipMouse as a computer input device, accessible game controller or remote control.

FlipMouse construction kit in box

I can’t overstate how important a development the FlipMouse could be to the future of Assistive Technology. Giving communities the ability to build and support complex AT solutions locally not only makes them more affordable but also strengthens the connection between those who have a greater requirement for technology in their daily life and those with the creativity, passion and in-depth knowledge of emerging technologies, the makers. Here’s hoping the FlipMouse is the first of many projects to take this approach.

Tobii Tracker 4C

Tobii Gaming, the division of Swedish technology firm Tobii responsible for mainstream (and therefore low cost) eye trackers have released a new peripheral called the Tracker 4C (pictured below).

contents of Tracker 4 C box. Eye tracker, documentation, 2 magnetic mounts

Before getting into the details of this new device I first want to highlight that although this eye tracker can be used as a computer access solution for someone with a disability (it already works with Optikey and Project IRIS), it is not being marketed as such. What this means in practice is that it may not provide the reliability that their much costlier Assistive Technology (AT) eye trackers such as the Tobii PC Eye Mini do. So if eye tracking is your only means of communication or computer access and you have the funds I would recommend spending that extra money. That said, many people don’t have the funds or perhaps they have other more robust means of computer access and just want to use eye tracking for specific tasks like creating music or gaming. For those people the Tracker 4 C is really good news as it packs a lot into the €159 price tag and overcomes many of the weaknesses of its predecessor the Tobii Eye X. The big improvement over the Tobii Eye X is the inclusion of the EyeChip. The EyeChip which was previously only included in the much more expensive range of Tobii AT eye trackers takes care of most of the data processing before sending it on to the computer. The result of this is much less data is being passed to the computer from the eyetracker (100KB/S compared to 20MB/S) and a much lower CPU (Central Processing Unit) load (1% compared to 10%). This allows it to work over an older USB 2 connection and means most (even budget) computers should have no problem running this device (unlike the Eye X which required a high end PC).

All this must have come as some compromise in performance right? Wrong. The Tracker 4C actually beats the Eye X in almost every category. Frequency has risen from 70Hz to 90Hz, slightly longer operating distance is possible .95m, and the max screen size has increased by 3” to 30”. This last stat could be the deciding factor that convinces Tobii PC Eye Mini users to buy the Tracker 4 C as a secondary device as the Mini only works with a max screen size of 19”. The Tracker 4 C also offers head tracking but as I haven’t tested the device I’m unsure of how this works or of it is something that could be utilised as AT. Watch this space, the Tracker 4 C is on our shopping list and I’ll post an update as soon as we get to test whether it’s as impressive in real life as it seems on paper.

The table below compares specs for all Tobii’s current range of consumer eye trackers. In some areas where information was not available I have added a question mark and if appropriate a speculation. I am open to correction.

  Gaming Assistive Technology
Eye Tracker Models  Tobii Eye Tracker 4C Tobii EyeX* Tobii PC Eye Explore Tobii PC Eye Mini
Cost €159 €109 €680 €2000
Size 17 x 15 x 335 mm
(0.66 x 0.6 x 13.1 in)
20 x 15 x 318 mm
(0.8 x 0.6 x 12.5 in)
20 x 15 x 318 mm
(0.8 x 0.6 x 12.5 in)
170 mm × 18 mm × 13 mm

6.69“ × 0.71“ × 0.51“

Weight 91 grams 91 grams 69 grams 59 grams
Max Screen Size 27 inches with 16:9 Aspect Ratio
30 inches with 21:9 Aspect Ratio
27 inches 27 inches 19 Inches
Operating Distance 20 – 37″ / 50 – 95 cm 20 – 35″ / 50 – 90 cm 18-32 “/ 45-80 cm 45 cm – 80 cm 18” – 32″
Track Box Dimensions  16 x 12″ / 40 x 30 cm at 29.5″ / 75 cm 16 x 12″ / 40 x 30 cm at 29.5″ / 75 cm 19 x 15” / 48 x 39cm >35 cm × 30 cm ellipse
>13.4” × 11.8”
Tobii EyeChip Yes No No Yes
Connectivity USB 2.0 (integrated cord, USB 2.0 BC 1.2) USB 3.0 (separate cord) USB 3.0 USB 2.0
USB Cable Length 80 cm 180 cm 180 cm (short, extension needed in some situations)
Head Tracking Yes (not powered by EyeChip) No No No
OS Compatibility Windows 7, 8.1 and 10 (64-bit only) Windows 7, 8.1 and 10 (64-bit only) Windows 7, 8.1 and 10 (64-bit only) Windows 7 Windows 8.1 Windows 10
CPU Load 1%* 10% 10% ? (unconfirmed but similar to Tracker 4 C)
Power Consumption 1.5 Watt 4.5 Watt ? (unconfirmed but suspect same as Eye X) 1.5 Watt
USB Data Transfer Rate 100KB/s 20MB/s ? (unconfirmed but suspect same as Eye X) ? (unconfirmed but similar to Tracker 4 C)
Frequency 90 Hz 70 Hz 55 Hz 60Hz
Illuminators Near Infrared (NIR 850nm) Only Backlight Assisted Near Infrared
(NIR 850nm + red light (650nm))
? (unconfirmed but suspect same as Eye X) ?
Tracking Population 97% 95% ? (unconfirmed but suspect same as Eye X) ?
Additional Software Tobii Eye Tracking Core Software Tobii Eye Tracking Core Software Gaze Point (mouse emulation software) Windows Control

 

* The specs given here are taken from those listed on https://help.tobii.com/hc/en-us/articles/212814329-What-s-the-difference-between-Tobii-Eye-Tracker-4C-and-Tobii-EyeX- accessed 08/03/2017. Because the weight listed is 91 grams I suspect these specs are for the first generation Tobii Eye X (as it weighs 91 grams, the more recent Eye X weighs 69 Grams). The current Eye X specs are probably similar to the PC Eye Explore but I cannot confirm this.

GazeSpeak & Microsoft’s ongoing efforts to support people with Motor Neuron Disease (ALS)

Last Friday (February 17th) New Scientist published an article about a new app in development at Microsoft called GazeSpeak. Due to be released over the coming months on iOS, GazeSpeak aims at facilitating communication between a person with MND (known as ALS in the US, I will use both terms interchangeably) and another individual, perhaps their partner, carer or friend. Developed by Microsoft intern, Xiaoyi Zhang, GazeSpeak differs from traditional approaches in a number of ways. Before getting into the details however it’s worth looking at the background, GazeSpeaker didn’t come from nowhere, it’s actually one of the products of some heavyweight research into Augmentative and Alternative Communication (AAC) that has been taking place at Microsoft over the last few years. Since 2013, inspired by football legend and ALS sufferer Steve Gleason (read more here) Microsoft researchers and developers have put the weight of their considerable collective intellect to bear on the subject of increasing the ease and efficiency of communication for people with MND.

Last year Microsoft Research published a paper called ”
AACrobat: Using Mobile Devices to Lower Communication Barriers and Provide Autonomy with Gaze-Based AAC” (abstract and pdf download at previous link) which proposed a companion app to allow an AAC user’s communication partner assist (in an non-intrusive way) in the communication process. Take a look at the video below for a more detailed explanation.

This is an entirely new approach to increasing the efficiency of AAC and one that I suggest, could only have come from a large mainstream tech organisation who have over thirty years experience facilitating communication and collaboration.

Another Microsoft research paper published last year (with some of the same authors at the previous paper) called “Exploring the Design Space of AAC Awareness Displays” looks at importance of a communication partners “awareness of the subtle, social, and contextual cues that are necessary for people to naturally communicate in person”. There research focused on creating a display that would allow the person with ALS express things like humor, frustration, affection etc, emotions difficult to express with text alone. Yes they proposed the use of Emoji, which are a proven and effective way a similar difficulty is overcome in remote or non face to face interactions however they went much further and also looked at solutions like Avatars, Skins and even coloured LED arrays. This, like the other one above, is an academic paper and as such not an easy read but the ideas and solutions being proposed by these researchers are practical and will hopefully be filtering through to end users of future AAC solutions.

That brings us back to GazeSpeak, the first fruits of the Microsoft/Steve Gleason partnership to reach the general public. Like the AACrobat solution outlined above GazeSpeak gives the communication partner a tool rather than focusing on tech for the person with MND. As the image below illustrates the communication partner would have GazeSpeak installed on their phone and with the app running they would hold their device up to the person with MND as if they were photographing them. They suggest a sticker with four grids of letters is placed on the back of the smart phone facing the speaker. The app then tracks the persons eyes: up, down, left or right, each direction means the letter they are selecting is contained in the grid in that direction (see photo below).

man looking right, other person holding smartphone up with gazespeak installed

Similar to how the old T9 predictive text worked, GazeSpeak selects the appropriate letter from each group and predicts the word based on the most common English words. So the app is using AI in the form of machine vision to track the eyes and also to make the word prediction. In the New Scientist  article they mention that the user would be able to add their own commonly used words and people/place names which one assumes would prioritize them within the prediction list. In the future perhaps some capacity for learning could be added to further increase efficiency. After using this system for a while the speaker may not even need to see the sticker with letters, they could write words from muscle memory. At this stage a simple QR code leading to the app download would allow them to communicate with complete strangers using just their eyes and no personal technology.

Create inclusive content with Office Mix and Sway

Here in Enable Ireland AT service we have been investigating using the Office Mix plugin for PowerPoint to create more engaging and accessible eLearning content. While we are still at the early stages and haven’t done any thorough user testing yet, so far it shows some real promise.

From the end user perspective it offers a number of advantages over the standard YouTube style hosted video. Each slide is marked out allowing the user to easily skip forward or back to different sections. So you can skip forward if you are comfortable with a particular area of the presentation or more importantly revisit parts that may have not been clear. The table of contents button makes this even easier by expanding thumbnail views of all the slides which directly link to the relevant sections of the video. There is also the ability to speed up or slow down the narration. Apart from the obvious comic value of this it is actually a very useful accessibility feature for people who may be looking at a presentation made in a language not native to them or by someone with a strong regional accent. On the flip side it’s also a good way to save time, the equivalent of speed reading.

From the content creator’s perspective it is extremely user friendly. Most of us are already familiar with PowerPoint, these additional tools sit comfortably within that application. You can easily record your microphone or camera and add to a presentation you may have already created. Another feature is “Inking”, the ability to write on slides and highlight areas with different colour inks. You can also add live web pages, YouTube videos (although this feature did not work in my test), questions and polls. Finally the analytics will give you a very good insight as to what areas of your presentation might need more clarification as you can see if some chooses to look at a slide a number or times. You can also see if slides were skipped or questions answered incorrectly.

Below is a nice post outlining some ways to create inclusive content using Office Mix and Sway, Microsoft’s other new(ish) web based presentation platform. Below that is a much more detailed introduction to Office Mix using… yes you guessed it Office Mix.

How Office Mix and Sway can help with student inclusion – Gerald Haigh