Tech

Google expands hands-free and eyes-free interfaces on Android

As part of Accessibility Awareness Day 2024, Google is introducing a few updates to Android that should be helpful to people with mobility or vision impairments.

Project Gameface allows players to use their face to move the cursor and perform common desktop clicking actions. It’s now coming to Android.

The project allows people with reduced mobility to use facial movements, such as raising an eyebrow, moving their mouth or turning their head, to activate various functions. There are basic elements like a virtual cursor, but also gestures where, for example, you can set the start and end of a swipe by opening your mouth, moving your head, then closing your mouth.

It is customizable based on a person’s abilities and Google researchers are working with Incluzza in India to test and improve the tool. Certainly, for many people, the ability to simply and easily play thousands of games (well, millions probably, but thousands of good ones) on Android will be more than welcome.

There’s a great video here that shows the product in action and being customized; Jeeja, in the preview image, talks about changing the amount she needs to move her head to activate the gesture.

This kind of granular adjustment is as important as someone being able to set the sensitivity of your mouse or trackpad.

Another feature for people who can’t easily use a keyboard, on-screen or physical: a new textless “look to talk” mode that lets people choose and send emojis either alone or as a group. as representatives of a sentence or an action.

You can also add your own photos, so someone can have common phrases and emoji on speed dial, as well as images of commonly used contacts attached to photos of them, all accessible with just a few glances.

For visually impaired people, there are a variety of tools (of varying effectiveness, no doubt) that allow a user to identify items seen by the phone’s camera. The use cases are endless, so sometimes it’s best to start with something simple, like finding an empty chair or recognizing the person’s key fob and pointing it out to them.

Image credits: Google

Users will be able to add custom object or location recognition so that the instant description feature gives them what they need and not just a list of generic objects like “a cup and a plate on a table.” What cup?!

Apple also showed off some accessibility features yesterday, and Microsoft has a few as well. Take a minute to look through these projects, which rarely get the mainline treatment (although Gameface did) but are of major importance to those for whom they are made.

techcrunch

Back to top button