Google has extended its Project Gameface, an open-source project aimed at making tech devices more accessible, to Android, and it can now be used to control the smartphone interface. The project was first unveiled at Google I/O 2023 as a hands-free gaming mouse that can be controlled using head movements and facial expressions. They are designed for people who suffer from physical disabilities and cannot use their hands or voice to operate devices. Keeping the functionality the same, the Android version adds a virtual cursor to allow users to control their device without touching it.

In an announcement made on his developer-focused blog post, Google said, “We’re open-sourcing more code for Project Gameface to help developers build Android apps to make every Android device more accessible. Through the device’s camera, it seamlessly tracks facial expressions and head movements, turning them into intuitive and personalized controls.” The company also asked developers to use the tools to add accessibility features to their apps as well.

Project Gameface collaborates with the Indian organization Incluzza, which supports people with disabilities. Using the collaboration, the project learned how its technologies could be extended for different use cases, such as message input, job search, and more. It uses MediaPipe’s facial landmark detection API and the Android Accessibility Service to create a new virtual cursor for Android devices. The cursor moves following the movement of the user’s head after tracking it with the front camera.

The API recognizes 52 facial gestures, including raising an eyebrow, opening the mouth, moving the lips, and more. These 52 gestures are used to control and map a wide range of functions on the Android device. One interesting feature is swiping. Users can use this to swipe home screen. To create a drag effect, users will need to define a start and end point. It could be something like opening the mouth and moving the head and after reaching the end point, closing the mouth again.

It should be noted that although this technology has been made available on GitHub, it is now up to developers to create applications using this option to make it more accessible to users. Apple also recently introduced a new feature that uses eye tracking to control the iPhone.

Affiliate links may be automatically generated – see our ethics statement for details.