Google is introducing two new features to make Android more accessible for those with mobility or speech disabilities. Both features use your phone’s front camera to track facial expressions and eye gestures, allowing you to navigate the Android interface, chat with others, or write text-to-speech messages without using touch controls or Google Assistant voice commands.
The first feature, called Camera Switches, was first spotted in the August Android 12 beta. It builds on Android’s old Switch Access tool, an accessibility feature that lets you operate an Android phone using external input devices. Camera Switches serves the same basic purpose, though it replaces physical buttons with face and eye gestures.
Like Switch Access, the new Camera Switches tool is fully customizable. You can move your eyes right and left to jump between an app’s UI elements, for example, or raise your eyebrows to pause and play music. Camera Switches even allows you to calibrate gestures to improve accuracy and prevent false-positives.
Android’s new Project Activate tool makes eye and facial gestures even more useful. It allows you to pair facial gestures with programmable actions. You could smile to send a pre-written text message to a family member or caretaker, for example. Or if you have a speech disability, you could use Project Activate to trigger common text-to-speech audio messages, such as your name or your favorite Starbucks order.
Not only that, but Project Activate lets you trigger sounds with facial or eye gestures, allowing you to express your emotions (or sense of humor) without pre-written messages. In examples provided by Google, people use this feature to play airhorn or party sounds while watching football games.
We aren’t entirely sure when Google will roll out these new accessibility features. Our best guess is that they will arrive alongside Android 12, which should release in the coming weeks. Hopefully they aren’t exclusive to Android 12, though.