Google is building upon its accessibility features with Android 12 on the Pixel 6, so we tested whether they actually make life easier.

Photo: Sam Rutherford/Gizmodo

For a decade, me and many other disabled people have watched ads for smartphones that promised to improve accessibility with slick, cutting-edge, and life-changing tech. Let’s be honest: historically, there has been a lack of awareness on behalf of tech companies toward those with disabilities. But in the past five years, there have been steady—and significant—improvements. (Perhaps because those companies want more money.) I was especially excited to try Google’s new Pixel 6 and Pixel 6 Pro, which have exclusive accessibility features that build on the ones included in Android 12, which just rolled out to all Pixels.

I have a scratched cornea in one eye, a deaf left ear, weakened and slurred speech, and dexterity issues with my hands, so I wanted to test the new Pixels’ accessibility features to see if they actually worked. Spoiler alert: While the phone still doesn’t feel quite the same for me as it does for someone without disabilities, the gap is narrowing.

Faster, More Accurate Voice Recognition

The Pixel 6 and Pixel 6 Pro are built on Google’s in-house Tensor chip, which promises faster on-device machine learning and features powered by artificial intelligence. One of the features that stands to benefit from Tensor is voice recognition. I have been trying to find a voice recognition program that works for me since 1997. A few years ago, I worked with Google’s Project Euphonia, and after speaking thousands of reiterated phrases, they provided an app with a specific algorithm for me. The app augmented the dictation on my phone in everything from texts and comments to Google Docs and Google Assistant.

But the voice recognition that comes with most smartphones has not been a viable option for me and many people with speech impairments. Our voices are not deciphered the way voices without speech impairments are, resulting in accidental commands like calling the wrong person at midnight or misunderstanding “I love new paintings” for “Hi love, you painting?” Because my words are frequently misheard, my favorite feature of the new dictation mode was the ability to say “clear” and delete the text instead of having to touch Gboard. When I used voice recognition with Google Assistant or said short, declarative sentences, the voice recognition was flawless. Google’s Voice Access, which allows you to control your phone hands-free, was much easier to use because of this accuracy. However, if I didn’t clear my throat often or used voice recognition at night when my voice gets tired, it was less accurate.

Compared to past Pixels, the voice recognition is much more precise—it’s just not perfect. I tested the phones with words like vexatious, vicissitude, and tricky specific names like Megan thee Stallion in Google Docs. For me, the voice recognition got around two sentences out of every four right. I should note that I tested the Pixel 6 phones right out of the box. …….

Source: https://gizmodo.com/the-pixel-6s-accessibility-features-make-me-hopeful-for-1847997736

Leave a comment

Your email address will not be published. Required fields are marked *