Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Iphone

iPhones Can Now Automatically Recognize and Label Buttons and UI Features for Blind Users (techcrunch.com) 20

Apple has always gone out of its way to build features for users with disabilities, and VoiceOver on iOS is an invaluable tool for anyone with a vision impairment -- assuming every element of the interface has been manually labeled. But the company just unveiled a brand new feature that uses machine learning to identify and label every button, slider and tab automatically. From a report: Screen Recognition, available now in iOS 14, is a computer vision system that has been trained on thousands of images of apps in use, learning what a button looks like, what icons mean and so on. Such systems are very flexible -- depending on the data you give them, they can become expert at spotting cats, facial expressions or, as in this case, the different parts of a user interface. The result is that in any app now, users can invoke the feature and a fraction of a second later every item on screen will be labeled. And by "every," they mean every -- after all, screen readers need to be aware of every thing that a sighted user would see and be able to interact with, from images (which iOS has been able to create one-sentence summaries of for some time) to common icons (home, back) and context-specific ones like "..." menus that appear just about everywhere. The idea is not to make manual labeling obsolete -- developers know best how to label their own apps, but updates, changing standards and challenging situations (in-game interfaces, for instance) can lead to things not being as accessible as they could be.
This discussion has been archived. No new comments can be posted.

iPhones Can Now Automatically Recognize and Label Buttons and UI Features for Blind Users

Comments Filter:
  • Apple has always gone out of its way to build features for users with disabilities

    Sure, that low-contrast UI stuff really helps everybody see what's on screen.

    (and getting lower-contrast with every generation...)

  • Now if only iPhone buttons could be recognized as buttons by sighted users too...
  • This might actually be able to help sighted people too. So many times i'm using a new app and I'm not even away that I can tap certain things or what action these things accomplish. I find touch apps to be pretty bad in general to be able to know how to get things done. on a traditional GUI, the elements stand out more and most things are pretty clear what you can click on an what you can't. Also, you can hover over most things and see a tool-tip of what the button will do. Things like this seem to be abs

  • Can it recognize those 'buttons' that are just static text in a slightly different color and weight, with no other borders or distinguishing features, that Apple loves oh so much?

  • Sighted users used to be able to see what was a button because it had a raised border, and could see what was an input field because it had a sunken border.
    You could seee immediately how to interact with a display.

    Then the GRAPHIC ARTIST WANKERS removed those cues because they didn't give a shit about users and only cared about a "clean" - i.e. UNUSABLE - look.

    • by tlhIngan ( 30335 )

      Then the GRAPHIC ARTIST WANKERS removed those cues because they didn't give a shit about users and only cared about a "clean" - i.e. UNUSABLE - look.

      Graphic Artists probably less so than "UX designers".

      The ones who railed against buttons which look like buttons because they limit you to designing things that look like physical things.

      Yes, too much skeumorphism is bad - the green felt and faux leather look in certain iOS apps look horrible, yet at the same time, they were a useful design element. After all,

  • I've tried Android's "accesibility" features for blind people, and I distinctly remember it having that exact feature without any (fake) "AI" being necessary. Because all app UIs were already built with the same toolkit, and all buttons already had readable labels. (All unicode symbols and emojis have assocated textual descriptions, so they work too.)

    • by _xeno_ ( 155264 )

      That's how VoiceOver currently works. One of the base UIKit classes implements interface that tells accessibility tools what the "human label" is for each element. (And then just to be assholes, Apple also handles specific elements like UIButton and UILabel differently based solely on class type.)

      The problem is getting people to fill them out, and fill them out correctly. Each field has three accessibility fields that can be filled out: "label," "hint," and "identifier." What do these do and what are you ex

  • How soon before it can figure out you're an idiot and hopefully blocks you from posting on slashdot?

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...