return to table of content

Apple announces new accessibility features, including eye tracking

xnx
68 replies
1d

I love accessibility features because they might be the last features developed solely with the benefit of the user in mind. So many other app/os features are designed to steal your attention or gradually nerf usefulness.

snoman
19 replies
1d

Every attention thief is absolutely thrilled at the idea of tracking your eyes. Let’s all imagine the day where the YouTube free tier pauses ads when you’re not actively looking at them.

Shit. I’m turning into one of those negative downers. I’m sorry. I’ve had too much internet today.

Tagbert
15 replies
1d

If this is at all like the eye tracking in Vision Pro, it is only available to the OS and apps are not given access to the data.

ben_w
8 replies
22h42m

System one is, but advertisers could always roll their own and see if they can get away with "you can only view this content if you give us permission to use your camera".

favorited
6 replies
22h22m

At least on iOS, I can't imagine that happening - apps are not allowed to demand you grant permissions unrelated to the actual functionality. From App Review Guidelines (5.1.1) Data Collection and Storage, (ii) Access:

Apps must respect the user’s permission settings and not attempt to manipulate, trick, or force people to consent to unnecessary data access. For example, apps that include the ability to post photos to a social network must not also require microphone access before allowing the user to upload photos.

Lots of iOS apps today really want to mine your address book, and constantly spam you with dialogs to enable it, but they don't go as far as disabling other features until you grant them access, because they'd get rejected once someone noticed.

burnerthrow008
2 replies
19h29m

Fortunately apps in the EU don’t have to pass though Apple’s anticompetitive review process, so developers are free to ignore that rule if they simply distribute the app via an alternative store.

Unfortunately, poor Americans cannot taste the freedom that Europeans have to be abused by developers.

favorited
0 replies
11h37m

Fortunately, the Americans who create these products extract more value from the EU than they invest, or they wouldn’t bother trading with you <3

ben_w
0 replies
12h29m

I was thinking similar (only without the snark); but then I realised this is almost certainly not compatible with GDPR.

amlib
2 replies
22h5m

It's not that hard to come with ways to circumvent system restrictions, after all, advertisers are a fierce adversary and have shown many clever ways of invading users privacy in web browsers and mobile apps. In the case of eye tracking I could see a situation where the system perhaps feeds the "malicious" app in question with a hint of which widget is being currently gazed by the user. You could then just build a giant grid of invisible widgets covering the whole app window and use that to reconstruct the all the eye tracking happening inside your app.

miki123211
0 replies
13h18m

This is not about technical restrictions and finding weird legal loopholes, Apple's guidelines don't work that way. It's the spirit that matters, not the letter.

dialup_sounds
0 replies
20h29m

The system doesn't supply data like widget highlight states to apps for exactly that reason.

LordKeren
0 replies
22h22m

This would not pass the App Review process

mistrial9
2 replies
23h43m

for now - it is naive to think this is safe

sroussey
1 replies
23h26m

Eye tracking has been on iOS for many years (as an option for faceid called attention).

stacktrust
0 replies
23h20m

Avoidable via iPhone SE3 and iPad Air, which both use TouchID.

happyopossum
1 replies
18h37m

Until the EU comes in and forces them to ‘open up’ the functionality in the name of ‘fairness’…

Tijdreiziger
0 replies
15h17m

If you look at the current things the EU has forced them to open up (app distribution and NFC payments), both of them are things that Apple was already actively monetizing.

To compare this to a non-monetized accessibility feature is a bit disingenuous.

The whole notion of ‘fairness’ is that gatekeepers should allow others to compete on even footing. That can be fulfilled by granting EITHER everybody OR nobody access to the market (but not: only yourself).

bun_at_work
0 replies
21h29m

Until people complain that Apple is being anti-competitive by not making vision tracking open, or allowing third-party eye-tracking controls, etc. etc.

carl_dr
1 replies
22h39m

Watch the Black Mirror episode “Fifteen Million Merits”, to see how this might end up.

vundercind
0 replies
16h45m

Fifteen million merits is maybe the least about the future of any Black Mirror episode. It reads best as entirely a comment on the society we already have.

A big clue is that the world in it doesn’t make a ton of internal sense and the episode makes absolutely no effort to smooth that over. The questions it raises and leaves open without even attempting an answer are on purpose. You’re supposed to go “god, bicycling to make little pellets? Just to buy useless trash? God, why? It’s so pointless,” or, “they truly look down on and are mean to the people who get demoted, even though they’re basically guaranteed to end up that way someday, when their health or good fortune run out? Just cruelty for no reason that’s highly likely to hit them some day, too? That’s insane!”

… because actually it’s about now (or, the year it came out) and the point is to get you to connect the dots and realize we’re doing the same crap and just don’t notice. It’s only “sci fi” as some sleight of hand to give you an alien’s eye view of our own actual society as a way to prompt reflection. The core story of the corruption of a “revolutionary” to serve the purposes of the system is some bog-standard media studies stuff pertaining to today, not sci fi. The reveal that they could just fucking stop all this and go outside, it’s not some alien world or a post apocalypse after all is yet another thing that’s supposed to make you go “what’s wrong with them? Oh. Right. It’s us.”[1]

Long winded way to say: we can’t “end up” at Fifteen Million Merits future-world, because it’s just our own world we’re already in.

[1] Some read the final scene as depicting yet another viewscreen, but this is such a wildly weaker reading as far as the how effective the scene is that I can’t believe it’s intended that way.

kevin_thibedeau
0 replies
22h47m

Wait for human attention detection to become mandatory to view DRMed content on the telescreen.

ljm
17 replies
21h56m

I don't love that solid UX gets pushed under the accessibility rug, as an option you might never find.

I don't care how cynical it sounds, user experience became user exploitation a long time ago. Big Tech have been running that gimmick at too-big-to-fail scale for the last decade or so.

philistine
10 replies
21h7m

Accessibility benefits everyone, but in the basics you’re right. Too many simple straightforward options are now strictly inside accessibility. At least on the Apple side.

And don’t get me started on hidden command line settings.

nottorp
5 replies
20h50m

Too many simple straightforward options are now strictly inside accessibility.

<cough> Reduce Motion. Is it an accessibility feature or does it just get rid of an annoyance and is good for everyone?

nottorp
2 replies
12h27m

Yes they can, but I don't and I still hate needless animations and turn them off. The point is, why is it in "accessibility" when it should be more visible?

filleduchaos
1 replies
4h28m

It's plenty visible right where it is to people who don't have odd hangups about "accessibility".

nottorp
0 replies
4h9m

So would people with perfect eyesight and no motion sickness discover it?

Personally I forgot about it on my latest phone because I had a fresh prescription for my glasses and didn't need to enlarge the font at the moment :)

ornornor
0 replies
20h41m

Saves battery too along with cross fade transitions :)

a_wild_dandan
1 replies
21h2m

I'm here to intentionally get you started on hidden CLI settings. Learn me somethin'!

ibll
0 replies
19h26m

not OP, but macOS has a ton of options available with arcane commands. my favorites are the auto-hide dock speed setting, the third hidden window minimise animation, and the hidden system accent colours usually locked to the colourful iMacs

gumby
0 replies
20h43m

Too many simple straightforward options are now strictly inside accessibility

From outside, it feels like these are the only people with the freedom to improve the user experience at all. So they have to hide their work in the Accessibility preferences.

PaulStatezny
0 replies
20h18m

You're getting a lot of agreement from other HN users, but I'm not sure it's fair to criticize Apple for putting these kinds of features under Accessibility.

There's nothing that inherently "locks out" people who don't have a recognized disability from exploring these features. Furthermore, most of Apple's "accessibility" features are related to Vision/Hearing/etc (and categorized as such), so I think it's reasonable to consider them accessibility features.

Clearly based on other comments here, plenty of people discover these features and find them useful.

hbn
4 replies
20h56m

The iPhone has a hidden accessibility setting where you can map and double and/or triple tap of the back of your phone to a handful of actions. I use this to trigger Reachability (the feature that brings the entire UI halfway down the screen so you can reach buttons at the top) because phone screens are so damn big that I can't reach the opposite top corner with my thumb even on my 13 mini without hand gymnastics. And the normal Reachability gesture is super unreliable to trigger ever since they got rid of the front Touch ID home button.

happyopossum
1 replies
18h40m

The iPhone has a hidden accessibility setting

This isn’t “hidden”. It was even called out and demonstrated in the iOS 14 keynote.

hbn
0 replies
3h26m

Perhaps I could rephrase that it's hidden within Accessibility settings, not that it's an accessibility setting that is furthermore hidden.

Most people don't go into that menu to look around for things they might want to use cause features that almost everyone could benefit from are alongside settings for people with visual and hearing impairments.

square_usual
0 replies
18h15m

Unironically calling this feature "hidden" is why things are the way they are now. It's not hidden! You can find it if you go through the settings app! But because it isn't in your face all day every now and then people will talk about this "super secret feature" and then a PM somewhere has to make a nag feature to advertise it.

ornornor
0 replies
20h42m

Double tap is reachability for me and triple tap is to make the display very dim so that at night at the lowest brightness setting, I can get it even lower. It resets after a while so even if I forget to switch it off my screen won’t stay dim for the next few days while I wonder why it’s so damn dark.

dmazzoni
0 replies
17h19m

Let's say you're a developer at a big software company (not necessarily Apple, this happens everywhere) and you want to add a new optional setting.

The bar is pretty high. There are already hundreds of settings. Each one adds cost and complexity. So even if it's a good idea, the leadership might say "no".

Now let's say this same setting makes a big difference in usability for some people with disabilities. You just want to put it in accessibility settings. It won't clutter the rest of the UI.

You just turned your "no" into a "yes".

jonpurdy
12 replies
23h32m

I often use them to get around bad UI/UX (like using Reduce Motion), or to make devices more useful (Color Filters (red) for using at night).

Even outside of this, even able-bodied folks can be disabled due to illness, surgery, injury, etc. So it's great to see Apple continuing to support accessibility.

gerry_shaw
11 replies
23h20m

The red color filter for outside when trying to preserve night vision is a great tip. Some apps have this built-in but much better to have the OS change it everywhere.

Recommend creating a Shortcut to toggle this setting.

justsomehnguy
5 replies
22h47m

Recommend creating a Shortcut to toggle this setting.

Huh?

It's not built-in?

Android (or at least Moto) has it for years, auto enable on the schedule or the sunrise/sunset.

cqqxo4zV46cp
1 replies
19h36m

Kinda feels like you could’ve done more of a cursory glance to see what functionality was actually being talked about before going for the “Android already does this!!” comment.

justsomehnguy
0 replies
3h8m

Kinda feels like you could've be more like [0] than being an ass, while totally missing what I was surprised about the lack of automagik, not of the feature itself.

https://news.ycombinator.com/item?id=40371449

ben_w
1 replies
22h45m

There's a built in "night shift", but that just changes the colour temperature, it doesn't make everything monochrome red.

throwanem
0 replies
15h2m

No, that's (part of) what Color Filters is for.

carl_dr
0 replies
22h44m

iOS has also had “Night Shift” for several years. The parent is talking about a full on red colour filter, like astronomers might use.

PaulStatezny
1 replies
20h25m

Not just for outside, but also at public gatherings like concerts! I went to a concert last month and used a red color filter to record a couple short videos without being a big distraction to the audience behind me.

Dim backlight + Red color filter can make the screen almost invisible to those around you.

wishfish
0 replies
16h8m

Thanks for the reminder. Wish I had remembered this feature during the aurora photography I was doing. I set the phone brightness to the lowest setting but the red filter would have helped even more.

throwanem
0 replies
22h57m

You can also add it to the accessibility shortcut, available anywhere by triple-clicking the power button.

rpastuszak
0 replies
9h27m

I think you can just add it to Control Centre, no need for shortcuts. I made an app for reading in the dark, minimising the amount of light hitting your eyeballs, but I'm still using the red color filter every night.

The app is overall darker and more "strict" with how it handles content (esp. images) though: https://untested.sonnet.io/Heart+of+Dorkness and midnight.sonnet.io

Overall, reduced the number of photons is a fun metric to play with when building something/messing with prototypes.

dylan604
0 replies
22h57m

The issue I've seen when the app itself offers a red filter is if that app calls an OS native widget like a keyboard does not get filtered. The system level accessibility feature does filter the OS widget. I would almost rather the app's setting to just enable the OS filter, but I can understand why that might not be possible.

stacktrust
7 replies
1d

> developed solely with the benefit of the user in mind

Hopefully accessibility features are never artificially segmented to higher priced devices.

Loughla
3 replies
1d

At least in the US, they kind of can't be. The disability community is pretty up front about lawsuits.

pxc
1 replies
21h18m

That's because the ADA has no enforcement mechanism other than lawsuits, isn't it? Our whole legal disability rights infrastructure is designed to be driven by lawsuits, and sits inert if nobody sues.

miki123211
0 replies
13h21m

That's actually a good thing, especially if there are people specializing in bringing these lawsuits en-masse.

Over here in Europe, where there's no lawsuit culture, some laws (not necessarily accessibility-related, our legislation in that area is far weaker) are violated almost without repercussions. When the government is the only party that can bring charges and the government is complacent / ineffective, nobody actually gets charged and nothing gets done.

There's also the problem of incentives, if you can get a lot of money from a lawsuit, you have a far better incentive to sue and find a competent lawyer. You may even deliberately look for and sue violators as a moneymaking scheme, some companies in the US do this. This puts pressure on businesses to comply with the law. Even if you're piss-poor and never sue anybody, you still benefit.

If this lawsuit culture doesn't exist, the best you can do is write a report to the government and hope they actually act on it. Many people don't know how to write these, and since there's no money in it, getting a lawyer to do it for you is an expense that nobody is going to help you recoup.

The people handling these reports don't help either, they're usually 9-to-5 salaried employees who aren't judged too hard on performance, so they have far less of an incentive to actually pursue cases.

stacktrust
0 replies
1d

iOS 17 audio image descriptions for blind people via Image Magnifier should work on all iPhones, but do not work on iPhone SE3 and iPhone 11 Pro. Audio image descriptions do work in iPhone 12 Pro. Lidar in 12 Pro increases accuracy, but should not be mandatory. Hopefully this is a bug that can be fixed, since text descriptions were still functional on the lower-end devices.

Source: purchased devices until finding one that worked, since Apple docs indicated the feature should work on all iPhones that can run iOS 17.

Edit: audio descriptions in Magnifier are non-functional on iPad Air, working on M2 iPad Pro.

corps_and_code
1 replies
1d

I wonder if that would be legal, at least in the US. That feels like it'd be a violation of the ADA?

diebeforei485
0 replies
15h9m

It would only be a violation if it's purely software locked.

If it requires a chip that supports specific operations, and entry tier devices have an older chip, that wouldn't be a violation.

miki123211
0 replies
13h34m

Some of them are, at least on Apple's side, but it's always for a good technical reason. Screen recognition is only available on devices that have a neural chip, things that require lidar don't work on devices that don't have lidar and so on.

Google is worse at this, Talkback multi-finger gestures used to be Pixel and Samsung exclusive for a while, even though there was no technical reason for it.

Apple has a different problem, many accessibility features aren't internationalized properly. Screen recognition still has issues on non-english systems, so do image descriptions. Voice Over (especially on Mac) didn't include voices for some of the less-popular languages until very recently, even though Vocalizer, their underlying speech engine, has supported them for years. Siri has the same problem.

2OEH8eoCRo0
4 replies
21h17m

Funny, I was just thinking it was so that they can get more attention-economy eyeballs for ads.

0xEF
3 replies
21h4m

This will happen. These features are always ushered in as ways to make someone's life easier, and often that is exactly what it does, for a time, before some product manager figures out how they can maximize profit with it.

Growth at all costs, I guess.

cqqxo4zV46cp
0 replies
19h32m

Don’t say “I guess” as if you aren’t the one making the rather baseless accusation. What other accessibility features have been abused?

bentruyman
0 replies
13h43m

Can you name a single accessibility feature where this has happened ever? Kinda seems like you just made up some fake reality.

astrange
0 replies
15h8m

Apple doesn't have product managers. (More importantly, the hardware has been technically capable of eye tracking since Face ID was added.)

tambourine_man
0 replies
19h17m

That’s a profound and surprising insight. You’re absolutely correct.

lucb1e
0 replies
20h6m

Wouldn't a lot of the companies that build in accessibility do it from a viewpoint of gaining an even wider reach and/or a better public image?

I don't see optimizing for that as bad. If they think we'll love the product more by making it better for a given audience, especially if I'm in that audience, I'm happy. Does that mean this company now gets richer? Perhaps, and that's fine by me

hanniabu
0 replies
18h11m

Accessibility features can be used to steal attention too

ErigmolCt
0 replies
22h11m

Accessibility features stand out as user-centric developments, love that

Shank
22 replies
1d

Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles.

This excites me so, so much! I can't really use my phone as a passenger in a car without getting motion sick after 1-2 minutes. This seems like it might be a promising thing to try.

droopyEyelids
15 replies
1d

Have you noticed any correlation between how hungry you are and how fast motion sickness kicks in?

astrange
8 replies
23h14m

I'm not sure why, but I feel like I only get motion sickness in the back of Priuses. It must be something about their braking curve.

I don't sit in enough EVs to tell if they're the same.

rootusrootus
2 replies
22h13m

Some people never really learn how to use one-pedal driving, so they end up just going back and forth between accelerating and decelerating. That'll make me motion sick in a hurry, and I bet that is fairly universal (among people prone to motion sickness in cars, that is). So in that sense, any EV or hybrid is potentially a problem, depending on the driver.

teekert
1 replies
9h14m

Ah yes, I never get motion sick, except for when I'm in the car with just such a driver: A person with sine-foot.

yurishimo
0 replies
6h34m

This is giving me flashbacks to the time I took an uber home from the airport in Dallas. 25 minutes of an older gentlemen (65ish) just modulating between gas pedal and brake pedal the entire time. It was awful and I wish I wasn't such a coward at the time and had told him after he dropped me off.

121789
2 replies
22h18m

Teslas are especially bad for me. I think it’s the rough suspension and fast acceleration/deceleration

rootusrootus
1 replies
22h10m

The instant-on power and braking takes some getting used to. For the folks who have trouble mastering it, my recommendation is chill mode. It has a much softer acceleration profile, mostly eliminating the harsh starts you might be experiencing.

underdeserver
0 replies
4h22m

Chill mode is all upside in my book. There's still a ton of power when you need it, it's easier on the tires (and thus your wallet), and you get jerked around less.

yc-kraln
0 replies
22h15m

Toyota's hybrids are the worst. I never get motion sick except as a passenger in any Toyota hybrid

kylehotchkiss
0 replies
20h29m

I suspect most people's interaction with Prius' are Uber rides. Maybe Uber drivers just get bad habits from the platform incentives (drive fast = get more rides)

Shank
2 replies
23h52m

It’s really interesting you say this. Is this a known correlation? I feel like now that you mention it, it’s incredibly fast if I’m hungry.

toast0
0 replies
23h38m

I went on a cruise, and had significant (for me) motion sickness that only got better once I ate --- of course, I was avoiding eating because I didn't feel well, so that seems like the wrong choice.

KolmogorovComp
0 replies
23h22m

It is a known correlation.

deinonychus
0 replies
20h42m

Yes, sort of. I don’t necessarily have to feel hungry but if I’m on an empty stomach or just haven’t eaten in a while, the odds I get motion sickness are much higher.

If I’m riding somewhere to go get dinner, I have to sit in the front passenger seat. After dinner with a full belly? Throw me in the back and I’ll be fine.

danaris
0 replies
21h46m

I regularly drive two family members around—one gets motion sick much faster and more frequently when hungry, while the other gets motion sick the same either way.

Does make me wonder what the difference is there.

ErigmolCt
0 replies
22h3m

I have not. For me, it does not matter. The ride begins - the motion sickness kicks in

jncfhnb
3 replies
4h30m

Vaguely related anecdote.

I used to get bad nausea from aggressive physic-y VR games. But I heard people claim it can be grinded through. So I did that, and they were right. I can do VR games without needed to vomit although it’s still uncomfortable.

However… I am now much more sensitive to non VR motion sickness. :|

tashoecraft
1 replies
4h14m

I played games my whole life and was shocked I had near instant VR motion sickness in sim racing. Can confirm it can be grinded through, recognize the feelings and stop immediately.

cathalc
0 replies
3h23m

Very similar experience. My instinct would be to fight the sickness and push through, but in reality you need to stop immediately and try again in a few hours. Your tolerance tends to build exponentially!

sumtechguy
0 replies
3h39m

I have had good luck with just closing one eye. But that is very tiring to do for long periods.

ErigmolCt
0 replies
22h6m

I have motion sickness... It's so hard to movearound for me and I am still not able to find what works best for me

yreg
20 replies
20h21m

Accessibility is for everyone, including you, if you live long enough. And the alternative is worse. So your choice is death or you are going to use accessibility features. – Siracusa

jiggawatts
18 replies
19h50m

I aimed for the upvote button but they’re so tiny that my fat finger hit the downvote button by accident and then I had to retry the action. This is what people mean by accessibility is for everyone all of the time.

pseudosavant
9 replies
19h26m

I have a tremor and I run into this issue on HN all the time. I need to zoom in a lot to be sure I'll hit it.

datahack
4 replies
19h14m

I don’t (yet) have accessibility challenges beyond glasses but hitting the tiny arrows is incredibly difficult. How come HN doesn’t update to be more accessible? It’s been a long time… I’m surprised it hasn’t been talked about by the team there.

ipqk
1 replies
19h4m

Do you think Paul Graham or Garry Tan give a shit about accessibility?

dgfitz
0 replies
18h21m

No.

avtar
1 replies
18h56m

As someone who has carried out accessibility audits, I can unfortunately attest to this topic being a blindspot in tech circles. I remember hanging out with fairly senior frontend devs from a FAANG company who didn't know what purpose skip links served on websites. It can also be an uphill battle to advocate for remediation once design and development work is already baked in.

brookst
0 replies
15h2m

Yep. And I think there’s an interesting implicit bias where younger / mornjunior developers often get tasked with things like that, so they see no problem

avtar
2 replies
19h16m

Try https://www.modernhn.com if you haven't already. UI elements have more spacing around them, especially if zoomed in.

ebcase
1 replies
16h29m

Hadn’t heard of this before, it looks great! Need it on mobile though, and would be happy to pay a reasonable fee for it.

llm_trw
0 replies
14h13m

You can install the addon on mobile firefox.

Ironically because of addons I now use firefox exclusively on mobile.

spike021
0 replies
16h31m

Same here with essential tremor. Rarely do people think of us.

vasco
4 replies
19h47m

Zoom the website in, the browser has accessibility built in and hackernews zooms in fairly well.

Edit: I seem to be missing something as this is getting downvoted. I genuinely cannot use HN under 150% zoom so thought this was a basic comment I was making.

uoaei
3 replies
19h30m

Accessibility isn't just about possibility, it's about ergonomics.

You could integrate a differential equation by lining rocks up in a large desert as a computer, but you wouldn't say that solution is "accessible" to a human in the same way it would be with a CPU, a monitor, and functioning sight.

vasco
2 replies
19h24m

Ah, so people are annoyed that the zoom isn't appropriate by default, I get it. Thanks I was getting extremely confused. That being said I use different zoom levels for different websites, and the browser remembers them for me, I like how the feature works right now, and I have loads of myopia.

If people made it "big enough" to be inclusive by default in some websites I'd have to zoom out as well. So my point is that to me is more important to zoom in correctly (many websites don't), than be "big enough" to start with.

uoaei
0 replies
18h1m

I agree about zoom levels on different websites when on desktop. Zoom works differently on mobile vs desktop which is its own challenge as far as ergonomic use. On mobile, to get the up/downvote buttons to the size I want them, I'd have to scroll side-to-side to read comments.

Tijdreiziger
0 replies
15h27m

I mean, even at 175% zoom, the vote arrows are pretty close together on a touchscreen. And I’m barely 30.

I always end up double-checking whether the ‘unvote’ link is present. If it says ‘undown’, I know I’ve made a mistake (or vice versa).

contravariant
1 replies
19h3m

I've hidden and reported so many posts on the front page.

I can only hope that the algorithms take into account that there's a decent chance someone trying to hit a link or access the comments will accidentally report a post instead.

furyofantares
0 replies
18h41m

Every once in a while I wonder where a post disappeared to. Eventually I worked out I was accidentally hiding them, found dozens of hidden posts I was interested in when I looked at my profile.

I still do it all the time. It's a problem on both desktop and mobile. I've send mail about it to dang before and did get a response, definitely a known issue.

rkagerer
0 replies
16h21m

This may be an unpopular opinion but I'm just happy zoom works as expected on this site and don't mind if once in a while I have to correct a misvote.

I appreciate the information density, and that HN hasn't adopted some "contemporary" UI that's all whitespace and animations.

(And yes I agree the buttons are awkward)

devinprater
13 replies
1d2h

I wonder what new voices will be added to VoiceOver? We blind people never, ever thought Eloquence, an old TTS engine from 20 years ago now, would ever come to iOS. And yet, here it is in iOS 17. I wouldn't be surprised to see DecTalk, or more Siri voices. More Braille features is amazing, and they even mentioned the Mac! VoiceOver for Mac is notoriously never given as much love as VoiceOver for iOS is, so most blind people still use Windows, even though they have iPhones.

I was expecting to see much better image descriptions, but they've already announced a ton of new stuff for plenty other disabilities. Having haptic music will be awesome even for me, adding another sense to the music. There are just so many new accessibility stuff, and I can't wait to see what all is really new in VoiceOver, since there's always new things not talked about in WWDC or release notes. I'm hoping that, one day, we get a tutorial for VoiceOver, like TalkBack on Android has, since there are so many commands, gestures, and settings that a new user never learns unless they learn to learn about them.

bombcar
7 replies
1d1h

The image description stuff is already surprisingly good - I noticed when I got a photo text while driving and it described it well enough for me to know what it was.

skunkworker
6 replies
23h24m

Same, a family member send a photo while I was driving and over Carplay it was described fairly accurately.

slau
5 replies
21h40m

It’s sometimes awesome, and often extremely basic. “Contact sent you a picture of a group of people at an airport”. Amazing. “Contact sent you a screenshot of a social media post”. Useless. We know iOS can select text in pictures, so Siri can clearly read it. It knows it’s SoMe, so why not give me the headline?

zimpenfish
2 replies
12h56m

It knows it’s SoMe, so why not give me the headline?

There's certainly a non-zero number of times this could go horribly wrong (I'm guessing CarPlay doesn't have any sense of who is in the car with you) and defaulting to not reading them out is the safest option (but they could definitely add a toggle for this, yeah.)

slau
1 replies
9h54m

The same is true with reading out text messages. I’ve disabled it for CarPlay now after receiving a mildly raunchy text with a car full of colleagues. It’s still useful on the headphones though.

withinboredom
0 replies
4h17m

Only if you have apple headphones though. If you've got some other thing, for some reason, it doesn't know how to tell you anything.

wwilim
0 replies
5h18m

"Wife sent you a photo of a nude woman in the shower"

MBCook
0 replies
15h26m

I’m hoping this shows up in iOS 18.

asadotzler
3 replies
20h34m

My friends that use synthetic voices prefer cleanliness of the older and familiar voices. One friend listens at about 900 WPM in skim mode and none of the more realistic voices work well at those rates.

coldpie
1 replies
3h18m

Every once in a while I'll hear a blind person's phone audio while I'm out and about and it sounds like an unintelligible stream of noises, but they're interacting with it and using it and getting information from it. It's amazing, a whole other world of interaction from what I experience with my phone. I kind of want to learn how they're interacting with it.

cancerhacker
0 replies
5h12m

Back in the 90s we reverse engineered / patched the classic MacInTalk to speed up playback. Our testers had it cranked so fast as they navigated the UI that to me it sounded more like musical notes than text.

gield
0 replies
7h21m

I'm hoping that, one day, we get a tutorial for VoiceOver

Maybe it's not feasible for you but if you're ever near an Apple Store, you could definitely call them and ask whether they have an accessibility expert you could talk to. In the Barcelona Apple Store for example, there is a blind employee who is an expert at using Apple's accessibility features on all their devices. He loves explaining his tips and tricks to anyone who needs to.

Baeocystin
12 replies
16h8m

One of the most important accessibility features they could bring back are physical home buttons on at least one ipad.

I am completely serious. I work with a lot of older adults, many of whom have cognitive challenges, and the lack of a physical button to press as an escape hatch when they get confused is the #1 stumbling block by a mile.

lowkey
6 replies
15h52m

I agree. I have a last gen base iPad with Touch ID and a home button. I am pretty tech savvy but actually prefer this form factor.

foooorsyth
5 replies
15h32m

Touch ID is so drastically superior to Face ID in so many common “iPad scenarios”, eg. laying in bed with face partially obscured.

I don’t understand Apple’s intense internal focus on FaceID only. FaceID with TouchID as an option when a device is flat on the table or when your face is obscured is so much nicer.

sleepingreset
1 replies
15h26m

FaceID builds profitable user habit loops. More real-estate on the screen to show things, easier to thoughtlessly purchase things if your password is a glance at the camera, etc.

I don't think this is a user-focused decision, I believe it's a profit-focused one.

astrange
0 replies
15h13m

You can't buy things just by looking at the camera, there are forced button presses and the App Store tends to make you enter your password.

mortenjorck
1 replies
14h48m

I actually can’t use Touch ID, at least year-round.

I have naturally very dry skin, and no matter how much moisturizer I used in colder months, I had to constantly have it relearn my fingerprint, to the point where I would just give up and use a passcode instead.

Face ID, on the other hand, has been flawless since my first phone with it in 2018 (minus some mask-related challenges during the pandemic). It and ProMotion are basically the two reasons I bought an iPad Pro over an iPad Air.

aikinai
0 replies
14h14m

My mother had the same issue. TouchID would stop recognizing her finger less than a day after setting it up.

diebeforei485
0 replies
15h14m

I don’t understand Apple’s intense internal focus on FaceID only.

It isn't. Just a few days ago the new iPad Air has Touch ID.

Only iPad Pro uses Face ID. For iPad Pro users who use it for work and unlock it hundreds of times a day when in an office or during a commute, Face ID is vastly superior.

astrange
3 replies
15h14m

When there were physical buttons, it was very popular in Asia to enable the accessibility option to put a virtual one on screen instead, because they were afraid the physical one would break. So it was kind of useless in the end.

diebeforei485
2 replies
15h12m

because they were afraid the physical one would break

On early models it was actually quite common for the button to stop working after a year or two. The durability has since improved, but habits die hard.

astrange
0 replies
13h57m

It actually improved by no longer being a real button, instead it was a fake indent with a pressure sensor that did a haptic "click". But it still took up a lot of space on the front.

Terr_
0 replies
14h26m

I have a pet-theory that certain older folks are cautious with technology because they grew up with stuff that would break expensively if you pressed the buttons in the wrong order.

Then when they see younger people just randomly explore-clicking to get things working--because their experience is that it's a safe tactic--that can gets misinterpreted as expertise, leading to: "Wow, kids these days just know technology."

JimDabell
0 replies
15h40m

It’s not physical, but if the problem is they need something visible rather than a gesture, then you can put an always-on menu button on the screen that has a home button in.

https://support.apple.com/en-sg/111794

dagmx
11 replies
1d3h

This is one major advantage to Apple sharing a foundation across all their devices. Vision Pro introduced eye tracking to their systems as a new input modality, and now it trickles down to their other platforms.

I am surprised CarPlay didn’t have voice control before this though.

simlevesque
4 replies
1d

CarPlay devices aren't really powerful.

andrewmunsell
2 replies
1d

CarPlay is rendered by the phone itself, so it's not strictly a function of how powerful the car infotainment is. You've been able to talk to Siri since the beginning of CarPlay so additional voice control is really just an accessibility thing

joezydeco
1 replies
23h27m

Some cars already have a voice control button on the wheel for their existing system which, if done correctly, is overriden by Siri+CarPlay. Which is really nice when it works.

dhosek
0 replies
4h50m

I’m able to control carplay by just saying “Hey Siri.” Siri’s abilities tend to fluctuate based on what Apple is doing on the server, the phase of the moon and whether I remembered to sacrifice a chicken that morning, but otherwise, it seems to work fine.

Tagbert
0 replies
23h53m

CarPlay devices (car components) are essentially playing a streaming video of a hidden display generated by the phone. CarPlay also lets those devices send back touch events to trigger buttons and other interactions. Very little process is done on the vehicle.

BTW if you are plugged in to CarPlay and take a screen shot, it will include the hidden CarPlay screen.

jsheard
4 replies
1d

Is this really likely to be downstream of the Vision Pro implementation? I would think that eye-tracking with specialized hardware at a fixed position very close to the eye is very different to doing it at a distance with a general purpose front facing camera.

dmicah
1 replies
1d

Typically eye-trackers work by illuminating the eyes with near infrared light and using infrared cameras. This creates a higher contrast image of the pupils, etc. I assume Apple is doing this in the Vision Pro. Eye-tracking can also be done with just visible light, though. Apple has the benefit of knowing where all the user interface elements are on screen, so eye-tracking in this on the iPhone or iPad doesn't need to be high precision. Knowledge of the position of the items can help to reduce the uncertainty of what is being fixated on.

neverokay
0 replies
23h58m

So there isn’t much more to it than getting a good resolution image of the eye from any distance, every millisecond.

Precision is the issue because we are mostly moving our eyes in about 8 directions, there’s no precision because we don’t know how to measure focusing of our eye lens with a camera yet (unless that too is just a matter of getting a picture).

Squinting would be the closest thing to physically expressing focusing. So the camera needs to know I’m looking left with my eye, followed by a squint to achieve precision. Seems stressful though.

Gonna need AI just to do noise cancelling of involuntary things your eyes do like pupil dilation, blinking.

iloveyouocean
0 replies
1d

The 'tracking eyes' part is different, but once you have eye position data, the 'how the eyes interact with the interface' could be very similar.

dagmx
0 replies
16h50m

Not the hardware side, but the software side. Implementing all the various behaviours and security models for the eye tracking and baking it into SwiftUI, means that it translates over easier once they figure out the hardware aspect. But the iPads and iPhones with FaceID have had eye tracking capabilities for a while, just not useful in UI.

callwhendone
0 replies
1d1h

It would be amazing if it gets carried over to the Mac.

roughly
9 replies
1d

Accessibility settings are really a gold mine on iOS for device customization (yes, I agree, they shouldn’t be limited to accessibility).

I’m particularly interested in the motion cues and the color filters for CarPlay - I have color filters set up to enable every night as kind of a Turbo-night shift mode (deep orange-red color shift), would love to do the same for CarPlay.

I also completely forgot iOS had a magnifier built in!

ryandrake
5 replies
1d

Accessibility features tend to be superpowers though, and I'm glad Apple gates them behind permissions and opt-ins. We all know of applications who try to trick the user into granting them inappropriate access to the device through the Accessibility APIs. I think DropBox still begs you to grant them Accessibility access so its tendrils can do who-knows-what to your system.

With great power comes great responsibility.

burntwater
3 replies
20h34m

Guaranteed that marketers are salivating at the idea of eye tracking on apps and website. It's an amazing feature that absolutely needs to be gatekept.

klausa
2 replies
11h2m

I wonder if it'll use the same architecture as visionOS; where the vision tracking events and UI affordances are processed and composited out-of-process; with the app never seeing them.

nindalf
1 replies
8h3m

That's probably how it'll go because it's the path of least resistance. A button will already have a listener for tap, so the OS translates the vision tracking into a "tap" and triggers the relevant code. There's no point telling the app about vision tracking because apps wouldn't already have a handler for that event. And for privacy reasons, there's no need to start now.

klausa
0 replies
7h44m

iPadOS has hover states and pointer events; those could be arguably trigerred by eye tracking.

roughly
0 replies
21h48m

It varies. Things like keyboard control or that kind of thing, absolutely, but mostly I've used it for stuff like "don't make an animated transition every time I change pages like an overcaffienated George Lucas" or "actually make night shift shift enough to be useful at night". I also use the background sounds to augment noise cancellation while taking a nap. All of those are just useful things or personal settings, not necessarily attack vectors.

gnicholas
0 replies
13h8m

My favorite is the "allow ANC with just one AirPod in". I have no idea why this would be an accessibility feature. If I turn on ANC, then I don't want it to be disabled just because I'm listening with one ear!

astrange
0 replies
23h12m

Well, they aren't really limited to accessibility, but they are hidden there. It's sort of like a convenient excuse to get UI designers off your back if you want to ship customization.

SSLy
0 replies
10h48m

FYI you can also make turbo night shift by scheduling toggling of white point balance, yep, in accessibility settings

dakial1
9 replies
23h48m

Eye tracking is not an accessibility feature, it is an advertising optimization feature disguised as an accessibility feature.

devmor
3 replies
23h47m

Eye tracking is absolutely an accessibility feature. Just because you don't need it, and it can be abused, does not mean it isn't an absolutely game changing feature for some people.

nofunsir
2 replies
23h26m

He didn't say it wasn't an accessibility feature, just that it was disguised as one.

Just because it's not a game changing feature for some people, doesn't mean its primary function isn't advertising.

yreg
0 replies
20h27m

The user did say verbatim “Eye tracking is not an accessibility feature”

devmor
0 replies
22h45m

Something being disguised as something explicitly implies it is not that thing.

jcotton42
1 replies
23h27m

My presumption is that apps will not be able to access this data, at least without some sort of permission gate.

sroussey
0 replies
23h17m

Indeed they haven’t for all the years it was limited to FaceId’s attention option.

simonw
0 replies
23h19m

Apple are very good about not making this kind of thing available to apps that don't have an explicit reason to need it.

joshstrange
0 replies
21h33m

You can say a lot of negative true things about Apple but this is just silly. There is no way Apple is going to expose that data to underlying apps in the same way they refused to do it in Vision Pro. I'd bet a good bit of money it works the same way where it's a layer on top of the app that the app can't access and from the video it looks like that's exactly how it works.

Aloisius
0 replies
20h46m

It allows people with ALS to navigate their device with their eyes.

Microsoft added a similar feature to Windows about seven years ago.

MagicMoonlight
7 replies
7h13m

This is why Apple is the best. They’ll make features like this because it’s the right thing to do, even if it won’t really make any difference to 99.99% of their customers.

They really earn consumer loyalty.

s_dev
1 replies
4h59m

The accessibility features are also useful for automated UI testing.

They’ll make features like this because it’s the right thing to do

While the positivity is great to see -- I'd temper that expectation they're simply doing the right thing and definitely acting in their perceived interest. People can and often do the right thing -- large companies rarely do.

huijzer
0 replies
4h13m

People can and often do the right thing -- large companies rarely do.

There are companies which try (and succeed) to be honest "citizens". Berkshire and Costco have quite good reputations.

cainxinth
1 replies
5h59m

This was the right thing to do, but I doubt “the right thing to do” was the primary motivator for it. This is smart marketing that helps position them as the more caring, inclusive, and capable tech brand. It also expands their reach more than a fraction of a percent. 13% of Americans have some form of disability.

bdzr
0 replies
5h11m

I feel like it also helps them get an edge in the healthcare sector, which has buckets of money in the mix.

withinboredom
0 replies
4h14m

Do you realize how bad Apple's accessibility issues were before? This is just marketing and I doubt many people are going to drop thousands of bucks to ditch years of tooling they're already intimately familiar with (aka, Windows), but this is an effort to try and entice some people. That's all it is. Marketing.

ncgl
0 replies
2h54m

I read this very skeptically.

When I hear eye tracking I immediately think of advertisers targeting me based on what I look at, NOT quadriplegics using Apple devices.

Maybe I'm a cynic

bloopernova
0 replies
3h56m

I'm hoping they'll add this to macOS too. Eye tracking would be great for end-user UX research.

I'd also like to see what sort of weird stuff people come up with to use eye tracking. Games could use that in interesting ways.

tootie
5 replies
1d

Quibble but this isn't "eye tracking" it's "gaze tracking". Eye tracking is detecting where your eyes are. Gaze tracking is what you're looking at.

crancher
1 replies
1d

Well then it's both?

tootie
0 replies
22h40m

Could be, but not necessarily. Eye tracking usually means it's tracking the eyes of one or more people in a video. Gaze tracking usually requires your eyes stay pretty steady and close to the tracker.

bogtog
1 replies
18h3m

Where are you getting this language point from? If I look up any company selling "eye trackers", their products are all meant to track where you're looking, e.g., https://www.tobii.com/

tootie
0 replies
15h40m

Interesting. I picked it up working with Tobii devices a few years ago actually. I guess they updated their vocabulary.

astrange
0 replies
23h10m

This is how I feel about "face recognition" (it should mean recognizing whether something is a face or not), but it is common to use eye tracking this way.

s3p
5 replies
1d1h

I wonder if this announcement had anything to do with the bombshells OpenAI and Google dropped this week. Couldn’t this have been part of WWDC next month?

joshstrange
1 replies
21h30m

I think it's more of "clearing the decks" for stuff that didn't make the cut for WWDC. I assume WWDC is going to be all about AI and they couldn't find a good spot to put this announcement. "Clearing the decks" isn't a very kind way to refer to this accessibility tech since Apple has always been better than almost everyone else when it comes to accessibility. I don't see this as "we don't care, just announce it early" as much as "we can't fit this in so let's announce it early".

dhosek
0 replies
4h50m

As noted elsewhere, Apple always does their accessibility announcements in advance of WWDC.

kergonath
0 replies
23h23m

They do it every year at the same time. Also, it’s a small announcement, not a keynote or the kind of fanfare we have at WWDC or the September events. This does not seem calibrated to be effective in an advertising war with another company. All this to say, probably not.

Tagbert
0 replies
23h57m

As Terramex pointed out this is tied to a particular, relevant event.

It’s also pretty common for Apple to preannounce some smaller features that are too specialized to be featured in the WWDC announcements. This gives them some attention when they would be lost and buried in WWDC footnotes.

It is also probably an indication that WWDC will be full of new features and only the most impactful will be part of the keynote.

anyfoo
5 replies
17h49m

I have better than 20/20 vision (yes, really) and now mobility problems, but there are some macOS accessibility features that I love.

One is Zoom: I hold down two modifier keys and scroll, and can instantly zoom in to any part of the screen. It is extremely smooth, the more you scroll the higher you zoom, instantly and with high frame rate. Great when you want to show someone something, or just zoom into a detail of something if the app doesn’t support it or is two cumbersome. Or “pseudo-fullscreen” something.

The other one is three finger drag on the trackpad. Why that isn’t default is beyond me. It means that if you use three fingers, any dragging on the trackpad behaves as if you held the button down. It’s so convenient.

eviks
2 replies
15h35m

It's better to have an easy way to hold a mouse button via keyboard with your left hand and continue to use one finger on the touchpad rather than do the whole three fingers to drag

anyfoo
1 replies
15h23m

Not for me, no.

But sounds like something accessibility options may also provide, and I can see how it may be better for a lot of people.

eviks
0 replies
14h52m

Which one have you tried?

simonbarker87
0 replies
11h22m

The default way to drag on a trackpad is something I’ve never gotten good with so I always enable drag lock the second I get a new laptop. Ideally I would switch to the three finger gesture but after 15 years of drag lock I just can’t get my brain to switch over.

floydnoel
0 replies
4h44m

three finger dragger for life here. whenever I see colleagues struggling to drag items on a Mac, I show them three finger dragging and it blows them away. total game-changer!

Ocha
4 replies
23h22m

Music haptics can be a cool way to teach someone how to dance and “feel the beat”

burntwater
1 replies
20h18m

I'm severely hearing impaired and enjoy going to dance classes - swing, salsa, etc. If I'm standing still, I can easily tune into the beat. But once I start moving, I quickly lose it on many songs; dance studios aren't known for having large sound systems with substantial bass. I don't know that this specific setup would fix anything -- it would need some way of syncing to the instructor's iPhone that is connected via bluetooth to the studio's little portable speaker. But it's a step in the right direction.

While on the topic, I can hear music but almost never understand lyrics; at best I might catch the key chorus phrase (example: the words "born in the USA" are literally the only words I understand in that song).

A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music. This have been game changing for me. I'm catching up on decades worth of music where I never had any idea what the lyrics are (filthy, that's what they are. So many songs about sex!). It has made exercising on the treadmill, elliptical, etc actually enjoyable.

pests
0 replies
14h31m

A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music.

JSYK Spotify has had this for years too, its just under the "View Lyrics" button but it does highlight the sentence/word karaoke style. It used to be a "spotify app" back in that genre of the service.

NeuroCoder
0 replies
23h9m

There's a lot of interesting things we can do with haptics since they're relatively cheap to put in stuff. Hopefully accessibility gets the software and applications further along soon

ErigmolCt
0 replies
21h59m

Using haptics in music to enhance rhythm perception and dance skills. Sounds really cool!

simonw
3 replies
23h20m

macOS has had a version of eye tracking for a while, it's really fun to try out.

System preferences -> Accessibility -> Pointer Control

Then turn on the "Head pointer" option.

kridsdale1
0 replies
20h33m

VisionOS has this too. I mapped it to a triple click of the button for when eye tracking becomes inaccurate.

adriancooney
0 replies
3h20m

Cool. I'm a bit unsettled that my camera's green dot didn't turn on for it though.

RaoulP
0 replies
4h27m

Cool! That works surprisingly well. But how do you click while using this? Clicking on the trackpad doesn't work, when it's tracking my head.

tallytarik
2 replies
21h28m

“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO.

Does this line actually mean anything? Press releases are so weird.

sussmannbaka
0 replies
12h26m

Sure it means something, it’s pretty low on the PR talk scale even. Let’s see:

We believe deeply in

"We", in this case, refers to Apple and as a company can’t believe in anything, it specifically refers to their employees.

the transformative power of innovation

A new feature will bring forth change

to enrich lives

In order to make someone’s life better.

So far so good, at least no synergies being leveraged for 10x buzzword transformation. Let’s see if it passes the test: There’s a bunch of new accessibility features, that certainly fits the bill for innovation. As anecdata, at least one of these will drastically change how I interact with my devices, so there’s the transformative power. I will be able to use these new accessibility features to improve both the way I work and how I privately use technology, so one could argue my life is being positively enriched by this transformative power brought about by innovation. Do they "believe deeply" in this? That is only for Tim Cook and his employees to know, but they’re one of the few companies pushing advances in accessibility past the finish line, so they might!

astrange
0 replies
14h59m

I always wonder why press releases include claims that a person said a paragraph or more of text that noone would ever say out loud, but this one actually does sound like something he'd say. In a keynote video, anyway.

myth_drannon
2 replies
1d

Eye Gaze devices(tablet with camera + software) cost around $20K, even if it offers 1/4 of the features this is good news for those who can't afford it.

asadotzler
1 replies
20h20m

Don't be ridiculous. Solid hardware and software combos for windows cost a small fraction of that. The convenient and decent Tobii PCEye costs like $1,250 and a very nice TMS5 mini is under $2,000. Your bullshit was off by at least an order of magnitude.

boneghost
0 replies
19h7m

Let's be fair and compare similar products. Do you have any examples of $2000 mobile devices that support eye tracking on the OS level? The products you mention look like they're extra hardware you strap to a Windows PC. Certainly useful, but not quite as potentially life-changing as having that built into your phone.

gpm
2 replies
20h7m

I wonder if Vision Pro has enough microphones to do the acoustic camera thing? If so you could plausibly get "speech bubbles over peoples heads", accurately identifying who said what.

I imagine that could be pretty awesome for deaf people.

stubish
1 replies
10h30m

Deaf people I know certainly wouldn't want to be wearing a headset, say, in a restaurant. But a phone app or tablet that can do live text-to-speach would be good enough in many cases if it can separate voices into streams. Anything like this available?

gpm
0 replies
3h16m

I don't want to say it's certainly not available, but I doubt it. Maybe someone has done something trying to recognize different voices by differences in pitch/volume/...

The vision pro has six mics, which likely enables it to locate the sources of sound in space by the amount of time it takes a sound to reach each mic (i.e. acting as an "acoustic camera"). Tablets and phones aren't going to have that hardware unfortunately.

And yeah, obviously wearing a headset is a pretty big compromise that's not always (or maybe even often) going to be worth it. This was more of a though of "the hardware can probably do this cool thing for people already using it" than "people should use the hardware so it can do the cool thing".

fitsumbelay
2 replies
23h49m

my first thought regarding eye-tracking: "whose accessibility to what|whom?"

devmor
1 replies
23h47m

Do you have another idea beyond the accessibility of Users without fine motor control to the phone's screen?

smegsicle
0 replies
21h59m

which ads you are looking at

sandspar
1 replies
8h34m

It's funny how the post is about enhanced surveillance technology and the text sentiment of the comment thread is overwhelmingly positive.

Surveillance technology: >:(

Surveillance technology with the word "accessibility" in the title: :)

kalkr
0 replies
6h45m

of course it’s surveillance technology.

an iphone constantly broadcasts your location to third parties, can deduce where you work and live, understands the unique shape of your face, has a built-in microphone and multiple kinds of cameras, stores all of your conversations, stores all of your photos, stores your health information, can figure out when you go to bed.. all on a completely closed and proprietary operating system.

it’s like asking “why hasn’t anyone mentioned that we’re all using a website right now”

rado
1 replies
23h43m

They keep announcing these bombastic accessibility features while simple things like tabbing remain frustratingly broken. The macOS “allow this app to access that” dialog supports shift+tab, but not tab.

pcl
1 replies
8h31m

Bummer the eye tracking is iOS only. I’ve been wanting focus-follows-eyes for decades.

npongratz
0 replies
1h54m

I share the sentiment. I've long noted in situations with grids of terminal windows where focus-follows-eyes would be so much faster and present less friction than using hotkeys or focus-follows-mouse.

mdm_
1 replies
22h31m

My first thought upon seeing the Haptic Music feature is to wonder how long until they make compatible headphones and I can relive my high school years, walking around listening to nu-metal and hip-hop with a Panasonic Shockwave walkman.

dhosek
0 replies
4h49m

Surely, this would be a headphone feature possible without having to be supported by the player.

killjoywashere
1 replies
17h25m

Imagine an eye-tracking loupe function on mobile: fit the same amount of text but bubble up the part under foveolar gaze. Save on readers* everywhere.

* readers are those glasses you can pick up at the drug store for $17.99.

skygazer
0 replies
11h24m

It took me a couple read-throughs to understand what you meant, but yes, on-screen gaze-sensitive magnification would be amazing, I agree.

tngranados
0 replies
4h0m

It's basically how the Apple Vision Pro mainly works.

ein0p
1 replies
12h26m

If accurate enough it seems like eye tracking could be useful beyond accessibility to eg control iPads without dragging one’s greasy fingers all over the screen. iPad already supports pointers.

superb_dev
0 replies
12h8m

I was thinking the same, it would make having an iPad under my monitor much less cumbersome

cush
1 replies
20h43m

All these features look amazing! That car motion sickness feature especially. Can’t wait to try it!

petre
0 replies
20h31m

At least I put off the phone while I was in the car. Not the case now. Thank you Apple but I'd rather be sick while looking at your phone in a car.

EasyMark
1 replies
21h34m

how can I make sure it's off? Is it off by default?

yreg
0 replies
20h26m

Yes

1oooqooq
1 replies
3h15m

remember how google added js apis to detect active tabs. and rendering intersection. complex webapps could use less battery and have richer interaction.

everyone rejoiced. nobody implemented anything useful.

soon, advertisers, google included, was selling ads packages by "viewability". lol.

can't wait for safari to lead with "eyed" ads.

sharkjacobs
0 replies
2h49m

I know this is a joke, but for for anyone who was wondering,

all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

It's not an API adopted which apps can use, it's a system pointer, like a mouse or trackpad

xyst
0 replies
17h38m

eye tracking feature is interesting … for ad-tech companies

trilorez
0 replies
15h24m

Haptics in the Music app will be great but it’s not exactly “new”, considering my haptic music app has been out for several years now: https://apps.apple.com/us/app/phazr/id1472113449

toasted-subs
0 replies
18h51m

Gotta make surf they got all the eye balls then make sure they don't get to date anybody and become homeless.

tap-snap-or-nap
0 replies
17h18m

This will have just enough capability limits and annoyances for you to be convinced to purchase a Vision Pro. It also makes eye tracking more widely accepted by making it available to everyone by using their existing devices.

syngrog66
0 replies
17h34m

while we're on topic of accessibility I'd like to point out the following lovely facts:

* on Android, in the builtin popup menu for copypasta, the COPY button is wedged in tightly (by mere millimeters) between the CUT and PASTE buttons on either side of it. A harmless one packed in between two destructive, lossy ones, and usually irreversible.

* the size of an adult human fingertip is no secret (and in fact 99.99% of humans have at least one!)

* Google's staff consists of humans

* Google supposedly hires ONLY "smart" people and their interview process involves some IQ-like tests

* Android has had this obvious anti-pattern for many years now. perhaps 10+?

stacktrust
0 replies
1d1h

iOS 17 Image Descriptions are quite good, but audio descriptions don't seem to work on non-Pro devices, even though text descriptions are being shown on the screen and the audio menu is present and activated. Is that a bug?

Even on Pro devices, audio image descriptions stop working after a few cycles of switching between Image Magnifier and apps/home. This can be fixed by restarting the app and disabling/enabling audio image descriptions, but that breaks the use case when an iPhone is dedicated to running only Image Magnifier + audio descriptions, via remote MDM with no way for the local blind user to restart the app.

On-device iOS image descriptions could be improved if the user could help train the local image recognition by annotating photos or videos with text descriptions. For a blind person, this would enable locally-specific audio descriptions like "bedroom door", "kitchen fridge" or specific food dishes.

Are there other iOS or Android AR apps which offer audio descriptions of live video from the camera?

squeral
0 replies
2h44m

Can't wait for my son to try this. He has general coordination issues due to a brainstem injury but the eyes are probably the part of the body he can better control. I'm not a fan of Apple's software and didn't have a great experience with the Vision Pro but I am excited to try it out.

rsingla
0 replies
15h44m

While it's a significant step forward for accessibility, it also invites us to consider how such technologies could integrate into everyday use for all users. This could enhance ease of use and efficiency, but it also requires careful consideration of privacy safeguards.

pmichaud
0 replies
2h47m

I hope this works for people with one eye. There are dozens of us—dozens!

jfoster
0 replies
6h42m

Reminder: eye tracking means an always-on camera.

I'm not fully against it, but it's good to keep in mind the implications.

hprotagonist
0 replies
20h26m

Very curious to see how well eye tracking works behind a motorcycle visor. Thick leather gloves, bike noise, and a touch screen and audio interface are not much fun.

elsonrodriguez
0 replies
3h33m

Didn’t Mark Rober say he worked on some motion sickness stuff at Apple?

devmor
0 replies
23h49m

I am excited for Vocal Cues - my main frustration with Siri is how poorly it comprehends even explicit instructions.

One thing I wish Apple would implement is some kind of gesture control. The camera can detect fine details of face movement, it would be nice if that were leveraged to track hands that aren't touching the screen.

For an example, if I have my iPhone or iPad on the desk in front of me, and a push notification that I don't need obstructs the content on the screen, I would love to be able to swipe my hand up towards the phone to dismiss it.

chaostheory
0 replies
18h2m

Nice to see features from Vision Pro make it onto other Apple products

bcx
0 replies
19h51m

This is a good time to remind everyone that tomorrow, May 16th is Global Accessibility Awareness Day (GAAD) (https://accessibility.day), and that there are over 176 events worldwide going on to celebrate the process we are all making at improving accessibility in our products -- any plenty of learning opportunities for beginners and experts.

badbart14
0 replies
1d

I definitely get a good amount of motion sickness when using my phone while in a car so I'm super interested about the motion sickness cues and if they'll work. The dots look like they may get in the way a bit but I'm willing to take that tradeoff. My current car motion sickness mitigation system is these glasses that have liquid in them that supposedly help your ears feel the motion of the car better (and make you look like Harry Potter)

alphabetting
0 replies
23h9m

google yesterday also open sourced their accessibility feature for android and windows that controls cursor using head movements and facial gestures

https://github.com/google/project-gameface

GiorgioG
0 replies
22h14m

My wife is a hospice nurse and from time to time she'll have a patient without any ability to communicate except their eyes (think ALS) - for these folks in their final days/weeks of life this will be a godsend. There are specialized eye-tracking devices, but they're expensive and good luck getting them approved by insurance in time for the folks in need near the end of their lives.

ChrisMarshallNY
0 replies
20h13m

I'm a believer in accessibility features. The difficulty is often in testing.

I use SimDaltonism, to test for color-blindness accessibility, and, in the last app I wrote, I added a "long press help" feature, that responds to long-presses on items, by opening a popover, containing the label and hint. Makes testing much easier, and doubles as user help.

C-Loftus
0 replies
19h3m

This is awesome and I love the UX, although I can't help but feel a bit sad that we need to always rely upon Apple and Microsoft for consumer accessibility.

It would be so great if more resources could be allocated for such things within the Linux ecosystem.