I love accessibility features because they might be the last features developed solely with the benefit of the user in mind. So many other app/os features are designed to steal your attention or gradually nerf usefulness.
Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles.
This excites me so, so much! I can't really use my phone as a passenger in a car without getting motion sick after 1-2 minutes. This seems like it might be a promising thing to try.
Have you noticed any correlation between how hungry you are and how fast motion sickness kicks in?
I'm not sure why, but I feel like I only get motion sickness in the back of Priuses. It must be something about their braking curve.
I don't sit in enough EVs to tell if they're the same.
Some people never really learn how to use one-pedal driving, so they end up just going back and forth between accelerating and decelerating. That'll make me motion sick in a hurry, and I bet that is fairly universal (among people prone to motion sickness in cars, that is). So in that sense, any EV or hybrid is potentially a problem, depending on the driver.
Ah yes, I never get motion sick, except for when I'm in the car with just such a driver: A person with sine-foot.
This is giving me flashbacks to the time I took an uber home from the airport in Dallas. 25 minutes of an older gentlemen (65ish) just modulating between gas pedal and brake pedal the entire time. It was awful and I wish I wasn't such a coward at the time and had told him after he dropped me off.
Teslas are especially bad for me. I think it’s the rough suspension and fast acceleration/deceleration
The instant-on power and braking takes some getting used to. For the folks who have trouble mastering it, my recommendation is chill mode. It has a much softer acceleration profile, mostly eliminating the harsh starts you might be experiencing.
Chill mode is all upside in my book. There's still a ton of power when you need it, it's easier on the tires (and thus your wallet), and you get jerked around less.
Toyota's hybrids are the worst. I never get motion sick except as a passenger in any Toyota hybrid
I suspect most people's interaction with Prius' are Uber rides. Maybe Uber drivers just get bad habits from the platform incentives (drive fast = get more rides)
It’s really interesting you say this. Is this a known correlation? I feel like now that you mention it, it’s incredibly fast if I’m hungry.
I went on a cruise, and had significant (for me) motion sickness that only got better once I ate --- of course, I was avoiding eating because I didn't feel well, so that seems like the wrong choice.
It is a known correlation.
Yes, sort of. I don’t necessarily have to feel hungry but if I’m on an empty stomach or just haven’t eaten in a while, the odds I get motion sickness are much higher.
If I’m riding somewhere to go get dinner, I have to sit in the front passenger seat. After dinner with a full belly? Throw me in the back and I’ll be fine.
I regularly drive two family members around—one gets motion sick much faster and more frequently when hungry, while the other gets motion sick the same either way.
Does make me wonder what the difference is there.
I have not. For me, it does not matter. The ride begins - the motion sickness kicks in
Vaguely related anecdote.
I used to get bad nausea from aggressive physic-y VR games. But I heard people claim it can be grinded through. So I did that, and they were right. I can do VR games without needed to vomit although it’s still uncomfortable.
However… I am now much more sensitive to non VR motion sickness. :|
I played games my whole life and was shocked I had near instant VR motion sickness in sim racing. Can confirm it can be grinded through, recognize the feelings and stop immediately.
Very similar experience. My instinct would be to fight the sickness and push through, but in reality you need to stop immediately and try again in a few hours. Your tolerance tends to build exponentially!
I have had good luck with just closing one eye. But that is very tiring to do for long periods.
A similar app that's on Android: https://play.google.com/store/apps/details?id=com.urbandroid...
Unsure if it actually works though, my personal test results are mixed.
I have motion sickness... It's so hard to movearound for me and I am still not able to find what works best for me
Accessibility is for everyone, including you, if you live long enough. And the alternative is worse. So your choice is death or you are going to use accessibility features. – Siracusa
I aimed for the upvote button but they’re so tiny that my fat finger hit the downvote button by accident and then I had to retry the action. This is what people mean by accessibility is for everyone all of the time.
I have a tremor and I run into this issue on HN all the time. I need to zoom in a lot to be sure I'll hit it.
I don’t (yet) have accessibility challenges beyond glasses but hitting the tiny arrows is incredibly difficult. How come HN doesn’t update to be more accessible? It’s been a long time… I’m surprised it hasn’t been talked about by the team there.
Do you think Paul Graham or Garry Tan give a shit about accessibility?
No.
As someone who has carried out accessibility audits, I can unfortunately attest to this topic being a blindspot in tech circles. I remember hanging out with fairly senior frontend devs from a FAANG company who didn't know what purpose skip links served on websites. It can also be an uphill battle to advocate for remediation once design and development work is already baked in.
Yep. And I think there’s an interesting implicit bias where younger / mornjunior developers often get tasked with things like that, so they see no problem
Try https://www.modernhn.com if you haven't already. UI elements have more spacing around them, especially if zoomed in.
Hadn’t heard of this before, it looks great! Need it on mobile though, and would be happy to pay a reasonable fee for it.
You can install the addon on mobile firefox.
Ironically because of addons I now use firefox exclusively on mobile.
Same here with essential tremor. Rarely do people think of us.
Zoom the website in, the browser has accessibility built in and hackernews zooms in fairly well.
Edit: I seem to be missing something as this is getting downvoted. I genuinely cannot use HN under 150% zoom so thought this was a basic comment I was making.
Accessibility isn't just about possibility, it's about ergonomics.
You could integrate a differential equation by lining rocks up in a large desert as a computer, but you wouldn't say that solution is "accessible" to a human in the same way it would be with a CPU, a monitor, and functioning sight.
Ah, so people are annoyed that the zoom isn't appropriate by default, I get it. Thanks I was getting extremely confused. That being said I use different zoom levels for different websites, and the browser remembers them for me, I like how the feature works right now, and I have loads of myopia.
If people made it "big enough" to be inclusive by default in some websites I'd have to zoom out as well. So my point is that to me is more important to zoom in correctly (many websites don't), than be "big enough" to start with.
I agree about zoom levels on different websites when on desktop. Zoom works differently on mobile vs desktop which is its own challenge as far as ergonomic use. On mobile, to get the up/downvote buttons to the size I want them, I'd have to scroll side-to-side to read comments.
I mean, even at 175% zoom, the vote arrows are pretty close together on a touchscreen. And I’m barely 30.
I always end up double-checking whether the ‘unvote’ link is present. If it says ‘undown’, I know I’ve made a mistake (or vice versa).
I've hidden and reported so many posts on the front page.
I can only hope that the algorithms take into account that there's a decent chance someone trying to hit a link or access the comments will accidentally report a post instead.
Every once in a while I wonder where a post disappeared to. Eventually I worked out I was accidentally hiding them, found dozens of hidden posts I was interested in when I looked at my profile.
I still do it all the time. It's a problem on both desktop and mobile. I've send mail about it to dang before and did get a response, definitely a known issue.
This may be an unpopular opinion but I'm just happy zoom works as expected on this site and don't mind if once in a while I have to correct a misvote.
I appreciate the information density, and that HN hasn't adopted some "contemporary" UI that's all whitespace and animations.
(And yes I agree the buttons are awkward)
I wonder what new voices will be added to VoiceOver? We blind people never, ever thought Eloquence, an old TTS engine from 20 years ago now, would ever come to iOS. And yet, here it is in iOS 17. I wouldn't be surprised to see DecTalk, or more Siri voices. More Braille features is amazing, and they even mentioned the Mac! VoiceOver for Mac is notoriously never given as much love as VoiceOver for iOS is, so most blind people still use Windows, even though they have iPhones.
I was expecting to see much better image descriptions, but they've already announced a ton of new stuff for plenty other disabilities. Having haptic music will be awesome even for me, adding another sense to the music. There are just so many new accessibility stuff, and I can't wait to see what all is really new in VoiceOver, since there's always new things not talked about in WWDC or release notes. I'm hoping that, one day, we get a tutorial for VoiceOver, like TalkBack on Android has, since there are so many commands, gestures, and settings that a new user never learns unless they learn to learn about them.
The image description stuff is already surprisingly good - I noticed when I got a photo text while driving and it described it well enough for me to know what it was.
Same, a family member send a photo while I was driving and over Carplay it was described fairly accurately.
It’s sometimes awesome, and often extremely basic. “Contact sent you a picture of a group of people at an airport”. Amazing. “Contact sent you a screenshot of a social media post”. Useless. We know iOS can select text in pictures, so Siri can clearly read it. It knows it’s SoMe, so why not give me the headline?
It knows it’s SoMe, so why not give me the headline?
There's certainly a non-zero number of times this could go horribly wrong (I'm guessing CarPlay doesn't have any sense of who is in the car with you) and defaulting to not reading them out is the safest option (but they could definitely add a toggle for this, yeah.)
The same is true with reading out text messages. I’ve disabled it for CarPlay now after receiving a mildly raunchy text with a car full of colleagues. It’s still useful on the headphones though.
Only if you have apple headphones though. If you've got some other thing, for some reason, it doesn't know how to tell you anything.
"Wife sent you a photo of a nude woman in the shower"
I’m hoping this shows up in iOS 18.
My friends that use synthetic voices prefer cleanliness of the older and familiar voices. One friend listens at about 900 WPM in skim mode and none of the more realistic voices work well at those rates.
Every once in a while I'll hear a blind person's phone audio while I'm out and about and it sounds like an unintelligible stream of noises, but they're interacting with it and using it and getting information from it. It's amazing, a whole other world of interaction from what I experience with my phone. I kind of want to learn how they're interacting with it.
Here's a great demo by a blind software engineer that explains some of it: https://youtu.be/wKISPePFrIs?si=YOLcW9b2uyLXLn59
Back in the 90s we reverse engineered / patched the classic MacInTalk to speed up playback. Our testers had it cranked so fast as they navigated the UI that to me it sounded more like musical notes than text.
I'm hoping that, one day, we get a tutorial for VoiceOver
Maybe it's not feasible for you but if you're ever near an Apple Store, you could definitely call them and ask whether they have an accessibility expert you could talk to. In the Barcelona Apple Store for example, there is a blind employee who is an expert at using Apple's accessibility features on all their devices. He loves explaining his tips and tricks to anyone who needs to.
One of the most important accessibility features they could bring back are physical home buttons on at least one ipad.
I am completely serious. I work with a lot of older adults, many of whom have cognitive challenges, and the lack of a physical button to press as an escape hatch when they get confused is the #1 stumbling block by a mile.
I agree. I have a last gen base iPad with Touch ID and a home button. I am pretty tech savvy but actually prefer this form factor.
Touch ID is so drastically superior to Face ID in so many common “iPad scenarios”, eg. laying in bed with face partially obscured.
I don’t understand Apple’s intense internal focus on FaceID only. FaceID with TouchID as an option when a device is flat on the table or when your face is obscured is so much nicer.
FaceID builds profitable user habit loops. More real-estate on the screen to show things, easier to thoughtlessly purchase things if your password is a glance at the camera, etc.
I don't think this is a user-focused decision, I believe it's a profit-focused one.
You can't buy things just by looking at the camera, there are forced button presses and the App Store tends to make you enter your password.
I actually can’t use Touch ID, at least year-round.
I have naturally very dry skin, and no matter how much moisturizer I used in colder months, I had to constantly have it relearn my fingerprint, to the point where I would just give up and use a passcode instead.
Face ID, on the other hand, has been flawless since my first phone with it in 2018 (minus some mask-related challenges during the pandemic). It and ProMotion are basically the two reasons I bought an iPad Pro over an iPad Air.
My mother had the same issue. TouchID would stop recognizing her finger less than a day after setting it up.
I don’t understand Apple’s intense internal focus on FaceID only.
It isn't. Just a few days ago the new iPad Air has Touch ID.
Only iPad Pro uses Face ID. For iPad Pro users who use it for work and unlock it hundreds of times a day when in an office or during a commute, Face ID is vastly superior.
When there were physical buttons, it was very popular in Asia to enable the accessibility option to put a virtual one on screen instead, because they were afraid the physical one would break. So it was kind of useless in the end.
because they were afraid the physical one would break
On early models it was actually quite common for the button to stop working after a year or two. The durability has since improved, but habits die hard.
It actually improved by no longer being a real button, instead it was a fake indent with a pressure sensor that did a haptic "click". But it still took up a lot of space on the front.
I have a pet-theory that certain older folks are cautious with technology because they grew up with stuff that would break expensively if you pressed the buttons in the wrong order.
Then when they see younger people just randomly explore-clicking to get things working--because their experience is that it's a safe tactic--that can gets misinterpreted as expertise, leading to: "Wow, kids these days just know technology."
It’s not physical, but if the problem is they need something visible rather than a gesture, then you can put an always-on menu button on the screen that has a home button in.
This is one major advantage to Apple sharing a foundation across all their devices. Vision Pro introduced eye tracking to their systems as a new input modality, and now it trickles down to their other platforms.
I am surprised CarPlay didn’t have voice control before this though.
CarPlay devices aren't really powerful.
CarPlay is rendered by the phone itself, so it's not strictly a function of how powerful the car infotainment is. You've been able to talk to Siri since the beginning of CarPlay so additional voice control is really just an accessibility thing
Some cars already have a voice control button on the wheel for their existing system which, if done correctly, is overriden by Siri+CarPlay. Which is really nice when it works.
I’m able to control carplay by just saying “Hey Siri.” Siri’s abilities tend to fluctuate based on what Apple is doing on the server, the phase of the moon and whether I remembered to sacrifice a chicken that morning, but otherwise, it seems to work fine.
CarPlay devices (car components) are essentially playing a streaming video of a hidden display generated by the phone. CarPlay also lets those devices send back touch events to trigger buttons and other interactions. Very little process is done on the vehicle.
BTW if you are plugged in to CarPlay and take a screen shot, it will include the hidden CarPlay screen.
Is this really likely to be downstream of the Vision Pro implementation? I would think that eye-tracking with specialized hardware at a fixed position very close to the eye is very different to doing it at a distance with a general purpose front facing camera.
Typically eye-trackers work by illuminating the eyes with near infrared light and using infrared cameras. This creates a higher contrast image of the pupils, etc. I assume Apple is doing this in the Vision Pro. Eye-tracking can also be done with just visible light, though. Apple has the benefit of knowing where all the user interface elements are on screen, so eye-tracking in this on the iPhone or iPad doesn't need to be high precision. Knowledge of the position of the items can help to reduce the uncertainty of what is being fixated on.
So there isn’t much more to it than getting a good resolution image of the eye from any distance, every millisecond.
Precision is the issue because we are mostly moving our eyes in about 8 directions, there’s no precision because we don’t know how to measure focusing of our eye lens with a camera yet (unless that too is just a matter of getting a picture).
Squinting would be the closest thing to physically expressing focusing. So the camera needs to know I’m looking left with my eye, followed by a squint to achieve precision. Seems stressful though.
Gonna need AI just to do noise cancelling of involuntary things your eyes do like pupil dilation, blinking.
The 'tracking eyes' part is different, but once you have eye position data, the 'how the eyes interact with the interface' could be very similar.
Not the hardware side, but the software side. Implementing all the various behaviours and security models for the eye tracking and baking it into SwiftUI, means that it translates over easier once they figure out the hardware aspect. But the iPads and iPhones with FaceID have had eye tracking capabilities for a while, just not useful in UI.
It would be amazing if it gets carried over to the Mac.
Accessibility settings are really a gold mine on iOS for device customization (yes, I agree, they shouldn’t be limited to accessibility).
I’m particularly interested in the motion cues and the color filters for CarPlay - I have color filters set up to enable every night as kind of a Turbo-night shift mode (deep orange-red color shift), would love to do the same for CarPlay.
I also completely forgot iOS had a magnifier built in!
Accessibility features tend to be superpowers though, and I'm glad Apple gates them behind permissions and opt-ins. We all know of applications who try to trick the user into granting them inappropriate access to the device through the Accessibility APIs. I think DropBox still begs you to grant them Accessibility access so its tendrils can do who-knows-what to your system.
With great power comes great responsibility.
Guaranteed that marketers are salivating at the idea of eye tracking on apps and website. It's an amazing feature that absolutely needs to be gatekept.
I wonder if it'll use the same architecture as visionOS; where the vision tracking events and UI affordances are processed and composited out-of-process; with the app never seeing them.
That's probably how it'll go because it's the path of least resistance. A button will already have a listener for tap, so the OS translates the vision tracking into a "tap" and triggers the relevant code. There's no point telling the app about vision tracking because apps wouldn't already have a handler for that event. And for privacy reasons, there's no need to start now.
iPadOS has hover states and pointer events; those could be arguably trigerred by eye tracking.
It varies. Things like keyboard control or that kind of thing, absolutely, but mostly I've used it for stuff like "don't make an animated transition every time I change pages like an overcaffienated George Lucas" or "actually make night shift shift enough to be useful at night". I also use the background sounds to augment noise cancellation while taking a nap. All of those are just useful things or personal settings, not necessarily attack vectors.
My favorite is the "allow ANC with just one AirPod in". I have no idea why this would be an accessibility feature. If I turn on ANC, then I don't want it to be disabled just because I'm listening with one ear!
Well, they aren't really limited to accessibility, but they are hidden there. It's sort of like a convenient excuse to get UI designers off your back if you want to ship customization.
FYI you can also make turbo night shift by scheduling toggling of white point balance, yep, in accessibility settings
Eye tracking is not an accessibility feature, it is an advertising optimization feature disguised as an accessibility feature.
Eye tracking is absolutely an accessibility feature. Just because you don't need it, and it can be abused, does not mean it isn't an absolutely game changing feature for some people.
He didn't say it wasn't an accessibility feature, just that it was disguised as one.
Just because it's not a game changing feature for some people, doesn't mean its primary function isn't advertising.
The user did say verbatim “Eye tracking is not an accessibility feature”
Something being disguised as something explicitly implies it is not that thing.
My presumption is that apps will not be able to access this data, at least without some sort of permission gate.
Indeed they haven’t for all the years it was limited to FaceId’s attention option.
Apple are very good about not making this kind of thing available to apps that don't have an explicit reason to need it.
You can say a lot of negative true things about Apple but this is just silly. There is no way Apple is going to expose that data to underlying apps in the same way they refused to do it in Vision Pro. I'd bet a good bit of money it works the same way where it's a layer on top of the app that the app can't access and from the video it looks like that's exactly how it works.
It allows people with ALS to navigate their device with their eyes.
Microsoft added a similar feature to Windows about seven years ago.
This is why Apple is the best. They’ll make features like this because it’s the right thing to do, even if it won’t really make any difference to 99.99% of their customers.
They really earn consumer loyalty.
The accessibility features are also useful for automated UI testing.
They’ll make features like this because it’s the right thing to do
While the positivity is great to see -- I'd temper that expectation they're simply doing the right thing and definitely acting in their perceived interest. People can and often do the right thing -- large companies rarely do.
People can and often do the right thing -- large companies rarely do.
There are companies which try (and succeed) to be honest "citizens". Berkshire and Costco have quite good reputations.
This was the right thing to do, but I doubt “the right thing to do” was the primary motivator for it. This is smart marketing that helps position them as the more caring, inclusive, and capable tech brand. It also expands their reach more than a fraction of a percent. 13% of Americans have some form of disability.
I feel like it also helps them get an edge in the healthcare sector, which has buckets of money in the mix.
Do you realize how bad Apple's accessibility issues were before? This is just marketing and I doubt many people are going to drop thousands of bucks to ditch years of tooling they're already intimately familiar with (aka, Windows), but this is an effort to try and entice some people. That's all it is. Marketing.
I read this very skeptically.
When I hear eye tracking I immediately think of advertisers targeting me based on what I look at, NOT quadriplegics using Apple devices.
Maybe I'm a cynic
I'm hoping they'll add this to macOS too. Eye tracking would be great for end-user UX research.
I'd also like to see what sort of weird stuff people come up with to use eye tracking. Games could use that in interesting ways.
Quibble but this isn't "eye tracking" it's "gaze tracking". Eye tracking is detecting where your eyes are. Gaze tracking is what you're looking at.
Well then it's both?
Could be, but not necessarily. Eye tracking usually means it's tracking the eyes of one or more people in a video. Gaze tracking usually requires your eyes stay pretty steady and close to the tracker.
Where are you getting this language point from? If I look up any company selling "eye trackers", their products are all meant to track where you're looking, e.g., https://www.tobii.com/
Interesting. I picked it up working with Tobii devices a few years ago actually. I guess they updated their vocabulary.
This is how I feel about "face recognition" (it should mean recognizing whether something is a face or not), but it is common to use eye tracking this way.
I wonder if this announcement had anything to do with the bombshells OpenAI and Google dropped this week. Couldn’t this have been part of WWDC next month?
I think it's more of "clearing the decks" for stuff that didn't make the cut for WWDC. I assume WWDC is going to be all about AI and they couldn't find a good spot to put this announcement. "Clearing the decks" isn't a very kind way to refer to this accessibility tech since Apple has always been better than almost everyone else when it comes to accessibility. I don't see this as "we don't care, just announce it early" as much as "we can't fit this in so let's announce it early".
As noted elsewhere, Apple always does their accessibility announcements in advance of WWDC.
Tomorrow (today in some timezones) is Global Accesibility Awareness Day: https://en.m.wikipedia.org/wiki/Global_Accessibility_Awarene...
They do it every year at the same time. Also, it’s a small announcement, not a keynote or the kind of fanfare we have at WWDC or the September events. This does not seem calibrated to be effective in an advertising war with another company. All this to say, probably not.
As Terramex pointed out this is tied to a particular, relevant event.
It’s also pretty common for Apple to preannounce some smaller features that are too specialized to be featured in the WWDC announcements. This gives them some attention when they would be lost and buried in WWDC footnotes.
It is also probably an indication that WWDC will be full of new features and only the most impactful will be part of the keynote.
I have better than 20/20 vision (yes, really) and now mobility problems, but there are some macOS accessibility features that I love.
One is Zoom: I hold down two modifier keys and scroll, and can instantly zoom in to any part of the screen. It is extremely smooth, the more you scroll the higher you zoom, instantly and with high frame rate. Great when you want to show someone something, or just zoom into a detail of something if the app doesn’t support it or is two cumbersome. Or “pseudo-fullscreen” something.
The other one is three finger drag on the trackpad. Why that isn’t default is beyond me. It means that if you use three fingers, any dragging on the trackpad behaves as if you held the button down. It’s so convenient.
It's better to have an easy way to hold a mouse button via keyboard with your left hand and continue to use one finger on the touchpad rather than do the whole three fingers to drag
Not for me, no.
But sounds like something accessibility options may also provide, and I can see how it may be better for a lot of people.
Which one have you tried?
The default way to drag on a trackpad is something I’ve never gotten good with so I always enable drag lock the second I get a new laptop. Ideally I would switch to the three finger gesture but after 15 years of drag lock I just can’t get my brain to switch over.
three finger dragger for life here. whenever I see colleagues struggling to drag items on a Mac, I show them three finger dragging and it blows them away. total game-changer!
Music haptics can be a cool way to teach someone how to dance and “feel the beat”
I'm severely hearing impaired and enjoy going to dance classes - swing, salsa, etc. If I'm standing still, I can easily tune into the beat. But once I start moving, I quickly lose it on many songs; dance studios aren't known for having large sound systems with substantial bass. I don't know that this specific setup would fix anything -- it would need some way of syncing to the instructor's iPhone that is connected via bluetooth to the studio's little portable speaker. But it's a step in the right direction.
While on the topic, I can hear music but almost never understand lyrics; at best I might catch the key chorus phrase (example: the words "born in the USA" are literally the only words I understand in that song).
A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music. This have been game changing for me. I'm catching up on decades worth of music where I never had any idea what the lyrics are (filthy, that's what they are. So many songs about sex!). It has made exercising on the treadmill, elliptical, etc actually enjoyable.
A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music.
JSYK Spotify has had this for years too, its just under the "View Lyrics" button but it does highlight the sentence/word karaoke style. It used to be a "spotify app" back in that genre of the service.
There's a lot of interesting things we can do with haptics since they're relatively cheap to put in stuff. Hopefully accessibility gets the software and applications further along soon
Using haptics in music to enhance rhythm perception and dance skills. Sounds really cool!
macOS has had a version of eye tracking for a while, it's really fun to try out.
System preferences -> Accessibility -> Pointer Control
Then turn on the "Head pointer" option.
VisionOS has this too. I mapped it to a triple click of the button for when eye tracking becomes inaccurate.
Cool. I'm a bit unsettled that my camera's green dot didn't turn on for it though.
Cool! That works surprisingly well. But how do you click while using this? Clicking on the trackpad doesn't work, when it's tracking my head.
“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO.
Does this line actually mean anything? Press releases are so weird.
Sure it means something, it’s pretty low on the PR talk scale even. Let’s see:
We believe deeply in
"We", in this case, refers to Apple and as a company can’t believe in anything, it specifically refers to their employees.
the transformative power of innovation
A new feature will bring forth change
to enrich lives
In order to make someone’s life better.
So far so good, at least no synergies being leveraged for 10x buzzword transformation. Let’s see if it passes the test: There’s a bunch of new accessibility features, that certainly fits the bill for innovation. As anecdata, at least one of these will drastically change how I interact with my devices, so there’s the transformative power. I will be able to use these new accessibility features to improve both the way I work and how I privately use technology, so one could argue my life is being positively enriched by this transformative power brought about by innovation. Do they "believe deeply" in this? That is only for Tim Cook and his employees to know, but they’re one of the few companies pushing advances in accessibility past the finish line, so they might!
I always wonder why press releases include claims that a person said a paragraph or more of text that noone would ever say out loud, but this one actually does sound like something he'd say. In a keynote video, anyway.
Eye Gaze devices(tablet with camera + software) cost around $20K, even if it offers 1/4 of the features this is good news for those who can't afford it.
Don't be ridiculous. Solid hardware and software combos for windows cost a small fraction of that. The convenient and decent Tobii PCEye costs like $1,250 and a very nice TMS5 mini is under $2,000. Your bullshit was off by at least an order of magnitude.
Let's be fair and compare similar products. Do you have any examples of $2000 mobile devices that support eye tracking on the OS level? The products you mention look like they're extra hardware you strap to a Windows PC. Certainly useful, but not quite as potentially life-changing as having that built into your phone.
I wonder if Vision Pro has enough microphones to do the acoustic camera thing? If so you could plausibly get "speech bubbles over peoples heads", accurately identifying who said what.
I imagine that could be pretty awesome for deaf people.
Deaf people I know certainly wouldn't want to be wearing a headset, say, in a restaurant. But a phone app or tablet that can do live text-to-speach would be good enough in many cases if it can separate voices into streams. Anything like this available?
I don't want to say it's certainly not available, but I doubt it. Maybe someone has done something trying to recognize different voices by differences in pitch/volume/...
The vision pro has six mics, which likely enables it to locate the sources of sound in space by the amount of time it takes a sound to reach each mic (i.e. acting as an "acoustic camera"). Tablets and phones aren't going to have that hardware unfortunately.
And yeah, obviously wearing a headset is a pretty big compromise that's not always (or maybe even often) going to be worth it. This was more of a though of "the hardware can probably do this cool thing for people already using it" than "people should use the hardware so it can do the cool thing".
my first thought regarding eye-tracking: "whose accessibility to what|whom?"
Do you have another idea beyond the accessibility of Users without fine motor control to the phone's screen?
which ads you are looking at
It's funny how the post is about enhanced surveillance technology and the text sentiment of the comment thread is overwhelmingly positive.
Surveillance technology: >:(
Surveillance technology with the word "accessibility" in the title: :)
of course it’s surveillance technology.
an iphone constantly broadcasts your location to third parties, can deduce where you work and live, understands the unique shape of your face, has a built-in microphone and multiple kinds of cameras, stores all of your conversations, stores all of your photos, stores your health information, can figure out when you go to bed.. all on a completely closed and proprietary operating system.
it’s like asking “why hasn’t anyone mentioned that we’re all using a website right now”
They keep announcing these bombastic accessibility features while simple things like tabbing remain frustratingly broken. The macOS “allow this app to access that” dialog supports shift+tab, but not tab.
https://support.apple.com/guide/mac-help/use-your-keyboard-l... - Keyboard navigation
https://support.apple.com/guide/mac-help/navigate-your-mac-u... - Full Keyboard Access (accessibility feature, goes beyond just tabbing between elements)
It's annoying that tabbing between UI elements is off by default on macOS. It's one of the first things I turn on with a new mac.
Bummer the eye tracking is iOS only. I’ve been wanting focus-follows-eyes for decades.
I share the sentiment. I've long noted in situations with grids of terminal windows where focus-follows-eyes would be so much faster and present less friction than using hotkeys or focus-follows-mouse.
My first thought upon seeing the Haptic Music feature is to wonder how long until they make compatible headphones and I can relive my high school years, walking around listening to nu-metal and hip-hop with a Panasonic Shockwave walkman.
Surely, this would be a headphone feature possible without having to be supported by the player.
Imagine an eye-tracking loupe function on mobile: fit the same amount of text but bubble up the part under foveolar gaze. Save on readers* everywhere.
* readers are those glasses you can pick up at the drug store for $17.99.
It took me a couple read-throughs to understand what you meant, but yes, on-screen gaze-sensitive magnification would be amazing, I agree.
Eye tracking coupled with the show grid feature would seem like using a computer the way that people do in movies https://www.youtube.com/watch?v=UxigSW9MbY8
It's basically how the Apple Vision Pro mainly works.
If accurate enough it seems like eye tracking could be useful beyond accessibility to eg control iPads without dragging one’s greasy fingers all over the screen. iPad already supports pointers.
I was thinking the same, it would make having an iPad under my monitor much less cumbersome
All these features look amazing! That car motion sickness feature especially. Can’t wait to try it!
At least I put off the phone while I was in the car. Not the case now. Thank you Apple but I'd rather be sick while looking at your phone in a car.
how can I make sure it's off? Is it off by default?
Yes
remember how google added js apis to detect active tabs. and rendering intersection. complex webapps could use less battery and have richer interaction.
everyone rejoiced. nobody implemented anything useful.
soon, advertisers, google included, was selling ads packages by "viewability". lol.
can't wait for safari to lead with "eyed" ads.
I know this is a joke, but for for anyone who was wondering,
all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.
It's not an API adopted which apps can use, it's a system pointer, like a mouse or trackpad
eye tracking feature is interesting … for ad-tech companies
Haptics in the Music app will be great but it’s not exactly “new”, considering my haptic music app has been out for several years now: https://apps.apple.com/us/app/phazr/id1472113449
Gotta make surf they got all the eye balls then make sure they don't get to date anybody and become homeless.
This will have just enough capability limits and annoyances for you to be convinced to purchase a Vision Pro. It also makes eye tracking more widely accepted by making it available to everyone by using their existing devices.
while we're on topic of accessibility I'd like to point out the following lovely facts:
* on Android, in the builtin popup menu for copypasta, the COPY button is wedged in tightly (by mere millimeters) between the CUT and PASTE buttons on either side of it. A harmless one packed in between two destructive, lossy ones, and usually irreversible.
* the size of an adult human fingertip is no secret (and in fact 99.99% of humans have at least one!)
* Google's staff consists of humans
* Google supposedly hires ONLY "smart" people and their interview process involves some IQ-like tests
* Android has had this obvious anti-pattern for many years now. perhaps 10+?
iOS 17 Image Descriptions are quite good, but audio descriptions don't seem to work on non-Pro devices, even though text descriptions are being shown on the screen and the audio menu is present and activated. Is that a bug?
Even on Pro devices, audio image descriptions stop working after a few cycles of switching between Image Magnifier and apps/home. This can be fixed by restarting the app and disabling/enabling audio image descriptions, but that breaks the use case when an iPhone is dedicated to running only Image Magnifier + audio descriptions, via remote MDM with no way for the local blind user to restart the app.
On-device iOS image descriptions could be improved if the user could help train the local image recognition by annotating photos or videos with text descriptions. For a blind person, this would enable locally-specific audio descriptions like "bedroom door", "kitchen fridge" or specific food dishes.
Are there other iOS or Android AR apps which offer audio descriptions of live video from the camera?
Can't wait for my son to try this. He has general coordination issues due to a brainstem injury but the eyes are probably the part of the body he can better control. I'm not a fan of Apple's software and didn't have a great experience with the Vision Pro but I am excited to try it out.
While it's a significant step forward for accessibility, it also invites us to consider how such technologies could integrate into everyday use for all users. This could enhance ease of use and efficiency, but it also requires careful consideration of privacy safeguards.
I hope this works for people with one eye. There are dozens of us—dozens!
Reminder: eye tracking means an always-on camera.
I'm not fully against it, but it's good to keep in mind the implications.
Very curious to see how well eye tracking works behind a motorcycle visor. Thick leather gloves, bike noise, and a touch screen and audio interface are not much fun.
Didn’t Mark Rober say he worked on some motion sickness stuff at Apple?
I am excited for Vocal Cues - my main frustration with Siri is how poorly it comprehends even explicit instructions.
One thing I wish Apple would implement is some kind of gesture control. The camera can detect fine details of face movement, it would be nice if that were leveraged to track hands that aren't touching the screen.
For an example, if I have my iPhone or iPad on the desk in front of me, and a push notification that I don't need obstructs the content on the screen, I would love to be able to swipe my hand up towards the phone to dismiss it.
Nice to see features from Vision Pro make it onto other Apple products
This is a good time to remind everyone that tomorrow, May 16th is Global Accessibility Awareness Day (GAAD) (https://accessibility.day), and that there are over 176 events worldwide going on to celebrate the process we are all making at improving accessibility in our products -- any plenty of learning opportunities for beginners and experts.
I definitely get a good amount of motion sickness when using my phone while in a car so I'm super interested about the motion sickness cues and if they'll work. The dots look like they may get in the way a bit but I'm willing to take that tradeoff. My current car motion sickness mitigation system is these glasses that have liquid in them that supposedly help your ears feel the motion of the car better (and make you look like Harry Potter)
google yesterday also open sourced their accessibility feature for android and windows that controls cursor using head movements and facial gestures
My wife is a hospice nurse and from time to time she'll have a patient without any ability to communicate except their eyes (think ALS) - for these folks in their final days/weeks of life this will be a godsend. There are specialized eye-tracking devices, but they're expensive and good luck getting them approved by insurance in time for the folks in need near the end of their lives.
I'm a believer in accessibility features. The difficulty is often in testing.
I use SimDaltonism, to test for color-blindness accessibility, and, in the last app I wrote, I added a "long press help" feature, that responds to long-presses on items, by opening a popover, containing the label and hint. Makes testing much easier, and doubles as user help.
This is awesome and I love the UX, although I can't help but feel a bit sad that we need to always rely upon Apple and Microsoft for consumer accessibility.
It would be so great if more resources could be allocated for such things within the Linux ecosystem.
Every attention thief is absolutely thrilled at the idea of tracking your eyes. Let’s all imagine the day where the YouTube free tier pauses ads when you’re not actively looking at them.
Shit. I’m turning into one of those negative downers. I’m sorry. I’ve had too much internet today.
If this is at all like the eye tracking in Vision Pro, it is only available to the OS and apps are not given access to the data.
System one is, but advertisers could always roll their own and see if they can get away with "you can only view this content if you give us permission to use your camera".
At least on iOS, I can't imagine that happening - apps are not allowed to demand you grant permissions unrelated to the actual functionality. From App Review Guidelines (5.1.1) Data Collection and Storage, (ii) Access:
Lots of iOS apps today really want to mine your address book, and constantly spam you with dialogs to enable it, but they don't go as far as disabling other features until you grant them access, because they'd get rejected once someone noticed.
Fortunately apps in the EU don’t have to pass though Apple’s anticompetitive review process, so developers are free to ignore that rule if they simply distribute the app via an alternative store.
Unfortunately, poor Americans cannot taste the freedom that Europeans have to be abused by developers.
Fortunately, the Americans who create these products extract more value from the EU than they invest, or they wouldn’t bother trading with you <3
I was thinking similar (only without the snark); but then I realised this is almost certainly not compatible with GDPR.
It's not that hard to come with ways to circumvent system restrictions, after all, advertisers are a fierce adversary and have shown many clever ways of invading users privacy in web browsers and mobile apps. In the case of eye tracking I could see a situation where the system perhaps feeds the "malicious" app in question with a hint of which widget is being currently gazed by the user. You could then just build a giant grid of invisible widgets covering the whole app window and use that to reconstruct the all the eye tracking happening inside your app.
This is not about technical restrictions and finding weird legal loopholes, Apple's guidelines don't work that way. It's the spirit that matters, not the letter.
The system doesn't supply data like widget highlight states to apps for exactly that reason.
This would not pass the App Review process
for now - it is naive to think this is safe
Eye tracking has been on iOS for many years (as an option for faceid called attention).
Avoidable via iPhone SE3 and iPad Air, which both use TouchID.
Until the EU comes in and forces them to ‘open up’ the functionality in the name of ‘fairness’…
If you look at the current things the EU has forced them to open up (app distribution and NFC payments), both of them are things that Apple was already actively monetizing.
To compare this to a non-monetized accessibility feature is a bit disingenuous.
The whole notion of ‘fairness’ is that gatekeepers should allow others to compete on even footing. That can be fulfilled by granting EITHER everybody OR nobody access to the market (but not: only yourself).
Until people complain that Apple is being anti-competitive by not making vision tracking open, or allowing third-party eye-tracking controls, etc. etc.
Watch the Black Mirror episode “Fifteen Million Merits”, to see how this might end up.
Fifteen million merits is maybe the least about the future of any Black Mirror episode. It reads best as entirely a comment on the society we already have.
A big clue is that the world in it doesn’t make a ton of internal sense and the episode makes absolutely no effort to smooth that over. The questions it raises and leaves open without even attempting an answer are on purpose. You’re supposed to go “god, bicycling to make little pellets? Just to buy useless trash? God, why? It’s so pointless,” or, “they truly look down on and are mean to the people who get demoted, even though they’re basically guaranteed to end up that way someday, when their health or good fortune run out? Just cruelty for no reason that’s highly likely to hit them some day, too? That’s insane!”
… because actually it’s about now (or, the year it came out) and the point is to get you to connect the dots and realize we’re doing the same crap and just don’t notice. It’s only “sci fi” as some sleight of hand to give you an alien’s eye view of our own actual society as a way to prompt reflection. The core story of the corruption of a “revolutionary” to serve the purposes of the system is some bog-standard media studies stuff pertaining to today, not sci fi. The reveal that they could just fucking stop all this and go outside, it’s not some alien world or a post apocalypse after all is yet another thing that’s supposed to make you go “what’s wrong with them? Oh. Right. It’s us.”[1]
Long winded way to say: we can’t “end up” at Fifteen Million Merits future-world, because it’s just our own world we’re already in.
[1] Some read the final scene as depicting yet another viewscreen, but this is such a wildly weaker reading as far as the how effective the scene is that I can’t believe it’s intended that way.
Wait for human attention detection to become mandatory to view DRMed content on the telescreen.
I don't love that solid UX gets pushed under the accessibility rug, as an option you might never find.
I don't care how cynical it sounds, user experience became user exploitation a long time ago. Big Tech have been running that gimmick at too-big-to-fail scale for the last decade or so.
Accessibility benefits everyone, but in the basics you’re right. Too many simple straightforward options are now strictly inside accessibility. At least on the Apple side.
And don’t get me started on hidden command line settings.
<cough> Reduce Motion. Is it an accessibility feature or does it just get rid of an annoyance and is good for everyone?
Yes--some people can become ill with certain types of motion on a screen [1].
[1]: https://www.a11yproject.com/posts/understanding-vestibular-d...
Yes they can, but I don't and I still hate needless animations and turn them off. The point is, why is it in "accessibility" when it should be more visible?
It's plenty visible right where it is to people who don't have odd hangups about "accessibility".
So would people with perfect eyesight and no motion sickness discover it?
Personally I forgot about it on my latest phone because I had a fresh prescription for my glasses and didn't need to enlarge the font at the moment :)
Saves battery too along with cross fade transitions :)
I'm here to intentionally get you started on hidden CLI settings. Learn me somethin'!
not OP, but macOS has a ton of options available with arcane commands. my favorites are the auto-hide dock speed setting, the third hidden window minimise animation, and the hidden system accent colours usually locked to the colourful iMacs
From outside, it feels like these are the only people with the freedom to improve the user experience at all. So they have to hide their work in the Accessibility preferences.
You're getting a lot of agreement from other HN users, but I'm not sure it's fair to criticize Apple for putting these kinds of features under Accessibility.
There's nothing that inherently "locks out" people who don't have a recognized disability from exploring these features. Furthermore, most of Apple's "accessibility" features are related to Vision/Hearing/etc (and categorized as such), so I think it's reasonable to consider them accessibility features.
Clearly based on other comments here, plenty of people discover these features and find them useful.
The iPhone has a hidden accessibility setting where you can map and double and/or triple tap of the back of your phone to a handful of actions. I use this to trigger Reachability (the feature that brings the entire UI halfway down the screen so you can reach buttons at the top) because phone screens are so damn big that I can't reach the opposite top corner with my thumb even on my 13 mini without hand gymnastics. And the normal Reachability gesture is super unreliable to trigger ever since they got rid of the front Touch ID home button.
This isn’t “hidden”. It was even called out and demonstrated in the iOS 14 keynote.
Perhaps I could rephrase that it's hidden within Accessibility settings, not that it's an accessibility setting that is furthermore hidden.
Most people don't go into that menu to look around for things they might want to use cause features that almost everyone could benefit from are alongside settings for people with visual and hearing impairments.
Unironically calling this feature "hidden" is why things are the way they are now. It's not hidden! You can find it if you go through the settings app! But because it isn't in your face all day every now and then people will talk about this "super secret feature" and then a PM somewhere has to make a nag feature to advertise it.
Double tap is reachability for me and triple tap is to make the display very dim so that at night at the lowest brightness setting, I can get it even lower. It resets after a while so even if I forget to switch it off my screen won’t stay dim for the next few days while I wonder why it’s so damn dark.
Let's say you're a developer at a big software company (not necessarily Apple, this happens everywhere) and you want to add a new optional setting.
The bar is pretty high. There are already hundreds of settings. Each one adds cost and complexity. So even if it's a good idea, the leadership might say "no".
Now let's say this same setting makes a big difference in usability for some people with disabilities. You just want to put it in accessibility settings. It won't clutter the rest of the UI.
You just turned your "no" into a "yes".
I often use them to get around bad UI/UX (like using Reduce Motion), or to make devices more useful (Color Filters (red) for using at night).
Even outside of this, even able-bodied folks can be disabled due to illness, surgery, injury, etc. So it's great to see Apple continuing to support accessibility.
The red color filter for outside when trying to preserve night vision is a great tip. Some apps have this built-in but much better to have the OS change it everywhere.
Recommend creating a Shortcut to toggle this setting.
Huh?
It's not built-in?
Android (or at least Moto) has it for years, auto enable on the schedule or the sunrise/sunset.
Kinda feels like you could’ve done more of a cursory glance to see what functionality was actually being talked about before going for the “Android already does this!!” comment.
Kinda feels like you could've be more like [0] than being an ass, while totally missing what I was surprised about the lack of automagik, not of the feature itself.
https://news.ycombinator.com/item?id=40371449
There's a built in "night shift", but that just changes the colour temperature, it doesn't make everything monochrome red.
No, that's (part of) what Color Filters is for.
iOS has also had “Night Shift” for several years. The parent is talking about a full on red colour filter, like astronomers might use.
Not just for outside, but also at public gatherings like concerts! I went to a concert last month and used a red color filter to record a couple short videos without being a big distraction to the audience behind me.
Dim backlight + Red color filter can make the screen almost invisible to those around you.
Thanks for the reminder. Wish I had remembered this feature during the aurora photography I was doing. I set the phone brightness to the lowest setting but the red filter would have helped even more.
You can also add it to the accessibility shortcut, available anywhere by triple-clicking the power button.
I think you can just add it to Control Centre, no need for shortcuts. I made an app for reading in the dark, minimising the amount of light hitting your eyeballs, but I'm still using the red color filter every night.
The app is overall darker and more "strict" with how it handles content (esp. images) though: https://untested.sonnet.io/Heart+of+Dorkness and midnight.sonnet.io
Overall, reduced the number of photons is a fun metric to play with when building something/messing with prototypes.
The issue I've seen when the app itself offers a red filter is if that app calls an OS native widget like a keyboard does not get filtered. The system level accessibility feature does filter the OS widget. I would almost rather the app's setting to just enable the OS filter, but I can understand why that might not be possible.
> developed solely with the benefit of the user in mind
Hopefully accessibility features are never artificially segmented to higher priced devices.
At least in the US, they kind of can't be. The disability community is pretty up front about lawsuits.
That's because the ADA has no enforcement mechanism other than lawsuits, isn't it? Our whole legal disability rights infrastructure is designed to be driven by lawsuits, and sits inert if nobody sues.
That's actually a good thing, especially if there are people specializing in bringing these lawsuits en-masse.
Over here in Europe, where there's no lawsuit culture, some laws (not necessarily accessibility-related, our legislation in that area is far weaker) are violated almost without repercussions. When the government is the only party that can bring charges and the government is complacent / ineffective, nobody actually gets charged and nothing gets done.
There's also the problem of incentives, if you can get a lot of money from a lawsuit, you have a far better incentive to sue and find a competent lawyer. You may even deliberately look for and sue violators as a moneymaking scheme, some companies in the US do this. This puts pressure on businesses to comply with the law. Even if you're piss-poor and never sue anybody, you still benefit.
If this lawsuit culture doesn't exist, the best you can do is write a report to the government and hope they actually act on it. Many people don't know how to write these, and since there's no money in it, getting a lawyer to do it for you is an expense that nobody is going to help you recoup.
The people handling these reports don't help either, they're usually 9-to-5 salaried employees who aren't judged too hard on performance, so they have far less of an incentive to actually pursue cases.
iOS 17 audio image descriptions for blind people via Image Magnifier should work on all iPhones, but do not work on iPhone SE3 and iPhone 11 Pro. Audio image descriptions do work in iPhone 12 Pro. Lidar in 12 Pro increases accuracy, but should not be mandatory. Hopefully this is a bug that can be fixed, since text descriptions were still functional on the lower-end devices.
Source: purchased devices until finding one that worked, since Apple docs indicated the feature should work on all iPhones that can run iOS 17.
Edit: audio descriptions in Magnifier are non-functional on iPad Air, working on M2 iPad Pro.
I wonder if that would be legal, at least in the US. That feels like it'd be a violation of the ADA?
It would only be a violation if it's purely software locked.
If it requires a chip that supports specific operations, and entry tier devices have an older chip, that wouldn't be a violation.
Some of them are, at least on Apple's side, but it's always for a good technical reason. Screen recognition is only available on devices that have a neural chip, things that require lidar don't work on devices that don't have lidar and so on.
Google is worse at this, Talkback multi-finger gestures used to be Pixel and Samsung exclusive for a while, even though there was no technical reason for it.
Apple has a different problem, many accessibility features aren't internationalized properly. Screen recognition still has issues on non-english systems, so do image descriptions. Voice Over (especially on Mac) didn't include voices for some of the less-popular languages until very recently, even though Vocalizer, their underlying speech engine, has supported them for years. Siri has the same problem.
Funny, I was just thinking it was so that they can get more attention-economy eyeballs for ads.
This will happen. These features are always ushered in as ways to make someone's life easier, and often that is exactly what it does, for a time, before some product manager figures out how they can maximize profit with it.
Growth at all costs, I guess.
Don’t say “I guess” as if you aren’t the one making the rather baseless accusation. What other accessibility features have been abused?
Can you name a single accessibility feature where this has happened ever? Kinda seems like you just made up some fake reality.
Apple doesn't have product managers. (More importantly, the hardware has been technically capable of eye tracking since Face ID was added.)
That’s a profound and surprising insight. You’re absolutely correct.
Wouldn't a lot of the companies that build in accessibility do it from a viewpoint of gaining an even wider reach and/or a better public image?
I don't see optimizing for that as bad. If they think we'll love the product more by making it better for a given audience, especially if I'm in that audience, I'm happy. Does that mean this company now gets richer? Perhaps, and that's fine by me
Accessibility features can be used to steal attention too
Accessibility features stand out as user-centric developments, love that