return to table of content

Recent 'MFA Bombing' Attacks Targeting Apple Users

lloeki
93 replies
1d10h

"recent"?

This happened to me and my wife (each starting a few days apart) in 2021, or maybe 2022 but no later. It started with a couple requests a day, then ramped up to every hour or something. IIRC we also both got a couple SMS claiming to be from Apple.

As soon as it ramped up I set up both accounts to use recovery keys, which is a move I had planned anyway on grounds that it should not be in Apple's (or someone coercing/subverting Apple, be it law enforcement or a hacker) power to get access to our accounts. This obviously stopped the attackers dead in their track.

For similar reasons I set up advanced data protection as soon as it was available and disabled web access. Only trusted devices get to see our data, and only trusted devices get to enroll a new device.

JKCalhoun
78 replies
1d8h

I was unsure what this Recovery Key was: https://support.apple.com/en-us/109345

It is kind of scary too — lose the key and no one can get you back in to your account.

pmontra
25 replies
1d8h

A recovery key is an randomly generated 28-character code

That's easy to backup. You can even print it and bury it in a sealed box in the garden or put it in a book or whatever. It depends who you are protecting against.

Klathmon
21 replies
1d6h

But you shouldn't ONLY store it in a box or in your house.

That means you're one natural disaster away from losing everything.

As much as it can "weaken" security, an electronic backup is still recommended for most

alistairSH
18 replies
1d6h

As much as it can "weaken" security, an electronic backup is still recommended for most

Maybe I'm being dense (probably), but where would you save it?

iCloud? No, that doesn't work - you need the key to access iCloud.

Some other cloud storage service? No, that doesn't work - you need your phone to generate a token for access and your phone was destroyed in the same fire as the paper backup.

Seems like the safe choice is a lock box at a bank or similar. Or a fireproof safe at home.

s3krit
6 replies
1d2h

Engraved onto something like titanium would be better than a fireproof safe - they're only safe for X amount of time (I want to take a stab in the dark and say about 90 minutes?). This is how I have backed up some (since retired) crypto seed phrases in the past.

tshaddox
5 replies
1d1h

Where do you keep the titanium plate? I'd be more worried about losing it due to a natural disaster than merely having it destroyed beyond readability in a natural disaster.

dylan604
4 replies
1d1h

What happens if there's a typo in the engraving? Who's doing the engraving? How much do you trust the people you are providing the key to do it? When does the paranoia kick in vs being diligent?

0cf8612b2e1e
3 replies
1d

This was at least an innovation in the bitcoin community. Several assemble at home systems where you can build a physical manifestation of a secret. Metal cards you punch with a hammer and nail. Another is essentially a tube where you string along metal letters of the password.

dylan604
1 replies
23h57m

Sure, sounds perfect. Let me send some crypto person that has invested in a home stamping kit the secret to my crypto wallet. At least they won't know what it's for to be able to hijack my wallet. phew. had me nervous that committing the cardinal sin of sharing my secret with someone I don't know isn't going to come back to haunt me.

0cf8612b2e1e
0 replies
23h41m

You assemble it at home? You do not send anyone your secrets.

Also, the idea is simple enough you could DIY your own version with stuff from any hardware store.

Klathmon
5 replies
1d5h

Personally, I encrypt my backup/recovery/setup keys in a CSV file using a password that I have memorized, and send them to family members to store in their accounts/cloud storage.

But safety deposit boxes are a good choice too, just be careful to balance your own convenience. If you can't easily update your backups, you're really unlikely to include new accounts in them

thfuran
1 replies
1d4h

That also means you can't easily update passwords.

danieldk
0 replies
1d3h

You could put your passwords in 1Password or iCloud Keychain, so you only need to back up those credentials.

tacocataco
1 replies
1d2h

What happens if you suffer a TBI and can't remember the password?

I guess you'd have bigger problems at that point.

tshaddox
0 replies
1d1h

Perhaps an estate lawyer could be trusted with the information in case you become incapacitated or dead.

fennecbutt
0 replies
1d3h

Doesn't that just mean that Apple's X character key is protected only by a password presumably of lesser length?

I suppose a phrase works too, and easy to remember.

bombcar
1 replies
1d5h

Get it tattooed on a (normally not seen) part of your body. Like under your hair! ;)

Of course, a code like that can be in multiple places, possibly where it won’t be recognized as such.

alistairSH
0 replies
1d2h

And pray you never need to update the passcode!

I'm imagining this spiraled around somebody's upper thigh... "fakePassw0rdo̶n̶e̶t̶w̶o̶t̶h̶r̶e̶e̶four"

zrm
0 replies
22h44m

Keep one copy in your fire-resistant safe at home. Then encrypt a copy, give the encrypted copy to your best friend and the decryption key to a family member, or keep one of these things in your desk at work. Neither of them have access unless they both figure out what it is and collude with each other, but you have a recovery system in case you lose your own copy.

tzs
0 replies
1d1h

One possibility is to encrypt a copy with a key that you are pretty sure you can remember, and store that encrypted copy someplace public on the web. Periodically check that you do still remember the key.

The conventional way to do this would be encrypt it with a symmetric cipher keyed from a password or passphrase. I've been using an unconventional approach where the secret you have to memorize is an algorithm rather than a password/phrase. Programmers might find an algorithm easier to memorize than a passphrase.

Here's an example of this general idea. The algorithm is going to be a hash. This one will take a count and a string, and output a hex string. In English the algorithm is:

  hash the input string using sha512 giving a hex string
  while count > 0
    prepend the count and a "." to current hash and apply sha512
The recovery code I want to backup is 3FAEAB4D-BA00-4735-8010-ADF45B33B736.

I'd pick a count (say 1969) and a string (say "one giant leap for mankind"), actually implement that algorithm, run it on that input and string. That would give me a 512 bit number. I'd take "3FAEAB4D-BA00-4735-8010-ADF45B33B736" and turn it into a number too (by treating at as 36 base 256 digits). I'd xor those two numbers, print the result in hex, and split it into 2 smaller strings so it wouldn't be annoyingly wide.

Then I'd save the input count, input string, and the output:

  1969 one giant leap for mankind
  ed428dffa23f4f14ae2a7b7e842019fc11b5726d726b96c11ec266758be67cb0
  f2a78a320a85df809afe83c6c7840e2d175cceadb455260735405cd047459cc9
I'd then delete my code.

I could then do a variety of things with the "1969 one giant leap for mankind" and the two hex strings. Put then in my HN description. Include then in a Reddit comment. Put them on Pastebin. Take a screenshot of them and put it on Imgur.

To recover the code from one of those backups, the procedure is to implement the algorithm from above, run it with the count and string from the backup to get the 512 bit hash, take the 512 bits of hex from the backup, xor them, and then treat the bytes of the result as ASCII.

Then delete the implementation of the algorithm. With this approach the algorithm is the secret, so should never exist outside your head except when you are actually making or restoring from backup.

When picking the algorithm take into account the circumstances you might be in when you need to use it for recovery. Since you'd probably only be needing this if something so bad happened that you most of your devices and things like your fireproof safe, you might want to pick an algorithm that does not require a fancy computer setup or software that would not be in a basic operating system installation.

The algorithm from this example just needs a basic Unix-like system that you have shell access to:

  #!/bin/sh
  COUNT=$1;
  shift;
  KEY=`/bin/echo -n $* | shasum -a 512 | cut -d ' ' -f 1`
  while [ $COUNT -ge 1  ]; do
    KEY=`/bin/echo -n $COUNT.$KEY | shasum -a 512 | cut -d ' ' -f 1`
    COUNT=`expr $COUNT - 1`
  done
  echo $KEY

CydeWeys
0 replies
1d2h

Some other cloud storage service? No, that doesn't work - you need your phone to generate a token for access

You definitely don't need your phone for access. I use Yubico security keys for everything like this. I have several of them that are on all my accounts and I don't keep them in the same place.

tacocataco
1 replies
1d2h

Why can't you bury a 2nd box in your friends yard who lives across the country?

CatWChainsaw
0 replies
20h6m

Okay, and when your friend moves, and you buried it years ago, so they forgot to dig it up what with everything else going on in their life at moving time?

forgotmyinfo
1 replies
1d5h

Never underestimate the security and safety of a hidden piece of paper! If it's good enough for wills for the last 500 years, it's good enough for a password.

lxgr
0 replies
21h29m

A better analogy would be a piece of paper with your username.

Finding somebody’s will doesn’t give you access to any of their data or funds.

fortran77
0 replies
1d4h

I keep one-time keys between pages of some books on my shelf, and a copy in a safe deposit box. I suppose if I were publically known to have tons of money in "crypto" or were a target of a nation-state, this wouldn't be safe enough. But I think it's OK for my gmail and OneDrive, etc.

josephcsible
23 replies
1d4h

Such a high risk of being locked out permanently is more than most people can stomach. Why can't they offer a last-resort option like showing up in person at an Apple Store with government-issued photo ID?

toomuchtodo
10 replies
1d4h

Because they aren’t required to by law. I have filed comments with the FTC that this recovery path should be legally mandated for digital accounts, I encourage others to do the same. It doesn’t have to be an Apple Store (insider risk, see SIM swapping analogy); could be USPS or another government identity proofer they partner with. Login.gov uses USPS for in person identity proofing, for example.

Your data and account ownership interest doesn’t disappear because of failure to possess the right sequence of bytes or a string. Can you imagine if your real estate or securities ownership evaporated because you didn't have the right password? Silliness.

AnthonyMouse
5 replies
22h22m

This should not be required by law because many people specifically don't want it. I'm content to keep my own redundant copies of a recovery key and suffer the consequences of my own actions, rather than allowing someone to steal my account just because they made a convincing fake ID or hacked some government system. In general centralized identity systems are a single point of failure and hooking more things into them is a bad thing.

Your data and account ownership interest doesn’t disappear because of failure to possess the right sequence of bytes or a string.

Somehow you have to establish that you are the owner of the account, in a way that nobody else can do it. This is very much not a trivial problem, and government IDs don't provide any kind of solution to it.

If you need a driver's license, how do you get a driver's license? With a birth certificate? Okay, how do you get a copy of your birth certificate when you don't have a driver's license?

If there is a path to go from your house burning down and you having zero documents to you having a valid ID again without proving you've memorized or otherwise backed up any kind of secrets, an attacker can do the same thing and get an ID in your name. This is why identity theft is a thing in every system that relies on government ID. Requiring all systems to accept government ID is requiring all systems to be subject to identity theft.

toomuchtodo
4 replies
21h55m

I argue for and advocate that this capability should exist, but not be mandatory. If you do not want to tie your personal identity to your digital identity, certainly, you should be able to not do so and rely solely on a cryptographic primitive, recovery key, or other digital mechanism to govern access of last resort. If your account access is lost forever, it's on you and that was a choice that was made.

Somehow you have to establish that you are the owner of the account, in a way that nobody else can do it. This is very much not a trivial problem, and government IDs don't provide any kind of solution to it.

This is actually very easy. You can identity proof someone through Stripe Identity [1] for ~$2/transaction. There are of course other private companies who will do this. You bind this identity to the digital identity once, when you have a high identity assurance level (IAL). Account recovery is then trivial.

If you need a driver's license, how do you get a driver's license? With a birth certificate? Okay, how do you get a copy of your birth certificate when you don't have a driver's license?

This is government's problem luckily, not that of private companies who would need to offer account identity bootstrapping. Does the liquor store or bar care where you got your government ID? The notary? The bank? They do not, because they trust the government to issue these credentials. They simply require the state of federal government credential. Based on the amount of crypto fraud that has occurred (~$72B and counting [2]), government identity web of trust is much more robust than "not your keys, not your crypto" and similar digital only primitives.

NIST 800-63 should answer any questions you might have I have not already answered: https://pages.nist.gov/800-63-3/ (NIST Digital Identity Guidelines)

[1] https://stripe.com/identity

[2] https://www.web3isgoinggreat.com/charts/top

(customer identity is a component of my work in financial services)

ohmyiv
2 replies
16h26m

If you need a driver's license, how do you get a driver's license? With a birth certificate? Okay, how do you get a copy of your birth certificate when you don't have a driver's license?

Using vitalchek, you can order a BC with a notarized document, using two people who have valid IDs as people to vouch for your identity. I've done it for multiple clients.

AnthonyMouse
1 replies
10h39m

Interesting to see a modern variant of compurgation still in active use.

So if I'm understanding this correctly, if me and one of my friends both have a valid ID, we can get anybody's birth certificate?

ohmyiv
0 replies
52m

There also has to be someone that needs the BC to see the notary. But, for the most part, yes, it's that easy to obtain a BC using vitalchek.

Note: The notary will record the ID #s and other info of the two ID holders. So if something goes wrong, the two ID holders will be on the hook as well.

Once the notarized document is submitted to vitalchek, they'll process the request.

Of course, one would still have to know a few details from the BC (parents, location, etc) to get vitalchek to submit the request to the county/city registrar.

AnthonyMouse
0 replies
20h37m

This is actually very easy. You can identity proof someone through Stripe Identity [1] for ~$2/transaction.

"Pay someone else to do it" is easy in the sense that doing the hard thing is now somebody else's problem, not in the sense that doing it is not hard. That also seems like a compliance service -- you are required to KYC, service provides box-checking for the regulatory requirement -- not something that can actually determine if someone is using a fraudulent ID, e.g. because they breached some DMV or some other company's servers and now have access to their customers' IDs.

This is government's problem luckily, not that of private companies who would need to offer account identity bootstrapping.

But it's actually the user's problem if it means the government's system has poor security and allows someone else to gain access to their account.

Based on the amount of crypto fraud that has occurred (~$72B and counting [2]), government identity web of trust is much more robust than "not your keys, not your crypto" and similar digital only primitives.

The vast majority of these are from custodial services, i.e. the things that don't keep the important keys in the hands of the users. Notably this number (which is global) is less than the losses from identity theft in the US alone.

The general problem also stems from "crypto transactions are irreversible" rather than "crypto transactions are secured by secrets". Systems with irreversible transactions are suitable for storing and transferring moderate amounts of value, as for example the amount of ordinary cash a person might keep in their wallet. People storing a hundred million dollars in a crypto wallet and not physically securing the keys like they're a hundred million dollars in gold bars are the fools from the saying about fools and their money.

kccqzy
3 replies
1d2h

Well previously when stock trades involved exchanging physical certificates, I could imagine that ownership could evaporate if you lost that piece of paper. Or just think about cash: you do lose that ownership when you lose that magical piece of paper. It's a simpler world when what you have physically determines what you own.

toomuchtodo
0 replies
1d1h

People want a just world (imho, n=1, based on all available evidence, etc), recourse, and protections, not a simple world. Interestingly, cash will likely be the last to go in the near future from a “possession of value” as the world goes cashless (although whether this is "good" or "bad" can be argued in another thread).

https://en.wikipedia.org/wiki/Cashless_society

josephcsible
0 replies
1d2h

If the deed to land or the title to a car gets destroyed, what happens? It doesn't suddenly forever become unownable.

recursive
3 replies
1d2h

How would this work? If this was possible, that would mean an Apple employee is verifying the ID. This has failure modes. See SIM swapping attacks.

josephcsible
1 replies
1d2h

Aren't SIM swapping attacks only such a problem because you can get a new SIM without showing up in person with ID?

recursive
0 replies
1d

No, they're also a problem because you can run into a storefront and snatch the employee's authenticated tablet, regardless of what company policy is.

mlyle
0 replies
1d2h

There's a wide set of possible approaches between "let any employee validate any ID" and "never let someone into an account that they have lost the credential to."

E.g. you could make it costly to attempt, require a notarized proof of identity -and- showing up at the Apple store, and enforce a n-day waiting period. A different employee does the unlock (from a customer service queue) than accepts the paperwork.

We don't lock people out of financial accounts forever when they forget a credential. It could definitely be solved for other types of accounts.

lazide
3 replies
1d3h

Have you seen how easy it is to get fake government ID? It’s damn near a rite of passage for teenagers so they can buy alcohol. $20-$50 if you know the right person or can wander the dark web right.

I’m not sure you want that to be the absolute best digital security you can get.

SoftTalker
1 replies
1d3h

Yes it is vulnerable to an attacker who is willing to present himself in person with a fake ID to target a specific account. However it's not scalable or remotely exploitable.

lazide
0 replies
1d3h

Since it requires a human looking at an ID and then pressing a button, the system triggered by the button press is likely quite exploitable no? Or even worse, scanning and storing an ID, which allows spoofing if those get compromised.

Recovery key isn’t susceptible to that - and isn’t susceptible to fake-id-spotting-ability or bribeability of staff either.

josephcsible
0 replies
1d2h

Okay, then also require a photo when opting in to this, and make sure the person who shows up looks like said photo too.

semiquaver
1 replies
1d3h

  > Why can't they offer a last-resort option like showing up in person at an Apple Store with government-issued photo ID?
This is easy to defeat and completely subverts the purpose of the system. If you are not comfortable with self-custody then don’t opt in.

nottorp
0 replies
3h25m

Only because you don't have proper IDs over there in the US?

I'd say a lot of identity problems are there because companies have to identify people without an official ID somehow...

matwood
0 replies
1d3h

I think their option for last resort is the trusted contact.

hedora
0 replies
23h59m

This is the default behavior if you don't turn this stuff on. They store your account recovery key in an escrow device.

The main problem is that walking into an apple store with a government-issued warrant works just as well as walking in with a government-issued ID.

nerdjon
11 replies
1d5h

You can setup a recovery contact incase you do loose the key. I just set that up with my partner and the chance of loosing the key and both of us losing all of our apple devices I think is fairly slim.

I also stuck that key in 1Password (sure it's less safe, but if my 1Password was breached I have far bigger problems than this key being retrieved).

Then keep a hard copy in a safe. Been contemplating sending my parents a safe (who live several states away) with keys on a sheet of paper without context that only I have the combination too. But not sure yet.

rjbwork
6 replies
1d5h

Then keep a hard copy in a safe. Been contemplating sending my parents a safe (who live several states away) with keys on a sheet of paper without context that only I have the combination too. But not sure yet.

A friend of mine who was (maybe is? he knows I'm not a fan so we don't talk about it much) big into crypto stores his secrets in similar safes with trusted friends and family around the country. I think it's a good idea for things like this tbh.

nerdjon
5 replies
1d5h

I think it is a good idea in theory also, there I just that voice that says "well now that key is out of my possession" and it scares me a bit.

I think I might need to look up to see if there is a known pattern to these keys that it could be easily figured out what it is even if it is just on a sheet with no context. Particularly 1Password which I think is a pattern if I remember correctly.

e40
2 replies
1d4h

Particularly 1Password which I think is a pattern if I remember correctly.

What does that mean?

hackinthebochs
1 replies
1d4h

Probably that the key has features that allows 1Password (and potentially anyone) to recognize that its a 1Password key. E.g. Fixed size, patterns of spaces or dashes, specific digits, embedded error correction, etc.

nerdjon
0 replies
1d2h

Yeah that is what I mean.

Similar to how a lot of package companies have a certain pattern, length, whatever for their tracking numbers. If there was a somewhat reliable way to say "This is a 1Password key" or "This is an iCloud key" it makes it means even without context it could be an issue.

IncreasePosts
0 replies
1d1h

Or, just apply some simple, easy to remember permutation to the key that no one would be likely to guess - eg rot13 the key, or add 1 to every character, move the first 14 characters of the key to the end of the key, etc.

lxgr
2 replies
21h27m

How many people own a safe? I personally don’t know anybody that does. I do know that safes sometimes get stolen.

saaaaaam
1 replies
10h54m

You personally don’t know anyone who obviously discloses that they have a safe. If you have a safe you are keeping something valuable secure. The fewer people know that you have something valuable that needs to be secured the better. If people don’t even know your safe exists then that reduces the chances of it being compromised.

lxgr
0 replies
6h15m

I know for a fact that many of my friends don’t own a safe, and I don’t think I’m an outlier here.

I don’t doubt that many people do, but it’s still not a solution for the majority of Apple users.

catlikesshrimp
0 replies
1d1h

Hard copy? edge the string in a hard surface. My favorite is a rock in my garden. The characters are facing the ground to shield from erosion. The visible surface of the rocks (all of them) is painted white for aesthetic.

Survives a fire, earthquake. No tornadoes or tsunamis here. Nobody has stolen any such rocks from here.

jareklupinski
5 replies
1d3h

lose the key and no one can get you back in to your account

sounds like a feature

"want to totally restart your entire digital life? just rip up your key :) never worry about something from your past coming back to you ever again!

bilbo0s
2 replies
1d1h

Only if you do everything at Apple.

You make posts on twitter, it's not protected the same way.

dylan604
1 replies
1d1h

I want to be upset that you've made a comment so obvious, yet sadly, there will be people in the wild that don't understand the silos platforms build. However, I doubt any of them are here reading this, but you never know.

hedora
0 replies
1d

Go read any thread about passkeys. :-)

headmelted
1 replies
1d1h

That seems like the worst option. Everything up to the free tier would stay there forever with no way for you to ever request it to be deleted.

sgjohnson
0 replies
22h27m

Turn on Advanced Data Protection before you rip up the key. Then it's all as good as deleted.

m_a_g
3 replies
1d6h

When you set up a recovery key, you turn off Apple's standard account recovery process.

However, if you lose your recovery key and can’t access one of your trusted devices, you'll be locked out of your account permanently.

I considered it before but I think it's just too much risk as I rely heavily on iCloud. On the other hand, I don't see the risk with the current method if you're smart enough not to fall for things like MFA bombing tactics.

asd4
2 replies
1d5h

The security researcher in the article was concerned about accidently confirming the prompt on his watch.

I don't think its a matter of being "smart enough". Human error can easily creep in when dismissing 10's or 100's of prompts.

hunter2_
1 replies
1d4h

The prompt UX should step into a special "bombed" mode when a frequency threshold is crossed, at which point accepting a prompt has fat-finger protection such as double confirmation steps, and declining all (or perhaps all that share a commonality, like same initiating IP address) becomes possible.

lazide
0 replies
1d3h

Or you know, not allow this kind of brute forcing at all?

themagician
1 replies
1d1h

You can regenerate a new key from any logged in device, so you have to lose the key AND every device.

lxgr
0 replies
21h28m

Except if Apple decides, based on undocumented heuristics, that you do in fact need the key, as far as I’ve heard.

fennecbutt
1 replies
1d3h

Put in safe deposit boxes at 2 different banks or something, I guess.

lxgr
0 replies
21h26m

That raises the TCO of iCloud considerably.

lloeki
0 replies
23h17m

lose the key and no one can get you back in to your account.

Incorrect: only Apple cannot.

You can voluntarily declare:

- recovery accounts: these trusted accounts can help you authenticate anytime.

https://support.apple.com/en-us/HT212513

- legacy contacts: these trusted contacts can access your account in the event of your death.

https://support.apple.com/en-us/102631

As for the "lose recovery key" situation is no different than hardware token 2FA + recovery codes. Print multiple copies and spread them to trusted third parties.

ThomasBb
0 replies
1d3h

“ If you use Advanced Data Protection and set up both a recovery key and a recovery contact, you can use either your recovery key or recovery contact to regain access to your account.”

antihero
4 replies
1d7h

Also, buy some (at least three) YubiKeys and use them for your Apple ID verification instead of the dumb push MFA.

https://support.apple.com/en-gb/HT213154

someguydave
2 replies
1d5h

But is it the case that the Yubikey is essentially treated the same as a trusted device? What if I want to untrust my devices and only trust ubikeys (without removing the device from my icloud account?)

antihero
1 replies
1d3h

I don’t seem to have the push option now

someguydave
0 replies
1d1h

Yes but my understanding is that you can remove the Yubikey without possessing it, just with a “trusted device”. I want to mark all of my devices untrusted (wrt icloud account changes) and rely only on Yubikeys

someguydave
0 replies
1d5h

From your apple doc:

“ When you use Security Keys for Apple ID, you’ll need a trusted device or a security key to:

Sign in with your Apple ID on a new device or on the web

Reset your Apple ID password or unlock your Apple ID

Add additional security keys or remove a security key”

Yubikeys do nothing except enlarge your attack surface.

fortran77
3 replies
1d4h

Wow! You'd think they'd rate limit these! Once you've done it twice, go to once every 15 minutes, then hour, then 4 hours, than day, etc. Like bad logins.

nilsherzig
1 replies
1d2h

That would allow me to log you out of your accounts

prepend
0 replies
1d1h

No, it would affect login status. Just a delay between reset attempts.

No reset actually occurs until one prompt is accepted.

WorldMaker
0 replies
1d

Krebs notes that the recovery form does have some form of CAPTCHA on them, which mostly just goes to show that CAPTCHA systems are a poor and increasingly deficient rate limiter.

ETA: Also from a user experience even once a week between attempts is still enough to deeply annoy a user getting popups on their devices. This is one of those cases where rate limits probably still can't solve the user irritation.

viraptor
2 replies
1d10h

It's not a recent approach, but this is a recent campaign using it against many people. Someone likely got a list of hacked passwords from some recent dump and is going through the apple accounts from it.

lloeki
0 replies
1d10h

I ventured as much. Given the amount of messages and the personal details gathered, I also guess attacker tools have significantly been improved or streamlined.

criddell
0 replies
1d7h

How would that explain Chris’ experience at the Genius Bar?

vdddv
1 replies
1d6h

Interesting that using the recovery key stopped the issue for you, but does not seem to do its job now. From the article "Ken said he enabled a recovery key for his account as instructed, but that it hasn’t stopped the unbidden system alerts from appearing on all of his devices every few days.

KrebsOnSecurity tested Ken’s experience, and can confirm that enabling a recovery key does nothing to stop a password reset prompt from being sent to associated Apple devices. "

hx833001
0 replies
1d6h

A password reset prompt is sent to the devices, but unfortunately the article leaves out that the prompt only enables you to reset the password on the device that receives the prompt. So it is not a security issue, just an annoyance.

rvz
23 replies
1d10h

Yet another reason why phone number verification is the most insecure way to verify users and it doesn't matter if a company like Apple is using it or your bank using so called 'Military grade encryption'. The point still stands [4] with countless examples [0] [1] [2] [3].

Unless you want your users to be SIM swapped, there is no reason to use phone numbers for logins, verification and 2FA.

[0] https://news.ycombinator.com/item?id=36133030

[1] https://news.ycombinator.com/item?id=34447883

[2] https://news.ycombinator.com/item?id=27310112

[3] https://news.ycombinator.com/item?id=29254051

[4] https://www.issms2fasecure.com

yieldcrv
9 replies
1d10h

I think we should start doing product liability lawsuits to any organization capable of having user financial data affected from their account, that is using SMS one time codes as either default, enabled by default, and the heaviest legal remedies to financial organizations where that's the only option

we should also update PCI DSS compliance or whatever relevant security standard to call SMS one time codes totally insecure

we can also reach insurers these companies use and tell them to force removal of SMS one time codes

do a multi pronged assault on SMS one time passcodes

guappa
7 replies
1d9h

I think the more urgent thing is to not use the social security number both as the ultimate secret, and also as a number you must give to hundreds of people.

yieldcrv
3 replies
1d8h

non sequitur, make a different thread for that cause

chgs
1 replies
1d8h

Not sure what sms one time codes has to do with this story either

yieldcrv
0 replies
1d6h

It’s one of the MFA methods Apple allows

guappa
0 replies
1d8h

Well if you fine companies for using SMS for security… you should put the CEO in jail for authenticating with social security number… if we go by just the number of people who get affected by skimmed SMS and by stolen ssn.

klabb3
0 replies
1d8h

both as the ultimate secret, and also as a number you must give to hundreds of people

Don’t forget the final nail in the coffin, which completes the trifecta: it’s entirely immutable - damage radius = infinite.

ImPostingOnHN
0 replies
1d5h

I think the more urgent thing is to end world hunger.

CatWChainsaw
0 replies
20h45m

That. I'm in favor of stopping this societal wave of making phone numbers the equivalent of digital SSNs (they're critical for digital life, everyone wants them, nothing good happens when you hand them out that freely).

adrr
0 replies
1d1h

Never will happen on the consumer side. Consumer lose their device way to often to make TOTP or pass codes viable.

Financial institutions can detect if your phone number has been ported or forwarded.

Bigger threat is phishing and password sharing between accounts. I ran tech at investment firm/ neo bank and never saw an attack on sms 2FA and we had over a million customers. We had email 2FA for a while there was significant number of people who shared passwords between email and their bank.

saagarjha
8 replies
1d10h

This has nothing to do with SIM swapping or phone numbers.

jasode
4 replies
1d10h

>phone numbers.

On the official Apple reset form, the "phone number" is one of the id options the hackers can use to MFA bomb the target:

https://iforgot.apple.com/password/verify/appleid

The gp proposes a different "private identification string" that's not public. Public IDs such as "email address" or "phone number" are susceptible to what this article is talking about.

consp
1 replies
1d9h

On the official Apple reset form, the "phone number" is one of the id options the hackers can use to MFA bomb the target

Funny thing is you cannot set a passphrase or equivalent recovery code unless you have an apple device. So users who have an apple account for development purposes (I hate apple device UX and wont ever use anything apple again other than to approve releases and manage certificates) and have no apple products are cursed to use ones phone number.

EasyMark
0 replies
1d2h

I used to be hardcore about stuff like this, but as I grew older I guess I gave up some of my morality and bought things like $150 iphone # and moved on with life if it was making me $$$.

xamarin
0 replies
1d10h

Yes, like password :)

gruez
0 replies
1d8h

Given that the gp was talking about victims being "SIM swapped", I strongly suspect he's referring to the classic sim swap attack where you sim swap, then use the newly registered sim to receive a password reset code. If it just involves discovering your phone number, you wouldn't need to sim swap at all.

The gp proposes a different "private identification string" that's not public. Public IDs such as "email address" or "phone number" are susceptible to what this article is talking about.

This is a non-starter for the general public. If they can barely remember their password what are the chances they'll remember a "private identification string" or whatever?

isoprophlex
1 replies
1d10h

TFA talks specifically about a victim buying a brand new phone, registering a new appleid, and getting MFA bombed immediately when putting in his old SIM...

klabb3
0 replies
1d8h

and getting MFA bombed immediately when putting in his old SIM...

I think it’s technically unrelated to the SIM, but rather to create the new Apple ID he used his existing (compromised, lol) phone number for “verification” or something. Which is weird in a way because then Apple must allow multiple accounts per phone number?

xamarin
0 replies
1d10h

That is not true. Please read article, he even bought new phone, and this did not stop attack, because of same phone number. I woul not even call this MFA attack, as they did not need his password. It is more like recovery password attack.

mhdhn
3 replies
1d10h

What's the recommended alternative for mere mortal hackers?

antihero
1 replies
1d7h

Use HSMs for your Apple ID MFA.

EasyMark
0 replies
1d2h

that brings in a whole new world of complexity and change that isn't for everyone.

rsync
0 replies
17h39m

I host my phone number at twilio and have built an SMS firewall between my public phone number and my actual SIM number.

Flatten texts to ASCII-256, blacklists/whitelists, priority tagging, SMS cc'd to an email box, multi-number simul-receive, and so on.

Well, you asked ...

Zetobal
14 replies
1d10h

Same problem with Instagram it's insane that so many giant companies have no rate limits in their recovery flows.

WatchDog
13 replies
1d9h

The problem with adding rate limits, at least a global per user rate limit, is that you then create a new denial of service issue, preventing people from being able to recover their account.

AtNightWeCode
7 replies
1d9h

Rate limiting per user is mostly a thing of the past. You set other rate limits and various rules and then get the rate limit per user for free.

millzlane
6 replies
1d6h

Rate limiting per user is mostly a thing of the past

Someone please tell this to fidelity. After 3 wrong password attempts they lock your account.

k8svet
4 replies
1d6h

Fidelity are clowns. They've spent an impressive effort breaking every god damn third party integration AND using Akamai to block scraping. I can scrape Ameriprise fine, but no matter how creative I get Fidelity gives back a weird error on login.

(This is on top of them not sending any actionable email when changing my contributions to 0 in between pay periods)

I'm rolling my 401k out as often and fast as possible. I hate American banks so much.

ryandrake
1 replies
1d5h

Fidelity are clowns. They've spent an impressive effort breaking every god damn third party integration AND using Akamai to block scraping.

What’s funny/sad is they probably pat themselves on their back thinking their security is so advanced and awesome. Financial services web integrations are all total clown shows.

EasyMark
0 replies
1d2h

but can't you buy API access? I would assume that's more of a business decision to promote paid for API access, rather than "security" against scraping.

mananaysiempre
1 replies
1d6h

us[e] Akamai to block scraping

Would https://github.com/lwthiker/curl-impersonate help? Haven’t tried with Akamai, but did help with another widely used CDN that shall remain unnamed (but has successfully infused me with burning hate for their products after a couple of years’ worth of using an always-on VPN to bypass Internet censorship and/or a slightly unusual browser).

k8svet
0 replies
20h37m

I'm using this to fill forms interactively and emulate a user. https://github.com/rust-headless-chrome/rust-headless-chrome

Afaict, it drives a stock Chromium instance. I'm not sure how Fidelity is detecting it, but they detect it even in normal headful mode. Idk if there's some JS that notices there's no mouse-move movements.

It's just not worth the headache. I despise bending over backwards for companies like this. But obviously I have no choice since they're my 401k plan facilitator.

dpkirchner
0 replies
1d4h

And they convert usernames to sets of digits so they can be entered more easily on phones. Naturally this results in a lot of collisions.

tdudhhu
1 replies
1d9h

Why? You can rate limit the business logic but still show the user the default flow.

For example: if a user is requesting a reset password link 10 times a minute you can just send the link one time but display everytime that a reset link was sent by email.

WatchDog
0 replies
1d9h

This flow is a bit different from a password reset email, it's a notification with a direct call to action, allow or deny.

You can't debounce them like you can with a reset password email flow.

With a typical password reset email, the actual password resetting is done by the user after they click the link in the email, only someone with access to the email can proceed, and they can only proceed on the same device that they clicked the email link.

In this flow, there is no further on-device interaction.

forgotmyinfo
0 replies
1d1h

You're telling me Facebook, with its billions of dollars and leetcode interviews, can't figure this out? That is outside the realm of computable functions?

faeriechangling
0 replies
1d3h

If you’re getting DOSed by identical prompts you already can’t recover your account since you’ll likely hit the wrong prompt. There’s no protection here against an MFA fatigue attack attempt.

EasyMark
0 replies
1d2h

it wouldn't be hard to add to the app though. obviously if you get a flood it's bullshit and more than a couple can be ignored. It's not rocket surgery

_def
13 replies
1d10h

I wonder how long it will take until another goal of these phone calls will be to gather enough samples to convincingly clone your voice.

ManBeardPc
5 replies
1d9h

There is already a variant where they try to get someone to say „yes“ and just use a recording of it to use as „proof“ that you agreed to some contract.

nebula8804
1 replies
1d

OMG that explains so much. I kept getting these calls where they would ask "Am I speaking with the head of the household?"...crap

cozzyd
0 replies
16h25m

Good thing for Google call screening ...

14
1 replies
1d

I actually don’t answer unknown callers with “hello” or any words actually. I simply just say “mmmhhmm” or make a dumb sound if it is automated it will trigger the automatic message. Someone asked why and I said voice cloning software they said wtf you have nothing to steal. Just feels risky idk why.

guappa
0 replies
1d9h

Phone providers have been doing this one in italy for over a decade.

gruez
2 replies
1d8h

You probably not going to get a voice clone from someone saying "hello?" 100 times. However, you don't really need to "MFA Bomb" people to clone their voice, just call them with a plausible sounding reason that will cause them to engage in an extended conversation (eg. "hey this is your uber/doordash driver/doctor/school/daycare).

rainbowzootsuit
1 replies
1d7h

I just really want to hear you say "passport" !

snowwrestler
0 replies
15h2m

Wait. A computer matched her with him? I don't think so.

rvz
1 replies
1d9h

Exactly this.

Another reason to not to use phone (or the numbers) calls to verify users even with so called 'voice identification or voice ID' which can easily be broken with advanced voice cloning.

_def
0 replies
1d8h

Recently I was baffled how far we've come with this. It doesn't work perfectly, but could be enough to fool someone. Just one pip install and a short voice sample away: https://github.com/coqui-ai/TTS

Sarkie
1 replies
1d9h

Good fucking point this

ted_bunny
0 replies
1d3h

Bad comment, this. Just upvote.

tanelpoder
12 replies
1d

There's an important omission in the article and the top comments here don't mention it either: Accidentally tapping "Allow" does not allow the attacker to change the password on their web browser. When you tap Allow on your device, you are shown the 6-digit pin on your device and you can use it to change your password on your device. The final part of the attack is that the attacker calls you using a spoofed Apple phone number and asks you to read out the 6-digit pin to them. If you choose to give out the 6-digit pin to the attacker over an incoming phone call, then they can use it in their browser to reset your password.

It's surprising that Krebs chose to omit this little detail in the security blog and instead seemed to confirm that someone could completely give away access to their account while sleeping.

WheatMillington
5 replies
23h12m

He describes this in the very first paragraph of the article:

Assuming the user manages not to fat-finger the wrong button on the umpteenth password reset request, the scammers will then call the victim while spoofing Apple support in the caller ID, saying the user’s account is under attack and that Apple support needs to “verify” a one-time code.
rootusrootus
4 replies
23h2m

That seems to be an entirely different point. Krebs suggests repeatedly that all you need to do to get hacked is click "Allow" in the push notification. This is demonstrably false.

"Assuming the user manages not to fat-finger the wrong button" means "assuming the user clicks Don't Allow". They call on the phone to try and convince the user to say Allow next time.

Of course that's kinda BS too, because the only time "Allow" gives you a six digit code is if you successfully authenticate your apple ID on a new device. If you get the reset password dialog, the result of Allow is not a six digit code, it just allows you to reset the password. Yourself. On your device.

WheatMillington
3 replies
22h18m

Are you reading the second half of the sentence I posted? Sorry but I'm not understanding where you are coming from - Krebbs lays out clearly in the first paragraph how the attack works and you seem to be deliberately ignoring that.

rootusrootus
1 replies
22h12m

No? I thought I specifically addressed that. They call you on the phone and ask for a code you won't have, even if you hit Allow.

What I find interesting is that Krebs didn't do any legwork to verify the claims before publishing.

trompetenaccoun
0 replies
3h55m

Why wouldn't you have the code? I thought your device shows the code when you press 'allow'.

misnome
0 replies
11h16m

Are you reading the first half of the sentence you posted? They are clearly implying pressing the wrong button would be dangerous.

It’s a bit confused about what exactly the problem is, so is a little self-contradictory (including elsewhere in the article).

madcadmium
4 replies
11h46m

It's in the article:

Ken didn’t know it when all this was happening (and it’s not at all obvious from the Apple prompts), but clicking “Allow” would not have allowed the attackers to change Ken’s password. Rather, clicking “Allow” displays a six digit PIN that must be entered on Ken’s device — allowing Ken to change his password. It appears that these rapid password reset prompts are being used to make a subsequent inbound phone call spoofing Apple more believable.
throw20240328
3 replies
8h54m

I did not see this when I read the article. Upon rereading it now, I see this:

Update, March 27, 5:06 p.m. ET: Added perspective on Ken’s experience.

Internet archive confirms that this was the edit: The paragraph you quoted was added to the article the next day.

trompetenaccoun
2 replies
3h49m

Anyone who edits news articles, blog posts or such without clearly disclosing the edit immediately loses my trust. It's a huge problem these days where everything is online instead of in print, but most people do not want to take responsibility for sloppy research or misleading reporting. And that's part of the reason why there is so much misinformation, it sometimes comes from trusted sources too, not just anonymous social media users.

shkkmo
1 replies
2h38m

I agree with you.

However, in this case, the edit is disclosed at the bottom of the article. Do you think this isn't sufficient? Does the edit disclosure need to contain a link to a diff of the changes or does it need to be at the top?

tanelpoder
0 replies
48m

If you look into the edits in Wayback Machine, you see that previously, the "Ken's experience" was:

"Unnerved by the idea that he could have rolled over on his watch while sleeping and allowed criminals to take over his Apple account, Ken said ..."

Once the article was updated, the original sentence implying that criminals could take over your account while you are sleeping was completely rewritten to say the 180 degree opposite - completely reversing what the initial sensational content said. In reality it is not possible to accidentally hand over your account to attackers by accidentally tapping Allow on your watch in your sleep.

The update disclosure only says: "Added perspective on Ken’s experience."

mattmaroon
0 replies
21h44m

Fair, and good to know, but I could still easily see reasonable people (not just 80 yr olds with their Obamaphone) falling for this.

And even if not, there's a severe annoyance factor here that could be simply removed by Apple rate limiting these requests. Why can they send you hundreds of these in a short time?

mavamaarten
8 replies
1d7h

I've been getting these on my LinkedIn account since a couple of days. Every few hours I get an email with a magic login link. They seem legitimate, originating from various locations around the globe.

standing_user
5 replies
1d7h

Happened to me yesterday, I was baffled but then I found that you can request the one time password just using the email associated with the LinkedIn account, so the password wasn't compromised

I have changed the password, main mail and in the privacy settings of LinkedIn removed the visibility of the email

pinebox
3 replies
1d5h

Linkedin will silently change your visibility settings without your consent.

estiaan
2 replies
1d3h

Do you have a source for that? Or any more info? It’s not that I doubt it, I ask because some details like my work email, job title and place of employment has been leaking into the hands of marketing companies and I an trying to figure out how.

forgotmyinfo
1 replies
1d1h

Your own company could've sold it to data brokers. Look into Equifax's Work Number score, it includes fun things like where you worked and how much you made. But no, let's not unionize or anything.

azinman2
0 replies
4h20m

What I don’t get is what is the ploy here. I’m getting them too, but have no indication my email is hacked. Therefore what’s happening?

m-p-3
0 replies
1d7h

I get these too, I wish I could turn the feature off in my account, especially since I already have multiple forms of 2FA (TOTP, Passkeys).

donovanallen
0 replies
23h25m

It's been Uber for me

type_Ben_struct
6 replies
1d11h

I’m still disappointed by Apples implementation of security keys. I want to be able to prevent all 2FA methods other than security keys, but it still seems possible in certain flows to authorise a new login with another iOS device making it vulnerable to this attack.

EasyMark
0 replies
1d2h

If I was doing something that needed heavy security, but I'm just a boring average joe. My critical accounts are protected by TOTP on one (backed up) device only, other things are kind of "good enough" with passkeys and passwords. If I ever become a criminal mastermind or double agent I'll probably dive into such methods though.

dm
1 replies
1d8h

What flows have you found not to use security keys?

ThePowerOfFuet
0 replies
1d8h

All of them.

lloeki
0 replies
1d10h

Interesting. I was contemplating moving to security keys (which according to the setup flow "replaces verification codes" but IIUC you're saying one can still fall back to verification codes in some flows?

prmoustache
5 replies
1d8h

I have hated Push MFA since it was introduced.

How hard is it to just type a code really. In the end to fight against push bombing you end up with push notification that ask you for a code anyway.

antihero
3 replies
1d7h

You can instead opt to use HSMs for your Apple ID MFA. I have 3x YubiKeys in various locations for this exact purpose.

https://support.apple.com/en-gb/HT213154

gruez
1 replies
1d4h

They mention "FIDO® Certified* security keys", this presumably means physical keys only, and not soft keys like the ones that keepassxc/bitwarden provides? If so that might be too much of a hassle for me. I care about my security, but I don't care enough to drop $100 on 3 separate security keys, and finding 3 separate places to keep them secure.

hocuspocus
0 replies
1d1h

You need two keys, not three.

But yes I wish you could use one hardware key as backup and one software key for day-to-day usage, or at least the security key in a trusted device (up to you to have a circular dependency to your main device or not).

someguydave
0 replies
1d1h

It does not help you when a trusted device is stolen, the yubikeys can be disabled if they unlock your phone or device

gruez
0 replies
1d8h

At least on for icloud sign ins (not sure about password resets, too lazy to check), clicking "allow" doesn't allow the sign in, it only displays a 6 digit code that you have to enter to log in.

rekoil
3 replies
1d4h

At some point the ability to trigger these prompts (or ones like them, like the Bluetooth-based setup new device prompts that were in the news last year) on Apple devices is itself the problem right?

Obviously it must be possible to reset ones password, but from the article it's apparently possible to make 30 requests to reset ones password in a short amount of time.

What possible non-malicious reason could there be for that to happen?

gruez
2 replies
1d4h

None, it's just that they haven't bothered adding a check for them. This isn't necessarily an indictment of them. It make sense in hindsight, but between sprints, OKRs/KPIs, and promotion packets, it's easy to let non-sexy functionality like these slip through the cracks.

forgotmyinfo
1 replies
1d1h

It's distressing and sad that we've come to expect so little from the trillion-dollar market cap companies to which we are beholden to participate in modernity.

zubairshaik
0 replies
20h8m

It's not as alarming if we just reframe it. Apple's software is written by developers, like many HN readers, and they follow similar interal processes. There is nothing inherent about having a large market cap that makes everyone involved superhuman. Some issues always slip through the cracks.

I'm surprised to see this comment on HN where many readers see how the sausage is made. There's no secret sauce, no matter how far up in FAANG/MANGA you get.

WarOnPrivacy
3 replies
1d3h

he received a call on his iPhone that said it was from Apple support.

"I said I would call them back and hung up," Chris said, demonstrating the proper response to such unbidden solicitations."

We're long-conditioned to assume that calling a large company and reaching a human will be difficult to impossible - and if we succeed, it will be an unpleasant experience. Much more so for a major tech company.

As far as this scam succeeds, it's partially due to intentional business designs.

someguydave
1 replies
1d1h

This is true, and it is because the public is mostly too inept to be responsible for themselves

WarOnPrivacy
0 replies
21h6m

This is true, and it is because the public is mostly too inept to be responsible for themselves

So why is an inept public responsible for major corps choices to mostly remove phone-to-human cust svc - and not corp poisoning by MBAs?

metanonsense
0 replies
19h58m

A few weeks ago, we had a major problem with our Apple developer account (which is registered to my name). For days, I tried everything to avoid calling customer support (for the above reasons) and only agreed when our release team started panicking. I was more than surprised how incredibly good Apple‘s support team was. Recovering from the problem was quite difficult (and the circumstances that lead to it made me question Apple’s SW dev capabilities), but the support experience was simply perfect.

woadwarrior01
2 replies
1d5h

Quite shocking how oblivious a lot of ostensibly tech savvy people are to the existence of hardware security tokens. Yubikeys have been around for over 15 years now, although Apple only added support for hardware tokens recently.

https://support.apple.com/en-us/HT213154

someguydave
0 replies
1d1h

They don’t help in the case that your unlocked phone is stolen

recursive
0 replies
1d2h

I know they exist. I just don't really know how they work or what they do.

rootusrootus
2 replies
22h53m

This seems like it is entirely a human problem, not any kind of technical failure. The fix is the same as it always was -- people need to be trained to say no by default, do not trust inbound calls ever, and never ever share your credentials.

If you follow that advice, this attack poses no risk other than annoyance. If you do not give your password to the creep who calls you claiming to be apple support, you will be okay.

dimgl
0 replies
22h45m

people need to be trained to say no by default, do not trust inbound calls ever

This really sucks though. It basically means that our current phone system is inherently broken and something that was potentially useful before is no longer useful due to malicious actors.

ascorbic
0 replies
21h24m

A system that lets an attacker send hundreds of push notifications, effectively making a phone unusable until you click "allow" is a technical failure. So is one that lets an attacker spoof Apple's caller ID. Sure, that one is a failure with caller ID in general, but it's not beyond Apple's ability to special-case its own numbers.

chrisjj
2 replies
1d8h

even though I have my Apple watch set to remain quiet during the time I’m usually sleeping at night, it woke me up with one of these alerts.

So... Apple Watch "quiet" is broken??

brookst
1 replies
1d5h

I find sleep focus mode much more reliable than the silent switch. It’s confusing they have both.

aareet
0 replies
1d5h

I think it is just a transition period until they can get rid of models with the switch in their lineup. Since the action button is now configurable, it could soon turn back into just focus modes as the configurable way to silence your phone

chatmasta
2 replies
23h25m

he received a call on his iPhone that said it was from Apple Support (the number displayed was 1-800-275-2273, Apple’s real customer support line)

This happened to me exactly once, and it was two days after I ordered a new MacBook from the online Apple Store. Since I was expecting a shipment, I almost picked it up. But instead I called Apple Support myself, and asked if they had called me, and they said they had not.

gnicholas
1 replies
18h10m

Did you order right after a new model was released (as many people do), or did they just get lucky in calling you soon after you placed an order?

chatmasta
0 replies
16h20m

It was in December 2020, so maybe? I can't remember. I had just incorporated an Inc. (and received seed funds, but those weren't publicly available or announced), so maybe some of that info was a trigger. But it was surprisingly well-timed, for sure. It was within three days of placing the order.

I suppose it could have also been as simple as "it's Christmas shopping time." I remember what was most surprising was seeing the caller ID, which I think was actually "Apple Inc," and which was saved as a contact in my phone.

mcintyre1994
1 replies
1d6h

That message is horribly designed if it allows a password reset to happen on any other device after you click allow. It specifically says "Use this iPhone to reset". I'd have assumed it asks the person who clicked allow to set a new password, on the same device they clicked allow.

Then again if it shows on the watch too (and isn't just mirroring a phone notification, since it ignores quiet mode), I can't imagine the idea is you click allow on your watch and then type a password on its keyboard?

fortran77
0 replies
1d4h

That message is horribly designed if it allows a password reset to happen on any other device after you click allow

This was a lifesaver when my 90 year old mother forget her iMac password (and I forgot that I had created a second admin account on her machine.) After getting locked out of the iMac, we were able to reset it because we were able to get into her iPad (which she forgot the pin to, but fortunately we found it written down.)

honzaik
1 replies
1d9h

I am confused. What does happen after clicking allow? Does Apple just provide a password reset form to the person on the iForgot website or does it show up only on the device?

viktorcode
0 replies
1d4h

I think it will show you the confirmation code on the device. Then the scammer will call to learn the code.

fennecbutt
1 replies
1d3h

B-but iPhones are secure and are the best and Apple spends so much money on security to keep us safe and don't need any government/EU oversight at all. Proof that Apple's "it's for your own good" has always just been marketing.

(Don't get me wrong, let's go after Google, MS, Sony, et al too!!!)

ghodith
0 replies
1d3h

I don't see where EU regulations would have helped in this case.

shuntress
0 replies
22h26m

It still seems wrong to me that we, as a society, have basically accepting this level of crime as just a constant sort of background noise in daily life.

paul_h
0 replies
11h50m

The fatigue part: if you clicked allow, and the hackers called you for the second step, but you responded "I understand you're a hacker and are wanting to steal from me in some way, but I am only going to give you incorrect pin numbers, so please stop with the reset dialogs and update your database not to try it again with me" .. would they stop? /s

nerdjon
0 replies
1d5h

The lack of rate limiting is surprising, either on the server side or the OS side (or both).

I mean they already lock my iPhone after too many failed attempts with my passcode and it gets longer each time, I feel like the lock here should be the same.

A better prompt would also go a long way.

kevrmoore
0 replies
19h57m

This happened to me about 2 yrs ago. It catches you off guard when you receive a spoofed call from Apple Care as you are being bombarded with PW reset requests from your iCloud. Of course, the hacker is really good and answers all the Apple-related questions fluidly. I believe my account data came from the big Ledger hack, so they were targeting crypto holders. iCloud security was so weak back then!

chefandy
0 replies
1d3h

I've been too immersed in university happenings recently. It took me clicking on the link and reading until "password reset feature" to realize that this wasn't some bizarre phishing attack involving Masters of Fine Arts degrees.

JohnMakin
0 replies
1d1h

my mfa applications do not work on any other device, even if it’s restored from icloud. However, this would still be incredibly concerning.

CodeWriter23
0 replies
3h7m

I think the way the attacker probes if victim is using an iPhone is they Message SPAM using Beeper-style use of Messages servers and interpreting error codes.