return to table of content

23andMe confirms hackers stole ancestry data on 6.9M users

skummetmaelk
69 replies
8h6m

This disaster is the perfect counter-argument to those always saying "why do you care so much about privacy. It doesn't affect you when I share things. You can just choose not to do it", except no, I can't choose when we're relatives and you chose to share our genome.

It is so obvious that your relatives sharing their genomic data with 23andMe reveals a lot of information about you. We can only hope people will realize that this also holds true for collecting behavioural data on other people sharing the same background as you.

fsckboy
30 replies
7h18m

I'm in favor of privacy, and I'm willing to go more out of my way to not share than the vast majority of people, but I'm also in favor of individual choice, and I can't think of a privacy model that would disallow other people from sharing their information just because you have some matching information.

kfk
19 replies
7h12m

FYI, the police is able to find criminals now by finding DNA sequences similarities with your relatives. Not saying this is good or bad, I am just saying you don't know the extent of the impact to your personal freedom when your relative's DNA is shared.

pfdietz
14 replies
4h38m

I can help track down distant family members who have committed crimes? Sounds like a plus.

I think the angst about this comes from men who don't want their status as fathers of illegitimate children (or, rapists when they were younger) unmasked.

mschuster91
10 replies
4h16m

I can help track down distant family members who have committed crimes? Sounds like a plus.

It's no longer so easy when the definition of "crime" gets expanded. Let's take this scenario:

- you're a first generation Chinese immigrant in the US

- a nephew of yours is in China and critical of the CCP

- you decide to have your genome scanned into 23andme or whatever to determine if you are at risk of genetic illness

- your nephew sprays an anti-CCP tag on a wall somewhere

- the Chinese police gathers DNA evidence from a laxly discarded spray can, but doesn't have fingerprints so they can't immediately link the can to your nephew

- the Chinese government, either via a legal subpoena or via espionage, gets its hands on your genetic profile from the genetic analytics company

- the Chinese government finds your data, now knows that the sprayer must be related to you in some way, and forces everyone of your family to subject to a DNA test

Sounds dystopic? Yes. But this is exactly where we will be headed. Police here in Germany already do DNA tests on petty vandalism [1].

[1] https://www.fuldaerzeitung.de/fulda/fulda-bahnhof-neuhof-dna...

pfdietz
9 replies
4h14m

It's precious that you imagine not getting your DNA sequenced will provide any sort of shield against dystopian governments.

This sort of thing looks more like a psychological crutch than an actual effective action.

dylan604
8 replies
3h57m

That’s not what the comment was driving at. At all. It’s about how data you think is innocent can be used in a manner you never thought about nor intended for dark purposes.

pfdietz
3 replies
3h54m

Fair. On the other hand, I'm a bit surprised that anti-immigrant forces in the US haven't made DNA sampling compulsory for new immigrants. The argument would be these would be harder to track down by these techniques, because the ancestry information is not as available, giving them an "unfair" advantage over white Americans.

peyton
1 replies
3h4m

I don’t think that fits with how people on this side of the pond think about immigration.

dylan604
0 replies
2h16m

It fits with how some people think about immigration on either side of the pond. That some is close to 50% on the western side of the pond.

kasey_junk
0 replies
2h28m

The US does do DNA collection for anyone it detains whether they end up being granted legal status or not.

They were processing so much DNA that they had to write a special rule allowing border agents to _not_ collect it if it would cause operational difficulties to do so.

https://www.federalregister.gov/documents/2020/03/09/2020-04...

mschuster91
3 replies
3h24m

I actually had intended to point out the dangers of "scope creep". Everyone is happy with a lot of pretty invasive stuff - dragnet surveillance, targeted surveillance (i.e. bugs placed in a suspect's home/car/computer/phone), DNA and fingerprint mass tests, no-knock raids - in severe crime cases such as terrorism, murder, rape, child sexual exploitation or abduction. So far, so good, and almost all Western countries have such provisions for decades that were introduced under the premise "it's only going to be used for <prior list of severe crimes>".

But in recent years, the scope of said "severe" crimes list has expanded massively, across the Western world, driven by both powerful industry lobbies (such as the copyright cartels) and "concerned citizens" aka authoritarians in disguise... and now you got a DNA investigation for about 4.000€ in damages of broken glass and a ticketing ATM. No matter what: this scope creep is not justifiable.

On top of that comes the risk of "what if our governments and the tools/data they and society (both in the form of individuals and companies) possess fall into the hands of authoritarians". For a long time this risk has been laughed off, but nowadays both the far-right (in Europe and the US) and the far-left (in Southern America) have seriously raised the probability of such a scenario.

pfdietz
1 replies
2h31m

Why is DNA investigation supposed to be limited to "severe" crimes? It's just another investigative tool. The idea that it should be limited implies there's something sordid about it. Why should I accept that implication?

An amusing thing here is that the arguments against DNA were also made against the use of photography, back in the 1800s. At some point people have to realize that personal unease is not an argument.

dylan604
0 replies
1h23m

At some point people have to realize that personal unease is not an argument.

that's not how it works though. if you find enough other people that have the same uneasiness, then you can form groups that get people elected to make rules that forces everyone else to comply with your uneasiness.

dylan604
0 replies
2h15m

Everyone is happy with a lot of pretty invasive stuff

I beg to differ. The fact we're even having this discussion means not everyone is happy with the situation. Maybe Stockholm Syndrome has kicked in for you, but I'm still resisting

Majestic121
2 replies
4h22m

Why do you make this issue gendered, and if you do why would it impact only men father of illegitimate children, and not cheating mothers ?

watwut
0 replies
2h42m

Cause in the case of cheating mother, it is clear she is the mother. And to confirm fatherhood of husband or partner, no external registry is needed or helpful.

pfdietz
0 replies
4h15m

The mother will already be connected to the child. The father is what would be needing tracking down.

I mean, wasn't that completely obvious?

HenryBemis
3 replies
6h14m

Well they can narrow it down to the family, unless it was the very DNA giver that left that DNA sample on the scene of the crime.

And since 23andme (as I assume others) don't do these anonymously, there is no hope. Unless people use someone as a proxy (i.e. I-1 give my sample to a male colleague to send it as his-2, he-2 gives his sample to someone else to send it as his-3, and so on..). Police would eventually find the guilty in case of a crime, but the 23andme's of this world will be selling confusing (wrong) data.

stef25
1 replies
5h8m

There are plenty of cases where DNA is found at the crime scene, run through a database, match is found with a relative. Then the cops start looking at the family and boom there's your shady uncle with priors they got their guy.

quickthrower2
0 replies
4h54m

Yes it has come up a few times on forensic files usually on cold cases.

dylan604
0 replies
4h0m

If this was someone trying to fly under the radar by using this scheme to buy burner phones or some such, sure. But this is literal DNA, so even in your attempts to obfuscate, they’d know the name and the sample do not line up, but then be able to link the sample to a family and then figure out who you really are

skummetmaelk
7 replies
7h8m

I can think of an easy model. Disallow collection of personal information. Pull the rug out from under "services" which are really just data collection fronts turning a profit from selling your data instead of the primary service/good for money transaction.

23andMe could still have operated legally under this scheme. They could have done the analysis and sent you a printed sheet. But no, they had to store everything to be able to double dip by selling the data to pharma companies and whoever else would pay for it.

If you can't turn a profit without underhandedly selling your users' data. You deserve to fail.

mewpmewp2
2 replies
6h5m

What about people who would want to donate their data to further the research?

seszett
1 replies
5h33m

They can enrol in studies at actual (non-profit so they don't benefit from selling data, probably public funded) research institutes.

acuozzo
0 replies
4h49m

non-profit so they don't benefit from selling data

Non-profit in the US is a tax status. Many CEOs of non-profits enjoy multi-million dollar salaries and bonuses.

maxerickson
1 replies
4h42m

They are frank about also selling the data for research, it is not underhanded. It's even opt in...

For example, they talk about it on this page, which is linked from the about menu (so available with pretty small effort): https://www.23andme.com/research/

I expect lots of people also like that they get updates when information about new markers becomes available.

sonicanatidae
0 replies
48m

I trust them to opt me out, not at all. It's safer to just assume your data is being used, regardless, because it's free money to them. If/when they get caught selling data marked as Opted Out, they'll get a pittance fine, paid with other people's money and bonuses for making numbers that quarter.

You're welcome to trust them, but no I.

stef25
0 replies
5h11m

Disallow collection of personal information

It's all about the money, always. So not gonna happen.

joshuahedlund
0 replies
4h4m

They could have done the analysis and sent you a printed sheet.

Could they tho? The ancestry analysis itself is based on the data of other users in other parts of the world?

epicide
0 replies
2h29m

Nobody has perfect 100% individual choice/freedom. By itself, maximizing for it is a non-argument. The best explanation I've heard is that "my rights end where yours begin (and vice versa)". That is not an easy line to draw, so the debate becomes where exactly do we, as a society, decide to draw that line. (Noting that this also is never a singular, fixed answer)

Even without defining a specific model around how genetic data should be handled, I think it's more than fair to say that most people right now don't even consider how their choice to sign up for 23andme might affect their relatives (already born or otherwise). Even if they do, in my experience, it's only to a very surface-level degree.

croes
0 replies
6h24m

But if it's genetic information, it's not your data alone. It's your data, your parents' data, your childrens data etc.

losvedir
23 replies
4h51m

This disaster is the perfect counter-argument to those always saying "why do you care so much about privacy. It doesn't affect you when I share things. You can just choose not to do it"

While I agree it's a perfect counter-argument to that, is that what people always say? I'm not sure I've heard that argument as much as "why do you care so much about privacy?" full stop. As in, they don't really understand why anyone should care about privacy. And this isn't really a counter argument to that, any more than any other breach. And to be fair it's not really even a counter argument to that until you show the harm that came from it. What do you think will happen to people who had their ancestry data stolen here?

cj
14 replies
3h43m

I think the more common one I've heard is "Why do you care about privacy if you have nothing to hide?"

In the case of 23andme, it's a perfect answer: We don't know what's hiding in our DNA and I don't know how people will use that against me in the future.

pbhjpbhj
13 replies
3h27m

So, the reason for privacy is because the profit motive of capitalism is not sufficiently restrained as to protect citizens from being abused by corporations?

pc86
3 replies
2h30m

Be careful you don't break something with those gymnastics.

The immediate concern I had with this story is nefarious groups or individuals purchasing this data to target people with violence based on their ethnicities. Imagine if the genome of millions of Europeans was available on the black market in 1930s Europe.

bigtunacan
2 replies
57m

Considering one of the hacker's first actions was to offer for sale data identifying people of Jewish or Chinese descent I think that's a very valid concern.

jstarfish
1 replies
35m

Did anybody actually buy it though? This could be misdirection, or just misguided marketing based on historical instances of abuse. China isn't known for trying to repatriate descendants, and it's not exactly difficult to find Jews.

Ancestry data would certainly be of interest to a particular demographic known to discriminate by caste. There's no escaping your low-class heritage when anyone can look up your stolen DNA profile on the black market.

fragmede
0 replies
10m

"not exactly difficult"...

I'm not Jewish, but I feel like there's some sort of reason for them not wanting a list of who they and where they live to exist.

partitioned
3 replies
3h14m

Or a rival country could create a virus that targets 80% of their enemies population and only 20% of their own

pc86
2 replies
2h30m

This is tin-foil hat nonsense.

jstarfish
0 replies
46m

Unless you speak Kikongo.

_jas
0 replies
2h10m

It is becoming far easier than you are aware then. Sam Harris and Rob Reid discussed in length a few years ago.

https://www.samharris.org/podcasts/making-sense-episodes/spe...

jairuhme
2 replies
3h17m

How do you make the leap to it being an issue of capitalism? There are plenty of bad actors who could use this information (or other hacked info) who are not a corporation seeking profit.

uoaei
0 replies
14m

Capitalism isn't about corporations, it's about capital.

lvass
0 replies
2h20m

Like North Korea which by far has the most state sponsored cyber thugs per capita.

wongarsu
1 replies
2h58m

My go-to is "what if literal nazis come to power and use this information to kick-start their eugenics program", but I guess rampant capitalism is also on the threat list.

wahnfrieden
0 replies
2h26m

There are already businesses that practice eugenics based on illegal data like this or illegal maps

CaptainZapp
3 replies
3h35m

What do you think will happen to people who had their ancestry data stolen here?

Sounds like an absolute treasure trove for a life insurance company. Or, would you disagree?

jalk
2 replies
3h23m

Yes, but one would hope that if an insurance company was caught using stolen data to calculate the premiums, that would be the end of that company and jail time for management (like the leaders of VW responsible of the emissions testing cheating).

psychlops
0 replies
1h28m

jail time for management

Funny! We all know it would be a lone rogue engineer that did it in the end and management would apologize on their behalf.

hnbad
0 replies
3h18m

That assumes they do so in a really stupid and straightforward way. LLMs already exist to "AI-wash" copyrighted material in ways that technically don't violate copyright. I'm pretty sure someone will find a way to create a dodgy shell company around a foreign B2B service that reycles this data for them in a way that is technically legal to use.

"Feed personal data into this service and it'll spit out a risk assessment based on a model built on 6.9M historical health data sets."

nulcow
0 replies
1h17m

I'm not sure I've heard that argument as much as "why do you care so much about privacy?" full stop.

I'm not sure I've ever heard anyone I know mention privacy at all, as if they're totally ignorant to it. In reality, the majority of people will just let Google or Microsoft do whatever with their personal information as long as the product or service is slightly more convenient than the last one.

mannykannot
0 replies
3h56m

You are not likely to see the statement you are discussing unless you firstly somewhat frequently get into a situation where someone says something like "why do you care so much about privacy?" and then attempt to debate the issue.

It is not necessary to show actual harm from this breach for it to defeat the tacit premise behind the statement you are discussing, which is that their profligacy with their personal data cannot, by itself, reveal any of your personal data.

hulitu
0 replies
3h14m

"why do you care so much about privacy?"

Do you talk family problems with all your neighbours ? With strangers ?

How would you feel when your employer will know everything you did last night ?

White_Wolf
0 replies
1h18m

I wonder if that could be used as a list of possible organ donors. I don't know what else (data) is stored there tbh but if it helps narrow down to find a kidney or heart for someone rich...

spacebacon
5 replies
4h17m

Maybe we all shouldn’t be so Quic to create bad ideas.

dylan604
4 replies
3h55m

Right, like the person posting an idea on an Internet forum was the first and only person to have that idea. Security through obscurity does not work. It’s much better to open up the curtains and let the sunlight in. It’s the best disinfectant. At least then everyone is working with all of the information.

spacebacon
2 replies
3h7m

Blinds open btw. I’m picking up what you are gracefully throwing down but not without checks and balances.

dylan604
1 replies
2h12m

My checks are bouncing off the heavily skewed balance. These internet posts are pretty much the only "checks and balances", and they do diddly squat.

spacebacon
0 replies
1h15m

Written word anywhere always records to the record.

spacebacon
0 replies
3h26m

Touché, although people often identify what pharmacy they prescribe to with parroting other people’s phrases.

Melting_Harps
2 replies
6h56m

This disaster is the perfect counter-argument to those always saying

Personally speaking, I think Equihax was the better counter-argument; at least with 23andme YOU as a customer had to DECIDE to use their services and weigh the pros-cons of doing so, with Equihax I was forced into a rating system to determine my eligibility in a system that hoovers up any and all data sold to them by 3rd parties and holds all my personal information in order to complete anything from a loan application to a job application.

And when found to have been breached no effective recourse was made, and instead of admitting fault to a very high probability of Identity theft being the end result a token 'credit system monitoring' service was offered, which once again relies on these credit agencies who share/distribute this information without my consent and created the problem are let off scot-free and never suffer any consequences.

In short, it's a naive argument made from often ignorant and self-defeating practices that make others worse off because of their complacency and refusal to take privacy serious.

skummetmaelk
0 replies
6h53m

Completely true. However, Equifax was probably hard to wrap your head around. Whereas 23andme might seem a lot more personal and private to the average person. Of course, nothing is likely to come of this regardless.

SAI_Peregrinus
0 replies
3h26m

Not identity theft. Libel. There's a high probability a bank will libel people whose info Equifax leaked. They'll do that because they depend solely on the same (largely public) data compaies like Equifax collect to identify loan applicants.

adolph
1 replies
5h0m

To clarify, genomic data was not reported stolen. It sounds like the breach was about genealogical data.

The stolen data included the person’s name, birth year, relationship labels, the percentage of DNA shared with relatives, ancestry reports and self-reported location.

sandos
0 replies
3h17m

Yes, and remember that data is commonly widely shared. Because, its mostly about long dead people?

The real breach is for recently deceased people (here the time span varies greatly, but dead for ~100 years is definitely enough if you ask me) and for living people. Actually in Sweden the death info is publically available generally right away, more or less. You can buy USB sticks with ~all deaths up until very recently.

mejutoco
0 replies
2h45m

Agree. Alternatively: how much do you earn? Do you mind if I read your physical mail? Can I have a key to your home?

I think it is difficult for some people to think about abstract ideas. When you bring it to the physical world everyone understands it is vexing.

hotpotamus
0 replies
2h3m

I guess I'm feeling a bit philosophical today, but in some sense, aren't we all part of a shared data structure given that we are all somewhat related? While there a few bits that make us individuals, there is much that is shared to the point that privacy doesn't seem truly possible.

ekianjo
0 replies
2h29m

nobody will listen to your counterargument. They don't care.

mrtksn
22 replies
7h34m

If anyone who used the service, do you know if you can use it anonymously? I wanted to try it but I was afraid of exactly this.

How feasible it is to use a payment and an address that doesn't directly connect you to your samples?

poyu
8 replies
7h28m

I signed up with fake name, delivered to a friends address, and a Privacy.com single use credit card. No one is forcing you to use real credentials anywhere online.

fsckboy
6 replies
7h15m

there is no credential more comprehensively solid than your genome

mtlmtlmtlmtl
5 replies
6h56m

Yeah it's sort of like using Tor to log in to Facebook and expecting Facebook not to know who you are because your IP is obscured.

mrtksn
4 replies
5h55m

More like using Tor to create a FB account with made up information and expecting FB not know who you are because your IP is obscured and you use fresh browser profile. In your DNA doesn't tell your identity, there's DNA everywhere and it's not something hard to come by. The analysis is only valuable if can be connected to a persona, otherwise its just chemical reaction.

mtlmtlmtlmtl
3 replies
5h42m

No, your analogy makes no sense. A fake facebook account is not identifying information. DNA is. They can figure out who you are based on the DNA alone, if they have enough data from your relatives.

mrtksn
2 replies
5h26m

There are billions of addresses in the world and you can look it up online. It is an identifying information only when it is attached to an identity. That's exactly the same. DNA is everywhere, it's only identifying when connected to an identity.

mtlmtlmtlmtl
1 replies
4h59m

It can be tied to your identity, that's the point. Because they'll have your relatives' data.

mrtksn
0 replies
4h32m

They can match me with my DNA if they already have complete knowledge of my family structure and all their DNA. So if all Kardashians but one send their DNA for tests providing their true identities and the last one sends anonymously, they can assume the identity of the last one.

So yes, there's a risk but its not much different from going outside and leave behind hair or saliva unaccounted for. That's Putin level of paranoia IMHO(he is known to have men collecting his poo etc. when outside).

Maybe can be useful for insurance companies to match you and price you according to your DNA but they are not allowed to do that and they can only exist within legal structure.

mrtksn
0 replies
5h59m

No one is forcing you to use real credentials anywhere online.

Definitely not true, many services require KYC and others do it to prevent fraud.

shatnersbassoon
6 replies
7h28m

Given the nature of the service, you should probably treat it as inherently pseudonymous. You're handing over irrevocable genetic data which will link you to relatives. Whatever their data protection assurances are, you have to imagine the worst case scenario - massive data leakage. And if this happens then you will in all likelihood be identifiable.

mrtksn
4 replies
6h2m

Obviously I don't trust any assurances, but on the other hand I'm handing over my genetic data all the time because parts of me end up over in the hands of 3rd parties whenever I go outside but they just happen not to run test on those.

If I can have a test that is not connected to my persona, from their perspective the data would be as valuable as running a test on the hair in the barbershop or picking a random leftover of food and running test on the saliva left on the half consumed food.

shatnersbassoon
1 replies
5h46m

There's quite a big difference between the DNA you leave in a forensic sense all around you (which would take a lot of effort and expense for someone to gather against your will) versus serving it up on a plate in a format that is trivially searchable for close matches against any relatives. The value is in the fact that millions of people's DNA is gathered in the same place in the same format, not the individual value of your DNA. It's a classic network effect.

mrtksn
0 replies
5h25m

What's the difference of taking the DNA from a trash and from a plate delivered with fake information?

saalweachter
1 replies
5h0m

I mean, what is your threat model, and what are you trying to get out of DNA/ancestry services?

If you're worried about getting arrested for one of the many crimes you've committed, and you want to meet far-flung relatives, you're kind of in a bind here. Assuming unlimited cooperation between the police and the DNA/ancestry services -- which your threat model in this case would require -- even if they don't have your name or address, they could fabricate a half-sibling/double-cousin in the system and have them reach out to you. What, are you not going to talk to and eventually meet the cop pretending to be your previously unknown half-sibling/double-cousin?

mrtksn
0 replies
4h19m

My threat model is about the changing world, essentially activist who feel like having right to tell who lives where. I usually pass by the look, fail by the accent and I intend not to have another data point they can use.

ta1243
0 replies
6h43m

indeed, if you have a cousin that has already filled in the data about your grandparents, and they do a scan showing "John Doe" is the descendent of Arthur, Betty, Charlie, and Debbie; then you're already linked with a "shadow profile". Chances are they'll have the data from your cousin with your name added too, so now they can link the DNA of "John Doe" with the shadow profile of "YourActualName, cousin of Timmy TakeMyData".

The only solution to this is regulation and enough incentive that companies have to treat data as they treat radioactive waste material. Storing data should be a liability, not an asset.

mo_42
1 replies
7h27m

I had the same question once and dug around the internet a bit. It seems like they don't verify your information.

I'd recommend to use a fake name and not something like "Ano Nymous" as their rules don't allow that.

mrtksn
0 replies
6h0m

Right, if they don't do any extra verification on identity then non-perosonal mailing address and a single use payment method should do.

clarionbell
1 replies
6h13m

Find a private lab to do the test. It may be a bit more involved, and cost more money, but not by that much.

mrtksn
0 replies
5h55m

Do private labs have methods to provide similar information to 23andMe?

sandos
0 replies
3h11m

You DNA is shared with others, so any amateur geneaologist is likely to be able to find out who, approximately, you are.

The exact info though should be easy to protect. Just dont give it away.

But remember that other close family members to you are very likely to know who you are, and they may share that info by accident. Normally you can't share data of anyone living on these services, since that is illegal pretty much anywhere, but its enough if one user happen to mark you dead in their tree and has filled in your real data.

petesergeant
0 replies
4h47m

I used a fake name and a masked email address, and no issues so far. I've told close family members that if they get a sibling, child, or uncle show up called "James Brown" not to freak out too much

spacebacon
12 replies
5h58m

It's imperative to recognize that the healing of cross-generational trauma is a journey within an individual's lifetime, rather than resorting to a confrontational approach that employs genetic data to incriminate individuals based on what science implies about their genetic makeup. The flaws inherent in this data—biased, widespread, and contaminated—highlight its unreliability. Relying on such data, which historically proves to be often incorrect, represents a narrow perspective. Using it to calculate high probabilities for targeting individuals in specific scenarios fosters more animosity than extracting genuine, overarching truths.

e2le
5 replies
5h12m

I struggle to understand what is written here.

spacebacon
2 replies
4h52m

Sorry, if Google didn’t make so many world changing products so quickly I’m sure I would have less criticism of Google in general and fit in more easily here.

mkl
1 replies
4h39m

Does what you wrote have something to do with Google? I don't see it.

spacebacon
0 replies
4h10m

I’m sorry, I don’t understand what e2le has typed here.

cdolan
1 replies
5h2m

Me too but I think OPs point is “this was a pointless product in the first place, and can only lead to harm”

spacebacon
0 replies
5h1m

Yeah

spacebacon
4 replies
4h48m

Anyone else notice downvotes are hand and hand with google criticism here much?

spacebacon
3 replies
4h45m

Yeah, it’s almost as if a swarm of google investors and employees launch radical campaigns against critics.

spacebacon
2 replies
4h42m

Maybe we all shouldn’t be so QUIC to bad protocols.

spacebacon
1 replies
4h30m

Keep downvoting yourself into the deepest hole you can imagine Google.

concordDance
0 replies
4h3m

Personally in downvoting because your posts seem irrelevant or incoherent and I want to discourage that sort of thing on hackernews. Though in your particular case I strongly suspect you're having a psychotic break and need immediate medical help.

barrysteve
0 replies
4h24m

Yes genetic testing is suspectible to psuedo science, something like phrenology.

It is much healthier to offer genetic improvements for flaws than arbitrarily incriminate by way of dna makeup.

mikewarot
10 replies
5h11m

I never seriously considered using 23 and me. Not because of hackers, but rather what government would do with that information. I don't want to be responsible for some random relative getting charged with a crime just because I was curious about my family tree.

knallfrosch
5 replies
4h50m

If that's your only concern, you need to read up on something called "Nazis." Imagine what they would do with a database of genetic information.

dfxm12
1 replies
3h24m

Look at what they did without it. Godwin's Law aside, the point is, if a sufficiently powerful group is set on doing something, they'll do it. Such a group won't let "facts" or "accuracy" get in the way. Look at McCarthyism.

m_eiman
0 replies
2h59m

People bent on doing evil things are going to do evil things, but perhaps it's a good idea to not build systems that will let their evil be faster and more efficient.

ribosometronome
0 replies
3h34m

That Nazis could use something for truly evil purposes is hardly limited to this but, I think, the concern is. You can imagine nazis transporting Holocaust victims and soldiers by rail, because they did, but I’ve never heard anyone argue against adding more rail infrastructure because if Nazis take over they could use the rail infrastructure to enable genocide.

pfdietz
0 replies
4h34m

They'd populate the database by making participation compulsory.

maximus-decimus
0 replies
3h59m

Imagine what a racist government would do if they were able to tell who's black!

Sure, now they can start hating on people who have the gene that makes Cilantro taste like soap, but a lot of genetic things are already visible so I don't see this as being fundamentally different.

ribosometronome
1 replies
3h39m

Have they used these sorts of databases to charge things other than murder and rape? The cases I have seen solved by way of this technology, I would very much feel glad that something I did led to stopping someone doing seriously bad things.

satellite2
0 replies
2h26m

You have no guarantee on the type of regime under which you will live in a few years nor on the rules it will enforce. Actually, democracies are perfect regimes for rapid rule change without requiring a regime change.

al_borland
1 replies
3h4m

I’d really like my 23 and Me info, but I assumed it was only a matter of time before they were hacked or sold to an untrustworthy organization willing to sell out users to make a quick buck.

If the test was done, the results were sent, and then my test data/info were destroyed on their end, or if I could do a home test where the data never left my home, then I’d do it.

I struggle to understand why companies hold on to all this data. It is a huge liability. In this case, maybe it is so they can identify familial relationships, but is that feature worth the risk?

flavius29663
0 replies
1h8m

I struggle to understand why companies hold on to all this data

To sell it, did you miss the news? https://www.bloomberg.com/news/articles/2023-10-30/23andme-w...

yoaviram
9 replies
7h39m

Something does not add up.

"23andMe said the data breach was caused by customers reusing passwords"

Yet 14,000 accounts were breached in one go? Where did these passwords come from? Maybe there was another related breach (something like lastpass can explain this)?

Also, using the "DNA Relatives" features the hackers were able to access personal information relating to 6.9 million individuals. That means each one of the original 14,000 accounts had about 492 unique relatives. What am I missing?

tgv
1 replies
4h43m

14000 accounts <-> 7M users? That ratio is a bit skewed.

nickthegreek
0 replies
14m

Its because you can get some data if you opt in to relationship tracking. For instance, my 23andme shows like 1,500 genetic relatives. So saw you are jewish and opt-in to this feature. The person can login to the account and see X amount of other jewish people and their names. This is the data that was stolen from my understanding. Not the actual raw DNA of those individuals. With the current gaza/israel issue, you can see why someone having a list of the names of thousands of Jewish people might cause some concern.

quickthrower2
1 replies
4h45m

Genghis Khan got hacked?

RcouF1uZ4gsC
0 replies
4h42m

I think Genghis Khan usually did the hacking…

With a sword.

tzs
0 replies
1h12m

Seems plausible to me, assuming that my situation on 23andMe is about average when it comes to the number of DNA relatives and the vulnerability of my relatives to being hacked.

A quick search says 23andMe has 14 million customers, so 14000 accounts breached would be 1 in 1000 accounts breached.

The DNA relatives listing for me lists just over 1500 people. If each of those accounts had a 1/1000 probability of being hacked, the probability none of my relatives were hacked would be (1-1/1000)^1500 = 0.223. The probability that at least one of my relatives was hacked would then be 0.777.

I'd then expect, based on my assumption that I'm typical, about 10.8 million people to have had relatives with hacked accounts, which is close enough to 6.9 million that the latter seems plausible.

thaumasiotes
0 replies
7h28m

For a breach to be caused by password reuse, it must be the case that a set of usernames and passwords got leaked somewhere else. If the usernames and passwords were leaked from 23andMe, that wouldn't be a breach related to password reuse, it would just be someone who found and cracked 23andMe's list of credentials.

It isn't even slightly surprising that a list of credentials leaked from some other website (or a composite list built from leaks from several sites) might have 14,000 users in common with 23andMe.

semitic-bad
0 replies
51m

Considering Tech Companies being in bed with Dems, and Dems being in bed with communist china,

...I suspect we'll see more tech companies "accidentally" provide valuable intelligence to communist china.

just google: controversial sections of the talmud

to see what big tech leadership (and the nepotistic big nose people generally) think of non-j333ws

llm_nerd
0 replies
5h10m

Yet 14,000 accounts were breached in one go?

That part isn't super surprising beyond the technical issue of the data usurpers probably not being metered or flagged for continuously logging into different accounts. They could have used a massively distributed network to pull all the data, but there probably simply wasn't the detection or protection.

Having said that, in logging into my account to verify how many relatives are shown to add this response, 23andme refused to let me login and demanded that I reset my password because of password reuse. I have always had a very strong password on this account, and it isn't reused anywhere. I even have 2FA on. So it seems that the company isn't entirely comfortable with the notion that it was reused passwords behind it...

However after resetting my password that I never reused anywhere, the DNA relatives panel shows 60 pages of relatives, with each having 25 relatives. So 1500 relatives could be pulled. Grabbing that for 14000 random accounts would be a pretty formidable network someone could build.

PeterisP
0 replies
5h42m

I don't find that surprising at all. There are publicly available large lists that compile many historical password breaches, they are easy for attackers (or anyone) to access and it's quite reasonable to expect that at least 0.1% of anyone's users (14k accounts out of 14m+ users) will be reusing a password that has been leaked elsewhere, unless you explicitly attempt to detect and invalidate such passwords e.g. as in yesterday's discussion on HN about Troy Hunt's work.

taurath
9 replies
6h50m

How much does that have to do with their TOS update which went out on thanksgiving DAY (the most perfect time to get lost in everyone’s inboxes). The TOS update somehow tries to forbid class actions, requires you to go through an “informal” 60 day process before any legal action, and forces you into binding arbitration.

Functionally you as a customer have next to no legal rights, according to 23andMe lawyers who cooked this up.

martin_a
8 replies
6h40m

Does this hold up in court?

At least in Germany any contracts that are heavily in favor of one side will be declared void if it comes to court.

Cthulhu_
4 replies
6h36m

I'm not a lawyer so I can't answer the question, but it will defintely complicate going to court, and I'm confident that the company has more lawyers and more money for lawyers than the average user. A class action suit may follow, but only if enough people and lawyers are willing, and it'll likely end up with a pittance in damages paid in a settlement, eventually.

onetimeuse92304
3 replies
6h20m

What really interests me is "Are ToS changes absolutely binding?"

I am also not a lawyer. But I think there are two types of changes to ToS.

One is purely administrative. For example, they might change the methods available to reach for support. For example, they might say that you are no longer able to send a Fax to get support help. Or that their domain name changed.

Second is something that changes the service that you are receiving. For example, when you have bought their product they said "we are offering free support for all owners or our doodad". But then one day they decide that support is now paid option for all existing customers.

So the question is: can ToS changes that changes the service/product that you have already paid for binding without your consent? If they decide to introduce extra protection from class action, requirement for arbitrage, etc. is this just administrative or is it actually changing the service you are receiving by restricting your rights?

wongarsu
1 replies
2h54m

Not just "are the changes binding" but also "are the changes relevant". The changes might be binding for future services, but previous services were provided under the old terms; so you can make an argument that any arbitration clause in the new terms doesn't apply to services rendered before the new terms took effect.

blcknight
0 replies
6m

Most terms of service include automatic acceptance of future changes, so if you continue to use the service the terms apply to the past too.

“23andMe may make changes to the Terms at any time. If we make a material change to the Terms, we will notify you, such as by posting a notice on our website or sending a message to the email address associated with your account. By continuing to access or use the Services, you agree to be bound by the revised Terms.”

toyg
0 replies
4h10m

It doesn't really matter in practice, because the discipline around contract law is (typically) based on reasonableness of terms - I can't sign TOS that say I'm a slave, nor will TOS apply if such a clause is introduced at a later stage.

Regardless of whether terms are changed for administrative or product reasons, what matters is the reasonableness of terms imposed on the other party.

marcus0x62
0 replies
3h58m

Within limits, in the United States, yes. https://www.law.cornell.edu/wex/adhesion_contract_%28contrac...

jkaplowitz
0 replies
6h19m

In the US, yes, these kinds of contracts are unfortunately upheld under current laws and court rulings that are valid nationwide. Of course there are limits and exceptions, but those boundaries have been broadened in recent decades to allow these kinds of terms anywhere in the US, even for consumer contracts.

Stranger43
0 replies
5h45m

Depends a lot on the jurisdiction several European courts have thrown put EULA's and TOS' agreements as being entirely invalid in a business to customer relationship, but im sure some American court somewhere will declare those as valid contracts.

nroets
7 replies
5h34m

Isn't it time governments start to regulate passwords ? (They already regulate a lot of privacy issues like medical and financial history. Some governments even regulate the use of finger print authentication of employees)

For example, any website that allows users to choose normal, easy to remember passwords should meet a long list of requirements: For example security audits, bug bounties, capital reserves to deal with class action suits.

Then small websites will start either implement Open ID or encourage their users to use password manages.

stef25
3 replies
5h6m

Isn't it time governments start to regulate passwords ?

Nah we're good. They already regulate cookies and it's a dumpster fire.

mclightning
2 replies
4h16m

dumpster fire for who?

nroets
0 replies
4h1m

Cookie consent is just a nuisance.

Like the cancer warnings in California.

concordDance
0 replies
4h5m

Users constantly having to reclick on popups.

koheripbal
1 replies
3h25m

That would not have protected 23andme from this issue. Password complexity does not stop password reuse.

nroets
0 replies
3h12m

I'm only suggesting password complexity as a compromise so that websites wouldn't need to implement OAuth. (Do you have a better comprise ?)

Password complexity encourages the use of password managers. They in turn remove the need to reuse passwords.

BrandoElFollito
0 replies
2h10m

Please have a look at the "recommendations" of your national security agency (except NIST).

It is all outdated, written by people who thought they understood security in the 90's.

The French ANSSI recommendations are ridiculous. The German as well. Having an incompetent gov org forcing you to apply their "best practices" is terrible.

We have already for banks and look at what they have done: all possible "don't do it!" implementations neatly on one page (source: Boursobank, Fortuneo and other French banks who replay that the security is "reviewed by experts")

gcanyon
5 replies
4h14m

<puts on tinfoil hat>

Does anyone think privacy of any real sort is maintainable going forward? Machine learning algorithms are learning to identify people just by their walk -- no face recognition required. Algorithms are moving toward being able to decipher text just by the audio of the keyboard being typed on.

In short, given a gestalt of ALL public data and sufficiently advanced algorithms is there really a way for people to maintain what we today consider reasonable privacy without extraordinary measures, unfailingly applied?

To be clear, I'm not value-judging the situation, just expressing what I think the ongoing trend is.

pembrook
2 replies
3h33m

Agreed, but also "privacy" is an abstraction that layers over the actual thing that people are worried about.

Any answer to why do you care about hiding this information? can all be boiled down to the fear that "[person or group] might use [private data item] to create [bad outcome] for me."

So the thing people actually care about is the risk of bad outcome, not the actual data itself.

If your theory is correct, then the focus should be on the prevention of asymmetric power imbalances in societal transactions that can even create [bad outcome].

acuozzo
1 replies
1h54m

the focus should be on the prevention of asymmetric power imbalances in societal transactions

The rules governing social systems built to obscure the jungle (e.g., political, legal, and penal systems) can always be trumped by that which they were chosen to tame. This is the unfortunate reality of our wetware.

the thing people actually care about is the risk of bad outcome, not the actual data itself

"Bad" is subjective, no?

Is it good or bad if a father learns that his teenage son is not his own?

nulcow
0 replies
1h13m

"Bad" is subjective, no? > Is it good or bad if a father learns that his teenage son is not his own?

I feel like that would depend on the person. If the father wanted to know and the son didn't, that would be good for one and bad for the other, and vice versa.

dylan604
0 replies
3h52m

Does anyone think privacy of any real sort is maintainable going forward

Probably not, but it doesn’t mean we can’t guide the conversations about how it looks in the future. Sitting idly by just means they win, but discussing it in the open means that we might be able to put some safe guards in place.

Oh, who am I kidding. We’re all screwed and evilCorp will win so we’re just wasting our energy and making ourselves crazy fighting. Resistance is futile

agnosticmantis
0 replies
3h7m

Machine learning algorithms are learning to identify people just by their walk...

Turns out the UK government was working on privacy-preserving walks decades ago: https://youtu.be/eCLp7zodUiI

yashasolutions
4 replies
6h21m

It does feel at this point that any company collecting data will be hacked, it's only a matter of "when" and no "if"...

prawn
1 replies
5h45m

Penalties should be strong enough that sites and apps do not collect more than an email address without very good reason. Just wanting to contact me with marketing literature is not a good reason.

Lio
0 replies
3h57m

I can't help but think that we need is for a class action suit to impose strong enough penalties that insurance companies to insist on proper audits of what data is actually needed and what is just a financial loss waiting to happen.

wahnfrieden
0 replies
2h16m

Not because it’s hard work to protect themselves. But because it’s typically not a business priority (at top middle and via coercion and incentives, the bottom/workers too) to invest in security. Most of these big hacks are via well known threats that can be caught in typical good-faith auditing

eimrine
0 replies
6h14m

They even don't need to be hacked if the Government wants to receive/alter the data.

bartwr
4 replies
4h2m

Something super creepy that happened to me recently: a hospital where I've been to a few months ago called me and asked me to participate in some DNA analysis program. They said "oh and the best part? You don't need to do anything! We will use blood samples we collected the last time." I obviously declined, but it was a huge wtf to me - they stored biological samples associated with me without informing me and can do a post hoc DNA analysis. This is just insane and a proof of how non existent any privacy laws in the US are. (In EU they cannot freeze any samples without consent and unfrozen ones are ok for at most a few days)

me_me_me
2 replies
3h38m

Well they did call you and tried to trick you into letting them use it.

But assuming they obey the law they did not used your samples.

So there are privacy laws in place? Also they could have been cleaning old results/samples and this was one step.

bartwr
1 replies
3h15m

How do I know they didn't use them? They already did something with my biological samples (storing for a different purpose than when they drew my blood) without my consent nor informing me.

And also - could eg. police use it?

me_me_me
0 replies
2h51m

I am not sure, only guessing. But why would they ask for permission in the firs place?

They already did something with my biological samples

I can only guess you were tested for something in the hospital. Samples are sent to the lab (separate department) to be tested. If additional tests need to be perform they can use the blood they already received. The samples are kept for ready availability if additional tests are requested by doctor. Doctors dont care how its done, they dont have time to inform lab patient x is out home.

After some time they need to be destroyed - due to expiry date on it. Before destroying the lab contacted you and asked for dna permission.

And also - could eg. police use it?

I don't know that. But you might want to check how medical data is protected in your jurisdiction.

kzrdude
0 replies
1h18m

Sweden has a registry of blood samples of every person born in sweden since 1975: https://sv.wikipedia.org/wiki/PKU-registret (Swedish only wiki page)

Predictably?, amusingly? police never had access to this data, until a government minister was murdered in 2003, when a sample from the suspect was retrieved. From what we know it has not been used since. So we can be cynical, but under the circumstances, the police use of the registry has not yet taken hold and is guarded by the courts..

mrep8
3 replies
7h2m

"hackers stole" == "company sold"

Alifatisk
1 replies
6h41m

sed s/"hackers stole"/"company sold"/g

Alifatisk
0 replies
18m

Was my comment that bad?

tremon
0 replies
3h47m

False

meyum33
1 replies
7h36m

In my country where the police has a LOT of power I'm more worried about inadvertently getting my relatives arrested for old closet crimes. Am I being too dramatic?

Sebb767
0 replies
7h26m

If the police has that much power, they'd probably just find another pretense to arrest your relatives, if the need to put pressure on you arises.

mensetmanusman
1 replies
4h23m

Nice data for a super biological weapon though.

mclightning
0 replies
4h22m

imagine the next hitler... as an immigrant with his data on 23andme, I am pretty scared. I am sure 1 out of the 10 ethnicities that I am affiliated with, will be eventually hated by some group.

kylehotchkiss
1 replies
7h35m

I bet a lot of insurance risk adjusters will make an order for the data, optimize their models a bit, raise a few rates, and make a few new millionaires in the process.

I’m being snarky but isn’t it really scary that 2% of Americans could be impacted by something like that?

ribosometronome
0 replies
3h29m

That sounds like an issue with the healthcare system rather than genetic analysis, no?

jmount
1 replies
3h10m

So the original "14000" leak is 0.1% of the customers, but now 6.9M is claimed. 14000/6900000 ~ 0.2%, so now half the customers are indirectly leaked?

sandos
0 replies
2h54m

Yes, likely at different levels. For example, if I log into my myheritage account, I have 27000 DNA matches which I can download a .csv of, which includes the matching parts. I can also access the trees of those users that share theirs. This really does not include personal data, _except_ the information about where our DNA is identical or half-identical. Which has potentially far-reaching implications, but the personal data should not really be available here, unless people has opted in to sharing it.

Since its a site about sharing data, its not weird that its easy to extract data from it. It is sort of the purpose.

exo762
1 replies
3h22m

23andMe should be destroyed, including all copies of its database. I hope EU will fry them.

sandos
0 replies
3h14m

How about every other site that does, more or less, the same thing?

codeulike
1 replies
3h6m

haveibeencloned dot com

no-reply
0 replies
1h22m

Only $24,799.00 + transfer fee, get yours today! /s

alexbeloi
1 replies
7h43m

Fun fact, they can sometimes narrow down crime scene DNA to just a single person by having enough partial matches from their (potentially distant) relatives. I can't remember which DNA database was used, but some cases were solved this way, IIRC it introduced a bunch of legal questions about if you can search a database in that way.

I think this was the article that talked about this (apologies for the paywall): https://www.nytimes.com/2021/12/27/magazine/dna-test-crime-i...

mrweasel
0 replies
4h49m

There have also been a number of false positives because many believe that DNA is infallible. What people tend to forget is that DNA tests used by law enforcement is only using a very small subset of DNA markers. This mean that if you're already in a DNA database you can get an unpleasant knock on the door just because you have 10 DNA markers in common with some random criminal.

Danish police only upgraded from 10 DNA markers to 16 in 2021, forcing them to review 12.000 cases and redoing the DNA test. Resulting in at least one person having the sentence reversed. No word on how many was falsely suspected, but I assume more than a few.

23B1
1 replies
6h21m

"Luckily we are now offering a genome monitoring service. For only $79.99 per month you can be sure that you're alerted any time someone tries to access your genetic record!" - 23andMe

eimrine
0 replies
6h15m

The scumbags received what they deserve. Just don't be a 23andme's used.

spacecadet
0 replies
6h2m

Im sure someone just gave them access... Given the number of SE attacks on dumbass SaaS companies, seems easy when they all cut corners and overwork everyone...

solardev
0 replies
2h35m

If the hackers could "leak" this data to the public, it would be a tremendous genetic dataset for future generations of pirate researchers...

pbhjpbhj
0 replies
3h24m

If the data exists, it's going to get out.

In this case the data is your genome.

I was thinking about getting a DNA testing kit for my parents and my conclusion was that I'd have to advise them that they'd need to be comfortable with their genome being public because over time leaks are inevitable. As the UK marches on towards increased fascism the chances a Tory government will demand access to such data "for security purposes" gets higher.

major505
0 replies
4h33m

Well, it was a matter of time before something like that happened.

madethemcry
0 replies
4h1m

Small world. Only yesterday I read that great comment from user adameasterling about credential stuffing in another thread [1]

Troy Hunt is such a treasure. And for us web application developers, there is no excuse for not having protection against credential stuffing! While the best defense is likely two-factor, checking against Hunt's hashed password database is also very good and requires no extra work for users!

That user even listed 23andMe [2] as an example but it's from 60 days ago. This incident is referenced on the techcrunch article.

[1] https://news.ycombinator.com/item?id=38521106

[2] https://news.ycombinator.com/item?id=37794379

lunarimiso
0 replies
1h22m

Does genetic data count as personal information? Just wondering as a EU citizen.

hirvi74
0 replies
2h21m

23andMe is just pissed that someone or a group of hackers stole the same data they were selling to other companies.

harha_
0 replies
3h55m

I really want to see information about my DNA. So it's very tempting to just give in, pay and send a sample to something like 23andMe...

bohadi
0 replies
2h34m

very regretable and very predictable

anotheraccount9
0 replies
5h8m

How convenient.

al_be_back
0 replies
6h12m

there was a craze for DNA analysis 10+ years ago, the idea being 'if we can analyze e-commerce transactions, why not the human DNA!' The UPS being mostly around Health than Ancestry. That's flopped in my view.

Recent Gnome sequencing research is revealing that actually a Gene (downstream) doesn't necessitate a Health/Medical Condition (upstream) [1]. I think we need highest security measures, user education and Regulation when it comes to DNA, medical records, and biometric data (face, finger, iris, voice etc).

Charles Darwin & Co documented their theory of evolution well, there's enough ancestry there for most i think, at least as a solid starting point / platform. My guess would be if there was more education around theory of evolution (science), there would be less interest in Ancestry services (DNA based), leaving only a Medical case for them, and hence demanding greater protection/security.

[1] A biological relativity view of the relationships between genomes and phenotypes, Denis Noble - https://doi.org/10.1016/j.pbiomolbio.2012.09.004

agnosticmantis
0 replies
2h38m

Can we (folks who didn't use 23&me but may still be affected because our relatives might have) file a class action lawsuit?

We haven't signed any licensing agreements with 23&me waiving our privacy, so presumably we still have some rights?

DeathArrow
0 replies
7h26m

Cool, now they can hack the DNA of millions and transform them into frogs or something.

AndyMcConachie
0 replies
3h52m

So people in my family have used 23andMe but I'm assuming my data is also compromised. I've never used the service because I think it's kind of weird and gross. But it probably doesn't matter that much if both of my parents and my brother have. Health insurance companies in the future can still charge me different prices based on my risk profile.