return to table of content

Meta to pay Texas $1.4B for using facial recognition without users' permission

joeamroo
42 replies
1d

This going to be a slippery slope as more governments start to use fines as an alternative type of tax unique to these tech companies. I don't know how Meta/Google can react to these fines (Except the whole opt-in part, but then you have a tradeoff with usability, and people against it usually think that Meta cares about their data outside the aggregate)

toomuchtodo
12 replies
1d

This is great news until privacy legislation catches up. Big Tech should be fearful they'll step on a mine financilly.

autoexec
11 replies
1d

Oh yeah, big tech is going to be shaking in their boots. I'm sure Meta is really crying about the 1.4 billion they lost while they're rolling around in the 134 billion in revenue they made last year. They've even got a nice easy payment plan which allows them to invest and make money on the $225 million they're going to be paying each year from 2025 to 2028.

mulmen
6 replies
21h52m

Revenue isn’t profit. This is gradeschool finance. Meta’s net income in 2023 was $39 billion. $1.4 billion is 3.5% of worldwide income for one US state. It’s an unsustainable penalty for Meta if more states and jurisdictions issue similar penalties.

autoexec
5 replies
20h24m

Revenue isn’t profit.

No, and I didn't say that it was. Reported revenue was just the data Meta has made available. Unless I've missed it somewhere, they don't explicitly state exactly how much profit they made last year. I think it's reasonable to assume that it was several times more than the 1.4 billion dollar fine though, which is really the point. If Meta/facebook makes even just tens of billions in profit, 1.4 billion could easily be a sustainable penalty. The more years they are hit with a fine that size, and the more other states start demanding their cut of the action too, the less sustainable it becomes, probably, but for all we know paying this 1.4 billion fine (over several years) to Texas could actually be (or end up being) profitable for meta.

How much money did they make off the data they've been collecting and abusing since 2011? How much money will they make in the future from what they learned by abusing that facial recognition data for nearly 15 years? If it ever amounts to more than the fine, or if other incentives make it justifiable to shareholders then Meta is better off for having broken the law.

mulmen
4 replies
18h49m

Unless I've missed it somewhere, they don't explicitly state exactly how much profit they made last year.

They state this in their financial reports and it is readily available on financial news websites. I’m not sure how you found revenue without also finding net income (aka profit).

Type “meta profit” into a search engine and click the first result. This immediately gave me the answer in Bing, DuckDuckGo, Google, Kagi, and Yahoo.

autoexec
3 replies
18h26m

the first result in Duckduckgo took me here (https://www.nytimes.com/2023/07/26/technology/meta-earnings-...) which is paywalled but what is readable gives a number for a single quarter, but not for the whole of last year.

Google takes me here (https://economictimes.indiatimes.com/tech/technology/meta-po...) where it's the same story

You mentioned Kagi, so I thought I'd try asking kagi's AI search: "how much did Meta make in profits last year"

That gave me:

According to the available information:

In 2023, Meta Platforms reported annual revenue of $134.902 billion, which was a 15.69% increase from 2022.12 However, the information does not explicitly state Meta's profit for 2023.

The closest relevant information is that in 2022, Meta's total operating profit declined from $46.8 billion in 2021 to $28.9 billion.3 Additionally, in Q4 2023, Meta reported revenue of $40.11 billion, which was a 25% year-over-year increase.4

So while we don't have the exact profit figure for 2023, the available data suggests Meta's profits were likely substantial, though potentially lower than the previous year's $46.8 billion.3

mulmen
1 replies
17h32m

The first result for me is https://investor.fb.com/investor-news/press-release-details/... on all of those engines. It’s really easy to find. Any search engine should show you this in the first handful of results. Or just look it up on Yahoo finance. This is really basic stuff.

autoexec
0 replies
16h41m

I should have skipped the search engines all together and just pulled up the wikipedia article for meta. They have it listed right at the side of the page.

Aloisius
0 replies
17h38m

Profit = net income, which is in their income statement.

For FY2023, it was $39.1 billion.

autoexec
1 replies
20h11m

We need protections in law, but I can't say I'm a big fan of KOSA. It not only fails to address the problem for anyone other than children, but it enables a lot of harm. Censorship isn't the solution. Ending the buying and selling of personal data, outlawing ads that are targeted to individuals as opposed to targeting content/context, and requiring companies to apply the same policies and prices for all of their customers no matter who that customer is, or how much money that customer has in the bank would be a better approach.

toomuchtodo
0 replies
19h51m

Sometimes censorship is the solution, because humans are tricky and every human comes with their own baggage. Not always of course, but there is always nuance.

xkcd1963
0 replies
7h49m

We are not always going forward..

surfingdino
11 replies
1d

There is fuck all that those companies need facial recognition for. It is simply not needed and is just a massive invasion of privacy.

autoexec
6 replies
1d

They need it so that they can spy on you. It's not needed, but many companies are built on surveillance capitalism and an increasing number of companies are using that surveillance to gain a huge advantage over their customers. The more a company knows about you, the easier it is for them to take advantage of you.

There's a lot of money to be made exploiting the most intimate details of our lives. Nobody "needs" that money, but they sure don't want to leave it on the table when the government isn't going to stop them from violating our privacy and then stuffing their pockets with our cash.

jart
5 replies
22h23m

If there's a lot of money to be made, could you give some concrete examples with that have wide applicability? Ideally I'd like to hear something better than just selling it to an advertiser or data broker.

autoexec
3 replies
19h31m

Examples on how companies and people can make more money by exploiting the massive amounts of private data being collected and sold? I guess that's fair. No company will tell you when they exploit your data to their advantage. It's hidden.

Prices can be set according to the data companies have on you and the assumptions they make using that same data. The price you're asked to pay for something when you shop online isn't always the same price your neighbor would be asked to pay for the exact same item. Lots of potential here too when restaurants don't publicly disclose their prices, but insist that you use a cell phone app or scan a QR code just to see a menu. Your prices don't have to be the same as the person in line behind you for the same foods. Physical retailers have been trying to get this going for a long time.

"For example, ZipRecruiter, an online employment marketplace, indicates that it could increase profits by 84% by experimenting with personalized prices" (https://link.springer.com/article/10.1057/s41272-019-00224-3)

Fast food chain Wendy's tried to move the needle closer to personalized pricing (aka discriminatory pricing) when they said they were moving to surge pricing and you'd never know how much a burger was going to cost you until you'd already waited in line at the drive through and were told what price you were getting. They backed down due to consumer backlash, but their desire to squeeze every last dime possible out of you by leveraging big data and algorithms is still there.

Hotel/airfare/travel industry has been doing this for a very long time already (https://www.cnet.com/tech/services-and-software/mac-users-pa... and https://millionmilesecrets.com/guides/are-airlines-raising-y...)

Health insurance companies want your data so they can charge you more for not moving enough, or because people in your zip code were logged eating more fastfood, or because you've been spending too much on alcohol at the store.

https://www.propublica.org/article/health-insurers-are-vacuu...

https://www.ama-assn.org/practice-management/digital/insurer...

A lot of the tracking we see is explicitly trying to assess traits like intelligence, education level, and mental illnesses including dementia and bipolar disorder.

Here for example is a pizza shop that will "create a profile about you reflecting your preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes." You know, just normal pizza shop stuff! (https://pie-tap.com/privacy/)

Companies and scammers alike can easily target uneducated and low intelligence individuals, and machine learning algorithms can detect when bi-polar people are in a manic phase, or what time your ADHD meds usually start to wear off, or when someone with alzheimers starts sundowning and they can jump at those chances to hit people with ads, scams, and manipulations when they think their target is weakest/most confused/most impulsive. Even without a diagnosis your mental health is a huge business opportunity for the person willing to exploit it (https://www.politico.com/news/2022/01/28/suicide-hotline-sil...)

The data being collected on you is increasingly used for really big things like if you get a job, or if your rental lease agreement gets approved, but it's also used for really trivial things like determining how long to leave you on hold when you call a company (https://www.nytimes.com/2019/11/04/business/secret-consumer-...)

Companies aren't collecting huge amounts of facts about you and your life because it's fun for them. They pay a lot of money to purchase, collect, store, maintain, backup, and scrutinize all that data, and they do it because it's making them money. Almost always, their increased profits come at your expense.

jart
2 replies
18h29m

All the things you mentioned, like discrimination, sound like ways that companies safeguard against losing money, not earning it like I asked. They all also sound like money losers to me in practice. For example, if someone refuses to hire me because they found out something unsavory about me from a data broker, then they're really only hurting themselves. If someone tries to charge me more because they think I can afford it, then they're going to lose my business or get bad PR on X once I realize I'm being treated unfairly. The best you could probably argue is that preventing corporations from having knowledge about people will protect the most vulnerable members of our society who can't fight or fend for themselves and are willing to tolerate being treated poorly.

autoexec
1 replies
17h57m

All the things you mentioned, like discrimination, sound like ways that companies safeguard against losing money, not earning it like I asked.

A distinction without a difference? A company that can raise their prices for you because they know (or think) that you can afford the price hike isn't safeguarding them, it's just screwing you out of money. How would you feel if you got an awesome 20% raise at work, only to find that the next day the top 10 things you most frequently buy at the store were all suddenly 20% more when you went to pay for them. Why shouldn't a loaf of bread cost a percentage of your total income?

if someone refuses to hire me because they found out something unsavory about me from a data broker, then they're really only hurting themselves.

I could agree that it might not be smart for them to lose candidates based on random crap dug out of a background search, but it happens and however dumb it is for the company, you would still be out that job. They'll never tell you why you didn't get hired. You just get ghosted.

If someone tries to charge me more because they think I can afford it, then they're going to lose my business or get bad PR on X once I realize I'm being treated unfairly.

All the recent inflation shows that companies can get away with arbitrarily raising prices a whole lot as long as they all do it around the same time. Have you been boycotting them all recently? I've gone out of my way to avoid eggs associated with Cal-Maine Foods over price gouging (https://www.newsweek.com/egg-producers-accused-price-gouging...) but it's not been easy. Not every brand I see in stores advertises itself as being related to them. Boycotts are growing more difficult thanks to the massive consolidation of our food industry (https://www.theguardian.com/environment/ng-interactive/2021/...).

Most of the time when you end up paying more because of big data you'll never be told that. All of this is almost totally hidden from consumers. You'll just be charged more or get a bill that seems higher and you won't be told why. You're very likely already paying more for some things because of big data. Same with store policies. You ask a store what their return policy and they'll tell you one thing while the next person who asks gets told something different. You can't feel cheated or even like you're being treated special because your good consumer score is so high because you don't even know there are multiple policies in effect depending on who asks.

I'd be willing to bet that a lot of people on twitter have complained about companies like comcast, at&t, tyson foods, facebook, 3m, monsanto, etc, but what has it accomplished? Many of the most wealthy and powerful companies in the US are also the most hated by the public. They don't have good reputations to protect. They just don't have to care if you like them or not.

The best you could probably argue is that preventing corporations from having knowledge about people will protect the most vulnerable members of our society who can't fight or fend for themselves and are willing to tolerate being treated poorly.

It would protect all of us. No one can "fend for themselves" and everyone is being treated poorly. You have been being treated poorly already. You will continue to be treated poorly until it hurts a companies profits to treat you badly. Right now, they're not just getting away with it, they are looking for ways to screw you over even more than they are already and in new and innovative ways using resources unlike anything you'll ever have. It's highly asymmetric warfare where consumers are divided into buckets and ultimately conquered.

jart
0 replies
17h33m

How would you feel if you got an awesome 20% raise at work, only to find that the next day the top 10 things you most frequently buy at the store were all suddenly 20% more when you went to pay for them.

This is how the California economy works and it's something that I like, because if I'm allowed to have more money, then I can use my brain to figure out a way to not be scammed out of it like everyone else.

As for the rest, I don't think indignant agitators online who stir up fear are really representative of public opinion. Yes corporations tend to be slimy, but that's because people are slimy. I don't want to live in a society that takes away my freedom just to prevent the worst of us from exploiting the weakest of us.

surfingdino
0 replies
20h4m

Mass surveillance. Paid for by your tax dollars.

qeternity
3 replies
1d

This is such a ridiculous attitude. Facial recognition in my photo albums is hugely useful. It makes searching for people a breeze. Just because you don't have a need for it does not mean it is "simply not needed".

can16358p
1 replies
1d

While I get the point, IIRC the point is about using those features _without consent_.

nox101
0 replies
22h54m

Can I consent to scan my photos if you're in them? I can certainly manually write the names on a physical photo of the people in the photo. That used to be a fairly common practice before digital.

wil421
0 replies
23h44m

Facebook does not need to use facial recognition on me and we both know they use it for more than tagging photos. If my phone does it I am asked for permission.

jonas21
6 replies
1d

Or worse, to target specific companies that the government doesn't like for whatever reason (arguably, this is already happening in the EU).

can16358p
3 replies
1d

That's EU's whole weaponized business model for lacking in technological development (I mean the "big tech" not tech in general though).

immibis
2 replies
23h25m

Have you considered the Occam's razor possibility that Europe genuinely doesn't want these companies doing business the way they do, rather than it being a conspiracy to increase government revenue? Remember, it's often illegal to take a photograph in public in Germany, and for this reason Google Street View is decades old.

can16358p
1 replies
19h49m

I of course considered it. If they genuinely think that way, it's even worse IMHO.

If it was just a tactic it could at least have a sensible (though evil) explanation.

immibis
0 replies
4h26m

If I understood correctly, you think people genuinely wanting privacy, and preferring to have more privacy and no Facebook rather than less privacy and more Facebook, is inherently worse than actual corruption. This says more about you than about the EU.

tossandthrow
0 replies
1d

How is this happening in the EU?

tomComb
0 replies
23h31m

Now we're getting to the real point of this.

It is usually whatever company isn't friendly enough to the current government.

In Canada, it is any company that dares to compete with the telecom companies.

_trampeltier
4 replies
1d

Maybe, but already two years ago, if you read the fine print for ex. by buying a ticket for Cirque de Soleil you accept they can use your face on the video for ai training.

lokar
3 replies
1d

I went to a show last week. I did not buy the tickets or even see one. I did not agree to anything.

lokar
0 replies
23h47m

Except there you were shown the agreement and had to click or whatever.

For event tickets, you are not even made aware there is an “agreement “

_trampeltier
0 replies
23h37m

But somebody did buy the ticket for you. Today you accept a lot when buying a ticket. Mostly you accept also to things like, they can use your picture on commercials / media and so on. Just the next time, take the time and read the full EULA when buying a ticket from a large event.

autoexec
1 replies
1d

I don't know how Meta/Google can react to these fines

They'll react by lobbying for fines while also lobbying to limit the amount of those fines. They love the fines. Fines are something they can budget for and can let them violate the laws as long as they are willing to pay the government a fee/toll/bribe. Without fines they might be held meaningfully accountable for their crimes. The last thing they want is to face a risk of ending up in prison the way that you or I most certainly would for repeatedly ignoring the law.

phyrex
0 replies
19h51m

They’re also an economic moat to stop other networks from emerging. Facebook can pay those fines, $upstart can’t.

ta1243
0 replies
1d

I don't know how Meta/Google can react to these fines

How about by not breaking the law?

freejazz
0 replies
23h43m

I don't know how Meta/Google can react to these fines

They could start by not breaking the law...

Zambyte
0 replies
1d

I do hope informed consent is a slippery slope. That would be stellar.

qwerty456127
34 replies
1d1h

Do fines to big corporations even work? I tend to suspect paying them is a part of the business model - they will keep doing whatever they want to generate huge profits covering whatever fines they may have to pay.

suprjami
8 replies
23h37m

So if every US state sues Meta and settles for ~1.5B that's halved the value of the company.

One more infraction with the same consequences and the company has no money left.

That sounds pretty effective to me?

kelnos
4 replies
23h1m

That doesn't work, though. In practice, most states probably won't sue for this. And even if they do, Texas is one of the most populous states in the US. Presumably the per-state settlements will be proportionate to the number of people harmed. So it's not going to be 1.5B * 50 = $75B.

suprjami
3 replies
21h30m

It doesn't matter if the states do or not. It's a deterrent.

You go into your quarterly earnings call and say "we're conducting illegal business practices which expose us to up to $75B of risk" and see how the investors like that.

You don't get caught every time you speed, but the fine certainly is a deterrent against doing 100mph everywhere.

JohnFen
2 replies
20h41m

But that's not what they say to shareholders. These deals always come along with "admitting no fault", so they don't have to say to anyone that they're conducting illegal activities. They're perfectly free to cite the fine up to overzealous prosecution or whatever. I think that usually the case that's made is "we didn't do anything wrong, but it was cheaper to settle than to fight".

suprjami
0 replies
20h14m

Venture capitalists aren't stupid. Even if Zuck says everything is okay they come to their own conclusions.

Obviously everything is NOT okay, otherwise the company wouldn't be getting fined in court for breaking the law.

VCs know just as well as we do that Facebook's business model (privacy-invading targeted advertising) exists because law has not caught up to technology yet. That's starting to change, as this settlement shows.

If Meta suffer many more of these legal settlements which wipe >1% off the annual profit then investors will start to divest. The value of the company will fall.

Keep in mind this was a settlement amount which suggests the legal liability was actually a lot higher than $1.5B

So I think my statement is supported: fining corporations works.

IG_Semmelweiss
0 replies
16h28m

Under GAAP, the company is legally mandated by law to disclose it in the FS. Since the FS is the primary was the company communicates performance to investors, they in fact are saying this to investors.

Its called a legal provision or legal reserve. they need to set aside money for the eventual expense of the setlement. This is going to eat right into EPS.

And, any material change in provision will be discussed. Plus, any investor worth their salt is going to be poring over FS disclosures, including any legal provisions.

Admission of fault has nothing to do with FS. Its purely an insurance topic

echelon
2 replies
23h3m

Their market cap is approximately 1000x larger.

Their annual income is approximately 100x larger.

xvector
1 replies
12h35m

Wrong. This is 3.5% of their annual income.

echelon
0 replies
4h11m

Your figure is net income, not gross. They can arrange those numbers as they like.

mbreese
6 replies
1d

That’s still a fine of 1% of revenue, which seems like a small percentage, but is still a sizable hit. It’s enough to make investors notice, but probably not substantially if it is a once-off event. If it isn’t a one-off event though and states/countries start fining it all over the place, then there might be more of an issue.

dartos
3 replies
1d

Ain’t no amount of money that can compare to infinite growth and mass media control.

j_maffe
2 replies
22h50m

This is just factually incorrect. Facebook doesn't have "infinite growth"

ta1243
0 replies
7h50m

The entire economy is based on infinite growth, and most of it is based on exponential growth, which is of course a problem.

Indeed the concept of the infinite growth may well explain the Fermi paradox through the concept of a "light cage"[0]

https://pureportal.strath.ac.uk/en/publications/the-light-ca...

dartos
0 replies
12h4m

Tell big tech leadership and investors that

ta1243
0 replies
1d

If the illegal practice raised $5b in extra revenue then that's just good business sense.

dantyti
0 replies
1d

as noted in the AP article, it's at least a second settlement:

In 2021, a judge approved a $650 million settlement with the company, formerly known as Facebook, over similar allegations of users in Illinois.
spywaregorilla
0 replies
1d

You should cite net income when trying to make a point on this.

BurningFrog
4 replies
18h46m

Incentives always work.

If the expected fine for X is larger than the expected profit, companies will not do X.

AnthonyMouse
2 replies
18h17m

That's actually kind of the problem here. The assumption built into this is that the company is doing the math. More likely the company is just so big that if even one mid-level person makes a bad call, the consequence gets propagated across a billion users. But there are so many mid-level people that it's guaranteed to happen.

Which is the same reason that "higher penalties" wouldn't work either. The problem is not that the penalties aren't sufficient to deter, the problem is that the right hand doesn't know what the left hand is doing, so even if the lawyers tell them "you must never do this" they've got thousands of independent chances to screw it up. One person who doesn't heed their training and oof.

The actual problem is that these companies are too big. Mistakes will always happen, but one mistake by one team in a huge bureaucracy shouldn't affect a population the size of a major country.

photonthug
0 replies
3h27m

Let’s go nuts and just take for granted the good faith and best intentions of megacorp, against all evidence. Even then, we all know that ignorance of the law doesn’t get people out of a speeding ticket. Why should the argument from ignorance hold up then with megacorp, who is staffed with hundreds of legal compliance professionals? Better yet, why should it hold up for their nth example of systematic abuse?

BurningFrog
0 replies
3h19m

I have some sympathy for the idea that these fines are a form of "stochastic taxation". There are so many rules, arbitrarily enforced and ever changing, that a large company will bleed off a billion now and then whatever you do.

I think that definitely happens, but I'd guess it's the smaller part of the equation.

khafra
0 replies
12h37m

Seems to be a pretty strong incentive for companies to make sure that, for anything unethical that they actually want to do, the fines will be large enough to satisfy the median voter, but small enough not to impede their business.

Empirically, large companies seem to have little problem following this incentive.

Detrytus
1 replies
1d

Well, I would expect that, in addition to paying money, this settlement also obliges Meta to stop doing whatever they were doing. And if they don't, then I would expect fine for repeated offense to be much higher, like 10x. At some point that would make ignoring law unprofitable.

AmVess
1 replies
1d

Nope. They get passed on to the consumer, anyway.

The only way to stop this type of behavior is to throw execs in jail and take their personal assets off them.

xvector
0 replies
12h34m

For facial recognition in 2011? Extreme take.

tediousgraffit1
0 replies
21h47m

from the better ap-news article linked by the top comment:

The company announced in 2021 that it was shutting down its face-recognition system and delete the faceprints of more than 1 billion people amid growing concerns about the technology and its misuse by governments, police and others.
ta1243
0 replies
1d

Based on how long this was going on, its about $5 per user per year.

nickburns
0 replies
23h4m

It's even more insulting than a fine. It's a voluntary payment in settlement that admits no liability or wrongdoing.

josefritzishere
0 replies
23h57m

I can report yes. California has been using them for years afer passing the CCPA to motivate better data privacy policy and procedure.

edm0nd
0 replies
20h54m

Not really imo.

Getting fined for doing something illegal just really means its legal for a price.

TiredOfLife
0 replies
22h54m

They stopped doing facial recognition, so yes.

DoubleDerper
0 replies
1d

Corporate Crime and Punishment: An Empirical Study

"Put simply, for large companies, criminal penalties may be just another cost of doing business—and quite a low cost at that."

source: https://scholarship.law.upenn.edu/faculty_scholarship/2147/

CursedUrn
24 replies
1d

Hopefully other states and countries sue them too. Facebook has taken massive liberties with our private data and they should be held accountable for it.

qingcharles
16 replies
23h30m

They already did. They settled this same kind of suit in Illinois a couple of years back and paid everyone a few hundreds dollars piece. Google too.

burningChrome
14 replies
21h39m

This is always my frustration with class action lawsuits. The people who've been affected by the company never get anything of value. Its always the attorney's and state that started the suits that make all the money.

The worst was back in the day when Chevy used some bogus primer for their trucks. They refused to say it was their fault owners paint was coming off their trucks sans any rust. A class action was started and then over the course of several years, the government and owners finally won.

What did the owners get? Coupons for $100 off a paint job at the local Chevy dealer.

So not only did the owners not get anything to help them fix their problems, they were forwarded to a local dealership where the dealership could then take advantage of them again and charge them thousands of dollars and essentially make all the money back they were paying the attorney's and government.

twodave
7 replies
21h23m

If you have damages that are worth more, then you can excuse yourself from the class and sue them individually.

dataflow
6 replies
19h33m

What if your damages aren't worth the expense of suing them.

nyokodo
5 replies
16h30m

What if your damages aren't worth the expense of suing them.

That’s what small claims court is for.

dataflow
4 replies
16h23m

You don't think hiring lawyers to successfully sue a megacorp ends up costing you more than $10k?

twodave
2 replies
16h17m

Depending on the lawyer’s anticipated likelihood of winning it can be free.

dataflow
1 replies
14h53m

Let me get this straight, you're saying I should take a megacorp to small claims court because lawyers that will inevitably end up costing >$10k might take my case for free if the case is strong enough? Does that logic make any sense to you?

twodave
0 replies
5h32m

I’m not suggesting that you do anything. I’m just pointing out that, for example, people sue insurance companies every day, for free and win.

colejohnson66
0 replies
7h3m

Many small claims courts don’t allow lawyers.

jfengel
4 replies
21h17m

Much of the time the actual harm to individuals is fairly small, but there are so many that the aggregate harm is hard to ignore.

It's a quandary. You can ignore it, but that encourages criminal behavior. Or you can pursue it, but proving your case beyond a reasonable doubt is time consuming and difficult. Often, there is no money to pay the lawyers in advance, so they expect to be compensated for the risk of getting nothing.

If the lawyers worked for free you might get double or triple the settlement, but it's hardly better to get $300 off a paint job than $100. The cash equivalent is probably only $30.

shiroiushi
0 replies
14h12m

If the lawyers worked for free you might get double or triple the settlement, but it's hardly better to get $300 off a paint job than $100. The cash equivalent is probably only $30.

No, the cash equivalent is probably $-200 or so.

The problem is that the individuals affected didn't get anything of value. They got a $100 off coupon at the dealership for a paint job there. Which means the dealership is just going to inflate the price by at least that much, and dealerships are already known to have higher prices than competing businesses anyway.

If they had gotten a $100 voucher for a paint job at any paint business, or better yet a $100 check to spend as they wish, then this would have cost GM something at least. But instead, they stood to profit from the "payout".

nyokodo
0 replies
16h31m

proving your case beyond a reasonable doubt is time consuming and difficult

Civil cases aren’t generally required to be decided beyond a reasonable doubt but merely with a preponderance of the evidence. It’s more that court in general is often time consuming and difficult.

they expect to be compensated for the risk of getting nothing

It depends on the lawyer and the case, e.g. most class action suits the lawyers charge nothing up front but take a cut of the damages.

Y_Y
0 replies
20h9m

I'm not in that class, but I think that (like any other case) cash is the right way to settle, in the absence of a mutual agreement to the contrary. The beneficiaries didn't get to bargain for a voucher, and shouldn't have to go to the dealer in order to be made whole.

The only thing that seems fair to me is if the settlement is held in trust and disbursed with minimum friction to anyone eligible. If Chevy wants to give those people a $100 voucher for thirty bucks if they show their eligibility for the action then go ahead!

BobbyTables2
0 replies
14h16m

The problem in large class settlements is the “damages” awarded per capita are often a trifle compared to a similar situation with a smaller set of plaintiffs.

Look at the award from Equifax data breach settlement… The CFPB fine was about $0.66 per person. The class action award averages to about $2 per person.

Could your bank plaster your personal info on a billboard for two years, but make it up to you with a nice paper cup of coffee? (How would you even prove damages?!?). Don’t get your hopes up — that coffee would be too expensive…

Would $1000 be fair for you? Multiply that by 150 million…

Same with the airbag recalls.

The problem is worse than lawyers’ cuts… Proper restitution isn’t even possible (orders of magnitude greater than company’s assets).

At the individual level, a $10k automobile can do far more damage to people and property than just $10k…

But, As a society, we often fail to punish companies that do more harm than good.

krabizzwainch
0 replies
18h15m

I absolutely agree with you. However, living in Illinois I received a $397 dollar check. I had to go look it up to make sure I got the number right. On one hand, a massive chunk of the $650 million went to the lawyers. But $397 for each user, when over a million signed on to the lawsuit is pretty significant.

nilamo
0 replies
17h7m

Let's not pretend like that is a positive outcome. All of your personal data being irrevocably made available to persons unknown, and used for unknowable purposes for untold decades... but at least you got $100 one time.

awahab92
3 replies
16h9m

dont kid yourself. the government spies and violates civil liberties more than any corporation.

texas just wants money and so they steal it legally.

if americans had any principles, they would sue the government until it acted responsbility. instead they are complacent to be bullied

moate
1 replies
14h34m

This is a stupid fucking take. We got some whataboutism, victim blaming, and nothing about the actual issue?

1- the government absolutely violates civil rights on the regular. It also hands out these rights, and adjudicates when they’re violated. If you believe the system is flawed on a systemic level, the solution cannot be “make the system do what you want by turning it against itself” but rather “rise up and overthrow the system” since the former is untenable.

2- is the “texas wants money so they steal it legally” saying that bad actors can’t do the right thing ever, because that’s untrue. The state is doing what you seem to be arguing is a good thing (limiting intrusions) but because they also do bad things, they “just want money”? Who cares what the motive is if the result is correct?

You can hate the state all day (Hello, full blown anarchist here myself) but “x is worse than y, so y isn’t even a problem” is asinine baby thinking.

freedomben
0 replies
2h56m

Great comment, other than "This is a stupid fucking take" which is unnecessary and incendiary. I'm glad your comment wasn't flagged, though with that opening salvo it probably should have been.

Moldoteck
0 replies
11h50m

Oh look, someone is using whataboutism to simp for some big corpo's profits

minklebottom
0 replies
20h9m

I despise Facebook, but this is absurd. There's no world in which there was 1.4b of harm done to Texans.

This is no better than theft, and it will be defended exclusively by people who have never created a single thing of value in their entire lives.

knodi123
0 replies
20h35m

Facebook has taken massive liberties with our private data

Gotta remember the quote from Zuckerberg during the early days, when someone asked him how he got all that private data. He said, and I quote, "People just submitted it. I don't know why. They 'trust me'. Dumb fucks."

chrischen
0 replies
13h23m

A lot of our profile pics are actually public, and models like OpenAI's probably ingested those too.

jmakov
18 replies
23h19m

This is a trend not only among banks but more and more among big tech - just include the (future) fine into your product price, then settle (if they investigate at all). No harm, nobody goes to prison, everybody happy.

andsoitis
6 replies
23h12m

This pattern is not exclusive to “big companies”.

We all know individuals who explicitly do the same w.r.t. speeding on the highway, driving in the HOV lanes, etc.

So it seems to me that there’s a fundamental human trait (“what can I get away with” ??) that warrants thinking about as well.

lcnPylGDnU4H9OF
3 replies
22h55m

speeding on the highway, driving in the HOV lanes

This seems like it's apples to oranges. The people choosing to do these things aren't doing so because they consider it to be a cost of business, expecting that they'll generate more in revenue, but because they think the likelihood of being caught and the gravity of the offense are relatively low. This practice of disregarding regulations because the fines can be factored as a cost is fairly well confined to large corporations.

(I guess someone being paid per mile driven from advertising decals on their car would get a business benefit from the speeding; they may even factor in the probability of being caught with the amount of the fine to determine their speeding decisions. That's nobody I know, though.)

MassiveQuasar
2 replies
22h47m

Time is money in the end. Speeding reduces time spent on the road and allows more time being spent elsewhere. The analogy still holds, just more abstractly.

consp
1 replies
22h43m

Which is the reason why around here your license is void (for a certain time, as in years terms) if you speed too much (plus you can show up for a behavioural course which not only costs quite a bit for people with less money but also takes three days for those who can afford it). And no chauffeur is going to risk that since it will void the means of living.

edit: I do not see any reason to not apply "license of cooperation in [state/country]" void in case of continued breaking of laws.

htrp
0 replies
22h4m

The trick (at least in the USA) to use an out of state licence which isn't connected to the DMV system of the state you're driving in. Sure you'll get fines, but your licence won't be revoked.

dietr1ch
1 replies
23h1m

So it seems to me that there’s a fundamental human trait (“what can I get away with” ??) that warrants thinking about as well.

Idk, I suspect many "crimes" like jaywalking and speeding are more around, this rule errs too hard on safety, but I know I can do this safely, and if it's not safe enough I'll get to pay the consequences (fines, accidents, dying). That's nothing like violating the privacy of other people for profit and no real penalty (actual jailtime for those involved, not just a --fine-- tax on the profits)

petsfed
0 replies
21h4m

this rule errs too hard on safety, but I know I can do this safely, and if it's not safe enough I'll get to pay the consequences (fines, accidents, dying)

The analogy here is that people, like corporations, are frequently very bad at assessing where the appropriate line for safety actually is, doubly so when the appropriate line personally inconveniences them. Some rules are perhaps too stringent, but frequently, the guidelines are akin to safe working loads, with safety margins built in, rather than do-not-exceed limits. Anyone who has to understand either of those things will tell you that if your operational envelope exceeds the safe working limit, you will eventually fail, catastrophically.

It's certainly true that traffic fines are not similar to corporate fines in the sense that you don't lose your chemical manufacturing license after committing 15 points worth of chemical safety infractions, but for other kinds of infractions, the fine for people is also frequently just part of the cost of doing the thing.

All of that said, I 100% agree that company leadership should see jail time for a variety of infractions, with a sort of inverted burden of proof as it pertains to determining who is at fault in an offending organization: you can only pass the buck down as far as the highest person who you can prove beyond a shadow of a doubt deliberately hid information from those above them.

CydeWeys
4 replies
22h30m

$1.4B is serious money though. You can't be paying that out left and right, and there's no way Meta made more than $1.4B on doing facial analysis of photos uploaded by Texans. This was an actual loss by them, and thus an error.

whythre
1 replies
22h16m

It also seems to leave the door open to similar outcomes in other jurisdictions, compounding the potential loss.

queuebert
0 replies
22h0m

Finally an advantage for the people of the US being a republic of states instead of a monolithic democracy.

glaucon
1 replies
19h33m

1.5 days revenue which they have negotiated to pay over the course of five years.

CydeWeys
0 replies
18h26m

The right figure to compare against is their annual profit, which is around $45B. A single state in the US in a single enforcement action against them taking off over 3% of their entire profit for the year is a big deal.

jldugger
2 replies
23h13m

Also feels like a trend among states to end-run the commerce clause. Want to tax imports despite the Constitution? Just pass regulations that only affect out of state industries and fine them for non-compliance.

kelnos
1 replies
23h7m

I think that's fair, given that the commerce clause is interpreted so ridiculously broadly that the 10th amendment might as well not exist.

cryptonector
0 replies
20h39m

It should really be flipped. Wickard should be overruled, as a result reducing the federal power to regulate intra-state affairs, but state power better circumscribed by the Commerce Clause: no one state should be able to dictate nationwide commerce terms on account of its economy's size. Whether that would prevent Texas from regulating Meta as in this case is not clear -- the courts would have to figure that out.

neongodzilla
1 replies
22h13m

No harm? What about the people who lost their privacy without consent? That doesn't get repaired.

hamasho
0 replies
22h8m

I think GP meant no harm for those corporations.

renonce
0 replies
22h1m

The settlement, announced Tuesday, does not act as an admission of guilt and Meta maintains no wrongdoing.

In 2011, Meta introduced a feature known as Tag Suggestions to make it easier for users to tag people in their photos. According to Paxton’s office, the feature was turned on by default and ran facial recognition on users’ photos, automatically capturing data protected by the 2009 law. That system was discontinued in 2021, with Meta saying it deleted over 1 billion people’s individual facial recognition data.

The 2022 lawsuit

We are pleased to resolve this matter, and look forward to exploring future opportunities to deepen our business investments in Texas, including potentially developing data centers

Each statement makes it increasingly harder to view it as a fine than a tax. An offence that lasted 11 years and got prosecuted a year after it ended can be explained in no other way than being an excuse dug out of the ground to make a ransom

dantyti
8 replies
1d

This one seems like the better of the two, as it includes Q124 profit and revenue numbers, as well as the stock's movement

tediousgraffit1
7 replies
21h44m

it also suggests some important details, like the fact that this was already opt-in:

At the time, more than a third of Facebook’s daily active users had opted in to have their faces recognized by the social network’s system.
nkrisc
4 replies
8h7m

I don’t believe for a moment that a third of users would consensually opt-in to anything less than free money.

I can’t tell you how many newsletters and other BS I unknowingly “opted-in” to, for example.

bicx
3 replies
4h4m

I believe they asked things like “Get notified when you appear in your friends photos” or something along those lines. My memory is fuzzy though. Phrasing it like that would probably get more opt-ins. It’s a lot less scary sounding.

erichocean
2 replies
3h42m

I kind of wish 3rd parties could set the text of what is being opted into.

It's way too easy for companies to mislead consumers about what they are agreeing to.

freedomben
1 replies
2h59m

Yes, but the other side is true too. For example, if the prompt just said "Allow Facebook to scan all your personal photos and do facial recognition, and also capture some of that data for future use?" then it's not at all clear why they are asking for this. Also it's going to all be "no" from the users.

Much better would be a compromise, that describes both why and what. Though, that will be a lot of text which isn't good either. Hard problem.

Teever
0 replies
1h8m

I guess it's a hard problem, but is it a worthwhile problem?

wizzwizz4
0 replies
19h58m

Many of Facebook's "opt-in" things aren't actually consent, so I'm prejudicially wary of any numbers Facebook gives. (I have no insight into this particular case.)

gerdesj
0 replies
19h34m

How was the "opt in" offered?

In general, when I come across some sort of opt in I get offered a choice between allowing something to work and it not working and having some sort of a sub optimal experience. I can't even try before I buy either because once I have consented, the data/image/whatever is already "released".

Most opt-in choices, unless coerced by laws, will be heavy handed nudges at best.

I'm an IT consultant and stand more of a chance than most at making an informed choice but please don't wave "opt in" as some sort of laundering procedure. I am deliberately juxtaposing with money laundering.

tomwheeler
7 replies
1d

Was Texas the victim? If not, why shouldn't the actual victims get the money?

dantyti
3 replies
1d

when someone litters on my lawn the administrative fine goes into the local/state budget, not my pocket.

equivocates
1 replies
1d

when someone steals your tv, the restitution is paid to the victims.

marcosdumay
0 replies
23h44m

Restitution is payed to the victims; fines are payed to law-enforcing organizations.

selectodude
0 replies
1d

When Illinois won a judgement, I received a check in the mail.

olyjohn
2 replies
1d

You split the money up among the users, everybody gets a $1.50 check in the mail that costs $0.50 to mail out. Most of the checks will get tossed in the trash. Might as well just burn the money.

davidgay
1 replies
23h48m

1.4 billion $ / 30 million people = $47/person.

Do the math before being generically cynical?

insane_dreamer
0 replies
18h7m

the $1.50 is after administrative fees ;)

jeffwask
5 replies
23h46m

When do I get my 46.6 dollars or even better my 933 dollars if we only look at active Facebook users either way it still seems like a great deal price for everyone's biometrics.

the_sleaze_
2 replies
23h8m

Undoubtedly a great price. At times like this I'd like to see the entire extent of the profits split up among the plaintiffs.

“Wherever I'm going, I'll be there to apply the formula. I'll keep the secret intact. It's simple arithmetic. It's a story problem. If a new car built by my company leaves Chicago traveling west at 60 miles per hour, and the rear differential locks up, and the car crashes and burns with everyone trapped inside, does my company initiate a recall? You take the population of vehicles in the field (A) and multiple it by the probable rate of failure (B), then multiply the result by the average cost of an out-of-court settlement (C). A times B times C equals X. This is what it will cost if we don't initiate a recall. If X is greater than the cost of a recall, we recall the cars and no one gets hurt. If X is less than the cost of a recall, then we don't recall.”
singleshot_
0 replies
21h16m

The Coase equation.

AnthonyMouse
0 replies
18h25m

People always quote this like the company is doing something disreputable, but that's exactly what you want them to do as long as the cost of the lawsuits accurately represent the amount of the harm.

Recalling a hundred million cars at the cost of a hundred billion dollars because there is a 0.5% chance that one person across the entire hundred million could die is the wrong choice. Otherwise all products would have to be recalled continuously, because nothing is perfectly safe. Example: Many cars have been sold without lane departure warning systems, even though that could cause some people to die. Should they all be recalled?

The equation above is how you calculate the cutoff for whether to do the recall. What would you propose that they do instead?

al_borland
1 replies
22h25m

These penalties need to be much higher if we're going to hope for any change in behavior.

Maybe every state should sue, so it would be $70B. Of course that still doesn't make a dent when Meta's market cap is $1.174 T. It's a rounding error.

conductr
0 replies
21h46m

The penalty would be greater if they continued violating. First time penalties should probably work that way. However, there needs to be a 3 strike rule or something similar, where penalties of any type by the same entity grow exponentially and ultimately you get banned from operating. Status quo is a series of slaps on the wrists for differing infractions. It just trains them to hide their wrongdoing better.

delichon
5 replies
23h48m

This was the first lawsuit Paxton’s office argued under a 2009 state law that protects Texans’ biometric data, like fingerprints and facial scans. The law requires businesses to inform and get consent from individuals before collecting such data.

I hope it only restricts business, because I have an awkward amount of face blindness and would love to have an app that could put names to them. I wonder if the maker of such an app would be liable for my use of it in Texas.

criddell
4 replies
23h23m

As long as it’s opt-in, there wouldn’t be a problem.

Without that, I don’t think it could work. Does your need for an accessibility device trump my right to privacy?

warkdarrior
0 replies
22h22m

Where can I tell the Texas government (or Illinois, etc) that I opt-in to all facial-recognition software available to users?

tediousgraffit1
0 replies
21h43m

this is a really important point that's not clear to me - from the [ap news article](https://apnews.com/article/texas-attorney-general-meta-settl...) linked in the top comment -

At the time, more than a third of Facebook’s daily active users had opted in to have their faces recognized by the social network’s system. Facebook introduced facial recognition more than a decade earlier but gradually made it easier to opt out of the feature as it faced scrutiny from courts and regulators.
delichon
0 replies
19h1m

I don't think I should need your permission to use a prosthetic to overcome a disability in performing an essential human function.

CobrastanJorji
0 replies
16h50m

Anything you can legally opt-in to online is effectively complete freedom to companies. Nobody reads click-through agreements or terms of service. Sites will just add "and also we can collect your biometric info" to the terms, and people will keep clicking accept.

Without some very strong, EU-type rules around how you have to ask, it's just another thing lawyers will know to add to the terms.

wwilim
4 replies
12h48m

Suing big tech for ethical issues seems to be a reliable source of income for governments these days

xvector
3 replies
12h38m

Write enough laws and a company is bound to break one of them at some point. Simple probability. Easy source of income for an indefinite period.

Lio
1 replies
12h28m

If your business is dedicated to removing people’s legal right to privacy then you are probably going to break the law at some point.

Why not just act ethically in the first place?

xvector
0 replies
13m

Most privacy laws today are incredibly overwrought and vague. In this case, face recognition was illegal even if you opted into it. Try being a startup that needs any amount of customer data to operate, and then read the GDPR and realize it's better off to just move to the US.

Regardless, you are missing the point. It is a very trivial calculus to realize that if you craft enough complicated and vague privacy laws, companies are bound to violate it, no matter what they do or how hard they try. All you have to do is craft a set of laws in which any company can always be interpreted as in violation of one.

If I was a state, I would go out of my way to craft these vague, overwrought laws so I could have a reliable source of an extra few billion dollars here and there whenever I needed it. If I was a regulator or legislator, I'd do it for the career clout of "going after the big bad guys." And no one will ever complain, because "big companies are evil and capitalism has never done anything for humanity," so the Overton window can only ever move in one direction.

And this is how we end up with the undeniable technological stagnancy of the EU, where they completely missed the www and mobile revolutions, and will certainly miss the AI revolutions. How many Industrial Revolutions can you miss out on before you fade into irrelevancy, having lost a meaningful portion of your financial/economic/military/technological power? I guess we will find out.

preciousoo
0 replies
12h21m

oh brotherrr

dantyti
4 replies
1d

I think this adware company is pure evil, but this feature seems like something a user might actually opt in to (as noted in the AP story, quite a few did).

Interesting to see that while meta at least publicly scraps the feature, a similar one on iOS Photos is not even an opt-in – you can't turn it off.

phyrex
2 replies
19h54m

The difference is that the information gets processed, stored, and accessed only on your own devices with Apple, and on metas servers in the case of Facebook.

warkdarrior
1 replies
13h39m

How does information about my face gets processed, stored, and accessed only on your own devices? Without Apple's help?

phyrex
0 replies
8h7m

I don’t understand the question? iPhoto detects faces and you can assign one of your contacts to that face?

ta1243
0 replies
1d

It will be trivial to get users to opt in, just bury it in the "terms of service" for whatsapp/instragram/facebook/etc and you're set.

Network effects mean you can't realistically say no.

29athrowaway
4 replies
22h11m

Will they delete the information? Or they purchased Texan facial data for 1.4b?

tediousgraffit1
3 replies
21h46m

the ap news article linked by the top comment indicates that they deleted 'faceprints':

The company announced in 2021 that it was shutting down its face-recognition system and delete the faceprints of more than 1 billion people...
29athrowaway
2 replies
20h40m

What does deletion mean anyways? some bullshit soft deletion without any auditing that will enable them to keep using the data?

phyrex
1 replies
19h54m

Meta does get audited

shmeeed
0 replies
11h37m

Real question, how do you audit that data got deleted with certainty, and not just locked away out of reach for the time being? To me this seems like a fact hard to prove.

websap
3 replies
1d

This is kinda foolish. I find this feature to be extremely useful. Governments need to do better at reducing the burden of choice for end users.

I as an end user of an app shouldn't have to go through every feature, how it is implemented and if that meets my personal privacy bar. Sensible defaults are important.

causal
1 replies
1d

Governments need to do better at reducing the burden of choice for end users.

Arguably that's what Texas is doing here. Requiring permission to apply facial recognition feels like a very sensible default.

autoexec
0 replies
1d

What Texas is doing here is telling companies that they don't need your permission to steal your facial recognition data. Companies just need to consider that after several years of profiting from that facial recognition data they might one day have to pay just a tiny fraction of the money they make to the state of Texas. Cost of doing business really.

downrightmike
0 replies
1d

yes, but that would require a good, better than California's protections, federal GDPR like law, and data brokers are actively lobbying against it.

eftychis
3 replies
22h46m

Equitable and more specifically injunctive relief is the way out of this abuse of "crime is the cost of doing business" mentality. (That a lot of us raise here.)

What could that mean: Meta or whichever company breaks the law, loses ownership and rights to anything that is the result of the crime.

If it's a model, Meta can not use that or any other version of the model that utilized data illegally acquired. And that model becomes property of the victims.

TechDebtDevin
1 replies
22h34m

That would be the equivalent of confiscating a fleet or trucks from a trucking company whenever a driver broke a law. A little extreme and will never happen. China doesn't even operate like this.

eftychis
0 replies
22h3m

Not unheard of for some crimes, eg drugs or money laundering.

Securities fraud also essentially offers that: all money made out of the fraud and gains on that belong to the victims.

California's Unfair Competition framework also dictates essentially payment of the proceeds of the "unlawful activity" and taking actions to undo it.

Again this is already an existing relief for certain crimes or civil torts committed by individuals.

We just have not legislated to apply it to you know the other "persons," the companies.

xvector
0 replies
12h29m

This would just create a perverse incentive for the government to confiscate everything it can. Then you would see laws being crafted that are increasingly ambiguous and difficult to follow in order to enable greater confiscation of goods.

Asset forfeiture isn't something we should be cheering on.

swamp40
2 replies
23h43m

Is this why Illinois made it illegal too? For a nice big payout?

dmitrygr
1 replies
23h41m

Or maybe because the public wants this? I sure as hell want facial recognition made more difficult. I can’t change my face so I would prefer it not be used as a key in any database

jimbob45
2 replies
1d

Were Facebook's lawyers asleep at the wheel for this one? It seems like they could have thrown in a clause about "pictures uploaded may be subject to facial recognition software" and no one would have batted an eye. How is it possible that they dropped the ball so hard here?

staticman2
0 replies
22h45m

Putting it in the TOS wouldn't help. If my friend takes a picture of me and posts it on Facebook, I have in no way agreed to Facebook's TOS.

chmod775
0 replies
1d

I wouldn't be surprised if they had that. They also could've added a clause that Facebook now owns your house, and it'd probably be roughly as interesting to a court.

The fact is that many things buried in EULAs and whatnot are not really enforceable nor constitute consent. Some things have to be agreed to more explicitly than putting them on page 50 of your fine-print.

It's especially problematic when companies start doing something you didn't directly sign up for or couldn't have expected to happen when you did. I don't think that many people who signed up for a social network in 2015 expected that their photos would later be scanned. It might surprise people even now.

tediousgraffit1
1 replies
22h6m

The attorney general’s office did not say whether the money from the settlement would go into the state’s general fund or if it would be distributed in some other way.

so where's that money going to wind up?

warkdarrior
0 replies
13h46m

The money goes to the AG's re-election campaign.

sensanaty
1 replies
21h43m

There really needs to be a 3-strike rule type of thing with fines like these. It's ridiculous to me that they can continue to violate people's privacy without their consent, get fined a percentage of a percentage of the money they actually made on the practice, and that's the end of it.

These fines should be exponential in nature, and aggressively so. The 4th-in-a-row fine of this nature should basically take everything they earned in the whole year. Let's see how quick and efficient they suddenly become once there's actual consequences.

That, or Zucc and his cronies should be getting jailed. I'm fine with either option, or preferably both.

xkcd1963
0 replies
8h0m

Yes. And facebook is not too big to fail. There are thousands of alternatives to Facebook, Whatsapp and Instagram. And the few employees they have easily find another job.

mattfrommars
1 replies
19h48m

Anyone know how much money the legal firm gets from this? Their cut should be substantial since we are talking $1.4 billon dollars here.

lern_too_spel
0 replies
19h43m

It's the AG's office filing the suit, not a class action.

kazinator
1 replies
23h52m

Governmental cash grab; the actual users whose faces were subject to recognition won't see a cent, assuming anything is actually paid. Those not in Texas will not benefit indirectly in any way.

enobrev
0 replies
22h49m

Certainly possible. At the very least, in IL, I got a check for somewhere around $400.

farceSpherule
1 replies
22h20m

The fine is nothing. Not even 10% of their net profits for 2023.

galangalalgol
0 replies
21h37m

If every state sues and gets almost %10 that could be an issue...

rideontime
0 replies
18h55m

Now do Clearview AI.

jrochkind1
0 replies
15h46m

Consumer Reports, a nonprofit consumer advocacy organization, commended Paxton for the lawsuit but noted it was an outliner.

outlier, presumably. i remember when newspapers had good copy editing.

josefritzishere
0 replies
23h58m

Texas passed a pretty solid CCPA-like data privacy law (TDPSA) which went into effect July 1st of 2024. That start date was announced when it was passed in JUne of 2023. Meta needs to get with the times, they're wildly out of compliance.

jameson
0 replies
13h23m

Meta introduced a feature known as Tag Suggestions to make it easier for users to tag people in their photos. According to Paxton’s office, the feature was turned on by default and ran facial recognition on users’ photos, automatically capturing data protected by the 2009 law.

Meta should also be forbidden from using any features derived from this data, or open source any trained model from the data. 1.4B settlement is too small compared to the long term gains of a company

hobo_in_library
0 replies
19h6m

So who gets the money?

hamasho
0 replies
21h48m

Probably that data was used to train AI models too. I hope we establish a legal framework that prevents training models without proper permission, and the companies that have already trained their models will get fined and those models will be banned from commercial use.

I enjoy the rapid progress of LLMs. ChatGPT and Claude are already a critical part of my daily work. But I don't like the current situation where VCs and start-ups use unpermitted data to train the models, don't respect content creators, and take advantage of the lack of regulations.

dreamcompiler
0 replies
17h9m

Ken Paxton is still under federal investigation for various crimes, but I suppose even criminals do the right thing sometimes.

blitzar
0 replies
13h31m

It's a write-off for them. They just write it off. All these big companies just write everything.

bankcust08385
0 replies
13h32m

Maybe the state can direct some funds to TxDOT to properly mark hazards on their easements near roadways and properties to avoid my mom crashing her car into a hidden, unguarded, unmarked culvert.

Noumenon72
0 replies
22h3m

Huh, I wondered why this feature stopped being available. It would be even more useful now that you could use AI to say "find me all photos of George riding his bike" or "find all pictures of me with Dave".

Dowwie
0 replies
22h20m

Do the users get any of that money?