return to table of content

Google Ordered to Identify Who Watched Certain YouTube Videos

addicted
130 replies
15h44m

There are different incidents here.

The first one where the police uploaded videos and wanted viewer information is absolutely egregious and makes me wonder how a court could authorize that.

The next one, which I didn’t fully understand, but appeared to be in response to a swatting incident where the culprit is believed to have watched a specific camera livestream and the police provided a lot of narrowing details (time period, certain other characteristics, etc) appears far more legitimate.

godelski
117 replies
14h31m

I don't understand how either of these are remotely constitutional. They sure aren't what is in the spirit.

They asked for information about a video watched 30k times. Supposing every person watched that video 10 times AND supposing the target was one of the viewers (it really isn't clear that this is true), that's 2999 people who have had their rights violated to search for one. I believe Blackstone has something to say about this[0]. Literally 30x Blackstone's ratio, who heavily influenced the founding fathers.

I don't think any of this appears legitimate.

Edit: Ops [0] https://en.wikipedia.org/wiki/Blackstone%27s_ratio

mingus88
61 replies
14h13m

Cell phone tower data has been used for a decade now in pretty much the same way.

Did you happen to pass by a cell tower in a major city around the time a crime was committed? We all have.

Well, your IEMI was included in a cell tower dump. Probably dozens of times.

Did you happen to drive your car over any bridge in the Bay Area lately? Did a municipal vehicle pass you and catch your license plate with their ALPR camera?

Guess what? Your name went through a database of an LEO search if they wanted to find a perp for that time/location.

Privacy has been dead for a long time. The worst part is people don’t care.

The Snowden files changed nothing. If there was ever a point in history where people would have given up their cell phones for their civil liberties, that would have been the time to do it.

godelski
22 replies
13h29m

Cell phone tower data has been used for a decade now in pretty much the same way.

I was mad then. I'm more mad now. Stop these arguments because it isn't like one implies the other. And who the fuck cares if someone wasn't but is now. What's the argument, that you're a hipster? That's not solving problems. I don't want to gatekeep people from joining the movement to protect rights. I don't care if they joined as a tin foil hat or just yesterday after having literally been complacent in these atrocities. If you're here now, that's what matters.

Privacy has been dead for a long time. The worst part is people don’t care.

Bull, and bull.

There are plenty of people fighting back. I'm pretty sure me getting ads in languages I don't speaks is at least some good sign. Maybe I can't beat the NSA, sure, but can I beat mass surveillance? Can I beat 10%? 50%? 80%? 1% is better than 0% and privacy will die when we decide everything is binary.

People care. People are tired. People feel defeated. These are different things. If people didn't care Apple (and even Google) wouldn't advertise themselves as privacy conscious. Signal wouldn't exist and wouldn't have 50 million users. It's not time to lay down and give up.

mingus88 36 minutes ago | parent | context | flag | on: Google Ordered to Identify Who Watched Certain You...

Cell phone tower data has been used for a decade now in pretty much the same way.

Did you happen to pass by a cell tower in a major city around the time a crime was committed? We all have.

Well, your IEMI was included in a cell tower dump. Probably dozens of times.

Did you happen to drive your car over any bridge in the Bay Area lately? Did a municipal vehicle pass you and catch your license plate with their ALPR camera?

Guess what? Your name went through a database of an LEO search if they wanted to find a perp for that time/location.

Privacy has been dead for a long time. The worst part is people don’t care.

The Snowden files changed nothing.

They didn't change enough, but that isn't nothing.

alfiedotwtf
21 replies
11h2m

> The Snowden files changed nothing. >They didn't change enough, but that isn't nothing.

The biggest change IMHO was the entire industry got off their collective assets to finally move to HTTPS.

wutwutwat
8 replies
8h12m

The world’s largest MITM

chgs
6 replies
8h3m

Tech bros love it. And tailscale. And saas as a whole. Data sovereignty means you can’t be kind by the adtech industry so it’s not cool.

vitno
4 replies
6h30m

Calling out tailscale here is odd considering it's peer-to-peer and encrypted.

chgs
3 replies
6h24m

With keys controlled by a central entity

Handprint4469
2 replies
5h43m

do you have a source for that?

mikehotel
1 replies
3h40m

Tailscale [0] says the private keys never leave the device.

“Security

Tailscale and WireGuard offer identical point-to-point traffic encryption.

Using Tailscale introduces a dependency on Tailscale’s security. Using WireGuard directly does not. It is important to note that a device’s private key never leaves the device and thus Tailscale cannot decrypt network traffic. Our client code is open source, so you can confirm that yourself.”

0. https://tailscale.com/compare/wireguard

sdht0
0 replies
1h32m

To add to that, they also provides Tailnet lock [0], which protects from the only way the coordination server can mess with the tailnets, by connecting unauthorized nodes.

[0] https://tailscale.com/kb/1226/tailnet-lock

j45
0 replies
2h3m

Not sure what the issue is with Tailscale, especially since you can self-host Headscale server locally to get the same effect.

pbhjpbhj
0 replies
8h5m

Lol, I'm a bit slow ... some USA TLA runs Cloudflare, right?

rolph
0 replies
2h23m

a single encryption is for the stone age.

if [pecadillo] must remain secret when your nieghbour is investigated for [crime?] then encrypt at least twice, and obfusicate the original message

PeterStuer
6 replies
9h52m

Had nothing to do with Snowdon but with Google ranking algo changes. Google has a commercial interest of hindering competitors in the add brokering market from observing info on the wire.

mike_hearn
3 replies
7h0m

It had everything to do with Snowden. Source: I was at Google at the time he started leaking.

Before Snowden encryption was something that was mostly seen as a way to protect login forms. People knew it'd be nice to use it for everything but there were difficult technical and capacity/budget problems in the way because SSL was slow.

After Snowden two things happened:

1. Encryption of everything became the companies top priority. Budget became unlimited, other projects were shelved, whole teams were staffed to solve the latency problems. Not only for Google's own public facing web servers but all internal traffic, and they began working explicitly on working out what it'd take to get the entire internet to be encrypted.

2. End-to-end encryption of messengers (a misnomer IMHO but that's what they call it) went from an obscure feature for privacy and crypto nerds to a top priority project for every consumer facing app that took itself seriously.

The result was a massive increase in the amount of traffic that was encrypted. Maybe that would have eventually happened anyway, but it would have been far, far slower without Edward.

lern_too_spel
1 replies
2h49m

You were at Google at the time, but your memory of the ordering of events is off. Google used HTTPS everywhere before Snowden.[1][2] HTTPS on just the login form protects the password to prevent a MITM from collecting it and using it on other websites, but it doesn't prevent someone from just taking the logged in cookie and reusing it on the same website. That was a known issue before Snowden, and Google had already addressed it. Many other websites, including Yahoo, didn't start using HTTPS everywhere until after Snowden.[3] I know because this was something I was interested in when using public WiFi points that were popping up at the time. I also remember when Facebook moved their homepage to HTTPS.[4] Previously, only the login form POSTed to an HTTPS endpoint, but that doesn't protect against the login form being modified by a MITM to have a different action for the MITM to get your password, rendering the whole thing useless.

What changed after Snowden was how Google encrypts traffic on its network, according to an article quoting you at the time.[5]

[1]https://gmail.googleblog.com/2010/01/default-https-access-fo...

[2]https://googleblog.blogspot.com/2011/10/making-search-more-s...

[3]https://www.zdnet.com/article/yahoo-finally-enables-https-en...

[4]https://techcrunch.com/2012/11/18/facebook-https/

[5]https://arstechnica.com/information-technology/2013/11/googl...

KennyBlanken
0 replies
4h47m

That's nice and all, but the "why" is more important than the "what".

Google was driven not out of some panicked rush to protect user privacy, but to protect Google's collection and storage of user data.

Google has 10+ years of my email. It doesn't treat that like Fort Knox because it gives a shit about my privacy; it treats it like Fort Knox because it wants to use that for itself and provide services to others based off it.

You do know that Google was heavily seed-funded by the NSA, right?

kevin_thibedeau
0 replies
4h7m

There was literally a PowerPoint slide in the released docs implying they had backdoored Google's internal servers.

Jare
1 replies
10h55m

I thought this was driven by ISPs inserting their own ads in normal HTTP.

dbdudbdiddjc
0 replies
8h58m

…no, it was definitely “HTTPS added/removed here”

jpc0
14 replies
8h46m

There is a distinction I tend to make here.

If some person was able to pick me out from a lineup because they physically saw me then that wasn't private and privacy laws don't apply.

So for instance capturing my face on CCTV in a public place isn't a privacy violation, same with my license plate in a pulic place.

However what happens on my private property is a privacy violation if it is recorded without consent.

Certian information isn't private, and that being stored is fine. Where the line gets drawn is what's up for debate.

I surely would want my contact details and name saved by a company that I intend to do business with in either direction. However if they spam me with information I should be able to lodge an harrassment claim against them. It's not a privacy issue but a decency issue.

fmobus
8 replies
6h43m

That notion isn't universal. In Germany, for instance, I can't install a camera pointing to the street.

jpc0
6 replies
6h17m

I understand that completely. Just wanted to give a different viewpoint on that.

I'm all for finding a balance, it's just that many times people are against surveillance that does actually improve security or enforcement but mildy infringes on their "rights" when in reality they never had privacy in that situation to start with and the use of technology didn't substantially change that.

Youtube being forced to give up personal information based on who viewed a video is something I don't see as an issue. How is this any different from any other website getting the exact same order?

If you are doing something shady you know how to obfuscate that information, if you aren't, sure your "privacy" was "violated" for sure but it was violated in a way that was legally allowed and by law enforcement at that.

Living in a surveillance state where I have no choice but for the government to be able to track every single transaction I make financially and being able to link my cell number amongst other details directly to me, I feel like if I had to try to fight that I would only be causing myself undue anxiety and I've got enough legitimate reasons to be anxious.

kortilla
2 replies
4h30m

and the use of technology didn't substantially change that.

This is complete BS. Technology made it scalable to track where everyone is and query it historically. This used to require tailing someone so it couldn’t be done at scale.

jpc0
1 replies
3h13m

That same technology has also dramatically increased the cost of doing that.

Data isn't free and processing big data isn't cheap. As much as Google has the data, that means they need to store that data.

You know what used to happen before and still happens now, an example. I live in a restricted access area. Restricted in the sense rhat to get in some guy needs to take your name and license plate.

For many many businesses parks in my country that is still the defacto. There isn't really a camera watching that other than general CCTV that probably doesn't have the resolution to pick up text on our license plates. It's cheaper for them to literally pay a guy to stand at a boom and get that information than to install the technology required to track that automatically.

Nextgrid
0 replies
2h52m

Data isn't free

The adtech industry made data and its processing not just free (as in more than covered by the ad revenue) but outright profitable.

This is frankly a one-in-a-lifetime gift to the government because we've not only built an unaccountable industrial-grade spying machine but the government doesn't even have to pay for it as it pays for itself and incentivizes its own expansion.

KennyBlanken
1 replies
4h37m

"Hunters don't kill the innocent animals - they look for the shifty-eyed ones that are probably the criminal element of their species!

If they're not guilty, why are they running?"

jpc0
0 replies
3h20m

I never said any of that.

What I said is for this specific point a smart criminal won't get caught and you too can very easily obfuscate that very same data.

partitioned
0 replies
5h59m

Thank you for so eloquently explaining the bootlicking and privacy not caring mindset I’ve never understood. Also sorry that I can’t come up with a less worse way to say that

KennyBlanken
0 replies
4h41m

Unless you're wealthy and powerful.

I guarantee the very wealthy or politically powerful have plenty of very-well-hidden cameras surrounding their properties.

Those rules are to keep you from catching and proving the powerful doing something they shouldn't.

zakki
2 replies
3h54m

So, when your in your own property, cellular tower shouldn't be allowed to allow your mobile phone to register? Because they will record your IMEI while you are in your private property.

williamcotton
0 replies
3h45m

Yeah but the electro-magnetic spectrum is a limited public good. You don’t own your broadcasted radio waves in the same way you own your house. Your cellphone is a pollutant.

Etherlord87
0 replies
1h27m

Both the radio waves his cell phone emits, and the information (voltage change of an ADSL line or photons moving in an optic fiber) used to communicate over the Internet, actually leave his home, and then are registered. So I think in nature it's the same as sending a letter. So let's symmetrically consider that you send a letter, and police/agency asks the post office to attach to each letter information (from, to, weight, stamp...) the phone number from their database. If that happens for all letters going through a given sorting room, I can understand how that's an abuse.

scarface_74
0 replies
3h8m

If some person was able to pick me out from a lineup because they physically saw me then that wasn't private and privacy laws don't apply

It’s not an invasion of privacy. But it is a problem for other reasons

https://nobaproject.com/modules/eyewitness-testimony-and-mem....

chgs
0 replies
8h1m

However what happens on my private property is a privacy violation if it is recorded without consent.

And the biggest enablers of violation are things like ring doorbells and dashcams. There is no comeback in my country, don’t know about the US.

Governmental and commercial cctv has checks and balances. Domestic just goes onto planet wide databases with no control.

verisimi
6 replies
11h19m

Privacy has been dead for a long time. The worst part is people don’t care.

A lack of privacy is integral to the technocratic future that has been long planned. I think AI, smart (spy) devices/meters/etc are part of the control structure, de-anonymised internet (why Elon bought twitter) are all part of this fine grained control grid.

romafirst3
3 replies
9h27m

you honestly think there is some masterplan? I love a good conspiracy theory but that is nuts.

Nobody has a made a plan so it hasn't been long planned. Shit is just happening and we as a society, a culture and a race are adapting to it like we've always done.

verisimi
0 replies
3h47m

Not a masterplan, but definitely guides. I mean there are all sorts of meetings that occur where the rich and powerful meet and arrange this or that - eg Bilderberg, WEF, Trilateral Commission, UN, WHO, etc. These aren't elected bodies, but somehow all the governments act in tandem according their pronouncements. Its as if voting doesn't matter, as if its merely a pressure release valve.

Elsewhere I posted about technocracy inc - which Elon's grandfather was involved with. (https://newsinteractives.cbc.ca/longform/technocracy-incorpo...). Just 2 generations later, and you have this guy apparently putting out electric cars, space ships, neural laces, etc. Its more coherent than you think. Bill Gates's father headed Planned Parenthood.

It seems to me that there are a group of very rich individuals that try to shape policy and direct this or that. Its really a very natural state I think - we see this everywhere - eg a headmaster of a school will direct the school towards this or that, a CEO does the same in their company. This is the same principle, except at a far higher level. I don't think its even debatable that this is the case, tbh.

starspangled
0 replies
6h6m

We know of the ruling class regularly getting together to plan, and surely they meet and collude far more often than we know about. Of course they are making plans.

preordained
0 replies
7h24m

Well, I guess we're just naturally evolving and adapting into enslaving ourselves, just following nature's course... There doesn't need to be a "master plan", but clearly there are some working themes

godelski
1 replies
11h7m

You can have privacy and many of these things. You may be interested in homomorphic encryption or the weaker version differential privacy. There are such things as zero knowledge proofs.

But I think it is far easier to have these technologies without doing the encrypted aspects or protecting privacy. Then it is more a "never let a tragedy go to waste" situation. Paved with good intentions, right? This will always happen though as we rush and are unable to do things the right way. Often the right way can take the same amount of time but generally appears to take longer and that is enough, even if the wrong way actually takes longer. Because it also matters at what resolution we're concerned with.

So I'm saying it is mostly stupidity, not evil. Though evil loves and leverages stupidity.

verisimi
0 replies
10h50m

So I'm saying it is mostly stupidity, not evil. Though evil loves and leverages stupidity.

It's been long planned. Look into technocracy Inc and Zbigniew Brzezinski, co-Founder of the Trilateral Commission, and former National Security Adviser wrote about it - Between Two Ages: America's Role in the Technetronic Era

the increasing availability of biochemical means of human control augment the political scope of consciously chosen direction, and thereby also the pressures to direct, to choose, and to change.

Joshua Haldeman was a leader of Technocracy Incorporated in Canada from 1936 to 1941, but eventually became disillusioned with both the organization and the country, and packed up his young family to start life anew in South Africa.

In June 1971, Haldeman’s daughter Maeve gave birth to his first grandson. His name is Elon Musk.

In 2019, Musk tweeted, “accelerating Starship development to build the Martian Technocracy.”

Musk’s estimated net worth today is more than $150 billion US. He’s clearly done very well inside the price system his grandfather would have railed against. But Musk has not completely abandoned his Technocracy roots.

https://newsinteractives.cbc.ca/longform/technocracy-incorpo...

detourdog
4 replies
3h46m

The concept introduced by the Supreme Court regarding Pen register is consistent with all the examples you have given.

Anytime you willing share data with a 3rd party the law assumes you aren't keeping it private.

https://en.wikipedia.org/wiki/Pen_register

If you want to keep something private don't share it outside of your house.

Geezus_42
3 replies
3h41m

Except that existing in modern society requires giving immense amounts of personal information for even basic transactions.

morkalork
1 replies
3h3m

It's beyond absurd and desperately needs to be addressed. Too bad both the government and corporations stand to loose too much that I doubt it will be treated seriously.

detourdog
0 replies
1h37m

I personally think that the Apple anti-trust is being pushed due to their privacy stance.

Apple looked at the pen register cases and realized the best position to be in as a third party is to not possess usable data.

The US case from my point of view is trying to fore Apple to share user data with third parties.

detourdog
0 replies
1h39m

We all have choices to make. I avoid all sorts of things people consider indispensable.

2 examples are not having an amazon prime account and running my own mail server.

skybrian
3 replies
12h16m

This sounds scary, and yet I seem to be unharmed.

sriram_malhar
0 replies
9h59m

Shall we wait on the laws until you personally come to some harm?

ikekkdcjkfke
0 replies
10h15m

A case is criminal gangs buying from data brokers to scam elders

alt227
0 replies
2h3m

Then they came for me. And there was no one left to speak out for me

riedel
2 replies
11h42m

IMHO the problem here is really transparency. There IMHO can be situation in which it could be reasonable. But the concrete cases might be questionable as we are probably not talking about capital crime.

In Berlin there used to be a notification system if you were subjected to cell surveillance in Berlin. It was recently stopped [0]. IMHO we need the same for all IP assignment or account lookups. The problem IMHO is that we, individualy, and particularly vulnerable groups like journalists and activists, might be subject to far more of such activities than we know.

[0] https://netzpolitik.org/2024/rolle-rueckwaerts-berlin-beende...

godelski
0 replies
10h8m

I don't think anyone is saying that rights can't be infringed upon for any reason. The issue is that there needs be sufficient reason. Is this sufficient reason? I think the action is sufficient reason were it specifically targeted at the individual under suspicion. But a dragnet is not. Those innocent people were not under suspicion and were not doing anything wrong or illegal.

Terr_
0 replies
10h59m

notification system

More-generally, imagine if every citizen was entitled to a yearly report on all how many times law-enforcement received records containing their names or personally identifying information, except in cases that are formally unsolved and in-progress.

So a line item might be something like:

    {Ref ID}, {Date}, "All Youtube accounts that watched {Video Title}"

salawat
0 replies
3h0m

Most people don't know. Or if they know, they don't understand the implications. As Computer Scientists, part of pur whole shtick is to try to spread that lnowledge far and wide. Most, I hazard, spend precious little time on that particular responsibility.

mistrial9
0 replies
6m

"people do not care" - Please stop repeating this false statement. When you repeat it you give it legitimacy, and take the time when other statements could be made.

Most people are helpless to make change. Greater than one million adults serve in uniform services of some kind where they literally must comply. The ad budgets and massive, overflowing volumes of money generated by "surveillance capitalism" buy the consent of the mercenary finance occupations. None of this means "nobody cares"

andsoitis
0 replies
12h15m

Privacy has been dead for a long time. The worst part is people don’t care.

I would argue “people don’t care” because… there isn’t a high enough number of people who suffer negative consequences from “their privacy being invaded”.

ametrau
0 replies
13h4m

You play right into their hands by being demoralised (and trying to spread that to others)

jameshart
38 replies
13h51m

Blackstone: "It is better that ten guilty persons escape than that one innocent suffer"

So not sure where you got the impression he's okay with up to 100 people being disturbed so we can catch one bad guy.

But then, he wasn't really talking about that was he? Better the guilty go free than the innocent suffer what? He was, essentially, talking about the principle of innocent until proven guilty; that innocent people shouldn't suffer by being punished for a crime unjustly.

2999 innocent people, in your formulation, though, are not being punished for a crime. They're not even being accused of a crime.

AnthonyMouse
33 replies
13h13m

innocent people, in your formulation, though, are not being punished for a crime. They're not even being accused of a crime.

They are, however, being harmed.

It's easier to use historical examples because they're not afflicted with modern politics.

The FBI was known to investigate and harass civil rights leaders during the civil rights movement. Suppose they want to do that today.

Step one, come up with some pretext for why they should get a list of all the people who watched some video. It only has to be strong enough to get the list, not result in a conviction, because the point is only to get the list. Meanwhile the system is designed to punish them for a thin pretext by excluding the evidence when they go to charge someone and their lawyer provides context and an adversarial voice, but since their goal here isn't to gather evidence for a particular investigation, that's no deterrent.

Step two, now that they have the list of people interested in this type of content they can go on a fishing expedition looking for ways to harass them or charge them with unrelated crimes. This harms them, they're innocent people, therefore this should be prevented. Ideally by never recording this type of information to begin with.

There is a reason good librarians are wary about keeping a history of who borrowed a particular book.

echelon
15 replies
12h53m

They are, however, being harmed.

No they're not. Which ones will live a day less of their lives?

If I'm on surveillance footage near a crime scene, police have the right to look for me and question me. This isn't any different. It's just different sets of photons and electrons.

I respect the rights to privacy, but a crime happened, and the police have the tools to investigate. It's barely an inconvenience.

The burden of proof will still be on the investigators and prosecution to find out and show beyond a shadow of a doubt who performed the swatting.

josefx
5 replies
11h52m

No they're not. Which ones will live a day less of their lives?

There are cases like the bombing in Madrid where the US agencies cast out a wide net over possible suspects using data about people who converted to Islam and then used a bad finger print match (which everyone told them was garbage) to terrorize one suspect for weeks. They had no evidence that the guy was involved, they had no evidence that any of their suspects was involved, but they had a narrative and where happy about every bit of data that supported it. Meanwhile Spain convicted the actual bombers.

godelski
2 replies
11h28m

What's your point? Someone wrongly being detained/prosecuted/charged/hung/etc in the past does not make it right now. It doesn't make it any less wrong then. Nor are people of a government homogeneous. Especially America. The great American past time is shitting on America. But if you're going to dismiss wrong because wrong was done in the past (or even let it slide or be apathetic) that is enabling. Being upset and angry is very different than apathy.

josefx
1 replies
11h14m

My point was that it was wrong. I do not agree with casting wide nets in the hope that someone might fit whatever profile the police or other agencies have pulled out of their asses.

godelski
0 replies
9h50m

Your point came off as whataboutism. This may not be what was intended, but this is how I interpreted it and it appears that others did as well. Thank you for clarifying and I'm sorry we miscommunicated.

andsoitis
1 replies
11h37m

There are cases like the bombing in Madrid where the US agencies cast out a wide net over possible suspects using data about people who converted to Islam and then used a bad finger print match (which everyone told them was garbage) to terrorize one suspect for weeks.

Some hyperbole in your telling of the story and failure to mention that he was awarded restitution. According to Wikipedia:

Brandon Mayfield (born July 15, 1966) is a Muslim-American convert in Washington County, Oregon, who was wrongfully detained in connection with the 2004 Madrid train bombings on the basis of a faulty fingerprint match. On May 6, 2004, the FBI arrested Mayfield as a material witness in connection with the Madrid attacks, and held him for two weeks, before releasing him with a public apology following Spanish authorities identifying another suspect.[1] A United States DOJ internal review later acknowledged serious errors in the FBI investigation. Ensuing lawsuits resulted in a $2 million settlement.

https://en.wikipedia.org/wiki/Brandon_Mayfield

What point are you trying to make with this example?

partitioned
0 replies
5h54m

That it should never have happened?

AnthonyMouse
3 replies
12h34m

Which ones will live a day less of their lives?

The ones who, having had their political inclinations revealed to adversarial law enforcement, then become subject to harassment for those views which should have been private.

If I'm on surveillance footage near a crime scene, police have the right to look for me and question me.

The question is whether they should have the right to seize the surveillance footage by force if the proprietors would rather protect the privacy of their users. The third party doctrine is wrongful.

And given that it exists, so is keeping records like this that can then be seized using it.

The burden of proof will still be on the investigators and prosecution to find out and show beyond a shadow of a doubt who performed the swatting.

This is assuming they're trying to prosecute a particular crime rather than using a crime as a pretext to get a list of names.

And it's about the principle, not the particular case. Suppose a protester commits a crime and now they want a list of all the protesters. Any possibility for harm there?

andsoitis
2 replies
12h7m

The question is whether they should have the right to seize the surveillance footage by force if the proprietors would rather protect the privacy of their users.

If there was a crime committed outside your home and you have surveillance footage that has captured passers by, you would not offer it to the police because you would rather protect the privacy of the all the anonymous passers by when one of them is likely the culprit?

That strikes me as highly unlikely. And if you wouldn’t, I am willing to bet that most people would. Why care about the privacy of anonymous passers by when you can help catch the perpetrator and increase safety around your home?

godelski
1 replies
11h34m

If there was a crime committed outside your home and you have surveillance footage that has captured passers by, you would not offer it to the police?

These are not the same. You might think the difference is subtle, but I'll tell you that that subtly matters. And matters a lot.

And tbh, these two scenarios are quite different.

jameshart
0 replies
3h48m

I think the analogy is rather strong. Where does it differ?

godelski
2 replies
11h36m

This isn't any different. It's just different sets of photons and electrons.

And a dictator is just another set of cells and organic compounds? You can't break things down into this because then literally everything is the same. Literally everything you see is just a different set of photons and electrons. But those things have real effects. They aren't fungible. I don't care that my partner sees pictures of me naked, but I sure do care if cops or "the government" is, despite it being "just a different set of photons and electrons."

The burden of proof will still be on the investigators and prosecution to find out and show beyond a shadow of a doubt who performed the swatting.

The burden of proof is step by step. I don't think I should have to cite the 4th Amendment but

  The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and ***no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.***
The setup was to treat the government as an adversary. Needing to understand positive rights vs negative rights[0]. Obviously rights are not infinite, but there should be friction. Doesn't matter if the thing is seemingly innocent or inconsequential, what matters is power. Perception shifts and creeps so this is why people take a stand at what might seem trivial. [1]

[0] https://en.wikipedia.org/wiki/Negative_and_positive_rights

[1] https://encyclopedia.ushmm.org/content/en/article/martin-nie...

jll29
0 replies
5h12m

The setup was to treat the government as an adversary. Needing to understand positive rights vs negative rights

+1 on citing the constitution's wisdom of treating the government itself as adversarial, due to the enormous power it has.

+1 on pointing at the difference between positive and negative rights in this context.

echelon
0 replies
40m

There was a swatting incident and they have a time window where they'd like to corroborate IP logs.

An IP address is no different from a license plate on camera. It's a lead and the evidence was gathered at a crime scene. Nobody's home is being entered into. Nobody's iCloud account is being unlocked and ransacked. Gathering these logs alone won't lead to those things happening either.

I'm all for limits on power, but this seems to be entirely reasonable. This isn't a fishing expedition. IANAL, but I don't see how the 4th would be violated with either a court order or willing third party handing over the logs.

If the investigators get the IP logs, they shouldn't then be able to take those logs and ask the ISP for everything that those people were doing. The burden will be on the investigators to find more evidence linking one of those IPs to the call.

More crime will happen digitally year by year. Swatting has already entered the public consciousness. Just wait until people start strapping bombs to FPV drones or calling grandma with your voice.

We shouldn't stop at the software stack as some kind of impenetrable legal barrier that shrouds investigation. We should respect and enhance limits on power, but we also need to modernize the judicial tools to tackle the new reality.

The framers couldn't have imagined "swatting". The law needs to understand this. It should provide scoped-down investigatory tools that simultaneously guard and respect our constitutional rights and privacy. Access to anything beyond the scope of an actual crime that took place should be restricted.

akira2501
1 replies
12h43m

Which ones will live a day less of their lives?

Your liberties encompass so much more than this and a government that treads on them recklessly does far more damage than to simply waste an individuals time.

It's barely an inconvenience.

You assume it's not. How would you verify this? Why should you have to?

echelon
0 replies
59m

Your liberties encompass so much more than this

You don't have a liberty from being investigated if they have evidence. They're not snooping around in your home without cause. The swatter was watching the live stream, and the timestamped IP logs can corroborate.

Just because it was an IP address and not a face or license plate on camera doesn't make it any different. You can't hide behind a chosen technology stack as a shield when the fundamentals of the case are the same.

dllthomas
15 replies
12h36m

Surely you are not contending that Blackstone was of the position that no innocent person should be investigated, however briefly, unless it results in at least 10 convictions.

I very much agree that (some, probably minimal) harm is being done to these people. Pretending that they "suffer" in the sense Blackstone was using the word is disingenuous.

AnthonyMouse
9 replies
12h21m

Being investigated is a red herring. The problem is from the other end. Your premise is that a person being investigated when they're innocent of the original crime is basically harmless because the investigation will come to naught. The actual issue is that if they can find a pretext to get a list of all of the people who viewed some content they don't approve of, now they have a list of targets with which to play "bring me the man and I'll find you the crime" and that is a harm in need of preventing.

dllthomas
4 replies
11h43m

Your premise is that a person being investigated when they're innocent of the original crime is basically harmless because the investigation will come to naught.

Not at all. I would say that it's usually (not always!) small in the particulars but adds up in aggregate, and that we should be a lot more careful with how much surveillance we allow.

I just would also say that the kinds or amounts of harm being done there are manifestly not what Blackstone was talking about in his "formulation" as it leads immediately to absurd conclusions that go very well past the present case.

I will not here that "there is a concern here analogous to Blackstone's ratio" is a different thing than, paraphrasing what was up thread, "this is substantially more extreme than Blackstone's ratio should forbid".

And in case I haven't said it in thread anywhere, I share concerns about surveillance. I just think if we are enlisting support from historical figures, we should find a quote where they're talking about the question or acknowledge the distance, rather than pretending the quote means something it didn't - that will only turn off those who might be persuaded.

AnthonyMouse
3 replies
11h2m

"there is a concern here analogous to Blackstone's ratio" is a different thing than, paraphrasing what was up thread, "this is substantially more extreme than Blackstone's ratio should forbid".

I agree with this. What's happening here is different than the scenario in the original ratio, even though it's a similar concern.

I just would also say that the kinds or amounts of harm being done there are manifestly not what Blackstone was talking about in his "formulation" as it leads immediately to absurd conclusions that go very well past the present case.

If we direct ourselves to the case at hand, I'm not sure that a general rule that the government can't compel innocent bystanders to assist an investigation against their will would even be a net negative, much less cause serious problems. When a crime is committed people will generally be inclined to help bring the perpetrators to justice, because who wants thieves and murderers and so on going unpunished? Whereas if someone is disinclined to help, we might consider that they could have a reason, e.g. because the law being enforced is unjust or they believe the investigation is not being conducted in good faith, or they simply don't trust the government with the information, at which point the ability to refuse acts as a reasonable check on government power.

I just think if we are enlisting support from historical figures, we should find a quote where they're talking about the question or acknowledge the distance, rather than pretending the quote means something it didn't - that will only turn off those who might be persuaded.

I feel like historical quotes tend to detract from discussions in general, because they're effectively an appeal to authority and then the discussion turns to exactly where we are now, debating whether the current situation can be distinguished from the original, which is a separate matter from whether what's happening in the modern case is reasonable or satisfactory in its own right.

godelski
2 replies
10h19m

What's happening here is different than the scenario in the original ratio, even though it's a similar concern.

Correct me if I'm wrong, but I'm pretty sure Blackstone wrote about negative or natural rights.

In fact, let me pull out more context around the exact quote. He specifically addresses direct punishment but immediately after is the nature of having the duty to defend one's innocence. Which is exactly the case here.

  Fourthly, all presumptive evidence of felony should be admitted cautiously, for the law holds that ***it is better that ten guilty persons escape than that one innocent suffer.*** And Sir Matthew Hale in particular lays down two rules most prudent and necessary to be observed: 1. Never to convict a man for stealing the goods of a person unknown, merely because he will give no account how he came by them, unless an actual felony be proved of such goods; and, 2. Never to convict any person of murder or manslaughter till at least the body be found dead; on account of two instances he mentions where persons were executed for the murder of others who were then alive but missing.

  Lastly, it was an antient and commonly-received practice that as counsel was not allowed to any prisoner accused of a capital crime, so neither should he be suffered to exculpate himself by the testimony of any witnesses.
I would not be surprised if Blackstone found the act of investigation without the qualification of sufficient suspicion as gross injustice and directly relevant to his intent. As this is a less inconvenient version of locking everyone in a room and interviewing them checking their pockets for stolen goods before they leave. The negative or god given right of innocence is innate. The punishment is the accusation and search, which is an explicit infringement on the natural right. Yes, rights can be infringed upon, but not without due cause and not simply because one is in a position of authority.

I know that this is a point of contention in this (these) discussions, but I stand by that a right is being violated and harm is being done by the simple act of investigation. Mass surveillance (which is mass investigation), is an infringement on our god given rights. The point is to have friction for the infringement of rights. All rights can be violated, but they must need sufficient reason. It does not matter if these rights seem inconsequential or not. Because at the end of the day, that is a matter of opinion and perspective. Blackstone was writing about authoritarian governments and the birth of America was similarly founded on the idea of treating government as an adversary. These were all part of the same conversation, and they were happening at the same time.

I do not think I am taking the historical quote out of context. I think it is more in context than most realize. But I'm neither a historian nor a lawyer, so maybe there is additional context I am missing. But as far as I can tell, this is all related and we should not be distinguishing investigation (or from the other side of the same coin, exculpation) from punishment as these are in the same concept of reducing one's rights. They are just a matter of degree.

https://oll.libertyfund.org/titles/sharswood-commentaries-on...

AnthonyMouse
1 replies
9h34m

He specifically addresses direct punishment but immediately after is the nature of having the duty to defend one's innocence.

The issue is that the ratio can't mean much outside the realm of a criminal conviction when any of the rest of it would need a different standard.

Suppose we want to evaluate if it's reasonable for the police to search your residence for a murder weapon. Should we let 100 guilty people go free to avoid one search of an innocent person? That's probably not right, a search is enough of an imposition to require probable cause, but if you had to prove the crime to the same level as would be necessary for a conviction in order to get a warrant then searches would always be superfluous because they could only happen in cases where guilt is already fully established without the results of the search.

Conversely, with this YouTube kind of situation where the police want data on large numbers of people, the majority of whom are fully expected to be innocent, they're not even reaching probable cause for those people. Which is a lesser standard for justifiable reasons but it's still not one which is being met for those people. And so it's still a problem, but it's a different problem with a different standard.

godelski
0 replies
9h25m

I find this interpretation odd. I do not see the numbers as meaningful in a literal sense but rather in a means of making a point and a grounding for the surrounding abstraction. I think the point is to explicitly discuss these bounds and view them as a spectrum. To think of them in the abstract but to push back against authority.

Certainly Blackstone was not saying that infringement of rights (punishment) should not happen under any circumstance. Rather that there should be significant friction and that we should take great care to ensure that this is not eroded.

bowsamic
3 replies
11h18m

This is true of literally any investigation though

AnthonyMouse
2 replies
11h1m

It isn't. If the police are investigating John Smith and they get a warrant for the files of John Smith then they don't also get the files of anybody else along with them.

jameshart
0 replies
3h50m

How do police find out that John Smith is the person whose files they want to get a warrant for?

Maybe because John Smith was one of only eleven people who signed in to a building on the day a crime took place, and he signed out right after the crime happened.

But should the police not look at the sign-in sheet at the building because that will infringe the privacy of ten innocent people?

godelski
0 replies
10h3m

And importantly, there has to be sufficient reason for investigating John Smith. It can't be arbitrary (he looks funny, has a limp, is black, is gay, plays Doom, is a Muslim, etc). Rights can be infringed, but they need reason. And they need good reason.

godelski
4 replies
12h25m

I would contend that Blackstone was of the position that no innocent person who was not of sufficient suspicion of committing a crime should be investigated.

These are innocent bystanders. There is nothing suspicious about their activities other than they did something that a suspected criminal did. A perfectly legal activity? To take this to the ridiculous side, are we going to investigate everyone who took a poop at a specific time because a criminal did?

https://www.youtube.com/watch?v=DJklHwoYgBQ

dllthomas
1 replies
11h56m

Blackstone was of the position that no innocent person who was not of sufficient suspicion of committing a crime should be investigated

I expect so. But pretending that's what he was talking about in the quote you were referencing is going to undermine your (our, probably) position with those not already convinced.

godelski
0 replies
9h56m

I'm not convinced I am taking him out of context[0]. Was Blackstone not also discussing natural rights? I see him as viewing punishments as infringements on ones rights. As a spectrum. And those rights even including the simple aspect of presumption of innocence. My best understanding is that so much is literally about the mundane and simple. Because natural rights are... well... natural. They are things we have until we don't. That's why they are called negative rights, because they need be removed, not given. Punishments (infringements) can be extremely minor to major. But they are still one in the same because it is about the concept in the abstract. Or rather, in generalized form.

As far as I can tell, this is explicitly within the context of the quote.

That said, I do see your point and appreciate your feedback. Maybe this can be an opportunity to turn this into a lesson? It seems too cumbersome to bring up from the get-go and similarly backfire. But discussing in the abstract is a difficult task when it is neither a natural way of thinking nor is it a common way that is taught. But I still think it is an important tool and something that does distinguish humanity. I am open to suggestions (I think HN of all places is more likely to be successful when discussing things in the abstract, but it is still general public).

[0] https://news.ycombinator.com/item?id=39798280

TeMPOraL
1 replies
11h29m

Wait. Aren't "innocent bystanders" literally the first people the police wants to get ahold of, to interview and yes, potentially investigate if there's something off? People don't spontaneously become suspects, as if by radioactive decay; some degree of investigation comes first and is what turns "innocent rando" into a suspect.

godelski
0 replies
10h55m

Aren't "innocent bystanders" literally the first people the police wants to get ahold of, to interview

Yes. But that's not the same

potentially investigate if there's something off?

If you're asking information from people who witnessed a crime *and volunteering information* (which is not investigating that person and not accusing them of a crime, nor is lack of volunteering information a suspicious activity) and they then generate suspicious evidence, then yes, that enables capacity for investigation. It is true that things are not static, time exists, and entropy marches on.

That's the difference. There is nothing that these people did that warrants suspicion. These people are not being asked or questioned. This was not done voluntarily. They didn't even know this was happening to them. This was a thing imposed upon them, full stop.

I want to give a scenario to help make things clear. Suppose I send nudes to my partner. The government intercepts these without my knowledge, looks at these, and deletes them, and literally nothing else happens. Is this okay? I did not know this happened to me. No "harm" has fallen upon me. And as far as I know, nothing has changed in my life. But then later I find out this happened. Let's say 20 years later. I feel upset. Do you not think I am justified in being upset? I think I do. My rights were violated. It is worse that it was done in secrecy because it is difficult for me to seek justice. It is because I have the right to privacy. It is a natural, de facto, negative, but a god given right. They put my information at risk by simply intercepting it and making a copy. It was unnecessary and unjustified.

jameshart
0 replies
3h59m

In your story, the injustices are 1) the police going on a fishing expedition, and 2) the police using the data gained through an investigation to unjustly harass people. Those are bad things and we should have laws to prevent that and punish people who do so.

I agree it would be bad if they were making the request in furtherance of a conspiracy to do either of those things.

But the police asking Google for a list of people who viewed a video, though, is in itself not one of those things. It’s similar them asking a business owner whose business has a camera overlooking a street near a crime scene to hand over surveillance footage (which will include innocent passers by) or a business that sells a product which was known to be used by a criminal to provide a list of purchasers of that product (which will include innocent purchasers).

Many such businesses will voluntarily hand over such information to assist with an enquiry. Some businesses might refuse, or might choose not to have such information.

And this is why judges are involved in the process of issuing warrants and grand juries in the process of issuing subpoenas when the police or a prosecutor want to compel the production of evidence of that sort.

But it just seems inevitable that, at the beginning of an investigation into a crime where the perpetrator is unknown, the first step is to identify possible suspects; by definition not all of the people so identified will end up being investigated. How are the police to do that if they can’t ask anyone for information that might bring innocent people’s names to their attention?

I appreciate it seems idealistic maybe, but it feels to me that we need rules that ensure ‘coming to the attention of the police in the course of an investigation’ is genuinely harmless; not rules that assume it automatically exposes you to harm.

godelski
3 replies
13h37m

Well I forgot to link but from the wiki

  Other commentators have echoed the principle. Benjamin Franklin stated it as: "it is better 100 guilty Persons should escape than that one innocent Person should suffer"
I went with Franklin because we are specifically talking America but let's be honest, the number doesn't matter and it seems you agree. Let's focus on that. Because I'm 100% with you, this isn't even people who have been accused. Which even those accused have rights.

https://en.wikipedia.org/wiki/Blackstone%27s_ratio

jameshart
2 replies
13h27m

You've still got the ratio backwards. Franklin says if you know 101 people watched a video, and 100 of them are guilty of a crime, you can't just round up all 101 and throw them in jail. I.e., if you have a standard of punishment that would convict even one innocent person for every 100 guilty people it catches, it's not a good standard.

Which I think we can all agree with.

But that's not what's happening here, is it?

godelski
0 replies
13h11m

you can't just round up all 101 and throw them in jail

Yes. "100 people" (or whatever) had their rights violated. Sure, not as bad as jail, but it is still in the spirit. I'm not sure why you think I have it backwards, I think we're just using different perspectives.

But I'm not into being pedantic if we understand one another.

chii
0 replies
13h14m

so it is considered punishment to have the watch information revealed to the state?

okr
3 replies
11h55m

If it is an order by a court, then i think it is ok. Then it is no mass surveillance and for solving a crime it is useful.

I wonder what kind of video it is. Maybe a shared link, so only people who secretly know about it knew about it, and they have become suspects. Is it mentioned in the forbes article?

And i wonder if people abuse videos on youtube by encrypting the content with a key and the key is then shared.

wolverine876
2 replies
10h44m

If it is an order by a court, then i think it is ok. Then it is no mass surveillance and for solving a crime it is useful.

Why can't a court order be mass surveillance? In these cases, the videos were viewed 30,000 times and more than 130,000 times (if I understand the latter correctly). How is that not mass? Nobody suggests that more than a few of those people are suspects.

okr
1 replies
8h55m

My understanding of mass surveillance is, that masses are surveilled. :) But here its a court, that allows extracting log data for a specific case and it happened after the fact.

I make a difference between leaving loggable traces of living (which we leave all the time, no matter what) and sometimes filtering to recapture the past.

jll29
0 replies
5h7m

That's right even if the sample size N=30,000, it is still a one-time point event controlled/approved by the proper legal authority. There will be an audit trail of said approval and the process will be documented.

In contrast mass surveillance is just "oh, we have a BIG database, and we query whenever for whatever purpose, and nobody knows who searched for what and when and why, and nobody EXTERNAL TO THE AGENCY needs to approve it (lack of control). And today, Bob, who works for the police, background-searched his new girlfriend as well."

CobrastanJorji
3 replies
13h58m

I agree that it's egregious, but Sir Blackstone was talking about punishment, especially relating to execution, and I think perhaps the ratio can be adjusted significantly downward when the cost to the innocent is much lower. Otherwise, the only reasonable search would be when a government official is already certain of your guilt.

godelski
2 replies
13h7m

I'd consider your rights being violated "punishment."

Blackstone was talking in the abstract. Clearly Franklin was too considering many of the other things he's known for saying.

andsoitis
1 replies
12h1m

I'd consider your rights being violated "punishment."

You are wrong. Punishment is when you impose a penalty as retribution for an offense.

godelski
0 replies
11h21m

You can call me wrong, but maybe first ensure you understand negative rights.

I understand why you think I'm wrong, but I hope you understand why I think that way. We can disagree, and that is fine, but let's not act as if there are objective answers in social constructs.

Because, I do think punishment is an imposed penalty. In this case, on your rights. Rights are abstract, and these are not binary nor clearly discrete. House arrest is not jail, nor are fines. But as communicated to you elsewhere, the 4th amendment is about ensuring friction for removing someone's negative rights.

But I disagree that punishment is imposed as a penalty as retribution for an offense. You imply that this requires an actual offense to have been made. I assure you that punishment can be imposed for any arbitrary reason. I can also assure you that punishment is a spectrum, from extremely minor (as I think we'd agree is in this case) and extremely harsh.

vages
1 replies
13h58m

Did you forget the link to Blackstone?

changoplatanero
1 replies
14h23m

The second one seems a lot more narrow and more legitimate

dirtyhippiefree
0 replies
14h7m

I remember Larry Ellison (Oracle) in the news saying that Privacy Is Dead a quarter century ago, long enough the article merely referred to it as famous (no date needed) in 2018. It was written six years ago...

https://securitycurrent.com/privacy-is-dead-long-live-privac...

naasking
0 replies
13h4m

Supposing every person watched that video 10 times AND supposing the target was one of the viewers (it really isn't clear that this is true), that's 2999 people who have had their rights violated to search for one

I think whether their rights are violated depends entirely on what sort of information is handed over. Consider acquiring surveillance footage that has plenty of foot traffic, but a suspect is known to have passed by. The police are typically permitted to review that footage even though plenty of innocent people were captured on that video.

mistermann
0 replies
3h30m

I don't think any of this appears legitimate.

It isn't.

Democracy is fake.

Our justice system is fake.

Everything is fake.

(Where "fake" = "not what they are advertised or perceived to be.)

If all of the same things were occurring in another country, or even better: in a video game, you would have little difficulty and zero aversion to accepting these facts.

However: put a person into these things, and the brain malfunctions.

If you are a reasonably normal person, your mind will now be filled with objections to this proposition, reasons why I am incorrect. But if you were to state those objections, I can punch holes in every single one of them without even breaking a sweat.

We live in a literal simulation, but not the kind that everyone has been hypnotized to believe is the only kind possible - have you ever noticed that when the notion of simulation theory comes up, it is always The simulation theory (Nick Bostrom's)?

This is a pretty neat trick eh? And there seems to be nothing that can be done about it, because people will fight tooth and nail (using Meme Magic, aka "The Facts", "The Reality", etc) against being extracted.

Thankfully, it is simultaneously hilarious. Well, except the part where millions of children are dying, but nobody cares....but oh boy when a big scary "pandemic" comes along, pull out all the stops.

I hope that there is a Hell, because I would like to see every single member of this despicable 21st century society end up there some day. Seeing justice finally being served for once would be worth suffering for eternity.

lolinder
0 replies
4h27m

OP explicitly agrees with you that the 30k is illegitimate, but that's the only one you address. What's your take on the one where the police became aware that they were being watched on a YouTube livestream while responding to a bomb threat and obtained a court order asking for information about who was viewing a set of local livestreams at the specific times where they were searching for the bomb?

What makes that one different than a court order demanding that a business release security footage that covers the scene of a crime for the time window in which the crime occurred? Or would you consider such a court order to also be illegitimate?

iteygib
0 replies
5h51m

Any kind of search can be deemed constitutional if it goes through a warrant process, which is the point of warrants. This story is less about the how the information was taken and more about whether or not the warrant process and 4th Amendment rights were properly followed.

This would then be mixed in with the question of whether or not new forms of data (like video views) would equate to previous forms of similar data searches that police have obtained warrants for (like reviewing CCTV).

ChuckMcM
6 replies
14h17m

I certainly concur with this.

On the one hand, a narrow warrant that reveals a lot of people (classic example are warrants on motels to provide the names of everyone who checked in on a certain date, or was registered on a certain date) are certainly constitutional and have been upheld many times.

The first seems, odd.

godelski
3 replies
13h5m

  In a just-unsealed case from Kentucky reviewed by Forbes, undercover cops sought to identify the individual behind the online moniker “elonmuskwhm,” who they suspect of selling bitcoin for cash, potentially running afoul of money laundering laws and rules around unlicensed money transmitting.

  In conversations with the user in early January, undercover agents sent links of YouTube tutorials for mapping via drones and augmented reality software, then asked Google for information on who had viewed the videos, which collectively have been watched over 30,000 times.
This is the first case. This doesn't seem that narrow to me.

AnthonyMouse
2 replies
12h43m

who they suspect of selling bitcoin for cash, potentially running afoul of money laundering laws and rules around unlicensed money transmitting.

Wait, what? So is Bitcoin illegal to use as a currency now? Special casing exchanges for cash seems completely pointless if you could just buy some of <any commodity> for cash and then turn around and sell it back to the same person for the same amount in Bitcoin, but if every customer has to do KYC of the merchant when they're paying with Bitcoin, how is that ever going to be feasible?

godelski
1 replies
11h15m

I think a charitable interpretation is that they were suspicious of the money being used illegally. But I'm not sure there's enough information here to make this clear. Because clearly there's misinformation being spread. Claiming bitcoin is anonymous...

initplus
1 replies
13h50m

The first is a somewhat clever attempt to unmask someone ann undercover investigator was already talking to. Police should have narrowed scope of the warrant by only asking for data on viewers within a narrow window after they sent the link.

Even better might have been to directly link to some service that they already control on a honeypot URL, and then gone after the ISP for customer details.

ametrau
0 replies
13h2m

Nah actually pretty dumb overall. And sending an open link when a private one looks the same is even more dumb.

phire
2 replies
10h4m

> The first one where the police uploaded videos and wanted viewer information is absolutely egregious and makes me wonder how a court could authorize that.

The police didn't upload they videos. It's not entrapment, and it doesn't sound like the actual content of the videos is illegal.

Instead, they had an open communication channel with their target and were able to send them various links to youtube videos.

Their theory being if they can find any user who clicked on all (or most of) those links, it's probably their target. And it's unlikely some random user would have accidentally viewed all those videos.

The actual request for the raw list of all viewers seems unconstitutional to me. Too broad, gives the police a lot of infomation about all users who watched just one of the videos. But I suspect a much narrower request where google identified the target user and past just that user's info on would be constitutional.

csomar
1 replies
7h27m

But I suspect a much narrower request where google identified the target user and past just that user's info on would be constitutional.

Isn't that worse? Essentially making Google do the job of the police and the police having to trust the work of Google for it.

phire
0 replies
1h21m

I don't see any problem with trust.

The police will still get the exact same raw data of that one target user. The change just means that they won't get any data on other users.

supposemaybe
0 replies
7h28m

If you see a YT which is remotely dodgy, don’t watch it… It very well could be planted there as bait.

And that’s great Google are trying to fight back, a little. Though I wonder that for us Non-American Brits that they’d do the same for us too (doubtful)

nonethewiser
65 replies
15h44m

Im increasingly coming to the opinion that anonymity isnt guaranteed so you should assume everyone knows what you do.m and who you are. So you should probably just use your real name and do way less online.

Havent fully swallowed this pill but its feeling inevitable.

rapind
20 replies
15h39m

100%. We probably shouldn’t protest or even discuss non-conforming ideas. Just agree with the current rulers on all things to be safe. Also make sure to vote for the right leaders because who knows how long that’ll remain private.

scubbo
10 replies
15h16m

Not sure if you're being downvoted because people think you're serious, or because they dislike sarcasm.

globalnode
5 replies
15h11m

Ive seen sarcasm downvoted here before, its usually a literal crowd here on HN

Hnrobert42
4 replies
15h3m

A “literal crowd” sounds mildly pejorative. I think it’s more that HN prefers productive, rational discussions. Sarcasm is passive aggressive and a more circuitous route to the point than a literal one. Last, sarcasm isn’t usually even funny. When it is, it’s only funny to those who are with the point.

uoaei
1 replies
14h53m

Critical feedback is still actionable. We don't need to guard adults from hearing difficult things and feeling difficult emotions.

c0pium
0 replies
12h5m

That’s not what’s being suggested at all though. The problem with sarcasm is it’s boring but feels oh so clever to the person who writes it.

ametrau
1 replies
12h57m

Let’s say block headed and pretty dumb.

Hnrobert42
0 replies
7h46m

Name calling is rarely useful. It certainly is not in this case, in addition to be patently false.

JKCalhoun
1 replies
14h31m

I get that it is sarcasm, but disagree with the implication that the OP was suggesting complete surrender.

If you don't like it, walk away, seems reasonable to me. We don't own these corporate web sites and can only vote with our eyeballs (so to speak).

rapind
0 replies
14h19m

I thought (still do) OP was being sarcastic too and I was just playing into it.

rapind
0 replies
14h26m

Anyone thinking I was serious is a canary in the privacy coal mine.

That being said, I suspect it was just an unfortunate use of words (current / right leaders) that might lead some people to think I was being politically tribal. (nothing could be further from the truth)

Brian_K_White
0 replies
10h42m

I downvoted because to me it tried to say what someone else should not talk about.

I don't disagree that wrong things should not be tolerated and that giving up and accepting is no answer.

Whenever someone tries to tell a complainer to shut up, I frequently point out that in the entire history of the Earth, not one thing ever got better by accepting things as they are. It's one of my favorite things to point out. So I'm very much in the reject giving up camp.

But I don't think it's necessarily giving up or cooperating to merely explore any and all other possible solutions to any given problem, and that comment struck me that way.

My impression might be unjust, and so by disclosing it I may take a few arrows myself, but for once, one is explained. :)

naruhodo
4 replies
14h21m

Because of the possibility that the leaders will change - or change their opinions - in the future, the only safe course of action is to express no opinion whatsoever.

janice1999
1 replies
5h39m

Your reading habits, tv show preferences etc already reveal your politic beliefs and they have already been categorised by advertisers like Google/Facebook/Apple/Microsoft and sold to countless data brokers and government agencies already.

egeozcan
0 replies
4h36m

You don't have to read, and you don't have to watch tv. However that also tells a lot about someone. There's no escape.

rapind
0 replies
14h5m

I'm on to you sir... trying to fix reddit.

LAC-Tech
0 replies
13h22m

Can't do that either, "your silence speaks volumes", etc etc

zeroCalories
0 replies
14h27m

You won't dare post anything actually transgressive.

dogman144
0 replies
2h26m

I mean do all that but don’t tweet it and leave your phone at home. That is the issue.

Brian_K_White
0 replies
11h1m

We also, apparently, shouldn't even try to discuss and figure out any other possible approaches or responses to any given problem that might exist.

We don't care if this wall might possibly be easy to simply walk around and obviate, we shouldn't even look, or even talk about looking. The only rational way to attack any problem is to just look exactly in the direction you were led to look, bang your head on that same spot forever.

2OEH8eoCRo0
0 replies
2h30m

Or go back to how things were: Keep it to yourself and discuss spicy topics among close friends. Friends assume good faith. People on the internet tend to assume the worst interpretation possible and don't give any benefit of the doubt.

godelski
15 replies
14h40m

We're on a tech forum known to have some of the best and brightest and visited by tech giants. If anyone can solve this problem, it is us. If we are the ones giving up, then who is there to make things right?

As I see it, our only choice is to make privacy and anonymity trivial. Not for techies, but for our tech illiterate grandparents. Push hard for tools like Signal where people can get encryption without having to think about encryption. People want privacy and security but they just don't know how or don't understand what leaks data. But there's the clear irony that the sector __we__ are critical to is the one who is creating this problem.

I'm not ready to swallow that pill. I'm unconvinced we have to. Clearly __we__ can do something about this. Even if that is refusing to build such things, let alone build defenses. Apathy is no different than supporting these authoritarian takeover, because that's what it is. Authoritarian creep.

andsoitis
3 replies
11h47m

We're on a tech forum known to have some of the best and brightest and visited by tech giants. If anyone can solve this problem, it is us. If we are the ones giving up, then who is there to make things right?

You think the world’s geniuses are hanging out here? The world’s brightest are here and you’re going to inspire them to solve what you frame should be a very high priority? There are much bigger problems to solve.

I really think your vanity is warping your perspective.

skidd0
0 replies
4h0m

The privacy of the world's populace sounds like a pretty big problem to me considering the damage that can be caused by that information getting into the wrong hands.

sandspar
0 replies
11h8m

There are many extremely intelligent people who post here. There may not be Gandalf the Grey's here but there are dozens of tribal shamans.

godelski
0 replies
9h35m

You think the world’s geniuses are hanging out here?

Maybe. But they at least frequent here.

I really think your vanity is warping your perspective.

I think you undervalue yourself. I don't see myself as a big cog, but neither am I disillusioned to believe that just because I'm a cog in a much larger and more complex machine means that I have little to no importance. Lesser, but non-zero. Were I to have the vanity you suspect I have, I would not be calling for your support as I would use my ego to solve it alone. But I am not. I can't do this alone. Nor am I drumming up people to collect wood and assign tasks, but I am trying to help those find a longing for the endless immensity of the sea. I am trying to help us realize we aren't inconsequential and that together, we have meaningful power. The big cog may be shiny and may have a lot more power, but it is still supported by a thousand smaller ones.

I have no illusion that people here work for Google, Meta, Apple, Amazon, Microsoft, and so on. Do you really think differently?

bdw5204
2 replies
13h59m

Any truly reliable privacy and anonymity tool that isn't created by the government will probably be made illegal by the government. Failing that, using it will make you a target of the government's security apparatus. If you create a cryptocurrency that can't be traced[0] or an anonymous marketplace where people can buy and sell anything they want[1], you're going to end up on the wrong end of US government trade sanctions or US drug laws. Running a Tor exit node gets your IP address blocked by much of the internet and can even get you a visit from the FBI[2]. Tor itself only exists because it was created by the US Navy as a tool for dissidents in dictatorships to be able to access the internet.

The only way to solve the problem would be to elect politicians who would either dismantle most of the surveillance system or address crime and terrorism so decisively that there was no longer any plausible threat to justify continuing to maintain a mass surveillance apparatus in which case it would (hopefully) eventually wither away as part of budget cuts once politicians forget why it was even "necessary" in the first place. There is no solution to political problems without obtaining and using political power to solve them.

The strategy of eliminating the system's justification isn't foolproof though because the bureaucracy that runs the military draft (Selective Service) somehow still exists even though the draft was ended around half a century ago and is almost certainly never coming back. Politicians only noticed it existed a few years ago long enough to debate whether to extend the wrong of registration for it to include women in addition to men. The eventual decision was to leave the status quo intact[3]. The sensible option of abolishing that relic of a past rights violation rather than continuing to waste money on maintaining the bureaucracy was not seriously considered. That means the direct route is almost certainly the better approach.

[0]: https://www.theregister.com/2022/08/10/github_tornado_cookie...

[1]: https://en.wikipedia.org/wiki/Silk_Road_(marketplace)

[2]: https://www.reddit.com/r/TOR/comments/rjgq8s/ok_so_what_has_...

[3]: https://www.politico.com/news/2021/12/06/ndaa-women-draft-dr...

knightoffaith
1 replies
13h20m

cryptocurrency that can't be traced

Monero is still going on strong, as far as I know.

zeroCalories
1 replies
14h30m

Solving these issues won't make you much money, and anyone that gets close will invite more heat than the center of the sun. Better to divest. Keep an email and phone for essential services like banking but avoid all other activity.

godelski
0 replies
9h33m

Do you build tech for the money? It is not why I do it. Yes, I need to earn a living. But it is exactly that. What is necessary for living. What is the point of earning money if it is not to better our lives? Why is money the only way we can improve our lives?

tredre3
1 replies
13h13m

    If anyone can solve this problem, it is us.
People on this forum (including myself) are the ones creating the tools that enabled this problem.

Any tech we create to "solve" this issue will be worked around and/or used to cause more problems.

Tech isn't the solution.

godelski
0 replies
9h42m

You're right that tech isn't the solution, but it also is. A hammer is part of the toolset to solving homelessness. It can also be used to create the homeless. We can build homes or tear them down. Hell, we can even smack someone on the side of the head with one.

Tech is too abstracted, and we must concentrate on the application. There is time for abstraction and time for specification. Tech is used to extract information as well as tech is used to protect information. These are actions, not objects or attributes.

And yes, it isn't the only tool in the toolbox. But it is a tool everyone here shares in common. It is a tool that many here are using to create this problem. One that many are probably not even aware that they are contributing to. But due to the commonality of our community and the commonality in its usage to create or exacerbate the problem, it is worth mentioning and considering.

Don't pass the buck. There are no singular causes nor solutions. So if we dismiss something because it is incomplete, we will never create any solution.

richrichie
0 replies
9h16m

The so called best and brightest of this forum are more likely to be censorship loving control freaks that melt and disintegrate at the sight of a different opinion or political belief.

Those of us who were born in the sixties or seventies know how the tech industry was once led by and populated with people who intensely cared about freedom of speech and despised government agencies.

ejb999
0 replies
7h31m

"It is difficult to get a man to understand something when his salary depends upon his not understanding it." -- Upton Sinclair

I have no hope that the people who created the very tools that led to these problems, are in anyway going to try and solve this problem.

Nextgrid
0 replies
2h22m

If anyone can solve this problem, it is us

We've literally created this problem by making industrial-scale stalking profitable and socially-acceptable. We've created an entire self-sustaining industry that spies on everyone, is not accountable and that the government can just ask for data when needed.

2OEH8eoCRo0
0 replies
1h56m

best and brightest and visited by tech giants. If anyone can solve this problem, it is us.

I'd say this egotistical god-complex is exactly what got us into the current mess.

jrockway
12 replies
15h12m

I think it's all about how many clues you leave behind. If you make a HN account that you only access via Tor through a browser with Javascript turned off and stick your writing through some AI editing service, it's probably pretty difficult to trace anything back to you. If you stream yourself 16 hours a day every day, your nickname probably isn't saving you from much, as it only takes one person to go "oh I know them" and then your secret's out. So like everything, it's about a striking a balance. Who is out to get you, and how much do you like doing things online? Just a question you can ask yourself before you move into a cabin in the woods and work on your novel 24/7 or whatever. (Publish it under a pen name, though, obviously.)

bas
8 replies
15h2m

Consider: if your adversary is the NSA, CIA, or (maybe) FBI, you’ve already lost the game you’re playing.

no-dr-onboard
4 replies
14h47m

You would be surprised at how easily they can be thwarted by simple technical maneuvers.

YMMV, but ime a lot of people have this bogeyman caricature of who the feds really are. The reality is that these are government agencies that pay significantly below market rate for really intense, highly demanding work shrouded with multiple layers of government grade red tape.

lazide
2 replies
14h37m

The issue is they have time. Lots and lots of time. And they keep records.

So if you get high enough on the list, it’s like those ‘immortal snail’/snail assassin scenarios.

Even Bin Laden got taken out and dumped in the ocean eventually.

So like Jan 6th - it had better work, or your goose is very likely cooked eventually.

Mathnerd314
1 replies
13h6m

I think the biggest trick is to move around, so it isn't as simple as getting a single address. Like with Bin Laden, a lot of the work was figuring out where he was. And Ross Ulbricht, maybe he wouldn't have been caught so easily if he changed hosters occasionally and the VPN had listed 100 internet cafes in different cities as connecting IP addresses instead of just 1. Certainly that's the way Tor works, always hopping around routers. It's maybe a bit pointless though, once they get your legal name it's pretty much a matter of time.

lazide
0 replies
1h32m

Damn, the snail assassin analogy works a lot better than I expected!

knightoffaith
0 replies
14h36m

I think it's not a bad idea to overestimate the power of the government to track you; the common wisdom on the internet to make this assumption is probably a good thing so people are motivated to be as safe as possible.

On the other hand, it seems like the Tor users who get caught make clear, glaring mistakes in their opsec. And I always remember how long it took to catch the Unabomber, and how they apparently only managed to catch him because of his brother.

rockskon
0 replies
13h8m

That's no reason to make it easy on them. Their ability to do bulk surveillance is limited by resources. Don't lower their resource requirement.

anamax
0 replies
12h41m

They're not trying to get everyone.

They just have to make it painful enough for enough people to get the vast majority of the rest to "fly right."

I'm certain that this is not terrorism.

Dudhbbh3343
0 replies
13h30m

It entirely depends on how motivated and how much resources they're willing to dedicate to finding you. They're probably not going to go to great lengths to catch a single copyright violation, so simple precautions may be good enough.

If you're a legit threat to national security, then yeah, they're probably going to find you no matter what you do.

godelski
0 replies
14h38m

If you make a HN account that you only access via Tor through a browser with Javascript turned off and stick your writing through some AI editing service, it's probably pretty difficult to trace anything back to you.

This is already too hard. But anything that can be done needs to be wrapped up into a trivial to use interface. It has to be for everyone, not just people who are technologically {capable,knowledgeable} and have the time and energy to do this all the time every time. It needs to be standard.

Of course, we should fight this from both ends. Many ends. We shouldn't collect the data. We shouldn't process it. And we should build defenses.

bdw5204
0 replies
13h49m

If you're looking for privacy from your current and possibly future employers, you can obtain that by using a pseudonym online and taking basic measures to make yourself hard to dox. If you want privacy from the US government, that's not going to work.

Also, getting doxxed isn't entirely bad because it can open doors as well as closing them. Depends on how you leverage it. You just don't want the US government and/or the government where you live as your adversary.

afro88
0 replies
14h34m

But by doing this (Tor etc), you've also potentially identified yourself as a person of interest who warrants further scrutiny. It begs the question: what are you trying to hide.

jll29
1 replies
4h49m

What you say is indeed one possible way to deal with it.

Treat it as a public postcard signed with your name, and never for a minute assume that someone doesn't link what you say to your identity.

This mode of operating means you will be more polite when angered by some troll online, as you are not hiding behind some pseudonym.

And at least you won't be shocked when a Website does what Glasdoor recently did, i.e. convert from pseudonyms to people's real names WITHOUT CONSENT OR WARNING. Surely by using always your real name you will not bitch about your employer on a Website when you name is shown as the poster and you will still want to get promoted, or at least retained as an employee.

Too
0 replies
4h17m

So the whole internet will become like LinkedIn. Let the horror begin.

cbolton
1 replies
7h56m

I think that's exactly what Google's ex-CEO said years ago:

If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place

which you can read either as a terrible "nothing to hide nothing to fear" comment or as a good warning about the factual state of things.

Kim_Bruning
0 replies
6h8m

So one's privacy posture should be part of the complete security posture, and should ideally start at

"DEFAULT DENY ALL"

After which you can -of course- start opening up ports and start trusting people with information. Even if done imperfectly, one's attack surface is at least under some sort of control. I mean -at least- a semblance of control can be taken, however aspirational in practice. It allows conscious control of ones information flows.

As you may have experienced yourself a posture of "DEFAULT ALLOW ALL" is effectively impossible to manage, since tracking down and plugging new leaks faster than they show up is pretty much like bailing out a boat with -well- a squillion leaks (and more every minute).

Getting muggles to a safe default posture is going to be difficult. However, seeing the growing awareness in society it might not be impossible.

Think of nascent privacy initiatives by the EU (no matter how (in)effective as yet). Or you could think of starting school programs akin to "just say no" for instance, promoting more conscious and careful online behavior. It might never be perfect, but some level of herd resilience might be attainable?

AdamJacobMuller
1 replies
15h3m

you should probably just use your real name

Nonsense.

JKCalhoun
0 replies
14h30m

Ha ha.

wiseowise
0 replies
10h50m

Yes, agent Smith.

ilrwbwrkhv
0 replies
14h20m

And that's why these social media networks ruined the internet. Companies like Facebook forced "real identities". Now suffer everyone for giving in.

dogman144
0 replies
2h27m

Talk to anyone in advanced privacy work or out of government -> full stop, yes, if you’re not doing Snowden-style measures (TailsOs) or really reconsidering where and how your phone travels around with you and browser controls, it’s done.

Tracking and the firms that do it is incredibly extensive and hard to beat (ie browser ad you just scroll by can fingerprint you well enough).

dghughes
0 replies
4h41m

I'm also waiting for the day, which is pretty much here now, when you will have to use a real name for any sign up form on any website. Something verifiable and not John Smith at phone 123 456 7890.

datavirtue
0 replies
14h33m

Glassdoor is dox'ing people so...

brikym
0 replies
14h35m

Black mirror is far too accurate and coming true far too soon.

Terr_
0 replies
15h18m

There's a crucial distinction here between the pragmatic and the normative, or else there's a feedback trap where accepting it as normal makes it even more common.

In other words you can plan around the worst case, but don't let go of the opinion/social-value that it's too-common and wrong and aberrant.

JKCalhoun
0 replies
14h36m

I more or less do that. Not really related to privacy, but I find that if I post as myself, I am more honest, less likely to troll, more considerate of others when I post. For me it's healthier.

mschuster91
29 replies
16h31m

“This is the latest chapter in a disturbing trend where we see government agencies increasingly transforming search warrants into digital dragnets. It’s unconstitutional, it’s terrifying and it’s happening every day,” said Albert Fox-Cahn, executive director at the Surveillance Technology Oversight Project.

If companies would respect the spirit behind GDPR and not store data that is not needed to fulfill a user's requests and protect the data that they must have in a way that makes dragnet searches impossible, this would not be a problem.

Instead, we have sites not being ashamed in informing you about literally thousands of external ad broker, tracking, notifications and whatnotelse integrations.

To u/decremental: you seem to be shadowbanned, here's an Archive link: https://archive.ph/kAXQ1

loeg
16 replies
16h8m

If companies would respect the spirit behind GDPR and not store data that is not needed to fulfill a user's requests and protect the data that they must have in a way that makes dragnet searches impossible, this would not be a problem.

Saving user watch history is useful for users. Sure, make it optional, but I find it really useful that youtube shows me if I've already watched a video, and that I can find recently seen videos in my watch history.

boppo1
8 replies
15h52m

There should really be levels of history. I don't want youtube keeping every video I've ever watched on file, so I have history off. However, now, if I leave a video and come back to it the next day, my place in the video is lost. Lemme specify the length of time to keep history pls.

kvmet
5 replies
15h29m

Why does allowing them to store your watch history mean that they MUST also share it with 3rd parties?

Why isn't "I want _you_ to know but I don't want you to use it against me." an option?

ddingus
4 replies
15h16m

Because dollars.

Being able to sell histories means being able to sell supposedly more effective ADS. Also shows ads were viewed and by whom.

Precision of any kind in demographics is worth a lot of money.

CatWChainsaw
3 replies
1h50m

As far as I can tell this is just the truth, so I'm curious who downvoted and why. Guilty consciences?

loeg
2 replies
43m

It conflates data collection with selling data. Google, Meta, etc do not sell their users' data (for purely commercial reasons).

genewitch
0 replies
20m

the ol' "hashed minutae" and "in aggregate" chestnut. Meta at one point allowed you to target ads to a specific person (nevermind the cambridge analytica stuff.) Google is in the business of selling time, eyeballs, and mindshare to advertisers.

Google and meta (et al) receive money in exchange for the info of their user base. That they're playing 3 card monte (shell game?) with the data to "hide PII" - which has been mathematically proven to be impossible (currently, perhaps forever) seems a hair not worth splitting, considering their market caps.

Put simply, if there wasn't a financial reason to do the data collection google would simply not do it. You don't get rich shareholders by writing a lot of checks to seagate.

CatWChainsaw
0 replies
23m

I always understood "selling data" to be shorthand for the whole metrics-for-ads transaction, seems obviously that "selling" the data itself would be a one-time sale but being able to offer "updated insights" every time someone runs a new ad campaign is repeat business.

zzo38computer
1 replies
15h10m

Can you store the history locally, perhaps if you use one of the other programs for accessing YouTube rather than directly?

They shouldn't need to keep your history; you can use your own programs to do so.

rand0mx1
0 replies
5h44m

Try Freetube app to store history locally.

alisonatwork
3 replies
15h28m

Any client software can do this for the users who want it. There is no need for the service provider to track user watch history on the server by default.

transcriptase
1 replies
15h20m

I’m sure it might be useful if your primary business is something like, oh let’s say, delivering advertisements based on users preferences and behaviour.

hobs
0 replies
15h16m

At the end of the day the recommendation algo is why tiktok grew so fast and is so sticky, users want the companies to remember stuff, just not to use that memory against them (ya right)

judge2020
0 replies
15h18m

This also powers their recommendation algorithm. Arguably the only reason 99.99% of people use YouTube.

dpkirchner
1 replies
15h29m

I'd prefer to have that sort of thing stored locally, perhaps synced between all of my browsers/computers (directly, without storing the history unencrypted on servers).

jart
0 replies
15h21m

Even if Chrome did that, it wouldn't stop Edge from importing Chrome data and uploading it to Microsoft without asking permission.

nonethewiser
0 replies
15h48m

That doesn’t require them to know who you are.

wging
1 replies
15h52m

The comment in question appears in incognito mode. I don't think they're banned.

pvg
0 replies
15h50m

The comment got vouched to life.

pierat
1 replies
16h18m

One in the same on HN.

pvg
0 replies
15h53m

It's not shadowban if you're told you're banned and it doesn't come from the Shadoban region of Japan.

j33zusjuice
1 replies
16h17m

How is a banned account making visible comments? Also, how did you find that post?

rootusrootus
0 replies
16h10m

You can see dead comments by enabling showdead in your HN profile.

decremental
1 replies
15h19m

That's true, I'm just banned but it still lets me comment and that's fine with me. No pesky fake internet points to worry about.

balls187
0 replies
14h29m

This account is the Illuminati.

throwaway237289
2 replies
16h2m

If companies would respect the spirit behind GDPR and not store data that is not needed to fulfill a user's requests and protect the data that they must have in a way that makes dragnet searches impossible, this would not be a problem.

This is ridiculous. Literally the authorities are the one demanding information, so that they can abuse that power.

And you're blaming the people with information?

Stop fighting the wrong war: hold the people with power accountable. That's the government here. They can use force and put you in jail. That's power.

tomrod
0 replies
15h57m

Both, not either.

kevinmchugh
0 replies
15h46m

Anyone collecting data needs to consider how the data can be misused. Whether that be by a (repressive) government, a hacker, or a future purchaser of the data

tamimio
12 replies
14h45m

The court orders show the government telling Google to provide the names, addresses, telephone numbers and user activity for all Google account users who accessed the YouTube videos..

Hopefully that clarifies for some folks why these big tech/social media companies insist on having your phone number as a “2FA for security” despite all the sim-swap attacks.. simply for this moment, because you might be using a VPN, and address/name aren’t in your google account, but definitely your phone number is there, it’s even worse if you’re using an android too, as they probably will pull out all your app/browsing history..

IshKebab
4 replies
3h10m

"Guys, we know our users' names, addresses, all of their emails, browsing history, location history and contacts... but we're missing the critical information! Their phone numbers! Can anyone come up with a security justification for asking for it?"

-Nobody ever.

Come on, use your brain. Even if you are talking about smaller entities who might otherwise only have names and emails, why would they want phone numbers? They don't care about identifying you. And even if they did they already have your email and name.

Step away from the tin foil...

Nextgrid
2 replies
2h26m

why would they want phone numbers

Because it is trivial to make a burner/secondary email address, but much less trivial to do the same with a phone number. Furthermore, everyone adds phone numbers to their contacts but very few add emails, so phone numbers are much more valuable from the perspective of inferring social graphs.

Both of these are extremely valuable for adtech and generic "growth & engagement" scum, thus why all companies matching this criteria started effectively requiring phone numbers. The 2FA/security angle is just an excuse for the true reason behind it.

IshKebab
1 replies
2h4m

None of that is related to providing identities to the government, which was his tin foil hat conspiracy theory for why 2FA is used.

I'd buy the spam reduction angle - it's a bit easier to get an email address than a phone number. But I have never seen a service require 2FA (except things like NPM and PyPI; but that's clearly for security) so I don't think it's that either.

I think it's pretty clear that the reason really is security. There's no conspiracy.

Nextgrid
0 replies
1h56m

None of that is related to providing identities to the government

Agreed. But I disagree that the true reason is security. The true reason is better stalking which is valuable to adtech scum which now happens to be the vast majority of consumer-grade tech.

I have never seen a service require 2FA

Try register on Twitter. They'll let you register but then randomly suspend your account for alleged ToS violations (even if the account was outright inactive) but will give you the option of instantly unbanning yourself following phone number verification. Microsoft will randomly lock out MS accounts without a phone number attached and will require a phone number for "security" upon the next login (the security angle being very dubious considering they don't have a number on file to compare to, so even an attacker can pass this challenge just fine). Etc.

There's no conspiracy.

It's true, there's no conspiracy, it's just business and can be explained by common sense and economics. Phone numbers help tracking people. Adtech makes more money the better it can target its ads. Most consumer tech nowadays is intertwined with adtech. Said consumer tech thus optimizes for higher profit by collecting more data to help adtech.

dogman144
0 replies
2h29m

Brief counter, based on adtech knowledge.

Fingerprinting to a user, especially for a bulk request, without something to anchor on like a device id (or phone number), is harder than you make it out to be. End of third party cookies and so on has had an effect.

kevincox
2 replies
6h45m

I'm not saying that there aren't other motives, but there are legitimate security concerns.

Credential stuffing is a huge issue for large providers and requiring 2FA is a huge mitigation. Sure, a targeting attack will make the SIM swap, but that is a huge difficulty upgrade from generic credential stuffing.

dogman144
1 replies
2h31m

Source - am a fairly experienced security engineer.

It’s a nonsense argument to say Google can’t handle credential stuffing without SMS 2FA in place, as in not pushing all 2FA via Google Authenticator and using the very wide reach and talented security team for baseline cred stuffing. Sec tools for this, even without being Google and their very talented sec team, are pretty good.

Wanting a hard phone number is a pure identification play and also about the more likely pragmatic concern (than cred stuffing) of using Google for burner accounts.

kevincox
0 replies
2h27m

How do you handle credential stuffing? Attackers will use a huge number of regular residential IPs or VPNs that you would expect to see logins from. How do you tell a credential stuff from a regular login? They are both coming from unknown IPs with normal login rates and they have valid credentials.

chatmasta
2 replies
14h28m

You can avoid this with Google by using a virtual WebAuthN device (ironically via Chrome devtools), and then you will unlock the ability to enroll in MFA with a QR code for an OTP URL.

getcrunk
0 replies
14h0m

This sounds like it’s new though? And maybe is for testing/dev and will go away?

c0pium
0 replies
12h14m

Which really underscores that all of the MFA stuff is actually about security. Because of course it is.

caskstrength
0 replies
9h59m

No, it doesn't "clarify" anything like that. If google doesn't have phone numbers of some subset of the accounts requested... they will just specify so in their response to law enforcement since it is completely legal and google is not currently obligated by law to have phone numbers of all users of Youtube. Sundar isn't going to prison because of that or anything.

Saying that some PM at Google decided decade ago something like "hey guys let's build a database of our user's phone numbers to satisfy some theoretical future dragnet surveillance request from law enforcement and tell our users that it is for their own security" is actually quite ridiculous conspiracy theory if you think about it.

FpUser
10 replies
16h13m

"...undercover agents sent links of YouTube tutorials for mapping via drones and augmented reality software, then asked Google for information on who had viewed the videos, which collectively have been watched over 30,000 times."

For fuck's sake, there are thousands upon thousands of companies and individuals with totally legit commercial interest in this tech. I bet Zwift and the likes are doing just that.

anamax
5 replies
16h8m

undercover agents sent links of YouTube tutorials for mapping via drones and augmented reality software, then asked

Entrapment?

initplus
2 replies
13h42m

Entrapment isn’t just “any time police trick you during an investigation”. It is only entrapment if the police actually induce you to commit the crime you are charged with. Viewing these videos is not a crime.

genewitch
0 replies
27m

and yet delorean

anamax
0 replies
12h47m

Presumably the police think that people who viewed those videos are somewhat likely to have committed some crime.

At least some of those people wouldn't have thought of committing that crime without being exposed to those videos.

That's not at the level of a cop saying "hey, let's go rob a bank" to some sap, but ...

koolba
0 replies
16h5m

I read that as “they sent a list of URLs”. Not, “they uploaded a bunch of videos”.

FpUser
0 replies
16h3m

Entrapment is a practice in which a law enforcement agent or an agent of the state induces a person to commit a "crime" that the person would have otherwise been unlikely or unwilling to commit.

Viewing the links and using technology covered by those is anything but. This is trying to make crime suspects out of thin air. So much for fucking free democratic society.

squeaky-clean
1 replies
16h2m

I didn't take it that those videos were illegal. But they tried to trick the criminal into watching some videos with low view counts that they would be interested in, and then planned to ask youtube who viewed all of those videos during that time.

Like an ip gathering honeypot website. Except they probably assumed the criminal would be too smart to click a link to some random website. But they would visit some youtube links. And then they knew Google would roll over and give them the info of all viewers.

It's a scummy plan, but mostly because Google will give away that info without putting up any fight.

declan_roberts
0 replies
15h34m

We learned from Nest that they won't even require a warrant before handing over the code to a smart lock to law enforcement.

No judge needed. Just ask and say "open sesame"

patrickmay
0 replies
3h36m

They should have just paid for targeted ads on those videos. It wouldn't take many to narrow the pool.

crooked-v
0 replies
15h55m

"Mapping via drones" is from what I understand the #2 income source for professional drone operators, right after visual or IR maintenance surveys of industrial building exteriors.

rufus_foreman
7 replies
14h54m

I'm all for freedom and individual rights and all of that. But you also have to look at what is going on here.

This person is suspected of selling Bitcoin for cash.

It says so right in the article:

"undercover cops sought to identify the individual behind the online moniker “elonmuskwhm,” who they suspect of selling bitcoin for cash"

Says it right there.

To anyone who thinks that is acceptable or that there is some sort of overreaction by the authorities, I don't even need three words to refute you. Three numbers is enough:

9-11

politician
4 replies
14h49m

This person is suspected of selling Bitcoin for cash.

There must be more to it than that; buying and selling Bitcoin isn't illegal.

roenxi
1 replies
14h37m

Well, in the US in general it might be. The police are saying that it is probably a crime under money laundering and unlicensed money transmitting laws. The issue is that particular set of laws are mostly outrageous and quite unintuitive in practice. They criminalise a lot of quite reasonable activity - in this case, like selling bitcoin for cash.

Although the thread root refutation makes no sense. Freely trading bitcoin leading to 9-11 is going to have to involve some flexible mental gymnastics since 9-11 happened in a pre-Bitcoin world. In practice AML style laws are generally targeting tax evasion.

politician
0 replies
3h50m

Perhaps the police are trying to find a way to apply asset forfeiture because Bitcoin is so valuable? That would make more sense than AML because police forces don’t generally investigate those crimes — enforcement is usually the domain of FinCen and banks.

balls187
1 replies
14h38m

Per the GP—this is likely about someone laundering money for terrorist organizations that the US is investigating.

palmfacehn
0 replies
14h0m

The year is 2024. Everyone who disagrees with my political views is a confirmed national security threat and a terrorist. I hope they throw the book at these bitcoin boys.

ejb999
1 replies
6h21m

Did 9-11 happen because of bitcoin? Not sure I understand your point?

Dibby053
0 replies
5h27m

I'm pretty sure it's satire.

knightoffaith
6 replies
15h10m

For those who have not already heard of it, check out invidious: https://invidious.io/

A list of public instances (which you probably want to use if you're concerned about being identified) here: https://docs.invidious.io/instances/

recursive
5 replies
15h1m

Videos still get served by youtube. Maybe you know something I don't, but I wouldn't assume this is a way to get absolute anonymity against YT.

knightoffaith
2 replies
14h52m

Well, I don't know if there's any way to be absolutely anonymous on the internet.

It seems to me that there's still good reason to prefer using invidious on an instance used by many people as opposed to youtube directly. Can't make perfect the enemy of the good.

knightoffaith
0 replies
12h42m

Yep, you're right, thank you. Definitely be sure to check "proxy videos" in preferences.

An aside on invidious collecting data - I am a lot more comfortable with some random guy in Europe having my data than Google.

jraph
1 replies
5h25m

There's an option to use Invidious as a proxy, though I don't know if public instances offer this (and you probably need an invidious account).

knightoffaith
0 replies
3h37m

Public instances do offer this, and there's no need to sign up (I do this on public instances without an invidious account).

aprilthird2021
5 replies
16h31m

With the TikTok ban, jawboning, and now this, it feels like the US really wants to control the narrative its citizens see, now much more than previously...

falsaberN1
3 replies
2h34m

TikTok has a narrative? I thought it was all about cooking recipes that may or may not destroy you if you make them as instructed.

mardifoufs
2 replies
2h20m

Yes. It totally switched the narrative on Israel (and that's great imo). They didn't ban videos of Palestinians getting murdered and bombed, like other platforms did. It's a bit insane since even my apolitical friends got exposed to the raw videos and now it really shaped their opinion. I won't push it since I know this never actually happens, but it got zoomers to maybe even consider the conflict important enough to be an issue that swings their vote (the issue is that we don't vote lol).

For me that's awesome. But it really really really didn't sit well with members of Congress I guess, and boomers in general.

falsaberN1
1 replies
2h16m

Huh. I guess it's obvious I don't use TikTok (I'm past 30, I have no business going there) but I had no idea it had actual political content.

mardifoufs
0 replies
1h20m

Ha! Yeah I get it, I don't use it either. That's why it was so shocking to me too. It just started popping out at all times, in a weirdly "bipartisan" fashion too. When I asked about it, almost everyone said that they saw it on tiktok, or heard about it there.

I'm usually not super into video content, so even if I'm "younger" i never got into it but imo it's better than say, Facebook for this type of stuff. Even with the algorithm you still get exposed to people you don't follow, by design, which makes it harder to spread fake news in an echo chamber. so I'm not too concerned about the narrative control aspect.

As I said, most of the Palestine related stuff (that I started looking into afterwards) wasn't some pundits or grifters trying to get people outraged about a fictional "other side", it was mostly raw footage of what was happening in Gaza.

Clubber
0 replies
14h15m

the US really wants to control the narrative its citizens see

They had control for decades, then traditional television/cable media started to collapse. They're trying to take control of "new media," in the internet.

balls187
3 replies
14h54m

The order was merely a nicety. I cannot imagine any company succeeding at taking a stand against the United States Federal Government on when it really wants something.

Likely the same with any other major nation state (I bet the chinese govt once showed Google it could at will access all their data, in order to capitulate cooperation, while the US is given full access at any time for any reason)

xethos
0 replies
14h34m

I don't like Apple. I'm cheering on the current lawsuit brought by the feds. I do not, and never have owned, an iPhone, and Apple is not the bastion of privacy they sell themselves as.

That said, credit where it's due: https://en.wikipedia.org/wiki/Apple%E2%80%93FBI_encryption_d...

px43
0 replies
13h49m

Eh, no. Are you familiar with MUSCULAR? It was in the Snowden leak. Basically Google kept telling the NSA to fuck off, so the NSA dug a hole and tapped some fiber to exfiltrate clear text communications between data centers, which was apparently a lot of important stuff back then. Within weeks of the MUSCULAR leaks, all Google datacenters were communicating with each other with proper SSL termination.

Are you familiar with Aurora? It was that time in 2009 when a Chinese military unit broke into Google and started poking around for information about some known dissidents. They were expunged quickly and the level at which Google stepped up their security game is unparalleled in the history of infosec.

Yes, the NSA or various law enforcement groups can get information from Google if it goes through the proper channels, and there's probably a handful of intelligence community insiders who hand data out of Google from time to time as well, even as part of various intelligence sharing agreements, but the idea that the NSA just has free access to whatever they want there is ridiculous.

c0pium
0 replies
14h9m

What are you basing this on? The US government gets told no about things it really wants literally all the time.

Edit: this is also not the federal government, state and local government has even less juice than the little the Feds possess.

randomdev3
1 replies
15h56m

If doing shady shit, you're not using your own IP, let alone Google account...

colechristensen
0 replies
15h55m

You overestimate the average kind of person doing shady shit.

Animats
1 replies
14h26m

"In conversations with the user in early January, undercover agents sent links of YouTube tutorials for mapping via drones and augmented reality software, then asked Google for information on who had viewed the videos, which collectively have been watched over 30,000 times."

Huh? Why? Is this because some country doesn't like people having good mapping technology? Israel and China object to precision mapping, but the US historically has not.

templeosenjoyer
0 replies
6h28m

It was likely a fake mutual interest and they hoped "elonmuskwhm" would watch the videos so they could gather some data about them. That's how I interpreted it anyway.

stainablesteel
0 replies
3h39m

its eerily similar to the federal government forcing elon to give every piece of data on every person who ever liked a trump tweet

the water is already boiling, its time to jump out

someotherperson
0 replies
13h11m

Casual reminder that Invidious[0] and Piped[1] exist. Farside[2] can automatically redirect you to a working instance, for example: https://farside.link/invidious

If you add a redirector plugin for your browser, you can add a capture for something like this:

  https?://(.*?\.)?youtube.com/(.*)
And push it to something like this:

  https://farside.link/invidious/$2
For example: https://farside.link/invidious/watch?v=Ag1AKIl_2GM

[0] https://github.com/iv-org/invidious

[1] https://github.com/TeamPiped/Piped

[2] https://farside.link/

self_awareness
0 replies
11h32m

1. Upload a video to YouTube, 2. Make it non-public, but possible to view for people who have links, 3. Post it to some shady underground forum full of people who want to do bad things to you for money, 4. Build the net of connections and track people who did watch the non-public link, see who knows who in this degenerate world of extorsions and theft, 5. Send the message to criminals that if they want to continue what they're doing, then even YouTube isn't safe for them, 6. Profit.

TBH, if the above would be true, I would even be happy. But I'm biased, because I sincerely hate thieves and extorsionists.

phkahler
0 replies
7h22m

Why can't the police track down someone who calls in a threat from the phone records? How can people be more anonymous with "caller ID" than YouTube?

maxlin
0 replies
3h4m

https://archive.is/2fCUs

(Never not seen an archive link of a paywalled article buried so carrying my part of that responsibility)

decremental
0 replies
16h36m

Someone post the pirate link so we can read this.

datavirtue
0 replies
14h34m

Isn't that what they already do? I feel like I'm taking crazy pills.

atlgator
0 replies
3h0m

Just waiting to get pinched for my secret shame: bourbon hunting videos.

I don't even drink bourbon.

arp242
0 replies
13h40m

Somewhat related story from a few years ago:

Justice Department withdraws FBI subpoena for USA Today records ID'ing readers - https://news.ycombinator.com/item?id=27408647 - Jun 2021 (174 comments)

anon115
0 replies
11h20m

fuck em fuck all those monkeys

WillieCubed
0 replies
14h42m

The court orders show the government telling Google to provide the names, addresses, telephone numbers and user activity for all Google account users who accessed the YouTube videos between January 1 and January 8, 2023.

Interesting aside: Viacom used a similar broad request back in 2008 [1] in its lawsuit that nearly put YouTube out of business in its infancy. This time, it's the government making the request, and Google has way more data to potentially provide.

[1]: https://web.archive.org/web/20100702111029/http://afp.google...

Razengan
0 replies
9h45m

All this surveillance and cops still can't catch day to day petty thieves and scammers.

MilStdJunkie
0 replies
13h39m

Eleven hundred steaming pantloads of hot bullshit. This is textbook fishing trip. The real zinger of all this . . the thing that has me riled . . is that this is going to be used to put away who knows how many poor-ass sadsack innocent citizens who can't afford counsel, before the poooh-leeeeease finally bust that one guy who has the money to afford legal representation. Upon which everyone goes oh right this was really illegal SILLY US

Someone posting up illegal videos? We already got laws for that, you sons of bitches. WATCHING videos? Like, "everyone in the country who watches this video?" Get the hell out of here.

HeartStrings
0 replies
11h1m

Why are you linking to paywalled scam sites?

Eisenstein
0 replies
15h19m

Let's walk through this rationally:

Take the crime they are investigating and ignore that; it is a red herring. Say that there is a stalker who hacked someone's devices and pulled private information and videos from them, and posted them publicly. Something unequivocally bad. We would all agree that there needs to be a way to investigate and stop this person and seek justice for the victims.

Until recently, taking this to the cops would get you a blank stare and nothing would happen. At least now in certain places it is taken seriously. But traditional investigative methods don't work. They would need to get access logs to the places where that data was uploaded, at the very least.

Cops are not particularly great at investigating (most crime is solved by confessions or just catching them in the act). They don't have any reason not to just blindly grab all the data and sort through it by hand, because that's what they know how to do, and if they request it and get denied, why would they care? They either try something else or say 'fuck it'.

Local Judges see a request for records and a block of time and it seems reasonable to get that info. No one is sitting and explaining to them the implications of allowing cops access to personal information from viewer logs. Historically, the lower courts are not the place to seek enlightened rulings.

Google? I honestly have no love for them, and they bend over for China every day for worse stuff, so I would be surprised if they gave two shits about their users, except that if this becomes common enough they might actually have to devote support to these requests, and they hate giving support, so they might fight it just for their own self-interest.

The problem is that there is no settled case law here and no clear legislation, and I would hesitate to take any important cases to the Supreme Court until at least Mr. "Is This a Pube on My Coke Can" is gone.

Bu9818
0 replies
11h18m

Use Invidious, use RSS, use yt-dlp, use Tor.