return to table of content

Microsoft AI spying scandal: time to rethink privacy standards

Maro
107 replies
15h7m

Companies like Google have had access to our full email, search, location, photo roll, video viewing, docs, etc history for 10+ years. I don't think also having our LLM prompts fundamentally changes this picture..

I guess this is what the shifting baseline argument refers to..

Having said that, I think it's rational for almost all people to not care to give up privacy for all these (addictively) amazing tools. Most people don't do anything worth privacy protecting. Having worked in data/engineering at bigtech, it's not like there's a human on the other side reviewing what each user is doing. For almost all people the data will just be used for boring purposes to build models for better marketing/ads/recommendations. A lot of the models aren't even personalised, the user is just represented by some not-human-readable feature vector that again nobody looks at.

Hell, I have multiple Google Home devices that are always on and listening, and the thing's internal model is so basic and not-personalized that after multiple years it still has trouble parsing me when I say "Play jazz" and "Stop", even though these are the 2 commands I exclusuvely use. Sometimes it starts playing acid rock, and when I say "Stop" it starts reading me stock quotes.

ivlad
13 replies
14h56m

You choose to have Google spying devices at home. You choose to have Gmail. You may have no choice for Internet search for quite some time, but you certainly choose your browser.

Start there.

capital_guy
12 replies
14h44m

You choose to have Gmail

Disagree with this. self hosting email is notoriously difficult. Gotta give the data to somebody. Plus, your work email is either going through MSFT or GOOG, 99% of the time

justinclift
8 replies
13h54m

self hosting email is notoriously difficult

Yet people do it anyway. It's not an impossible task like you're making out.

nvy
7 replies
13h32m

The act of hosting postfix/dovecot is not in itself difficult. Debugging deliverability issues is, though. And it's time-consuming.

justinclift
5 replies
13h25m

And doesn't change anything I said at all.

nvy
4 replies
12h50m

It kinda does, though. It's why I stopped self-hosting. I bet if you emailed my gmail address it wouldn't come through, even through no fault of your own.

You can have DMARC, SPF and DKIM all correctly configured on a clean IP and some mail server at Microsoft will still drop your mails because it's having a hard day and it feels like it.

justinclift
3 replies
11h41m

You can also route outbound email through an existing place like smtp2go.com, which stops all of that outbound hassle. :)

That place in particular (which I use and can recommend) even have a (permanent?) free tier. ;)

intern4tional
1 replies
10h37m

This makes it something that the average person cannot do; you're suggesting something that already requires more time and resources than 75% of the population has access to.

If you're serious about this than go talk to a non-tech person and tell them to self-host email and see how they do. Look at their challenges, build a solution and then offer it.

justinclift
0 replies
10h6m

Please don't try changing the goal posts by introducing "non-tech" people into this.

That's not at all what the conversation is about, and routing through an external place removes a whole bunch of hassle compared to setting up and maintaining outbound email.

You might not like it for some reason, but that's on you.

nvy
0 replies
4h2m

If you need a third party to deliver your mail on your behalf, are you really self hosting?

What's the point? At that stage you've already conceded the deliverability problem so now you're just wasting time administrating dovecot and keeping up with security patches.

jstanley
0 replies
13h15m

The greatest trick the centralised email services ever pulled is convincing people that a faulty spam filter is the sender's problem.

sosodev
2 replies
14h31m

What about the numerous other email providers?

moring
0 replies
14h15m

The numerous other email providers are... numerous. Every discussion like this ignores, to an absurd extent, how hard it is for non-tech people to gather information on these topics and make an informed choice: Information about which email providers are care about which aspects of privacy, which aspects of privacy and information security even exists, which email providers even exist, what they are doind with your data, what parts of what they are doing is a problem...

You can't even ask tech people to make a choice for you because they all say different things.

Other domains like cars, medicine, construction, whatever have established standards because they have recognized that individuals simply _cannot_ make an informed choice, even if they want. I'm eager to say that only information technology likes to call the user "unwilling" and "lazy" instead, but actually individuals from other domains do that too. Luckily, the established standards are mandatory, so their opinion doesn't count.

TeMPOraL
0 replies
14h20m

Rounding error that doesn't matter, because the recipients of any e-mail sent from those providers are likely on mailboxes backed by Google or MSFT anyway.

amatecha
11 replies
12h48m

Most people don't do anything worth privacy protecting.

That rationale sounds great (albeit dismissive/invalidating) until something you've done (and have provided ample digital evidence of) becomes illegal or is otherwise used against you.

Oh actually, what's your email password? I mean, since you're not doing anything worth keeping private, right?

You think there isn't a human reviewing the data of what each user is doing, but there absolutely could be, and there's no reason there can't be, like when Tesla employees were viewing and exfiltrating footage/imagery from customers' vehicles. Not just one or two people but apparently disparate _groups_ of employees. https://www.reuters.com/technology/tesla-workers-shared-sens...

gbalduzzi
3 replies
10h41m

You think there isn't a human reviewing the data of what each user is doing, but there absolutely could be, and there's no reason there can't be

I'm not sure there is a solution to this problem, unless we accept to lose a lot of features in our products and switch to E2E services.

the only alternative I can think about is some required audit about the measure in place to prevent employees from accessing data, but I'm not sure how effective that would be

ItsBob
2 replies
10h17m

I'm not sure there is a solution to this problem, unless we accept to lose a lot of features in our products and switch to E2E services.

I'm not so sure.

Based on the figures I could find for 2021, Google's ad revenue was about $60B with approx 3B users (I asked various chat bots...).

So, if you extrapolated and did some slightly dodgy maths (there were other factors but I can't be arsed typing them) it would cost about $6-$7 per month per user if Google stopped all ads, and by extension, tracking & data mining - This is for Google to maintain their current figures.

Take these figures with a big pinch of salt though...

That $ figure is across the whole of Google. So that is for every single product they have but if you just use Gmail then it might only be $2 a month.

So, if Google wanted to make the same money without tracking, ads etc, they could but the temptation to sell your data and mine it would be strong!

Anyway, my point is that it's possible to do it but would people pay for it? I pay for my email with Fastmail so there is a single data point for you :)

the_other
0 replies
8h11m

Likewise. I pay for Fastmail, I contribute to Signal. Hell, I'm even paying for Kagi even tho' its results are fairly useless, in the hope that paying for privacy will lead to a better service over time.

n+2

soco
0 replies
8h46m

I pay for all apps and services I regularly use as a matter of principle even if they have a fine free version. It's not even that much, a couple dozens bucks a month, so here's a second data point.

Maro
3 replies
12h10m

Good point about future exposure.

A counter-argument: people already do all sorts illegal (misdemeanor?) things every day, tracked by big tech, and nothing happens. Some examples:

• speeding: your smart phone knows what road you're on, what the speed limit is, and that you're over it

• movie piracy: Chrome knows, your OS knows, your ISP knows, your VPN provider knows, any device that is listening can tell you're watching a movie that you shouldn'be able to watch at a non-cinema GPS location

• ..

8372049
1 replies
10h10m

20 years ago, legal firms sending out threatening letters to people they could identify on torrent trackers was commonplace.

chaoskanzlerin
0 replies
8h58m

It's still quite commonplace in Germany in 2024. Typically, they claim around 1000€ in said letters, and refusing will have the case go to court, which usually rules in favor of said legal firms.

bigfudge
0 replies
11h47m

Have there never been cases where law enforcement requested that sort of information and used it? I don’t think we know.

andoando
0 replies
1h36m

I worked at Tinder and we had full access to all the messages in plaintext, and ability to look up users by phone number, email, etc.

WhackyIdeas
0 replies
9h21m

Yep. Even NSA agents spy on their ex’s. So of course Microsoft employees will be spying on people.

It’s not about privacy, it’s about having control of your own destiny. Not having some shitty OS maker think they are God in a 1984 world.

This is the warped kind of stuff which happens to companies who join the NSA Prism program… give it a few years and all they care about is power and money and playing spy.

There’s probably an NSA agent rubbing the higher ups at these companies off and telling them they are God as they finish. Or taking them on tours of the office and showing their cool exploding pens and other James Bond tech. Whatever it is, Microsoft is more than just invested.

Recall? Give me a break. That is the most in your face global surveillance tech I have heard of to date.

Springtime
0 replies
4h50m

It's a common debate hole that privacy = hiding something bad/illegal, whether now or something deemed to be in the future. While this can be true it's only an aspect. Take benign examples where privacy would be useful and it's unrelated to that.

Eg: was reading recently in WSJ how various insurers are using satellite and drone imagery of customer's roof conditions and using it to deny them coverage. However this has been abused where even brand new roofs have been marked as bad even despite evidence provided and push-back. The insurers were collecting imagery and making decisions but not providing evidence on their end. According to someone working for an insurer they're expecting soon to take images daily for such purposes.

Here various incorrectly affected parties have done nothing illegal/bad/wrong but they're losing control and insight into processes that are affecting them in real ways. These aspects are part of what Daniel Solove outlined in their privacy taxonomy, where they broke down privacy into different things that comprise it (where information collection is distinguished from processing, etc).

> much of the data gathered in computer databases is not particularly sensitive [...] Frequently, though not always, people’s activities would not be inhibited if others knew this information. I suggested a different metaphor to capture the problems – Franz Kafka’s The Trial, which depicts a bureaucracy with inscrutable purposes that uses people’s information to make important decisions about them, yet denies the people the ability to participate in how their information is used.

devjab
9 replies
13h15m

I think it’s interesting that you call out Google. It’s not that I disagree, but from the European enterprise perspective you can say that Microsoft has access to virtually everything. Banks, Healthcare, Defense, Public Services and so on, everyone is using the Office365 product line and almost everything is stored in Azure.

I don’t begrudge Microsoft, I think they are a fantastic IT-business partner from an enterprise perspective. They are one of the few tech companies in the world that actually understands how an Enteprise Organisation wants to buy IT and support, and they’ve only ever gotten better at it. As an example, when I worked for a Danish city, someone from Seattle would call us with updates on major incidents hourly. Which is something you can translate to a CTO being capable of telling their organisation that Microsoft is calling them with updates on why “e-mail” isn’t working. So I actually think Microsoft is great from that side of things.

I don’t think we should’ve put all our data into their care. We live in a post Snowden world, and even here in Denmark we recently had a scandal where it was revealed that our government lets NSA spy on every internet point leaving the country. I get that’s the way it is when you’re a pseudo vassal state. We’ve always had our government secrecy regarding the US and Greenland. It also makes me wonder how secret anything we’ve let Microsoft have access at really is.

choeger
5 replies
12h57m

This. The next time, there's a real disagreement in trade policies, Europe is going to be fucked. Microsoft does have access to literally everything and no one even seems to understand that, because no one understands what "cloud" or even just "online vs. offline" means nowadays. It's a bit scary.

devjab
4 replies
12h32m

This is another big issue, but the EU does know and care about it. My current employer falls under the critical infrastructure category (we’re finance/energy) and that means we’re required to have contingency plans for how to exit Microsoft in a month. Not just theoretical plans, but actual hands on plans that are to some degree tested once in a while.

The issue is how impossible it is to exit Microsoft, and this is where I’m completely onboard with your scary part. We can exit Azure painlessly from the digitalisation perspective, well not financially painless but still. IT-operations will have fun replacing AD/EntraId though, but all our internal software can be moved to a Kubernetes cluster and be ready to accept external authorisation from Keycloak or whatever they have planned to move to.

But where is the alternative to Office365? Anyone on HN could probably mention a bunch, but where is the alternative for people who don’t really “use” computers as such? The employee who basically think a pc “is” Office365. As in we could probably switch their Windows to Linux and they might not notice if they still had Office365.

This is where the EU currently doesn’t really have an answer. We have a strategy to exit Office365, but I’m honestly not sure our business would survive it.

yourusername
1 replies
11h44m

My current employer falls under the critical infrastructure category (we’re finance/energy) and that means we’re required to have contingency plans for how to exit Microsoft in a month. Not just theoretical plans, but actual hands on plans that are to some degree tested once in a while.

If those plans exist and there is even a tiny chance you can pull that off i'm impressed. In most organizations it would be a almost impossible challenge to even upgrade all their servers to a new OS in a month. I don't think i've ever seen a organization of more than 100 employees that could reasonable migrate their cloud provider, identity source and operating system in a month. Endpoint operating system upgrades often take a year (or more).

jononor
0 replies
10h36m

Most organizations do not spend any time even thinking about that, nor considering it in their decision processes, nor prepare for it. An organisation that do, will have an IT architecture. For example limiting exposure in the first place. For example, they might chose to not have have any servers with Windows in the first place. They might have a thin client or web oriented workflow for endpoint applications, which make switching out Windows easier on employee mdchines. They might have already have multiple OSes in use, to check that critical systems can be successfully accessed without Windows. That said, it is of course a big endeavour.

nonrandomstring
0 replies
9h11m

This is a big deal in cybersecurity education. I'm in the UK doing it. We've a dilemma that industry is desperate for fresh new cybersecurity recruits to fill an enormous skills gap. In the UK, Microsoft is a "preferred supplier" for lots of organisations, even defence stuff, and to get our students past the gatekeepers they pretty much need "365". Regardless of whether they can recompile a Linux kernel and do protocol analysis with Wireshark... no 365, no job, Not even tier-1 support.

By contrast my last cohort of masters students worked on things like critical infrastructure, national security, long-term resilience, hybrid interoperability... everything that Microsoft is not and makes worse.

So there's a schism between academic understanding and industrial reality that makes cybersecurity really rather hard to fix.

So I have to walk into a classroom and say:

  "Heads-up! We're going to be learning about 365 administration this
   week, about Active Directory, and this and that... which are all
   okay products and make a lot of admin tasks easier. BUT!! The only
   reason is so you can walk into a job. Because this US company has
   the UK tech sector by the balls. As soon as you're working, forget
   everything you hear in these lectures, because it's dangerous
   BigTech mono-culture that's antithetical to the real values of
   cybersecurity. Take the principles. Reject the products. Look at
   other tools that do the same, Have a backup plan." 
And I hope they took enough from Ross Anderson's SecEng book, and from the BSD/Linux classes and my the other lectures to go out there and start undoing the harm.

robertlagrant
2 replies
12h42m

Which is something you can translate to a CTO being capable of telling their organisation that Microsoft is calling them with updates on why “e-mail” isn’t working. So I actually think Microsoft is great from that side of things.

This is exactly it. Execs want to sound in charge of situations, even if it's just a person who can be shouted at. Microsoft can employ very expensive, individualised call centre staff in expensive suits to read out to you a service status page.

devjab
1 replies
12h29m

I agree but I also think it’s bigger than the ego of C-types. The fact that Microsoft calls you with updates also has a near magical impact on organisation culture in general. It’s the, “oh ok” gestalt that every employee feels, the thing that makes them consign to wait instead of being angry, and what not.

Sure there is ego, but a lot of C types are frankly good enough to work beyond that part of the equation.

robertlagrant
0 replies
10h31m

I wasn't necessarily talking about ego, but more about how other people in the C-suite will react differently knowing that someone's calling with updates regularly.

TaylorAlexander
8 replies
14h12m

I don't think also having our LLM prompts fundamentally changes this picture

Is that what is being discussed? The biggest issue with Microsoft's new AI announcement was that their system was going to take screenshots of your computer every second and process them with AI. That means they could have way, way more data about you than LLM prompts.

https://arstechnica.com/ai/2024/06/windows-recall-demands-an...

moontear
6 replies
13h7m

No, this is not what the article nor this discussion is about. Windows Recall is completely local and hence actually is in line with the argument of the article that stuff was local way before they were in the cloud. Those screnshots Recall takes pose a whole different security and privacy problem: that someone (bad actor, your employer, your partner) may access these screenshots and gleam a lot of information about you they shouldn’t get. Recall is not about „Microsoft having way, way more data about you“. Other MS products, sure - Recall isn’t the point here.

logicchains
4 replies
12h29m

Windows Recall is completely local

I'm pretty confident if the NSA or whatever asks MS for those screenshots, they've got a way of making them non-local. The EU is already pushing for mandatory local scanning for CSAM, do you think they wouldn't also extend this to Windows Recall snapshots once the technology is there?

Maro
3 replies
12h7m

I don't take Microsoft devices seriously, esp. PCs, except the Xbox they never last or gain widespread adoption..

crawfishphase
2 replies
11h41m

Microsoft Surface

Maro
1 replies
10h37m

I've never seen one irl.

vundercind
0 replies
21m

I’ve seen exactly one. Got one at a dev agency I worked at as a Windows test device.

This was like a 3rd or 4th generation one, I think. I was really excited to finally get my hands on one because they were supposed to be really good.

TL;DR it was mediocre-leaning-bad judged as a laptop, and a terrible tablet. I can’t figure out how they got anything but bad press for the things.

dspillett
0 replies
9h13m

> Windows Recall is completely local

Until it isn't, due to future changes or malicious 3rd party managing to make use of bad security decisions & bugs.

meowface
0 replies
13h9m

Yes, that's what this article is about. It doesn't mention user device surveillance.

Animats
7 replies
13h55m

Companies like Google have had access to our full email, search, location, photo roll, video viewing, docs, etc history for 10+ years. I don't think also having our LLM prompts fundamentally changes this picture..

Which is why I don't have a Google account.

Their search is OK, but their other services aren't that great.

sillysaurusx
6 replies
13h37m

What do you use for email?

wizhi
0 replies
13h13m

I can recommend Runbox. It's a paid service, but I really think that's for the better.

naavis
0 replies
12h51m

I have been very satisfied with Fastmail.

layer8
0 replies
8h22m

GMX

jjav
0 replies
11h29m

What do you use for email?

Self host

behnamoh
0 replies
13h15m

Google it.

Animats
0 replies
13h16m

ISP's IMAP server and Thunderbird.

czl
6 replies
14h58m

not like there's a human on the other side reviewing what the user is doing

With LLMs looking at all this data if you want to persecute or narrowly propagandize those who are X (X = pro-israel, anti-israel, pro-trump, anti-trump etc) it can be done much better than before. The "humans on the other" side will be using all this data to narrowly find people.

Maro
5 replies
14h24m

Narrowly?

Half of people are pro-Trump, half of people are pro-Isreal/Palestine.

_heimdall
1 replies
14h3m

Stats are really misleading when boiled down to binary decisions.

While polls generally show that roughly half of US voters plan to vote for Trump, that's in the context of only being given the option of Trump or Biden. Most polls I remember seeing since 2016 show roughly 1/3 of the US really consider themselves Trump supporters.

The Israel/Palestine question has similar problems. A binary poll question sets the context that a respondent needs to be on one side or the other, and that supporting or opposing both sides isn't an option. It also puts respondents in a position to have to pick a side regardless of how much or little they may know about the situation. With no more context, a 50:50 split could mean simply that most people don't know enough to decide and randomly pick a side instead.

apantel
0 replies
12h55m

Nobody randomly picks a side. People who can’t make well-informed decisions simply follow the lean of whatever biases they have. Slightly hawkish or conservative? Pro-trump. Bleeding heart? Pro-Palestine.

roenxi
0 replies
14h13m

Yeah but you don't need to target them all. Most people are pretty much useless, politically speaking, they just do what they're told by someone else. If you can identify who is doing the telling and target them specifically, large groups of people will otherwise be docile.

Historic attempts to apply that theory have been broad-brush to say the least [0]. With LLMs and access to enough data the authoritarians can get really fine-grained about when they take people out the next time they seize enough power. Anyone attempting to do something politically uncomfortable for the incumbents will be at serious risk in a fine-grained way that has not previously been possible.

Half of people are pro-Trump, half of people are pro-Isreal/Palestine.

I don't think it is 50-50, more like 20-30% for Trump and I don't have a read on the Israel/Palestine stats. Trump has a dedicated core of supporters but I'd suggest a lot of the people polling for him just don't see a better option.

[0] Eg, I was reading up on https://en.wikipedia.org/wiki/Intelligenzaktion the other day

lazide
0 replies
14h14m

It’s not that simple - each group has a bunch of sub-groups which respond to specific propaganda tactics/buttons.

Same with abortion/anti-abortion, guns/anti-gun, and any of thousands of other topics.

czl
0 replies
9h5m

Yes, for a search to be considered narrow, the resulting group should be small and specifically defined by precise criteria, not encompassing a significant portion of the population. *Notice once criteria are stacked groups can get small.* E.g. Which young males on your street (or in your building) who like Trump but not Israel were protesting at city hall today? Big datasets let authorities / advertisers / ... answer questions like that. The answer will often be a tiny "narrow" fraction of the population (of your city / state / country / ...).

renegat0x0
4 replies
11h49m

Normalization of abusive behavior is not OK! If one company abuses you for 10+ years, then it is not OK for other companies to abuse you!

If big data were not lucrative, they could not sell your data. If your data was not valuable Facebook and Google would immediately remove it from their servers without hesitation.

Mantra.

- I don't care about privacy

- I don't care big tech has access to all my data

- I don't care that Google has access all my politicians data

- I don't care I am building a worse future for society

- I don't care that I am being recorded while having sex in Tesla

- I don't care I am building surveillance state

- I don't care that my data are being sold to China, to India, to wherever highest bidder lives

- I don't care how my data are being used. I don't care if my data are being used to train military robot dogs that will be used for wars

- I don't care that I will not receive insurance because my medical data are sold wherever

- I don't care about privacy

- I don't care about privacy

- I don't care about privacy

IshKebab
2 replies
8h53m

Right but I think his point was that we accepted this decades ago, and this latest case hasn't changed anything at all. In other words the baseline hasn't shifted.

renegat0x0
0 replies
8h40m

No, you are not correct. The amount of surveillance IS rising. At first search data was recorded, emails, then maps, social interactions, car sensors, smart speakers.

Now literally everything you do is captured, repackaged and sold. Corporations also are more creative when it comes to selling your data. They have more data, they sell more.

Every drama is adding more to the privacy nightmare.

Sorry, but you cannot say that "nope nothing has changed for privacy in last 10 years".

dbspin
0 replies
7h40m

'We' didn't accept it. The internet expanded and the mass media conversation moved on to the next shiny thing. The majority of geeks have vehemently opposed mass surveillance every time it's been revealed (Snowdon, Assange). The majority of normies are completely unaware of the extent to which companies like Palantir and Clearview AI sell their most personal information to 'law enforcement', and Microsoft, Google and Facebook own and manipulate their information to fuel advertising.

tomalbrc
0 replies
9h18m

"I have nothing to hide"

justinclift
3 replies
13h55m

Companies like Google have had access to our ...

I hear people saying things like that occasionally, and have to wonder... why did you do that to yourself?

And why are you assuming everyone else was similarly unwise?

"Companies like Google" certainly haven't had that kind of info from me, nor from several people I know, as we've avoided the vast majority of their products.

YetAnotherNick
2 replies
13h30m

"Companies like Google" certainly haven't had that kind of info from me

Do you use any search engine? Then yeah, your data is funnelled to google/microsoft.

justinclift
0 replies
13h25m

Kagi

jononor
0 replies
10h20m

Only the data on what is being searched. Which is of course quite a lot. But it requires active effort to leak data that way. Many types of sensitive information is not natural to put there, say conversations with others or a diary. And other things can be actively avoided, like searching about illegal things. Which in some countries could be simple things like gay communities etc, unfortunately.

hackermatic
2 replies
14h17m

A lot more people are vulnerable to abusive partners than you may think, and that's a threat model most of these products never consider.

Thorrez
1 replies
10h54m

Would local hosting be any better against the abusive partner threat than these products? Disclosure: I work at Google.

hackermatic
0 replies
3h33m

Local hosting/processing is a good thought, but it only helps in limited circumstances, because partners you haven't separated from yet are likely to have physical access to your devices.

It's one of the big criticisms of Microsoft Recall: the database is locally generated and encrypted at rest, but practically, any user in the same home with device access can probably access it, and bypass any efforts you've made to delete your browsing history or messages.

Remember that abusers are often controlling and suspicious, so disabling Recall, denying them access to your devices, or changing your passwords is enough to set them off because you appear to be hiding something (maybe making plans to leave or report them).

Plausible deniability can be an important feature for activists and regular people alike. You can't always predict when a relationship goes south like this, or get out of it as soon as it does, or afford and hide a burner phone.

One of my friends remarks (edit) that tech companies should have a social worker and a public defender on staff for threat modeling these things.

charles_f
2 replies
2h21m

Having worked in data/engineering at bigtech, it's not like there's a human on the other side reviewing what each user is doing

There doesn't need to be a human inspecting stuff, just a computer doing the filtering and reacting to it. The example that the article is using is already mentioning that they found the nefarious purpose. Given the current trend of politics, notably in the US, its not far-fetch to draw a future where searching for things like, say, abortion, becomes illegal, and tech companies get coerced into sharing that kind of information.

The fact that MSFT found that co-pilot was being used by hackers suggests that they already have the entire set of tools to already do that.

wilsonnb3
1 replies
2h9m

Given the current trend of politics, notably in the US, its not far-fetch to draw a future where searching for things like, say, abortion, becomes illegal, and tech companies get coerced into sharing that kind of information.

This is the weakest of the privacy arguments to me. If we are in the "evil government" hypothetical future, why do they need my real searches? They can just as easily lie, make something up, use some other piece of data to persecute who they want, or do away with all of that pretense because they don't need it.

digging
0 replies
48m

You're imagining a fictional, united evil government acting as a single entity. In real life right now, any given government has numerous bad actors at various levels who can and do use data to persecute people and groups they dislike. Most of the time, they still need real data, but it's a messy multidimensional spectrum and sometimes they can fabricate it. Another thing you're missing is that they're using this data to find the people they want to persecute.

__MatrixMan__
2 replies
13h53m

Most people don't do anything worth privacy protecting.

That may be true, but most people still benefit indirectly from the actions of the few who do have something to hide (e.g. protestors, journalists, whistleblowers).

If we want the masses to continue to benefit from the actions of those few, then we need to find a middle ground. Someplace where the masses are private enough that a truly private individual can hide among them without sticking out like a sore thumb. You don't need to hide, you just need to be able to hide.

FloatArtifact
0 replies
12h53m

I would say that statement comes out of people's ignorance not out of informed knowledge. For those that even do know feel helpless in the face of industry standard treatment of privacy.

2Gkashmiri
0 replies
13h16m

this is literally the reason given for 100% adoption of https (because what if)

newrotik
1 replies
12h29m

Privacy is (a) freedom.

The reason why people care about privacy is not necessarily because giving up privacy has some directly observable negative effect. But, simply, living without freedom sucks.

I don't want you to know my personal information not because you could/would do something nefarious with it. I don't want you to have it simply because it's none of your business.

A4ET8a8uTh0
0 replies
7h5m

I am personally starting to think that this is the framing we need for this conversation. It needs to be discussed as a freedom as opposed to a right for no other reason than 'expectation of privacy' was very successfully neutered.

midtake
1 replies
11h52m

Information about your preferences is weaponized in the current era. As an example that some might find relatable, if you support Trump and you work for a FAANG, you might just lose your job.

8372049
0 replies
9h49m

While I agree with the general sentiment, I think the problem in this specific case is lack of job security in the US. In most European countries getting fired for something like this would never fly.

Madmallard
1 replies
13h36m

You have no idea what future advantage can be taken against people using this data. Seriously, like it's complete normal instinct to never overshare in conversation. Everyone knows this. This is the equivalent of extreme oversharing at all times.

jononor
0 replies
10h17m

Yeah, just potential for gossip and slander type attacks, makes it same to not share everything. Our cloud services probably had the power to discredit or blackmail any user.

Aerroon
1 replies
4h31m

I don't think that's really true. Google has had pretty serious privacy guards in place, where if you don't want Google to collect info on you (for advertising), they won't.

Also, lots of people use other email services than Gmail, don't share their location information, don't share their photo roll nor share their internet history with Google. And a lot of video viewing is obviously not done on YouTube.

Maskawanian
0 replies
2h48m

Google has had pretty serious privacy guards in place, where if you don't want Google to collect info on you (for advertising), they won't.

How do we know this? They may claim this, however their incentives as an advertising company would provide strong pressure against this.

xinayder
0 replies
7h46m

Most people don't do anything worth privacy protecting.

Bringing back the same argument in favor of protecting privacy:

"Saying you don't care about privacy protection because you have nothing to hide it's the same as saying you don't care about protecting free speech because you have nothing interesting to say"

I killed my social media accounts because I disagree with my pictures being used for training AI models. My friends, who aren't tech savvy people, keep mocking me for caring about it, and when I confront them about big tech knowing much more about our private lives than ourselves, they simply respond "well, I don't care anyways, it won't make a difference in my life".

wruza
0 replies
9h23m

You are welcome to live in any country that doesn’t value privacy just to see what it’s like after you surrender it.

verisimi
0 replies
12h41m

Most people don't do anything worth privacy protecting.

There are lots of reasons why you don't hand your personal information to everyone, why you wear clothes even though it might be warm enough not to, etc.

But the key point for me, is that knowing you are being watched, or even suspecting it, changes behaviour.

You cannot be you online. You do things differently, edit yourself. It is a form of manipulation. Which was always the point. The panopticon was conceived as the perfect prison to control others.

https://www.wikipedia.org/wiki/Panopticon

timeon
0 replies
11h32m

I have multiple Google Home devices that are always on and listening

No wonder you don't care too much about privacy, but this is not normal for everyone.

throw283725
0 replies
10h23m

Most people don't do anything worth privacy protecting.

Do you feel the same way about TikTok?

sneak
0 replies
12h16m

It’s not Google you need to worry about. It’s the state that can compel access to Google’s (and Apple’s, and Meta’s) data without a warrant. Big tech doesn’t run concentration camps, but governments can and do, including the USA.

Are you confident that the state will always be friendly to people like you? How about the people you support politically?

longerd2
0 replies
10h47m

You really need a google device and an internet connection to be able to say "stop" to stop your music? Wow. My old "smart" watch could do that offline xD This can't be hard to do on your own, without a google device listening at all times...

lesostep
0 replies
7h51m

>Most people don't do anything worth privacy protecting

I won't disclose why Russian government considers me a part of three different terrorist groups. But they also tried to call people who watched anime a terrorist organization. And they tried that at least 3 times in last ten years. Sooner or later they will succeed.

Also, wasn't there a recent problem in USA where women were uninstalling period tracking software, because it could report someone as pregnant for having irregular period? I know at least 4 causes for missed period, do judges know that much?

We don't protect our privacy because we are imperfect, we do it because assholes and|or stupid people exist. It's like a middle school. You might not be ashamed of having a crush, but you wouldn't tell everyone who they are, because some of our classmates are assholes that would care too much

jononor
0 replies
10h47m

Advertisement has the purpose of trying to get money from you, or to make you change your mind on political issues. These things are worth protecting. Companies using your private information for such purposes is an attack.

jetsetk
0 replies
9h30m

Most people don't do anything worth privacy protecting

One of the worst takes I've ever read. There is something called metadata. Even if you don't do anything that is worth protecting explicitely, the data about your 'worthless data' enables perpetrators to see the patterns of your daily life. You can reconstruct so much by just gathering metadata over a certain time span, knowing when someone usually interacts with devices, social media etc.

I don't want everybody to be able to derive when I'm sleeping, going to work or when I'm going on vacation. Hopefully, it is obvious to you that even a simple thief could use such information (if leaked) to know when it is the best time to go on a heist in your aparment.

There is a nice 3c presentation by David Kriesel called SpiegelMining available on YouTube. It is in German, but the autogenerated subtitles are good enough to understand everything. He downloaded Spiegel Online newspaper articles over a certain period of time and was able to derive lots of information about the authors based on the article's metadata (publishing timestamps, author initials etc.)

jc6
0 replies
14h3m

Its not some accident Google ended up with all this data.

There is a reason they give you Free stuff like video, email, chat, search etc.

People have forgotten the only way (in the past) to solve the Info Explosion that results when networks grow, is trying to understand Peoples needs better. That was the intention behind data collection. But ofcourse the story went off the rails when advertisors and marketers and politicians found value in all that data.

But there is an upper bound to how much value there is, just like there is an upper bound to how much milk you can extract from a cow.

Once you build huge ever scaling infra assuming there is no upper bound and then an upper bound is hit, what happens?

Nothing good. Excess cows start getting slaughtered. Larger the system. Greater the over run. More slaughtering. Dont expect what you got for free to stay free. Expect all your data to be sold off at fire sales.

intended
0 replies
11h2m

Most people don't do anything worth privacy protecting.

Absolutely, its very rare for people to have things Like passwords, bank accounts, confidential documents, secrets, fears, weaknesses.

And as we know, everyone applies good infosec practices, none of us have a txt file sitting in a folder with all our passwords.

So on average, there isnt a growing collection of confidential data that our models are getting trained on.

Matter of fact, if everyone who reads this were to randomly start loudly talking about “EAR-WAX REMOVAL” or “LOW LIBIDO”, in close proximity to a friend’s phone or Smart TV, theres no impact. They dont end up seeing some interesting and potentially embarrassing ads.

It’s not like we live in a world where bad actors exist, maybe in some distant country, who try and take resources away.

——-

EDIT: I came back to edit this because I felt I was perhaps too snarky and as a result having fun at expense to your argument.

For what you said - it should be understood that it is easy to end up in a situation where privacy or PII is converted into a “problem” which has to be minimized.

This is a position that then leads to many other far too complex failure states.

Our future selves are better served by thinking of data we generate, as default private, and that all private data is “heavy”

graemep
0 replies
9h49m

Most people don't do anything worth privacy protecting.

Except for people whose employer does not approve of their politics, or whose family does not approve of their religion, or whose community does not approve of their sexuality, or whistleblowers, or journalists, or people who support causes the government disapproves of, or........

That covers an awful lot of people.

Pretty much anyone living in, or with any links to anyone living in an authoritarian state too.

eesmith
0 replies
12h22m

Most people don't do anything worth privacy protecting.

The flip side is false positives.

Have a scanned photo from when your grandmother bathed you as a baby? Google may identify it as child porn and shut down your account. https://timesofindia.indiatimes.com/city/ahmedabad/google-la...

"Gujarat high court has issued notice to state govt, Centre and Google India Pvt Ltd after the tech giant blocked an engineer’s account citing “explicit child abuse”. The engineer had uploaded a photo showing his grandmother giving him a bath as a two-year-old."

"... his client could not even access email and his business was suffering. The blocking was like a loss of identity for Shukla, a computer engineer, most of whose business depended on communication through the internet. Shukla had requested Google to restore his account, but in vain."

andrepd
0 replies
11h14m

Most people don't do anything worth privacy protecting.

I can only describe this take as disgusting. Saying you don't care about privacy because you have nothing to hide is like saying you don't care about freedom of speech because you have nothing to say.

Privacy is a fundamental pillar of society, for without privacy there is no freedom. We already see the chilling effects across many situations, we have seen them for the past decade+ at least. It's only the beginning.

__loam
0 replies
10h22m

Privacy doesn't matter until someone in a vulnerable group gets killed because someone doxxed them. Hell, even some semi-private people have been killed by things like swatting. Privacy matters and we should pass laws to protect it, just like we have the 4th amendment to protect against unlawful search and seizure.

SebFender
0 replies
7h37m

Having said that, it's easy not to care - until something significant happens to you or your family...

How many clients has my team helped when it was just too late? Many if not most.

One fundamental flaw humans have is not to care until it's too late.

Why should I care about my gutters? Until the day the basement is filled with water because of these "ridiculous" gutters...

LightHugger
0 replies
12h15m

Most people don't do anything worth privacy protecting.

This is blatantly wrong, If i'm a large corporation i can use the information you think is worthless against you via first degree price discrimination and countless other targetted mechanisms. You simply haven't thought about it enough, and as soon as you do, you will realize when no privacy exists people will and have been developing mechanisms to take advantage and capitalize on this state of affairs.

JohnFen
0 replies
3h37m

For almost all people the data will just be used for boring purposes to build models for better marketing/ads/recommendations.

You say that like it's a mundane and acceptable use, but this is the primary thing I want to avoid my data being used for, and is a huge part of why I get very concerned over privacy issues.

BrenBarn
0 replies
13h42m

Companies like Google have had access to our full email, search, location, photo roll, video viewing, docs, etc history for 10+ years. I don't think also having our LLM prompts fundamentally changes this picture..

Well, but that's sort of the point of the article. You're comparing against a baseline where our privacy is already eroded. If you compare to an earlier (say, pre-web) baseline it's quite different.

AlexandrB
0 replies
4h0m

Most people don't do anything worth privacy protecting.

This is true of many fundamental rights. Most people don't say anything worth speech protecting either. The catch is that when you do need to say something worth saying or do something worth privacy protecting if the right wasn't there all along you're kind of screwed.

openrisk
83 replies
12h30m

Trying to understand consumer privacy behaviors outside the prevalent social contract that the vast majority of people operate under is bound to missinterpret what is happening and why.

We live in a regulated "supermarket" economy. What surfaces on a screen is entirely analogous to what surfaces on a shelf: People check the price and make their choices based on taste, budget etc. They are not idiots, they operate under a simplifying assumption that makes life in a complex world possible.

The implicit assumption central to this way of organising the economy is that anything legally on sale is "safe". That it has been checked and approved by experts that know what they are doing and have the consumer interest as top priority.

People will not rush back home to their chemistry labs to check what is in their purchased food, whether it corresponds to the label (assuming that such a label even exists) and what might be the short or long term health effects. They dont have the knowledge, resources and time to do that for all the stuff they get exposed to.

What has drifted in the digital economy is not consumer standards, it is regulatory standards. Surfacing digital products with questionable short and long term implications for individuals and society has become a lucrative business, has captured its regulatory environment and will keep exploiting opportunities and blind spots until there is pushback.

Ultimately regulators only derive legitimacy from serving their constituencies, but that feedback loop can be very slow and it gets tangled with myriad other unrelated political issues.

alvah
36 replies
10h28m

"The implicit assumption central to this way of organising the economy is that anything legally on sale is "safe". That it has been checked and approved by experts that know what they are doing and have the consumer interest as top priority.

People will not rush back home to their chemistry labs to check what is in their purchased food, whether it corresponds to the label (assuming that such a label even exists) and what might be the short or long term health effects. They dont have the knowledge, resources and time to do that for all the stuff they get exposed to."

What you describe is a feature of a high-trust society, where you don't have to double-check every single transaction or interaction you enter into, but can take most statements on trust. This allows people to get on with the fundamental task at hand, rather than dealing with the overhead of checking their food in the chemistry lab, or whatever the equivalent is for the specific transaction.

I have read suggestions that this was a major contributor to the growth of the Western economies, relative to other low-trust societies. If this was the case, we are in for a bumpy ride, as we seem to be rapidly changing from a high-trust to a low-trust society.

soco
17 replies
9h14m

"I have read suggestions that this was a major contributor to the growth of the Western economies, relative to other low-trust societies." I'm not sure I follow, what are other low trust societies? Otherwise I'm with you here - living in a cabin in the woods survivalist-mode does nothing to progress a society.

_heimdall
9 replies
5h58m

living in a cabin in the woods survivalist-mode does nothing to progress a society.

Not everyone's goal is to progress a society though. If one's goal is to live a quiet life and do what makes them happy, what's wrong with living in a cabin in the woods?

That would only be a fundamental problem if everyone owes something to society. That's a much different conversation though, whether everyone is born into a debt that must be paid back to society.

tsukikage
3 replies
5h26m

If handiwork and subsistence farming are not what makes you happy, living in a cabin in the woods will not make you happy, because when you cannot outsource them to the rest of society, nearly all your time will be spent doing those things in order to survive.

Even once these basics are sorted, you will only live happily outside society as long as you are lucky enough to stay healthy.

_heimdall
2 replies
3h8m

If handiwork and producing or finding your own food is what makes you happy, then why does it matter whether you are outsourcing to society?

The second sounds like a separate goal unto itself. There's absolutely nothing wrong with that goal, or with having multiple goals, but if you start by saying doing X makes you happy then it doesn't really make sense to say doing X won't actually make you happy because you aren't doing Y.

tsimionescu
0 replies
1h56m

I think the point is more that there are a very limited number of very specific lifestyles that can exist outside of a society. If you happen to thrive in one of those lifestyles, awesome, cabin in the woods works great for you.

But you can't do that if your passion is making music, or mathematics, or computer programming, or electrical tinkering, etc. There just isn't an option to follow the vast majority of pursuits except if you also engage in society.

pessimizer
0 replies
2h0m

People don't have the physical or mental ability to live alone. Their version of "alone" is a world where there are institutions that exist to protect their property, guarantee their transactions, and where they are supplied with a massive amount of high-quality manufactured goods. Paying for them doesn't make you somehow independent of society, it's the nature of society. You're trading bits of paper with government promises printed on them.

soco
2 replies
5h0m

The comment I was replying to was talking about the growth of the economies and society progress. It's not about owing, it's about what is happening - and I think we agree that if the goal of everybody in a society is to live a quiet life, there will be no progress. Maybe we'd even witness the contrary: a regress of said society, to the extent we can call sparse people living by themselves in the woods a "society". If that sounds negative and you feel the need to defend it, it's maybe because you actually agree it's a negative for the society. While being good for the individual, right.

_heimdall
1 replies
3h10m

and I think we agree that if the goal of everybody in a society is to live a quiet life, there will be no progress

That's actually where it gets really interesting though. Progress isn't absolute, it's relational and requires first defining the goal. If one's goal is to live a quiet life where they minimize their dependence on others, living in a cabin in the woods and finding their solution for food and water is progress. That obviously doesn't fit for a larger society where the goal is generally increasing dependence and trust on the larger society, but neither is right or wrong.

pessimizer
0 replies
1h55m

If one's goal is to live a quiet life where they minimize their dependence on others, it is incumbent on them to figure out how to keep anybody who wants what they have from just coming in and taking it. That requires a society. Your deed to your land is civilization. Part of societal progress is making it so that deed can be trusted to keep people from just taking your cabin in the woods and throwing you out.

This has to be negotiated with the people who would want to take your cabin in the woods and throw you out.

notjoemama
1 replies
2h56m

Not everyone's goal is to progress a society though.

Thank you. That was my initial thought too. Why is progress the goal? Not everything has to "progress" at all times. What progress needs to be made anyway? And towards what end? Who decides that?

There's an inherent good to stopping progress and spending some time in a cabin in the woods.

If we never stop and enjoy now, then why bother with tomorrow?

_heimdall
0 replies
30m

The really challenge I've had with "progress" as a goal is that it so frequently is missing the context of what we're trying to progress towards.

The idea seems to be that starting with what we have today and taking another step forward is always the right move. Never go backwards, and its okay if we don't define our goals beforehand as long as we keep moving our feet.

whearyou
1 replies
3h44m

For a case study, Israel vs eg Egypt.

Though, for Israel that is changing for the worse - less trust in the government and each other.

notjoemama
0 replies
3h0m

They're having the same problems with political extremism as the US. To be clear, this is both that political figures are leaning towards the extreme ends of their ideologies, as well as a culture of dogma and isolation that pushes them and their citizens towards extremes.

goda90
1 replies
5h55m

Societies with lots of corruption, adulteration, theft, forgery, counterfeit, etc that goes under-punished. If you are frequently burned by your transactions and interactions with business, government, etc you're going to have low trust.

HarryHirsch
0 replies
4h30m

There's a wrinkle to that - here's a vidio from the Indian trading standards authority to detect adulterated salt: https://youtu.be/x3CWvI_AWkU

That's the government actually doing its job for once, but you can't check at home for data safeguarding.

rolph
0 replies
1h41m

i live in a cabin in the woods, we have different lifestyles too. im very technical, my nieghbour 'grizzly annie' is the opposite, but we trade back n forth.

AlienRobot
14 replies
6h22m

I've always found interesting that we've grown up hearing don't install random programs from the internet/don't access random websites.

Then you go use linux and everyone copy-pastes commands other people wrote straight into the terminal.

screamingninja
4 replies
5h1m

everyone copy-pastes commands other people wrote straight into the terminal

I know a lot of people that use Linux and not many of them operate this way. Most care about their software sources. "Everyone" is certainly not the case.

g15jv2dp
3 replies
4h52m

And yet when I complained about `curl | sh` on HN the other day, I got ridiculed. "Everyone" is too much, but even on a purportedly "hacker" website, people find the idea of perusing a shell script before executing it preposterous.

dcow
1 replies
2h5m

But `curl | sh` is no less secure. Download this file and execute it. Functionally the same outcome. Tell me how doing that is materially different than `apt get`. Both employ signing and checksums (just with different PKI). One delegates trust to a package maintainer while the other trusts the author directly. I truly don’t understand the paranoia and consider it tinfoil hat security theater.

tetris11
0 replies
1h53m

the package maintainer has to go through a web of trust in their FOSS ecosystem to be allowed to distribute their packages.

A github author just has to put up a repo and hope that their fanbase aren't too versed in the language

neilv
0 replies
4h21m

Something that's hard to remember, but helps a little: if you get 3 people saying stupid things, that's only 3 people -- not necessarily representative of the people out there.

bayindirh
3 replies
6h13m

Then you go use linux and everyone copy-pastes commands other people wrote straight into the terminal.

This is exactly the same bias in action. When I started using Linux nobody was doing that, and even if somebody gave you a script, its actions were verifiable by reading the relevant man pages.

I still don't run install.sh files I didn't read or at least skimmed, for example.

roblabla
2 replies
4h53m

Do you also audit the sources of the programs installed by that install.sh? Do you make sure the binaries and sources match? If not, why? What makes the shell script so special that it must be audited with care, but the binaries are fine?

bayindirh
1 replies
4h40m

I do not use any "install.sh" that installs freestanding binaries. All of the ones I use just sets up repositories, and I make sure that the repositories are the correct/legit ones. If I have to install a freestanding binary, I compile it from source and install.

Since all the repositories are signed there must be a big breach to compromise these packages since the infrastructure is generally distributed. Different servers, keys, etc.

What makes the shell script so special that it must be audited with care...

Because I need to know what changes I'm incorporating into my system(s), and plan accordingly, or prevent any change which is not in line with my system administration principles.

...but the binaries are fine?

They are not fine, but they are signed at multiple levels and checksummed, so they are a lower risk.

dcow
0 replies
2h12m

Your risk model is kinda perverse. You're saying you trust package maintainers because they sign things. So if I send you a signed script that checksums itself before running will you run that without audit?

It’s trust all the way down and always has been. You just have a different idea of how you formally signal and convey trust than someone else.

I paste commands into the terminal because I can read exactly what they do and they are delivered over a connection where my user agent has verified the TLS certificate of the server. In fact I’m electing to trust directly rather than transitively the source of the software.

The only thing signed files prevents is modification in transit (and at rest on macOS/iOS and Windows, Linux doesn’t do that). Linux is ripe with time of check vs time of use race conditions.

singpolyma3
1 replies
6h10m

Don't access random websites?

Sounds like a very small internet.

AlienRobot
0 replies
5h26m

It does feel like that sometimes doesn't it.

robdar
1 replies
4h29m

It’s worse than that. Find a random blog that gives you shell commands that add random repositories to your apt sources.list, adds the ssl keys, and installs packages from the repo, all through a paste to the command line.

Maken
0 replies
2h17m

I used to do that, but nowadays I tend to stick with either my distro or developer repositories. Internet is a wild place.

Spivak
0 replies
33m

I think the replies saying how terrible this is are missing it. The Linux community is a high-trust community and has continuously earned that trust over and over again. The times when it's been broken are so few it's newsworthy each time it happens.

Anyone who's like "well I don't copy/paste shell code into my terminal" is just virtue signaling. I'm willing to bet their editor Vim/Emacs/VSCode is overflowing with plug-ins and code written by just some guy on Github. I bet they've ran containers that are written by just some guy too.

It's a really cool feature that you can just download a random binary off Github, run it, and not really have to worry about it.

datavirtue
1 replies
3h52m

Business moves at the speed of trust.

jrexilius
0 replies
3h5m

That is a very good summary. I'm stealing that line..

jrexilius
0 replies
3h2m

Having worked in many low-trust countries, I very much agree with that assertion. And seeing the effects of the trust-decay in our own, and the trajectory it sets, reinforces that view.

peacechance
11 replies
11h37m

Incorrect. We operate in a military smorgasbord where the war hungry U.S. government gets an unlimited budget of trillions of dollars year after year and their modus operandi is COLLECT IT ALL ... EXPLOIT IT ALL. The government believes that they are the only one's entitled to privacy for National Security™ purposes and their crimes are covered by criminal and conspiratorial compartmentalization. They go after journalists and whistle blowers like Julian Assange and Edward Snowden with dictatorial fury because the true enemy of fascist governments is Privacy for the People.

Your perspective is cute business school brain washing but the real fact is that businesses like the FAANGS have realized that they can tap into fascist daddy's money faucet and sidestep the shelf-joke analysis you portrayed. Why worry about stocking shelves with good products to maybe sell units to individual consumers when you can sell to one big monstrous entity who can sign a billion dollars' large contract to ingest and process everything against the masses who barely have $100 to pay for a piece of software.

We have a U.S. military problem here.

meitham
8 replies
11h26m

The downvoting of this comment clearly show how HN users are in denial of the US atrocious behaviour on the world stage, to the extent of sanctioning international institutions such as the ICC!

novariation
6 replies
11h0m

To be very honest I'm tempted to downvote this , not because I would disagree if this was the topic at hand, but because it seems like a deranged rant that's very loosely related to the comment it's replying.

bbarnett
5 replies
8h45m

Not to mention, if you compare the US, its citizen's intentions, and its government's intentions, and yes the DnD's intentions to all but a tiny handful of countries, the US is a golden boy of good.

Compare today's US to, for example colonial powers. The US is pure in comparison.

Compare the US to what happens to journalists in Russia, China, Iran. The US is a beacon of justice.

Is the US perfect? Nope. But do a little comparative analysis, and the result is that the world has never, ever seen such a peaceful empire.

The rancor often displayed on such comments makes me wonder.

dbspin
4 replies
7h44m

While I agree that the tone of parent is... unhelpfully agitated and aggressive. The broad strokes critique of US foreign (and domestic) policy is on point.

The US began as a corporate colonial project - Virginia Company, London Company, Plymouth Company, Massachusetts Bay Company etc. It proceeded to expand through genocide of indigenous populations and wars of conquest - Cherokee–American Wars, Mexican American War, Spanish American War, Quasi-War with France and on and on.

From the first the US (contrary to domestic myth) was colonial, expansionist and interventionist. US involvement in theatres around the world has initiated, prolonged and expanded conflicts - from Vietnam, Cambodia and Laos to Iraq and Afghanistan. The US has toppled literally dozens of democratically elected governments and helped elect numerous authoritarian dictators. The list of countries where the US has engaged in 'regime change' is so long it could fill up this comments character limit but to cite a few Hawaii, Panama, Honduras, Nicaragua, Mexico, Haiti, Philippines, Korea, Venezuela, Libya, Palestine, Syria etc. America has funded (and continues to fund) genocides, death squads, torture sites (its own as well as those of its allies). The US has replaced democratic leaders in countries as friendly as Australia as recently as 1975.

At the risk of turning this comment into a letter to the editor of Foreign Policy - the US is not remotely a 'golden boy of good'. On balance Pax Americana has kept Europe at peace, but at the cost of keeping Africa, Latin and Central America, and parts of South East Asia impoverished and constantly at war.

Is it China? Is it Russia? No. Would they make worse imperial powers? Almost certainly. It's a unique kind of tyranny, one that manages to convince it's own elites against all historic evidence it's a 'force for good'.

bbarnett
3 replies
7h31m

the US is not remotely a 'golden boy of good'

Ah, but it is in the context I stated.

Quote:

Not to mention, if you compare the US, its citizen's intentions, and its government's intentions, and yes the DnD's intentions to all but a tiny handful of countries, the US is a golden boy of good.

Note the conditionals. "intentions" and "compared to all but a tiny handful of countries".

Also note the context where I was careful to say "Compare today's US".

In these contexts, the history of the US is meaningless, and only today counts. And I was referencing "colonial powers", which are historical, to us actions today.

I know all the US has done. I also know all the British Empire did. I also know how Russia, China, Iran, and various tiny dictatorships around the world act.

Yes, the US is very much a "golden boy of good" in reference to, and comparison to these things.

Anything and anyone can be made a monster taken out of context.

smatija
2 replies
6h32m

Just look at atrocities USA is commiting in Yemen today. Or in Palestine proxied through Israel.

bbarnett
0 replies
1h58m

Ah, the backwards world view, where people defending themselves from relentless aggressors, including ones that call for an end to their existence, are somehow wrong to end those endless attacks.

Israel would be nuts to leave even a wisp of Hamas in power in Palestine, and yes attacking innocent ships gets a defensive response.

Next up! Man attacked by machete wielding lunatic punches him in defense, how dare he!

MediocreSysEgnr
0 replies
5h28m

Sincere question: Is the second half of your assertion that U.S. government is directing/coordinating the conduct of the Israeli military or that their lack of action makes them makes them equally culpable/complicit in said actions?

TeMPOraL
0 replies
4h18m

Oh we know all about US atrocious behavior on the world stage. But this is completely irrelevant to the topic at hand.

TLA agencies gonna TLA. But US military does not explain the enshittification and the surveillance economy. That's all on private business, small and large. Ethically challenged entrepreneurs, marketers, advertising industry, and yes, MBAs. But they're not a US phenomenon, they're a global issue.

posterboy
1 replies
8h2m

we are opperating in a "breadboard"?

wizzwizz4
0 replies
9m

Buffet is probably a better translation, in this context: I don't think the emphasis was on it being bread. (The bread of the metaphorical smorgasbord signifies… resources? Pokémon? something that isn't bread, anyway.)

nonrandomstring
8 replies
12h7m

Not sure that regulatory capture explains the poor quality of digital goods and how people value them. I think it's that cargo cults (what we have) are antithetical to true technological societies (which is what we say we want)

A fault I see in Bruce Schneier's article, and the general hypothesis of "frame/baseline shifting" - it's not that people have been conditioned or forgotten the value of privacy, but that we're looking at the world through ever smaller lenses. The story in the article is that science is guilty of that too. The decline has been around education and perspective in general, not just attitudes to a small issue like "privacy". We thought the internet would widen our scope. It narrowed it.

In the UK we have a great travel and culture show by Romesh Ranganathan. (I highly recommend it, so get on your VPN to watch BBC or find it on the torrents). It will cheer you up [0]. He's a funny guy. But also the cultural vista is breath-taking. Looking at life in central Africa it's great to be reminded of the diversity of humankind. Watch for the little things - like a whole bus queue of people, none of whom are on mobile phones.

What the Internet promised - the great "conversation of mankind" - never emerged. Instead we got cat memes and social control media that forced people into ever smaller parochial silos. HN is no different. Here it's cool to have a bleak outlook on humanity and technology. "Oh it's too late... we're doomed... oh woe is me!" C'mon hackers... what happened to the joy of shaping the world? :)

Bruce Schneier appeals to a macro systems theory metaphor, and mentions Daniel Pauly [1]. But he neglects some of the more profound lessons that Forrester, Meadows and actually Norbert Weiner gave us about feedback and the empty dream of cybernetic governance. Nothing as big as humanity will fit in bottle that small unless you're willing to destroy it in the process. And what you're destroying is the very innovative base that gave you the technology in the first place. This is why they should teach history, geography and other cultures in schools.

[0] https://en.wikipedia.org/wiki/Romesh_Ranganathan

[1] https://oceans.ubc.ca/2023/05/19/daniel-pauly/

whilenot-dev
3 replies
11h1m

We thought the internet would widen our scope. It narrowed it.

What the Internet promised - the great "conversation of mankind" - never emerged. Instead we got cat memes and social control media that forced people into ever smaller parochial silos.

The internet economy enabled browsing on a global scale and, because no kind of agency could keep up with its development speed, pushed everything into an "on demand" culture - what evolved with it is the higher responsibility for consumers. I feel the state of economy before the internet was on a scale of "everything is possible" (in the western hemisphere at least), only to discover years later that "our greed has consequences we just couldn't see before". Our cognitive load feels higher than ever to just cover the basics of survival and not get screwed over, or screw someone else over, constantly.

This is why they should teach history, geography and other cultures in schools.

I think system thinking should become a thing in schools early on too. Every defined system is a mere conceptual view with artificial boundaries - a system without boundaries isn't a system, but a universe. It's good to enable new resources to think about new systems, or rethink existing systems, but we also should discuss the boundaries required for its concept to make any sense. I feel that this understanding gets sometimes lost, and its naivety gets regretfully relabeled as "innovation".

soco
1 replies
8h56m

In a MBA-led economy anything which doesn't create immediate monetary value gets pushed aside - humanities and education in general, for instance. And before somebody jumps: we did have innovation before this just as well.

nonrandomstring
0 replies
4h54m

MBA-led

There are no "leaders" with MBAs

nonrandomstring
0 replies
8h51m

we also should discuss the boundaries required for its concept to make any sense. I feel that this understanding gets sometimes lost, and its naivety gets regretfully relabeled as "innovation".

Good observation. It gets blamed on globalism, liberalism, post-modernism, capitalism, and a bunch of other things, but somehow we got the wrong idea that all boundaries are bad. Boundaries must be broken!

I think that came from the language of science and technology. We made "Smashing barriers!" a synonym for progress. But it is a childish, iconoclastic and directionless notion of "progress", for its own sake.

Psychologically at least, a lack of boundaries is a kind of madness, it's disinhibited, intrusive, lacking self-control - and to the extent there's a political theory of mind it leads to wars and internal unrest.

We raised "connectedness" to supernatural hoodoo, but connectedness for it's own sake is a catastrophe to systems. It violates the principles of modularity, decoupling and appropriate cohesion.

nonrandomstring
0 replies
11h21m

Ouch. Thanks.

TheRealDunkirk
1 replies
5h17m

what happened to the joy of shaping the world?

We were vastly outnumbered and overwhelmed by sociopaths who exploited the tools to make our lives better by capturing everything of value in the market, and subsequently enshittifying it. Our cynicism has been well earned and deserved.

nonrandomstring
0 replies
4h58m

You misheard the reveille for the sound of the retreat. Can't wear cynicism on your chest with pride can you? Take back what's yours.

pjc50
6 replies
8h59m

Schneier used to talk about "the Exxon Valdez of privacy", the idea that there would be a single giant spill that had significantly bad effects enough that it would force change.

That has basically not happened. It sometimes seems that the situation has got worse in terms of public debate, due to the usual bad-faith actors. For example, the TikTok discussion is not framed around privacy in general but focuses on "China bad". With the implication that an algorithmic megacorp controlling political sentiment through feeds is completely fine so long as it's Americans doing it. And the voting security discussion: there were questions about voting machines long before 2020, but partisan attacks focused on discrediting valid results.

frapaconi
3 replies
5h6m

Yeah keep in mind that both political parties heavily use private data farms to narrowcast campaign advertising. Facebook is just one of those. Basically there’s a private sector spying industry and the government enjoys the benefits too.

TikTok is messed up for many reasons. One being it’s a waste of everyone’s time. Two, foreign nations getting a treasure trove of video footage on a massive populace in the age of AI. This WILL lead to a faked mass-casualty event using real identities of people to cause political unrest. Just a matter of time—and data.

Going back to the article though, the social media baseline needs to be fixed as well. The OG Facebook platform was actually GOOD—lots of good-natured value in human connections. Then they started cutting in the bad shit reducing its purity.

TikTok took that to a whole new level and put in some home grown Singaporean crack. Ironically, the Chinese and Singapore have hefty penalty’s for drugs—and yet they thrust it onto the western sensibility. After all, it’s the WILD WEST.

frapaconi
1 replies
2h30m

Got lots of down-votes. HN needs to implement a feature requiring a REASON for down votes.

solarpunk
0 replies
51m

you are basically making the same old "digital fentanyl" argument that nancy fucking pelosi has been making for the last year.

it's just not that compelling.

TeMPOraL
0 replies
4h43m

Ironically, the Chinese and Singapore have hefty penalty’s for drugs—and yet they thrust it onto the western sensibility. After all, it’s the WILD WEST.

One almost wants to say, revenge for the opium.

tracker1
0 replies
4h6m

If a major credit agency leak didn't do it, I doubt anything will... At least in terms of a large scale event. We went through the likes of melissa and "I love you" and several others around Y2K (and even that). In the end, life goes on for the most part. What people don't see, they're oblivious to and have short term memories on top.

throw4847285
0 replies
1h7m

a single giant spill that had significantly bad effects enough that it would force change.

That's the premise of the Brian K Vaughan/Marcos Martin comic "The Private Eye." Unfortunately, the premise is more of an aesthetic than anything else. Still, it's a beautiful aesthetic and a fun read.

Draiken
6 replies
6h18m

All of these systems you mentioned have already failed. We are simply very slow to accept it.

Even in the (almost) mainstream media with shows like John Oliver, you can see just how deep the degradation of all our regulatory entities really is.

In today's world, almost everything we purchase is pure garbage. The food is overly processed and with excessive use of chemicals and added sugar, the furniture is from cheap material and shoddily built, the building we live in was built cutting corners and skirting laws as much as possible, the software we use is mostly garbage thrown together to "ship fast and break things", the doctors we go to are uninformed and trained to dismiss you as fast as possible, the car repair service wants to rip us off based on our lack of knowledge. I could go on forever here.

It's honestly a joke. We are all the proverbial boiling frogs and allowed everything around us to fall into ruin (and continue to do so).

This assumption you presented only works for the uninformed masses. If you investigate for 5 seconds into anything we purchase, we pierce the veil and see how it's mostly utter garbage. Most of these regulatory institutions became a facade where big corporations get a pass and we pretend they are doing a good job.

Regulators can't ever win this battle. Even if they aren't corrupt, they can't be insulated from the influence of big corporations and they are, and will always be, underfunded because regulation inevitably means less profits. We can't have that, can we?

Aerroon
4 replies
4h34m

I think it's fitting that the parent poster used supermarkets. They're a relatively recent thing and often seen as lower quality. In my country it's still relatively common for people to forage for mushrooms and berries in the forests. There's no regulatory oversight over that, but people are still fine.

Supermarkets and this kind of regulation hasn't necessarily stood the test of time. It's hard to say if it will considering how every system that gets built seems to slowly get corrupted or bent in somebody's favor.

spdustin
1 replies
3h40m

The forests in which people are foraging for mushrooms and berries have no need to optimize for profit. The moment humans get involved in the supply chain, the more the need for some form of (idealized, for sure) oversight from agencies.

skybrian
0 replies
2h38m

Evolution results in other incentives. Beware the assumption that nature is somehow benevolent. It can go either way.

Some mushrooms are poisonous. They evolved to protect themselves from being eaten. On the other hand, some plants evolved so their fruit will be eaten and spread their seeds.

And then there are diseases and parasites.

skybrian
0 replies
2h37m

People are fine if they didn’t eat the wrong mushrooms.

ericjmorey
0 replies
3h14m

"People that didn't get seriously ill or die are fine" is a bad argument.

carlosjobim
0 replies
50m

In today's world, almost everything we purchase is pure garbage. The food is overly processed and with excessive use of chemicals and added sugar, the furniture is from cheap material and shoddily built, the building we live in was built cutting corners ...

Everything that existed in the past is still being made and is available for purchase. You just have to pay the price. If you are a commoner with a job, your real income is a tiny fraction of what a worker was paid in the past, thanks to inflation. So mass produced garbage is what is available within your budget.

More than half of the working age population in industrialized nations do not work (including many who are employed). A good part of the population have never worked a day in their life and will grow old and die without ever having done anything useful. Everybody else has to pay for their sustenance, and one way is the general decay you see.

djtango
2 replies
8h51m

You chose an analogy close to my heart! I have managed to convince myself that most "food" in the supermarket is inedible poison!

It's exhausting and a real nuisance to my quality of life but I equally refuse to knowingly consume excess additives unless completely in a pinch.

Needless to say I'm also very suspicious of online businesses. Although I'm actually getting a bit fatigued/defeatist by privacy issues. We're all so overwhelmingly in this ship that I don't know what I really stand to gain by constantly hamstringing myself digitally...

If we wake up in the worst case scenario, I'm sure I have enough of a footprint I wouldn't be able to meaningfully hide much from a determined bad actor...

kenjackson
0 replies
3h54m

It's exhausting and a real nuisance to my quality of life but I equally refuse to knowingly consume excess additives unless completely in a pinch.

Why not? Unless you have severe allergies the excess additives will probably do very little to your health. And the quality of life and exhaustion wins probably make it a net win for you. Is it a principled stand?

A4ET8a8uTh0
0 replies
7h20m

<< I don't know what I really stand to gain by constantly hamstringing myself digitally...

I hear that and I will admit that lately it does seem to get in the way a lot more often than expected ( my bank in my recent interaction removed branch connection for a specific application and moved everything to an online process, to which direct link is an ad tracker ). Still, wife now has gotten used to no ads ( almost ) anywhere and is starting to see what the benefits can be.

<< determined bad actor

Sadly, if I am targeted, there is no escape. I am privacy conscious, but I also have a family and try to live in a society in a sufficiently comfortable manner. It is a tough balancing act. In a sense, it is not that different from guerrilla warfare. There is no defense against sufficiently motivated entity that is not exhausting. My only solace is that I am a low value target so it does not seem that likely. Still, what are the odds of me pissing off someone sufficiently important? Non-zero for certain.

benterix
2 replies
9h31m

The implicit assumption central to this way of organising the economy is that anything legally on sale is "safe". That it has been checked and approved by experts

This applies to certain producs only and even in the EU where these laws are more strict some products are regularly withdrawn from sale (these are mostly quality issues, not intentional actions), even as far as drugs are concerned. It's one of the reasons I tend to buy fresh products and from bio shops mostly - to increase my chances.

soco
0 replies
9h7m

I like the way you put it "to increase my chances" because I've heard more than once from adepts of bio products promoting as it would be the holy grail. Bio producers can make mistakes too, they can be neglecting or even nasty too, it's just, chances are higher with them to get better quality produce. If you can afford it, that is.

TeMPOraL
0 replies
4h33m

There are a lot of products. Mistakes are a statical certainty at this scale, and regular recalls are a sign the system is working, at least to an extent (I'd say it works quite well, safety-wise).

Interesting choice wrt. bio foods. I'm the opposite: I don't trust organic/bio food at all. There is a lot of vanity products clearly meant to pay more to make yourself feel you're choosing a healthier alternative. Plus, I trust the established industrial processes more - they're thoroughly regulated and tested. A devil I know. Bio/organic stuff, who knows what they're doing - and what poisons they're spraying on their produce, that let you keep "organic" label but are otherwise more toxic than industrial pesticides.

guardiang
0 replies
10m

It's way past time that consumer-first became modus operandi for Tech. LLMs (GPT-4 quality or better) have the potential to enable that future. It'll take an avalanche of high quality products coming out of small-footprint companies. Each company utilizing LLMs to shore up any limitations due to their size, and every one of them having a consumer-first mindset from the get-go. It can be done.

ericjmorey
0 replies
3h6m

I think you're romanticizing the past that brought us to the current state of consumer markets. Until the 1930s in the US, the primary guiding principle was caveat emptor and it took a lot of regulations to change expectations.

ProdSecBurner
0 replies
5h21m

This is the liberal viewpoint (I subscribe to it too). The conservative viewpoint is to do mostly do what our ancestors did (given it worked for many generations). My problem with this approafh is that it limits progress and favours folks who are already doing well.

Citizen_Lame
0 replies
8h20m

Regulatory standards have not drifted. They have been captured by companies as big as some countries. Entire political and social system has been slowly eroding and reverting back to the Charles Dicken's times. We are just living in the times were cracks are visible enough and we can clearly see them.

Tech bros never faced the same scrutiny as regular industries do (or at least there were no consequences).

1vuio0pswjnm7
0 replies
7h48m

The article referred to the beliefs of fisheries scientists and what the authors have suggested are their analogous counterparts today in the "security community". It is not focused on the beliefs of computer users.

For many years on HN I have seen comments that constantly try to shift the focus away from articles like these and place blame on computer users. This is old hat. We are past that nonsense.

This article is addressed to the "security community". But some HN commenter is trying once again to shift the focus to computer users, making sweeping, generalised, incorrect, and ridiculous assumptions about them.

There is a commercial motive for destroying privacy and collecting data. Computer users are not voluntarily "giving up on privacy", or "trading away their data", with informed consent, to enrich software developers. It is being taken, without notice, often unbeknownst to the victims. There is no "contract".

Regulation will continue to gain momentum as it is obvious to anyone outside of the so-called "tech" companies and their supporters that this is a bad deal for society.

The problem is not the behaviour of the victims, it is the behaviour of the perpetrators.

whoknowsidont
36 replies
15h38m

I agree. Philosophically, politically and financially (in terms of groups I donate to).

But it's too late. The frame shift is too great for most people. Getting people to care about this let alone vote in a manner that would cause actual change is so far out I have a better chance of winning the lottery.

I don't know what to do other than to support and educate where I can.

I do hope I'm wrong.

EGreg
17 replies
15h23m

Vegans can talk all they want

But the Impossible Burger did more than talking ever did

Government tried antitrust for decades w telcos

But the costs of long distance calls dropped to practically zero when open VoIP protocols came out

The Web disrupted AOL, MSN, CompuServe, TV, Newspapers, Magazines etc.

Wikipedia disrupted Britannica and Encarta

The answer: open source must get good enough that people can switch from big tech corporation-controlled solutions and then they will be in control of their own software !! :)

kbenson
11 replies
14h54m

Caring about privacy has always been a rich person's game, since the poor have little to pay with other than their data. That's worth keeping in mind.

mindslight
5 replies
14h35m

But the products we're talking about "paying" for here are continually dropping in price - email, media sharing, social networking, etc.

A basic cloud host that can take care of a whole family can be had for $5/mo. Perhaps that's still too much for your exemplar poor to spend on a hope. But the amount extracted by the surveillance industry is always increasing, and eventually that levee will break.

I'd say there are much bigger impediments - needing to invest time for setup, needing trust a given solution won't turn out to be a lemon or even backfire, and lack of straightforward packageable solutions to many of the technical problems.

kavalg
4 replies
13h42m

It's not the cloud cost that is stopping people from doing this. It is the installation and administration cost. That requires time and expertise that most either don't have or are not willing to spend.

mindslight
3 replies
12h51m

"Poor" on its own generally refers to monetary cost, which is what I responding to.

I agree with you. Time poor is a definite problem, even (or especially) among people that aren't money poor, and it comes with its own Vimes's Boots analogy. For example, the many articles we read here about the centralized service of the week creating some kind of problem for their users, which could have been dodged with a little outlay of time to make better tech choices in the first place.

kbenson
2 replies
2h33m

Poor, as I intended it, was meant to encompass many things. Working poor people have less money, but they also often have less free time because they have less money. If you have plenty of money you can shop around for jobs that fit your schedule better, or if your job allows for it take time off, or outsource some work you would have to do yourselves (housecleaning, maintenance, landscaping, child care when you're unavailable, cooking, etc). Additionally, if you're in a family, there's a higher chance that both parents will need to work full time, meaning there's less free time for all those things mentioned above, which if one parent worked part time or not at all could allow for a lot of those things to be done while one is working, leading to more free time in the evening for both.

Given that, who's more likely to look into free alternatives to apparently free services online? The middle-class person with a spouse that works part time and takes care of many of the chores and that pays for a handyman or contractor or repair man to fix appliances and household problems or the poor person that works full time, their spouse works full time, and when they're done working they're busy doing the chores that life requires because paying someone else is not feasible for them?

Time is money, and the working poor have neither.

mindslight
1 replies
1h3m

It feels like you're stretching to fit this into some memetic narrative of "poor" when most people are time poor until you get to the upper class. While middle class people are more likely to pay for things that need skilled repair (due to knowing fewer people who can do such work informally), they're not paying for things they can do themselves like housecleaning, landscaping, etc - paying someone else for large chunks of time is an indicator of being an upper class.

The point isn't to discount anyone's struggle, but rather to look at the actual mechanisms that hinder adoption of libre software. And apart from some relatively affordable table stakes, I don't think the financial cost is really one of those. The attention cost of self-actualizing and the ambiguity of using a non-advertised solution are though.

kbenson
0 replies
35m

While middle class people are more likely to pay for things that need skilled repair (due to knowing fewer people who can do such work informally), they're not paying for things they can do themselves like housecleaning, landscaping, etc

It's not that they're paying for all of them, but that they have the option of paying for them and I would hazard that a good portion of middle class families with both parents fully employed opt for offloading at least some of that some of the time if they live in a portion of the country where middle class means you actually have disposable income. E.g. many people pay for larger landscaping projects, or hire a handyman to do cleanup around the outside of the house or even might have a maid come in once a month. This might be less common now that the middle class has eroded to some degree, but I think that's a problem of a shrinking middle class, not of those being things the middle class doesn't do.

And apart from some relatively affordable table stakes, I don't think the financial cost is really one of those.

I'm not sure I agree with that. Having spare hardware to run something, a stable place to put it, and paying the power for it (a minor but increasing cost) all play a part.

The attention cost of self-actualizing and the ambiguity of using a non-advertised solution are though.

I agree, and I think the working poor have less attention to spare because of less free time, but also that they generally have more and larger worries that make this problem seem insignificant by comparison. If your major worried are making rent, having enough money for food, repairing your car so you can effectively get to work and take kids to school, spend quality time with your family, and set up a solution so that your privacy is protected from companies that want to monetize you, which one will get the least attention? Middle class families probably have at least one less of those concerns, and every major concern that's more important than using free software to avoid a small but persistent exploitation is something that competes for attention.

huygens6363
3 replies
13h43m

The “poor” seem to scrape by just enough to get access to Netflix, sports.. I don’t think this is a money issue.

jononor
2 replies
9h28m

Having to choose between those things and privacy is definitely a money issue! It is not surprising that privacy looses in that case - it is opening up another huge can of worries. Entertainment is a source of relaxation, de-stressing as well as social connection (notice how much people talk about and group around the topics you mention).

huygens6363
1 replies
6h37m

Having to choose between what is actually good for you and what is “relaxing” and then deciding to divert all available funds to “relaxation” is not a money issue.

jononor
0 replies
1h17m

I think privacy is very important. But a social network is definitely more so! And many have that via sports, for example. And if someone thinks it best to spend money on Netflix over Fastmail, I definitely am not gonna shame them and say "that is not good for you"! Especially not if they are poor.

freehorse
0 replies
11h27m

The only reason that privacy issues exist online in the first place is the advertising business. The advertising business increases the costs of everything. So, enjoy your "free youtube videos" that contain advertisements for which you pay through the products you buy and are advertised. Privacy being "rich people's game" is BS because essentially everybody pays for the advertising costs in a rather horizontal fashion.

scarface_74
2 replies
14h45m

And if you want to be noticed on the web, you have to go through Google. If you want to sell anything you either have to go through Facebook or Amazon.

The web is “open”. But Google and to a lesser extent Apple control the browsers to access it and they can control the platforms.

Meet the new boss…

jononor
1 replies
9h22m

You do not have to. But, there is a considerable cost to shying away from the dominant actors. Adding on top of the existing difficulties... But it probably can be done. It would be very interesting, especially in HN community, to build some entrepreneurial practice around "succeeding without surveillance capitalists organizations". As a parallel to the "succeeding without venture capital" practices that already have been building up (bootstrapping, tiny teams) the last years. We can be the frontrunners.

scarface_74
0 replies
4h16m

And how will your alternative get users without advertising? Where are you going to advertise if not Google and Facebook?

Ben Thompson talks about “Aggregation theory” all of the time. The companies that have the power have that power because that’s where the users are.

How do you compete with “free with advertising”?

We talk about how the web killed TV networks. But streaming services are making deals with cable companies to bundle their ad supported tier with regular TV and with their internet only packages. Oh and the largest cable company owns NBC.

When “cord cutters” get rid of pay TV and use YouTube TV where most of the money still goes back to the same providers - how is this any different?

wdh505
0 replies
15h7m

I love this answer. I don't have a large enough graphics card to host a local open sourced gpt 4o yet. I like seeing the options grow though

reverius42
0 replies
12h12m

2024 will be the year of the Linux Desktop.

poikroequ
11 replies
15h10m

I've seen first hand just how little the average person cares about privacy. The type of people who believe they have nothing to hide. My friends and family don't understand why I don't post everything to social media. No I'm not going to install this app and scan all my receipts. BuT wHy?!?!?!

I think many people here on hackernews live in a bubble and just don't understand what is an average human being. They surround themselves with like-minded tech savvy individuals and fail to comprehend how there isn't stronger support for privacy.

teh_infallible
3 replies
14h42m

I suspect that people care about privacy more than we think, but the technology feels overwhelming to them, so they bury their heads in the sand.

I don’t even browse in “private mode.” Not because I don’t care, but because I assume it won’t really change anything.

hierophantic
0 replies
7h4m

I just don't agree at all.

My experience is that people not only do not care about privacy but look at people who care about privacy as having something to hide or paranoid about something that simply doesn't matter.

TeMPOraL
0 replies
14h17m

"Private mode" is not private, it even says so in the browser. The nickname "porn mode" is much more accurate term.

8372049
0 replies
9h43m

Private mode has very little impact on what you share with the world/through the network. It's there to keep your browsing private from people you live with etc., not from anyone else.

yazzku
1 replies
14h14m

Disagree with the latter part. You're also not going to have a large effect on those people, but that doesn't mean you should give up on what you believe just because other people can't even bother with it. I stick to my preferences of not using corporate stuff very much. If somebody wonders why and asks, I'll explain. Whether they then decide to do anything about the answer, I don't really care. If they do, then good for them.

What's the alternative anyway? Give up and suck Google's tit every day? It's "fine" if people who don't have the background to understand this issue do so. But if you, having the background and belief to object to such corporate control still choose to look away and do so, then you have willingly demoted yourself to a pathetic Google tit sucker.

poikroequ
0 replies
13h16m

I personally care very much about my privacy, but people are people, and most people only care about what directly and immediately affects them. That's the problem with privacy concerns, the negative effects aresubtle and gradual. It's very hard to make people care for that reason.

It's the same reason why it's taken decades to take serious action against climate change. It's only in recent years with extreme weather events and record breaking temperatures that people are finally starting to care because they're finally experiencing the negative effects of climate change. But only to a certain extent, you're still not going to see many people trading in their vehicles for ebikes anytime soon.

pennybanks
1 replies
14h43m

i agree. people just dont care. and if everyone did we would be paying cash for the online products we use for free

bn-l
0 replies
10h37m

It’s the “free” price. I used to think it was great! Now it always makes me uneasy.

renonce
0 replies
14h57m

You don’t want to be responsible for some random receipt 10 years ago that allegedly commits tax fraud, allow influencers to manipulate you into buying things or supporting campaigns that you immediately regret 1 minute later, receive lots of spams in your email and social accounts, etc, and the stakes get higher as you climb up the social ladder. Many of these don’t matter for people with little stakes as they have nothing to lose.

kbenson
0 replies
14h57m

I think many people here on hackernews live in a bubble and just don't understand what is an average human being.

Plenty of us understand, but it's depressing and usually trying to explain to others that have a rosier view of how people will react feels a lot like trying to get the people themselves to care, which is to say it's hard, thankless, and depressing, so few people bother.

blitzar
0 replies
8h43m

I think many people here on hackernews live in a bubble

and create apps to scan all your receipts, your retinas and then track your movements and sell the data to the highest and the lowest bidder in exchange for a slightly newer Maserati.

JumpCrisscross
3 replies
14h40m

it's too late. The frame shift is too great for most people

The capacity for the current crop of privacy advocates to make a cogent case it lacking. Peoples’ capacity to recognise and rearrange themselves in defence of a novel threat is not. If anything, our present malaise is one of allergic reactions to phantom threats.

whoknowsidont
2 replies
13h57m

Peoples’ capacity to recognise and rearrange themselves in defence of a novel threat is not.

Yet the two major topical events one could easily point to escape you.

JumpCrisscross
1 replies
10h6m

Yet the two major topical events one could easily point to escape you

Sure, as with you the trend line.

Your wit would be stronger with substance. Which events do you cite, and how?

whoknowsidont
0 replies
2h19m

Your wit would be stronger with substance

It would be wasted on you, as you are clearly acting dishonestly.

pennybanks
1 replies
14h45m

is it such a problem if people dont care? its not like they arent aware.

i mean its not too complicated to get to a desired privacy standard for the most part so it seems like the privacy conscious in general get angry for other people, who dont even care.

its because of those people that we can even use these products for free. i mean they arent making ad revenue off of us....

myaccountonhn
0 replies
5h33m

I think people do care, but are either not empowered or really know how/why it happens .

Many times, I’ll discuss the issue with less tech literate people and when they learn all the spying that’s going on, they feel really bothered by it, but not empowered enough to do anything about.

IMO it’s a bit like organic, unprocessed and local food. My personal experience is that I only started caring about it after really learning about agriculture.

ekianjo
10 replies
15h32m

Have people (except a minority) actually showed that they cared about privacy? I fail to see any large movement that is organized to fight back, so the logical conclusion is most people dont care.

karlgkk
6 replies
15h19m

Privacy is a major factor in apples ad campaigns. They do a pretty good job and offer pretty good strict options… compared to Microsoft or Google. (Hey, the bar is low!)

A lot of people care, enough that Apple thinks running major ad campaigns on the issue will move the needle

okdood64
3 replies
14h53m

I guarantee you that if they stopped taking privacy seriously and had serious public missteps in it, it would barely affect their sales.

bee_rider
2 replies
14h8m

If that is the case, what are they, ideological privacy adherents? Just dumb or fond of wasting money? I’m beginning to wonder how much this guarantee by a random internet person is even worth!

shiroiushi
1 replies
13h25m

Apple can use it in their marketing to try to convince some Apple holdouts to switch. The people who already have Apples will tout privacy to their non-Apple friends because they heard about it from Apple. Basically, it doesn't hurt, and can only help, and it sounds better than many other things they could tout as features. The most important thing in their ads isn't exactly what they're touting (privacy features), but that they're keeping themselves in peoples' minds and making them feel good somehow.

bee_rider
0 replies
6h35m

Sure, the comment I replied to said it would barely affect sales. If you are saying that it is just one feature among many, I agree.

I mean, no successful computer system is sold based on just one single feature. So, I suppose skimping on this particular feature out wouldn’t kill Apple. But I think they’ve identified somewhere where they have a fundamental advantage where their main competition, Google, an ad company, has trouble responding.

The most important thing in their ads isn't exactly what they're touting (privacy features), but that they're keeping themselves in peoples' minds and making them feel good somehow.

I don’t really see the difference… I mean, making people feel good about their products and buy them is the intended outcome of most companies’ moves. The features are how they so that. They can’t be ranked in terms of importance, one is the goal, the other is the means.

nomercy400
0 replies
11h6m

Caring about privacy is like caring about sustainability.

Great for ad campaigns, but when push comes to shove it always becomes a secondary or tertiary concern.

dorkwood
0 replies
14h49m

They do that to cover their own asses, not because it sells more phones.

marginalia_nu
0 replies
15h1m

How do you propose someone would fight back? What would such a movement do? How would it, well, move? It's not like Microsoft and the like are subject to the whims of popular opinion.

8372049
0 replies
9h27m

Well, there's the EU...

isodev
8 replies
13h55m

one that respects people’s privacy rights while also allowing companies to recoup costs for services they provide

Big tech is a bit like tobacco companies at some point - we’ve convinced people and ourselves that certain things are good, even healthy while they’re actually the opposite.

We, the people who build and maintain these services, are in a position to offer pushback to ensure the humans and their privacy is higher than any other concerns.

shiroiushi
5 replies
13h31m

We, the people who build and maintain these services, are in a position to...

Speak for yourself. Most people in tech do not work for Google or Microsoft. Working in tech alone doesn't give you some kind of power to offer pushback in these tech companies, just like being a random worker in the agriculture sector doesn't give you power to offer pushback against the tobacco companies.

iamkonstantin
1 replies
13h26m

I think the op means that this is a mindset to apply on any kind of software or tech project. Think apps supported by ads or deploying tracking toolkits by default. I was looking for a blog template the other day - so many come loaded with Google analytics and silly cookie prompts for no good reason. These are things we can influence.

shortrounddev2
0 replies
4h59m

If the boss says we're adding third party tracking, then that's what's happening. Software engineers have little to no control over the product. The only power I have is to quit my job

gigel82
1 replies
11h34m

And you think people that work at Google or Microsoft have power to pushback against the shit their employers do? LOL

shiroiushi
0 replies
10h0m

I never said they did, but the OP clearly thought so. I'm just pointing out that most of us don't work there, so we don't have whatever power the OP thinks we have.

hereme888
0 replies
3h15m

I think "most people in tech" have the choice to not use Big Tech spying services that keep gathering the data.

But how common is that choice?

behnamoh
1 replies
13h8m

we, the people

unfortunately most engineers I've seen (i come from an engineering background) care mostly about technical stuff and lack the EQ to see the bigger picture.

Longhanks
0 replies
11h40m

Sure, because any attempt to be morally right instead of pleasing the shareholders is struck down by management. You might even risk to penalize your future career.

Most engineers are engineers, not company politicians. That doesn’t mean they don’t see the greater picture, it’s just not worth it dealing with corporate bullshit (or even endangering).

Narhem
8 replies
14h6m

What about Apple illegally going through my MacBook then my Linux laptop.

Forget these legal grey areas. These organizations actively deal with corporate espionage and no one does anything about it.

shiroiushi
6 replies
13h28m

How exactly did Apple go through your Linux laptop (assuming you didn't physically hand it to them)?

Narhem
5 replies
12h39m

They gave me extremely specific jobs after so I’m not gonna sue them.

I was working on a specific thing in university so without illegally probing my devices they probably wouldn’t have found a good job match.

The related item was on my Linux laptop though. So extremely illegal regardless.

garbagewoman
3 replies
12h9m

You became an Apple employee after you found out they had hacked your computer?

nehal3m
2 replies
12h1m

Looking at their post history they seem a little paranoid. Apparently Reddit bans their accounts as soon as they make them, too.

immibis
1 replies
3h45m

Reddit does this to my accounts too. That's not paranoia. Somehow, the only accounts that stick are the ones where I just log out of my old account and don't clear my cookies.

You can see if it happens, because after your first two comments (always exactly two) you can go to your profile page in a logged-out browser and get a 404. This is a standard Reddit shadow-ban.

An easy way to demonstrate it is to create an account via Tor Browser, without a verified email address.

nehal3m
0 replies
2h37m

I stand corrected. Thank you.

8372049
0 replies
9h32m

Occam's razor is probably a good match for this story.

cryptonym
0 replies
10h12m

Throwing such statement with no beginning of proof is ridiculous.

Even *if* that happened, as long as you provide nothing tangible, you should keep that for yourself.

Barrin92
6 replies
14h55m

Genuinely odd article. I mean the observation is right, there's a baseline shift in privacy, but that's just... civilizational development. Modern technology just like virtually anything in modernity, at least when used collectively means you share data. Unlike ecological decline this is usually a voluntary trade off and not obviously bad. You can go and live off the grid. People make these choices because they don't value privacy as much as they value other things.

Calling this espionage is just a misuse of language because espionage is done with malicious and intent and without knowledge of the subject.

There's never a real argument in these articles about privacy that addresses the fact that most users, not because they're all trapped in some false consciousness, accept tools that give them some safety in exchange for privacy. If you ask people, do you want Microsoft to scan your stuff if it finds some malware most will just say yes. None of this is a scandal to the majority of users. Which is why there's never any outrage.

henriquez
2 replies
14h50m

Espionage is not necessarily done with malicious intent if it’s for the good of the country, right? Doesn’t justify trampling peoples’ human rights in any particular instance tho

lazide
1 replies
14h10m

Hah, plenty of invasions (including the current Ukrainian one) are done under the auspices of ‘freeing’/‘liberating’ the invaded. Plenty of examples.

And COINTELPRO and MKULTRA were supposedly for the good of the country/public.

Always be skeptical of anything being justified as ‘for the greater good’.

henriquez
0 replies
5h30m

MKULTRA was some overprivileged Ivy League frat douchebags with security clearances playing God with drugs and prostitutes, not really espionage in my book.

Espionage against citizens in violation of the constitution is immoral, but spying on other countries is fair game. It’s a huge part of U.S. military dominance.

8372049
1 replies
9h18m

Calling this espionage is just a misuse of language because espionage is done with malicious and intent and without knowledge of the subject.

State-based intelligence is not "malicious", and so you are misusing language yourself. State-based intelligence can be roughly divided into three sectors: Civilian, military and counter-terrorism. In peacetime, strategic civilian intelligence is by far the biggest, and is used to back trade agreements, various political decisions and so on. It's simply about preserving the country's interests.

With "malicious" out of the way: Most technological espionage is absolutely done with intent and without the knowledge or at least informed understanding of most subjects.

Hizonner
0 replies
5h36m

It's simply about preserving the country's interests.

... and damage the interests of those being spied on. Which is the whole reason they didn't just offer up the information to begin with.

If I steal your car to "preserve my interests", it's still malicious.

Hizonner
0 replies
5h38m

Modern technology just like virtually anything in modernity, at least when used collectively means you share data.

A whole lot of "modern technology" has been deliberately architected to "need" more sharing than is actually necessary to achieve its function.

oefrha
3 replies
14h39m

Snowden has shown us that Microsoft and every other big tech will happily give NSA the keys. Given that OpenAI is fully beholden to Microsoft both in terms of ownership and compute, I have to assume NSA gets whatever they want from OpenAI, directly or through Microsoft.

Now, apps and services are integrating AI like crazy, penetrating just about every type of information, and much of these integrations are sending data to OpenAI, voluntarily. People are sending their private thoughts to OpenAI as they brainstorm, as they write, before fully fledged ideas are even formed. NSA must be enjoying this transparency now.

blitzar
0 replies
8h37m

Snowden has shown us that nobody cares.

The disclosures of Snowden should have brought about profound change in this entire debate (> 10 years ago).

Instead he is a refugee with not much more to do than to shill crypto.

benreesman
0 replies
14h29m

These concerns are doubtlessly well-founded based on disclosures, particularly the Snowden disclosures.

But as for infinite data and definitely smart enough people to have ten SOTA LLMs in flight?

I don’t think TAO/EG needs any help from fucking Microsoft.

TiredOfLife
0 replies
7h51m

Snowden showed only what his handlers told him to show.

kisamoto
3 replies
12h50m

People are individual, some really don't mind offering their privacy for lower costs (their data is mined and doesn't belong to them) or additional features.

That doesn't mean it should be the norm though. If enough people actually care about their privacy there is the possibility for cheap, privacy invasive and expensive, privacy respecting products to live in the market.

Shameless plug: I launched Cognos[0] a few days ago that encrypts your AI Prompts and Outputs. No risk of leaks/hacks/being used for training.

If you like generative AI but are worried about putting all your data into ChatGPT and don't know how/want to run inference infrastructure yourself it could be something for you.

- [0] https://cognos.io/cognos-beta-is-live/

oefrha
2 replies
7h7m

I had a look at your offering, and unfortunately I’m not convinced. Obviously you need to send plain text to the actual AI model in use, which is confirmed by

Sending a message involves transmitting the message to the server in plaintext over TLS encrypted connections. Your plaintext message is encrypted with the conversation public key and saved before being forward to your AI model of choice, again in plaintext over an TLS encrypted connection. When the AI model has generated a response, this is returned to our server. This response is also encrypted with the conversation public key and saved before being forwarded back to you.

The model operator still gets everything, just not trivially traced back to the user and not collected in one place; in addition you get everything and there’s no way to verify you’re not inspecting and/or storing clear text user info, or hacked to do so. It’s basically a pinky promise from an unknown entity (no offense but you can see that from the user point of view) whose main value proposition is that pinky promise.

kisamoto
1 replies
6h2m

Thanks - I appreciate the honest words and no offense taken.

Yes I try to make it as obvious as possible that this is not end-to-end encrypted. Using provider APIs over their products (e.g. ChatGPT) does already offer some privacy benefits and I do give the user transparent choice over where to send their prompt.

But I am (currently) an unknown entity. How could I improve my offering to build up that trust?

I'm happy to even have a video call to answer any questions if that would help :) I see myself as offering a similar service to ProtonMail so perhaps I also need to look at how they built up trust in the early days too.

oefrha
0 replies
1h27m

Thanks for the candid response.

How could I improve my offering to build up that trust?

I don’t know, it’s a hard bootstrap problem. With ProntonMail, even if the privacy promise isn’t upheld, at least they’re another email provider, and ProtonMail to ProtonMail emails probably doesn’t end up with Google, so it could still be slightly better. Here your service is a middle man, so if the privacy promise isn’t upheld it’s sort of strictly worse.

Maybe you can play up the multi-provider aspect? But that market has fierce competition from existing, often deep-pocketed alternatives, e.g. Quora’s Poe. They’re more convenient too since they don’t try to be private.

rokizero
2 replies
12h0m

It would have been fair if the author would have mentioned that Microsoft is very open about this. I went to a Microsoft training, where the instructor also made clear that the service is being monitored. Suspicious messages get flagged and are reviewed under a 4-eyes principle. [1]

At least in the EU (I'm told) they are required by law keep logs to ensure that their AI services are not being used to do bad.

I'm glad that their system worked.

[1] https://learn.microsoft.com/en-us/legal/cognitive-services/o...

Hizonner
0 replies
5h42m

Very impressive. How many actual users waste days of their lives in Microsoft training to get basic information?

And, no, burying it in the ToS isn't being "very open" either. "Very open" would be putting a big visible banner on every page of the UI.

... and the abuse still wouldn't be acceptable even if Microsoft actually were being forthcoming about it.

mFixman
2 replies
10h3m

Dumb question: but how is this different from Apple capturing all your files in Time Machine for "convenience reasons"?

Both programs are a terrible invasion of privacy installed in all of their computers, but the backlash against Microsoft seems much larger than the one against Apple.

lutrinus
1 replies
9h47m

As far as I‘m aware, Time Machine is a local backup which means the data never even reaches Apple‘s servers.

Additionally, you can and should encrypt your data which Apple allows to do using FileVault.

Again, the encryption key never even reaches Apple‘s servers except if you use the iCloud Keychain. And even if you do and you store your key there, Apple wouldn’t have the key for your iCloud Keychain and thus couldn’t do anything with that data.

mFixman
0 replies
6h6m

It's still a privacy nightmare. Your encryption key "never reaches Apple's servers" until the company that built your computer and signs your safe executables force-installs some binary that gives them data about every single backup you ever made.

Microsoft is not breaking new ground in killing your privacy. Customers already don't know or don't care that $bigcorp is watching everything you do and choose to give them all their data.

henriquez
2 replies
14h52m

Boiling frog metaphor is much more emotionally compelling for the same argument.

nehal3m
0 replies
11h56m

Yeah and laughter is not actual medicine. The metaphor works though even if it's based on a myth.

submeta
1 replies
12h45m

Those who play this down and say: „Most of us have nothing to hide“. See where our western democracies are heading to. With a potential Trump as a next president. With a Germany attacking, silencing, cancelling anyone who opposes the slaughter in Gaza. Professors being cancelled because she liked a tweet. Truth and thought police coming in the western world, with all the data they need to prosecute anyone deviating from the officially accepted opinion.

8372049
0 replies
9h28m

Yes, but it's also more nuanced than that. Russia is actively feeding these trends to undermine and destabilize western countries, for example. The internet is a tool for this as much as a cause of it.

hiddencost
1 replies
15h38m

Why would anyone think that MSFT was doing anything other than everything they were allowed under the law?

INGSOCIALITE
0 replies
15h35m

...and more. because they work directly with/for "the law" - and have enough money to pay any fine that could possibly be levied on them

ein0p
1 replies
14h51m

Did Microsoft ever explain how “generative AI” was used for these “attacks”? Because I smell bullshit and I’ve yet to see anything concrete.

wmf
0 replies
13h42m

It could be something as simple as using Copilot to write code for malware. It's like saying criminals use roads or electricity.

codr7
1 replies
5h55m

If they want to get rid of Windows completely, they're doing a pretty good job.

It's turning into a total disaster.

And there are viable options these days...

immibis
0 replies
3h47m

Not really. There's Apple (less spying, more lock-in) and there's Linux (build-your-own operating system construction kit for DIY-loving folks. The prefabs aren't good enough that you can ignore what's underneath.).

abpavel
1 replies
14h44m

I suggest reading "Bastard Operator from Hell". Historically ISPs could read all email, see all AOL chats, and all the search queries. They could see you entering your payment information. If we go by fisheries example, there was no privacy in the 90s. It's reasonable expectation that you trust someone with your information. I feel lake author falls into imaginary world fallacy, where people don't know that the meat comes from killed animals, or that cooking is not normally done in latex gloves.

wuiheerfoj
0 replies
14h37m

They _could_ see those things, but they didn’t do anything with them at the scale that’s happening now

Edit: in fact, before widespread TLS practically everybody could see them…

victor22
0 replies
14h14m

Step 1 - ban the countries you want banned

step 2 - blame something vaguely coherent

step 3 - ?????

step 4 - profit

user90131313
0 replies
15h9m

I personally want to say thanks for productive engineers, managers and all worked in these features in SV and in USA and all over the world making these companies happen. Without them we might have a different reality today but hey, they were just following the managers and never had a chance to stop this. How coul they know? :(( Trully sorry for them while they mostly worth a lot money and enjoy having these spying giants. True work of art. One of the best beneficial work to society.

Next round is AI killer robots and It's already happening with AI people. After it's done, each engineer will worth at least $10Million and we will have great killing robots choosing people based on algos.It will be algo that kill people and not them for sure. :) Can't wait that future. Great dystopia here we build. Thank you AI people too.

thund
0 replies
13h19m

It’s called monitoring and it’s all automated. When something triggers then of course someone is expected to investigate. It’s how you run a service responsibly. All big corps do that and things would be much worse otherwise.

sharpshadow
0 replies
1h0m

The analogy with the sea is great also in territorial questions, international waters and piracy of course.

oglop
0 replies
4h1m

Good thing that we saved all those fish since learning about relative baselines…

A very nothing article. It amazes me people think some private company owes them anything like privacy or free speech. That’s not how that relationship works. It’s an exchange relationship. Transactional. Not an ethical relationship. The only motivation for ethics here is PR. That’s the best you can do.

notarobot123
0 replies
8h44m

we need to step back and look at what a healthy technological ecosystem would look like: one that respects people’s privacy rights while also allowing companies to recoup costs for services they provide.

User-community owned services/infrastructure could reasonably be the way towards services that run well, recoup costs and prevent abuses (both of the service and of users). Are there any examples of this kind of thing for AI yet?

maverick74
0 replies
6h58m

The biggest problem is just one:

Nobody Cares

(at least 95% does not)

We can say M$ tried to fight the spy-system (you can see that with the scroogled campaign) but learnt that - say - 95% didn't care...

so why couldn't they get rich as Google and Facebook was getting?!?!

Problem solved!

everyone is eating what they ordered.

Bon Appetite

mark_l_watson
0 replies
5h16m

Personally, I think I lose a lot of productivity chasing privacy. A few examples:

I like to use Proton Mail even though other options have more features and convenience. I strongly favor running LLMs locally using Ollama, even though APIs like those from Groq, Mistral, OpenAI, etc. are more convenient if I have an Internet connection. I enjoy and find Duckduckgo useful and pleasant to use, but Google and Bing are flashier. I also run in Apple’s Lockdown mode, and prefer private browser tabs even though there is a delay getting auto-logged in.

If you have time it is well worth books like Privacy is Power and Surveillance Capitalism.

I don’t advise non-technical friends and family to go down the privacy rabbit hole, but I personally choose to do so.

junto
0 replies
11h51m

In other news, yesterday two thieves used a Volkswagen Golf to flee the scene of a robbery yesterday along the A12 motorway. Both Volkswagen and the builder of the A12 motorway will be prosecuted for supporting this criminality.

figassis
0 replies
13h16m

shifting baseline syndrome - didn’t know there was actually a term for it. I just call it generational amnesia.

ffhhj
0 replies
2h49m

1980's: we all wanted personal computers

1990's: we all wanted the Internet

2000's: we all wanted social networks

2010's: we all wanted mobile devices

2020's: we just want to get rid of shitty search and AI surveillance

dbttdft
0 replies
6h28m

we need to step back and look at what a healthy technological ecosystem would look like

...aaand as usual mr tech commentator was right up until this point. There doesn't need to be a balance. People are always talking about "balance". What balance? TV/radio didn't spy on you to operate a profitable business in the broadcast days. I can do anything on an Amiga computer with probably not bad UI compared to the latest versions of Windows, and it will never have to phone home for anything. This opinion itself, ironically, is just a shifting baseline. You are talking about "balance" (translation: compromises) because the 3 maintained pieces of software for your domain (such as camera in your house) are by two scum corporations and the 3rd is some garbage quality open source software. Nothing stops someone from making actual good software/hardware, closed or open.

crimsoneer
0 replies
10h32m

Saying something is a scandal in a headline doesn't actually make it a scandal.

ajb
0 replies
7h40m

A lot of this is due to the failure of people's privacy instincts. People do have a strong instinctive desire for privacy, but only against other identifiable people. When private data are revealed to people they know or can be connected to their home address, then they are very upset. But they don't react when 'faceless' organizations collect their data, and use it in ways that are opaque, even when this is strongly to their disadvantage (such as HR departments sharing salary data).

Anti-privacy interests are completely aware of this and avoid triggering the viceral response. For example, the spy agencies always give the example of "no-one is listening to your calls" as if the risk of surveillance was the guys transcribing from reel-to-reel tape recordings, not the dossier available to the powerful on each citizen. But the guy listening to our words on the tape is the one we have a viceral response to. Similarly facebook advertising terms (and presumably that of other companies) are careful to ban advertisers from showing the targetting and giving the impression that there is a person at the other end who knows all about you.

I think the upshot of this is that we need to personalise it for the wider public, not just talk about abstracts like privacy. Bring out the fact that your next employer has that data when negotiating your salary. That companies spend thousands of hours figuring out how to manipulate you based on your data. That at the press of a button, the security services could assemble a dossier on you, and make it available to anyone powerful that you have annoyed.

a-dub
0 replies
13h53m

i thought they were getting flack for building a local activity log for a local assistant?

the linked sources don't really talk about any kind of user activity logging.

i suppose the point stands though: modern tech is kinda like 70s and 80s wingnut nightmares come true. the tv actually does watch you now, and when you read something in the standard way, detailed information about your demographics and behavior are shared and then stored within milliseconds across tens of parties.

how many entities with an ein get notified when you read that ieee spectrum article? too many!

Joel_Mckay
0 replies
4h56m

So it is a Key-logger rebranded as "AI", and I'm sure Microsoft protected themselves in the EULA.

Did this violate the legal definition of an "expectation of privacy"?

In most WIPO countries, you can't legally record other peoples conversations unless both parties give you prior consent (hence the quality assurance disclaimer on phone services.) Single-party-consent does not mean what most people think it means, and has launched many lawsuits.

I usually recommend this list after a windows 11 "offline" install:

https://github.com/StellarSand/privacy-settings/blob/main/Pr...

They sure don't make it easy to un-dork your PC for sure, but it works if you need to run the OS for that one legacy application that people refused to port to Mac/Linux.

Good luck, and make sure to sue them in a jury trial... They won't change their behavior unless there are fiscal disincentives. =)

Fer24
0 replies
14h17m

Standard, thar word was changed to mean something close to absurd imposition, rhe healthy way is something become an standar because this tool was found very usefull to solve a certain matter.

The modern idea of standas is, a ridiculous imposition that only benefits a little group of people

DeathArrow
0 replies
10h47m

The defaults have changed. In the '90 we assumed privacy by default. Now we assume spying by default. I don't think we can move back in time so we have to work around that spying and lack of privacy.