That Tweet links to this [1] description which, in glancing at the text, seems to indeed be accurate:
----
According to the latest draft regulation dated 28 May (Council document 9093/24), which is presented as “upload moderation”, users of apps and services with chat functions are to be asked whether they accept the indiscriminate and error-prone scanning and possibly reporting of their privately shared images, photos and videos. Previously unknown images and videos are also to be scrutinised using “artificial intelligence” technology. If a user refuses the scanning, they would be blocked from sending or receiving images, photos, videos and links (Article 10). End-to-end encrypted services such as Whatsapp or Signal would have to implement the automated searches “prior to transmission” of a message (so-called client-side scanning, Article 10a). The initially proposed scanning of text messages for indications of grooming, which is hardly being used to date, is to be scrapped, as is the scanning of voice communication, which has never been done before. Probably as a concession to France, the chats of employees of security authorities and the military are also to be exempted from chat control.
----
Strange times we live in. Entertaining, but strange.
[1] - https://www.patrick-breyer.de/en/majority-for-chat-control-p...
Yes. There are people with problems. Instead of helping them fix their problems it's just easier to prevent nice things from everybody.
Strange times indeed.
TBF what else are governments going to do to prevent child exploitation and other bad stuff on these platforms?
Perhaps if the platforms made more of an effort to block the bad behaviour there would be an argument against laws targeting this kind of stuff..
Neither Whatsapp nor Signal can do anything about it, since they don't know the content of the messages. That is the whole point of their protocol. That is the whole point of privacy.
Nobody falls for that crap. We all know that CP is being presented as a scapegoat here because, "how can you be against something that MIGHT help against CP!?!?!" while in fact and in the end it'll be used to spy on everything.
Nonsense. Whatsapp owns both endpoints. They could know perfectly well what you write, when you write it, to whom, and anything their heart desires by way of their analytics. The messages themselves contains no business value to them. They could send it by carrier pigeon for all they care as long as the client is their product.
No. A user owns each endpoint. Whatsapp provides a service to the owners of the endpoints.
Yes, Whatsapp is in a position to act unethically and steal information, but that does not make Whatsapp the owner of anything.
Whatsapp is not something you can compile or inspect easily. They own the endpoint, in that specific meaning. They may not have root access on the device, but inside the client nothing is out of scope.
It is their client. Any data you enter into the client is data they 0wn.
You can definitely decompile WhatsApp on Android to inspect it. I'm sure security researchers do this regularly, including those looking for a bug bounty that could be life-changing.
They don't have to steal information in order to block inappropriate content. The app itself can detect and block without external intervention.
Presumably they would use edge based hash scans or ai models to detect unsavory content. But if the content is so extreme as to be unsavory, likely they will be legally required to report it to leo.
The next steps are leo seizing your device(s) or leo having WhatsApp start sending all your messages to them for review.
What happens when leo adds the hash of a state-loathed meme?
They "could" maybe. But since you seem to not have more information, we have to remain on the assumption that they're still using Signal protocol and can't see what the messange contents are.
They could use the signal protocol AND see the contents
Signal could still see the contents of your messages. Anything you enter into their app could be scanned or sent back in plaintext to some server, all prior to actual transmission via their protocol.
The only way to ensure that can't happen is to inspect the code and compile it yourself, or at least validate the hash of the binary you're installing. But we've also recently learned with the xz fiasco that you'll need to be sure to add checks all the way down.
Of course, you could always encrypt before entering the text into signal, but at that point why use signal?
You're right. But apparently every time this topic comes up we cry that this is abuse of our freedom.
Tech is not good or bad but it does have unintended consequences and open new ways for abuse. This is tough to swallow for us who work in tech but is obvious to everyone else. If we stay in denial and do not volunteer to help use tech smartly to compensate for bad side effects, they will vote in some dumb law and some bad guys will exploit it for surveillance later.
There is in fact no "smart" way to compensate for this particular "bad side effect".
Either your communications are spied on to weed out unapproved material, or they're not. And there is no way to make the system architecture care about which material is allowed to be "unapproved".
The right answer here is just to accept that, beyond a certain point, further reducing the amount of circulating child porn requires unacceptable tradeoffs. Then stop whining and wishing for impossible technical solutions.
There is in fact. You could build AI directly into the app to detect inappropriate content and block it.
The app has thus not violated end-to-end encryption, nor has that content been exposed to external parties.
Platforms could definitely do more.
And how do you deal with all false positives. That will just be deemed collateral damage?
To add to your excellent point, who gets to validate the models efficacy? How do we know the state hadn't trained it to report users talking about maga, or Isreal, or for those with Chinese national lovers?
Government recording all our conversations also has have "unintended" consequences.
The track record of such societies is rather terrifying.
You think I disagree? My point is almost exactly the same, we should stop that (by promoting a saner solution).
One way that would put down lots of exploitation and support privacy of adults would be video and online surveillance of all children when not alone, using a parent-controlled computer to detect bad things happening. This could start in kindergarden and school and gradually expand to all spaces that are not home. Children have some right to privacy, but not as strong as adults.
On the 16-th year, if the child wants so, surveillance gets turned off and he/she is granted more privacy. Like with age limits on car driving or working, at some point the state says, you are old enough to take responsibility, we won't protect you from harsh life anymore.
This is a targeted, reasonable solution with little collateral damage, that upholds the right to privacy for adults. It's what parents would want, instead of the bureaucrats. And who really, actually cares about safety of the children, parents or bureaucrats?
So, you want children to have no privacy just to get a tiny bit more privacy to adults? Are adults really this horrible towards children, do you really think you would like this as a child?
Why the hell not? Do you really think it is ok that your daughter gets constantly video surveilled all throughout puberty? Do you really think that is a lesser evil than your text messages being scanned for some keywords? Would you be happy if there was a camera constantly watching you as you jerked off as a kid?
Only parents would have access to surveillance records. Children often do not like stuff their parents make them do, and their power over them, this would be one more thing, with great benefits.
Because they are children, they do not have full responsibility for their actions, and they are more vulnerable to abuse, and protecting their safety is more important than protecting their privacy. I want to keep the status quo, where children are protected, and adults have rights. The way stuff is going, we're all getting more like children with one parent called Big Brother.
That is not what I'm suggesting. I'm talking about public spaces (including online) where adults are present. If the kid wants to jerk off, or two or more kids want to make love, they can go home or use some private space like a bathroom.
Being under constant surveillance during the most formative years of your life can leave you extremely mentally unwell.
I hope you're not a parent.
The bad stuff will just move somewhere else as it always has done.
Compromising everyone's privacy will eventually mostly affect innocent people. Or even cause the platforms to cease existing altogether, which looks like a real possibility with Signal. Pedos will just move on to whatever service isn't compromised yet. You can outlaw or hamper secure encryption in some jurisdictions, but due to the generic nature of computers you can't in principle stop people from using secure encryption.
This might only be a temporary state, though. Smartphones are, in that sense, not generic. PC might follow.
mtl is saying computers are big boxes of math and you can't ban math.
Though given how incredibly clueless politicians have become I wouldn't be shocked if they tried.
Smartphones are general-purpose computers with a bunch of little digital locks, that while strong, are not impervious. Such locks, when used to protect a device owner, are good. The same type of locks, when used to deny a device owner full rights to use their device as they see fit (absent harm done to others), are evil.
Your argument makes as much sense as banning knives because they are sometimes misused to attack people. What about alcohol? Some people drink and drive, we should ban alcohol too!
Banning sale of alcohol in gas stations would not be unreasonable.
It would
Lots of them will probably be selling alcohol years after they are no longer selling gasoline.
CP is a pretext to grab power, the same way terrorism was 20 years ago. If a government actually cared, they would start dismantling the catholic church. Risking a slippery slope fallacy, I see no way governments won't expand the scope of this intrusion. Before you know it, being critical of a certain foreign government [0] or teachers criticising the department of education [1] will be limited.
0 - you know what conflict I mean. Will we have to resort to coded messages wherever we go?
1 - https://www.theguardian.com/politics/2023/oct/21/uk-governme...
The Australian government enacted the same type of "protect the children" laws, and then immediately used it to surveil journalists critical of their policies.
Didn't the UK's web filter contain political websites too?
This argument is more thought provoking than people may think.
What we see is a shift of power. Electronic communications started out as private enterprises, then mostly taken over by states because of the need centralization, and now almost completely taken over by private enterprises one layer above. Governments are still trying to make sense of what happened and find their role in this new world.
Platforms are centralization at work, and it's not that far fetched to think that states could do a better job than Twitter or Facebook. Platforms have immense power. After all, we mostly agree that Facebook very literally facilitating genocide was not good for society. What we disagree on is how much they knew and how much was circumstantial.
There is also this idea that jurisdictions matter for platforms. The Chinese connections with Tiktok owners are problematic since we know for a fact that they have the power to influence elections. The American ownership of Facebook is not similarly problematic, largely because the CIA and other institutions interests mostly align with ours.
It would not surprise me if the Saudi money financing Twitter/X would turn out to be just as important as the financing of 9/11.
In light of that, it should not be surprising that EU states wants to play the game too, even if it will have very little practical effect.
States can already play the game. But this isn't playing the game via competition. This is just stealing data via lawfare
We keep having "anti child exploitation" measures, like all the bloody time. It's been decades by now.
And the problem keeps getting worse.
So either it's just not working whatsoever and this is useless and should stop. Or it's never really about child exploitation.
Just like "temporary measures" against terrorism which have been temporary for 30+ years now (no, it didn't start on 9/11).
Almost like it was never about terrorism.
You're swallowing the propaganda. The problem hasn't changed to speak of.
Why do you take their motives at face value?
Obviously, the moment these platforms lose privacy, the criminals cease communication on them immediately. So they're the last group this is aimed at.
The solution to crime is the same investigation and detective work and anonymous tip offs and so on that it's always been. People going undercover and infiltrating these groups and then bringing them down.
By chasing the criminals off this platforms, all that happens is the detective work gets harder. Now they've got to go find where to start their infiltration, all over again.
This outcome is so obvious that the only conclusions available are that the lawmakers are either IQ 60 morons, or that they have malicious intent.
I read it somewhere, but most of the crime is done by people with low IQ. So it makes work for the police actually easier.
Well, we went in billions to these chat apps. The state just follows.
As sceptical as i am, i think, i want the state to resolve online crimes.
Why should everyone have to suffer so that state's job in catching criminals is made dead easy ? Such criminals are a microscopic minority of the population. Governments - esp in the west - have disinvested in traditional investigation and moved to using mass surveillance as their default operating strategy. And citizens are being made to pay the price.
Just put the entire population in jail, I'm sure you'll be able to prevent a lot of crime that way.
I hope this passes.
I want to see the riots when EU cell phone consumers have to pay for text messages again.
Some men just want to watch the world burn...
Does anyone have to pay for text messages?
People often forget just how much we get from the EU that's taken for granted. Everything from practicalities like no roaming costs and consumer protection, all the way to freedom of movement, peace and overall stability.
I don't think the EU says anything about how much texts should cost...
Edit: I am *obviously* not talking about surcharges for roaming but cost in plans, which I think is the point of the OP when he asks who pays for texts. Pricing is not regulated and has nothing to do with the EU.
It does, quite a lot. For example, it says you should pay the same regardless where you are in the EU (roaming), it also says you get a clear and transparent pricing for your mobile service, it also says all kinds of things about your rights to cancel, change providers, get refunds etc... you should look it up.
Why the snark? Especially since you're beside the point, so let me rephrase with the help of my lawyer: the EU does not say anything about how much consumers should be charged for text messages by their operator (i.e. "cost"). This was clear from my previous comment...
So where does it say how much operators can charge in their plans? Nowhere.
They are free to say texts are free (included in plan), or texts are charged at 10 euros each, whatever.
The EU only limits surcharges when roaming to other EU countries.
Jeez, guys.
Edit:
I am not backing away from anything, I was rephrasing to sustain your strange cross-examination. You guys are being unecessarily aggressive and argumentative over a simple point.
"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."
"don't cross-examine."
It's Sunday, guys. Do something positive with your time. Bye.
So you're backing away from your original stance of:
"I don't think the EU says anything about how much texts should cost."
Into a weaker form of "The EU only limits surcharges when roaming to other EU countries.". This was my only comment because your first position is factually incorrect, there are situations in which the EU defines how much texts can cost.
edit: not sure where you see the aggressiveness
Yes, but also the (1) push of PNR to keep a complete record of your travel and movements within the EU. Dutch train operator NS is operating in spirit of the future “PNR” already and makes it harder and harder to buy “anonymous” train ticket (even for local 15min journeys). The so called anonymous card, is linked to your bank account used to top it up. What if you prefer to use cash in your daily life (2), for envelope-style budgeting? Well, your bank will let inform the authorities of your anomaly. You will start receiving monthly questionnaires asking to backup your behavior and why you might want 2-3-4K eur in cash every month.
This is part of the trans Atlantic trade agreements. They have a certain responsibility, having negotiated them, but they clearly feel they are beneficial in some way.
All that people have to do is vote for the Pirate Party. There are people holding the line in Europe, but the population is worried about war and immigration instead of net neutrality and encryption. These are understandable priorities but it'll not be good.
Crediting the EU with peace while Israel and Ukraine are actively at war is a bit much.
Oh come on, France and Germany were basically in a perpetual war for hundreds of years before, and now there's not even a theoretical chance of any two EU states fighting each other.
Occasional war here and there not directly involving any of the member states is incomparably better than what came before.
Are Israel and Ukraine EU members? Are they fighting members of the EU?
If you have to run every image and text through some sort of "filter" you cant turn off... Yes.
Thats a LOT of compute.
Yes. The phone batteries across the EU are going to hate this.
That is already happening though, this just says it should also happen to e2e encryption by scanning on device. If it was that expensive costs would have appeared a long time ago.
In Germany at least, unless you opt for a super cheap package(e.g. sponsored free or less than EUR 6/-), calls and texts are unlimited in local(country level) networks, only data volume is limited. Not sure about other EU neighbors.
In Poland, 4.70€/mo will get you unlimited calls and texts. Can be purchased on pre-paid plans as a 30-day package, no need for post-paid plans.
I pay ~0.10€ for a text message - but I don't know anyone who uses text messages for communication so it's not a problem. I could buy a cheap plan for unlimited text messages, but I don't want a fixed monthly fee and prefer prepaid.
There won't be any riots. This is the scariest part, most people just don't care
France: Hold by beer. https://en.wikipedia.org/wiki/List_of_incidents_of_civil_unr...
While it's a fun meme, if you actually look at the list, the inciting incidents are usually (accusations of) police violence/overreach, reforms to cut labor protections, welfare or public education, and Israeli military operations in Palestine.
In fact, most of the protests seem to neatly fall into the "police violence" (usually against minorities) and "austerity" buckets.
I think the parent meant that France is a country that actively protects its rights via violent protest as opposed to other countries that merely sit and suffer.
And yet France is talking about turning off the internet during these protests so that protestors can't communicate.
A lack of rioting doesn’t mean people don’t care.
When was the last time public outcry actually impacted policy?
Example from last year: https://en.wikipedia.org/wiki/Dutch_farmers%27_protests?uses...
What do they do if I send: AFBC67CEDA7AD?
Ban all “non-intelligible” content?
Who can stop me from hiding information in very normal looking sentences?
If you want privacy, there will always be a way.
This is not about preventing you from doing that, it is about preventing services with many users from providing that service to you without a backdoor.
You can just send anyone you want an encrypted e-mail or message, but Signal can't facilitate that without the required provisions set out in those laws. If these dumb laws get enacted, Signal cannot get away with just pretending you are sending gibberish whilst providing true end-to-end encryption without any client-side scanning or whatever to you, but you are well within your rights do so yourself on top of Signal (if that's even possible); they just can't provide an automated means to do that for you.
> they just can't provide an automated means to do that for you
What I'm wondering is whether two separate applications can be set up to communicate automatically, with one handling messaging and the other being responsible for encrypting and decrypting the data.
What would be against the law in that case? The messaging app? The encryption app? Or the interaction you are doing in that moment?
Step one would be determining if anyone actually uses those two apps together. A handful of people? No one cares. Is it now the default way you install Signal (or its two components) and do hundreds of thousands of users do this? Then the next question asked is who is facilitating it and how is that done? Does the backdoored Signal have a plug-in that allows this kind of use? Does Android facilitate that? Those people will likely find themselves in legal trouble.
Of course these laws are dumb, but that doesn't mean they can't be (mis)used to get the desired effect.
Does Android facilitates IPC and services?
That's not what the law cares about. Being able to encrypt stuff end-to-end, is not what is being targetted — it is not realistically possible. What is being targetted is millions of people getting private, true end-to-end secure communication with no content scanning of any kind through some service. Are you providing that service to millions like Signal is? This law applies. Are you the size of Meta and are you implementing some 'clever' two-component solution to sidestep this law? Expect legal trouble.
You can already install a mail client with PGP-support. Will K-9 Mail get in to trouble if a million users in the EU started privately exchanging keys and using GPG with K-9 Mail? Who knows. These laws are not about such practical details. This is about unlocking massive amounts of signal intelligence to do… who knows what, and those large communication platforms are juicy targets. All it needs is a law to coerce them to cooperate.
Don't expect reasonable arguments from the proponents of such laws, and don't expect to be able to avoid them for millions of users with clever tricks; you'll still fall foul of the spirit of the law, if not the letter.
Actually police and various agencies do, because when most people aren't encrypting, the few that do are suddenly interesting. Some of them will turn out to be organized crime, but some of them are just adults who want to communicate privately.
That might be the next step. Banning encryption. We live in a strange world right now.
The EU does not want to ban encryption, because it is the backbone of e-commerce and banking. There are plenty of public references that show the EU's explicit support of strong encryption.
What some law-and-order types (globally) want, is the means to scan, peek, or otherwise access private communication, especially if that communication is provided by a service used by millions. You can encrypt all you like, but if you use WhatsApp or Signal, laws like these force those services to create a way to eavesdrop. How is probably not defined in the law. Client-side scanning before encryption, having those services act as men-in-the-middle for each conversation; this is all fine, and can use encryption as usual. As long as certain agencies get to have a peek somewhere between those strongly encrypted tunnels.
Neutralizing encryption is real; it is not about forbidding websites and clients using TLS, it's about getting in the middle.
I am using a special keyboard that outputs these sequences automatically. Must my keyboard driver send the keypresses to the EU? Fine, here are they: AFC628BCF627.
I understand Signal is handicapped now, but I couldn’t care less about Signal. I only care about being able to communicate in private. Why don’t we implement this stuff at a lower layer?
Surely the guys on top must see this is an endless game that they will never win? It’s not an arm’s race, it’s fundamentally impossible.
You don't seem to understand the point of these laws. It is not about stopping anyone with enough technical know-how from encrypting their communications. This is possible today, and not something which can be easily legislated away without resorting to a much heavier class of draconian laws (at which point you won't be living in a democracy any longer in any case).
This is about making it hard (or impossible) for some perceived group of miscreants to communicate privately. People sharing CSAM (however you define that) or dealing drugs, and stuff like that. Anyone can encrypt their communication, but most people don't do this consciously; the masses just use WhatsApp and Signal and what have you.
You and your special keyboard are not of interest, and unless you start selling these along with a service to route the encrypted messages to thousands of users, you are not the target of this legislation. Take away Signal and WhatsApp and sending an end-to-end encrypted message to your drug dealer without exchanging keys and agreeing on a protocol suddenly isn't as easy as just opening an app. That's the point of this law.
It's a dumb law, but you won't make it go away by playing silly semantic games.
Criminals are using specialty phones or should I say, were using, because this was recently cracked and thus became useless. Catching Taghi in NL was a famous result of that.
Point is that dangerous people will use specialty devices/services and being legal is certainly not one of the requirements.
Again, not what this law is about. This is all about wanting to gather signal intelligence on millions of people automatically. About being flagged when someone uploads CSAM or uses certain keywords. They know it won't stop anyone with the skills and means to use some other encrypted solution.
So they explain it to people as being able to catch criminals, but they known it won’t work against that?
How can you maintain such a position? At some point you’ll have to explain your reasons for draconian measures like this.
That’s why I’m spamming people with “it won’t work” because so many seem fooled by this. You will catch exactly zero people with this. The only people you’ll catch will be the ones that you would have caught anyway, because of their nonchalance.
I don’t know anything about law but I know tech.
They may be coming down the pipe, after the soft version gets people macerated.
But you will be hearing from talking heads that you are, and Russia and North Korea are the real dictatorships.
And maybe they will be right, because what is democracy? The word has different meanings to different people, and it won't be difficult to shape the discussion about what our liberal democracy is all about. Maybe it's about accepting who has power now and about protecting the vulnerable. We have seen a bit of this stance and real capabilities in recent years.
If you sell your keyboard as a service to people in questionable lines of business, then yes, you will need to comply with these laws. You can probably also expect a visit from three letter agencies. Which these targeted platform companies also do.
"We kill people based on metadata"
If this sort of surveillance stuff gets accepted, in time, you're gonna get noticed, put into database, and maybe called for questioning. Why are you using encryption, people that have nothing to hide do not use it.
A timely reminder that 9/11 hijackers communicated in the clear. Examples:
“The semester begins in three more weeks. We've obtained 19 confirmations for studies in the faculty of law, the faculty of urban planning, the faculty of fine arts and the faculty of engineering.” — Mohamed Atta
“Two sticks, a dash and a cake with a stick down. What is it?”[2] — Mohamed Atta
“The first semester commences in three weeks. Two high schools and two universities. ... This summer will surely be hot ...19 certificates for private education and four exams. Regards to the professor. Goodbye.”[2] — Abu Abdul Rahman
Even in China with extreme surveillance and censorship in place, Chinese people have been quite creative in their ways of circumventing censorship. An example approach is cutting and pasting official political videos together in a way that changes their meaning ever so slightly. Automatic censorship algorithms are fooled, and human analysis and censorship are necessary and very expensive to carry out. Other examples are playing with the sounds of words, or using memes or rapidly-changing euphemisms.[3] It's too difficult to automatically censor stuff where people have taken a normal word such as "chair" and send each other images of chairs as a sign of protest. An image of a chair and a birthday cake with a number of candles could indicate a date of protest. Or a chair in a picture with a number of ducks in the background, or a number of chairs stacked on top of each other. And if censors start blocking all chairs, everyone just shifts to buses, or pieces of paper, or bananas, or whatever.
[1] https://community.apan.org/cfs-file/__key/docpreview-s/00-00...
[2] https://www.nytimes.com/2004/12/20/business/technology/on-th...
[3] https://en.wikipedia.org/wiki/2022_COVID-19_protests_in_Chin...
Seriously, who are they expecting to pay for that? AI vision detection run against _every image every person sends to anyone_, among other things, will get ridiculously expensive.
Half of the reason Microsoft is pushing "AI PCs" with special hardware is so they can push their spying to on-device and reduce all the extra costs the data processing they're imagining for things like automatic-screenshot-analysis-every-x-seconds will need.
And they're pretty much experts on spying on users. They've been collecting so much data for so long that apparently they've found a way to utilize what they collect in a way that makes the costs balance out in the end. Whether thats with government access, preferential antitrust treatment, or some actual financial method that directly affects the bottom line, I don't know. Somehow it's worthwhile for them. BUT -- when even Microsoft is looking for more efficient ways to spy on people, and forcing new hardware to support that effort, you know the data collection and analysis technique is definitely not ready to be made a legal mandate.
It doesn't make sense at all for some EU decision makers to decide it's acceptable for their citizens to bear the cost of so much data processing.
....wait, how much do large players in AI contribute to these politicians campaigns? Or if not them, who is really pushing this? It seems like someone should really try following the money on this one.
Let me take the other side here. The western world couldn't make it without sovereignty. I do realize that it sounds bad that few states would have such power. But make no mistake - if they won't do it, other actors would, I think that your interests reconcile with a democratic state much more than the other crooked actors.
It's just a matter of lesser evil in my humble opinion.
Implemented mass-surveilance is proof that democracy is dead.
What if the overwhelming majority of voters want surveillance?
Then put it to a referendum. It's certainly an important enough protection considering that correspondence is usually constitutionally protected.
E2E encryption would prevent anyone from intercepting texts by mathematic certainty.
At least on a mass scale, if we worry about back doors.
This SaaS fad is the underlying technical enabler of the spying and need to go away.
I guess Google is the main culprit by making P2P coms hard.
Asking faceless corps to open their users' mailboxes willy nilly is way easier then asking the voters to.
You answered yourself : remember Apple’s implementation of CSAM detection.
We don’t own our devices anymore and we now have very limited control of what is executed or not so there is nothing stopping developers to run those legal spywares on the device since our only option if we don’t like what an app does is to not use it.
Apple's proposed algorithm was probably the best so far.
The problem is not going away, we in tech are partly responsible and we should promote good ways to deal with it. If we don't then a solution will be found anyway, it'll just be a bad one.
A trusted app running on arm execution level 3 could make use of the NPU to offload some of the work?
I don't know how to help folks that didn't treat the apple csam fiasco as a massive wake up call to ditch the ecosystem.
We have linux phones these days, caly, and grapheneos. There really isn't reason to give up on general computing. (Ignoring the propriety baseband blobs.)
vry good take
Follow the money article: https://balkaninsight.com/2023/09/25/who-benefits-inside-the...
I will quit using all these platforms then.
Finally, free from WhatsApp voice notes, stickers and images clogging my phone storage. Adieu.
If those are what's bothering you, nothing stops you from quitting right now.
For now I have a Tasker job routinely deleting WhatsApp's media directories. Unfortunately everyone is using it so I have to stick with it. Hate WhatsApp with passion.
If this law passes, bye bye.
"want to stick with it"..
dont misrepresent
"have to stick with it"..
dont misrepresent
you admitted yourself you dont have to, as you will leave it if the law passes. Why are you not willing to admit that it is a "WANT" ?
Do you imagine that people will stop chatting if this law passes? If you hate it so much, why not just stop using it now?
As others have stated, just uninstall it now. Even at my work, my manager wanted us to use WhatsApp to communicate with our offshore teams. I let her know that I would be happy to do so with a company provided phone, but I don't install spyware on my devices, and furthermore don't have an app store so would need to be able to build it from source. But I'd be happy to use signal or email.
Everyone will be using it when the law passes so it’s no different.
Frankly, GB was totally right in leaving, and if I had the ability to vote for stay/leave right now, I'd want to go. But this position is totally unwanted. You're being called Nazi or at least extreme-right the minute you utter it. This way of dealing with supposed democracy is what makes me want to leave even more. The EU has become a strange beast. The current election ads make that pretty clear. It basically reduces to "vote us, we're cool, we are deocracy" which is almost dystopianly void of real content.
Where are you living, if you don't mind sharing? I don't know about any place where expressing eurosceptic views would get you called "far right", let alone a Nazi.
And I disagree. You take for granted all the good regulating and all the things enabled by the EU, and focus on the one bad regulation we're discussing, which is not even a law yet. I, personally, am not looking forward to the future without EU (I remember my country before it joined and the progress is immense).
I think elsewhere online this may happen.
Also, some public broadcasters use the "far right" word group suspiciously often, almost as if it was some kind of effort to softly suggest to people how not to vote, but I must be imagining things, because they would never do that:)
Try the largest German-speaking subreddit, for example.
UK has passed that law before EU. https://en.m.wikipedia.org/wiki/Online_Safety_Act_2023
Yeah, UK left because they thought EU wasn't draconian enough.
GB has their own version of chat control: https://en.wikipedia.org/wiki/Online_Safety_Act_2023
England's government already passed essentially the same thing.
Article 12 of the declaration of Human Rights :
No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.
Bugging communications by investigators with special approval is already an exception to this principle. Government bodies that make sure that laws conform to the constitution should veto any exception broader than that, so this draft should basically be pointless.
It feels like there are social/political mechanisms at work that allow that to however happen. They pave the road to Hell little by little, one stone at a time, and this is neither strange nor entertaining. To me, the beginning of this century has similarities with the beginning of the previous, which is quite worrying.
agreed. i feel the real motivation is much more sinister and much more to do with the geopolitical situation than anything else.
the reality is, if anyone is seriously determined to commit what they know are crimes then there are many solutions, of admittedly varying quality, for having private communications outside of the mainstream apps available on the app store. even signal itself has an apk you can install on android from their website.
so, it's unlikely this will indeed help in the fight against CSAM or whatever else is purportedly motivating this legislation. the end result will be mass surveillance 24/7 on the vast majority of the population who aren't commiting any crimes at all. it seems to me like big brother's wet dream. ironic that this is exactly the thing the US/EU political leadership have been bashing the chinese for since forever.
governments having low level dirt on their populations feels like manufacturing excuses to send people to war and remove their autonomy
Any intelligent criminal will just meet face-to-face to discuss their criminal activities. None of these apps protect against someone taking a photo of the screen and snitching to the authorities about what was said in exchange for less jail time.
there's a good reason for that. it's not controlled by and for the benefit of our oligarchs.
Yeah I truly don't see how this is all that ambiguous of a violation of privacy.
Will my email client using SMIME have to implement this? Seems kind of ridiculous.
Seems to be targeting platforms. Will it be illegal to send encrypted texts, what is keeping anyone from using crypto on top of existing messaging?
While I do not want to dive into any details on adverse effects of such stupidities, the EU seems to be actually taking a strange road to tech dependent overengineered regulation. It seems that this mostly driven by lobbyists that want to sell compliance services. Also it seems that there is more value in creating regulation rather than making sure it is enforcible.
If you are in the business of selling SMIME clients as a service to other people, then yes, you would need to implement this if the law passes.
Maybe there's an unintended upside to all this regressive business legislation. With all the focus on the "platforms", then maybe, just maybe, this will be yet another nail among the thousands of nails needed to finally kill them off.
Signal is not selling anything. So my guess is that distribution is enough. So my guess this would also apply to e.g. Thunderbird then.
Seems about right for the EU.
The law says "images, photos, videos and links". What about simple encrypted or password protected zip files?
As far as I know, most messaging platforms allow you to send regular files too. Wouldn't "the bad guys" simply use that as a loophole and continue with their day?
I know the real reason behind the law isn't to actually protect children, but, you know...
When apps created for overthrowing governments in other countries begin to backfire amid record low approval ratings, it all of a sudden turns out that “human rights” are merely a cudgel to beat others with.