If you or I did this, we would already be in jail for phishing plus whatever add-on charges the Feds could file.
Meta has Washington in their pocket so this will never leave civil court. The penalty will be less than the money made, meaning somebody gets a bonus for being creative.
seriously, how does this not violate wire tapping laws? does agreeing to ToS mean you also agree to being spied on in a way that protects them? you are deliberately circumventing encryption for malicious purposes. if people got in trouble for DeCSS for circumventing encryption, how is this okay?
pithy "because they have all the monies" replies not wanted.
Big tech and telecommunications companies are effectively miniature arms of the U.S. government at this point.
As seen by the "Protect America Act" of 2007[0], the government will retroactively cover their own ass and your companies' ass if deemed important enough to the intelligence apparatus. There isn't a chance in hell that Meta would be brought criminal charges for wiretapping.
0: https://en.wikipedia.org/wiki/Protect_America_Act_of_2007
I think The Onion nailed it in 2011:
https://www.theonion.com/cias-facebook-program-dramatically-...
Which is clearly a red flag operation so that whenever someone serious tries to tout this, they'll be rebuffed as it's an article in the Onion. Those clever bastards!
That, or scathing satire has been a mainstay at The Onion longer than political consternation.
I'm assuming they were doing it for the federal government at this point. There's no reason for them to spy on another app, they can hire almost any developer they want.
Hiring another dev does not give them access to the raw numbers. It's not the same thing at all
What is described in the article is not some elaborate scheme or novel work of software engineering. Rather, it's exactly what 99% of corporate networks do (proxy server with SSL inspection using a custom root certificate) "to combat cyber threats".
As coincidence would have it, this is the perfect alibi provided by a snake oil "cybersecurity" app by one of the world's largest companies.
Every tech company that has promulgated the lie that a VPN operated by a third party provides added security is indirectly responsible for this. Funneling all your traffic through a shady intermediary does no such thing, and in fact often does the opposite.
Doesn't change anything, consent and whether you own the device is everything.
The comparison with VPNs doesn't hold either, because for all their faults VPNs do not decrypt traffic going through them.
99% of corporate networks? That can't be true.
I do know that this is done - in fact worked at a pretty major smartphone manufacturer and never logged in to any personal account on work devices. It was pretty obvious by even just looking at the security info on chrome/firefox that the certificate used was a root signed by the company itself. I used to shout at the top of my lungs to my friends, that hey, _this_ is how your information is vulnerable to the corporate overlords, but I guess they weren't as paranoid as I.
The first thing I checked when moving to my next employer was if they were intercepting SSL traffic like this. (They weren't - they used Falcon)
It’s not really spelled out clearly in the article, but this was a specific program where people had to choose to opt-in in exchange for compensation.
This wasn’t simply Facebook hijacking random people’s traffic because they accepted the ToS or used the Facebook app
Not defending the program, but it’s not what a lot of comments are assuming.
Do you have further insights or references on what was the "trigger condition"? This is a new case, separate to the previous litigation related to the VPN app.
The article details how users were lied to about what was being collected and why.
If you lie to someone to get them to sign an agreement, that agreement is voided in nearly any sane jurisdiction on the planet.
It isn't because they have the money, it's because they have given the government access to whatever data they want. When it comes to three letter agencies it really isn't about money, it's about power and in today's digital world data is power.
To answer your specific question, this isn't okay. Both the government and large corporations have been given way too much power and we really have no hope of making any meaningful change until the people reclaim this power and put those in charge out on their ass.
This relates to a much bigger problem of courts upholding contracts even when nobody actually believes they represent an informed and voluntary agreement.
We aren't quite at the Looney-Tunes step of enforcing extra clauses that were hidden in invisibly small print, but things are drifting in that direction.
See also: https://www.law.cornell.edu/wex/adhesion_contract_(contract_...
Your work does this. This is incredibly common on basically every corporate device issued today.
The real issue is the NUX, which doesn't look like it made the data collection clear to users.
My work puts a big banner on the login screen that says up front that they can and will record and monitor everything on this machine. And IMO that's fine, because it's their machine. If they wanted to do that to my machine it would be a problem.
I agree it's legally fine, but morally/socially there are ways to go-too-far.
there's nothing wrong with corporations tracking use of their hardware.
they have to watch for data exfiltration and attempts to download malware, etc.
don't use a corporate device for anything you don't want work to see.
use your own. that's not a hard ask.
As written, that means they can secretly enable the camera and microphone to surveil my house, supposedly to check the usage (or non-usage) of the hardware.
Surely that's very "wrong", if not also illegal in most places. Not everything about or near the hardware is fair game.
That's clearly not what was meant
No, read what they're replying-to.
I wrote one sentence about how "there are ways for companies to go too far", which I think is pretty dang uncontroversial and trivially-true. However that user replied with what is clearly a disagreement, with corporate justifications and placing sole responsibility on employees to avoid the hardware.
This leads to two competing options:
(A) They simply can't imagine any scenario where a company might "go too far" and be at fault.
(B) Their stance is much milder, but for some reason they are replying to a straw-man argument that isn't what I actually wrote.
Of those two ambiguities, I went with (A), but if you think (B) is a more-charitable reading...
Or that the discussion was about information on and being transmitted through the devices and I was limiting my opinion on "there being nothing wrong with corporations tracking use of their hardware" to that scope, and not extending it to include spying on people in their homes using the device peripherals.
No, they shouldn't be flicking on your laptop camera or mic remotely, as these are pretty obviously violations of your privacy.
My rights are not subordinate to my company's, if anything it should be the reverse. My employment contract is intended for mutual benefit and the company also reserves the right to privacy from me in some things, even things in the scope of my employment. It should be acceptable to do things outside the scope of your employment using corporate devices, and you should retain a reasonable expectation of privacy when doing so.
No place I’ve worked has ever told their employees that they do this, but most of them do. Some employees I’ve spoken to are quite surprised that their “encrypted” connections are being monitored.
People should probably read their employment agreements and IT usage policy. I'd be surprised if it's not written somewhere.
Besides which, using someone else's computer with an expectation of privacy is the wrong expectation.
It's "fine" in the way that I would leave that company at the first opportunity.
I signed a contract with my employer that when I'm using the computer they give me to conduct their business on their behalf, they have the right to observe my usage of that computer.
The situation in this article is completely different.
Our apps would be deplatformed on Android and iOS, and our businesses would be prosecuted by the DoJ and FBI.
Looks like this was the real reason Facebook could not comply with China's data sovereignty laws and had to abandon the market.
The fact Apple and Microsoft services both work in China shows they are a little more trustworthy.
How so?
Absolutely not. Companies apply different policies in different countries they operate in. This tells you nothing more than those companies came to a mutually beneficial agreement with the Chinese Communist Party.
Indeed. Even McDonalds has different menus, local workers, local employee standards, and even how their business signage looks, depending upon country/location.
That's one possible read. The other possible read is that Apple and Microsoft both agreed to let the CCP decrypt all user data, which makes them less trustworthy in my book. You really gonna believe they couldn't have a similar arrangement with the US TLAs after that?
Yes. It's a good opportunity for an ambitious state attorney general to prosecute Facebook, of course.
https://www.naag.org/find-my-ag/
https://www.consumerresources.org/file-a-complaint/