return to table of content

Is Microsoft trying to commit suicide?

simonw
55 replies
1d3h

Security/privacy concerns aside, Recall doesn't particularly feel like an "AI" feature to me. It's on-device OCR plus a SQLite database that you can then search, right?

Even by today's loose definition of "AI" I'm having trouble making that connection. I guess the OCR is based on machine learning?

Is there an LLM component to Recall that I've missed? If so, can I build websites with prompt injection attacks on them that will target Recall users directly, by getting my rogue instructions indexed into their SQLite database and later fed into the LLM?

ben_w
26 replies
1d3h

Even by today's loose definition of "AI" I'm having trouble making that connection. I guess the OCR is based on machine learning?

It's long been noticed that every time AI researchers figure out how to do a thing, it goes from "this is impossible SciFi nonsense" to "that's not real AI".

It's weird for me to actually encounter people doing this.

I remember when OCR was impossible, at any decent quality level, for AI. We used this to stop bots logging into forums, with a thing we called a "Completely Automated Public Turing test to tell Computers and Humans Apart" or CAPTCHA for short. It started off as text, then the text got increasingly distorted until most humans had trouble reading it, then it became house numbers, then it became "click all pictures of XYZ", before mostly disappearing into analytics seeing where you hovered your mouse cursor and which other websites you visited.

mannykannot
9 replies
1d1h

Every time AI researchers figure out how to do a thing...

If we are going to define AI as whatever AI researchers are working on (a definition having something of a bootstrap problem), then the only way the goalposts will not move is when they are not making progress.

ben_w
8 replies
1d

I wonder which other professions exhibit the same effect?

Artists certainly don't, as art remains art even when the artist themselves becomes lost to time.

I suspect politicians may be, for every problem they do solve becomes the status quo… though also every problem they don't solve becomes their fault, so perhaps not.

Civil engineers may be on the boundary, people forgetting that cities are built rather than natural when complaining about immigrants with phrases such as "we are full", yet remember well enough for politicians to score points with what they promise to get built.

yutrin
6 replies
15h15m

It's long been noticed that every time AI researchers figure out how to do a thing, it goes from "this is impossible SciFi nonsense" to "that's not real AI".

I struggle to see how anything we have today is “AI”.

So you think we’ve done it?

We’ve solved the “AI” problem.

We can just stop working on it now?

It's weird for me to actually encounter people doing this.

Rather than posturing, perhaps you could provide us with the definition of “AI” so we can all agree it’s here.

art remains art even when the artist themselves becomes lost to time.

And if statements remain if statements. What’s your point?

I wonder which other professions exhibit the same effect?

I disagree that there is any “effect” worth pondering, but here’s a biting quote that if written by a tech bro with unsubstantiated zeal for wasting planetary resources to engorge the wealth of unethical sociopaths it would have the word “goalposts” in it, and be the worse for it.

“Fashion is a form of ugliness so intolerable that we have to alter it every six months.” -Oscar Wilde

ben_w
5 replies
9h58m

I struggle to see how anything we have today is “AI”.

Your struggle is inherent in the "AI effect".

So you think we’ve done it?

We’ve solved the “AI” problem.

Calling it "the" is as wrong as calling all medical science "the" problem of medicine.

Replace "AI" with "medicine" and see how ridiculous your words look.

We have plenty of medicine without anyone saying "aspirin isn't medicine" or "heart transplants aren't medicine" or similar, and because nobody is saying that, nobody is saying "oh, so you think we've solved medicine, we can all just stop researching it now?"

So yeah, we've repeatedly solved problems that are AI problems and which people were arguing that no computer could possibly do even as the computers were in fact doing them.

Rather than posturing, perhaps you could provide us with the definition of “AI” so we can all agree it’s here.

"""It is a field of research in computer science that develops and studies methods and software that enable machines to perceive their environment and discover which actions maximize their chances of achieving defined goals"""

Which is pretty close to the opening paragraph on Wikipedia, sans the recursion of the latter using the word "intelligence" to define "intelligence".

yutrin
4 replies
8h50m

"""It is a field of research in computer science that develops and studies methods and software that enable machines to perceive their environment and discover which actions maximize their chances of achieving defined goals"""

This is exactly why I suggested you define the mental model you’re working with, because now I agree with mannykannot’s first gp addressing your original lament over those you interpret as “moving the goalposts”:

If we are going to define AI as whatever AI researchers are working on then the only way the goalposts will not move is when they are not making progress.

Your “definition” of “AI” is full of wishy washy anthropomorphisms like “perceive” and “discover”, and elsewhere is so broad it could apply to just about anything.

An astable multivibrator with two LEDs at each output and a switch fits your definition.

The circuit is a “machine” that “perceives” the switch being pressed then “discovers” which side of the circuit will “maximize its chances of achieving the goal” of illuminating the environment.

We have plenty of medicine without anyone saying "aspirin isn't medicine" or "heart transplants aren't medicine"

People do in fact say blood letting the humors “isn’t medicine”.

So you’re interpretation of “medicine” is yet another common example that also exhibits your apparently super rare “AI effect”?

But this attempt at analogy is just a distracting digression.

You appear to be changing the rigidity of your definition of “AI” ad hoc to satisfy whatever argument you’re trying to make in that moment.

In this thread alone you refer to products, the “aspirins”, as “AI”, but then claim your definition is that “AI” is a “field of research”, your “medicine”.

Take the press release from the product being discussed here and replace “AI” with “field of research”.

“Corpo just released a ‘field of research’ for your PC.”

Starting to “see how ridiculous your words look” yet?

ben_w
3 replies
8h9m

Your “definition” of “AI” is full of wishy washy anthropomorphisms like “perceive” and “discover”, and elsewhere is so broad it could apply to just about anything.

Neither "perceive" nor "discover" is an anthropomorphism. Not only on a technicality such as "anthropomorphism excludes animals, so you would be saying you think animals can't do those things" (I wouldn't normally bring it up, but you want live by the technical precision sword, you get this) — but also for the much more important point that examples such as a chess engine doesn't need to have eyes, neither does a chatbot, nor even a robot: they only need an input and an output.

Definitionally all functions have an input and an output, and if you insist on mathematically precise formulation of "discover" I can rephrase that statement without loss as:

"AI is the field of research for how to automatically create a best-fit function f(x) to maximise the expected value of some other reward function r(f(y)), given only examples x_0 … x_n".

Any specific AI model is thereby some f() produced by this research.

An astable multivibrator with two LEDs at each output and a switch fits your definition.

No, it doesn't, there's no "discover" in that.

And at this point, I could replace you with an early version of ChatGPT and the prompt "give a deliberately obtuse response to the following comment: ${comment goes here}"

People do in fact say blood letting the humors “isn’t medicine”.

So you’re interpretation of “medicine” is yet another common example that also exhibits your apparently super rare “AI effect”?

Those things, which are your examples not mine, were actively shown to not work, and this was shown by the field of research called medicine, so no.

(Also, "apparently super rare" is putting words into my mouth and wildly misrepresents "I wonder which other professions exhibit the same effect?").

Again, what you write here is so wildly wrong that I have to assume that your brand new account is either a deliberate trolling attempt, or ChatGPT session with a prompt of "miss the point entirely" — but of course, I have met humans who are equally capable of non-comprehension before such tools were available. (I think those humans were doing arguments as soldiers, but it's hard to be sure).

You appear to be changing the rigidity of your definition of “AI” ad hoc to satisfy whatever argument you’re trying to make in that moment.

You asked for a definition, you got one, you complained about the definition. That's you being loose.

I have a fixed definition, and am noting how other people change theirs to always exclude anything that actually exists. Which you are doing, which is you being loose.

What would be shifting track, would be to use the observation that you are hard to distinguish from an LLM to introduce a new-to-this-thread definition of AI — but I'm going to say that Turing can keep the imitation game, because although his anthropomorphic model of intelligence has its uses, I view it as narrow and parochial compared to the field as a whole.

In this thread alone you refer to products, the “aspirins”, as “AI”, but then claim your definition is that “AI” is a “field of research”, your “medicine”.

No.

Aspirin is a medicine, an example of a product of the field of research which is medicine.

The equivalence is "[[aspirin] is to [medical research]] as [[route finding] is to [AI research]]". One can shorten "I perform medical research" into "I work in medicine" and not be misunderstood, you are misunderstanding the contraction of "this is an AI algorithm" to "this is AI".

You are the one dismissing the existing solutions in the field of AI and sarcastically suggesting that anyone who says otherwise thinks we've solved all AI problems and can stop researching it now — which is as wrong as dismissing aspirin as "not a medicine" and sarcastically suggesting that anyone who says otherwise thinks "we've solved all medical problems and can stop researching it now".

Take the press release from the product being discussed here and replace “AI” with “field of research”.

“Corpo just released a new ‘field of research’ for your PC.”

I see you're unfamiliar with the entire history of scientific research software, too.

yutrin
2 replies
7h9m

You asked for a definition, you got one, you complained about the definition. That's you being loose.

Wait, I’m trying to help someone who thinks that giving an idiosyncratic definition of a broadly used term when someone asks for one requires that the provided definition be applied universally and accepted as correct without scrutiny?

AI is the field of research for how to automatically create a best-fit function f(x) to maximise the expected value of some other reward function r(f(y)), given only examples x_0 … x_n

Cool, cool another ad hoc change. At least this one is more precise.

I see you're unfamiliar with the entire history of scientific research software, too.

Project much?

https://ai100.stanford.edu/2016-report/section-i-what-artifi...

Read up on: Heuristic Search, Computer Vision, Natural Language Processing (NLP), Mobile Robotics, Artificial Neural Networks, and Expert Systems.

These fields of research either lack reward functions all together, or did so in earlier iterations.

And at this point, I could replace you with an early version of [redacted]

or [redacted] session with a prompt of "miss the point entirely"

Oh, I see why you are so passionate about these products now.

You can replace or dismiss anyone who points out your shortcomings with them.

ben_w
1 replies
7h0m

Wait, I’m trying to help someone who thinks that giving an idiosyncratic definition of a broadly used term when someone asks for one requires that the provided definition be applied universally and accepted as correct without scrutiny?

That's not even a coherent English sentence.

Cool, cool another ad hoc change. At least this one is more precise.

You asked for it. It's identical in meaning. There's nothing "ad hoc" about this in either the original or the precise form.

Artificial Neural Networks … lack reward functions

Deeply and fundamentally wrong.

But worse than that, the thing you linked to actively denies your own prior claim, which was (and I'm copy-pasting) "I struggle to see how anything we have today is “AI”", even though that quotation is a statement about your own beliefs.

Even by trying to use that source you are engaging as arguments-as-soldiers, using something that contradicts your own other points.

Oh, I see why you are so passionate about these products now.

I've heard much the same projection about my inner state from an equally wrong Young Earth Baptist. Like him, you have yet to demonstrate any understanding.

I've been interested in this since back when NLP couldn't understand the word "not", and back when "I write AI" implicitly meant "for a computer game".

yutrin
0 replies
6h5m

But worse than that, the thing you linked to actively denies your own prior claim, which was (and I'm copy-pasting) "I struggle to see how anything we have today is “AI”", even though that quotation is a statement about your own beliefs.

This is the insidiousness of your ad hoc definition flip flopping. You can claim you meant one and I’m addressing the other, and vice versa, whenever it suits you.

The linked article is about your “field of research” definition of the term “AI” while the quote from my original reply is addressing your “product goalposts” definition.

It's identical in meaning.

Well I’ve tried but am unable to find a dictionary that defines “discover” as relying on reward functions. Can you link me to one?

Here’s a popular dictionary’s definition: https://www.merriam-webster.com/dictionary/discover

> Artificial Neural Networks … lack reward functions

Nice, you removed the pertinent context of “or did so in earlier iterations” with that ellipsis to make your point. Nice.

Deeply and fundamentally wrong.

Read up on Minskys SNARCs which used trial and error learning without the need for a predefined reward function.

mannykannot
0 replies
19h40m

That's a good question, though I think art is one of the few things that is in some sense a parallel. While masterpieces of the past remain masterpieces today, the scope of what is considered art has been expanding, and while perfectly good work is being done today in styles that would have been recognized as art in the past, it probably will not attract the attention it would have gained if it had been produced in the past.

Perhaps the thing that makes AI different from other aspects of computing, in terms of how its progress is regarded, is that the term invites lofty expectations.

MrRadar
4 replies
1d2h

Yeah, take this XKCD for instance: https://xkcd.com/1425/ That was published 10 years ago but now the second item is (nearly) as trivial to accomplish as the first.

pavel_lishin
2 replies
1d2h

To be fair, we've had multiple research teams and ten years :)

jantissler
1 replies
23h4m

But that's not the point I think. The point to me is: 10 years ago, this seemed nearly impossible to solve. "Weird, impossible sci-fi stuff". Now I have this exact feature in the Photos app of my iPhone. It not only finds my dog in the photos, but correctly determines its breed. It's amazing! And it works locally on my device, no servers involved. But I'm sure many people would argue that it is not "real AI", because it is "just XYZ". So as soon as we figure out how to do it, it's declared banal. Like LLMs. I'm baffled by the people who think it's "just" this or "just" that. That little word "just" buries years of research and several breakthroughs. Somehow, AI is always what AI can't do yet. It's always the next thing. The noise of goal posts being moved is sometimes deafening. And people don't even realize they're doing it.

ben_w
0 replies
22h44m

Mm.

I don't normally like reaching for SciFi in these conversations, as there's always at least one of someone who mistakes a story for reality, and someone who misses the point and accuses the person giving the SciFi as an example of the same…

…but there are two episodes of TNG Trek which come to mind:

1. Elementary, Dear Data: Dr. Pulaski asserts that Data, being a robot, is incapable of solving a mystery to which he does not already know the outcome. Data accepts Dr. Pulaski's challenge and invites her to join them on the holodeck. There, Geordi instructs the computer to create a unique Sherlock Holmes mystery, but accidentally specifies an adversary "who is capable of defeating Data" rather than "Sherlock" and thus shenanigans happen.

2. The Measure of a Man: Lawsuit over "is Data (a) a person with the right to refuse consent; or (b) a thing to be disassembled, studied, replicated, and used as a mechanical servant?"

We've had ongoing arguments about #1 with AI for a long time, even though Chess and Go playing models beating the best human players should've ended this one.

I think we're going to get a lot of real life cases like #2 even when they involve high-fidelity brain uploads.

quectophoton
3 replies
1d1h

It's long been noticed that every time AI researchers figure out how to do a thing, it goes from "this is impossible SciFi nonsense" to "that's not real AI".

https://en.wikipedia.org/wiki/AI_effect

The most extreme example I can find of this, is Google Translate; I don't know of anybody who thinks of it as AI.

sebazzz
1 replies
23h42m

It is only because it already existed it isn't branded as AI.

bsharper
0 replies
19h31m

It's also how Google generally brands consumer-facing products, which is just Google + Noun. Most of their non-enterprise products tend to have unambiguous names (Google Search, Google Maps, Google Calendar, Google Translate).

ben_w
0 replies
1d

The most extreme example I can find of this, is Google Translate; I don't know of anybody who thinks of it as AI.

Huh.

I'm (to my own surprise, given how old the feature is) still astounding people by demonstrating that it has augmented reality mode.

JohnTHaller
2 replies
1d

If someone created the first Bayesian filter for classifying spam today, they would call it 'AI'.

ben_w
0 replies
1d

P(what you did there | I see it) = P(I see it | what you did there) * P(what you did there) / P(I see it)

theanonymousone
1 replies
1d

Do bot detectors check browsing history of a prospective bot? How?

ben_w
0 replies
1d

All those tracking cookies, or so I'm told.

tpoacher
0 replies
1d2h

To paraphrase the Incredibles, "if everything is AI, then nothing is".

_heimdall
0 replies
22h23m

Well that's really just a side effect of capitalism and marketing. Some AI researchers are working through incremental discoveries needed to potentially have artificial intelligence. Companies come along and roll up each innovation into products and pitch them as some massive AI revolution.

LLMs, at least as they are publicly understood, aren't AI any more than OCR is.

breadwinner
10 replies
1d3h

Demos show that you can search for "Blue bag" even if the words "blue bag" don't appear on the screen. If there was a blue bag in a photo, say in a PowerPoint slide, it will find it.

cjk2
5 replies
1d3h

To be fair Apple Photos does this.

But it does it to your photos not your entire universe.

hu3
1 replies
1d3h

Same for google photos.

I searched for "smiling in beach" and it works flawlessly.

I recon Microsoft will want to offload this kind of search processing to user machines with the advent in "AI processor computers" we saw recently: https://blogs.microsoft.com/blog/2024/05/20/introducing-copi...

mmcnl
0 replies
1d1h

I believe Apple does it completely locally on the device, whereas Google searches in the cloud.

sleepybrett
0 replies
1d2h

spotlight is also indexing pretty much every file you create on your mac.

ethagnawl
0 replies
1d2h

Dropbox has also been doing this for years. It's helpful sometimes and other times completely misses the mark. Personally, I do find it to be creepy and it has me edging closer to using a more privacy-focused cloud solution to backup my important documents, photos, etc.

acdha
0 replies
17h0m

There’s a really interesting angle about how the model of the service fits with the user’s concept and consent here. Apple Photos only indexes things you choose to save there, and you have to grant access to other things like Siri using it, and most people seem to be comfortable with that because it’s easy to understand the scope and opt-out.

Recall is the opposite: on by default and it records everything, which leaves people uncertain about whether they even can reliably opt-out or what that will mean if they’re interacting with someone who hasn’t, and combines with other areas of Windows 11 where Microsoft has been doing things like putting ads in the start menu which really cut into users’ trust. The blowback has been pretty impressive for what might otherwise have been a relatively ignored feature.

simonw
1 replies
1d3h

I'd love to know how they are doing this! Are they running an on-device CLIP embedding model of some sort? Could they even be shipping a multi-modal LLM like Phi-3 Vision?

Firmwarrior
0 replies
1d2h

OneDrive and the local Microsoft photos app have been doing it for a decade already.

renewiltord
0 replies
1d3h

My favourite feature in Apple and Google Photos. I use this all the time and have gotten so used to it I get annoyed when it doesn't find something.

cflewis
0 replies
1d3h

In some ways, I find this even creepier than when I thought it was more than an SQLite database.

simonw
3 replies
1d3h

Answering my own question (with the help of replies on Twitter: https://twitter.com/simonw/status/1798368111038779610 )

Best guess is that there's some level of semantic analysis going on beyond just OCR, which tips the balance (at least for me) to being an "AI" feature based on loose current usage of that term.

The clue is that in this demo https://www.youtube.com/watch?v=aZbHd4suAnQ&t=1062s they show searching for "blue pantsuit with sequin lace" finds something that was described as "peacock" in the text. It looks like this is based on embedding search against image embeddings.

And... one of the 3 SQLite databases created by Recall is called "SemanticImageStore" - which suggests to me there may be a CLIP-style image embeddings model being run on-device here.

Those databases contain "diskann" columns which are likely a reference to Microsoft's vector indexing library: https://github.com/microsoft/DiskANN

osolo
2 replies
1d2h

My team worked on this feature and I can confirm that yes, semantic search is being done for both text and images. This allows you to do fuzzy searches like you mentioned in your comment and use words to match images. Everything runs on your device (all the models, the vector database, etc.) in order to preserve privacy.

simonw
1 replies
1d1h

That's really impressive. Can you share which embedding models you are using for this? Also, is Phi-3 or Phi-3 Vision involved?

osolo
0 replies
22h18m

I don't think they've announced publicly what models we're using. I don't think there's any particular reason for this, but just in case, I can't name them here. I'll see if this can be addressed in a blog post of something.

I can tell you that Recall isn't using Phi. Rather, it's using a collection of models that are much more tuned (and therefore much more efficient) for the feature.

zamalek
2 replies
1d3h

It probably has RAG.

simonw
1 replies
1d3h

Is that confirmed?

Microsoft have been boasting that it all runs on-device, so are they running RAG using an on-device model (Phi-3 or similar) as part of the Recall feature?

rihegher
0 replies
1d3h

I had this confirmed during the Q&A portion of a demo presented by Microsoft. All the processing is supposed to happen locally using the NPU capacity of the hardware.

jrm4
2 replies
1d3h

I really can't fathom what it takes to casually be like "security/privacy concerns aside" here.

simonw
0 replies
1d3h

I decided not to fully type out "obviously the security/privacy concerns are the most important thing here and should not be ignored... but aside from that, does anyone know if..."

codetrotter
0 replies
1d3h

I think you are misreading the comment. Saying aside from those concerns does not dismiss the concerns themselves. It’s just expressing that this comment will speak about some other things, and not those things.

ryandrake
1 replies
1d2h

Recall doesn't particularly feel like an "AI" feature to me.

The word "AI" is becoming a pure marketing checkbox. You can't release any new product in 2024 without somehow attaching the word "AI" to it. It doesn't matter whether it actually uses AI or an LLM.

paulmd
0 replies
23h44m

Well, this one does, so what’s the point of this observation in this context?

randomdata
1 replies
1d3h

I assume you are working with the AI definition that is along the lines of: "Problems a computer may not be able to solve"?

OCR was absolutely considered AI before we knew how to do it. Now that we understand it, it's just computing, of course. But it still is reasonably considered AI by any other definition of AI.

KeplerBoy
0 replies
1d3h

Isn't SotA OCR done using CNNs? So it's AI even in the modern sense.

maciejgryka
0 replies
1d3h

I have no actual info on this, but I always assumed they'd compute some mutlimodal embeddings of the screenshots to then retrieve semantically-relevant ones by text? And yeah, they'd have to do it using on-device models, which doesn't seem out of reach?

giantg2
0 replies
1d3h

I assume the AI part is using some model to interpret the stuff on the screen (Maybe even non-text) and used in the search results. That's just a guess. I'm not really that interested in the Recall product, so I have dug into it.

EasyMark
0 replies
1d

But it doesn't just recognize characters though? It recognizes things happening on video and scenes in pictures without text. for example, you could prompt "find my photos and videos with dogs in them"

bnchrch
37 replies
1d3h

Among the many logical fallacies already pointed out.

I would like to add one more.

This assumes that Microsofts future IS windows.

I don't believe that has been the case since Satya.

And once you realize that you can see the future is very bright

Regardless of some implementation gaff in a Windows feature.

toast0
22 replies
1d3h

What is their proposed future then?

It's not Xbox, they're losing that war, and it was based on Windows. It's not phones, they already lost that one and that was based on Windows anyways.

It can't be Office; there's less and less of a need for a paid office suite, and MS Office in a browser is rather bad.

It might be Azure, but IMHO the big onramp to Azure is Microsoft pushing corporations to move ActiveDirectory and Exchange into the cloud, but the end of Windows means the end of ActiveDirectory, and there's options for Cloud mail and calendar. Is Azure compelling for many if you're not already there because of Exchange?

And once you realize that you can see the future is very bright

The future may be bright, but how will you know if there's no Windows to see it through? ;p

Bluecobra
9 replies
1d2h

I thought for sure way back when the original Xbox came out that they were playing the really long game (no pun intended) to get a foothold and conquer the living room. After nearly 25 years I don't see that happening and it seems like they are going to end up going the Sega route and just develop software.

kalleboo
1 replies
1d2h

The really tried to conquer the living room with the Xbox One, making it more about media than games, and the gamers revolted while the non-gamers shrugged. I think that's when the rest of Microsoft wrote it off.

Freebytes
0 replies
23h59m

For one thing, they need to hire someone to be "Chief Executive of Naming Things That Make Sense" because they are terrible with naming stuff.

"I got an Xbox One!" "Oh, the first one?" "No, the XBox One not the original XBox 1."

Or, looking at the .NET evolution:

.NET Framework 4.8 .NET Core .NET Core 2 .NET Core 3 .NET 5

There is no .NET Core 4 because people would get confused with .NET Framework 4 which is still getting bug fixes and has support. And they wanted people to move from .NET Framework 4 so they called it .NET 5 instead of .NET Core 5.

JohnMakin
1 replies
1d2h

Ironically, the fact that Xbox is so windows-friendly is exactly why I think its sales suffer and why they're going to end up just making software - for me as a gamer, I don't feel the need to own an xbox when I can play any xbox exclusive on my PC (and often with much better performance and peripherals). I can only play Playstation exclusive games on a Playstation, and the Playstation exclusives have been consistently very, very good. Pretty much everything on the Switch is exclusive to that console. So why would a budget-constrained person ever buy an Xbox?

cchi_co
0 replies
3h41m

I think purchasing a new gaming PC with comparable performance to a console like the Xbox Series S can be significantly more expensive than buying the console itself. But yeah the concept of exclusivity is lost here.

AlexandrB
1 replies
1d2h

I thought the same thing. It's crazy to see the state of Xbox today. Microsoft just seems bad at spotting, nurturing, and retaining talented game developers. Look at the recent layoffs where they nuked Tango Gameworks despite HiFi Rush's success.

Going all-in on Gamepass has also devalued their own games in many consumers' minds. Why buy it when you can just get it on Gamepass? I suspect the recurring revenue is not making up for the lost unit sales within any reasonable timeframe.

cchi_co
0 replies
3h49m

Microsoft just seems bad at spotting, nurturing, and retaining talented game developers.

Agree. I think, Sony has generally been more successful in cultivating strong relationships with its first-party studios.

bee_rider
0 replies
1d

The idea of a sort of home-computer situated in the living room instead of the office made a whole lot more sense before smartphones.

ThrowawayB7
0 replies
15h51m

"to get a foothold and conquer the living room"

They did try it, remember? During the Xbox One launch, there was a heavy marketing blitz to position it as a living room multimedia machine and console gamers absolutely tore them to pieces for it. They gave up after that debacle.

Freebytes
0 replies
1d

Well, the name of the company is Microsoft, not Microhard.

cflewis
2 replies
1d2h

AFAICT the future is just Azure and legacy enterprise contracts. Windows has no growth to make, stockholders only value growth, hence Windows is not of interest to the company.

bogwog
1 replies
1d2h

That's the genius plan: kill Windows marketshare so that it can grow again in the future!

bnchrch
0 replies
4h21m

I know this is a joke, but.

This strategy would eventually let them rebuild it from the ground up.

WSL is a good bridge for now, but a larger shift to a unix OS is still needed.

Rastonbury
2 replies
1d2h

It may be fun to joke at and hate on them but look at their revenues, growth and market shares, instead of thinking as a consumer where you personal come across them and whether you like them. Nearly monopoly personal OS and productivity, top 2 cloud and search engine (cue the Bing jokes but it makes them $12bn a year), rights to frontier AI models via OpenAI. The only rival who competes in multiple markets is Google - who still makes 90% of their money from ads and does not have sales/distribution that MS has. If Microsoft doesn't have a future then what of the rest of FAANG, none of them are as diversified or have OpenAI. Oh I forgot Github and Linkedin

Freebytes
1 replies
1d

I think Google is afraid of Microsoft.

cchi_co
0 replies
3h54m

There are areas where Google might feel competitive pressure from Microsoft

mike_hearn
0 replies
1d2h

Less need for an office suite? Tell that to all the companies that are not only paying for Office but are now migrating from Slack to Teams, thus becoming even more dependent on it.

lenkite
0 replies
1d1h

What is their proposed future then?

Microsoft's Future is Azure AI POWERED CLOUD. Their CEO explicitly said this. Windoze is just the legacy on-boarding software.

hnben
0 replies
1d2h

What is their proposed future then?

selling azure and consulting to governments and big corps.

Is Azure compelling for many if you're not already there because of Exchange?

Most are already on Exchange, I think.

Also: I think MS wants Azure to be compelling even without Exchange. (though I am not sure if that works out)

gary_0
0 replies
1d2h

What is their proposed future then?

Whatever IBM and Oracle do to stay solvent in spite of being irrelevant to most people and even most software people.

dartharva
0 replies
13h26m

there's less and less of a need for a paid office suite

Since when? Is the world's white collar population decreasing?

Sammi
0 replies
9h10m

Have you looked at Microsoft's revenue breakdown by segment? Cloud already dominates the totals, but more importantly it also dominates growth. Gaming, devices, and Windows are a shrinking pieces of the MS pie.

Now you understand Microsoft. They have no reason to care about Windows any more.

giancarlostoro
3 replies
1d2h

All they need to do to make Windows great is strip all of the crap nobody asked for. There's an insane amount of anti-patterns that are sketchy.

AlexandrB
1 replies
1d2h

Windows is increasingly just a marketing channel for Microsoft to push their various other products. Unless your an enterprise/government customer the crap is the whole point.

giancarlostoro
0 replies
1d1h

Which is why I migrated to Linux. Proton by Steam plays all the games I care about. I can do all my dev work on Linux just fine.

Freebytes
0 replies
23h54m

We need to create something that look better! Let us update a small set of applications one at a time and leave the others that look like they are from Windows 98 still in place.

Years later, that look is old. We should update the look. But what about the other stuff we have not updated yet that is still old looking? We will get to it later.

Meanwhile, you have three or four different layouts and designs and patterns throughout the system. It is crazy. They finally updated "System" to look like the rest of Windows, but if you click "Advanced System Settings", it still brings up the old System Properties. And it is crazy that the old one is still easier to navigate and find what you are trying to find.

causal
2 replies
1d3h

Exactly. Windows is almost a loss leader at this point.

mnau
0 replies
1d2h

Hell, .net is basically loss leader at this point (significant offloading to bunch of volunteers) and that is their ramp to Azure spending.

Sammi
0 replies
9h8m

Is it even? Linux is just as good an on ramp to MS cloud as Windows. I cannot see how Windows is generating secondary income for MS. All their largest and growing money making products work just as well on Mac or Linux or the web.

TacticalCoder
2 replies
1d3h

This assumes that Microsofts future IS windows.

Is Microsoft's future a smartphone OS that'll rival the iPhone or Android phones?

malfist
1 replies
1d2h

No, I think microsoft's future is encyclopedia software

toast0
0 replies
1d

That and high end gaming tables.

DrBazza
1 replies
1d2h

This assumes that Microsofts future IS windows.

I don't believe that has been the case since Satya.

Indeed. When MS has several multi-billion dollar streams of revenue - Office, Azure, XBox, game labels, and so on, and the "cloudy" part of computing is almost exclusively Linux, there's less compelling reasons for them to invest in Windows.

With other moves like WSL, 'winget' and 'sudo', it's clear Windows is slowly moving towards a facade that mimics Linux from the CLI.

In 2000, MS was Windows 9x, 2000, and Office.

Consider that in 2004, you'd go to a site and download a Windows-only piece of software, and in 2024, you'll now find Apple-only software, or Linux only.

Arguably, the 'year of the Linux(-like) desktop' has been MacOS for well over a decade. It's a gateway drug to full Linux, with very little friction in comparison to Windows (filesystem paths, Win32 api, etc.).

donmcronald
0 replies
1d

'winget'

IMO the main purpose for WinGet was to kill AppGet because MS doesn't want a 3rd party to control the package repository for an easy to use distribution system.

As far as I can tell, custom WinGet repos need to be run on Azure, so MS basically has a stranglehold on distribution. If you publish anything they don't like they can label you as being malicious and ban your Azure account. Regardless, the friction to running your own repo is high and most people will capitulate and submit their apps to the official repo for approval (by MS).

Big tech has been relentlessly locking down software distribution for the last decade and killing everything but the Windows Store, WinGet, etc. has to be a long term goal for MS.

mrweasel
0 replies
1d1h

This assumes that Microsofts future IS windows.

That pretty much leaves the world with a defunct operating system running om 80% of all desktop and laptops. I'm not suggesting that you're wrong, in fact I'd agree that Windows has taken a backseat at Microsoft, but that's pretty scary.

Given the new Exchange Server licensing, I think it's pretty safe to assume that Microsoft would like to exit the on-prem software business at the earliest convenient time. I just question if the world is ready for that, or that's even what we want.

gorjusborg
0 replies
1d3h

Regardless of whether Microsoft == Windows, a company's brand matters, especially when trust is required for adoption of their new offerings.

Microsoft had been trying, it seemed, to make their image more positive, with some positive results in recent years.

Their recent decisions have revealed that their priorities are not very aligned with their users. The future may be bright, depending on their goals, but I have no interest in being a customer of a company that makes decisions that are so user hostile and frankly, idiotic.

generalizations
36 replies
1d3h

No, they aren't. They're desperate.

We've realized for some time that not only are language model technologies going to dominate our future, but that they require massive compute to run. We've also been talking about Apple - their silence, but also their hardware capabilities to run LLMs on consumer devices. Every other player (OpenAI, Facebook, Microsoft, etc) is faced with a billion dollar datacenter investment that still provides a worse experience (latency & privacy).

Apple has a huge advantage because they get to pay nothing for a superior compute infrastructure that comes with a better privacy and latency story. As soon as they release something, it's going to be good, and it's going to be free, because it runs fast on consumer hardware and consumer electricity. A billion-dollar datacenter that's upgraded every N years can't compete with that. Microsoft knows this, and they're trying to get out in front of whatever Apple comes up with. That's why they're telling manufacturers to install these "NPU" things - it's a direct equivalent to the iPhone/iPad/Mac compute capabilities.

So no, Microsoft probably knew this would be the reaction. But it's better to get dragged through the mud for lack of trust than for being left behind the cutting edge. In the former case you're just a slimeball; in the latter case you're a moronic slimeball.

causal
12 replies
1d3h

Agree on the Apple strengths, disagree that MS is scared. PCs just aren't where they make most of their money anymore.

Enterprise is extremely lucrative for them, and their headstart in the AI space may be just what they need to close the gap with AWS.

CuriouslyC
9 replies
1d3h

Azure is bad. Google is the one that's going to leverage AI to catch up with AWS, GCP is actually good and they've already partnered with Huggingface.

yesdocs
2 replies
1d2h

Why is azure ‘bad’? Is it because you dislike Microsoft?

In my experience Azure is really good and getting better. Even if AWS is more popular, Azure is gaining traction, is more intuitive and easier to use, and in my experience: cheaper

Hikikomori
0 replies
1d2h

Had to do multi cloud with Azure, worked on network infrastructure, vnet, peering, firewall, vpn gateway, etc.

Documentation was almost worthless, leaving a lot of room for interpretation. Encountered and filed more bugs trying to get it up and running than I did for years of using Aws. Support was mediocre. Non existent ipv6 support. This was a few years ago and in comparison to Aws.

CuriouslyC
0 replies
1d2h

they've had long running security issues and performance edge cases that make relying on them a bit of a time bomb.

bruce511
2 replies
1d2h

Features aside, I simply don't trust Google to keep GCP running in the long term. Their propensity to kill projects (even ones which are successful), plus their gratuitous price hikes (aka Google Maps) has made me Goggle negative.

Sure, they might run GCP forever. Sure they may offer some sort of customer service at some point. Sure they may decide to keep pricing at sane levels. The key work in all that is "might" - and that requires my _trust_.

And for better, or worse, my trust simply isn't there.

magicalhippo
0 replies
22h52m

For us Google wasn't even on the table, due to what you mentioned.

It was AWS or Azure. AWS didn't have a data center in our country and our customers are required by law to keep their data in the country, so easy win for Azure.

CuriouslyC
0 replies
1d2h

I 100% feel you on the google trust. I think it'll end up being central to a lot of their other efforts so I'm expecting it to stick around, but hybrid cloud solutions are just a smart bet anyhow.

bongodongobob
2 replies
1d2h

You realize that Azure is much more than compute right?

yesdocs
1 replies
1d2h

Of course, and why is azure ‘bad’?

bongodongobob
0 replies
1d1h

It's not, it great if you have a large org.

Zhyl
1 replies
1d3h

What would you do if you had several billion devices that you had root access to, used directly by humans and isn't making you any money in sales these days?

The 2024 answer is to use it to capture data to train AI.

Microsoft could lose 9/10 of their market share (which they won't) and still have millions of devices harvesting training data for them.

lalalandland
0 replies
1d2h

If GDPR is at all questionable, Microsoft is not able to push this to EU countries. The risk for a business to get in GDPR trouble is to high.

jerf
7 replies
1d3h

"We've realized for some time that not only are language model technologies going to dominate our future"

Close, but not quite. What we realize right now is that AI has pumped, pumped, pumped a whole lot of stock valuations sky high, and the first faint glimmers of the market asking that some actual money be delivered in proportion to the sky high valuations are starting to appear. (Note carefully the "in proportion to"; it is not enough to make a bit of cash. These valuations are enormous.) The stocks so pumped need results more-or-less right now, since propagating anything they have right now out into the market to actually gather revenue is going to take some time.

I expect a couple of other Hail Mary plays like this to show up too.

And again, let me underline, emphasize, and highlight, this isn't about whether AI can make "any" money. It's about whether it can justify those staggeringly high stock valuations in, say, the next year or so. This is a fairly uphill battle. Even if you are a True Believer than AI will be an economic revolution in literally the next 12 months or so, it's hard not to look at the market and come to the conclusion that right now every AI player is being valuated as the inevitable victor and the inevitable beneficiary of 100% (or more) of the AI gain, e.g., nVidia is being valuated as if there will never, ever be any competition for AI hardware despite all the incentives, each AI user valuated as if they are the inevitable winner in their respective domains, etc. They can't all win simultaneously like that, it's just impossible.

(On the plus side, it's really just stock valuations. I don't expect an AI winter this time, the tools are much more valuable than they were last time. I mean, I would still very firmly say that what we have right now is grotesquely over hyped, but there's still quite a bit of "there" there. The previous AI hype cycles didn't have anywhere near as much "there" there.)

AlexandrB
3 replies
1d2h

The costs are also largely being ignored at the moment. Both the obvious marginal costs (electricity, hardware for compute, dev time) and the hidden costs (flood of spam and misinformation).

russiancapybara
0 replies
1d2h

Would be awesome to have nuclear power stations dedicated to powering these data centers for inferencing.

I believe Amazon just purchased a nuclear power plant in Pennsylvania for this.

Yoric
0 replies
1d2h

Also, the environmental cost. It's hidden, but won't be forever.

Bluecobra
0 replies
1d2h

I was really surprised to read the other day that Dell isn't making money on AI servers. I played around with the server build tool on their website and was surprised to see that a H100 adds ~$50K to a server! How the heck are these companies spending $100K per server going to get back their ROI? Are all these costs going to get written off as R&D? This really seems like the dot-com days but worse somehow.

https://www.cnbc.com/2024/05/31/dell-shares-fall-ai-servers-...

solidasparagus
1 replies
1d2h

NVIDIA has fundamental hardware and software advantages that will take years and years to catch up on. The general purpose part of GPGPU is no joke, just ask the people trying to build competing hardware and an alternative to the CUDA stack. But...

NVIDIA is massively investing in AI companies with the expectation (explicit or otherwise) that they use the money on NVIDIA hardware. And they are pouring in ridiculous amounts of money. I think this is severely impacting the ecosystem, creating overvalued and overfunded AI startups that lack viable businesses and driving essentially fake revenue to NVIDIA that is unsustainable. The money is basically free for NVIDIA to spend, maybe even net positive between revenue numbers and progress towards monopolizing the AI stack. Time will tell if the bubble will pop or if one of the plays will hit and turn a fake flywheel into a real one, but I definitely agree that the stock valuations are too high.

Wytwwww
0 replies
1d2h

It's not that clear Apple/local-AI is necessarily going to directly compete that much with large models running in datacentres (outside of some relatively limited use cases) and there don't seem to be many signs that Nvidia has any serious plans about entering the consumer market and compromise their insane margins.

mjr00
0 replies
1d2h

Spot on. The current wave of GenAI has created a bunch of new markets, but those markets are, at least currently, the unsavory types: LLM-based virtual girlfriends, AI hentai games running off stable diffusion, SEO and social media spam with ChatGPT, etc.

The best we've got in the "regular" enterprise world is excellent parsing of unstructured documents. Which is, to be sure, hugely valuable and will make a big difference in a lot of industries. But isn't nearly enough to justify the insane evaluations and investment companies are making into AI right now.

Two years into the AI boom, and it's starting to look closer to blockchain and the metaverse than the iPhone.

oldpersonintx
2 replies
1d2h

Apple is in terrible shape on software, AI is driving this home

the only tier-one, network-effects app they control is iMessage

search? someone else

maps? someone else

video? someone else

audio? someone else

and now they will go to someone else for AI

what did they do in the meantime? VR that bellyflopped

boxed
0 replies
1d2h

You forgot iOS. App APIs are also network effect things.

AlexandrB
0 replies
1d2h

Not sure why this is relevant. Apple's forte was always the hardware (and lately the greater focus on privacy). And you could easily say the same about Microsoft:

search? someone else

maps? someone else

video? someone else

audio? someone else

Arguably Apple is at least better positioned than MS for some of these because people actually use Apple Maps and Apple Music. When was the last time someone whipped out Bing Maps?

surgical_fire
1 replies
1d2h

As soon as they release something, it's going to be good

Huh?

How is that a given?

luma
0 replies
1d2h

It is absolutely not true, they already released a model (OpenELM) and it compares extremely poorly versus similar size models from the likes of MS.

This chart is correct and to scale: https://i.imgur.com/5mjlesU.png

yyggvbb
0 replies
1d2h

I think Microsoft software teams are already working behind their legal teams. Their business model is more about licensing than making products. Likely they’ll force some kind of regulatory situation that’s advantage to them. For example no AI capable OSs for government work, or something like that.

pjkundert
0 replies
1d3h

Suddenly, distributed private inference, compute, storage and bandwidth mediated by Holochain at global scale will become … valuable.

luma
0 replies
1d2h

Apple has a huge advantage because they get to pay nothing for a superior compute infrastructure that comes with a better privacy and latency story.

Right now, AI is a pay-$B-to-play and you need the massive datacenter compute for training and skilled researchers to develop. Apple silicon is neat, but let's not pretend like they have any credible story to tell about AI, and their recent deal with OpenAI suggests that they've been forced to punt.

As soon as they release something, it's going to be good, and it's going to be free, because it runs fast on consumer hardware and consumer electricity.

Apple recently released their first public LLM, OpenELM. The largest released model is OpenELM-3B. The Instruct version of this model scores a 24.8 on the MMLU benchmark, which is a pick-one-of-four multiple choice test.

Apple's biggest and best model somehow manages to get less than 25% on a 4 option multiple choice exam. For reference, MS released a similar size model Phi3-mini-4k which scored a 69 on that same exam.

Yes, Apple has some nice local hardware to run models, but they don't have the models. They don't have the models because they don't have the people and they don't have the compute.

coltonv
0 replies
1d3h

We've realized for some time that... language model technologies going to dominate our future

Citation Needed.

ca_tech
0 replies
1d2h

The realized value of LLMs is going to be output from a data to AI pipeline. From raw data to actions or insights. This is Microsoft's play to control the entire process. They are attempting to abstract away all the "tooling" needed to manage and process data. It falls down because there is no discretion left to the consumer. Objectively, the computer is already processing all the data you are interacting with. Subjectively it is assumed its ephemeral. Computer forensics proves this wrong. So I think to refine my opening statement, the realized value of LLMs is going to be the output from a "curated" data to AI pipeline. Which Microsoft is not providing with this solution.

bee_rider
0 replies
1d3h

It seems like a somewhat far fetched explanation, but then I can’t think of a better one. They come up with this farcical application that nobody wants, to justify the hardware requirement, so that at least the hardware will be out there for when Apple shows them what the actual application is.

amelius
0 replies
1d2h

Just wait until TSMC asks for 30% of Apple's revenue or they'll kick Apple out of their Fabstore.

Fabrication is where the tech advantage lies, not hardware design IP. Anyone nowadays can program VHDL, see open source hardware. The challenge is actually fabricating it.

Wytwwww
0 replies
1d2h

Apple - their silence, but also their hardware capabilities to run LLMs > billion dollar datacenter investment that still provides a worse experience

It's not at all clear to me that those very small local models would be competing directly with large LLMs run in datacentres for the same use cases. If that's going to be the case MS/etc. would also require significantly less compute power that's is currently expected, so Nvidia would be the real losers in that case (unless they shift to consumer HW).

Also does Apple have such a huge advantage compared to the newer AMD/Intel/Google/Qualcomm(?) chips with NPUs? Yes Apple started investing sooner but all the phones/tablets and the overwhelming majority of laptops are crippled by extremely low memory capacity (in the context of running LLMs). So are they really that much ahead? e.g. what real advantages does the latest iPhone have compared to a Pixel 8 Pro (sure it's still a niche device, with a seemingly inferior an chip (but more memory) but based on what we know Google seems to be ahead in AI development and will make their solution available on all high end Android devices).

As soon as they release something, it's

I doubt that's a given, their software has long ceased to be highly particularly exceptional and on the hardware side the only actually innovative product they have released for years has been a flop with a very unclear future.

SideburnsOfDoom
0 replies
1d3h

language model technologies going to dominate our future

I know many people who are _far from convinced_ of that. But that's not the relevant point: Microsoft, Google et all clearly are.

And watching them collectively, desperately stampede after the mirage, at great cost, is certainly something.

Miraste
0 replies
1d2h

Apple doesn't have as much of an advantage as it appears. Their Neural Engine are fast and getting better, but modern machine learning models need another resource: fast RAM. Unfortunately for Apple, their whole product stack is segmented based on RAM. There are millions and millions of Apple devices with lightning-fast processors and 8GB or even 4GB RAM. These won't be able to do anything like Copilot+, unless they make calls to a datacenter like Microsoft is doing.

Their existing local models are hit-or-miss already. FaceTime's audio transcription is laughable. Whether this is due to memory requirements, I don't know, but it doesn't bode well for further models.

delichon
18 replies
1d3h

  The behavior of any bureaucratic organization can best be understood by assuming that it is controlled by a secret cabal of its enemies. -- Robert Conquest's Third Law of Politics
I'm grateful that it's not compulsory for me to financially support this particular bureaucracy.

breck
6 replies
1d3h

Wow, thanks for that quote.

When I worked at Azure, one of the things I was responsible for was the "deep link" code to get to the management page for a resource.

So I became intimately familiar with Azure "resource ids". I could not for the life of me figure out why they were designed the way they were designed. They should have just been a random or custom alphanumeric hash/permalink. I asked my questions and raised my concerns internally but failed to get things fixed (politics is not my strong suit).

My most probable explanation for them was that they were designed by a mole Jeff Bezos had gotten into the Azure org to work on behalf of AWS.

gradus_ad
2 replies
1d3h

Incompetence does a fine job of explaining this, no?

throwup238
0 replies
1d3h

Microsoft’s signature Incompetence™ Enterprise Edition no less. Overcomplicating things the enterprise way is exactly what I expect from them.

SigmundA
0 replies
1d3h

Yes Hanlon's Razor is sort of the opposite side of the coin here, almost like Clarks Third law except:

"Any sufficiently bureaucratic system is indistinguishable from a cabal of enemies"

katbyte
1 replies
1d3h

fwiw i am a fan of azure resource IDs as they make it easy to identify what and where a resource is vs a random magic string you have to go look up

breck
0 replies
21h56m

I think this could be accommodated with changeable permalinks.

I understand your perspective. Thanks for sharing.

NoMoreNicksLeft
0 replies
1d3h

My most probable explanation for them was that they were designed by a mole Jeff Bezos had gotten into the Azure org to work on behalf of AWS.

I've read this five times now, and each time I read it I flip on whether it's a joke or some kind of (semi-)anonymous tip being leaked to a journalist through a pre-arranged drop site.

divan
4 replies
1d3h

How to understand this quote? I've recently started seeing bureaucratic organisations as a simple one-/few-cellular organisms. I.e. one person is smart and intelligent with 90 billion neurones contributing to its behaviour, but bureaucratic organisation with 1000 people is like an organism with 1000 neurons (or less).

constantcrying
1 replies
1d2h

How to understand this quote?

Exactly like it is written. Have you ever been in a 100k+ employee corporation, which has thousands of suppliers?

I have and I do believe it is a pretty good description. Again and again the simplest tasks turn into monumental disasters, again and again you see incomprehensible "reorganizations", which break up functional structures and turn them into dysfunctional ones, again and again top management forces insane decisions, which everyone at the bottom knows do not work.

I've recently started seeing bureaucratic organisations as a simple one-/few-cellular organisms. I.e. one person is smart and intelligent with 90 billion neurones contributing to its behaviour, but bureaucratic organisation with 1000 people is like an organism with 1000 neurons (or less).

That is not a good description at all. Bureaucracy happens because every single node is infinitely complex and responds to wildly different inputs.

lalalandland
0 replies
1d2h

Bureaucracy on a certain level can almost be described as an artificial living organism. It starts to behave in certain ways that nobody can control.

JohnFen
1 replies
1d3h

I think that quote is just a restatement of "We have met the enemy, and the enemy is us."

immibis
3 replies
1d3h

Have you bought a computer recently, or, like, ever? Every OEM pays a Windows tax, even on computers that don't come with Windows.

TacticalCoder
1 replies
1d3h

Have you bought a computer recently, or, like, ever?

A non-Mac computer you mean? I bought a desktop in parts, which I assembled myself and installed Linux on it. I don't doubt there may be some hidden Microsoft tax somewhere in the stack but I certainly didn't pay any Windows license nor any other Microsoft software.

NemoNobody
0 replies
1d2h

Microsoft owns substantial share of Apple - in the 90s Gates saved Jobs so that Msft would have a "competitor"

ben_w
0 replies
1d2h

I remember people saying that in the 90s, I rather assumed it stopped with them getting sued for monopolistic practices.

yyggvbb
0 replies
1d2h

This is laughably paranoid. I bet you’re great with the ladies.

laxd
0 replies
1d2h

Would this be an example of the Yoneda lemma?

elorant
11 replies
1d2h

Recall is the last straw for me; my next rig will be Linux exclusively. I don’t like it as a feature, don’t understand its usage, and most importantly I hate the way Microsoft tries to force it on my computer. Fuck that, and fuck them too. Next thing, I’m moving off Visual Studio too for Rider. For over a decade now Windows have been a shitshow with every next version being totally crap. Windows 7 was great, 8 was crap. Ten was good after a while, 11 is also crap. Probably they don’t care about Windows that much anymore since they’re making a shitton of money from Azure. But I have work to do and I can’t keep fighting my PC.

kalleboo
9 replies
1d2h

What's stopping you from installing Linux this weekend? Why are you putting it off until your "next rig"?

elorant
5 replies
1d2h

Windows 10 works just fine. When it reaches EOL I'll move.

datavirtue
3 replies
1d2h

Explorer tabs are worth the move.

datadrivenangel
0 replies
1d

They changed Alt+Tab behavior. Windows 11 is dead to me.

dartharva
0 replies
13h22m

Absolutely not.

bishbosh
0 replies
1d

I have found them to be an utterly incompetent implementation.

Maybe it's an issue with my set up, but they are shockingly bad in a way that makes them feel like they were hacked together. They lack basic functionality like scroll wheel click on any navigation buttons. They are constantly slow in opening. All while missing basic considerations like "Back" isn't separated by tab, so opening a new tab and navigating to a folder makes my "Home" the previous folder for all tabs on that window.

Again, maybe it's my set up, but going back to dolphin I am shocked they get away with something so half-baked.

dartharva
0 replies
13h22m

Exactly the same with me. W10 LTSC 2021 goes EOL on 2027, once that happens we'll have no choice.

donmcronald
1 replies
1d

What's stopping you from installing Linux this weekend? Why are you putting it off until your "next rig"?

For me it's OneDrive. I wouldn't trust it with a grocery list, but I do work for people that use it and there's no way I'd be able to connect to a share with an unofficial OneDrive app.

Microsoft has so many little hooks that's it's really hard to drop Windows if you work for / with anyone that uses anything in the MS365 ecosystem.

ku1ik
0 replies
11h49m

You can run Windows in VM, e.g. with virt-manager, which is a nice GNU/Linux GUI for QEMU/KVM. I keep Windows contained like this, would never let it run directly on bare metal these days. I even used to game on this VM (before switching to Proton) with near-native performance after passing through a Radeon GPU to Windows.

magicalhippo
0 replies
23h0m

Not OP, but a decent RDP alternative is my major blocker. None of the alternatives comes remotely close in terms of performance and features.

I also don't like the backup situation. I really like doing (incremental) full-disk backups due to ease of restore in case of emergencies. But I think ZFS-on-root will scratch that itch.

dartharva
0 replies
13h3m

I have used and tested several Linux distros, from Debian, Ubuntu and Mint to Fedora and Arch. I personally don't like how much time and energy it takes to configure and maintain them, and how there will always be some hardware incompatibility that we are just supposed to suck up.

But Microsoft has become ridiculous. All they have to do is keep one stable base and only add compatibility and stability updates, but they insist on adding meaningless garbage that slows everything down and makes it difficult to work. I used to think it was common sense that for software as critical as Windows you should add technological improvements without disrupting existing users' workflows. These geniuses do the exact opposite - use the same legacy platform they've been using over decades, but drive random silly design changes that only serve to piss existing users off.

Ah well.. here's to hoping the influx of new users to desktop linux ends up convincing hardware manufacturers to give first-party support. Once W10 ends support it's done for me.

breck
11 replies
1d3h

Microsoft's achilles heel is Bill Gates' fundamentally wrong model of information, as demonstrated in his 1976 letter[1].

I am not sure if it is possible for Microsoft to maintain its large market cap and revenue streams once the population wakes up to the fact that _digital copyright_ has poisoned our society and needs to go.

That being said, I worked at Microsoft, and it is filled with brilliant, hard working, competitive people of high integrity, so if anyone can figure out how to turn that Aircraft Carrier around and succeed in a world with intellectual freedom, it is Microsoft.

When I started at Microsoft, people made fun of me for installing Ubuntu on my machine the first day, and for my love of Git over sd (Microsoft's previous VCS system which stands for "shit dumpster"). The year I left, Microsoft shipped Ubuntu with Windows, and bought GitHub. They've started the turn. I hope they do the right thing, even if it means smaller profits (I'm not sure if it does).

[1] https://en.wikipedia.org/wiki/An_Open_Letter_to_Hobbyists

logtrees
9 replies
1d3h

Why do you think that digital copyright has poisoned society and needs to go?

__MatrixMan__
4 replies
1d2h

I'm not who you asked, but I think that digital copyright has poisoned society and needs to go because to enforce it, you need a way to shut content off, so you need to build infrastructure that grants somebody censorship power. That's a high value target, and attracts the attention of precisely the people you'd rather not have censorship power.

The internet infrastructure that allows digital rights holders to protect their intellectual property also lends itself to things like billionaires owning media platforms so that they can influence public opinion: It's about protecting capabilities that third parties use to influence discourse at the expense of the users. If you reshaped the landscape to remove the chokepoints, there would be nothing for those billionaires to buy, and our technology wouldn't be amplifying cults of personality to the extent that it currently is (or rather, it would amplify based on personality, not ownership... an upgrade?).

We're living in a sort of influence aristocracy which is protected by the seats of control that were created to protect digital copyright.

logtrees
3 replies
1d2h

How would you otherwise protect the people who invest their time and resources into building digital tools and products? Without digital copyright isn't it open season on any digital idea since it can just be stolen?

LoganDark
1 replies
1d2h

Are you actually talking about software patents, not digital copyright? Software patents protect ideas (i.e. algorithms), software copyright protects implementations (i.e. source code and binaries).

logtrees
0 replies
1d2h

Both.

__MatrixMan__
0 replies
1d

My primary argument is that however bad it would be if content creators were left unprotected, the sort of platform abuse that we're seeing today is even worse. Happy artists are important, information symmetry (re: markets and re: democracy) is more important.

But yeah, suppose we've made that switch, and now we want a new way to encourage content. I'd propose that we build attribution into our protocols. Many years ago there was controversy over "reaction" videos on youtube. They were being taken down because half of the screen was copyrighted (while the other half included somebody seeing it for the first time).

Rather than having the content exist as a single thing that either is allowed to be public or is not, I'd say that the fight should converge on how much credit was given to the person who put together the reaction video (some), and how much credit goes to the creator of the reacted-to content (more).

When I say "credit" I mean that as you browse your browser would keep track of which content you consume, and which parts of it are attributed to which people (or maybe these are bank accounts or crypto addresses or whatever). Part of paying for internet access would be allocating some fixed amount, $15/mo say... for content. At the end of the month, you'd send the $15 to the creators of your content (split up based on your consumption) and send the ISP proof that you've done so. Otherwise, your ISP collects the $15 and it goes to some nonprofit dedicated to mediating credit disputes and to educating creators on how to ensure that they do in fact get credit for their work.

This way there's no incentive to not pay for the content you're consuming: you the consumer are out the money either way. Also, the browser doesn't have to share the consumption history with the ISP... in order to meet their regulatory burden, the ISP need only see proof that you have done so. Lastly, unallocated money goes towards ensuring that future money is indeed allocated to a content creator, so it's a negative feedback loop which would hopefully converge on a state where most people participate in funding their creators.

If ISP's don't cooperate, we tax the bejeezus out of them and remove their protections re: who gets to use which buried cables such that new ISP's emerge that will cooperate.

LoganDark
3 replies
1d2h

I don't quite know how best to articulate why, but I agree.

Copyright sort of has the wrong idea. It's intended to prevent the copyright owner from missing out on the opportunity to capitalize on their work—similar to software patents (which are disgusting in their own right)—the rationale being the idea that if free copying ("piracy") were allowed to run rampant, the copyright owner would only ever make one sale.

However, this assumes that every pirate is the loss of a customer. This is certainly not the case, neither for me nor for most pirates out there. There are quite a few reasons people pirate software in practice, and almost none of them deprive the copyright owner of a potential customer. Either because the pirate never could have been a customer, or because they still are.

Here are some reasons I have either personally experienced, or commonly seen, for piracy:

- They can't pay at all. Say, kids without bank accounts. They may become customers later when they have money, but there's no use in campaigning against them now unless you want to make an enemy of them!

- They want to try before they buy. Some pirates buy software they like after pirating it. There's no point in buying something they won't use or like, especially if it's non-refundable. So they pirate first, as a sort of demo.

- They can't buy it even if they have money. For example, old media or retro games that are no longer being sold. There's no benefit to restricting people's freedoms here because there's no monetary gain to be lost because no sales are even being allowed.

- They can't own it with their money. Some people simply believe in owning the software they use and the content they watch. These people aren't going to care about streaming or subscriptions because it's viewed as a pointless expense. They'd be willing to pay more up front for real ownership, but if that's not available, they're not going to take a subscription.

- The buyer experience sucks. For example, streaming services or Adobe subscriptions. Many streaming services won't even do 1080p unless you use a smart TV. Adobe doesn't let you buy software to own anymore, only rent access to the latest offering, with whatever new changes their whims dictate. Microsoft has been moving in a similar direction for a while now.

- The price is too high. They aren't the intended customer and the copyright owner doesn't want their business anyway. No money lost by pirating.

- They don't want to give the copyright owner money. If not for piracy they still wouldn't consider the purchase.

But of course, there are still:

- The purchasing process is too annoying or too much effort compared to pirating. If pirating didn't exist as an alternative, maybe they would've had more patience, but even then, there's only so much bullshit that any one potential customer can handle.

- The desire to get something for free as the primary reason to pirate. This is typically how most copyright owners think of it, and yes, it does happen. It could be because of entitlement or because of a different philosophical belief (i.e. information should be free).

See Gabe Newell's "piracy is a service problem, not a money problem". I get all my games on Steam. I don't care about the DRM or the fact that I don't technically own the games. It's so easy, they treat me well as a customer, they respect my rights, so they're the natural choice for me. Not only that, they provide services which are genuinely helpful, such as P2P networking through their servers, which means anyone can host a multiplayer server and have their friends join. It is worth the money.

But I pirate shows I like once they're removed from Netflix or whatever other streaming service, because they're literally not available anymore. Literally, if there were a way for me to pay I would, but I can't pay money even if I want to. The copyright owner is basically telling me to go fuck myself. They shouldn't be allowed to do that and still prevent me from pirating the content that they won't sell me.

So yeah. Digital copyright is dumb.

smaudet
1 replies
1d2h

They shouldn't be allowed to do that and still prevent me from pirating the content

This is the only part I partially disagree with - privacy is a thing, and privacy of personal information is (or should be) a thing as well. This is the whole concept behind GDPR...

In case you don't think this is an actual problem, go look up Hunter Moore and the ethical issues with unfettered dissemination of information.

Of course its completely backwards that "movies" are given better protection than medical or other sensitive information....

The solution is not rampant piracy, for media concerns, all media/games/etc should be archived by e.g. the Library of Congress and available for purchase, unless determined by due process e.g. that the content is harmful (revenge porn, e.g.). This doesn't need to be a convenient process, obtaining e.g. such media can be a matter of requesting/buying/shipping encrypted usb drives around, perhaps (or purchasing a key for encrypted files online), licensing to distributors such as Netflix should still be the norm and more economically/ergonomically viable option.

But I do at least agree that piracy itself is not the problem, it is a symptom of a larger socio-economic issue.

LoganDark
0 replies
1d2h

Copyright shouldn't be what I have on my personal information. That's the wrong model to use. I'm not selling my own personal information and I'm not miffed by the idea of losing out on customers. In fact, I don't even want customers. My information is already being sold without my consent or involvement, and that's fucked up. I don't want them to come to me, I want them to go to jail.

The correct model is indeed privacy, which is that people should have the right to their PII not being sold or distributed without their express, informed consent. This has nothing to do with piracy or copyright. I'd say people should have the right to their PII not even being recorded in the first place, but consent to recording is often implied when you voluntarily provide it in exchange for a service, so we still need additional protections on top.

I'd still appreciate not being recorded without my express consent, though. I appreciate countries that have laws against it.

I do appreciate GDPR and its effects on data storage, retrieval and deletion. But it's only one piece of a complete set of privacy-preserving regulations, IMHO.

logtrees
0 replies
1d2h

Wouldn't getting rid of digital copyright affect someone who produces the idea and the code, though? By empowering others to just steal their idea and their code.

cjk2
0 replies
1d3h

> That being said, I worked at Microsoft, and it is filled with brilliant, hard working, competitive people of high integrity, so if anyone can figure out how to turn that Aircraft Carrier around and succeed in a world with intellectual freedom, it is Microsoft.

This is no good if the management and marketoids are fuckwits though, which they are.

ivraatiems
9 replies
1d3h

A couple of notes here. Overall I agree with the point being made but there are a few misunderstandings in this post.

Microsoft don't want to be entirely dependent on Intel, especially as Intel's share of the global microprocessor market is rapidly shrinking,

He says Intel, but means amd64 architecture - Intel isn't the only or even the dominant consumer computer processor manufacturer these days, especially for desktops/laptops.

Recall takes snapshots of all the windows on a Windows computer's screen (except the DRM'd media, because the MPAA must have their kilo of flesh) and saves them locally.

I wonder if I could cause Recall not to work by just playing some DRM'd content half-transparent over whatever I was doing...

Now, "unencrypted" is relative; the database is stored on a filesystem which should be encrypted using Microsoft's BitLocker.

If you have that enabled, which most people don't.

Worse: even if you don't use Recall, if you send an email or instant message to someone else who does then it will be OCRd and indexed via Recall: and preserved for posterity.

I don't see this as a problem; there's nothing stopping anybody you send anything to from taking a copy of it. You can't take emails or IMs back. It's just dumb.

Suddenly every PC becomes a target for Discovery during legal proceedings.

This was already true and common.

Some commentators are snarking that Microsoft really really wants to make 2025 the year of Linux on the Desktop, and it's kind of hard to refute them right now.

Or, people will just keep using Windows 10, which is what >70% of the market is already doing.

bentcorner
2 replies
1d3h

I wonder if I could cause Recall not to work by just playing some DRM'd content half-transparent over whatever I was doing...

At that point why not just disable Recall? There will be undoubtedly be a setting or at the very minimum a group policy option to disable the feature.

ivraatiems
1 replies
1d3h

Suppose your employer or network admin requires you to have it on.

bentcorner
0 replies
1d2h

If that's the case it's not your PC anyways, so there's nothing for you to personally gain by obfuscating any information it gathers.

Shared404
1 replies
1d3h

I don't see this as a problem; there's nothing stopping anybody you send anything to from taking a copy of it. You can't take emails or IMs back. It's just dumb.

I may trust someone personally, but not trust their ability to administer a system. This requires me to do both if I am sending information to somebody.

digging
0 replies
1d2h

Same reason I don't give out my real phone number unless I know how someone handles their phone. Doesn't take any malicious intent to grant Contacts access to all kinds of malicious apps. Other peoples' computers have to be assumed hostile.

digging
0 replies
1d2h

Or, people will just keep using Windows 10, which is what >70% of the market is already doing.

Support ends in late 2025 though. They've been trying to push the upgrade aggressively lately which is what got me to switch over my last machine early, so now I am happily Windows-free. And honestly, Mint is not much more complicated than Windows, while being immeasurably safer to use, it's just that Microsoft has inertia and a propaganda budget.

chucke1992
0 replies
1d3h

I wonder if I could cause Recall not to work by just playing some DRM'd content half-transparent over whatever I was doing.

The hilarious part is that the feature - OpenRecall is already there so the feature is not going anywhere.

This was already true and common.

Exactly plus removal of emails and history is something that courts are frown upon these days.

LoganDark
0 replies
1d3h

I wonder if I could cause Recall not to work by just playing some DRM'd content half-transparent over whatever I was doing...

Protected content works at the compositor level, so all that'd happen is that your screenshots would have a half-transparent black window over them.

BadBadJellyBean
0 replies
1d3h

Bitlocker is enabled by default on windows 11. It's one reason for the tpm requirement.

ChrisLTD
8 replies
1d3h

Microsoft's recently quarterly earning report showed a 17% rise in revenue, and their stock is up nearly 25% in the last year.

bigstrat2003
3 replies
1d3h

Good for them. Doesn't really matter if their customers start refusing to do business with them because they are scared off by stuff like this.

Rastonbury
2 replies
1d3h

Most of MS revenues are from F1000 corporate who's legal/security teams would never enable these features and MS would never force questionable features on them. They increased quarterly cloud revenue by $7B, even if only $1B of that is from OpenAI models in Copilot/Azure, they'll pay back their investment in OAI in 2.5 years

Author is just taking the opportunity make hot take on single terribly implemented product (or data gathering attempt), the big machine doesn't care

chucke1992
0 replies
1d2h

Most of MS revenues are from F1000 corporate who's legal/security teams would never enable these features and MS would never force questionable features on them.

Actually I think the feature will be enabled - companies have been spying on their users for ages. The thing is that the modern security is not about your stored features - you cannot login to laptops without passwords and a lot of devices even require cards (and flashdrives).

Modern security is about networking (VPN and stuff) and social engineering - fishing and stuff. Malware, executables etc. are not thta much of threat anymore.

bongodongobob
0 replies
1d2h

This is the correct answer. I know you're not supposed to say this but... The HN community is completely out of touch in regards to MS and what they do to make money and how organizations leverage their producs. For most orgs, leaving M365 ecosystem isn't even remotely an option from any angle. You're not replacing Office with g suite or any of the open options, no one is going to roll their own LDAP solution etc.

I just laugh at the comments on every MS article on this site. The business world world is bigger than HN users' 7 person startups.

doctorwho42
1 replies
1d3h

Cue comic about value for shareholders and end of the world.

hnben
0 replies
1d2h

their stock is up nearly 25% in the last year

So they performed as well as the average S&P500? I would expect better from a tech giant

Symmetry
0 replies
1d2h

If I could be short Microsoft's OS business while still being long their cloud provider business and their investment in OpenAI I would.

nextworddev
7 replies
1d3h

Microsoft (or any other FAANG) could leak all of our data, and not sure if it will face any real consequences. When was the last time a firm actually went "under" for a data security issue?

Some recent data security breaches: - Snowflake - AWS / Capital One

sickofparadox
2 replies
1d3h

Snowflake has actually been fairly consistent on saying that the breach was not of their underlying systems, but customer misconfiguration. I am reminded of the mid 2010s when AWS had certain systems public by default and tons of "data breaches" involved companies simply not making their data buckets private.

datadrivenangel
1 replies
1d

"Our stupid customers didn't turn on the 'safety mode' which our customer support people often tell them to turn off" is such a reassuring PR tack.

sickofparadox
0 replies
23h36m

If the default settings of a system are to make it public, then yes it is the "stupid customers'" fault for not clicking the check box to make it private.

Snowflake can't take the blame just because you went and dreamed up a scenario where customer support tells users to make their data public, then got mad at them for your imagination.

idontknowtech
1 replies
1d2h

What we need are large, mandatory, fines for every data breach that happens. Say $10k to every person whose address gets leaked. Then we'll see companies start treating consumer data as a risk, not just a low -cost asset.

lukev
0 replies
1d3h

Yes, I think everyone understands that leaks happen.

The problem being pointed out here is that you can't leak what you don't collect: this is a lot more data, and a lot more sensitive than what MS has ever collected before.

bnchrch
0 replies
1d3h

Dont forget Equifax!

crims0n
7 replies
1d3h

with dick moves like disabling auto-save to local files in Microsoft Word (your autosave data only autosaves to OneDrive)

Unrelated, but this burned me so hard a few months back - I was so pissed. Worked on a document for hours, saved it but the file was corrupted. No problem, will restore an autosave snapshot... only it was disabled in some update, and only works with OneDrive if you manually toggle the switch. I was flabbergasted.

selimnairb
1 replies
1d2h

This is why I subconsciously hit Cmd-S/Ctrl-S every 15 seconds.

boxed
0 replies
1d2h

On macOS they switched to live state save for documents decades ago.

cjk2
1 replies
1d3h

Stuff like that makes me appreciate Time Machine on my Mac. I have hourly snapshots of everything that changes. I don't even use VCS software when I'm writing code any more unless I have to collaborate with someone else.

Also there are serious problems with OneDrive version history. In numerous occasions, files have failed to restore or the UI is broken and unusable.

nothis
0 replies
1d2h

Dropbox is still unmatched as a service not trying to push you into a trillion dollar monopoly (One Drive, iCloud, Google Drive). It's admittedly been a while but I tested a whole bunch of automatic cloud backup software at one point because I got pissed about some annoying niche cases and Dropbox just worked throughout everything I threw at it, automatic versioning and recovery included. It doesn't mangle file names, handles huge files, folders with tens of thousands of small files, recovering old versions of accidentally overwritten files, works on Mac and Windows without losing a beat, sending people files always works.

I guess for collaboration you want to use stuff like Google Docs and advanced versioning software for coding (I'm not talking about coding stuff here, btw, which might throw off some people on hacker news). But Dropbox delivers for "I need a backup and I don't want to think about it except when I mess up and need its help".

pradn
0 replies
1d2h

This is so baffling to me. Is it just to reduce traffic to OneDrive or something?

gigel82
0 replies
1d2h

Well, if you had recall enabled, you could get all your text back out of that :)

ARandomerDude
0 replies
1d3h

This is exactly why I use git for anything important. You have to remember to save and commit periodically but it has really saved me.

chx
7 replies
1d2h

If Recall is not completely illegal already in the EU then Microsoft is doing an any% speedrun from launch to having EU wide legislation specifically banning your product. There is no scenario where the EU will allow this to exist within its borders. The very valid concern for abusive spouses alone makes for excellent political fodder. You won't find a single soul in the EP who would speak up against a legislation framed that way. The newly elected representatives would certainly enjoy an easy win when most of their work will be complicated and very hard.

buzzert
3 replies
21h12m

What if European citizens want this feature? Do they not get a choice?

rcxdude
2 replies
19h24m

I think pretty much everyone who's pissed off about this feature is because it defaults on.

chx
1 replies
17h46m

No. Not even close. The problem is that it ships with the OS.

When Apple shipped Airtag and we said it's a fantastic tool for stalkers people said "but this is not new, there are other similar tools blah blah blah". Doesn't matter one whit. It's an Apple product, in accessibility and awareness it's so far above the rest it's not even funny.

Who cares that Rewind.ai exists? The chances of an abusive husband finding and installing that is incredibly low. Because of that, no one howled when it was released. But the chances of the same man click MS Recall on? waaaaaaay higher.

It's possible shipping with default off might clear the GDPR but that doesn't matter either, alas.

Of course, there's a lack of women in data scientists. https://msmagazine.com/2023/01/19/women-data-science-technol...

But as long as the field of data science remains predominantly male and white, it will be difficult to have inclusive advancements in artificial intelligence.

2023 January.

The freakin' UNESCO in 2019 https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1

the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education.

Today, women and girls are 25 per cent less likely than men to know how to leverage digital technology for basic purposes, 4 times less likely to know how to programme computers and 13 times less likely to file for a technology patent

of course a product will arise which prominently hurts women simply because there are not enough women around the table to say "don't do this".

chx
0 replies
4h4m

the techbros downvoting this without leaving a comment as to where I am wrong just demonstrates how right it is :D

ChrisLTD
2 replies
22h48m

Why would Recall be illegal in the EU?

chx
1 replies
19h39m

1. GDPR Article 6 lists the lawful purposes for processing private data. The only that can apply is "If the data subject has given consent to the processing of his or her personal data" and opt out won't cut it based on previous cases. Microsoft needs to ship Recall default off in the EU and make it strictly opt in. Given it seems like a simple keypress might re-enable it, this doesn't look good so far.

2. Even if Microsoft ships it default off in the EU and makes it strictly opt in, it's not at all clear whether Recall recording said information of EU citizens from chats, emails and everything else on someone else's computer is lawful. Ie.: just because I sent you a message over Whatsapp or Viber doesn't mean I consented to you processing it with Recall.

3. Even if Microsoft manages to clear the GDPR hurdle and -- that's saying something -- the EP and the EC might still take issue with it: it certainly violates the spirit of the GDPR even if not the letter of it and together with the domestic abuse issue they probably will show the EU is not always so slow in legislating.

dartharva
0 replies
13h28m

Recall works locally on the client machine from what I understand. The snapshots don't leave your computer, so all this may not be under GDPR's purview.

ksynwa
5 replies
1d3h

Can someone please enlighten me on the claim that the share of Intel processors is rapidly shrinking?

I don't know enough to believe or doubt it. Apple moving to ARM processors seems like an important enough event for something like this to happen. But from cursory reading, I felt that Apple's M processors are so good because of their SoC nature where there are coprocessing units specialised for common tasks. I just thought that something like this could be achieved by Intel with their processors too but it doesn't look like that's the case.

I am rambling but I hope someone can shed some light here regardless.

twobitshifter
0 replies
1d2h

1) Intel hit a rough patch vs AMD chips for performance/price tradeoffs

2) Apple M Series Chips

3) New Snapdragon ARM chips are poised to do to the rest of laptops what M did to Macs. Once someone has all day battery life they aren’t going back to intel.

smaudet
0 replies
1d2h

https://www.cnbc.com/2024/02/07/arm-earnings-report-q3-2024-... https://www.techpowerup.com/322317/amd-hits-highest-ever-x86...

Did a pretty simple search, but these appear to be the trend being referenced.

The first marks a clear downward trajectory for Intel (on x86) and upward one for AMD, the second mentions high ship numbers for ARM.

I can imagine you can dice those to show that the Intel market share demand is decreasing, at least relative to total market share (which could be increasing).

I just thought that something like this could be achieved by Intel with their processors too but it doesn't look like that's the case.

Nope. Intel has an large set of extensions to support with CISC (slow, more complexity) whereas RISC architecture has a small API and hence more leeway to do as they please (fast, less complexity).

You could probably strip down the intel CISC architecture to something much simpler/nimbler, but you'd break a ton of existing applications. Or you could go RISC as well, and break even more things...

Or like Apple you already don't care and effectively deprecate all the "old" applications (they run on a compatibility layer), and use your expertise and vertical integration to design a system from the ground up...Microsoft and Intel would both have to decide to completely abandon their existing product line and "start over", which they could do, but then they'd also have to work very, very closely with each other (no egos/bosses trying to do things "their" way) to make that work, and probably piss off every other vendor they've ever worked with at the same time....

Which is probably why Microsoft is trying so hard to "just" get ARM working, its far simpler to target a platform with a large and growing base rather than try to work efficiently with Intel...

rob74
0 replies
1d3h

I'm wondering if they mean "Intel" in the "x86" sense or "Intel" as in "CPUs actually built by Intel"? The market share of the latter is probably shrinking faster than the former...

limpbizkitfan
0 replies
1d2h

Intel’s PC market was partially eaten by Apple’s switch to M1 and even earlier, Chromebooks moving to ARM/RISC. New server farms and clusters now have a CPU and a GPU as opposed to the past where you’d have just Xeons everywhere (a market Intel took from Sun). Intel tried to buoy themselves by purchasing Altera and working on discrete GPUs/AI focused peripherals, but AMD bought Xilinx and had (I believe) far better resources and experience in the GPU segment. Most troubling for Intel, they still invest heavily in and own their fabs. If there’s some protective anti-China tariff (certainly, there will be if China and Taiwan do reunify), perhaps Intel could have a slight advantage with their fab locations??

katbyte
0 replies
1d2h

the apple chips are so much "better" for a bunch of different reasons including the coprocessors but the biggest being the move to the ARM instruction set vs intel's x86 - Intel can not make that change for their main set of processors.

solidasparagus
4 replies
1d2h

I feel like you are complaining about the first car. Yes, it sucks and, god, I'm not sure it's good for the world. But zoom out a bit and it seems fairly clear to me that this is what the future is going to look like and this is why Microsoft feels compelled to act.

This is the first step to creating an assistant that automatically does tasks for you on your computer driven by natural language. The context being collected by Recall is essential for this technology to work effectively as people frame natural language instructions with the assumption that the assistant shares your context. And they probably want to collect a training set, although it's excellent (and necessary for trust) that this data is local-only, which might mean this doesn't generate any useful training data. I'm super happy about that.

Despite occasional snarky comments from tech savvy people, people absolutely do want this - normal people like computing to be simpler.

I do agree that this feels a bit early, but I don't know enough to say yet. Rewind AI (the startup) handles privacy by not capturing incognito windows, which I feel is a good start and we need to keep iterating on techniques like that to balance trust against usefulness (of course with the ability to disable the whole thing if you are willing to lose the features). Hopefully Microsoft spent enough time thinking about this, but time will tell.

Fundamentally the need for AI to understand your context to correctly assist you as well as the feedback techniques essential to training effective AI are going to run up against privacy concerns and we're going to have to figure them out as an industry - preferably in public, preferably with a company that takes trust seriously. This is fundamental to the technology IMO. Is Microsoft that company? I don't know, but I wouldn't underestimate how valuable this data would be if it was on Microsoft's server, so I personally am quite happy that they made the right choice there.

wkat4242
1 replies
1d2h

people absolutely do want this - normal people like computing to be simpler.

Of course I want this. I just want to have it myself. Like for the assistant and its data to be truly mine. I don't want Microsoft to run it for me. I don't mind paying for someone to build it but not to operate it. Especially not Microsoft. They're as bad as Google.

I work with Microsoft people daily and they're so brainwashed. They think their company is beyond any reproach, that the products are the best in the market, that using them is a win for us etc.

I hate them and their company with a passion but unfortunately our leadership is easily impressed and not too smart. They just hobble along with whatever Gartner is paid to tell them.

solidasparagus
0 replies
1d2h

What do you mean for the assistant to be yours? There is a core model somewhere that needs to be trained and the cost/technical requirements are prohibitive for that to be done by each individual. If the model is local, is that enough? The data of course should be yours - and until we solve information leaks in AI (never?), it will have to be or the product has huge viability risks.

Agree on data, which is one of the core issues. The expectation of privacy on your device is very high. But that can change, especially with the rise of web app tools that blur the line between your private space, the company's space, and public.

asadotzler
1 replies
23h19m

The first car wasn't automatically rolled out to 100% of horse and buggy users without their permission and no way to return to their buggy. Why wasn't it? Because it wasn't fit for purpose for most horse and buggy users, and it would be decades, almost half a century before it was.

solidasparagus
0 replies
22h29m

You can disable Recall.

amelius
4 replies
1d3h

Looks like their web server committed suicide first.

cgannett
0 replies
1d3h

I was able to get this to load by going to the blog site from an older post to antipope.org and then navigating to the latest blog post.

That or whatever was wrong was fixed just now.

banish-m4
0 replies
1d3h

    Server: Monty Python's Life of Brian - Crack Suicide Squad, Electronic Regiment/0.1.0

bee_rider
3 replies
1d3h

I don’t think MS has any idea about how to make Windows better. So they are bolting on extra crap and services. Re-invent the UI again.

Maybe nobody wants to work on hard and boring kernel stuff?

baal80spam
2 replies
1d1h

I don't really think they want to make Windows _better_. All they want is to push more ads and other crap in the user's face.

cchi_co
0 replies
3h34m

Yeah some decisions are really questionable

bee_rider
0 replies
1d

If they could make money by selling new copies of Windows I’m sure they would. But they ran out of ideas with Windows 7. Since there’s no reason to buy a new copy of Windows, MS needs to turn their customers into a recurring revenue stream.

I wonder if they fired everybody creative a couple years ago while CPUs were stagnant, leaving them in a rough spot to respond now that things are doing ok (I mean we’ll never have 199X-200X again, but there’s competition again at least).

vessenes
2 replies
1d2h

From the title, I thought Charlie would be in full rant mode, but this is middle of the road for him. Windows enshittification continues, and will continue, for sure.

But, he’s absolutely wrong that most people care about whether Recall stores things or not. They really really want what Copilot offers (note I didn’t say delivers, yet), which is labor saving on their knowledge work. My kids use a version of Recall on Mac for classroom lecture notes, and .. they find it helpful. People who have a job of making PowerPoints based on information from emails are going to find Copilot helpful.

These are the people OpenAI believes won’t have jobs eventually, because this sort of work is going from ‘needs human with college degree’ to ‘automatable’ very rapidly.

Charlie doesn’t offer any alternative architectures for this in his post, and that’s because there aren’t really any — MS delivered a locally encrypted database with RAG, and that’s … as good as it gets for this feature set right now. If nobody wants it, it will die. But, a lot of people want it, because they believe they’ll keep their jobs and get to be more productive in the meantime.

enragedcacti
1 replies
1d2h

MS delivered a locally encrypted database with RAG, and that’s … as good as it gets for this feature set right now.

I don't think that's true, at the very least they could have separately encrypted the db with a locking timeout the way that password managers do. That alone would eliminate a number of people's concerns.

Beyond that there are a ton of ways they could make the feature safer. They could have a setting where anytime the DB is unlocked the system uses Windows Hello periodically to confirm that the user that typed the password is still in front of the machine. They could proactively prompt the user to update rules or pause Recall in cases where recall is capturing snapshots the user might not want stored (and I question the overall utility of an AI that can't do that reasonably well for the common cases).

But all those things would be active reminders to the user of how creepy and dangerous the feature is, so instead they have to pretend that they've thought of everything and that there's no reason to be concerned.

vessenes
0 replies
13h54m

These are fair, but I'd argue that they're still largely security theatre, and, to your point, worsen the emotional experience a lot. As a product manager, I think it's a fair shake to be like "We expect that generally your Windows machine is secure, and we do a little extra here, no worries mate!" Not that I believe the priors embedded in that pitch; I don't. But, most customers do already, and it's probably a bad plan to make them doubt those priors.

Anyway, you said it precisely at the end: don't remind the customers how creepy this is.

gigel82
2 replies
1d3h

Microsoft is pushing this feature into the latest update of Windows 11 for all compatible hardware and making it impossible to remove or disable

They lost me there. This is false, you can definitely disable it.

Also, all these articles talking about how the SQLite database is "plain text"; if someone would want this feature to exist, how would they want the database to be "encrypted"? (assuming they'd still want to access it to search)

enragedcacti
0 replies
1d2h

A good start would be doing what Microsoft already does for Edge's Password Manager. It separately encrypts the database with the user's creds and triggers a prompt for them when the feature is accessed. It doesn't solve all the attack vectors but it would be a good start.

https://learn.microsoft.com/en-us/deployedge/microsoft-edge-...

With Edge specifically its not really clear to me when the prompt is required but its at least sometimes as you can see here:

https://support.microsoft.com/en-us/topic/edit-your-password...

Iwan-Zotow
0 replies
1d2h

This is false, you can definitely disable it.

not in the corporate world

if feature is enabled or disable on corp laptop, there is nothing you could do

chucke1992
2 replies
1d3h

People are using their personal accounts to store corporate credentials in browsers. These days social engineering is a bigger threat that Recall.

Plus corporate laptops are required to use things like Bitlocker and stuff.

rob74
1 replies
1d3h

Ok, then Recall just increases the amount of juicy stuff attackers might find on your machine once they have found the credentials by other means. Forget browsing in private mode, Recall sees and remembers everything!

donmcronald
0 replies
23h51m

And there's no way it stays per-machine. It's almost guaranteed it's going to get synced to all your devices via OneDrive at some point.

breadwinner
2 replies
1d3h

Microsoft is clearly not committing suicide: Look at how much attention this feature has got them. Every tech publication is talking about Windows. Windows is suddenly relevant again! How is that suicide?

It is Microsoft customers that is committing suicide, by leaving this feature on. Users need to be educated on the dangers of this feature, and how to turn the feature off.

datavirtue
0 replies
1d2h

We went over the slippery slope and have the wind in our hair.

Lord-Jobo
0 replies
1d2h

If someone points a gun at you, says "I am going to shoot you in 5 seconds" and then you stand there and get shot instead of diving behind cover, its still not suicide. its a dumb death but the shooter still committed a homicide.

And i do feel like that is a decent analogy for the situation

andrewdb
2 replies
1d3h

Steel-manned Deductive Argument

Premise 1: AI (specifically, statistical modeling based on hidden layer neural networks) has been increasingly integrated into various technological products and services.

Premise 2: Major companies like Microsoft, Apple, and Intel are intensifying their efforts in AI development and integration, with Microsoft announcing products like CoPilot+ and Recall.

Premise 3: CoPilot+, which is AI-driven, is built on users’ actual usage data, potentially offering more realistic and user-friendly assistance than previous AI iterations like Microsoft’s Clippy.

Premise 4: Recall, another AI feature by Microsoft, aims to enhance user productivity by automatically capturing and storing screenshots and textual content, but it stores this data unencrypted locally, posing significant privacy risks.

Premise 5: The continuous expansion of AI features in technology products often correlates with increased privacy risks and potential legal issues, as evidenced by the concerns surrounding Recall’s handling of personal data.

Conclusion: The rapid integration of AI into technology products, while intended to enhance functionality and user interaction, simultaneously amplifies privacy and security concerns, necessitating a cautious and regulated approach to AI deployment in consumer technologies.

Possible Fallacies in the Argument

Hasty Generalization: Concluding that all AI integrations pose privacy risks based on specific examples like Microsoft’s Recall might not account for other AI integrations that prioritize security and privacy.

Slippery Slope: The argument implies that increasing AI functionalities will inevitably lead to greater privacy and legal issues, which may not necessarily hold true if appropriate measures are taken.

Appeal to Fear: Highlighting the severe privacy risks and potential legal issues may play on the fears of surveillance and loss of privacy, overshadowing potential benefits of AI.

Biased Sample: The argument focuses mainly on Microsoft’s implementations of AI, which may not represent the broader industry approach to AI integration and its implications.

LoganDark
1 replies
1d3h

Is this comment LLM-generated?

andrewdb
0 replies
20h20m

In large part, yes

stephenflanders
1 replies
23h12m

"It's the latest hype bubble now that Cryptocurrencies are no longer the freshest sucker-bait in town"

Strong statement when Bitcoin is sitting at around all-time highs and ETFs are pulling in almost a billion a day.

tim333
0 replies
5h5m

The AI hype is probably more recent though. Bitcoin is becoming mainstream.

dustedcodes
1 replies
1d3h

Last year at the peak of all the tech lay-offs someone said that the cheapest way to quickly pull an Elon Musk and fire huge swathes of employees is by buying them a Windows Surface device for work and then wait for the resignations letters shortly after. I wasn't sure if this was a joke or they were serious, I wouldn't be surprised if this was true lol

datavirtue
0 replies
1d1h

The Surface truly is Apple-priced garbage. I have had several and didn't like any of them. I have a new one now.

We made the mistake of rolling them out with Windows 10. Trackpad and screen input are easy to disable accidentally. Real PITA.

avmich
1 replies
1d3h

Recall takes snapshots of all the windows on a Windows computer's screen (except the DRM'd media, because the MPAA must have their kilo of flesh) and saves them locally.

Can I turn my app's windows into DRM'd media? Just curious.

rossy
0 replies
1d2h

Yes, I think SetWindowDisplayAffinity is the Windows API function.

andrewfromx
1 replies
1d3h

more like a beached whale. we just don't know if it was intentional or they just drifted this way and oops!

cchi_co
0 replies
3h32m

An interesting and sad comparison. Time will show.

OptionOfT
1 replies
1d1h

I searched HN but couldn't find anything, but I could swear I saw something like Recall while back: a screenshot tool that ran it through OCR / detection, put it in a DB and made it searchable.

It was open source on GitHub.

Sidenote: I'm conflicted on the privacy issue here, but on the other hand it seems to be bringing ARM to Windows (again), which I support.

zitterbewegung
0 replies
1d3h

I think if your replace Microsoft with Google it makes more sense. Microsoft AI in their laptops can fail like their windows phone. But Google has much less of a moat than Microsoft especially when they are hurting themselves in the same way as Microsoft and they are already doing what the article says.

tsunamifury
0 replies
1d3h

No they are using the best training ground they can find to teach AI to do your job.

Every exec right now is selling the story will transform their company into an employee less infinite profit future.

trollerator23
0 replies
1d3h

Eh. Charles Stross went off the rocker long ago. Not worth reading.

tim333
0 replies
4h58m

I'm sure they are not trying to commit suicide. They'll probably in response to the horror just make Recall easier to switch off and improve security a bit.

As an aside there was an interesting graph recently showing OS share on all personal devices 1970 to now. Windows has fall from nearly all the market around 2000 to about 30% now as people use iOS, Android and Mac for their stuff. Also slightly to my surprise it has the dominant OS in 1970 as BASIC. Which brings back memories as my first personal machine was a BBC micro using that.

https://www.reddit.com/r/pcmasterrace/comments/1d47yuf/the_o...

throwme0827349
0 replies
1d2h

Thank goodness my Windows updates are broken!

tempodox
0 replies
1d3h

This article is balm for my battered nerves. Unfortunately it won't reach, let alone convince, those who have the most urgent need of getting the message. They will have to learn it the hard way.

tempodox
0 replies
1d2h

If things should progress logically for a change, all those organizations that have an obligation of confidentiality will get sued into oblivion for using Windows.

surfingdino
0 replies
1d3h

MSFT's future is safe, it is not a company of innovators, but it is good at selling its products and services.

runjake
0 replies
1d2h

No, I don't Microsoft is necessarily committing suicide.

However, I believe they are too reactive vs proactive. In other words, making the mistake of focusing too heavily on predicting technological trends, rather than forging their own path and creating their own opportunities.

Admittedly, they are certainly doing this in certain market segments, such as developer tools -- and perhaps, Azure.

But what do I know? I'm a dumb programmer.

roamerz
0 replies
1d3h

I wouldn’t mind the total recall feature if it was OPTIONAL. I would probably use it on a dedicated pc that I would use for diagnostics and issue remediation on. Would serve as a (probably) great tool for creating documentation after the fact where you don’t have the bandwidth to do it in real time.

But to use it on a pc that is used for day to day activities? It would effectively circumvent every policy which I’m required to adhere to. This would be a shit show especially in government where anyone can legally and with force request “all correspondence and electronic records pertaining to xxx”.

renewiltord
0 replies
1d3h

Interesting UI slug. Truncation or word blacklist?

nashashmi
0 replies
1d2h

Suicide won't be committed by missteps. Suicide will be committed by departing from your brand. And this copilot+ and recall is on-brand. And the missteps will be corrected. Remember tick-tock. The ticks suck! The tocks do well. Tick: 95, Windows 2000, ME, Vista, Win8. Tock: Windows 98SE?, XP, 7, 10. I am not sure where to put Windows 11.

Copilot+ is a tick. Let's wait for what is next.

nadam
0 replies
1d2h

'The breaking tech news this year has been the pervasive spread of "AI" (or rather, statistical modeling based on hidden layer neural networks) into everything'

Calling current generative AI tech as 'statistical modeling based on hidden layer neural networks' is as insightful as calling any new development in the tech industry (like personal computers, software as a service, or game engines) as 'usage of stored program computers' or even 'usage of transistors'. Although I understand that the author's aim is to somehow discredit generative AI this way, but referring to something as one of its very basic, old and low-level building blocks feels just dumb.

msl09
0 replies
1d2h

To answer the question in the title, I don't think that Microsoft is even remotely trying to commit suicide. I think they realized that they are so freaking huge that they can nuke the equivalent of an entire year of revenue and still be in no danger.

So they can just try stupid shit left and right and see what happens, if they get lambasted for it, it's no big loss. If it works they will be genius visionaries, stock prices go up, investors are happy.

It doesn't please me but, it is what it is.

moffkalast
0 replies
1d3h

Always has been.

iancmceachern
0 replies
1d2h

Don't forget Hololens

hurflmurfl
0 replies
1d2h

So after building my latest rig and crazily getting a lot of driver issues related to network I've finally ditched Windows for Linux. While trying to figure out how to keep track of all the system changes I make in case I change my distro and need to set up my system yet again, I came across Nixos, which has been working out pretty well for me.

The issue I'm currently battling with is how do I connect my cloud drive (self hosted nextcloud) to the system in a way that it worked on Windows, and so far no luck. Once I figure it out (if at all), it's going to be perfect.

Surprisingly, even gaming works just fine via steam. I've been able to play whatever I have in my library by "forcing compatibility tool" ( which is proton). Colour me impressed.

Now I just need to confirm that I can run my country's official program for tax-related stuff via wine and I'll have no regrets.

hulitu
0 replies
1d1h

Is Microsoft trying to commit suicide?

Kids those days. Wikipedia has some pages about Microsoft. /s

geor9e
0 replies
16h44m

I'm still not convinced the very loud people online who hate it aren't just a very vocal minority. Personally I plan to leave the feature on. I await real data on what the silent majority of Windows users think. It's not like MSFT would just randomly launch a hugely divisive, frankly shocking, feature to a billion plug users without a lot of dogfooders and focus group studies at least getting a passing grade. To me, this doesn't feel any different that browser history. I do get why privacy-enjoyers get the willies by it too. That's what a setting toggle is for.

daft_pink
0 replies
1d3h

I think they simply look at a company like Google and wish that they could get flexible advertising revenues from all their users instead of getting a few bucks from Dell for every machine sold.

If they could achieve that, they could 5x their revenue and make sooo much $’s.

d_burfoot
0 replies
1d3h

Page is down, I assume due to load from HN traffic. Maybe HN should implement some kind of custom CDN, but only for pages that hit the front page?

chfritz
0 replies
1d1h

They've been trying to commit suicide for over 20 years now, they just never succeed.

blackeyeblitzar
0 replies
1d2h

I see it less as suicide and more like an anti competing attempt to get all the gains from the AI revolution, through classic Microsoft practices like bundling (tying AI features and products to other products), using their muscle in OEM partnerships (forcing Copilot keys on everyone), etc. I think they’ll succeed, so I don’t think it’s suicidal. I just think it’s bad for fair competition and our societies.

astro-
0 replies
1d3h

I love that DRM content was where they drew the line. Customer privacy? Eh, whatever. But movies? Let's not poke the bear...

Desktop copilots/assistants have huge potential, but this type of data collection can't possibly end well.

amai
0 replies
1d2h

Microsoft is just the first prey of AI.

alangibson
0 replies
1d2h

We've realized for some time that not only are language model technologies going to dominate our future

Where 'we' is defined as people whose income depends on ever more investment in LLM tech. For the rest of us, they seem hopelessly flawed as a general solution to anything.

aftbit
0 replies
1d2h

I'm always here for a Charlie Stross post. Now what is this all about?

slurping all incoming email for accounts accessed via Microsoft Outlook into Microsoft's own cloud for AI training purposes

Really? ALL incoming email that you just access via Outlook? That's not okay. Does Google or Apple do that too?

aftbit
0 replies
1d2h

Has this post been deranked or something? It went from nearly the top of page 1 to half way down page 3 in under an hour.

SpacePortKnight
0 replies
1d1h

All these fancy AI laptops and all I want is a fanless laptop like MBA from Intel.

PaulHoule
0 replies
1d2h

This article is so popular it crashed his website, see the archive https://archive.ph/5sSg1

I remember Clippy being the face of an online help system for Office that was highly effective. It's hard to remember before Google that the standard of a full-text search facility was that it probably didn't work. People had learned the hard way that it just wasn't worth doing a full-text search for the manual of a software product. Microsoft tried to break through that by putting a better-than-expected search facility behind an engaging character that... didn't work on an emotional level.

Compare to other bad product launches from Microsoft such as OneDrive. Developed shambolically they didn't do due diligence on the trademark and had to rename it from SkyDrive. They tried to force it down your throat when it was in a state where it didn't work reliably: OneDrive became the default save location for Office and if OneDrive was flaking out you could not save a file at all! (Annoying as hell for a platform that is always asking me if I want to recover a file I didn't save four years ago)

There's also the fact that people seriously roll their eyes now when they hear "AI", "Blockchain" and "5G" (especially in the same sentence) There are numerous papers where people apply ML to scientific problems and I'm not so inclined to post them because too many of y'all think that anything like that is a scam... Even though the automatic differentiator in neural network tools is sure helpful for solving differential equations and such.

If I was trying to sell an AI product in 2024 I'd try to avoid the words "AI" and "ML" specifically: if you were working on this stuff ten years ago you were ahead of the curve, if it looks like you just got into it now the automatic assumption is you are behind the curve, not ahead.

NemoNobody
0 replies
1d2h

Ok - unpopular opinion - this would be cool if it was impossible to leave the machine. Id never need another screenshot.

Perhaps if this was an on/off button like service.

I don't need 12 hours of a gaming binge... or a any hours of a porn binge saved second by second for me. I have no use for record of movies I watched screen by screen - especially if they will simply be black screens.

Total Recall of what browsing, project work or file organization/data management stuff I've ever done as I did it would be great actually

JonBarrett92
0 replies
1d2h

“Unfortunately, human beings assume that LLMs are sentient and understand the questions they're asked, rather than being unthinking statistical models that cough up the highest probability answer-shaped object generated in response to any prompt, regardless of whether it's a truthful answer or not.”

I don’t think this is something that will stay true. Users become more educated on what their systems can and cannot do as well as an intuition of their inner workings over time. Getting the shape of the highest probability answer is a very useful tool that shouldn’t be underestimated, especially if it’s understood that way.

As for the Recall related stuff in the rest of the article, I have similar sentiment. Only difference is that I feel like they could really pull off a very useful, similar tool but there are definitely a lot of serious concerns to be solved here first, and rolling it out in its current state is morally irresponsible of those pushing it at Microsoft.

Andrex
0 replies
22h43m

But they’ve been going backwards since 2020, with dick moves like disabling auto-save to local files in Microsoft Word (your autosave data only autosaves to OneDrive)

I missed this when it happened. What the fuck?

1vuio0pswjnm7
0 replies
22h9m

Another possible risk that is not addressed in this submission, or in any other submission on this that I have seen so far, as follows.

Any Microsoft software installed on the computer owner's computer, e.g., the kind that "updates" itself remotely, because Microsoft coerces computer owners into keeping their computers connected to the internet 24/7,^1 can query this SQLite database to learn what the computer owner is doing with that software and also what the computer owner is doing outside of that software, e.g., with other software. The later is, in effect if not literally, next-level "telemetry".

It is also an internet marketer's wet dream.

Arguably it does not matter whether the data in this database is ever transferred to Microsoft, so long as Microsoft can install and run any software it wants on someone else's computer^2 and that software can query the database to see what the computer owner is doing with their computer.

We have seen how badly software developers want to use so-called "telemetry", by default, at the cost of computer user privacy. As such, this is, IMHO, a plausible risk.

1. Why? See #2.

2. It can, just like a botnet.